Hi, I created a simple application with the following config (short):
if (!Bass.BASS_Init(-1, 44100, BASSInit.BASS_DEVICE_DEFAULT, IntPtr.Zero))
{
throw new Exception("Bass-Init Error!");
}
if ((_stream = Bass.BASS_StreamCreateFile(_path, 0, -1, BASSFlag.BASS_DEFAULT)) == 0)
{
throw new PlayerInitializationException(Bass.BASS_ErrorGetCode());
}
After creating a stream with a 192000Hz 24Bit .flac file, the function Bass.BASS_ChannelGetLength(_stream, BASSMode.BASS_POS_BYTES) outputs a 4x bigger value than a .m4a stream with the same duration. Is this intended?
Creating a WaveForm using the code below creates a bitmap, which is way too big to convert it to a BitmapImage (for the .flac 24Bit scenario).
// _waveConfig.Speed = 2700;
// _waveConfig.Height = 70;
int width = (int)Bass.BASS_ChannelGetLength(_stream, BASSMode.BASS_POS_BYTES) / _waveConfig.Speed;
var renderedWaveForm = _waveForm.CreateBitmap(width, _waveConfig.Height - 15, -1, -1, false);
Different Stream lengths:
MQA-encoded file: 63360000 ({MF, 48000Hz, Stereo, 24bit})
FLAC MAX: 253440000 ({MF, 192000Hz, Stereo, 24bit})
Am I missing something? Thanks in advance!