Author Topic: BASS_ATTRIB_NOBUFFER and default update period problem  (Read 555 times)

Falcosoft

  • Posts: 37
Hi Ian,
Using BASS_ChannelSetAttribute(Midistream, BASS_ATTRIB_NOBUFFER, 1) or BASS_ChannelSetAttribute(Midistream, BASS_ATTRIB_BUFFER, 0) works perfectly and results in the best overall latency regarding audio playback. But it has one minor drawback. Namely it ignores the update period defined by BASS_SetConfig(BASS_CONFIG_UPDATEPERIOD, 5) and seems to use always 10ms.  Ignoring  BASS_CONFIG_UPDATEPERIOD is understandable in the sense that this setting mainly affects audio buffering but with BASS_ATTRIB_NOBUFFER ther is no such thing. But on the other hand BASS_CONFIG_UPDATEPERIOD has a broader effect on Bass, namely it also determines the frequency of callback calls. And as we have already discussed previously in case of Bass_VST and VSTi plugins thus it has an effect on the precision of Midi timing:
Quote
Actually there is a benefit of using 5 ms update period but it is not latency related. As I said above the benefit is the more frequent callback calls which can be important if you use e.g. Bass_VST and VST instruments. The midi timing is more precise in case of 5 ms update latency since the instrument plugins get the midi data at the callback rate and since Bass_VST does not use the deltaFrames member of VstMidiEvent struct (it's always 0) the precision only depends on the callback frequency.
So the current  problem is that with BASS_ATTRIB_NOBUFFER you can have the best latency but with less VSTi Midi timing precision than with buffering.
May I ask you to add a new option that could set the now fixed 10ms callback frequency to 5ms (that is available now in buffered mode with using BASS_CONFIG_UPDATEPERIOD ).
Thanks in advance.



Ian @ un4seen

  • Administrator
  • Posts: 21861
The BASS_CONFIG_UPDATEPERIOD setting only applies to playback buffering (it determines how often a stream's playback buffer is refilled). When playback buffering is disabled via BASS_ATTRIB_NOBUFFER, it is the device's update period that determines how often the stream is processed. That can be set via the BASS_CONFIG_DEV_PERIOD option but it is a fixed number on Windows (usually 10ms). To break it down into two 5ms chunks, you could use an intermediate custom stream and have the STREAMPROC call BASS_ChannelGetData twice on the VST stream (with half of the "length" parameter each). You would apply new events to the VST stream before each call, depending on which 5ms chunk they should be in.

Falcosoft

  • Posts: 37
Hi Ian,
Thanks for your answer. The problem with your solution is that my code does not use any STREAMPROC or BASS_ChannelGetData , these are handled by Bass_VST library internally. Bass_VST architecture in this sense is similar to Bassmidi. In case of VSTi plugins you use BASS_VST_ChannelCreate() that is similar to BASS_MIDI_StreamCreate() and then you call BASS_VST_ProcessEventRaw(), that is similar to BASS_MIDI_StreamEvents() to send Midi data. You do not need to create decoding channels, the audio data can be sent to output directly by Bass (used internally by Bass_VST). The problem is BASS_VST_ProcessEventRaw() puts sent Midi data in a queue/buffer without timestamps (more precisely without 'sample frame stamps' in deltaFrames field of VstMidiEvent struct) and use constant 0. This way it is not sample accurate and it can be only so much precise as the STREAMPROC callback rate. The callback first sends all buffered Midi data and then gets the rendered samples from the VSTi plugin. The relevant code parts are queueEventRaw() in bass_vst_impl.cpp and callProcess() in bass_vst_process.cpp. I have tried to implement a proper deltaFrames calculation in queueEventRaw() but it does not work properly: 
Code: [Select]
static void queueEventRaw(BASS_VST_PLUGIN* this_, char midi0, char midi1, char midi2, const void* sysexDump, size_t sysexBytes, DWORD* error)
{
// prepare
EnterCriticalSection(&this_->midiCritical_);

VstEvent** eSlot = NULL;
VstInt32 deltaFrames;

///falco: deltaFrames field has to be implemented properly. Constant 0 is a very rude solution.
if( this_ && this_->channelHandle)
{
if (this_->globalChannelinfo.flags & BASS_SAMPLE_FLOAT)
deltaFrames = (int)((BASS_ChannelGetPosition(this_->channelHandle, BASS_POS_BYTE | BASS_POS_DECODE)  / sizeof(float) / this_->globalChannelinfo.chans) - this_->globalSamplePos);
else
deltaFrames = (int)((BASS_ChannelGetPosition(this_->channelHandle, BASS_POS_BYTE | BASS_POS_DECODE)  / sizeof(signed short) / this_->globalChannelinfo.chans) - this_->globalSamplePos);

}
else deltaFrames = 0;

///////
...
}
this_->globalSamplePos refers to the accumulated samples in the process stage in the callback. Unfortunately this solution does not give sample accurate results. If  this_->globalSamplePos is equal the rendered samples calculated internally by Bass_VST (this_->vstTimeInfo.samplePos += numSamples;) then the deltaFrames calculation result in only zeros or negative numbers. If this_->globalSamplePos calculated similarly to deltaFrames by using BASS_ChannelGetPosition() then at least I do not get negative numbers but only zeros or some bigger numbers depending on the update rate but without the necessary fine sample accuracy.
Maybe that's why originally it was not even tried to implement deltaFrames properly?

2. I have found another solution, but I'm not sure about the drawbacks of it.  Namely using BASS_ChannelUpdate() I can even achieve 1ms update rates. I have noticed that by disabling buffering (by BASS_ATTRIB_NOBUFFER) and also disabling automatic updates (by BASS_CONFIG_UPDATEPERIOD set to 0) I still got automatic updates every 10ms. I assume it's the device period that you said almost always 10ms in Windows. I would like to ask how these automatic updates and manual BASS_ChannelUpdate() calls correlate in case of disabled buffering. I have got perfect results without audio glitches even by calling  BASS_ChannelUpdate(Midistram, 1). But I do not know what kind of additional latency this solution results in. I have also noticed that the callback rate is related to both the frequency of BASS_ChannelUpdate() calls and the value given to it in its length parameter ( e.g. a length of 5 results in similar situation when you call BASS_ChannelUpdate() only at 5ms intervals. I do not know if these variations give different results regarding latency or which one is preferable.
Thanks for your time and consideration.
« Last Edit: 10 Apr '19 - 11:53 by Falcosoft »

Ian @ un4seen

  • Administrator
  • Posts: 21861
Using BASS_ChannelUpdate is indeed another way that you could achieve what you want, ie. call it in a worker thread every 5ms. Doing that is basically reenabling playback buffering (BASS_ChannelUpdate puts data in the playback buffer) but you have more control over it. If you buffer sufficient data then BASS won't have to fetch any itself. You can check how much data is buffered with BASS_ChannelGetData (BASS_DATA_AVAILABLE).

Falcosoft

  • Posts: 37
Hi Ian,
It seems this problem is more general than I originally thought. It also affects Bassmidi when using real time BASS_MIDI_StreamEvents() calls and not only WASAPI in bufferless mode but also ASIO output mode. I have noticed the problem when tested  KaleidonKep99's OmniMidi synth driver that is based on Bassmidi.
The point is ASIO buffer size does not only influence audio latency but also Midi timing/precision. Midi timing problems can occur even at reasonable buffer sizes of 20-50 ms.
This problem is very similar to WASAPI bufferless issues but can cause even more serious timing inaccuracies. I think the problem in case of ASIO output is also that there is no independent update period like in case of Directsound/WASAPI buffered modes but the update period is always corresponds the audio buffer size. This is not a fortunate coupling since it means any buffer sizes above ~5ms can cause noticeable timing problems since all Midi data is buffered and played back at once at each buffer update. Video about the issue:
https://youtu.be/z3C7wmCdgZw?t=151
Can't you find some solution that could prevent this unfortunate direct coupling between audio buffer sizes and update periods/Midi precision? I seriously think that audio buffer sizes affecting Midi precision is not fortunate/intuitive at all.   
Thanks in advance.

rv

  • Posts: 284
Yes Ian, you can find my email 6 years ago, about this problem :)

The realtime BASS_MIDI_StreamEvents should not depend on the granularity of the audio update period
This did not bothered it too much, as my program is realtime, and play at low latency 6ms

Ian, maybe, you can add an option so the output ask audio data to the sources every 1 ms, and fill the ASIO or WASAPi audio buffer by little part ?

Ian @ un4seen

  • Administrator
  • Posts: 21861
Have you tried using the BASS_MIDI_EVENTS_TIME or BASS_MIDI_EVENTS_ABSTIME option with BASS_MIDI_StreamEvents? If not, please give them a try. They allow you to set a time for each event, meaning the events don't have to always be applied at the start of the next processing block. That will generally be more efficient than using very small (eg. 1ms) processing blocks.

rv

  • Posts: 284
In fact, I don't use BASS_MIDI_StreamEvents but BASS_MIDI_StreamEvent as all events are played in realtime by a midi keyboard (virtual synth)
Maybe you can make it the default way to process the BASS_MIDI_StreamEvent immediately ?

Ian @ un4seen

  • Administrator
  • Posts: 21861
BASS_MIDI_StreamEvent does apply events immediately, ie. the change will be present in the next output that the MIDI stream produces. That is the next BASS_ChannelGetData call in the case of a decoding channel.

The point of using BASS_MIDI_StreamEvents with BASS_MIDI_EVENTS_TIME or BASS_MIDI_EVENTS_ABSTIME would be to delay events. For example, if you're generating data every 10ms and an event comes in 3ms after the last block of data was generated then you would delay the event by 3ms in the next block. That way all events will sound like they are delayed equally. The code could look something like this:

Code: [Select]
DWORD lastupdate; // timestamp of last output block

...

// send an event
BASS_ChannelLock(midistream, TRUE); // lock to prevent async output processing in the middle of this
BASS_MIDI_EVENT event;
event.pos = BASS_ChannelSeconds2Bytes(midistream, (timeGetTime() - lastupdate) / 1000.0); // set the delay for the event
// set other event info here
BASS_MIDI_StreamEvents(midistream, BASS_MIDI_EVENTS_STRUCT | BASS_MIDI_EVENTS_TIME, &event, 1); // apply the event
BASS_ChannelLock(midistream, FALSE);

..

// generate output
BASS_ChannelLock(midistream, TRUE);
BASS_ChannelGetData(midistream, ...); // generate output
lastupdate = timeGetTime(); // update timestamp
BASS_ChannelLock(midistream, FALSE);

Falcosoft

  • Posts: 37
Hi,
Quote
Have you tried using the BASS_MIDI_EVENTS_TIME or BASS_MIDI_EVENTS_ABSTIME option with BASS_MIDI_StreamEvents? If not, please give them a try
If you remember maybe I was the first who used BASS_MIDI_EVENTS_TIME flag :)
http://www.un4seen.com/forum/?topic=17431.msg122310#msg122310
It worked perfectly in a VST environment. The problem with your solution is that it's not general and it can only help in SOME situations. Namely it works only if you use decoding channels and you use callbacks to process data yourself by using BASS_ChannelGetData().  In case of non-decoding channels you have no idea when you should set 'lastupdate' since output generation is not made by your code but by Bass itself. And e.g. there can be many problems with callbacks and managed code/garbage collection. That's why many can not wait for you to release the new version of BassAsio where they could use BASS_ASIO_ChannelEnableBASS() without callbacks. 
http://www.un4seen.com/forum/?topic=18211.msg127863#msg127863
Also currently this could be a solution only for Bassmidi but not for Bass_VST since BASS_VST_ProcessEventRaw() does not support BASS_MIDI_EVENTS_TIME. As you can see some posts above I had tried to achieve something similar in Bass_VST internally. It seems the problem was that BASS_ChannelGetPosition() also depends on the granularity of the update period so it does not count positions between updates. Maybe later sample accurate continuous position reporting (like VST hosts do) could be added to Bass. It could be more precise/useful than using timeGetTime() calculations to get the necessary delay.
I have just mentioned KaleidonKep99's OmniMidi (and so Bassmidi and ASIO) to demonstrate that it is not just BASS_VST and WASAPI specific problem, and this peculiarity of Bass is so counter intuitive that even driver programmers themselves are not aware of this problem (including Mudlord and Kode54, the authors of the original BassMidi driver OmniMidi is based on ).
« Last Edit: 29 Apr '19 - 19:52 by Falcosoft »

rv

  • Posts: 284
yes, but what is the period of the BASS_ChannelGetData call ? (I use WASAPI exclusive or ASIO outputs managed by BASS)
it should be very small and not dependent on the selected buffer size, so the bassmidi granularity will not change with the buffer size

exemple, if the buffer size is 10ms, the BASS_ChannelGetData is called 10 times every ms, asking 1 ms of data
Not sure the overhead it adds...

Ian @ un4seen

  • Administrator
  • Posts: 21861
If you remember maybe I was the first who used BASS_MIDI_EVENTS_TIME flag :)
http://www.un4seen.com/forum/?topic=17431.msg122310#msg122310
It worked perfectly in a VST environment. The problem with your solution is that it's not general and it can only help in SOME situations. Namely it works only if you use decoding channels and you use callbacks to process data yourself by using BASS_ChannelGetData().  In case of non-decoding channels you have no idea when you should set 'lastupdate' since output generation is not made by your code but by Bass itself.

I think the newer BASS_MIDI_EVENTS_ABSTIME option could be used in that case? It uses absolute positions, so does not need a "lastupdate" value. A possible issue then could be that your clock (used to calculate the absolute positions) gets out of sync with the MIDI stream's playback, eg. if it stalls. A BASS_SYNC_STALL sync could be used to detect that and resync your clock with BASS_ChannelGetPosition.

Also currently this could be a solution only for Bassmidi but not for Bass_VST since BASS_VST_ProcessEventRaw() does not support BASS_MIDI_EVENTS_TIME. As you can see some posts above I had tried to achieve something similar in Bass_VST internally. It seems the problem was that BASS_ChannelGetPosition() also depends on the granularity of the update period so it does not count positions between updates. Maybe later sample accurate continuous position reporting (like VST hosts do) could be added to Bass. It could be more precise/useful than using timeGetTime() calculations to get the necessary delay.

Were you using a decoding channel in this case? If not, BASS_ChannelGetPosition should not be affected by the update period unless you use the BASS_POS_DECODE flag to get the decoder's position (the decoder will advance according to the update period).

yes, but what is the period of the BASS_ChannelGetData call ? (I use WASAPI exclusive or ASIO outputs managed by BASS)
it should be very small and not dependent on the selected buffer size, so the bassmidi granularity will not change with the buffer size

exemple, if the buffer size is 10ms, the BASS_ChannelGetData is called 10 times every ms, asking 1 ms of data
Not sure the overhead it adds...

If you do something like the code snippet I posted above, it doesn't really matter what the ASIO/WASAPI buffer size is (so long as it's constant). You would just fill the buffer with a single BASS_ChannelGetData call each time (no need to split it into 1ms blocks).