BASS_Encode_ServerInit delay

Started by Chris Oakley, 25 Oct '23 - 21:03

Chris Oakley

If I did this:
_EncoderCable = BassEnc.BASS_Encode_Start(_NodeID, Nothing, BASSEncode.BASS_ENCODE_PCM, Nothing, IntPtr.Zero)
BassEnc.BASS_Encode_ServerInit(_EncoderCable, "0.0.0.0:10015", 32000, 32000, BASSEncodeServer.BASS_ENCODE_SERVER_DEFAULT, Nothing, IntPtr.Zero)
would it be usual for there to be a delay of around 1 to 2 seconds? I'm finding that it doesn't remain consistent and it can be 2 seconds and then a little later it can be 5 seconds. Is there a way I can make it consitent and also keep the delay to a minimum?

For reference the _NodeID is just a mixer, defined as such:
_NodeID = BassMix.BASS_Mixer_StreamCreate(44100, 2, BASSFlag.BASS_DEFAULT Or BASSFlag.BASS_MIXER_NONSTOP)
The mixer is just outputting to the NoSound device 0.

Ian @ un4seen

What encoding format/options are you using? If it's VBR, that could explain an inconsistent delay. To minimize the delay, you will want to minimize any buffering on both the server and client side. For example, if the client is using BASS then set BASS_CONFIG_NET_PREBUF to 0.

Chris Oakley

QuoteWhat encoding format/options are you using?
I'm not sure. I'm starting the Encode using PCM as per the start:
BassEnc.BASS_Encode_Start(_NodeID, Nothing, BASSEncode.BASS_ENCODE_PCM, Nothing, IntPtr.Zero)
Where else could I specify options for CBR?

The client picking up this stream would be BASS so I'll try setting the BASS_CONFIG_NET_PREBUF to 0.

Ian @ un4seen

Oh, if the data is plain PCM then it is CBR, so that's not the issue. Try the BASS_CONFIG_NET_PREBUF setting and see how that affects the delay.

Chris Oakley

I've tried that setting. Not sure if it's really helped that much. It's really odd. It starts off with about a 1 second delay and then while it's playing it jumps to about a 200ms delay. It's the same application outputting using the ServerInit and taking in the input with:
Bass.BASS_StreamCreateURL(_VirtualCableUrl, 0, BASSFlag.BASS_STREAM_DECODE, Nothing, IntPtr.Zero)

I can hear both the originating audio and the audio which is being received, that's how I can tell there is a jump in the delay.

Ian @ un4seen

The delay drops rather than getting bigger? If so, that sounds like a buffer has overflowed. My guess is it's the server's buffer. 44100Hz stereo 16-bit is 176400 bytes per second, so a 32000 byte buffer covers 181ms, close to the 200ms you mention. Try increasing that number. You could also try disabling playback buffering in the client by setting BASS_ATTRIB_BUFFER to 0, which will make it read smaller amounts more frequently.

Chris Oakley

Okay, that seems to be more stable now. I'm not hearing a change. It seems to stay about 1 second-ish delayed.

Beyond this initial testing, what are the things that would cause it to change? Just buffer overflows?

Ian @ un4seen

If the delay suddenly falls then that could be due to a buffer overflow (which means some data gets lost/dropped). In that case, the delay will usually have first been rising as the buffer got fuller, and then suddenly fell once it filled too much and overflowed.

Chris Oakley

Okay, and the way to prevent this potential overflow is to set the buffer higher on the ServerInit call? I'm also not 100% clear what the burst value is for. It always just seems to be the same as the buffer value.

Ian @ un4seen

A larger buffer is less likely to overflow, but it isn't impossible. An overflow of this buffer would usually be because the client is consuming the data more slowly than it is being produced. That could still happen with a larger buffer, but less frequently.

The "burst" value is the amount of already buffered data that will be sent to a new client. High values may increase latency, but low values may mean the client has to wait before it has enough data to start playing. If you're trying to minimize latency then the latter may be preferable to the former.

Chris Oakley

Okay great. So with the client consuming the data more slowly, how can that be controlled? Ideally I'd like the client to consume it at the same rate it's being generated so it doesn't get out of sync. Or is that not possible?

Ian @ un4seen

In theory the data should already be produced and consumed at the same rate, but it's possible for either end to not be going precisely at the specified rate, leading to them drifting apart over time and an eventual buffer overflow (or underflow). When drifting is detected (in the buffer fill levels), it is possible for the client to counteract that by making small adjustments to its playback rate (BASS_ATTRIB_FREQ).

Chris Oakley

Okay, but can I check, the BASS_ATTRIB_FREQ doesn't work on decode channels. Is that correct or did I imagine that?

Ian @ un4seen

Correct, BASS_ATTRIB_FREQ has no effect on a decoding channel, unless it's being played by a mixer (a mixer applies its sources' BASS_ATTRIB_FREQ settings).

Chris Oakley

Quoteit is possible for the client to counteract that by making small adjustments to its playback rate (BASS_ATTRIB_FREQ)

Does this mean I would periodically have to set this attribute in order to clear any build up and keep things on track?

Ian @ un4seen

Yes, possibly. If you see the client's buffer level above a high threshold, then you would raise BASS_ATTRIB_FREQ until the buffer level starts falling. If you see the buffer level below a low threshold, then you would lower BASS_ATTRIB_FREQ until the buffer level starts rising. When the buffer level is between the thresholds, then you can leave BASS_ATTRIB_FREQ as it is. Note that the BASS_ATTRIB_FREQ adjustments should be small/gradual so that they aren't noticeable to the listener.

Chris Oakley

Okay, so my next question would be how do I check the buffer level and what would be acceptable thresholds?

Ian @ un4seen

If the client is using BASS_StreamCreateURL to play the stream then there are 2 buffers that you could monitor: the download buffer (BASS_StreamGetFilePosition with BASS_FILEPOS_AVAILABLE) and the playback buffer (BASS_ChannelGetData with BASS_DATA_AVAILABLE). The playback buffer would probably be best, but are you playing the stream or is it a decoding channel? Regarding the thresholds, you could check the buffer level at the start and set the thresholds a bit below and above that. If the buffer is already full at the start then it isn't big enough; you can use the BASS_CONFIG_BUFFER or BASS_CONFIG_NET_BUFFER options (with BASS_SetConfig) to enlarge them. You will almost certainly need to enlarge the playback buffer (from the 500ms default) if you choose to monitor that for this.

Chris Oakley

#18
I think we're good with the ServerInit now. I have an issue in a separate application when opening this stream using:

_VirtualCableId = Bass.BASS_StreamCreateURL(_VirtualCableUrl, 0, BASSFlag.BASS_STREAM_DECODE, Nothing, IntPtr.Zero)

It's inconsistent to the amount of data it buffers and I'm trying to figure out where this could be happening. For example I'll start the app running and the stream could be 500ms delayed, but then other times it can be over 1 second, and it has to be the same all the time.

I do add this to a mixer and if I put a thread sleep in between the creation and adding then I do get a definate delay on the stream.

_VirtualCableId = Bass.BASS_StreamCreateURL(_VirtualCableUrl, 0, BASSFlag.BASS_STREAM_DECODE, Nothing, IntPtr.Zero)
Threading.Thread.Sleep(5000)
BassMix.BASS_Mixer_StreamAddChannel(_Mixer, _VirtualCableId, BASSFlag.BASS_DEFAULT)

Now I can understand why this is happening, because it's buffered 5 seconds of data before it was plugged into the mixer. I suppose my question is, how can I force the stream to get up to date / flush a buffer?

Ian @ un4seen

Quote from: Chris Oakley on 20 Nov '23 - 15:30I think we're good with the ServerInit now. I have an issue in a separate application when opening this stream using:

_VirtualCableId = Bass.BASS_StreamCreateURL(_VirtualCableUrl, 0, BASSFlag.BASS_STREAM_DECODE, Nothing, IntPtr.Zero)

It's inconsistent to the amount of data it buffers and I'm trying to figure out where this could be happening. For example I'll start the app running and the stream could be 500ms delayed, but then other times it can be over 1 second, and it has to be the same all the time.

Is "_VirtualCableUrl" a BASSenc server started with BASS_Encode_ServerInit, and if so, is it still serving PCM data? If it's a diferent format now and that happens to be VBR, then the duration of the same amount of buffered data can vary. You can call BASS_StreamGetFilePosition with BASS_FILEPOS_BUFFER before BASS_ChannelPlay/Start, to check if the amount of pre-buffered data is the same each time.

Quote from: Chris Oakley on 20 Nov '23 - 15:30Now I can understand why this is happening, because it's buffered 5 seconds of data before it was plugged into the mixer. I suppose my question is, how can I force the stream to get up to date / flush a buffer?

You could try using the BASS_POS_DECODETO option with BASS_ChannelSetPosition to flush all buffered data, ie. use up all the buffered data by seeking as far ahead as possible. Like this:

BASS_ChannelSetPosition(handle, -1, BASS_POS_DECODETO);

Note that -1 = the max possible unsigned value. If your compiler complains about that then you can replace it with whatever constant it has for the max 64-bit value (possibly UInt64.MaxValue?).

Chris Oakley

QuoteIs "_VirtualCableUrl" a BASSenc server started with BASS_Encode_ServerInit, and if so, is it still serving PCM data? If it's a diferent format now and that happens to be VBR, then the duration of the same amount of buffered data can vary. You can call BASS_StreamGetFilePosition with BASS_FILEPOS_BUFFER before BASS_ChannelPlay/Start, to check if the amount of pre-buffered data is the same each time.

Yes it is started from a ServerInit and it is PCM.

QuoteYou could try using the BASS_POS_DECODETO option with BASS_ChannelSetPosition to flush all buffered data, ie. use up all the buffered data by seeking as far ahead as possible.

That seems to do the trick. I was trying to do something similiar but I wasn't using the BASS_POS_DECODETO flag.

I'm going to deploy and see if it holds now.

Ian @ un4seen

Quote from: Chris Oakley on 20 Nov '23 - 18:38Yes it is started from a ServerInit and it is PCM.

What "burst" parameter are you using in the BASS_Encode_ServerInit call? That will determine how much data is sent immediately to new clients, so you may want to lower it for less delay (more data = more delay). With plain PCM data, you can use BASS_ChannelSeconds2Bytes to get a "burst" value.

Chris Oakley

QuoteWhat "burst" parameter are you using in the BASS_Encode_ServerInit call?

I'm currently using 64000. Should I lower that? If I was to use the BASS_ChannelSeconds2Bytes, what am I doing with that? Am I asking it for the bytes in 0.03 for example?

I'm also noticiing that each instance of the application running on the same machine, gradually goes out of sync. Not by much, but they all start bang on with each other, but over a period of about 12 hours they drift a little. Not a massive amount. Only about 500ms to 750ms each, but it's enough for me to be concerned because it's not what I expect.

Ian @ un4seen

Quote from: Chris Oakley on 22 Nov '23 - 12:11
QuoteWhat "burst" parameter are you using in the BASS_Encode_ServerInit call?

I'm currently using 64000. Should I lower that? If I was to use the BASS_ChannelSeconds2Bytes, what am I doing with that? Am I asking it for the bytes in 0.03 for example?

The meaning of that number will depend on the exact sample format, eg. assuming 44100hz 16-bit stereo then 64000 = 0.363s (64000/176400). If that's correct in your case then it would be strange to sometimes have a 1 second delay at the start. Are you possibly delaying the BASS_ChannelPlay call sometimes?

Quote from: Chris Oakley on 22 Nov '23 - 12:11I'm also noticiing that each instance of the application running on the same machine, gradually goes out of sync. Not by much, but they all start bang on with each other, but over a period of about 12 hours they drift a little. Not a massive amount. Only about 500ms to 750ms each, but it's enough for me to be concerned because it's not what I expect.

I would generally expect streams on the same system to maintain their initial sync with each other (assuming no stalls). Are you making BASS_ATTRIB_FREQ adjustments as discussed earlier? If so, that could be causing the sync changes, ie. the streams aren't all getting identical adjustments.

Chris Oakley

QuoteThe meaning of that number will depend on the exact sample format, eg. assuming 44100hz 16-bit stereo then 64000 = 0.363s (64000/176400). If that's correct in your case then it would be strange to sometimes have a 1 second delay at the start. Are you possibly delaying the BASS_ChannelPlay call sometimes?

I'm not actually starting it playing. It's being added to a mixer which is a decode mixer so it starts as soon as it is attached if I'm not mistaken on how this works. This mixer is then added to another mixer which is outputting to the No Sound device and playing on that.

QuoteI would generally expect streams on the same system to maintain their initial sync with each other (assuming no stalls). Are you making BASS_ATTRIB_FREQ adjustments as discussed earlier? If so, that could be causing the sync changes, ie. the streams aren't all getting identical adjustments.

I'm not making any adjustements with BASS_ATTRIB_FREQ at all.