Potential bug when using DSP_StreamCopy

Started by Chris Oakley,

Chris Oakley

I'm tearing my hair out over an issue we only see on a few machines. There is no obvious pattern or link between the machines suffering from the problem.

The issue we have is that after an unmeasurable period of time running our app the audio playback will begin to pause and then continue. The pauses can be anything from half a second to 6 seconds. It's crazy.

We've had issues before where some systems have audio delayed https://www.un4seen.com/forum/?topic=20244.msg142401#msg142401 which was frustrating because I had to try to handle it myself in code, which I don't expect to do when using a library designed to handle audio.

I thought at first the pausing could be a result of the work done above, but after sidestepping that process I can confirm it's nothing to do with that.

I'm not doing any loops to process the audio myself, I'm relying on the NoSound device to handle playback.

We have 16 outputs and I create a mixer like this on the NoSound device for each one:
_Mixer = BassMix.BASS_Mixer_StreamCreate(48000, 2, BASSFlag.BASS_MIXER_NONSTOP)
I then set the buffer to 0 and start it playing:
Bass.BASS_ChannelSetAttribute(_Mixer, BASSAttribute.BASS_ATTRIB_BUFFER, 0F)
Bass.BASS_ChannelPlay(_Mixer, False)

The reason we have 16 outputs is because we have developed a comprehensive audio matrix so the user can customize the audio routing as they see fit.

Once we have the Buses we need to make what we call an AudioNode on each bus. The AudioNode is used to determine what the Buses output will actually be. A BUS always goes to the NoSound device. Each BUS has an AudioNode and that is what tells it where to send the actual audio. This could be a physical device, a streaming server or a file on the machine.

In order to get a cloned copy of the BUS, the AudioNode Class does this:
    m_SourceId = _SourceId
    m_DestinationId = _DestinationId
    m_DspClone = New DSP_StreamCopy()
    Dim _info As BASS_INFO = Bass.BASS_GetInfo
    With m_DspClone
        .OutputLatency = _info.latency
        .ChannelHandle = _SourceId 'Mixer handle to clone
        .StreamCopyFlags = .ChannelInfo.flags
        .TargetMixerStream = _DestinationId 'The target mixer handle
        .DSPPriority = -1000
        .IsOutputBuffered = True
        .Start()
    End With

This now sends a synchronized copy of the BUS to the desired destination. So anything that happens on that BUS will be heard at the destination. Pretty straightforward.

I then open a file like this:
_StreamFX = Bass.BASS_StreamCreateFile(_Filename, 0, 0, BASSFlag.BASS_STREAM_DECODE Or BASSFlag.BASS_ASYNCFILE)
Once it's opened I then do this so the item can be tempoed if needs be:
_Stream = BassFx.BASS_FX_TempoCreate(_StreamFX, BASSFlag.BASS_STREAM_DECODE Or BASSFlag.BASS_FX_FREESOURCE)
The next step needs a little explaining. I can't monitor the playback position properly so have to make two splits and then make one of them silent and use that to return the position. I don't recall why, but I had to do it this way:
_SplitMain = BassMix.BASS_Split_StreamCreate(_Stream, BASSFlag.BASS_STREAM_DECODE, Nothing)
_SplitPosition = BassMix.BASS_Split_StreamCreate(_Stream, BASSFlag.BASS_STREAM_DECODE, Nothing)
Bass.BASS_ChannelSetAttribute(_SplitPosition, BASSAttribute.BASS_ATTRIB_VOL, 0)
BassMix.BASS_Mixer_StreamAddChannel(_DestinationID, _SplitPosition, BASSFlag.BASS_DEFAULT)
BassMix.BASS_Mixer_ChannelFlags(_SplitMain, BASSFlag.BASS_MIXER_CHAN_PAUSE, BASSFlag.BASS_MIXER_CHAN_PAUSE)
BassMix.BASS_Mixer_ChannelFlags(_SplitPosition, BASSFlag.BASS_MIXER_CHAN_PAUSE, BASSFlag.BASS_MIXER_CHAN_PAUSE)

Once this has been done the resulting stream is added to the BUS mixer in a paused state:
BassMix.BASS_Mixer_StreamAddChannel(_Mixer, _SplitMain, BASSFlag.BASS_DEFAULT)
When we want to play the audio we reset the pause flag:
Ret = CType(BassMix.BASS_Mixer_ChannelFlags(_SplitMain, BASSFlag.BASS_DEFAULT, BASSFlag.BASS_MIXER_CHAN_PAUSE), BASSError)
Ret = CType(BassMix.BASS_Mixer_ChannelFlags(_SplitPosition, BASSFlag.BASS_DEFAULT, BASSFlag.BASS_MIXER_CHAN_PAUSE), BASSError)

It's like a buffer somewhere is being starved, but all the files are played locally. There is absolutely no reason for anything to stall. I do have a SYNC on each mixer I make to log if they stall, but since my mixers are NON_STOP I don't know if that means they would never stall because I've had no logs to say they have.

Am I also correct in thinking that if I have a mixer which I've done a DSP_StreamCopy of, there is no way any problem on the StreamCopy could affect the mixer it was sourced from?

Ian @ un4seen

I've never used DSP_StreamCopy myself, so I can't really comment on it, but it seems like you could replace it with BASSmix splitters in this case. If you would like to try that, you would set the BASS_STREAM_DECODE flag on the mixer and then create a splitter from that for each device that you want to play it on (including the "No Sound" device), something like this:

_Mixer = BassMix.BASS_Mixer_StreamCreate(48000, 2, BASSFlag.BASS_MIXER_NONSTOP or BASSFlag.BASS_STREAM_DECODE)

' start splitter on "No Sound" device
BASS.BASS_SetDevice(0)
_SplitNoSound = BassMix.BASS_Split_StreamCreate(_Mixer, BASSFlag.BASS_DEFAULT, Nothing)
Bass.BASS_ChannelSetAttribute(_SplitNoSound, BASSAttribute.BASS_ATTRIB_BUFFER, 0F)
Bass.BASS_ChannelPlay(_SplitNoSound, False)

' start splitter (slave) on physical device
BASS.BASS_SetDevice(device)
_SplitPhysical = BassMix.BASS_Split_StreamCreate(_Mixer, BASSFlag.BASS_SPLIT_SLAVE, Nothing)
Bass.BASS_ChannelPlay(_SplitPhysical, False)

You can free (BASS_StreamFree) a splitter when you no longer want to play on its device.

A demo of this sort of thing can be found in the MULTI.C example included in the BASSmix package. There's a pre-compiled version in the C\BIN folder.

Chris Oakley

If the main mixer is a DECODE then I have to processes the data myself with a loop and the GetChannelData.

I used the DSP_StreamCopy because I can't split from a non decode mixer. I wish that was possible.

I'm trying to avoid having to process data myself because I've had problems in the past with doing that and I'm not confident that I wouldn't just be swapping one problem for another.

Ian @ un4seen

Quote from: Chris OakleyIf the main mixer is a DECODE then I have to processes the data myself with a loop and the GetChannelData.

You wouldn't have to do that because a splitter will process its source when more data is needed (unless BASS_SPLIT_SLAVE is set). With the code in my last post, _SplitNoSound will process the mixer and _SplitPhysical will just copy the result of that.

radio42

I guess Ian might be right (as he is most of the times ;-)
The DSP_StreamCopy is a normal DSP. What it does is actually in its DSP implementation, it call BASS_ChannelGetData to retrieve the data from the source and then it calls BASS_StreamPutData to copy the data to the target.
That's it basically.

The DSP is from back in the days where the split function was not available at that time...

Chris Oakley

I appreciate the input guys. Flipping things around to work the way suggested is an undertaking in the application and whilst I'd love to try it, I don't want to spend all that time to find it doesn't solve the problem because we haven't actually isolated what's causing the issue.

We had an issue today where on of the installs it just paused the playback mid song and it never started again. It normally restarts. The application hadn't locked up. It was still responsive, but unfortunately by the time I had been informed, the application had been restarted so I didn't have an opportunity to inspect the system.

radio42, I presume there must be a loop of some sort in a DSP_StreamCopy in order to process the data? If so is there anything in there that could hold up that loop?

Same goes for playing a Mixer via NoSound - surely there must be a loop there too which processes the data? Is there anything there that could hang that up?

I'm looking for anything I can right now to try to understand what might be configured wrong on the problem systems.

Ian @ un4seen

#6
Quote from: Chris Oakley...we haven't actually isolated what's causing the issue.

The "pauses" are most likely stalls caused by running out of data to play. So I would suggest setting BASS_SYNC_STALL syncs on all channels (mixers and sources) and see which (if any) get triggered. Note you should use BASS_Mixer_ChannelSetSync rather than BASS_ChannelSetSync to set the sync on a mixer source, ie. any time you use BASS_Mixer_StreamAddChannel. In the case of a DSP_StreamCopy stream, it looks its handle is in the "StreamCopy" property.

In addition to that, also monitor the BASS_ATTRIB_CPU values of all channels with BASS_ChannelGetAttribute and see if any spike (higher than usual) when the problem occurs. If playback sometimes ends unexpectedly (not just pauses/stalls) then also monitor the BASS_ChannelIsActive values.

You could also try having an app that keeps the drive awake running alongside your app on the affected systems, and see if that helps. For example:

    https://github.com/stsrki/KeepAliveHD

radio42

Quote from: Chris Oakleyradio42, I presume there must be a loop of some sort in a DSP_StreamCopy in order to process the data? If so is there anything in there that could hold up that loop?

No, the DSP function is called by BASS itself to process the data. So I doubt it got stuck in there.
I also guess, that BASS is not calling the DSP function anymore - as Ian explained.

Ian @ un4seen

Quote from: Ian @ un4seenI would suggest setting BASS_SYNC_STALL syncs on all channels (mixers and sources) and see which (if any) get triggered. Note you should use BASS_Mixer_ChannelSetSync rather than BASS_ChannelSetSync to set the sync on a mixer source, ie. any time you use BASS_Mixer_StreamAddChannel.

To simplify this, here's a BASSmix update that adds a new BASS_SYNC_MIXER_STALL (0x10203) sync for any mixer sources stalling:

    www.un4seen.com/stuff/bassmix.zip

Instead of setting BASS_SYNC_STALL syncs on every source, you can just set a BASS_SYNC_MIXER_STALL sync on the mixer, and the callback will receive the stalled source handle in its "data" parameter. Note you should still set a BASS_SYNC_STALL sync on the mixer itself. Something like this:

mixer = BASS_Mixer_StreamCreate(...); // create mixer
BASS_ChannelSetSync(mixer, BASS_SYNC_STALL | BASS_SYNC_MIXTIME, 0, StalledMixerSync, 0); // sync for stalled mixer
BASS_ChannelSetSync(mixer, BASS_SYNC_MIXER_STALL | BASS_SYNC_MIXTIME, 0, StalledSourceSync, 0); // sync for stalled sources

void CALLBACK StalledMixerSync(HSYNC handle, DWORD channel, DWORD data, void *user)
{
    if (!data) Log("mixer %x stalled\n", channel);
}

void CALLBACK StalledSourceSync(HSYNC handle, DWORD channel, DWORD data, void *user)
{
    Log("source %x in mixer %x stalled\n", data, channel);
}

Chris Oakley

Thanks for this Ian I'll take a look and see if this helps.

I have a question however. One of the problem clients at the moment has had another hang up this morning and I noted that our application reported a timeout when calling BASS_Encode_CastSetTitle. This is how we send meta to the connected stream.

The question is could this timeout be causing problems on the Encoder and then this has a knock on effect back down the chain?

This is the line that executes it:
BassEnc.BASS_Encode_CastSetTitle(_Encoder, _enc.GetBytes(_Meta), Nothing)
We start the encoder like this:
_Encoder = BassEnc.BASS_Encode_Start(m_Stream, _EncoderString, BASSEncode.BASS_ENCODE_NOHEAD Or BASSEncode.BASS_ENCODE_FP_16BIT, Nothing, IntPtr.Zero)

Ian @ un4seen

What type of cast server is it? Unless it's Shoutcast 2, BASS_Encode_CastSetTitle will open a new connection to set the title, so a problem with that wouldn't necessarily affect the audio data connection. But it could indicate a more general connection problem that is affecting the audio data connection too, and that could perhaps cause a pause/stall in playback like you reported. Adding the BASS_ENCODE_QUEUE flag to the encoder should help prevent encoder problems affecting playback.

Chris Oakley

Interesting. So yes, it is IceCast.

So as long as we're using the QUEUE flag, that should mitigate any encoder problems caused by a glitch in the stream?

Ian @ un4seen

It's the other way round. The BASS_ENCODE_QUEUE flag should prevent local playback problems caused by a glitch in the encoder (eg. bad casting connection). The listeners connected to the Icecast server would of course still be affected by such glitches, because they're playing the encoder output.

Chris Oakley

Sorry, that's what I meant. That's interesting then. I'm trying to see if I can force the error with a TIMEOUT on the BASS_Encode_CastSetTitle but it's quite a hard situation to replicate as the connection to the streaming server was still active according to our logs.

If it drops then it logs that it has and then logs when reconnected.

In addition when it sends the Meta it does this to all connected streaming services. In this case there are two and the other one sent the meta fine. They are both different servers and services. So this was definately not a local internet issue.

To be on the safe side could I send the meta using a thread as I can see it held up the application for 5 seconds? Or are there implications around different threads for this - or would doing that make no difference?

Ian @ un4seen

Sending the metadata in another thread will be fine.

Chris Oakley

Cool. So with regard to:

QuoteAdding the BASS_ENCODE_QUEUE flag to the encoder should help prevent encoder problems affecting playback.

could an odd problem with the streaming server ingesting the data have a knock on effect? I'm not talking a connection drop here I'm more thinking along the lines of the connection is still valid but there is some sort of delay sending the stream data.

Ian @ un4seen

Yes, when the BASS_ENCODE_QUEUE flag isn't set, any delay in the encoding/casting will affect the source channel's other processing (that won't continue until the delay is over).

Chris Oakley

So this could have been the problem all along. I honestly didn't think an encoder casting to a streaming server could affect the playback of the source material going out on a NoSound mixer.

So just for the benefit of clarity, if I:

  • Create a Non Decode mixer outputting to a physical device. Call this Mixer A.
  • Open a file, plug it into Mixer A. File is now playing on Mixer A.
  • Create another Non Decode mixer to NoSound. Call this Mixer B.
  • Make a DSP_StreamCopy of Mixer A and have Mixer B as its target so Mixer A is now also coming through Mixer B.
  • Start an encoder using Mixer B as its source without the QUEUE flag and cast it to a streaming server.

In this situation something blocking the streaming process would pause the playback of the audio playing out of Mixer A?

Note: The reason for creating another mixer for the streaming encoder is to normalize the samplerate as some command line encoders like AAC and even MP3 can be a royal confusing pain in the neck, so this puts us on a level playing field so to speak.

Ian @ un4seen

I don't think mixer A would generally be affected in that case. I believe DSP_StreamCopy will be using a "push" stream to get the data from mixer A to mixer B, ie. mixer A writes to the push stream and mixer B separately reads from it. If there's any delay in mixer B's processing then data from mixer A would just build up in the push stream's buffer in the meantime. If the amount of buffered data keeps growing then there could perhaps be some delays for mixer A from having to enlarge the buffer to hold the data. It's possible to limit how much data a push stream can hold via its BASS_ATTRIB_PUSH_LIMIT attribute.

But from what you said earlier, I thought your mixers were the other way round, ie. a mixer on the NoSound device is feeding (via DSP_StreamCopy) a mixer on the physical device? If so, and the encoder is also on the NoSound mixer, then the physical device mixer would be affected by encoding delays because there's no data generated during them.

Chris Oakley

Quote...I thought your mixers were the other way round, ie. a mixer on the NoSound device is feeding (via DSP_StreamCopy) a mixer on the physical device? If so, and the encoder is also on the NoSound mixer, then the physical device mixer would be affected by encoding delays because there's no data generated during them.

I am doing that, you're correct, I was trying to simplify the situation to try and understand the pipeline a little better. The only slight difference is the encoder isn't tapping off the NoSound mixer, it's taking a feed from another DSP_StreamCopy which is being used for this purpose because it's being plugged into a mixer soley being used for the encoder to cast. This is what I mentioned earlier about normalizing the sample rate for casting.

I've got a test rig running now with the encoder set to QUEUE.

It's quite interesting because I have another encoder which is logging the audio to files of 60 minutes each. Whenever the length of the logged audio is less than 59:59 then it's a sign something went wrong. This encoder has always had the QUEUE flag set, but I realise this means nothing with regard to the encoder which is being used for casting.

I just need to monitor this now and see if it makes any difference.

Sorry it all sounds so complex because we have an audio routing matrix of 16 BUSES (Outputs which could be a device or a logging file or casting a stream) and we have 8 Players and each of those players can be patched through to any or all of those buses. So a lot of splitting and mixing going on.

Ian @ un4seen

Quote from: Chris OakleyIt's quite interesting because I have another encoder which is logging the audio to files of 60 minutes each. Whenever the length of the logged audio is less than 59:59 then it's a sign something went wrong. This encoder has always had the QUEUE flag set, but I realise this means nothing with regard to the encoder which is being used for casting.

Are you using BASS_Encode_StopEx with queue=true to end those encodings? If not, that could explain them being slightly short, because BASS_Encode_Stop won't process any data waiting in the queue before stopping.

Chris Oakley

QuoteAre you using BASS_Encode_StopEx with queue=true to end those encodings? If not, that could explain them being slightly short, because BASS_Encode_Stop won't process any data waiting in the queue before stopping.

No I'm not using that. I'm just using:
BassEnc.BASS_Encode_Stop(_LogEncoder)
That may explain why sometimes the length is 1:00:00 and other times it's 59:59.

I doubt this will be responsible for a length of 59:52 though.