Author Topic: ASIO time synchronization questions  (Read 9282 times)

RobJellinghaus

  • Posts: 94
ASIO time synchronization questions
« on: 6 Sep '11 - 18:48 »
My app is coming together very well, but last night things got slightly awry in a confusing way.

As previously mentioned in the "Samples and Effects" thread (http://www.un4seen.com/forum/?topic=12912.0), I'm doing an app that implements live looping by:

  • setting up an ASIOPROC for the ASIO input channel
  • setting up a DSPPROC to copy the ASIO input data to a memory buffer when recording
  • feeding the ASIO input data to an input decoding push stream
  • feeding the input decoding push stream to a mixer
  • creating a second ASIOPROC to feed the ASIO output channels from the mixer stream

Then for each live loop I record, I handle it by:

  • creating a decode push stream for the loop
  • creating a SYNCPROC to push memory-buffered data into the loop's push stream whenever it stalls
  • adding the loop's push stream as a mixer input

This is mostly working very well.  I also want to coordinate various visuals with the tempo of the music, and to that end, I'm driving the app's internal clock purely based on the number of input samples consumed by the input ASIOPROC.

My assumption was (and is) that since I'm basically passing everything through as directly as possible, the number of input samples processed by the input ASIOPROC would at all times be in lock-step with the number of samples consumed by the SYNCPROCs on each loop's push stream.  In other words, I am assuming that the rate at which input flows into the system will be the same as the rate at which recorded data is requested for each loop by the system; my intuition is that the input ASIOPROC, the output ASIOPROC, and all the mixer inputs and outputs will be consuming data at the same rate (48,000 Hz as currently configured).

This was working pretty much as expected yesterday.  The only issues were that I was using a stereo ASIO input, but I'm using a mono microphone, so all the sound was only on the left channel.  I modified this to use a mono input channel and BASSFlag.BASS_MIXER_DOWNMIX to spread the input over both mixer channels.  I left all the memory buffering as mono and added BASS_MIXER_DOWNMIX on all the additional per-loop mixer inputs, figuring that this would save memory while costing minimal CPU.  (My CPU budget is large  ;) )

I also added some code to continually buffer the last second's worth of input audio, so I could compensate for latency in the input system when starting a new loop.  (e.g. when starting a new loop, grab the last 1/20th of a second of input audio and start the loop with it.)

However, after making these two changes, I noticed that the recorded loop's playback was getting out of sync with the app's internal clock.  The clock is still driven by the input ASIOPROC.  It seemed that the input ASIOPROC was consuming samples faster than the recorded loop's SYNCPROC was consuming them.  In other words, I was truncating the recorded loop to exactly 48,000 samples, assuming that this would mean it would always play in lockstep with the input ASIOPROC; but it seemed the 48,000 samples in the recorded loop were taking longer to play than it was taking the input ASIOPROC to consume the same number of samples.  (The input ASIOPROC was keeping perfect time, 48,000 samples per second, but the recorded loop was playing back in about 1.01 - 1.05 seconds, resulting in things getting out of sync.)

Things were synced very tightly before I made these two changes (the stereo downmixing and the retroactive buffering), so I will experiment tonight to find out which change introduced this drift, but I wanted to queue up this question for Ian and Bernd (being as they're on Europe time and I'm in Seattle).

My questions:
  • Am I clearly describing the situation?
  • Does the DOWNMIX flag alter the rate at which samples are consumed by a mixer?
  • Can the amount of data pushed into a push stream via StreamPutData affect the rate at which the data is consumed?  (e.g. if I push three small chunks of data, would that possibly take longer to consume than pushing a single large chunk?)
  • Overall, is my assumption of lock-step sample synchrony throughout this kind of ASIO-driven BASS pipeline reasonable, or are there places where samples might not be consumed or produced in 1:1 ratio?

Thanks as always for your awesome support -- BASS has already gotten me much further, much more quickly, than I had hoped would be possible  ;D

For reference, a recent version of my code (before last night's sync-breaking changes) is at http://holofunk.codeplex.com

RobJellinghaus

  • Posts: 94
Re: ASIO time synchronization questions
« Reply #1 on: 7 Sep '11 - 05:32 »
...And it turns out that my changes last night weren't the issue at all.  I rolled back to an earlier version of my code, and the same drift-out-of-sync issue exists.

I simplified my code to push the entire 48,000-sample loop (via BASS_StreamPutData) at once.  Here is my stream setup code:

Code: [Select]
           m_mixerHStream = BassMix.BASS_Mixer_StreamCreate(
                48000,
                2,
                BASSFlag.BASS_MIXER_RESUME | BASSFlag.BASS_MIXER_NONSTOP | BASSFlag.BASS_STREAM_DECODE | BASSFlag.BASS_SAMPLE_FLOAT);


            m_trackHStream = Bass.BASS_StreamCreatePush(
                48000,
                1,
                BASSFlag.BASS_SAMPLE_FLOAT | BASSFlag.BASS_STREAM_DECODE,
                new IntPtr(m_id));

            BassMix.BASS_Mixer_StreamAddChannel(
                m_mixerHStream,
                m_trackHStream,
                BASSFlag.BASS_MIXER_DOWNMIX | BASSFlag.BASS_MIXER_NORAMPIN);

            // try setting to 40% volume to reduce over-leveling
            Bass.BASS_ChannelSetAttribute(trackHStream, BASSAttribute.BASS_ATTRIB_VOL, (float)TopMixVolume);

            BassMix.BASS_Mixer_ChannelPlay(trackHStream);

            m_trackHSync = BassMix.BASS_Mixer_ChannelSetSync(
                m_trackHStream,
                BASSSync.BASS_SYNC_MIXTIME | BASSSync.BASS_SYNC_STALL,
                0, // ignored
                m_syncProc,
                new IntPtr(0));

Here is my SYNCPROC:

Code: [Select]
       public void SyncProc(int handle, int channel, int data, IntPtr user)
        {
            // push the next sample to the stream
            if (data == 0) { // means "stalled"
                PushTrackToStream();
            }
        }

        public void PushTrackToStream()
        {
            for (int i = 0; i < m_samples.Count; i++) {
                PushSampleToStream(m_samples[i]);
            }
        }

        protected unsafe override void PushSampleToStream(Sample<float> sample)
        {
            float[] samples = sample.Chunk.Storage;

            // per http://www.un4seen.com/forum/?topic=12912.msg89978#msg89978
            fixed (float* p = &samples[0]) {
                byte* b = (byte*)p;

                // we ignore the return value from StreamPutData since it queues anything in excess,
                // so we don't need to track any underflows
                Bass.BASS_StreamPutData(m_trackHStream, new IntPtr(p), sample.Length * sizeof(float));
            }
        }

What I observe when running this code is that a clock, driven by the number of samples consumed by my input ASIOPROC, drifts out of sync with the looping sound.  It's as though the loop is taking 10msec or so longer to actually loop than it should, given that the loop is exactly 48,000 samples (e.g. should be exactly one second).

It's almost as though BASS_SYNC_MIXTIME is not quite soon enough, or something.  What I want is a SYNCPROC that will fire soon enough, before the mixer stream stalls, that I can put the track data back in with zero lag before starting the loop again.

Is this realistic?  Or is there always going to be some looping delay here, even if only very small?  The problem with small loop delays is that they automatically build up, because, well, loops loop a lot  :(
« Last Edit: 7 Sep '11 - 06:36 by RobJellinghaus »

RobJellinghaus

  • Posts: 94
Re: ASIO time synchronization questions
« Reply #2 on: 7 Sep '11 - 07:53 »
Continuing my tradition of big thread monologues that are hopefully helpful to someone, here's something very interesting.  I decided that if I couldn't make loops restart with zero lag, I might as well add some clock-adjustment logic to correct the drift.  So I started measuring the actual time taken by the loop, relative to the input ASIOPROC.

And color me surprised:
Code: [Select]
Track #1 SyncProc invoked at time [1011712 timepoints, 21.0773333333333 secs, 21.0773333333333 beats]; timepoints since last sync 48128; average sync duration 48128
Track #1 SyncProc invoked at time [1059840 timepoints, 22.08 secs, 22.08 beats]; timepoints since last sync 48128; average sync duration 48128
Track #1 SyncProc invoked at time [1107968 timepoints, 23.0826666666667 secs, 23.0826666666667 beats]; timepoints since last sync 48128; average sync duration 48128
Track #1 SyncProc invoked at time [1156096 timepoints, 24.0853333333333 secs, 24.0853333333333 beats]; timepoints since last sync 48128; average sync duration 48128

This was surprising until I realized my input ASIOPROC buffer length is 512 bytes, or 128 float samples.  So it looks like my actual problem here is clock quantization -- the SYNCPROC runs one ASIOPROC interval after the actual input consumption, so I always see one additional ASIO buffer's worth of samples.

(I use "timepoints" to refer to "number of measurements" independent of the number of channels.  Right now I'm using mono channels, but I don't want my code to assume that's always the case.)

Simply subtracting 128 (float) samples from my track length didn't work; it had to be a bit less than that.  But I was nervous about hardcoding some adjustment here since who knows whether it would always be right.

I wound up using some (grade-school) control theory, and building in a rolling average of the clock skew.  I progressively shorten my loop (e.g. make it put fewer samples) as long as I am apparently seeing an excessively long playback (e.g. one that takes longer than the exact sample length).  Then if the playback gets too short, I reduce the amount by which I'm shortening the sample.  This results in the following output:

Code: [Select]
Track #1 SyncProc invoked at time [18862080 timepoints, 392.96 secs, 392.96 beats]; average sync length (timepoints): 48025.6; timepoint drift: -25.60156; loop lag (timepoints): 397
Track #1 SyncProc invoked at time [18909696 timepoints, 393.952 secs, 393.952 beats]; average sync length (timepoints): 47923.2; timepoint drift: 76.80078; loop lag (timepoints): 359
Track #1 SyncProc invoked at time [18957824 timepoints, 394.954666666667 secs, 394.954666666667 beats]; average sync length (timepoints): 47923.2; timepoint drift: 76.80078; loop lag (timepoints): 321
Track #1 SyncProc invoked at time [19005952 timepoints, 395.957333333333 secs, 395.957333333333 beats]; average sync length (timepoints): 47923.2; timepoint drift: 76.80078; loop lag (timepoints): 283
Track #1 SyncProc invoked at time [19054080 timepoints, 396.96 secs, 396.96 beats]; average sync length (timepoints): 47923.2; timepoint drift: 76.80078; loop lag (timepoints): 245
Track #1 SyncProc invoked at time [19102208 timepoints, 397.962666666667 secs, 397.962666666667 beats]; average sync length (timepoints): 48025.6; timepoint drift: -25.60156; loop lag (timepoints): 257
Track #1 SyncProc invoked at time [19150336 timepoints, 398.965333333333 secs, 398.965333333333 beats]; average sync length (timepoints): 48128; timepoint drift: -128; loop lag (timepoints): 321
Track #1 SyncProc invoked at time [19198464 timepoints, 399.968 secs, 399.968 beats]; average sync length (timepoints): 48128; timepoint drift: -128; loop lag (timepoints): 385
Track #1 SyncProc invoked at time [19246080 timepoints, 400.96 secs, 400.96 beats]; average sync length (timepoints): 48025.6; timepoint drift: -25.60156; loop lag (timepoints): 397
Track #1 SyncProc invoked at time [19293696 timepoints, 401.952 secs, 401.952 beats]; average sync length (timepoints): 47923.2; timepoint drift: 76.80078; loop lag (timepoints): 359
Track #1 SyncProc invoked at time [19341824 timepoints, 402.954666666667 secs, 402.954666666667 beats]; average sync length (timepoints): 47923.2; timepoint drift: 76.80078; loop lag (timepoints): 321
Track #1 SyncProc invoked at time [19389952 timepoints, 403.957333333333 secs, 403.957333333333 beats]; average sync length (timepoints): 47923.2; timepoint drift: 76.80078; loop lag (timepoints): 283

You can see that this is remaining just about perfectly centered on one second (48,000 samples exactly) per track playback.  There is a small amount of jitter, but it's not noticeable.

Now I can leave the thing running for many minutes and my visuals and my audio stay perfectly synchronized!

It might be nice if BASS exposed its internal clock in more detail, but I'm not 100% sure exactly what that would look like, and this is working well enough for me now :-)  Thanks everyone for your patience with these posts.  And if there is a better / simpler way to handle this, I'm all ears!
« Last Edit: 7 Sep '11 - 08:00 by RobJellinghaus »

radio42

  • Posts: 4573
Re: ASIO time synchronization questions
« Reply #3 on: 7 Sep '11 - 09:29 »
Very interesting findings! THX for that...
...lets wait for a bit more details from Ian...

Ian @ un4seen

  • Administrator
  • Posts: 20393
Re: ASIO time synchronization questions
« Reply #4 on: 7 Sep '11 - 16:01 »
If I'm reading it right, you're currently using a BASS_SYNC_STALL sync to determine when the push stream needs more data. The problem with that will be that a mixer won't immediately check for more data following a BASS_SYNC_STALL sync, so it won't receive the newly "pushed" data until the next BASS_ChannelGetData call.

You could instead use a BASS_SYNC_END sync, as the mixer will be able to immediately receive more data following that. When doing that, you would also need to use the BASS_STREAMPROC_END flag in a BASS_StreamPutData call to signify the end of the data (so that the sync is triggered), and then call BASS_ChannelSetPosition (with pos=0) in the SYNCPROC function to reset the stream before feeding it more data. For example, it could look something like this...

Code: [Select]
BASS_ChannelSetSync(pushstream, BASS_SYNC_END|BASS_SYNC_MIXTIME, 0, EndSyncProc, 0); // set a sync at the end of the stream
BASS_StreamPutData(pushstream, data, datalen|BASS_STREAMPROC_END); // feed the data to the stream

...

void CALLBACK EndSyncProc(HSYNC handle, DWORD channel, DWORD data, void *user)
{
BASS_ChannelSetPosition(channel, 0, BASS_POS_BYTE); // reset the stream so it's no longer ended
BASS_StreamPutData(channel, data, datalen|BASS_STREAMPROC_END); // feed the data to the stream
}

radio42

  • Posts: 4573
Re: ASIO time synchronization questions
« Reply #5 on: 7 Sep '11 - 17:35 »
Just a side question - since this is not clear to me:
Why are you using a SYNC callback anyhow.
Wouldn't it be 'easier' to directly push the data within the input ASIOPROC?
Just as new data arrived you might push it...?

RobJellinghaus

  • Posts: 94
Re: ASIO time synchronization questions
« Reply #6 on: 7 Sep '11 - 19:51 »
Thanks for that, Ian, I'll try it.  Glad I figured out the other technique too, as a fallback.

Bernd, sorry for all the excess detail.  Here's it in a nutshell.  My app does both live microphone playback *and* live looping of microphone-recorded sound, as so:

- Input ASIOPROC does put data into a push stream, and that input push stream goes straight into the mixer, so I can hear the microphone live with minimal latency.

- A DSPPROC on the input push stream copies the samples to a separate memory buffer whenever I'm recording a new loop.

- Once I am done recording that new loop, I then create a separate loop push stream, and drive it with the SYNCPROC.  That loop push stream then also goes into the mixer, and now I can hear the pre-recorded loop and the live microphone input together, with minimal latency.  Works really great, actually.

So the SYNCPROC is only for recorded data.

Does that make more sense?
« Last Edit: 7 Sep '11 - 19:54 by RobJellinghaus »

RobJellinghaus

  • Posts: 94
Re: ASIO time synchronization questions
« Reply #7 on: 8 Sep '11 - 05:43 »
Well, I tried the BASS_SYNC_END strategy.  Even with using BassMix.BASS_Mixer_ChannelSetPosition, and making sure to get the BASS_STREAMPROC_END flag right, I found that my SYNCPROC was only being called once.  That is, I heard the sound loop exactly once, and then never again.  My debug spam started going slightly crazy, too, as though the SYNCPROC was getting called over and over.  Basically, it didn't work  :(

My hysteresis code is working fine for now, though I could still do without the audible click over loop transitions.  I may try SYNC_END again at some point, but for now I need to move on to other, more cosmetic features -- got to try to Youtube something tonight!

Thanks again for your epic support here, Ian and Bernd.

Ian @ un4seen

  • Administrator
  • Posts: 20393
Re: ASIO time synchronization questions
« Reply #8 on: 8 Sep '11 - 14:57 »
Well, I tried the BASS_SYNC_END strategy.  Even with using BassMix.BASS_Mixer_ChannelSetPosition, and making sure to get the BASS_STREAMPROC_END flag right, I found that my SYNCPROC was only being called once.  That is, I heard the sound loop exactly once, and then never again.  My debug spam started going slightly crazy, too, as though the SYNCPROC was getting called over and over.  Basically, it didn't work  :(

I just tried it myself and you're right, there is a problem! It's a bug in BASS_ChannelSetPosition that means it won't clear the BASS_STREAMPROC_END flag from a push stream when called from within a mixtime SYNCPROC. So the stream would immediately end again, and the END sync triggered again. Here's an update to correct that...

   www.un4seen.com/stuff/bass.dll

The END sync solution should work properly now :)

RobJellinghaus

  • Posts: 94
Re: ASIO time synchronization questions
« Reply #9 on: 8 Sep '11 - 18:33 »
As always you guys are amazing.  Not sure when I can get to this, might be a few days, but I will let you know when I do. 

Also, note that I am using BASS_Mixer_ChannelSetPosition, not the ordinary BASS_ChannelSetPosition, since this decode push stream is a mixer input stream.  Is your fix needed and/or does your fix apply to bassmix.dll as well?
« Last Edit: 8 Sep '11 - 18:40 by RobJellinghaus »

Ian @ un4seen

  • Administrator
  • Posts: 20393
Re: ASIO time synchronization questions
« Reply #10 on: 9 Sep '11 - 15:06 »
BASS_Mixer_ChannelSetPosition internally calls BASS_ChannelSetPosition, so it would be affected by the bug/fix too.

RobJellinghaus

  • Posts: 94
Re: ASIO time synchronization questions
« Reply #11 on: 19 Apr '12 - 08:38 »
A mere eight months later, I'm deliberately reviving this exact thread.  I'm finally hacking on my Holofunk project again (the one Beardyman tested: http://www.un4seen.com/forum/?topic=13022.msg90669#msg90669).

I finally tried the BASS_SYNC_END strategy given here, and at first I couldn't make it work.  Finally, I tried the DLL you linked above, INSTEAD OF the one in BASS 2.4.8... and it worked!

Is it possible that the DLL above (which says its version is 2.4.8.15) has fixes in it that haven't yet made it into the official bass.dll release???

If so, shouldn't that be corrected?

Here is my current code for copying a sample to the mixer:

Code: [Select]
        protected unsafe override void PushSampleToStream(Sample<float> sample, int lengthSamples)
        {
            float[] samples = sample.Chunk.Storage;

            // per http://www.un4seen.com/forum/?topic=12912.msg89978#msg89978
            fixed (float* p = &samples[sample.Index]) {
                byte* b = (byte*)p;

                // reset stream position so no longer ended.
                // this is as per http://www.un4seen.com/forum/?topic=12965.msg90332#msg90332
                Bass.BASS_ChannelSetPosition(m_trackHStream, 0, BASSMode.BASS_POS_BYTES); // reset the stream so it's no longer ended

                // we ignore the return value from StreamPutData since it queues anything in excess,
                // so we don't need to track any underflows
                Bass.BASS_StreamPutData(
                    m_trackHStream,
                    new IntPtr(p),
                    (lengthSamples * sizeof(float))
                        | (int)BASSStreamProc.BASS_STREAMPROC_END);
            }
        }

And here is my code for setting up the stream:

Code: [Select]
        public void StartPlaying()
        {
            Spam.WriteLine("Starting playing; total track length is now " + m_totalAppendedLengthSamples);

            m_trackHStream = Bass.BASS_StreamCreatePush(
                HolofunkBassAsio.TimepointFrequencyHz,
                HolofunkBassAsio.InputChannelCount,
                BASSFlag.BASS_SAMPLE_FLOAT | BASSFlag.BASS_STREAM_DECODE,
                new IntPtr(m_id));

            m_holofunkBass.AddStreamToMixer(m_trackHStream);

            m_trackHSync = BassMix.BASS_Mixer_ChannelSetSync(
                m_trackHStream,
                BASSSync.BASS_SYNC_MIXTIME | BASSSync.BASS_SYNC_END,
                0, // ignored
                m_syncProc,
                new IntPtr(0));

            // connect peak level meter to input push stream
            m_plmTrack = new DSP_PeakLevelMeter(m_trackHStream, 0);
            m_plmTrack.Notification += new EventHandler(Plm_Track_Notification);

            m_lastSyncTime = m_clock.Now;

            // and boot us up with the whole track's data
            PushTrackToStream();
        }

Thanks very much for your help all those months ago, Ian -- hope you see this reply :-)  If not, I will make a new thread, because this BASS_SYNC_END fix really seems like it should make it into a BASS update.

Ian @ un4seen

  • Administrator
  • Posts: 20393
Re: ASIO time synchronization questions
« Reply #12 on: 19 Apr '12 - 14:37 »
Is it possible that the DLL above (which says its version is 2.4.8.15) has fixes in it that haven't yet made it into the official bass.dll release???

If so, shouldn't that be corrected?

Yes, there are indeed fixes in the update above that aren't yet in the release version up on the BASS webpage. A release to fix that should be available shortly :)

You can continue to use the update above in the meantime.