Update rate and buffer length

Started by Ingolf2,

Ingolf2

I was wondering how the update rate (bass_init and bass_recordint) work, and how this should be set with the buffer length.

I ask because when I set the update rate to 1ms, and the buffer length to 10ms, it works (mostly only for a brief time), and when I set the buffer length to 50ms, the sound is all choppy.

A larger buffer works great off course, but my application has the ability to play files, and record in realtime too. (like playing along with the music with a guitar) What is the best way of realizing this?

Ian @ un4seen

The "recommended" minimum buffer length is the "minbuf" value (BASS_INFO member) + the update period. There's an example of doing this in the SYNTH example, and also in the BASS_SetBufferLength docs.

Regarding recording... in your example, are you recording just the guitar (eg. "line-in"), or the whole mix (eg. "what you hear")? And then what are you doing with the recorded data?

Ingolf2

I knew about MinBuf, but I want as little latency as possible, which I guess means buying a new soundcard?!?

I'm recording from line-in.
When the user adds a recording channel (in my app), it reads data from the RecordChan handle and then mixes it to a master channel.

In my custom stream callback, I try to free some data when there has been a lot of recording. I don't understand this really too, because in your recording example, you have a Chunk variable. Why? Is the first chunk the largest? I use this (3528 for 32 bits I thought)

   if Handle = H then begin
      I := BASS_ChannelGetData(RecordChan,nil,BASS_DATA_AVAILABLE);
      I := I - Length;
      if I > 3528 then begin
        BASS_ChannelGetData(RecordChan,nil,I-3528);
      end;
    end;

But I guess it frees to much data because the sound is choppy, and ChannelGetData returns less than requested. When I try 3528 * 10, it works, but I don't want that.

Ian @ un4seen

QuoteIn my custom stream callback, I try to free some data when there has been a lot of recording. I don't understand this really too, because in your recording example, you have a Chunk variable. Why? Is the first chunk the largest? I use this (3528 for 32 bits I thought)
BASS receives recorded data from the drivers in chunks, it does not receive a constant stream of samples. But the chunk size is not fixed across all systems, so the example detects the size, to use in it's management of the recording buffer.

What are you doing with the mixed data? If you're just playing it, then it'd be simplest to let the soundcard mix the line-in with your output (you don't even need to bother with recording then).

Ingolf

Sorry, should have mentioned it. On all channels in my app (so recording too), numerous effects can be selected for processing, so there needs to be a buffer for the effects to process. I want to use a distortion and phaser effect on my guitar for example, so I really need it.

I'll built-in the chunk variable as you say.

Another thing, latency is set to 6, but MinBuf is set to 127. Why is this value that high, and then what is the difference?

Ian @ un4seen

QuoteAnother thing, latency is set to 6, but MinBuf is set to 127. Why is this value that high, and then what is the difference?
127ms is quite high. What soundcard and Windows version do you have, and do you have hardware acceleration enabled in the advanced audio properties? You could also check if updated drivers are available.

Ingolf

I have recently installed new drivers because of multiple speaker support. When I had older drivers installed, it was 35, with a latency of 35. On another computer they are both 30.

I have a Realtek on-board 5.1 chipset running Windows XP. I'll check for hardware acceleration.