Author Topic: BASS_CONFIG_DEV_BUFFER on macOS / iOS  (Read 387 times)

elan

  • Posts: 29
BASS_CONFIG_DEV_BUFFER on macOS / iOS
« on: 1 Jun '19 - 22:35 »
The documentation for BASS_CONFIG_DEV_BUFFER says:

Quote
This option is not available on OSX or iOS. The device buffer length on those platforms is twice the device update period, which can be set via the BASS_CONFIG_DEV_PERIOD option.

However, if I modify BASS_CONFIG_DEV_PERIOD I don't see any change in the values in latency and minbuf. If I set BASS_CONFIG_DEV_BUFFER on macOS to e.g. 50, I see: latency is 77ms (minimum buffer: 50ms).

Does this option actually do anything now on macOS/iOS?

elan

  • Posts: 29
Re: BASS_CONFIG_DEV_BUFFER on macOS / iOS
« Reply #1 on: 1 Jun '19 - 22:41 »
Actually, I just upgraded to BASS 2.4.14 (from 2.4.13) and now it seems the opposite is true (and documentation is correct). So I guess this recently changed?

elan

  • Posts: 29
Re: BASS_CONFIG_DEV_BUFFER on macOS / iOS
« Reply #2 on: 1 Jun '19 - 23:12 »
(On iOS, no matter what I set those to, info.latency, info.minbuf are both 0.)

I seem to be hearing micro-dropouts in the simulator, at least, so this makes me worry I'm not able to configure latency via the API.
« Last Edit: 2 Jun '19 - 06:21 by elan »

elan

  • Posts: 29
Re: BASS_CONFIG_DEV_BUFFER on macOS / iOS
« Reply #3 on: 2 Jun '19 - 06:51 »
Talking to myself here, but one more data point: when I tried to raise BASS_CONFIG_DEV_PERIOD to 150ms on macOS (which seems pretty reasonable, and I was trying to avoid some dropouts), it reported: latency: 141ms, minimum buffer: 93ms

The docs say: "The system may also choose to use a different buffer length if the requested one is too short or long, or needs rounding for granularity" which implies 150ms * 2 was too long?

Basically: I'm confused  ;D

Ian @ un4seen

  • Administrator
  • Posts: 21991
Re: BASS_CONFIG_DEV_BUFFER on macOS / iOS
« Reply #4 on: 3 Jun '19 - 13:55 »
Actually, I just upgraded to BASS 2.4.14 (from 2.4.13) and now it seems the opposite is true (and documentation is correct). So I guess this recently changed?

Yes, that was changed/fixed in 2.4.14. The BASS_CONFIG_DEV_BUFFER option predates the BASS_CONFIG_DEV_PERIOD option. When only the former was available, it could be used to set the output update period (rather than total buffer size) on OSX. Once the latter option appeared, it was more logical to use that, so the change was made.

(On iOS, no matter what I set those to, info.latency, info.minbuf are both 0.)

Ah, it looks like there's a little bug in the latest iOS release that means those values aren't available immediately after the BASS_Init call. Here's an update that should fix that:

   www.un4seen.com/stuff/libbass.a

I seem to be hearing micro-dropouts in the simulator, at least, so this makes me worry I'm not able to configure latency via the API.

To narrow down what's causing the dropouts, please try setting a BASS_SYNC_STALL sync on the playback stream and see if that gets triggered. Also, confirm whether you have disabled playback buffering via the BASS_ATTRIB_BUFFER setting and/or you are accessing the output mix via a device stream (created with BASS_StreamCreate + STREAMPROC_DEVICE).

Talking to myself here, but one more data point: when I tried to raise BASS_CONFIG_DEV_PERIOD to 150ms on macOS (which seems pretty reasonable, and I was trying to avoid some dropouts), it reported: latency: 141ms, minimum buffer: 93ms

The docs say: "The system may also choose to use a different buffer length if the requested one is too short or long, or needs rounding for granularity" which implies 150ms * 2 was too long?

On OSX, the BASS_CONFIG_DEV_PERIOD setting controls the output's kAudioDevicePropertyBufferFrameSize property. The maximum for that is generally 4096 samples (it's available in the kAudioDevicePropertyBufferFrameSizeRange property), and the requested BASS_CONFIG_DEV_PERIOD will be capped at the equivalent number of milliseconds. With a sample rate of 44100 Hz, that is 93ms rounded-up (4096/44100=0.0929).

elan

  • Posts: 29
Re: BASS_CONFIG_DEV_BUFFER on macOS / iOS
« Reply #5 on: 3 Jun '19 - 18:49 »
Quote
Here's an update that should fix that:

Thanks so much, it's now showing: sample rate: 44100, latency: 28ms, minimum buffer: 12ms

This seems a bit low for this code, unless I'm misunderstanding.

#elif (TARGET_OS_IPHONE || TARGET_IPHONE_SIMULATOR)
  // iOS.
  BASS_SetConfig(BASS_CONFIG_DEV_PERIOD, 150);
  BASS_SetConfig(BASS_CONFIG_UPDATEPERIOD, 75);
  BASS_SetConfig(BASS_CONFIG_BUFFER, 150);
#else


Quote
To narrow down what's causing the dropouts, please try setting a BASS_SYNC_STALL sync on the playback stream and see if that gets triggered.

Yep, that's the first thing I did, I set it on the mixer so I'd be able to tell if anything "upstream" was causing. I hear the dropouts and I do NOT see the underflow message from the SYNC. No use of BASS_ATTRIB_BUFFER, and no STREAMPROC_DEVICE.

Thanks for your quick reply, as always!


elan

  • Posts: 29
Re: BASS_CONFIG_DEV_BUFFER on macOS / iOS
« Reply #6 on: 4 Jun '19 - 07:38 »
Actually, it looks like I might be being bit by this bug https://appleinsider.com/articles/19/02/19/pro-audio-glitch-with-t2-equipped-macs-connected-to-usb-20-connections

I am still curious if it's possible to raise the buffer size on iOS beyond what was reported, but perhaps it's by design?

Ian @ un4seen

  • Administrator
  • Posts: 21991
Re: BASS_CONFIG_DEV_BUFFER on macOS / iOS
« Reply #7 on: 4 Jun '19 - 16:09 »
Were you running on the iOS emulator? It looks like it isn't possible to change the output buffer/period (via AVAudioSession:setPreferredIOBufferDuration) on the emulator. On a real device, it should be possible to go up to 4096 samples again (like on OSX).

elan

  • Posts: 29
Re: BASS_CONFIG_DEV_BUFFER on macOS / iOS
« Reply #8 on: 5 Jun '19 - 00:43 »
Yep, that was it; thanks so much for your help, Ian!