Author Topic: BASS for iOS (iPhone/iPad)  (Read 606780 times)

GregaK

  • Posts: 22
Re: BASS for iOS (iPhone/iPad)
« Reply #875 on: 2 Sep '14 - 08:56 »
Hi un4seen and other guys  ;)

i have a few questions:

1. Is it possible to create a BASS stream from ipod library item directly (or in other words to PLAY ipod library item directly)?
For example using MPMediaPickerController and MPMediaItemPropertyAssetURL.
Or do i need something like TSLibraryImport (https://github.com/tapsquare/TSLibraryImport) to get ipod item first, save it locally and finally use it in BASS?

2. Is it possible to route sound to internal speaker even if headphones are connected? Or maybe to 30-pin or Lightning connector? Airplay? Bluetooth?

Thank you.

Regards,
Grega

Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #876 on: 2 Sep '14 - 17:42 »
1. Is it possible to create a BASS stream from ipod library item directly (or in other words to PLAY ipod library item directly)?
For example using MPMediaPickerController and MPMediaItemPropertyAssetURL.
Or do i need something like TSLibraryImport (https://github.com/tapsquare/TSLibraryImport) to get ipod item first, save it locally and finally use it in BASS?

It's a little while since I last looked into that stuff in detail, but as far as I know it still isn't possible to access files in the ipod library directly, so you would indeed need to use something like TSLibraryImport.

2. Is it possible to route sound to internal speaker even if headphones are connected? Or maybe to 30-pin or Lightning connector? Airplay? Bluetooth?

You could use the AVAudioSession overrideOutputAudioPort method (with AVAudioSessionPortOverrideSpeaker) to force the output to the speaker, along with a AVAudioSessionInterruptionNotification notification to reinforce it (eg. when headphones get plugged in), something like this...

Code: [Select]
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(routeChangeHandler:) name:AVAudioSessionRouteChangeNotification object:nil]; // request notification of route changes
RecordInit(0); // initialize recording device for PlayAndRecord category
BASS_Init(-1, 44100, 0, 0, 0); // initialize output device

...

- (void) routeChangeHandler:(NSNotification*)notification
{
[[AVAudioSession sharedInstance] overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil]; // force output to the speaker
}

Note that the overrideOutputAudioPort method only has effect when the session category is set to AVAudioSessionCategoryPlayAndRecord. BASS will use that category when RecordInit has been called.

GregaK

  • Posts: 22
Re: BASS for iOS (iPhone/iPad)
« Reply #877 on: 3 Sep '14 - 08:45 »
Hi Ian,

thanks for the info. Helpful as always.

One more thing about my second question (route sound):
i suspect this will override sound output for all audio channels?
Is it possible to set sound output for each audio channel?
For example if i want to write a DJ app and have default sound output to headphones jack and for pre-listening use internal speaker or 30-pin/lightning connector. Is this possible?

Thanks for support.

Regards,
Grega


Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #878 on: 3 Sep '14 - 17:54 »
It isn't currently possible to use 2 different outputs at the same time. It looks like it should be possible to achieve that with the AVAudioSessionCategoryMultiRoute session category. I looked into that option a while ago, but unfortunately it was pretty much undocumented, so I was basically stabbing in the dark and didn't manage to hit anything :) ... I'll have another look to see if there is any more information available on it now.

GregaK

  • Posts: 22
Re: BASS for iOS (iPhone/iPad)
« Reply #879 on: 4 Sep '14 - 08:18 »
Hi,

thanks.

Note that it is not necessary to route one channel to multiple outputs.
It would be just great if i could tell channel which output to use - for example channel 1 to jack, channel 2 to internal speaker, channel 3 to 30-pin/lightning port. This would give BASS total advantage in iOS world of audio!
For example Cubasis, Auria and some other DJ/mixing apps have already achived that so it looks like it is possible. Of course maybe they are using some internal code (not core audio) to route the sound, however i know that if output device is class compliant (USB audio device) it works so i am guessing it must be some kind of standard.

One more thing if i may ... i am using BASS_ChannelSetAttribute(chan, BASS_ATTRIB_TEMPO, tempo) to change the channel tempo. Is it normal that the audio becomes "rough" especially when setting slower tempo. Like some sort of chorus. I have tried Dirac3 is very smooth - can this be achived with BASS as well?

Sorry to bother you so much, but i need to be sure if BASS is what i need, before i purchase it.

Regards,
Grega

Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #880 on: 4 Sep '14 - 17:52 »
By default, the tempo processing quality is set a bit lower on non-desktop platforms to save CPU. That can be changed by setting the BASS_ATTRIB_TEMPO_OPTION_USE_QUICKALGO option to 0 via BASS_ChannelSetAttribute...

Code: [Select]
BASS_ChannelSetAttribute(handle, BASS_ATTRIB_TEMPO_OPTION_USE_QUICKALGO, 0);

The tempo processing is based on the SoundTouch library. If you have another tempo processing library that you would prefer to use, that is possible by using a custom stream (see BASS_StreamCreate) to play the source channel (instead of passing it to BASS_Tempo_StreamCreate), which allows you to pass the decoded data through the tempo processing. For example, you could have a STREAMPROC function that looks something like this...

Code: [Select]
DWORD CALLBACK TempoStreamProc(HSTREAM handle, void *buffer, DWORD length, void *user)
{
int need=ConvertOutputLengthToInput(length); // calculate how much input data is needed produce "length" output
void *temp=malloc(need); // allocate a buffer for the data
int got=BASS_ChannelGetData(decoder, temp, need); // get data from the decoding channel
if (got>0) got=ProcessData(temp, got, buffer, length); // process the data
free(temp);
if (got==-1) return BASS_STREAMPROC_END; // reached the end
return got;
}

Please see the BASS_StreamCreate and STREAMPROC documentation for details on them.

GregaK

  • Posts: 22
Re: BASS for iOS (iPhone/iPad)
« Reply #881 on: 5 Sep '14 - 06:27 »
Hi,

thanks, but i must say setting attribute BASS_ATTRIB_TEMPO_OPTION_USE_QUICKALGO to 0 (False) does not make any difference - the sound is the same. Strange.

Regards,
Grega

Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #882 on: 5 Sep '14 - 17:09 »
You could also try playing with the other BASS_ATTRIB_TEMPO_OPTION options to see if you can find a combination that works better for you, but it may well be that there are other tempo processing libraries that perform better than SoundTouch. If you already have a licence for Dirac3 (or another tempo processing library), you could try incorporating that with BASS via a custom stream, as described above.

maheep

  • Guest
Re: BASS for iOS (iPhone/iPad)
« Reply #883 on: 11 Sep '14 - 14:03 »
Hi
Can any one help me, even I am looking to use BASS on an ios app.
Can someone guide me how should I start on this?
some example would be the best thing to start with
Thanks

mix1009

  • Posts: 9
Re: BASS for iOS (iPhone/iPad)
« Reply #884 on: 13 Sep '14 - 12:07 »
Could you provide x86_64 architecture support for the simulator?
Xcode 6(GM) can only run x86_64 for iPhone 6 / iPhone 6 plus in the simulator.
Which makes it impossible to test on new iPhone 6 resolutions until I get the device.

Thank you.

Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #885 on: 15 Sep '14 - 15:01 »
Can any one help me, even I am looking to use BASS on an ios app.
Can someone guide me how should I start on this?
some example would be the best thing to start with

Unfortunately, there currently aren't any iOS examples provided, but BASS usage is very much the same on all platforms, so you could have a look at the examples provided for another platform (eg. OSX or Windows) and then copy the ideas into your iOS app.

Could you provide x86_64 architecture support for the simulator?
Xcode 6(GM) can only run x86_64 for iPhone 6 / iPhone 6 plus in the simulator.
Which makes it impossible to test on new iPhone 6 resolutions until I get the device.

Are you sure x86_64 is required, ie. the architecture can't be set to i386, as in the image in this post?

   www.un4seen.com/forum/?topic=10910.msg110437#msg110437

Anyway, as simulator architectures won't be included in (and so bloat) device builds, there's no real harm in including a 2nd simulator architecture even if it does seem a bit unnecessary, so an updated BASS library including the x86_64 architecture is now up in the 1st post.

For those that have been wanting to play iPod library entries directly (ie. without first creating a copy), this update also adds support for that via BASS_StreamCreateURL on iOS 7 and above. For example, you might do something like this with an MPMediaItem object...

Code: [Select]
stream=BASS_StreamCreateURL([[[mediaitem valueForProperty:MPMediaItemPropertyAssetURL] absoluteString] UTF8String], 0, 0, 0, 0);

The BASS_StreamCreateURL offset/proc/user parameters are ignored, as are the BASS_STREAM_RESTRATE/BLOCK/STATUS flags. The iPod library entries are played using CoreAudio codecs, so the stream's "ctype" will always be BASS_CTYPE_STREAM_CA. BASS doesn't have direct access to the file itself, so tags won't be available from BASS_ChannelGetTags (apart from BASS_TAG_CA_CODEC); the MPMediaItem properties could be used instead.

GregaK

  • Posts: 22
Re: BASS for iOS (iPhone/iPad)
« Reply #886 on: 19 Sep '14 - 13:19 »
Quote
For those that have been wanting to play iPod library entries directly (ie. without first creating a copy), this update also adds support for that via BASS_StreamCreateURL on iOS 7 and above. For example, you might do something like this with an MPMediaItem object...

That is great! How about effects (like reverb and so on) and tempo, BPM detection, BPM grid, etc. does this work?

Regards,
Grega

Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #887 on: 19 Sep '14 - 14:58 »
Yes, all of that should work as normal. Only things that require BASS to have direct access to the file won't be available, eg. tag reading (BASS_ChannelGetTags) and file position info (BASS_StreamGetFilePosition).

mix1009

  • Posts: 9
Re: BASS for iOS (iPhone/iPad)
« Reply #888 on: 29 Sep '14 - 03:27 »
Are you sure x86_64 is required, ie. the architecture can't be set to i386, as in the image in this post?

   www.un4seen.com/forum/?topic=10910.msg110437#msg110437

Anyway, as simulator architectures won't be included in (and so bloat) device builds, there's no real harm in including a 2nd simulator architecture even if it does seem a bit unnecessary, so an updated BASS library including the x86_64 architecture is now up in the 1st post.

For those that have been wanting to play iPod library entries directly (ie. without first creating a copy), this update also adds support for that via BASS_StreamCreateURL on iOS 7 and above. For example, you might do something like this with an MPMediaItem object...

Code: [Select]
stream=BASS_StreamCreateURL([[[mediaitem valueForProperty:MPMediaItemPropertyAssetURL] absoluteString] UTF8String], 0, 0, 0, 0);

The BASS_StreamCreateURL offset/proc/user parameters are ignored, as are the BASS_STREAM_RESTRATE/BLOCK/STATUS flags. The iPod library entries are played using CoreAudio codecs, so the stream's "ctype" will always be BASS_CTYPE_STREAM_CA. BASS doesn't have direct access to the file itself, so tags won't be available from BASS_ChannelGetTags (apart from BASS_TAG_CA_CODEC); the MPMediaItem properties could be used instead.

Thanks Ian.

I was able to run iPhone6 simulators with i386 architecture :)

And I haven't tried the above code yet, but it will be very useful.

ppeau

  • Posts: 48
Re: BASS for iOS (iPhone/iPad)
« Reply #889 on: 3 Oct '14 - 03:53 »
1. Is it possible to create a BASS stream from ipod library item directly (or in other words to PLAY ipod library item directly)?
For example using MPMediaPickerController and MPMediaItemPropertyAssetURL.
Or do i need something like TSLibraryImport (https://github.com/tapsquare/TSLibraryImport) to get ipod item first, save it locally and finally use it in BASS?

It's a little while since I last looked into that stuff in detail, but as far as I know it still isn't possible to access files in the ipod library directly, so you would indeed need to use something like TSLibraryImport.

2. Is it possible to route sound to internal speaker even if headphones are connected? Or maybe to 30-pin or Lightning connector? Airplay? Bluetooth?

You could use the AVAudioSession overrideOutputAudioPort method (with AVAudioSessionPortOverrideSpeaker) to force the output to the speaker, along with a AVAudioSessionInterruptionNotification notification to reinforce it (eg. when headphones get plugged in), something like this...

Code: [Select]
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(routeChangeHandler:) name:AVAudioSessionRouteChangeNotification object:nil]; // request notification of route changes
RecordInit(0); // initialize recording device for PlayAndRecord category
BASS_Init(-1, 44100, 0, 0, 0); // initialize output device

...

- (void) routeChangeHandler:(NSNotification*)notification
{
[[AVAudioSession sharedInstance] overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil]; // force output to the speaker
}

Note that the overrideOutputAudioPort method only has effect when the session category is set to AVAudioSessionCategoryPlayAndRecord. BASS will use that category when RecordInit has been called.

Hi, I dunno if I missed something.
but for sure you can get audio data directly from the ipod library.
you just have to use AVAssetReaderTrackOutput, the only stuff we can't is use MPMediaItem with CoreAudio.

Quick example : u've got a MPMediaItem and u want the PCM data values without writing a file or something else like TSLibraryImport , just do that:

// media u wanna get
NSURL* outputURL = [iTuneMedia valueForProperty:MPMediaItemPropertyAssetURL];

AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:outputURL options:nil];
AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:asset error:nil];
AVAssetTrack *songTrack = [asset.tracks objectAtIndex:0];

// Set the format of values u want (In this case PCM in float (always 32bits), 2 channels )
NSDictionary *outputSettingsDict = [[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
[NSNumber numberWithInt:44100],AVSampleRateKey,
[NSNumber numberWithInt:2],AVNumberOfChannelsKey,
[NSNumber numberWithInt:32],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:YES],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsNonInterleaved,
nil];


AVAssetReaderTrackOutput *output = [[AVAssetReaderTrackOutput alloc] initWithTrack:songTrack outputSettings:outputSettingsDict];
[reader addOutput:output];
output = nil;
[reader startReading];

// Bytes per samples for the stream
UInt32 bytesPerSample = sizeof(float);
UInt64 totalBytes = 0;

while (reader.status == AVAssetReaderStatusReading)
{
     AVAssetReaderTrackOutput * trackOutput = (AVAssetReaderTrackOutput *)[reader.outputs objectAtIndex:0];
     CMSampleBufferRef sampleBufferRef = [trackOutput copyNextSampleBuffer];
     if (sampleBufferRef)
     {
          CMTime progressTime = CMSampleBufferGetPresentationTimeStamp(sampleBufferRef);
          CMTime sampleDuration = CMSampleBufferGetDuration(sampleBufferRef);

          if (CMTIME_IS_NUMERIC(sampleDuration)) progressTime= CMTimeAdd(progressTime, sampleDuration);
          printf("Progress : %i\n", (CMTimeGetSeconds(progressTime) / CMTimeGetSeconds(asset.duration)*100);


          CMBlockBufferRef blockBufferRef = CMSampleBufferGetDataBuffer(sampleBufferRef);
          size_t length = CMBlockBufferGetDataLength(blockBufferRef);
          totalBytes += length;
          NSMutableData * data = [NSMutableData dataWithLength:length];
          CMBlockBufferCopyDataBytes(blockBufferRef, 0, length, data.mutableBytes);
          float * samples = (float *) data.mutableBytes;
          int sampleCount = (int)length / bytesPerSample;

          for (int i = 0; i < sampleCount ; i++)
          {
                    // Do stuff
                    // (float)samples;
          }
}

if (reader.status == AVAssetReaderStatusFailed || reader.status == AVAssetReaderStatusUnknown)
{
          printf("Something went wrong...");
}


Hope this help someone

GregaK

  • Posts: 22
Re: BASS for iOS (iPhone/iPad)
« Reply #890 on: 3 Oct '14 - 09:27 »
Hi,

can someone please explain how to use BASS_ATTRIB_TEMPO_FREQ attribute.
I can adjust tempo without a problem using.
Code: [Select]
float tempo = (self.sldBpm.value / chan1bpm - 1.0f) * 100.0f;
BOOL result = BASS_ChannelSetAttribute(chan1, BASS_ATTRIB_TEMPO, tempo);

But for BASS_ATTRIB_TEMPO_FREQ i don't know how to calculate freq value.
Help says: freq   Samplerate in Hz (Internally calculated by the same percents as tempo)
But i still need an example, because everything i put in there for freq i get invalid parameter error.

Thanks.


Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #891 on: 3 Oct '14 - 13:44 »
The BASS_ATTRIB_TEMPO_FREQ parameter is in Hz, and needs to be within 5% to 5000% of the channel's original sample rate. For example, if the original rate is 44100, then the valid range should be 2205 (5%) to 2205000 (5000%).

Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #892 on: 3 Oct '14 - 17:27 »
The latest version (2.4.12) of the BASSenc add-on is now up in the 1st post, adding support for user-provided encoders, eg. it is now possible to use the LAME library instead of the standalone LAME executable (which isn't available on iOS).

GregaK

  • Posts: 22
Re: BASS for iOS (iPhone/iPad)
« Reply #893 on: 6 Oct '14 - 11:13 »
The BASS_ATTRIB_TEMPO_FREQ parameter is in Hz, and needs to be within 5% to 5000% of the channel's original sample rate. For example, if the original rate is 44100, then the valid range should be 2205 (5%) to 2205000 (5000%).

Hi,

thank you, i decided to do it like this (i removed error conditions from the code to make it simple to read):

Code: [Select]
float freq = BASS_FX_BPM_Translate(chan1src, self.sldBpm.value,BASS_FX_BPM_TRAN_2FREQ);
BOOL result = BASS_ChannelSetAttribute(chan1, BASS_ATTRIB_TEMPO_FREQ, freq);

I have a BPM slider 'sldBpm' which goes from 20 to 200.
I have a source channel (chan1src) and a tempo channel (chan1) variable.
I calculate frequency from BPM using BASS_FX_BPM_Translate.
From techincal point of view it is working great, i just want to know if it is correct e.g. if i set BPM to 200, freq is calculated and applied - but is it applied with correct function (BASS_FX_BPM_Translate) so that playing track is really set to 200 BPM using frequency?

Regards,
Grega

Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #894 on: 6 Oct '14 - 15:45 »
I haven't tried that myself, but it looks right :)

If you happen to be only using the tempo stream to change the playback rate, then you could use the BASS_ATTRIB_FREQ option to have BASS apply the change instead.

GregaK

  • Posts: 22
Re: BASS for iOS (iPhone/iPad)
« Reply #895 on: 8 Oct '14 - 08:23 »
 :) :)

If you happen to be only using the tempo stream to change the playback rate, then you could use the BASS_ATTRIB_FREQ option to have BASS apply the change instead.

Sorry i don't understand what you mean. Can you please explain. Thanks.

Regards,
Grega

Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #896 on: 8 Oct '14 - 14:54 »
What I meant is that if you are only using the BASS_ATTRIB_TEMPO_FREQ option (not BASS_ATTRIB_TEMPO or BASS_ATTRIB_TEMPO_PITCH), then you probably don't really need to use tempo processing (ie. BASS_FX_TempoCreate), as you can replace BASS_ATTRIB_TEMPO_FREQ with BASS_ATTRIB_FREQ to have BASS (instead of BASS_FX) change the playback rate. Having BASS handle it is likely to be more efficient, as it avoids the extra buffering involved in the tempo processing. Changes will also have lower latency (ie. be heard sooner), as BASS_ATTRIB_FREQ isn't applied until the final output mix is generated, while BASS_ATTRIB_TEMPO_FREQ is applied as soon as the data is decoded (meaning the already buffered data - see BASS_CONFIG_BUFFER - needs to be played before changes are heard).

GregaK

  • Posts: 22
Re: BASS for iOS (iPhone/iPad)
« Reply #897 on: 12 Oct '14 - 09:31 »
Quote
For those that have been wanting to play iPod library entries directly (ie. without first creating a copy), this update also adds support for that via BASS_StreamCreateURL on iOS 7 and above. For example, you might do something like this with an MPMediaItem object...

Code: [Select]
stream=BASS_StreamCreateURL([[[mediaitem valueForProperty:MPMediaItemPropertyAssetURL] absoluteString] UTF8String], 0, 0, 0, 0);

Using code above and BASS_FX_BPM_DecodeGet to get BPM value returns error 38 (not decoded stream).
So maybe for all other users - adding BASS_STREAM_DECODE flag to BASS_StreamCreateURL solves it.
Code: [Select]
stream=BASS_StreamCreateURL([[[mediaitem valueForProperty:MPMediaItemPropertyAssetURL] absoluteString] UTF8String], 0, BASS_STREAM_DECODE, 0, 0);

Ian please confirm if this is correct way to go.

Question for all BASS users: what was the best scan time practice in your case for BASS_FX_BPM_DecodeGet. I don't want to scan whole file - or do i? Please share your thoughts.

Thank you.

Ian @ un4seen

  • Administrator
  • Posts: 21861
Re: BASS for iOS (iPhone/iPad)
« Reply #898 on: 13 Oct '14 - 15:17 »
Yes, BASS_FX_BPM_DecodeGet requires a "decoding channel", so you would indeed need to use the BASS_STREAM_DECODE flag in the BASS_StreamCreateURL call in that case.

Regarding BPM scanning, I haven't really played with the BPM stuff myself, but it seems to me that scanning a section from the middle of the file would generally be best, ie. skipping any intro and outro.

ppeau

  • Posts: 48
Re: BASS for iOS (iPhone/iPad)
« Reply #899 on: 22 Oct '14 - 17:30 »
Hi, Is there a way to verify the current setting for the size of the buffer set for a stream already created ?

Because,I think I detected an issue with IOS 8 and IOS 8.1 (The same code doesn't have the same behaviour with IOS 7 or 6 where everything works perfectly).

After a lot of tests and some devises always with IOS 7, I isolate an issue who appear just in IOS 8 and after.

the BASS_SetConfig with for example BASS_CONFIG_BUFFER, modify now all streams created before the new setting and not like in IOS 7 or explain in your manual : "Using this config option only affects the HMUSIC/HSTREAM channels that are created afterwards, not any that have already been created. So you can have channels with differing buffer lengths by using this config option each time before creating them."

To reproduce, I used for example an iPad mini or an iPad air. I just created a stream to play a track with a BASS_CONFIG_BUFFER at 500ms and after, I set the config buffer at 10 and now the first stream is affected and I can hear some cut during the playing.

If I comment the second BASS_CONFIG_BUFFER, the issue disappear.

Can I have a confirmation that is not a normal behaviour?

Thanks a lot