BASS for iOS (iPhone/iPad)

Started by Ian @ un4seen, 16 Feb '10 - 16:32

MarkZ68

My iOS-App (with libbass.a from November 2017) works fine on all Apple devices (iOS 10, iOS 11) except iPhone X. I am using the folling code to record the audio data (16-bit, recording period = 93 ms):

BASS_RecordInit(-1);
BASS_RecordStart (44100,1,6094848,@RecordingCallback,nil);

A user of a iPhone X informed me that BASS_RecordStart fails. I do not have any further Informations because I do not have a iPhone X and I have not implemented the BASS_ErrorGetCode.
Please, can you help me.

Ian @ un4seen

I haven't tried an iPhoneX myself yet either, so I'm not sure what the problem could be. Would it be possible to send a debug version to the user, to get more info on what's happening?

piyatat

Hello,

I'm working with BASS HLS and have the similar issue (cannot read the first metadata):
http://www.un4seen.com/forum/?topic=18003.msg126369#msg126369

But I cannot get the first metadata even if I call
BASS_ChannelGetTags(channel, BASS_TAG_HLS_EXTINF);
right after the stream creation.

While the android version can read it with Ian's suggestion, so I suspect it may be the issue with iOS version (or iOS itself)

Thanks :)

Ian @ un4seen

It could be that BASSHLS has already begun downloading the 2nd segment when the BASS_StreamCreateURL call returns. To reduce the chances of that, you can try enabling asynchronous pre-buffering via the BASS_CONFIG_NET_PREBUF_WAIT option:

BASS_SetConfig(BASS_CONFIG_NET_PREBUF_WAIT, 0);

Note you need to be using the latest BASS version (2.4.13) for that option.

p-XDv18

Hallo,
i create an app with player BASS in swift, but I can't capture the audio controls from lockscreen in iOS 11.
This is the code I use:

        do {
            try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
            try AVAudioSession.sharedInstance().setActive(true)
            debugPrint("AVAudioSession is Active and Category Playback is set")
            UIApplication.shared.beginReceivingRemoteControlEvents()
            self.becomeFirstResponder()
            setupCommandCenter()
        } catch {
            debugPrint("Error: \(error)")
        }
    }
   
    private func setupCommandCenter() {
        MPNowPlayingInfoCenter.default().nowPlayingInfo = [MPMediaItemPropertyTitle: "RADIOXDEVEL"]
       
        let commandCenter = MPRemoteCommandCenter.shared()
        commandCenter.playCommand.isEnabled = true
        commandCenter.pauseCommand.isEnabled = true
        commandCenter.playCommand.addTarget { [weak self] (event) -> MPRemoteCommandHandlerStatus in
            self?.playRadio()
            print("play lockscreen")
            return .success
        }
        commandCenter.pauseCommand.addTarget { [weak self] (event) -> MPRemoteCommandHandlerStatus in
            self?.pauseRadio()
            print("pause lockscreen")
            return .success
        }
    }


also I can't configure metadata in lockscreen using this code:

let commandCenter = MPRemoteCommandCenter.shared()
        commandCenter.nextTrackCommand.isEnabled = true
       
        MPNowPlayingInfoCenter.default().nowPlayingInfo = [
            MPMediaItemPropertyArtist: track.artist,
            MPMediaItemPropertyTitle: track.title,
        ]


can you help me please?

Oleg the soundman

Successfully tested recording on iPhone X, all recording setup is the same as always.

Oleg the soundman

#1256
@p-XDv18, looks like now in iOS 11 You have to set the route sharing policy to long form, i.e. use new method for setting the category of AVAudioSession, the one which includes routeSharingPolicy with AVAudioSessionRouteSharingPolicyLongForm param, to connect Your app to Control Center.

p-XDv18

Thanks for reply @Oleg N.

I added this instruction to my code,

AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, mode: AVAudioSessionModeDefault, routeSharingPolicy: AVAudioSessionRouteSharingPolicy.longForm, options:[])

but the Control Center doesn't respond to the player.

D you have a solution?

Oleg the soundman

#1258
@p-XDv18 is app allowed to play audio in background? and Do You provide the MPNowPlayingInfoPropertyPlaybackRate = 1 to MPNowPlayingInfoCenter too?

Oleg the soundman

Please share example how You make BASS available from Swift in XCode 9

p-XDv18

Hi @Oleg N.
I have enabled audio in the background. I show you all the code I use for the

In viewDidLoad()

 let scc = MPRemoteCommandCenter.shared()
        scc.togglePlayPauseCommand.addTarget(self, action: #selector(doPlayPause))
        scc.playCommand.addTarget(self, action:#selector(doPlay))
        scc.pauseCommand.addTarget(self, action:#selector(doPause))
        scc.changePlaybackPositionCommand.isEnabled = false
       
       try? AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, mode: AVAudioSessionModeDefault, routeSharingPolicy: AVAudioSessionRouteSharingPolicy.longForm, options:[])
       
        try? AVAudioSession.sharedInstance().setActive(true)

        UIApplication.shared.beginReceivingRemoteControlEvents()
        self.becomeFirstResponder()
       
        //Setup NowPlayingInfoCenter with the MPNowPlayingInfoCenter.default()
              updateLockScreen()


This is the other function for the MPRemoteCommandEvent:

@objc func doPlayPause(_ event:MPRemoteCommandEvent) {
        print("playpause Remote")
        if isPlaying { self.doPause(event) } else { self.doPlay(event) }
    }
   
    @objc func doPlay(_ event:MPRemoteCommandEvent) {
        print("play REMOTE")
       
        toggle()
   
        let mpic = MPNowPlayingInfoCenter.default()
        if var d = mpic.nowPlayingInfo {
            d[MPNowPlayingInfoPropertyPlaybackRate] = 1
            mpic.nowPlayingInfo = d
        }
    }
    @objc func doPause(_ event:MPRemoteCommandEvent) {
        print("pause REMOTE")
       
        toggle()
       
        let mpic = MPNowPlayingInfoCenter.default()
        if var d = mpic.nowPlayingInfo {
            d[MPNowPlayingInfoPropertyPlaybackRate] = 0
            mpic.nowPlayingInfo = d
        }
    }


But the Lockscreen Control Center does not update and the commands do not work.
Do I do everything correctly?

Thank you

Oleg the soundman

#1261
@p-XDv18, If You've done it all correctly and it still does not work, last resort is reboot Your device.

Other than that, You should have:

- AVAudioSession correctly configured and active; options brackets are optional in Your case, I think.
- background mode entitlement including "Audio and Airplay"
- have MPNowPlayingInfoCenter & MPRemoteCommandCenter instantiated and working
- audio actually playing when You set data to MPNowPlayingInfoCenter. Note that iOS will ignore the new data if You send it to MPNowPlayingInfoCenter too often. Common mistake is to send some incomplete data then sending full metadata within less than 1 second afterwards.

as for Your code, are You checking if setting AVAudioSession category and activating it have both succeeded?

 

Oleg the soundman

#1262
for debugging You could for example put breakpoint in doPause() at this position:

d[MPNowPlayingInfoPropertyPlaybackRate] = 0
play track, then  call this func from within Your app (pause button?) and check what is inside d dictionary? if it is empty probably You need to properly configure Now Playing information when track has started.

JerryRol

DWORD recordChannel;
HENCODE _encoder;
HSTREAM output = 0;

HSTREAM playStream;

// Recording callback - not doing anything with the data
BOOL CALLBACK DuffRecording(HRECORD handle, const void *buffer, DWORD length, void *user) {
    BASS_StreamPutData(recordChannel, buffer, length);
   
    //BASS_ChannelGetData(playStream, &buffer, length);
    return TRUE; // continue recording
}

DWORD CALLBACK playStreamCallBack(HSTREAM handle, void* writeBuffer, DWORD length, void* user) {
    DWORD r = BASS_ChannelGetData(recordChannel, writeBuffer, length);
    if (r==(DWORD)-1) return BASS_STREAMPROC_END; // signal the end
    return r;
}

#define  AUDIO_FILE @"rec.m4a"

- (void)initRec {
    AVAudioSession *audioSession; // get your audio session somehow
    BOOL success = [audioSession overrideOutputAudioPort: AVAudioSessionPortOverrideSpeaker error:nil];
    if(!success) {
        NSLog(@"error doing outputaudioportoverride -");
    }
   
    BASS_SetConfig(BASS_CONFIG_IOS_MIXAUDIO,0);
    BASS_SetConfig(BASS_CONFIG_IOS_SPEAKER, 1);

    if (!BASS_Init(-1, 44100, BASS_DEVICE_LATENCY, 0, NULL)) {  // initialize default output device (and measure latency)
        NSLog(@"Can't initialize device");
        return;
    }
   
    if (!BASS_RecordInit(-1)) { // initialize BASS recording (default device)
        NSLog(@"Can't initialize Record");
    }

    BASS_INFO info;
    BASS_GetInfo(&info); // get device info for optimal sample rate
   
    if (!(recordChannel = BASS_RecordStart(44100, 2, 0, DuffRecording, nil))) { // start recording (44100hz mono 16-bit)
        NSLog(@"Error starting recording: %d", BASS_ErrorGetCode());
        BASS_RecordFree();
    }
   
    BASS_ChannelSetAttribute(recordChannel, BASS_ATTRIB_VOL, 0.9);
   
    NSString *filePath = [self.storagePath stringByAppendingPathComponent:AUDIO_FILE];
    _encoder = BASS_Encode_StartCAFile(recordChannel, 'm4af', 'alac', 0, 0, [filePath cStringUsingEncoding:NSUTF8StringEncoding]);
    if (!_encoder) {
        NSLog(@"Error starting encoder: %d", BASS_ErrorGetCode());
    }
   
    if (!(output = BASS_StreamCreate(info.freq, 2, BASS_SAMPLE_FLOAT, playStreamCallBack, &recordChannel))) { // create stream with same format to play the recording
        NSLog(@"Error starting output stream ------------Error:%d",  BASS_ErrorGetCode());
    }
   
    BASS_ChannelSetAttribute((DWORD)output, BASS_ATTRIB_NOBUFFER, 1); // disable playback buffering on it for lower latency
   
    BOOL setPlay = BASS_ChannelPlay(recordChannel, FALSE);
    BOOL setPlay2 = BASS_ChannelPlay(output, FALSE);
   
    NSLog(@"------%d--- ----- %d------Error:%d", setPlay, setPlay2, BASS_ErrorGetCode());
}

I don't know where this code is going wrong~!

Thanks.

JerryRol

p-XDv18

@Oleg N
Thanks for the reply! I solved by going to check how you said.
I sept MPNowPlayingInfoCenter after it is in play.

But when I stop and return to play the MPNowPlayingInfoCenter no responds. Why?
Do I have to set the MPNowPlayingInfoCenter again?

Ian @ un4seen

Quote from: JerryRol on 28 Mar '18 - 11:19// Recording callback - not doing anything with the data
BOOL CALLBACK DuffRecording(HRECORD handle, const void *buffer, DWORD length, void *user) {
    BASS_StreamPutData(recordChannel, buffer, length);
   
    //BASS_ChannelGetData(playStream, &buffer, length);
    return TRUE; // continue recording
}

It looks like this is attempting to push the captured data back to the recording channel? That isn't possible, ie. the BASS_StreamPutData call will be failing. BASS_StreamPutData can only be used on streams that are created by BASS_StreamCreate with STREAMPROC_PUSH.

I would suggest that you try not using a RECORDPROC function on the recording channel, which will allow you to fetch data from it (via BASS_ChannelGetData) in the output stream's STREAMPROC function. Like this:

    if (!(recordChannel = BASS_RecordStart(44100, 2, 0, nil, nil))) { // start recording (44100hz mono 16-bit)
        NSLog(@"Error starting recording: %d", BASS_ErrorGetCode());
        BASS_RecordFree();
    }

Also note that the output stream's format should match the recording's format. So your BASS_StreamCreate call should be modified like this:

    if (!(output = BASS_StreamCreate(44100, 2, 0, playStreamCallBack, &recordChannel))) { // create stream with same format to play the recording
        NSLog(@"Error starting output stream ------------Error:%d",  BASS_ErrorGetCode());
    }

JerryRol

Quote from: Ian @ un4seen on 28 Mar '18 - 15:08
Quote from: JerryRol on 28 Mar '18 - 11:19// Recording callback - not doing anything with the data
BOOL CALLBACK DuffRecording(HRECORD handle, const void *buffer, DWORD length, void *user) {
    BASS_StreamPutData(recordChannel, buffer, length);
   
    //BASS_ChannelGetData(playStream, &buffer, length);
    return TRUE; // continue recording
}

It looks like this is attempting to push the captured data back to the recording channel? That isn't possible, ie. the BASS_StreamPutData call will be failing. BASS_StreamPutData can only be used on streams that are created by BASS_StreamCreate with STREAMPROC_PUSH.

I would suggest that you try not using a RECORDPROC function on the recording channel, which will allow you to fetch data from it (via BASS_ChannelGetData) in the output stream's STREAMPROC function. Like this:

    if (!(recordChannel = BASS_RecordStart(44100, 2, 0, nil, nil))) { // start recording (44100hz mono 16-bit)
        NSLog(@"Error starting recording: %d", BASS_ErrorGetCode());
        BASS_RecordFree();
    }

Also note that the output stream's format should match the recording's format. So your BASS_StreamCreate call should be modified like this:

    if (!(output = BASS_StreamCreate(44100, 2, 0, playStreamCallBack, &recordChannel))) { // create stream with same format to play the recording
        NSLog(@"Error starting output stream ------------Error:%d",  BASS_ErrorGetCode());
    }


hi Ian,  I hope to achieve this goal: When I put on headphones to start recording, I can hear the sound I recorded in real time.
 What should I do?

Ian @ un4seen

I think the changes I mentioned will give you that. If you're planning to also encode the captured data, you should use the BASS_ENCODE_QUEUE flag (eg. in the BASS_Encode_StartCAFile call) to prevent that delaying playback.

JerryRol

Quote from: Ian @ un4seen on 29 Mar '18 - 15:49I think the changes I mentioned will give you that. If you're planning to also encode the captured data, you should use the BASS_ENCODE_QUEUE flag (eg. in the BASS_Encode_StartCAFile call) to prevent that delaying playback.

http://dump.bitcheese.net/files/rerymel/MBass.zip

This is my project ,I have done what you said,but still can't hear the sound realtime .
Please help me , thx Ian.

Oleg the soundman

#1269
JerryRol, first, You never call [AVAudioSession setActive:error:] in Your code, but it should play audio even without it.
Then, add this line in Your recording callback, just before "return TRUE;":

BASS_StreamPutData(output, buffer, length);
This is because You create output channel as STREAMPROC_PUSH (so You push data into it here)

Third, I would put Audio Session category/setActive calls into viewDidLoad or something like that to make sure it's called before You do anything with audio

Fourth, the output channel creation should look something like,
 if (!(output = BASS_StreamCreate(44100, 2, 0, STREAMPROC_PUSH, NULL)))
NULL, not reference to rec channel because this last parameter is for userInfo, not input channel.

Oleg the soundman

Quote from: p-XDv18 on 28 Mar '18 - 12:02@Oleg N
Thanks for the reply! I solved by going to check how you said.
I sept MPNowPlayingInfoCenter after it is in play.

But when I stop and return to play the MPNowPlayingInfoCenter no responds. Why?
Do I have to set the MPNowPlayingInfoCenter again?

Yes You need to set it again when You unpause, and that's logical because You need to change playback ratio to 1 anyway.

Ian @ un4seen

Quote from: JerryRol on  2 Apr '18 - 03:53http://dump.bitcheese.net/files/rerymel/MBass.zip

This is my project ,I have done what you said,but still can't hear the sound realtime .
Please help me , thx Ian.

That code doesn't appear to have the changes that I suggested, ie. it's still using a RECORDPROC ("DuffRecording") in the BASS_RecordStart call and STREAMPROC_PUSH in the BASS_StreamCreate call. Perhaps you uploaded an old version of the code?

Paresh Gorasva

Can we use in iOS Swift Project.? If yes than how to integrate in project. I need some help.

JerryRol

hi, Ian:
   How can I do real-time noise reduction when recording? please help me .

thx!

p-XDv18

Hi,

is the latest version of BASS compatible with Xcode 9.4 and swift 4.1.2?

I followed the procedure to import BASS but Xcode returns an error.
Module compiled with Swift 4.0.3 cannot be imported in Swift 4.1.2: /Users/developer/Desktop/xxxxxxxx/Bass.framework/Modules/Bass.swiftmodule/arm64.swiftmodule

Can you help me?