Author Topic: [BASS for android issue] After apply effect and mix chanels, how to save to file  (Read 18376 times)

Slim

  • Guest
Hi everyone,

It seems like BASS version for android is not complete, yet. I think BASSENC plugin can solve my issue but it not provide for android yet. So I do like this:

//for example, mix 2 channel to a channel
   mchal = BASSmix.BASS_Mixer_StreamCreate(44100, 2, BASS.BASS_SAMPLE_8BITS);

      BASS.BASS_StreamFree(bchal);
      bchal = BASS.BASS_StreamCreateFile(bfile.getPath(), 0, 0, BASS.BASS_STREAM_DECODE);
      BASSmix.BASS_Mixer_StreamAddChannel(mchal, bchal, BASSmix.BASS_MIXER_BUFFER);

      BASS.BASS_StreamFree(vchal);
      vchal = BASS.BASS_StreamCreateFile(vfile.getPath(), 0, 0, BASS.BASS_STREAM_DECODE);
      BASSmix.BASS_Mixer_StreamAddChannel(mchal, vchal, BASSmix.BASS_MIXER_BUFFER);

//play the mixed channel
      if (mchal != 0) {
         boolean success = BASS.BASS_ChannelPlay(mchal, false);
      }

//register callback   
BASS.BASS_ChannelSetDSP(mchal, dspcallback, 0, 2);

//process callback
   private boolean unlock = true;
   BASS.DSPPROC dspcallback = new BASS.DSPPROC() {
      public void DSPPROC(int handle, int channel, ByteBuffer buffer, int length, Object user) {
         if (unlock) {
            // buffer the data
            try {
               outbuf.put(buffer);
            }
            catch (BufferOverflowException e) {
               // increase buffer size
               ByteBuffer temp;
               try {
                  temp = ByteBuffer.allocateDirect(outbuf.position() + length + BUFSTEP);
               }
               catch (Error e2) {
                  Log.d("test", "out of memory!");
                  return;
               }
               outbuf.limit(outbuf.position());
               outbuf.position(0);
               temp.put(outbuf);
               outbuf = temp;
               outbuf.put(buffer);
            }

         }
      }
   };

//save to a file
unlock = false;

            outbuf.limit(outbuf.position());
            // complete the WAVE header
            outbuf.putInt(4, outbuf.position() - 8);
            outbuf.putInt(40, outbuf.position() - 44);

            Log.d("test",
                  outbuf.get(0) + " " + outbuf.get(1) + " " + outbuf.get(67) + " " + outbuf.get(68) + " " + outbuf.get(69) + " " + outbuf.get(70) + " " + outbuf.get(71)
                        + " " + outbuf.get(72) + " " + outbuf.get(73) + " " + outbuf.get(74) + " " + outbuf.get(75));

            File outfile = new File("/sdcard/mixed.wav");
            try {
               FileChannel fc = new FileOutputStream(outfile).getChannel();
               outbuf.position(0);
               // fc.write(writeHeader(length));
               fc.write(outbuf);
               fc.close();
            }
            catch (IOException e) {
               Log.d("test", "Can't save the file");
            }


It can save to file but it's acceptable because 2 things:
1. Only save when playing, save buffers into file until stop playing (DSP signal only receive when playing)
2. The sound when playing the file is not acceptable, too low quality. I'm so sure playing with channels is better than plaing with saved file a lot.

Ian @ un4seen

  • Administrator
  • Posts: 26015
To simplify things, an Android version of the BASSenc add-on is now up in the Android thread...

   www.un4seen.com/forum/?topic=13225

SlimDroid

  • Posts: 17
Thanks Ian!
Thank you a lot.
If my research success, my company will be buy full platform license 
It seems BASS is very good.  ;D

SlimDroid

  • Posts: 17
Hi Ian,

I've saved to file successfully with the BASSenc you've just added. I save to wav format is good but I can't save to mp3 format.

Quote
int log = BASSenc.BASS_Encode_Start(mchal, "lame -b 128 - /sdcard/mixed.mp3", BASSenc.BASS_ENCODE_AUTOFREE, null, 0);
Log.d("test", "enc: " + log + " | error: " + BASS.BASS_ErrorGetCode());
Log:
enc: 0 | error: 37  =====>  BASS_ERROR_NOTAVAIL

I thought LAME is included in BASS. So I just do this statement like example in the doc. But it's not true, right?
Help me another, please  ;D

 

radio42

  • Posts: 4839
As stated in the docs, the command-line encoder has to be provided by yourself - in the same directory.
So just provide lame.exe and you should be fine.
bass doesn't include lame.exe of course.

SlimDroid

  • Posts: 17
As stated in the docs, the command-line encoder has to be provided by yourself - in the same directory.
So just provide lame.exe and you should be fine.
bass doesn't include lame.exe of course.

How about android?
I've just downloaded the LAME library and compile it for android but that statement still doesn't work. So, should I use ENCODEPROC instead?

Ian @ un4seen

  • Administrator
  • Posts: 26015
You would set a DSP function on the BASS channel that you wish to encode, and have that function feed the data to the LAME encoder. For example, instead of calling BASS_Encode_Start, you might do something like this...

Code: [Select]
BASS.DSPPROC LameDSP=new BASS.DSPPROC() {
public void DSPPROC(int handle, int channel, ByteBuffer buffer, int length, Object user) {
// feed "buffer" to the encoder here
}
};

// initialize the LAME encoder here
lamedsp=BASS.BASS_ChannelSetDSP(handle, LameDSP, LameObject, 0); // set DSP function to feed the encoder

SlimDroid

  • Posts: 17
Code: [Select]
int log = BASSenc.BASS_Encode_Start(mchal, "/sdcard/mixed.wav", BASSenc.BASS_ENCODE_PCM, null, 0);
Log.d("test", "enc: " + log + " | error: " + BASS.BASS_ErrorGetCode());
if (mchal != 0) {
boolean success = BASS.BASS_ChannelPlay(mchal, false);
Log.d("test", "mchal_played: " + success);
}
I have another question: Is there any way to encode to save to file without playing channel? It's must be an independent process.
I've figured out encode_start statement is nothing without ChannelPlay, "mixed.wav" will get 0 byte. Encoding and saving file seem do real-time with playing

Ian @ un4seen

  • Administrator
  • Posts: 26015
Yes, it is possible to write/encode a file without playing. You need to use the BASS_STREAM_DECODE flag when creating the stream to make it a "decoding channel", and then repeatedly call BASS_ChannelGetData (instead of BASS_ChannelPlay) to process it. For example, you could have a worker thread that does something like this...

Code: [Select]
ByteBuffer buf=ByteBuffer.allocateDirect(20000); // allocate buffer for decoding
while (!stop) {
int r=BASS.BASS_ChannelGetData(handle, buf, buf.capacity()); // decode some data (and send it to WAV writer)
if (r==-1) break; // end or error
}

SlimDroid

  • Posts: 17
Yep, I got it! I've just done it! ;D

SlimDroid

  • Posts: 17
Hi Ian,

I've just found something abnormal.
When I use this method BASSmix.BASS_Mixer_StreamAddChannelEx(mchan, bchan, BASSmix.BASS_MIXER_BUFFER, delay, 0);, even recording and mixing/encoding are not related to eachother,  It causes either problem 1 or problem 2 below to my record:
1. Can't play recorded file.
2. In 1,2 first seconds, nothing is received from recording then crash the sound a little bit. After all, the file include that error part and normal part.

Problem 2 example:
https://drive.google.com/file/d/0B6h31E9yLA-1d2ZST05CYzBVVkU/edit?usp=sharing

normal recorded file:
https://drive.google.com/file/d/0B6h31E9yLA-1eVlsMG9RTDJ4VFk/edit?usp=sharing
===>I use BASSmix.BASS_Mixer_StreamAddChannel(mchan, bchan, BASSmix.BASS_MIXER_BUFFER);. So, the file receives data immediately from recording and has no any crashed part.

---------
PS: Sorry, I posted wrong link!
« Last Edit: 8 Apr '14 - 04:47 by SlimDroid »

Ian @ un4seen

  • Administrator
  • Posts: 26015
Is "bchan" a recording channel, created with BASS_RecordStart? If so, a problem with delaying its introduction in the mixer is that the recording buffer will overflow in the meantime. You could try using the BASS_RECORD_PAUSE flag to create it in a paused state, and then perhaps use a BASS_SYNC_POS sync on the mixer to resume it (via BASS_ChannelPlay) just before its start position.

If you still have a problem with it, please post the full code (including all of the BASS calls).
« Last Edit: 8 Apr '14 - 14:07 by Ian @ un4seen »

SlimDroid

  • Posts: 17
Sorry, I posted wrong link ;D
Please check my comment up there again to see what happen to my file or click here:
https://drive.google.com/file/d/0B6h31E9yLA-1d2ZST05CYzBVVkU/edit?usp=sharing

Actually, when I test more than before, I figure out It's not about that statement. It's caused by something else I don't understand. Sometime, It doesn't happend.

Here is the detail:
https://drive.google.com/file/d/0B6h31E9yLA-1QVpQS2JkWEhsYmM/edit?usp=sharing

-----------
I understand your solution but there is no "BASS_POS_SYNC" flag.

I named:
rchan = record channel
bchan = normal music channel
mchan = mixer channel

-----------
Here is the code of recording, if you want to know what I did with my record;D
Code: [Select]
rchan = BASS.BASS_RecordStart(FREQ, CHANS, 0, RecordingCallback, 0);
...
private BASS.RECORDPROC RecordingCallback = new BASS.RECORDPROC() {
public boolean RECORDPROC(int handle, ByteBuffer buffer, int length, Object user) {
// buffer the data
try {
recbuf.put(buffer);
}
catch (BufferOverflowException e) {
// increase buffer size
ByteBuffer temp;
try {
temp = ByteBuffer.allocateDirect(recbuf.position() + length + BUFSTEP);
}
catch (Error e2) {
runOnUiThread(new Runnable() {
public void run() {
Error("Out of memory!");
StopRecording();
}
});
return false;
}
recbuf.limit(recbuf.position());
recbuf.position(0);
temp.put(recbuf);
recbuf = temp;
recbuf.put(buffer);
}
return true; // continue recording
}
};

If you want to see full code, please tell me your email. I'll zip my android project and send to you. It's like a example with short code, so not so hard to  read ;D

Ian @ un4seen

  • Administrator
  • Posts: 26015
I understand your solution but there is no "BASS_POS_SYNC" flag.

Oops, I meant to say "BASS_SYNC_POS".

rchan = record channel
bchan = normal music channel
mchan = mixer channel

Is your problem WAV file from the mixer or the recording? If the recording, it looks like your recording code is based on the RECTEST example, so please check whether you can reproduce the problem with the original RECTEST example. Note there was an endianness bug in the header of the example's written WAV files; you can redownload the Android BASS package to get a corrected version (an "order" call added to the RECORDPROC). I don't think that's what's causing your issue, but still worth fixing :)

SlimDroid

  • Posts: 17
Quote
Is your problem WAV file from the mixer or the recording? If the recording, it looks like your recording code is based on the RECTEST example, so please check whether you can reproduce the problem with the original RECTEST example. Note there was an endianness bug in the header of the example's written WAV files; you can redownload the Android BASS package to get a corrected version (an "order" call added to the RECORDPROC). I don't think that's what's causing your issue, but still worth fixing Smiley
Yep, actually, It's based on the RECTEST example ;D
I also fixed the header to make sure that case is not problem.

So sad because the problem come from the recording. That's why I don't understand. Mixer and recording is not relate to eachother but I was thought the problem may be come from mixer like I said, but I'm just saying  ;D

Did you see my picture? You gonna understand. It's weird, right? I think also maybe the Init take a long time than I go to the app and click record.
« Last Edit: 10 Apr '14 - 03:30 by SlimDroid »

Ian @ un4seen

  • Administrator
  • Posts: 26015
To help narrow down what's causing the problem, please check if you get the problem when running an unmodified RECTEST example (from the Android BASS package).

SlimDroid

  • Posts: 17
Hi Ian,

I turns out why is that and I've already fixed that ;D
It answers why I think statement BASS_Mixer_StreamAddChannelEx or the Init causes the issue.
After install the app for first time, everything is ok. But next times, the problem is happened.
Why? Because, worst case is using BASS_Init without using BASS_Free first then using BASS_Mixer_StreamAddChannelEx. But if using BASS_Mixer_StreamAddChannel is ok. It's so weird!

BASS_Init # BASS_RecordInit # BASSmix. I think they do not make sense to cause problem to eachother but seem like they do.

If you do this, you can do anything else without problem.
Code: [Select]
BASS.BASS_RecordFree();
BASS.BASS_Free();

BASS.BASS_SetConfig(BASSmix.BASS_CONFIG_MIXER_BUFFER, 1);
if (!BASS.BASS_Init(-1, 44100, 0)) {
Log.d("test", "Can't initialize device");
return;
}

// setup recording device (using default device)
if (!BASS.BASS_RecordInit(-1)) {
Error("Can't initialize recording device");
        return;
}

PS: Sorry If my inference is stupid ;D

Wael Adel

  • Guest
have that function feed the data to the LAME encoder. For example, instead of calling BASS_Encode_Start, you might do something like this...

Code: [Select]
BASS.DSPPROC LameDSP=new BASS.DSPPROC() {
public void DSPPROC(int handle, int channel, ByteBuffer buffer, int length, Object user) {
// feed "buffer" to the encoder here
}
};

// initialize the LAME encoder here
lamedsp=BASS.BASS_ChannelSetDSP(handle, LameDSP, LameObject, 0); // set DSP function to feed the encoder

Is there any example or link to show me how to feed buffer to LAME encoder?

As the LAME wrapper in this tutorial http://developer.samsung.com/android/technical-docs/Porting-and-using-LAME-MP3-on-Android-with-JNI only have this function:
Code: [Select]

private native int encodeFile(String sourcePath, String targetPath);
Which only accept a string to source file not a ByteBuffer, so any clue how to feed ByteBuffer to LAME encoder? or adding a function to the wrapper to make it handle ByteBuffer instead of file source?

Ian @ un4seen

  • Administrator
  • Posts: 26015
You could add an "encodeBuffer" function. For example, the Java function declaration could look like this...

Code: [Select]
private native ByteBuffer encodeBuffer(ByteBuffer buffer, int length);

And the JNI code (added to WRAPPER.C) could look something like this...

Code: [Select]
void outbuf=NULL; // output buffer
int outsize=0; // output buffer size

jobject Java_com_samsung_sample_lame4android_LameActivity_encodeBuffer(JNIEnv *env, jobject jobj, jobject buffer, jint length) {
int outlen=max(length, BUFFER_SIZE);
if (outsize<outlen) outbuf=realloc(outbuf, outsize=outlen); // enlarge output buffer
if (!buffer) { // no ByteBuffer (null) = flush encoder
outlen=lame_encode_flush(lame, outbuf, outsize); // flush
} else {
void *inbuf=env->GetDirectBufferAddress(buffer);
int channels=lame_get_num_channels(lame);
outlen=lame_encode_buffer_interleaved(lame, inbuf, length/channels/sizeof(short), outbuf, outsize); // encode the data
}
if (outlen<0) return NULL; // failed
return env->NewDirectByteBuffer(outbuf, outlen); // return a new ByteBuffer containing encoded data
}

Note that the "outbuf" buffer is reused in each call, so you will need to do something with (eg. write to file) the returned data between calls. You can free "outbuf" (and reset outsize=0) in the Java_com_samsung_sample_lame4android_LameActivity_destroyEncoder function.

Wael Adel

  • Guest
Code: [Select]
void outbuf=NULL; // output buffer
int outsize=0; // output buffer size

jobject Java_com_samsung_sample_lame4android_LameActivity_encodeBuffer(JNIEnv *env, jobject jobj, jobject buffer, jint length) {
int outlen=max(length, BUFFER_SIZE);
if (outsize<outlen) outbuf=realloc(outbuf, outsize=outlen); // enlarge output buffer
if (!buffer) { // no ByteBuffer (null) = flush encoder
outlen=lame_encode_flush(lame, outbuf, outsize); // flush
} else {
void *inbuf=env->GetDirectBufferAddress(buffer);
int channels=lame_get_num_channels(lame);
outlen=lame_encode_buffer_interleaved(lame, inbuf, length/channels/sizeof(short), outbuf, outsize); // encode the data
}
if (outlen<0) return NULL; // failed
return env->NewDirectByteBuffer(outbuf, outlen); // return a new ByteBuffer containing encoded data
}

well, here is my code, but i have two problems:
1- it's not broadcasting to shoutcast server
2- when i destroy LAME and BASS encoder it keeps getting the buffered data till the app crashs.

This the the wrapper.c code:

Code: [Select]
void *outbuf=NULL; // output buffer
int outsize=0; // output buffer size
//========================

void Java_com_samsung_sample_lame4android_LameActivity_destroyEncoder(
JNIEnv *env, jobject jobj) {

free(outbuf); // free outbuf to star again
        outbuf = NULL;
        outsize=0; // return outsize to zero
int res = lame_close(lame);
LOGD("Deinit returned: %d", res);
}
//=========================
jobject Java_com_samsung_sample_lame4android_LameActivity_encodeBuffer(JNIEnv *env, jobject jobj, jobject buffer, jint length) {
int outlen;// = length;
if (length>BUFFER_SIZE) {
     outlen=length;
} else {
         outlen=BUFFER_SIZE;
}

if (outsize<outlen) outbuf=realloc(outbuf, outsize=outlen); // enlarge output buffer
if (!buffer) { // no ByteBuffer (null) = flush encoder
outlen=lame_encode_flush(lame, outbuf, outsize); // flush
LOGD("no ByteBuffer (null) = flush encoder");
} else {
void *inbuf=(*env)->GetDirectBufferAddress(env, buffer);
int channels=lame_get_num_channels(lame);
outlen=lame_encode_buffer_interleaved(lame, inbuf, length/channels/sizeof(short), outbuf, outsize); // encode the data
LOGD("encode the data");
}
if (outlen<0) return NULL; // failed
LOGD("return a new ByteBuffer containing encoded data");
return (*env)->NewDirectByteBuffer(env, outbuf, outlen); // return a new ByteBuffer containing encoded data
}

and this is my service code which i use to encode music and broadcast it to shoutcast:

Code: [Select]
BASSenc.ENCODERPROC EncodingCallback=new BASSenc.ENCODERPROC() {
        public int ENCODERPROC(int handle, int channel, ByteBuffer buffer, int length, int maxout, Object user) {

            encodedBuffer = encodeBuffer(buffer, length);
            buffer.clear(); // to make buffer ready for writing
            try {
                buffer.put(encodedBuffer); // to write the encoded MP3 buffer on the old PCM buffer
            } catch (BufferOverflowException e) {
                // increase buffer size

                ByteBuffer temp ;
                try {
                    temp=ByteBuffer.allocateDirect(buffer.position()+length+BUFSTEP);
                } catch (Error e2) {
                    Log.d(TAG, "bass buffer Out of memory!" + BASS.BASS_ErrorGetCode());

                    return -1;
                }
                temp.order(ByteOrder.LITTLE_ENDIAN);
                buffer.limit(buffer.position());
                buffer.position(0);
                temp.put(buffer);
                buffer=temp;
                buffer.put(encodedBuffer); // to write the encoded MP3 buffer on the old PCM buffer
            }
           
       
            return buffer.capacity();// it's suppose to return the amount of data buffered, i am not sure if its capacity,maxout or length
        }

    };
//====================================
    private native ByteBuffer encodeBuffer(ByteBuffer buffer, int length);
//====================================

 @Override
    public void onCreate() {
        super.onCreate();
             
        initEncoder(NUM_CHANNELS, SAMPLE_RATE, BITRATE, MODE, QUALITY); //init LAME encoder

        BASS.BASS_Free();
        if (!BASS.BASS_Init(-1, 44100, 0)) {
            Log.d(TAG, "bass Can't initialize device");
            return;
        }
       
        chan = BASS.BASS_StreamCreateURL("http://media.house-mixes.com/m/DeejaymiloSA/f661e0a0-8d77-40f5-9fd7-04eb178e21d3.mp3", 0, 0, null, 0);

    }

//==========================
@Override
    public int onStartCommand(Intent intent, int flags, int startId) {

            BASS.BASS_ChannelPlay(chan, false);// play Bass Chan
            encoder=BASSenc.BASS_Encode_StartUser(chan ,BASSenc.BASS_ENCODE_PCM |BASSenc.BASS_ENCODE_AUTOFREE, EncodingCallback , 0);

            if (encoder == 0) { // can't start encoder.. maybe no internet
                BASS.BASS_ChannelStop(chan);
                chan = 0;
                //return;
            }

            //initiate BASS Cast to ShoutCast server
            if (!BASSenc.BASS_Encode_CastInit(encoder,server,pass,content,name,url,genre,desc,headers,bitrate,pub)) {
                BASS.BASS_ChannelStop(chan);
                chan=0;
            }

            BASSenc.BASS_Encode_CastSetTitle(encoder,"loves Adele songs",null);

        return START_STICKY;

    }

//==========================
@Override
    public void onDestroy() {

            BASSenc.BASS_Encode_Stop(encoder)
            BASS.BASS_ChannelStop(chan);// stop Bass Channel
            chan = 0;
            BASS.BASS_Free(); // free BASS resources
            destroyEncoder(); // destroy LAME encoder

        super.onDestroy();

    }
//=============================
// shoutcast settings
    String domain = "http://trackaty.com"; //192.232.244.57
    int port = 8000;
    int streamId = 1; // 897
    String server = domain+":"+port;
    String pass = "xxxxx";
    String username = "wael";
    String content = BASSenc.BASS_ENCODE_TYPE_MP3;
    String name = "Qalb Trancaya";
    String url = "http://www.trackaty.com/897-karim-adel/profile";
    String genre = "Trance";
    String desc = null;
    String headers = null;
    int bitrate = 96;
    boolean pub = false;

What i am doing wrong?
Why i cant send data to shoutcast?
Did i put the encoded MP3 buffer to the same old buffer correctly?
What should i return from ENCODERPROC  EncodingCallback to represent the "amount of data", buffer.capacity() or lingth or maxout?
Why ENCODERPROC EncodingCallback still getting buffered data forever even after i stop the encoder and it sends length=BASS_ENCODER_CLOSE? isn't suppose to get just the remaining data and then close?

Wael Adel

  • Guest
actually it can broadcast to shoutcast, but it's very slow, and pauses every second.

Ian @ un4seen

  • Administrator
  • Posts: 26015
and this is my service code which i use to encode music and broadcast it to shoutcast:

Code: [Select]
BASSenc.ENCODERPROC EncodingCallback=new BASSenc.ENCODERPROC() {
        public int ENCODERPROC(int handle, int channel, ByteBuffer buffer, int length, int maxout, Object user) {

            encodedBuffer = encodeBuffer(buffer, length);
            buffer.clear(); // to make buffer ready for writing
            try {
                buffer.put(encodedBuffer); // to write the encoded MP3 buffer on the old PCM buffer
            } catch (BufferOverflowException e) {
                // increase buffer size

                ByteBuffer temp ;
                try {
                    temp=ByteBuffer.allocateDirect(buffer.position()+length+BUFSTEP);
                } catch (Error e2) {
                    Log.d(TAG, "bass buffer Out of memory!" + BASS.BASS_ErrorGetCode());

                    return -1;
                }
                temp.order(ByteOrder.LITTLE_ENDIAN);
                buffer.limit(buffer.position());
                buffer.position(0);
                temp.put(buffer);
                buffer=temp;
                buffer.put(encodedBuffer); // to write the encoded MP3 buffer on the old PCM buffer
            }
            
        
            return buffer.capacity();// it's suppose to return the amount of data buffered, i am not sure if its capacity,maxout or length
        }

    };

The encoded data needs to be placed in the same buffer as was recieved in the "buffer" parameter, ie. don't allocate a new buffer. If the encoder returns more than "maxout" bytes (which will be equal to "buffer.capacity()"), then you would need to retain the excess data and return that in the next ENCODERPROC call (it will be called again). To avoid such concerns, it would probably be best to pass the "maxout" value to the encodeBuffer function. It could be modifed like this...

Code: [Select]
jobject Java_com_samsung_sample_lame4android_LameActivity_encodeBuffer(JNIEnv *env, jobject jobj, jobject buffer, jint length, jint maxout) {
int outlen;
if (outsize<maxout) outbuf=realloc(outbuf, outsize=maxout); // enlarge output buffer
if (!buffer) { // no ByteBuffer (null) = flush encoder
outlen=lame_encode_flush(lame, outbuf, maxout); // flush
LOGD("no ByteBuffer (null) = flush encoder");
} else {
void *inbuf=(*env)->GetDirectBufferAddress(env, buffer);
int channels=lame_get_num_channels(lame);
outlen=lame_encode_buffer_interleaved(lame, inbuf, length/channels/sizeof(short), outbuf, maxout); // encode the data
LOGD("encode the data");
}
if (outlen<0) return NULL; // failed
LOGD("return a new ByteBuffer containing encoded data");
return (*env)->NewDirectByteBuffer(env, outbuf, outlen); // return a new ByteBuffer containing encoded data
}

You can then do this in your ENCODERPROC...

Code: [Select]
BASSenc.ENCODERPROC EncodingCallback=new BASSenc.ENCODERPROC() {
        public int ENCODERPROC(int handle, int channel, ByteBuffer buffer, int length, int maxout, Object user) {
            encodedBuffer = encodeBuffer(length==-1?null:buffer, length, maxout); // flush upon closing (length=-1)
            buffer.clear(); // to make buffer ready for writing
            buffer.put(encodedBuffer); // to write the encoded MP3 buffer on the old PCM buffer
            return buffer.position();
        }
    };

Code: [Select]
           BASS.BASS_ChannelPlay(chan, false);// play Bass Chan
            encoder=BASSenc.BASS_Encode_StartUser(chan ,BASSenc.BASS_ENCODE_PCM |BASSenc.BASS_ENCODE_AUTOFREE, EncodingCallback , 0);

            if (encoder == 0) { // can't start encoder.. maybe no internet
                BASS.BASS_ChannelStop(chan);
                chan = 0;
                //return;
            }

            //initiate BASS Cast to ShoutCast server
            if (!BASSenc.BASS_Encode_CastInit(encoder,server,pass,content,name,url,genre,desc,headers,bitrate,pub)) {
                BASS.BASS_ChannelStop(chan);
                chan=0;
            }

The encoder and caster should generally be setup before starting the source, ie. move the BASS_ChannelPlay call to after BASS_Encode_CastInit. Alternatively, you can start the encoder in a paused state via the BASS_ENCODE_PAUSE flag. It isn't a big deal with headerless formats like MP3, but if you happen to want to support Ogg Vorbis or FLAC in future, then it is important to setup the caster before the encoder begins processing; the caster might otherwise miss some of the encoder output.
« Last Edit: 8 Oct '14 - 14:57 by Ian @ un4seen »

Wael Adel

  • Guest
Thank you so much, it worked, i managed to broadcast to shoutcast successfully, you are my hero.

One more question. I use my app to broadcast my player's songs to shoutcast, so which is the best practice?

1- To use a record channel to record what the player is playing, is it possible to record in android "What you hear" or Master Device?
2- To play the same source file which used be the player, but off-curse using a decode channel and send it's data to BASS encoder using BASS.BASS_ChannelGetData.

also i need to quickly change the stream source file when i play next song, as may app now looks like if its not responding for few socndes, because for every new song i had to stop the channel and encoder then initiate them again, and start a new CastInit. I wonder if BASS.BASS_ChannelGetData can send new stream to encoder and CastInit without having to stop and initiate them with every new song.

Hear is what i do when the next song plays:

Code: [Select]
public void onChanChanged() { //when Bass Channel changed

        if(chan != 0){
           BASS.BASS_ChannelPause(chan);//Stop channel to apply new one
        }

        trackSource = Uri.parse(PlaybackService.getCurrentQuery().getPreferredTrackResult().getPath());
        chan = BASS.BASS_StreamCreateFile(trackSource.toString(), 0, 0, 0);
           
        if(encoder != 0){
            BASSenc.BASS_Encode_Stop(encoder);//Stpo encoder to apply new one
        }

        //initiate BASS encoder to get LAME Bencoded mp3 and send it to caster
        encoder=BASSenc.BASS_Encode_StartUser(chan ,BASSenc.BASS_ENCODE_PCM |BASSenc.BASS_ENCODE_LIMIT, EncodingCallback , 0);

        //initiate BASS Cast to ShoutCast server
       if (!BASSenc.BASS_Encode_CastInit(encoder,server,pass,content,name,url,genre,desc,headers,BITRATE,pub)) {
            Log.d(TAG, "bass Couldn't setup Init connection with server" + BASS.BASS_ErrorGetCode());
            BASS.BASS_ChannelStop(chan);
            chan=0;
        }

        BASS.BASS_ChannelPlay(chan, false);// play Bass Chan
        BASSenc.BASS_Encode_CastSetTitle(encoder,artistName +" - " +trackName,null);

    }

Ian @ un4seen

  • Administrator
  • Posts: 26015
Good to hear that the casting is working well. Regarding your question, I don't think there is any "What you hear" recording option on Android, so you would need to set the encoder/caster directly on what you're playing. It shouldn't be necessary to create a separate decoding channel for that, ie. set the encoder/caster directly on the playback stream. BASS_Encode_SetChannel can be used to move an encoder/caster to another stream, but the new stream's sample format must match the old stream's (and encoder's) format, ie. same sample rate and channel count. If you need to play files with varying sample formats, then you could use a mixer to play them all in a common sample format and keep the encoder/caster set on the mixer throughout.

By the way, the BASS_ENCODE_PCM flag doesn't apply to BASS_Encode_StartUser (it will be ignored), so you may as well remove that from your call.

Wael Adel

  • Guest
hi,

I get this error on my released version, despite the debug version is working fine, any idea how to solve it?

 E/AndroidRuntime﹕ FATAL EXCEPTION: Thread-3705
    java.lang.NoSuchMethodError: no method with name='ENCODERPROC' signature='(IILjava/nio/ByteBuffer;IILjava/lang/Object;)I' in class /services/StreamService$2;
            at com.un4seen.bass.BASSenc.BASS_Encode_StartUser(Native Method)