Skip to content Skip to sidebar Skip to footer

Passing Sound (wav) File To Javascript From Objective C

I am recording a sound file ( wav format) in objective C. I want to pass this back to Javascript using Objective C stringByEvaluatingJavaScriptFromString. I am thinking that I wil

Solution 1:

well, this was not straight forward as I expected. so here is how I was able to achieve this.

Step 1: I recorded the audio in caf format using AudioRecorder.

NSArray *dirPaths;
NSString *docsDir;

dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);

docsDir = [dirPaths objectAtIndex:0];

soundFilePath = [docsDir stringByAppendingPathComponent:@"sound.caf"];

NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];

NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys:
    [NSNumber numberWithInt:AVAudioQualityMin],
    AVEncoderAudioQualityKey,
    [NSNumber numberWithInt:16],
    AVEncoderBitRateKey,
    [NSNumber numberWithInt:2],
    AVNumberOfChannelsKey,
    [NSNumber numberWithFloat:44100],
                                AVSampleRateKey,
    nil];

NSError *error = nil;

audioRecorder = [[AVAudioRecorder alloc]
                 initWithURL:soundFileURL
                 settings:recordSettings error:&error];

if(error)
{
    NSLog(@"error: %@", [error localizedDescription]);
} else {
    [audioRecorder prepareToRecord];
}

after this, you just need to call audioRecorder.record to record the audio. it will be recorded in caf format. If you want to see my recordAudio function, then here it is.

  (void) recordAudio
   {
    if(!audioRecorder.recording)
     {
         _playButton.enabled = NO;
         _recordButton.title = @"Stop";
         [audioRecorder record];
         [self animate1:nil finished:nil context:nil];

     }
    else
    {
       [_recordingImage stopAnimating];
       [audioRecorder stop];
       _playButton.enabled = YES;
      _recordButton.title = @"Record";
    }
  }

Step 2: Convert the caf format to wav format. This I was able to perform using following function.

 -(BOOL)exportAssetAsWaveFormat:(NSString*)filePath
{
   NSError *error = nil ;

NSDictionary *audioSetting = [NSDictionary dictionaryWithObjectsAndKeys:
                              [ NSNumber numberWithFloat:44100.0], AVSampleRateKey,
                              [ NSNumber numberWithInt:2], AVNumberOfChannelsKey,
                              [ NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
                              [ NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
                              [ NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
                              [ NSNumber numberWithBool:0], AVLinearPCMIsBigEndianKey,
                              [ NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
                              [ NSData data], AVChannelLayoutKey, nil ];

NSString *audioFilePath = filePath;
AVURLAsset * URLAsset = [[AVURLAsset alloc]  initWithURL:[NSURL fileURLWithPath:audioFilePath] options:nil];

if (!URLAsset) returnNO ;

AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:URLAsset error:&error];
if (error) returnNO;

NSArray *tracks = [URLAsset tracksWithMediaType:AVMediaTypeAudio];
if (![tracks count]) returnNO;

AVAssetReaderAudioMixOutput *audioMixOutput = [AVAssetReaderAudioMixOutput
                                               assetReaderAudioMixOutputWithAudioTracks:tracks
                                               audioSettings :audioSetting];

if (![assetReader canAddOutput:audioMixOutput]) returnNO ;

[assetReader addOutput :audioMixOutput];

if (![assetReader startReading]) returnNO;



NSString *title = @"WavConverted";
NSArray *docDirs = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docDir = [docDirs objectAtIndex: 0];
NSString *outPath = [[docDir stringByAppendingPathComponent :title]
                     stringByAppendingPathExtension:@"wav" ];

if(![[NSFileManager defaultManager] removeItemAtPath:outPath error:NULL])
{
    returnNO;
}

soundFilePath = outPath;

NSURL *outURL = [NSURL fileURLWithPath:outPath];
AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:outURL
                                                      fileType:AVFileTypeWAVE
                                                         error:&error];
if (error) returnNO;

AVAssetWriterInput *assetWriterInput = [ AVAssetWriterInput assetWriterInputWithMediaType :AVMediaTypeAudio
                                                                            outputSettings:audioSetting];
assetWriterInput. expectsMediaDataInRealTime = NO;

if (![assetWriter canAddInput:assetWriterInput]) returnNO ;

[assetWriter addInput :assetWriterInput];

if (![assetWriter startWriting]) returnNO;


//[assetReader retain];//[assetWriter retain];

[assetWriter startSessionAtSourceTime:kCMTimeZero ];

dispatch_queue_t queue = dispatch_queue_create( "assetWriterQueue", NULL );

[assetWriterInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{

    NSLog(@"start");

    while (1)
    {
        if ([assetWriterInput isReadyForMoreMediaData] && (assetReader.status == AVAssetReaderStatusReading)) {

            CMSampleBufferRef sampleBuffer = [audioMixOutput copyNextSampleBuffer];

            if (sampleBuffer) {
                [assetWriterInput appendSampleBuffer :sampleBuffer];
                CFRelease(sampleBuffer);
            } else {
                [assetWriterInput markAsFinished];
                break;
            }
        }
    }

    [assetWriter finishWriting];

    //[self playWavFile];NSError *err;
    NSData *audioData = [NSData dataWithContentsOfFile:soundFilePath options: 0 error:&err];
    [self.audioDelegate doneRecording:audioData];
    //[assetReader release ];//[assetWriter release ];NSLog(@"soundFilePath=%@",soundFilePath);
    NSDictionary *dict = [[NSFileManager defaultManager] attributesOfItemAtPath:soundFilePath error:&err];
    NSLog(@"size of wav file = %@",[dict objectForKey:NSFileSize]);
    //NSLog(@"finish");
}];

well in this function, i am calling audioDelegate function doneRecording with audioData which is in wav format. Here is code for doneRecording.

-(void) doneRecording:(NSData *)contents
{
myContents = [[NSData dataWithData:contents] retain];
[self returnResult:alertCallbackId args:@"Recording Done.",nil];
}

// Call this function when you have results to send back to javascript callbacks// callbackId : int comes from handleCall function// args: list of objects to send to the javascript callback
- (void)returnResult:(int)callbackId args:(id)arg, ...;
{
  if (callbackId==0) return;

  va_list argsList;
  NSMutableArray *resultArray = [[NSMutableArray alloc] init];

  if(arg != nil){
    [resultArray addObject:arg];
    va_start(argsList, arg);
    while((arg = va_arg(argsList, id)) != nil)
      [resultArray addObject:arg];
    va_end(argsList);
  }

   NSString *resultArrayString = [json stringWithObject:resultArray allowScalar:YES error:nil];
   [self performSelectorOnMainThread:@selector(stringByEvaluatingJavaScriptFromString:) withObject:[NSString stringWithFormat:@"NativeBridge.resultForCallback(%d,%@);",callbackId,resultArrayString] waitUntilDone:NO];
   [resultArray release];    
}

Step 3: Now it is time to communicate back to javascript inside UIWebView that we are done recording the audio so you can start accepting data in blocks from us. I am using websockets to transfer data back to javascript. The data will be transferred in blocks because server(https://github.com/benlodotcom/BLWebSocketsServer) that I was using, was build using libwebsockets(http://git.warmcat.com/cgi-bin/cgit/libwebsockets/).

This is how you start the server in delegate class.

- (id)initWithFrame:(CGRect)frame 
{
  if (self = [super initWithFrame:frame]) {

      [self _createServer];
      [self.server start];
      myContents = [NSData data];

    // Set delegate in order to "shouldStartLoadWithRequest" to be calledself.delegate = self;

    // Set non-opaque in order to make "body{background-color:transparent}" working!self.opaque = NO;

    // Instanciate JSON parser library
    json = [ SBJSON new ];

    // load our html fileNSString *path = [[NSBundle mainBundle] pathForResource:@"webview-document" ofType:@"html"];
    [self loadRequest:[NSURLRequest requestWithURL:[NSURL fileURLWithPath:path]]];



  }
  returnself;
}
-(void) _createServer
{
    /*Create a simple echo server*/self.server = [[BLWebSocketsServer alloc] initWithPort:9000 andProtocolName:echoProtocol];
    [self.server setHandleRequestBlock:^NSData *(NSData *data) {

        NSString *convertedString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
        NSLog(@"Received Request...%@",convertedString);

        if([convertedString isEqualToString:@"start"])
        {
            NSLog(@"myContents size: %d",[myContents length]);

            int contentSize = [myContents length];
            int chunkSize = 64*1023;
            chunksCount = ([myContents length]/(64*1023))+1;

            NSLog(@"ChunkSize=%d",chunkSize);
            NSLog(@"chunksCount=%d",chunksCount);

            chunksArray =  [[NSMutableArray array] retain];

            int index = 0;
            //NSRange chunkRange;for(int i=1;i<=chunksCount;i++)
            {

                if(i==chunksCount)
                {
                    NSRange chunkRange = {index,contentSize-index};
                    NSLog(@"chunk# = %d, chunkRange=(%d,%d)",i,index,contentSize-index);
                    NSData *dataChunk = [myContents subdataWithRange:chunkRange];
                    [chunksArray addObject:dataChunk];
                    break;
                }
                else
                {
                    NSRange chunkRange = {index, chunkSize};
                    NSLog(@"chunk# = %d, chunkRange=(%d,%d)",i,index,chunkSize);
                    NSData *dataChunk = [myContents subdataWithRange:chunkRange];
                    index += chunkSize;
                    [chunksArray addObject:dataChunk];
                }
            }

            return [chunksArray objectAtIndex:0];

        }
        else
        {
            int chunkNumber = [convertedString intValue];

            if(chunkNumber>0 && (chunkNumber+1)<=chunksCount)
            {
                return [chunksArray objectAtIndex:(chunkNumber)];
            }


        }

        NSLog(@"Releasing Array");
        [chunksArray release];
        chunksCount = 0;
        return [NSData dataWithBase64EncodedString:@"Stop"];
    }];
}

code on javascript side is

var socket;
var chunkCount = 0;
var soundBlob, soundUrl;
var smallBlobs = newArray();

functioncaptureMovieCallback(response)
{
    if(socket)
    {
        try{
            socket.send('start');
        }
        catch(e)
        {
            log('Socket is not valid object');
        }

    }
    else
    {
        log('socket is null');
    }
}

functioncloseSocket(response)
{
    socket.close();
}


functionconnect(){
    try{
        window.WebSocket = window.WebSocket || window.MozWebSocket;

        socket = newWebSocket('ws://127.0.0.1:9000',
                                      'echo-protocol');

        socket.onopen = function(){
        }

        socket.onmessage = function(e){
            var data = e.data;
            if(e.datainstanceofArrayBuffer)
            {
                log('its arrayBuffer');
            }
            elseif(e.datainstanceofBlob)
            {
                if(soundBlob)
                   log('its Blob of size = '+ e.data.size + ' final blob size:'+ soundBlob.size);

                if(e.data.size != 3)
                {
                    //log('its Blob of size = '+ e.data.size);
                    smallBlobs[chunkCount]= e.data;
                    chunkCount = chunkCount +1;
                    socket.send(''+chunkCount);
                }
                else
                {
                    //alert('End Received');try{
                    soundBlob = newBlob(smallBlobs,{ "type" : "audio/wav" });
                    var myURL = window.URL || window.webkitURL;
                    soundUrl = myURL.createObjectURL(soundBlob);
                    log('soundURL='+soundUrl);
                    }
                    catch(e)
                    {
                        log('Problem creating blob and url.');
                    }

                    try{
                        var serverUrl = 'http://10.44.45.74:8080/MyTestProject/WebRecording?record';
                        var xhr = newXMLHttpRequest();
                        xhr.open('POST',serverUrl,true);
                        xhr.setRequestHeader("content-type","multipart/form-data");
                        xhr.send(soundBlob);
                    }
                    catch(e)
                    {
                        log('error uploading blob file');
                    }

                    socket.close();
                }

                //alert(JSON.stringify(msg, null, 4));
            }
            else
            {
                log('dont know');
            }
        }

        socket.onclose = function(){
            //message('<p class="event">Socket Status: '+socket.readyState+' (Closed)');log('final blob size:'+soundBlob.size);
        }

    } catch(exception){
       log('<p>Error: '+exception);
    }
}

functionlog(msg) {
    NativeBridge.log(msg);
}
functionstopCapture() {
    NativeBridge.call("stopMovie", null,null);
}

functionstartCapture() {
    NativeBridge.call("captureMovie",null,captureMovieCallback);
}

NativeBridge.js

varNativeBridge = {
  callbacksCount : 1,
  callbacks : {},

  // Automatically called by native layer when a result is available
  resultForCallback : functionresultForCallback(callbackId, resultArray) {
    try {


    var callback = NativeBridge.callbacks[callbackId];
    if (!callback) return;
    console.log("calling callback for "+callbackId);
    callback.apply(null,resultArray);
    } catch(e) {alert(e)}
  },

  // Use this in javascript to request native objective-c code// functionName : string (I think the name is explicit :p)// args : array of arguments// callback : function with n-arguments that is going to be called when the native code returned
  call : functioncall(functionName, args, callback) {

    //alert("call");//alert('callback='+callback);var hasCallback = callback && typeof callback == "function";
    var callbackId = hasCallback ? NativeBridge.callbacksCount++ : 0;

    if (hasCallback)
      NativeBridge.callbacks[callbackId] = callback;

    var iframe = document.createElement("IFRAME");
    iframe.setAttribute("src", "js-frame:" + functionName + ":" + callbackId+ ":" + encodeURIComponent(JSON.stringify(args)));
    document.documentElement.appendChild(iframe);
    iframe.parentNode.removeChild(iframe);
    iframe = null;

  },

    log : functionlog(message) {

        var iframe = document.createElement("IFRAME");
        iframe.setAttribute("src", "ios-log:"+encodeURIComponent(JSON.stringify("#iOS#" + message)));
        document.documentElement.appendChild(iframe);
        iframe.parentNode.removeChild(iframe);
        iframe = null;

    }

};
  1. we call connect() on javascript side on body load in html side

  2. Once we receive callback(captureMovieCallback) from startCapture function, we send start message indicating that we are ready to accept the data.

  3. server on objective c side splits the wav audio data in small chunks of chunksize=60*1023 and stores in array.

  4. sends the first block back to javascript side.

  5. javascript accepts this block and sends the number of next block that it need from server.

  6. server sends block indicated by this number. This process is repeated untill we send the last block to javascript.

  7. At the last we send stop message back to javascript side indicating that we are done. it is apparently 3 bytes in size ( which is used as criteria to break this loop.)

  8. Every block is stored as small blob in array. Now we create a bigger blobs from these small blobs using following line

    soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" });

    This blob is uploaded to server which writes this blob as wav file. we can pass url to this wav file as src of audio tag to replay it back on javascript side.

  9. we close the websocket connection after sending blob to server.

    Hope this is clear enough to understand.

Solution 2:

If all you want to do is to play the sound than you'd be much better off using one of the native audio playback systems in iOS rather than the HTML audio tag.

Post a Comment for "Passing Sound (wav) File To Javascript From Objective C"