Hi, 

I have task of recording webrtc local stream audio and webrtc remote stream 
audio.
I succeeded in recording audio with MediaRecorder (using timeSlice) in which I 
get  recorder chunk Blob (audio/ogg) for local and remote audio.

Now, before I send this to server I would like to mix recorded local and remote 
audio chunks to one chunk using Web Audio API and this is where I have some 
problems.

I used this steps:

    1. when both local and remote audio blobs are available I'm using 
FileReader to get ArrayBuffer for each

    2. using AudioContext decodeData to get AudioBuffer (here I get error: The 
buffer passed to decodeAudioData contains an unknown content type. and The 
buffer passed to decodeAudioData contains invalid content which cannot be 
decoded successfully.)

    3. then I thought I can use OfflineAudioContext to connect both AudioBuffer 
to OfflineAudioContext destination and get AudioBuffer mix. Is this OK?

    4. After OfflineAudioContext rendering is done I need to convert 
AudioBuffer back to Blob so I can get Data URL using File Reader to sent to 
server. How get Blob from AudioBuffer.

Please help,
Thank You


 
_______________________________________________
dev-media mailing list
[email protected]
https://lists.mozilla.org/listinfo/dev-media

Reply via email to