On Wed, Jun 18, 2014 at 11:03 AM, AirMike <[email protected]> wrote:

> I have task of recording webrtc local stream audio and webrtc remote
> stream audio.
> I succeeded in recording audio with MediaRecorder (using timeSlice) in
> which I get  recorder chunk Blob (audio/ogg) for local and remote audio.
>
> Now, before I send this to server I would like to mix recorded local and
> remote audio chunks to one chunk using Web Audio API and this is where I
> have some problems.
>

It sounds like you're using MediaRecorder to compress the local and remote
audio chunks on the client, and then trying to uncompress them on the
client, mix them, recompress them and send the result to the server. Is
that right? If so, why are you doing the first compression step instead of
just leaving them uncompressed?

I used this steps:
>
>     1. when both local and remote audio blobs are available I'm using
> FileReader to get ArrayBuffer for each
>
>     2. using AudioContext decodeData to get AudioBuffer (here I get error:
> The buffer passed to decodeAudioData contains an unknown content type. and
> The buffer passed to decodeAudioData contains invalid content which cannot
> be decoded successfully.)
>

Is your initial compression step using timeSlice to produce multiple Blobs
from a single MediaRecorder? If so, those Blobs must be concatenated to get
a single resource which you can decode successfully. E.g. you can't pass
just the second Blob created by a MediaRecorder to
AudioContext.decodeAudioData and expect it to work.

Rob
-- 
Jtehsauts  tshaei dS,o n" Wohfy  Mdaon  yhoaus  eanuttehrotraiitny  eovni
le atrhtohu gthot sf oirng iyvoeu rs ihnesa.r"t sS?o  Whhei csha iids  teoa
stiheer :p atroa lsyazye,d  'mYaonu,r  "sGients  uapr,e  tfaokreg iyvoeunr,
'm aotr  atnod  sgaoy ,h o'mGee.t"  uTph eann dt hwea lmka'n?  gBoutt  uIp
waanndt  wyeonut  thoo mken.o w
_______________________________________________
dev-media mailing list
[email protected]
https://lists.mozilla.org/listinfo/dev-media

Reply via email to