Hello all,

        I've noticed that encoding/decoding an mp3 introduces a timeshit.  For 
example, encoding a 136,044 byte 44.1/16-bit WAV and then decoding and resampling 
gives a 136,074 byte 44.1/16-bit WAV.  The surprising thing is that the extra data 
seems to be inserted at the *beginning*, not the end (where I would've guessed extra 
solence may have gone to fill a frame or something).  I can tell this because the 
mathematical difference of the two produces a WAV that (aurally) matches them.

        Is there a way to predict this timeshift?  Does anyone know if it's a variable 
based on encoding or constant?

Ross Vandegrift
[EMAIL PROTECTED]
_______________________________________________
mp3encoder mailing list
[EMAIL PROTECTED]
http://minnie.tuhs.org/mailman/listinfo/mp3encoder

Reply via email to