Hi all,

                I' ve done some tests on a bunch of songs in different compressed 
formats ( 
samplerate = 44100 ): Mp3 and Ogg. For the Mp3 format I tested various 
bitrates and I find out that on the playback phase this format has a value of 
26.12 milliseconds/frame ( meaning that every frame cover 26.12 ms! ).

For the Ogg format I tested only a nominal bitrate of 192 Kb and I find out a 
value of 23.22 milliseconds/frame. These values are taken with an error of 
about 10^(-6) seconds. 

        My ask: is there a general algorythm to calculate the ms/frame value for all 
conpressed formats? Could someone confirm my values? Are this values affected 
from some parameters ( I think surely samplerate... )? Why different values 
for Ogg and Mp3? Ogg will be affected by bitrate?


Thanks to all people!

                                     J_Zar
                            Gianluca Romanin 

Reply via email to