Hi all,

I'm trying to understand the semantic of AVCodec.sample_rate.

When decoding data, this field is usually set by the decoder, but I
think there are cases when this value may be set by the user to
instruct the decoder regarding the expected sample rate for the
encoded data (this is analogous for example with the rawvideo decoder
when you have to explicitly tell the decoder the size of each video
frame).

Then when we have to decode, the encoder expects in the samples buffer
16-bit samples, yet which is the samplerate expected?

I think it expects some default samplerate value(44100?), otherwise
you have to instruct somehow the encoder about it (and how in this
case?).

Many thanks in advance for your advices.
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Reply via email to