Greetings -
I’m attempting to encode a sequence of frames with libx264. For testing, I’m
using the sample code from http://www.imc-store.com.au/Articles.asp?ID=276
<http://www.imc-store.com.au/Articles.asp?ID=276>. This is all pretty vanilla
FFmpeg API usage, and is similar to what one might write oneself. It produces
correct output with libx264, however the bitrate I specify is not honored at
all…
I can set the bitrate (m_AVIMOV_BPS) to whatever number I choose, and the
actual bitrate used is some other value:
AVCodec *m_video_codec =
avcodec_find_encoder(m_fmt->video_codec);
if (!(m_video_codec)) {
return;
}
AVStream *st = avformat_new_stream(m_oc, m_video_codec);
AVCodecContext *m_c = st->codec;
m_c->codec_id = m_fmt->video_codec;
m_c->bit_rate = m_AVIMOV_BPS;
Googling this issue, I find that it’s the subject of a lot of discussion.
Evidently, simply setting the bit rate in the context is not sufficient — I can
set m_AVIMOV_BPS to 400 million, but the bitrate used in encoding ends up being
something like 176 Kbps (using the info inspector in one of any number of
players/utilities). As well, visually I see significant undersampling
artifacts. I assume the effective bitrate used is computed by ffmpeg or the
encoder itself, based on some other (default) parameters. But in any case, my
value is being ignored, making the output useless to me.
One forum I post I found (
http://libav-users.943685.n4.nabble.com/Setting-libx264-bitrate-via-API-td4655453.html
<http://libav-users.943685.n4.nabble.com/Setting-libx264-bitrate-via-API-td4655453.html>
) suggests that the reason the encoder (or ffmpeg itself?) is ignoring the bit
rate is because of the use of “pts” and “dts” in the encoder and output stream
writer. Specifically, the encoder’s input should use, say, integer frame
numbers, while the stream writer should use values in its own time base:
My code was sending pictures into the encoder using a pts in the stream's
time_base of 1/90000 (e.g. 3003, 6006, 9009). The solution was to first
rescale the AVFrame's pts from the stream's time_base to the codec time_base
to get a simple frame number (e.g. 1, 2, 3).
pic->pts = av_rescale_q(pic->pts, ost->time_base, enc->time_base);
avcodec_encode_video2(enc, &newpkt, pic, &got_packet_ptr);
Then when a packet is received from the encoder, you need to rescale pts and
dts back to the stream time_base.
newpkt.pts = av_rescale_q(newpkt.pts, enc->time_base, ost->time_base);
newpkt.dts = av_rescale_q(newpkt.dts, enc->time_base, ost->time_base);
av_interleaved_write_frame(out, &newpkt);
However, this is not working for me. I suspect it may be due to other
differences between my code and theirs.. :-(
In any case, surely someone out there has the understanding (and code
snippets?) of how to get the libx264 encoder to honor the specified bitrate?
Any help/pointers/advice/code would be greatly appreciated!
Thanks!
_______________________________________________
Libav-user mailing list
[email protected]
http://ffmpeg.org/mailman/listinfo/libav-user