On Thu, 11 Nov 2021 at 22:06, wrote:
> You may need a modified version of av_read_frame(). See this:
>
>
>
> https://gist.github.com/bsenftner/ba3d493fa36b0b201ffd995e8c2c60a2
>
>
>
> That’s for an earlier version of the libs, but you can see at line 31, the
> modification is simply providing a
On Tue, 9 Nov 2021 at 15:43, Yurii Monakov wrote:
> You can add "timeout" to the options dictionary (avformat_open_input).
>
> AVDictionary* opt = 0;
> av_dict_set_int(, "timeout", 100);
> avformat_open_input(..., );
>
> For TCP this will set open/receive timeout to 1 second.
>
> Best
Hi,
I'm trying to stop av_read_frame from blocking when the stream is stopped
(stream over TCP, and the sender pauses sending). Reading various forum
posts, it would appear that I need to use the AVIOInterruptCB structure and
assign it to my format context. So I do this:
/* retrieve stream
I am using av_read_frame to read from an incoming TCP audio stream, that
was opened by avformat_open_input with ("tcp://127.0.0.1:62011?listen") as
the source file.
The server connects to this and sends the data, and av_read_frame returns
frames of audio data that avcodec_decode_audio4 decodes
On Tue, 12 Oct 2021 at 15:41, drwho wrote:
> One of the audio hifi streaming devices used to drop or insert samples.
> In newer versions they fine tune the audio pll to match the stream rate.
> Beware Sonos has a patent on the behavior you just described. What is your
> application?
>
> Thanks.
Hi,
I'm using the ffmpeg decode engine to receive opus encoded audio over IP
and push it into my buffer which connects to my audio driver (custom
firmware, not a PC). The audio driver expects audio at 48kHz and plays it
at 48kHz locked to its system clock rate. However, the audio coming in is
implying that
avconfig.h is built in the build process?
Thanks,
Simon
On Mon, 30 Nov 2020 at 23:36, Carl Eugen Hoyos wrote:
> Am Mo., 30. Nov. 2020 um 17:04 Uhr schrieb Simon Brown
> :
> >
> > I am currently getting a compiler error saying libavutil/avconfig.h not
> found
I am currently getting a compiler error saying libavutil/avconfig.h not
found.
This is included from libavutil/common.h
I've just downloaded the latest git, and it's the same, avconfig.h not
found. And I've searched on github and there common.h still includes
avconfig.h and avconfig.h doesn't
On Fri, 26 Jun 2020 at 10:07, Info || Non-Lethal Applications <
i...@non-lethal-applications.com> wrote:
> With Apple’s recent announcement of moving from Intel to their own (ARM
> based) processors, I’m wondering what this means for FFmpeg users.
> I’m using the FFmpeg libraries as base for a -
>
>
> Your first attempt - iiuc - was not to use the skip_frame option for h264
> and
> vc1 input but to re-implement it in your code (which is technically
> difficult or
> impossible because only while decoding can you decide what to drop).
> I suggest you use the option instead of
>
>
>
> You can do --enable-decoder=aac,h264,vc1 to make the configure line more
> readable.
>
> Thanks - it's good to learn even basic things like this!
> libavcodec's skip_frame option works fine for h264 and vc1, I strongly
> suggest
> that you don't try to reimplement it after testing with
>
>
> But the question was how you compiled FFmpeg.
>
> > I tried skipping frames, but then it had too many errors
>
> Can you reproduce these errors with ffmpeg (the command line interface)?
> (Assuming you tested the skip_frame option)
>
> Carl Eugen
>
> Thank you Carl, and apologies for
On Fri, 17 Apr 2020 at 22:50, Carl Eugen Hoyos wrote:
> Am Fr., 17. Apr. 2020 um 20:38 Uhr schrieb Simon Brown
> :
>
> > Thanks Devin, found the right place, but finding frames to skip seems to
> be taking more CPU time.
> >
> > Is there any way of building ffmpe
On Fri, 17 Apr 2020 at 13:53, Devin Heitmueller
wrote:
> Not through the standard APIs, as implementations of FIFOs within
> input modules vary depending on the input. In my case (which is the
> UDP source), I added instrumentation to libavformat/udp.c which once a
> second calls
I am trying to decode a live stream using the libav libraries from ffmpeg.
I open a udp input using avformat_open_input.
I get the stream information and open_codec_contexts for the video and
audio streams.
Then when everything is setup I go through a loop of
av_read_frame
decode_packet
Hi,
Is there any way to specify number of threads for the h264 decoder when you
call avcodec_video_decode2?
I'm trying to speed up the decoder on an embedded ARM and although it could
decode the sintel animation at full speed (in PAL) it's now struggling with
real live video.
Cheers,
Simon
>
>
>
> On Tue, 3 Mar 2020, 22:25 Carl Eugen Hoyos, wrote:
>
>> Am Di., 3. März 2020 um 19:02 Uhr schrieb Simon Brown <
>> simon.k.br...@gmail.com>:
>>
>> > Can anyone help with a) getting a bitrate measure from the libav
>> libraries?
&g
On Tue, 3 Mar 2020, 22:25 Carl Eugen Hoyos, wrote:
> Am Di., 3. März 2020 um 19:02 Uhr schrieb Simon Brown <
> simon.k.br...@gmail.com>:
>
> > Can anyone help with a) getting a bitrate measure from the libav
> libraries?
>
> I may misunderstand but in gener
Hi,
If I use ffmpeg to produce -f ldash it takes a long time to sync, and I
need to first produce an h264 output stream and then feed that back into
ffmpeg to pass to the dash encoder. I have an h264 stream coming in, and I
don't want to re-encode.
So I now want to use code written based on the
>
>
> Ok, next problem - I have parsed all the command line parameters and
> assigned them to the output context using:
>
> ret = avformat_write_header(ofmt_ctx, );
>
> where opt contains the options dictionary. I know it is accepting this
> because the returned dictionary is empty. Also, I
On Wed, 26 Feb 2020 at 11:28, Simon Brown wrote:
>
> See the "remuxing.c" example.
>> Use avcodec_parameters_copy to copy the codecpar from the demuxer's
>> AVStream to the muxer's one.
>>
>> Thanks James - that's exactly what I needed. Now I just
> See the "remuxing.c" example.
> Use avcodec_parameters_copy to copy the codecpar from the demuxer's
> AVStream to the muxer's one.
>
> Thanks James - that's exactly what I needed. Now I just need to pass all
the parameters I need into the dash muxer and I'll be sorted.
Cheers,
Simon
I'm trying to convert an incoming transport stream to DASH without
re-encoding.
I've done this using two separate ffmpeg commands and it works, but as each
instance of FFMpeg spends time locking onto the stream, finding the first
I-frame, etc the resulting latency is unacceptable. I thought I
drop=0 speed=0.373x
video:326700kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB
muxing overhead: 0.00%
Regards,
Simon Brown
___
Libav-user mailing list
Libav-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/libav-user
To unsubscribe,
Hi,
I'm trying to run some code to decode FFmpeg frames and output them in a
custom format. I have started with the demux_decode sample file and it
runs, and decodes frames. However, I'm running on a Raspberry Pi 4 and
have built FFmpeg and libraries from the latest git head with the MMAL
On Tue, 13 Aug 2019 at 13:25, Carl Eugen Hoyos wrote:
> "..." is the important part here as libavformat is not self-contained.
> libavutil is a requirement for all other FFmpeg libraries, libavcodec
> is also needed for libavformat.
>
> Carl Eugen
>
> Thank you Carl, yes - I am aware I needed
I have a modified version of the demuxing_decoding.c program (with ffmpeg
from the latest git head), that I copied into a C++ file (to link it with
other software that requires C++). The program compiles, but when linking
it fails to find the libraries, so produces undefined references to all the
>
>
> Thanks,
> If I understand what you said, I have to proccess like this :
>
> //Declaration of yuv arrays
> uint16_t yval[mb_height * mb_width][256];
> uint16_t uval[mb_height * mb_width][256];
> uint16_t vval[mb_height * mb_width][256];
>
> for(int mby=0; mby {
> for(int mbx=0; mbx
On Tue, 9 Apr 2019 at 14:43, NDJORE BORIS wrote:
>
> Thank you simon.
>
> You probably right. The first loop is to obten all macroblock in the frame
> and the second is to optain each pixel in the macroblock, I thinked.
> But in what you gave if I do this for two different block, I think that I
On Tue, 9 Apr 2019 at 14:20, NDJORE BORIS wrote:
> Hello guys,
>
> I try to find pixels value for a given macroblock in a frame.
> What I did is the following :
>
> //for all macroblock in this frame
> for(int mby=0; y {
> for(int mbx=0; x {
> int xy =
>
>
>
> Ok, Thanks!
>
> But can you tell me how to get the frame pixel format, please?
> I use frame->format but the output is only zero (0) value.
>
> Regards
>
>>
>> I have:
av_get_pix_fmt_name(frame->format)
to get a format name, but also:
pix_fmt = video_dec_ctx->pix_fmt;
This should
On Fri, 5 Apr 2019 at 12:08, NDJORE BORIS wrote:
> Hello,
>
> Can someone help me to find the value of each pixel on a given frame,
> please.
> I use frame.data[] and frame.linesize[].
> Then I do this :
>
> for(int i = 0; i< height; i++)
> {
>
> for(int j=0 ;j {
>
2016-10-03 14:26 GMT+02:00 Alex Grosu :
> > I am just calling avcodec_alloc_context(). This sets thread_count
> > to 1. I tried set it to 4, but it seems there is no difference.
>
> When I asked about increasing thread count, someone helpfully pointed me
at this thread:
>
> With ffmpeg or writing code to use the libs?
> Probably get more ffmpeg help over in ffmpeg-u...@ffmpeg.org
With writing code using the libs - I can get it to work with ffmpeg - so I
know it's possible!
>
>> For my use though, I don't want to wade through ffmpeg.c to find what
>> it's
On 23 September 2016 at 09:27, Simon Brown <simon.k.br...@gmail.com> wrote:
> Hi,
> I'm trying to serve a transport stream generated by hardware out on as an
> RTP stream. Is there an example I can use to help me do this? I've been
> looking through ffserver.c but that is muc
Hi,
I'm trying to serve a transport stream generated by hardware out on as an
RTP stream. Is there an example I can use to help me do this? I've been
looking through ffserver.c but that is much more complicated than I need,
and seems to be largely aimed at serving files out.
Cheers,
Simon
On 8 September 2016 at 14:12, Carl Eugen Hoyos wrote:
> 2016-09-07 17:59 GMT+02:00 Kristijonas Malisauskas :
> > First thing i would suggest doing is setting your
> > AVCodecContext to use more threads
> >
> > for ex: videoCtx->thread_count = 4;
>
> I
CPU, I
think.
Cheers,
Simon
On 8 September 2016 at 11:44, Nikita Orlov <nikitos1...@yandex.ru> wrote:
> I see. Do you have exactly model name of cpu?
>
> 08.09.2016, 13:40, "Simon Brown" <simon.k.br...@gmail.com>:
> > That's excellent - thank you. I had change
ijonas Malisauskas <k...@sportcaster.dk>
>> wrote:
>>
>> First thing i would suggest doing is setting your AVCodecContext to use
>> more threads
>>
>> for ex: videoCtx->thread_count = 4;
>>
>> On Wed, Sep 7, 2016 at 3:05 PM, Simon Brown <simon
I'm running a decode operation using doc/examples/demuxing_decoding.c I'm
running on an Arm A9 with Neon enabled. Sending a 1280x720p stream at
24fps it can only decode about 15fps. Is there any way of speeding up the
decode operation?
Regards,
Simon
40 matches
Mail list logo