On Tuesday 28 October 2014 09:42:15 am Rio Kierkels wrote:
I'll be making an effort to contribute to the project
Just to make sure:
All contributions (code, documentation, reproducible bug
reports and financial contributions) for FFmpeg are very
welcome (and needed)!
To the best of my
On Monday 27 October 2014 04:02:24 pm Chaganti, Ravi (RE-CON) wrote:
OS is Oracle Solaris 10.
Please install a compiler on your system or pass the compiler path
with --cc=.
root@ # ./configure
--extra-cflags=-fpicfPIC
Remove this.
--extra-ldflags=-L/usr/local/lib -R/usr/local/lib
On Monday 27 October 2014 08:36:37 pm Paulo Fidalgo wrote:
On 27/10/14 00:04, Carl Eugen Hoyos wrote:
Paulo Fidalgo paulo.fidalgo.pt at gmail.com writes:
After making another test with lame and ffmpeg
Please test the following:
$ lame -V 0 -q 0 2L38_01_96kHz.wav
Thank you for your help Moritz, I tried your suggestion and I can say it
works.
I'm now trying to implement the same thing but using webm streaming, with
the same configuration, just replacing the format and VideoCodec:
*Stream test.webm # Output stream URL definition*
* Feed
But should't ffserver configuration file provide the encoding options,
namely *VideoCodec *libvpx?
With ffmpeg command I'm just feeding the raw video stream...
On Tue, Oct 28, 2014 at 12:28 PM, Moritz Barsnick barsn...@gmx.net wrote:
On Tue, Oct 28, 2014 at 12:00:46 +, Ricardo Mota wrote:
Hello ffmpeg-users,
I am currently working on a project where I would like to take a series of
images and create a video, however, I would like the images to be displayed
for a certain, but different amount of time.
For example, I have a series of 3 images, image1, image2, and image3. I
would
hi,
using the below line,
./ffmpeg -f s16le -i ~/file.raw -acodec mlp -b:a 18m -r 29.97 -strict
unofficial -sample_fmt s16 ~/final.mlp
I got the following error
ffmpeg version 2.4.git Copyright (c) 2000-2014 the FFmpeg developers
built on Oct 27 2014 22:58:52 with gcc 4.8 (Ubuntu/Linaro
Le septidi 7 brumaire, an CCXXIII, Ryan Vincent a écrit :
I am currently working on a project where I would like to take a series of
images and create a video, however, I would like the images to be displayed
for a certain, but different amount of time.
You can do that with the concat demuxer:
This thread came to my attention today:
http://ffmpeg.org/pipermail/ffmpeg-user/2014-July/022377.html
Although the problem is solved, I want to point to the root cause, which I
reported long time ago, here:
https://code.google.com/p/android/issues/detail?id=38423
Because Android build system
-filter_complex
[0:a][1:a]amerge[aout] -map [aout] output.m4a -report
Thanks,
Mark
Full log output:
ffmpeg started on 2014-10-28 at 17:05:14
Report written to ffmpeg-20141028-170514.log
Command line:
ffmpeg -y -i rtsp://172.24.0.31:554/axis-media/media.amp?camera=1 -i
rtsp://172.24.0.32:554
Diogo Serrano diogopmserrano at gmail.com writes:
ffmpeg -y -i http://URL/File.mov -c:v libx264 -f mp4 bla.mp4
(very slow encode, and the quicktime format do lots of
206 HTTP request to my server to process the video).
Complete, uncut console output missing.
Did you run qt-faststart on
Hi all,
I am using FFMPEG library to mux H.264 and AAC frames to Matroska (.mkv)
file. I can do that both using command line and C program.
Now, instead of writing the muxed matroska data into file I want to
write these muxed data directly on to socket or pipe. My actual goal is
to write a
12 matches
Mail list logo