Am 06.03.2019 um 21:53 schrieb Paul B Mahol:
On 3/6/19, Michael Koch wrote:
Am 06.03.2019 um 20:45 schrieb Paul B Mahol:
Your interpretation of test results is invalid.
You should count frames.
Well, I did run the video in VLC and measure the duration with a
stopwatch, and it is 5 seconds
It like to try some tweaks with fillborders filter, so I need some
16-bit per plane images for testing. Can you help me with that?
You can make 16-bit images with the latest version of GIMP. Use "export
as", then use *.png as filename, then select "16bpc RGB" as pixel
format, then click on "
ASTRO ELECTRONIC Dipl.-Ing. Michael Koch
Raabestr. 43 37412 Herzberg
www.astro-electronic.de
Tel. +49 5521 854265 Fax +49 5521 854266
**
___
ffmpeg-user mailing list
ffmpeg-user@ffmpe
Am 09.03.2019 um 19:14 schrieb Gerardo Ballabio:
Hello all,
I have a video that was filmed under bad lighting conditions, the
background is too bright and people in the foreground are dark. I'm
trying to use ffmpeg to correct it. Please kindly help me.
For video processing like this you best us
Am 10.03.2019 um 08:22 schrieb Venkateswaran.S:
Hi all,
I wants to detect glitches in my video. Is there any filter to achieve it.?
How does the glitch look like? It it's black, you could try the
blackdetect or blackframe filters.
Michael
___
ffm
Am 10.03.2019 um 21:33 schrieb Gerardo Ballabio:
Thank you Michael -- but that means I still have to find good values
of brightness, contrast, gamma, saturation and hue to use with Gimp.
When I've done that, then couldn't I just apply the same values in
ffmpeg using the "eq" filter?
That won't
Am 10.03.2019 um 22:05 schrieb Michael Koch:
Am 10.03.2019 um 21:33 schrieb Gerardo Ballabio:
Thank you Michael -- but that means I still have to find good values
of brightness, contrast, gamma, saturation and hue to use with Gimp.
When I've done that, then couldn't I just apply the s
Am 11.03.2019 um 09:01 schrieb Gerardo Ballabio:
Ok, thank you Michael. I'll try haldclut then. This means I have to do
some research into Gimp too...
For the described workflow you can also use any other graphics program,
if it can read and write uncompressed or lossless compressed images
(l
ffmpeg -i input.webm -c copy -metadata:s:v:0 rotate=90 output.webm
As far as I know, rotating without re-encoding isn't possible. Try this:
ffmpeg -i input.webm -vf rotate=PI/2 output.webm
Michael
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.o
Am 23.03.2019 um 12:00 schrieb Rex East:
On Sat, Mar 23, 2019 at 7:46 PM Michael Koch
wrote:
As far as I know, rotating without re-encoding isn't possible. Try this:
ffmpeg -i input.webm -vf rotate=PI/2 output.webm
Thank you Michael for the reply and information.
However, that co
Hi all,
I'd like to make a suggestion for improvement. It's easy to make a video
from many pictures, but it's difficult to make a video which runs in
reverse order. I propose a new parameter "-number_decrease" which tells
ffmpeg to decrease the file number, instead of increasing it.
Example:
second.
Michael
--
**
ASTRO ELECTRONIC Dipl.-Ing. Michael Koch
Raabestr. 43 37412 Herzberg
www.astro-electronic.de
Tel. +49 5521 854265 Fax +49 5521 854266
Am 27.01.2016 um 13:21 schrieb pra...@devrepublic.nl:
Hi Carl,
I am running this command
ffmpeg -framerate 1/5 -pattern_type glob -i
'/home/devprj01/domains/devrepublic-projects.nl/public_html/development/houseview2/uploads/projects/52/images/temp/*.jpg'
-i
'/home/devprj01/domains/devrepubl
Hi all,
I found a bug in the documentation of the lenscorrection filter:
https://www.ffmpeg.org/ffmpeg-all.html#lenscorrection
It's written there "0.5 means no correction" for the coefficients k1 and
k2.
This can't be right, or the formula is wrong. It's obvious that the
formula makes no corre
Hi all,
I'd like to make a deep zoom-in from several images. For example, begin
with image1 (which was taken with a 50mm objective) and use the zoompan
filter to make a 5 seconds zoom-in from 1x to 4x, then use image2 (taken
with a 200mm objective) and continue to zoom in, and so on. There may
Paul,
Something like this:
zoompan=zoom='if(mod(in\,40)\,zoom\,0)+0.01':d=40:x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)'
thank you very much for this hint. The _in_ parameter does indeed work.
With this parameter it's for example possible to make a slideshow and
specify different durations fo
Hi all,
I have a problem with the zoompan filter. The video is jerking around.
This is especially visible when the zoom factor is quite high, as in the
below example, where it varies over 100 frames from 5.0 to 5*1.005^99=8.19
Other question: What's the meaning of the "deprecated pixel format
ffmpeg -i Baum.JPG -vf
"scale=iw*8:ih*8,zoompan=z='if(eq(zoom,1),5,zoom*1.005)':d=100:x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)'"
-c:v mpeg4 -s 1800x1200 -b:v 5M -q:v 2 test.mp4
Error message: Picture size 43776x29184 is invalid. Failed to inject
frame into filter network: Cannot allocate memory
Ugh, then you need to crop it and scale croped picture.
That's a workaround which has other problems...
The jerking must be a bug in ffmpeg. I just made this test:
1. Use a 5472x3548 PNG picture as input. Jerking is barely visible, only
about 1 pixel.
2. Convert the PNG picture to JPG and use
2. It would be nice to have an expression for a lookup table, for example
lookup(x, y0, y1, y2, ... , z)
Evaluate x, and if the result is 0 return y0, if the result is 1
return y1, and so on, else if the result is out of range return z.
I found a workaround for a lookup table. The f
Hello all,
let's assume I have two images and want to see them alternating with
framerate 1.
image1, image2, image1, image2 and so on, and stop after 60 seconds.
I could make 60 copies of the images and rename them with numbers from 0
to 59.
Is there an easier way to do this, without making s
ffmpeg -r 1 -loop 1 -i cherry-left.jpg -r 1 -loop 1 -i
cherry-right.jpg -lavfi
[0:v]setpts=N*2[a],[1:v]setpts=N*2+1[b],[a][b]interleave /tmp/o.nut
I didn't understand the meaning of the output path, but finally I got it
working after I replaced
/tmp/o.nut
by
-t 10 -r 4 test.mp4
Thanks,
Mi
I found a bug in the documentation of the lenscorrection filter:
https://www.ffmpeg.org/ffmpeg-all.html#lenscorrection
It's written there "0.5 means no correction" for the coefficients k1
and k2.
This can't be right, or the formula is wrong. It's obvious that the
formula makes no correction i
I can't help you much, but you are free to look at the source code, and
see whether you can see the formula from it:
https://github.com/FFmpeg/FFmpeg/blob/master/libavfilter/vf_lenscorrection.c
I did already try that, but there are no comments in the source code,
and it's really difficult to u
Am 23.02.2016 um 10:00 schrieb Michael Koch:
I found a bug in the documentation of the lenscorrection filter:
https://www.ffmpeg.org/ffmpeg-all.html#lenscorrection
It's written there "0.5 means no correction" for the coefficients k1
and k2.
This can't be right, or the fo
Am 23.02.2016 um 18:51 schrieb Michael Koch:
r_src is the radial coordinate in object space.
r_tgt is the radial coordinate in the image, which was taken with a
distorted lens.
Sorry, I confused these two. This is correct:
r_tgt is the radial coordinate in object space.
r_src is the radial
Please consider sending a patch that updates the documentation.
I recommend to make the following changes in chapter 38.73.1:
The filter accepts the following options:
cx
Relative x-coordinate of the center of the distortion. This value
has a range [0,1] and is expressed as fractions of
Thanks for advice, I've tried running the following command, but I get an
error as follows:
./ffmpeg -framerate 25 -pattern_type glob -i
/root/video-source/gif/yeay.gif /root/video-source/gif/outgif/outyeay.mp4
Do you have many gif images, or do you have one animated gif?
The answer from Moritz
Great, I thought, and since I will need a days worth of pictures (1 every
minute gives 1440 pictures) I'll rename them 0601.jpg to 1200.jpg and try
ffmpeg.exe -r 30 -i C:\Users\kostas\Documents\cams\timelapse\test2\%04d.jpg
timelapse9.mp4
You forgot to specify -start_number 601
Michael
Hi all,
over the last months I've collected a list what's missing in FFmpeg's
documentation.
At the end of the list there are a few feature requests.
"amplify" filter:
-- Many options are in the [0..65535] range. It's unclear how these
options must be set for 8-bit videos. Is it [0..255] in t
Am 15.04.2019 um 14:55 schrieb Carl Eugen Hoyos:
2019-04-15 14:45 GMT+02:00, Michael Koch :
over the last months I've collected a list what's missing in
FFmpeg's documentation.
While some of your suggestions will not be accepted (for example
to document for some filters t
Am 15.04.2019 um 15:59 schrieb Paul B Mahol:
Feature request:
-- Decay filter, makes short flashes of light appear longer, with an
exponential decay curve. Like on an analog oscilloscope screen with long
persistence time. Could be used for videos of the night sky to make fast
meteors appear lon
While some of your suggestions will not be accepted (for example
to document for some filters that they can be concatenated),
Why not? atempo does it.
I suspect nearly all filters can be concatenated and given that the
documentation is already hard to read, I don't think it should be
mentioned
Hi Gyan,
Contrast can go from -1000 to 1000. The range was expanded a few years
ago; the doc wasn't updated.
Do note that what contrast does is scale the distance of a pixel's
value from the median value i.e. 128 for a 8-bit input. So, if a pixel
channel has a value of 100, then a contrast o
Dipl.-Ing. Michael Koch
Raabestr. 43 37412 Herzberg
www.astro-electronic.de
Tel. +49 5521 854265 Fax +49 5521 854266
**
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://
Am 23.04.2019 um 11:43 schrieb Jon bae:
Hello,
I just have an general question:
As I know denoiser are specialized to specific noise pattern.
Makes it a different when I denoise after scaling a video, or is the
quality better when I denoise before scaling.
In terms of speed it would be better
Am 24.04.2019 um 10:43 schrieb Ulf Zibis:
Am 23.04.19 um 11:43 schrieb Jon bae:
Makes it a different when I denoise after scaling a video, or is the
quality better when I denoise before scaling.
In terms of speed it would be better for me, denoising after scaling, but
when the quality is better
Hi,
I'd like to report a problem when loading large PGM files, which are
required for the remap filter.
These files are loading extremely slow in the latest ffmpeg version.
Much slower than a few weeks before.
I found out that the problem began between April 10 and 11.
The PGM file that I use
, all these are much
slower.
I'm testing on the same computer with the same command line and same
input file, of course.
On my computer it's reproducible.
Michael
--
**
ASTRO ELECTRONIC Dipl.-Ing. Michael Koch
Raabestr. 43 3741
Am 27.04.2019 um 13:07 schrieb Carl Eugen Hoyos:
2019-04-27 0:41 GMT+02:00, Michael Koch :
I'd like to report a problem when loading large PGM files,
which are required for the remap filter.
These files are loading extremely slow in the latest ffmpeg version.
Much slower than a few
Feature request:
-- Decay filter, makes short flashes of light appear longer, with an
exponential decay curve. Like on an analog oscilloscope screen with long
persistence time. Could be used for videos of the night sky to make fast
meteors appear longer. Should work as follows:
For each pixel do
Am 15.04.2019 um 18:10 schrieb Gyan:
Hi Michael,
I'll look at these in detail later on, but just to touch on one point..
On 15-04-2019 06:15 PM, Michael Koch wrote:
"eq" filter:
-- It might be helpful to add a note that multiple eq filters can be
cascaded, for exam
Am 13.05.2019 um 12:04 schrieb Ben via ffmpeg-user:
From *.jpg pictures I know that there is a hidden header which
contains e.g. info about the original creation time of the picture/photo.
Is there something similar in *.MP4 video files?
yes, and you can view the data with exiftool:
https://w
Or the fps filter.
This filter and the r option have different algorithms that have advantages
(and disadvantages) in different situations.
Are the differences explained somewhere in the documentation?
Michael
___
ffmpeg-user mailing list
ffmpeg-us
Hi,
in the documentation for the blend filter is written:
"By default, the output terminates when the longest input terminates."
What's used as input when the shortest input has terminated? Is the last
frame of the shorter input duplicated?
Is there a trick to let the output terminate when the
Hi all,
I want to delay a video in a filter chain by 5 seconds, and then combine
it with the original video with hstack. I use the shortest=1 option, so
that the output contains only the duration where both videos overlap:
ffmpeg -f lavfi -i testsrc=duration=10:size=vga:rate=25 -filter_comple
Am 10.06.2019 um 13:11 schrieb Paul B Mahol:
Can not reproduce.
Make sure that audio is not taken in output duration.
I did remove the audio track from the input video, but the problem is
still the same.
Provide your input file by uploading it somewhere.
http://www.astro-electronic.de/308.m
Am 10.06.2019 um 15:43 schrieb Paul B Mahol:
On 6/10/19, Michael Koch wrote:
Am 10.06.2019 um 13:11 schrieb Paul B Mahol:
Can not reproduce.
Make sure that audio is not taken in output duration.
I did remove the audio track from the input video, but the problem is
still the same.
Provide
I want to delay a video in a filter chain by 5 seconds, and then
combine it with the original video with hstack. I use the shortest=1
option, so that the output contains only the duration where both
videos overlap:
ffmpeg -f lavfi -i testsrc=duration=10:size=vga:rate=25
-filter_complex
s
Hi all,
I just tried this command line and it crashes FFmpeg:
ffmpeg -i meteor308.mp4 -vf fade=in:0:0.5 -q:v 1 -y 308.mp4
I already know what's wrong. I thought the fade duration is expressed in
seconds. But (with this syntax) it must be expressed in frames and must
be an integer. It would be
Am 29.07.2019 um 06:57 schrieb Gyan:
On 28-07-2019 11:04 PM, Michael Koch wrote:
Hi all,
I just tried this command line and it crashes FFmpeg:
ffmpeg -i meteor308.mp4 -vf fade=in:0:0.5 -q:v 1 -y 308.mp4
I already know what's wrong. I thought the fade duration is expressed
in seconds
I am facing issue while cropping the video. i am using the following
command to crop the video it is working perfect for land scap videos but
for portrait video it is always crop to top.
Please help in this
potrait
top
-y-ss00:00.0-i input.mp4-t00:06.235-vfcrop=480:480:240:0:exact=0 output.mp
Hello all,
I want to hstack a video with a delayed version of itself.
First test:
ffmpeg -f lavfi -i testsrc2=size=600x400:duration=10 -y test1.mp4
This produces a video of 10 seconds length. The clock is running from 0
to 10, as expected.
Second test:
ffmpeg -f lavfi -i testsrc2=size=600
Hi Moritz,
Without actually trying to reproduce, there is one obvious (to me)
error in your command line:
setpts=PTS-4[3]
This isn't doing what you expect it to. PTS is in "ticks", i.e. certain
incremental units, but normally not in seconds. That timebase is
"1/TB". TB is what you see as "tb
Hello all,
this is not a question. During the last few years I've tested many
things with FFmpeg, and now I've written it all in this document:
http://www.astro-electronic.de/FFmpeg_Book.pdf
It contains many examples for FFmpeg and a few other programs, also for
audio processing, color grading
Hi Rik,
It might be worth putting this online as a separate website (kinda like an
additional wiki), which would make it easier to update and folks would
always have the latest version/info.
Just save the link to my document, and when you need the latest version
just download it again.
There
Hi Rik,
It might be worth putting this online as a separate website (kinda
like an
additional wiki), which would make it easier to update and folks would
always have the latest version/info.
Just save the link to my document, and when you need the latest
version just download it again.
Ther
Hello,
I'm using the following batch file (Windows 7) for a live ultrasound
conversion.
set "SR=44100" :: Sample rate
set "F=14000" :: Subtracted frequency
set "VOL=10" :: Volume factor
set /a "N=4096*%F%/%SR%" :: N = 4096 * F / SR
c:\ffmpeg\ffmpeg -f
Hello Paul,
ffplay and using pipe gives you huge delay. By using mpv and filtergraph
directly you would get much lesser delay.
Default delay introduced by this filter is the one set by win_size in
number of samples, which is by default 4096.
If you need less delay even than this one and can not
Hello all,
I have used the remap filter to make a video with size 1920x960 and the
content is a spherical video in equirectangular projection. But VLC
player doesn't recognize it as spherical, so I can't use the mouse to
change the direction of view in VLC.
I think (but I'm not sure) that some
Google next time before you ask here.
You need special python tool available on github which inserts sidedata.
I dunno in what state is that tool now, seems pretty abandoned last time I
looked it.
I saw the Python tool, but I thought it might also be possible to do
that with FFmpeg. That's wh
Hello all,
for a special effect I want to overlay a small video over the main
video, where the position of the overlay is a function of time. It's a
complicated function and can't be expressed by a simple formula.
Segment-wise linear interpolation might work, but many segments are
required an
Hello Gyan,
You can just read the whole filtergraph from a file.
ffmpeg -i in -filter_complex_script graph.txt ...
where graph.txt is
e.g.
[0:v][2:v]overlay=...[out]
That's a good idea!
Are line feeds allowed in the text file, or must all be written in one line?
Michael
_
That's a good idea!
Are line feeds allowed in the text file, or must all be written in one line?
Try it?
It works with line feeds.
Michael
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user
To unsu
Am 03.09.2019 um 02:27 schrieb suraj kandukuri:
Hi,
I have Long video ~8hrs to be converted to individual frames to be saved in
a folder with the filename in the format
"InputVideoName_Frame_Number_TimeStampoftheframeinthevideo.jpg" for
example: InputVideo2_234_130425 in a python code. My system
Hi all,
I see the progress on the V360 filter... that's very good!
A few suggestions for improvement:
-- (Single-)Fisheye as input and output format, with a selectable field
of view up to 360° (little planet).
-- A "color" option for filling those areas that have got no data from
the input.
--
Am 06.09.2019 um 15:14 schrieb Michael Koch:
Hi all,
I see the progress on the V360 filter... that's very good!
A few suggestions for improvement:
-- (Single-)Fisheye as input and output format, with a selectable
field of view up to 360° (little planet).
-- A "color" option fo
Paul,
Stereographic projection aka little planet added as both input and output.
Single fisheye is little different.
v_fov is limited to 90°, that's not enough for little planet.
-- I think the d_flip parameter needs some more documentation. What's an
"in-depth" flip?
It swaps back with f
Paul,
v_fov is limited to 90°, that's not enough for little planet.
I increased it.
ok, I will test it as soon as it's available at Zeranoe.
I think the field of view isn't correct in the stereographic output.
Here is a script for making really nice test images, for measuring field
of vi
ffmpeg -i %IN% -i %IN% -lavfi
"[0]transpose=1[left];[1]transpose=2,geq=r='255-r(X,Y)':g='255-g(X,Y)':b='255-b(X,Y)'[right];[left][right]hstack"
-y %OUT%
Cant you use here negate filter instead of geq?
oops, yes that would be better. I did search for "invert" and didn't
find it...
Michael
_
Paul,
I think the field of view isn't correct in the stereographic output.
Here is a script for making really nice test images, for measuring field
of view:
Well spotted, thanks, should be fixed.
The field of view of the stereographic output isn't correct. Here are a
few examples:
set in c
Am 11.09.2019 um 13:13 schrieb Michael Koch:
Paul,
I think the field of view isn't correct in the stereographic output.
Here is a script for making really nice test images, for measuring
field
of view:
Well spotted, thanks, should be fixed.
The field of view of the stereographic o
Am 11.09.2019 um 14:23 schrieb Michael Koch:
Am 11.09.2019 um 13:13 schrieb Michael Koch:
Paul,
I think the field of view isn't correct in the stereographic output.
Here is a script for making really nice test images, for measuring
field
of view:
Well spotted, thanks, should be
Paul,
Make this correction:
new_fov = 180 * tan(fov/4)
where fov the the field of view you get from the command line, and
fov_new is the value that you use for the filter.
You must exclude values too close to 360°, because 360° stereographic
projection is impossible.
P.S. of course fov mus
Am 12.09.2019 um 14:05 schrieb Paul B Mahol:
On 9/12/19, Michael Koch wrote:
Paul,
Make this correction:
new_fov = 180 * tan(fov/4)
where fov the the field of view you get from the command line, and
fov_new is the value that you use for the filter.
You must exclude values too close to 360
Paul,
I'm working on dual fisheye output
I've tested double-fisheye input and output and found no problems. This
can also be used for tilting of 180° single-fisheye images:
set "IN=IMG_077.jpg" :: Input image or video
set "PITCH=0" :: Rotation angle around X axis
set "YA
Am 12.09.2019 um 14:55 schrieb Paul B Mahol:
Question:
If the input format is stereographic and the output format is
equirectangular, what's then the meaning of the h_fov and v_fov parameters?
Do they define the field of view of the stereographic input (that would
make sense),
or do they define
Paul,
I'm working on dual fisheye output
I've tested the new "ball" output format and found no problems. It gives
the same output as my workaround with the remap filter.
The only thing that's still missing is single-fisheye as input and
output format, with a user-defined field of view.
Nicolas,
So, to compute the timestamp of a frame with variable speed:
* Express your frame rate as a complete formula: t → v
Is t the time in the input video, or the time in the output video?
Michael
___
ffmpeg-user mailing list
ffmpeg-user@ffmpe
So, to compute the timestamp of a frame with variable speed:
* Express your frame rate as a complete formula: t → v
* Integrate it: t → f.
* Find the reciprocal: f → t.
Let's give it a try. My input video has framerate=20 and length=10s.
Let's change the framerate linearly from 20 at the b
Am 23.09.2019 um 23:07 schrieb Carl Eugen Hoyos:
Am Mo., 23. Sept. 2019 um 23:04 Uhr schrieb Michael Koch
:
ffmpeg -f lavfi -i testsrc2=size=vga:duration=10:rate=20 -lavfi
"setpts='(20-sqrt(400-2*N))/TB' " -y out.mp4
(Does this syntax really work?)
As said before, ou
Am 23.09.2019 um 14:45 schrieb RazrFalcon:
Hi!
I have a 240FPS video made by GoPro and I'm looking for a way to gradually
change a video framerate. If I want to simply slow down a video I can use:
ffmpeg -i raw.mp4 -filter_complex "[v:0]setpts=8*PTS" -r 30 out.mp4
and it works fine. But when I
Hi,
I'm using this animated GIF as input http://gosper.org/sidereal.gif
and process it with the most simple FFmpeg command line:
ffmpeg -i sidereal.gif -y out.gif
Why are the colors wrong in the output?
I did already try to add -vf format=pix_fmts=rgb24 but the output is
the same.
Micha
Am 26.09.2019 um 15:06 schrieb Carl Eugen Hoyos:
Am Do., 26. Sept. 2019 um 15:00 Uhr schrieb Michael Koch
:
I'm using this animated GIF as input http://gosper.org/sidereal.gif
and process it with the most simple FFmpeg command line:
ffmpeg -i sidereal.gif -y out.gif
Did you also tr
Hi Javier,
I suspect it is something about the palette generated.
I have been using palettegen and paletteuse filters lately, and they have
worked out fine for me.
This command is working as you might expect:
ffmpeg -y -i sidereal.gif -filter_complex
"split[s0][s1];[s0]palettegen[p];[s1][
Am 26.09.2019 um 23:06 schrieb Carl Eugen Hoyos:
Am Do., 26. Sept. 2019 um 21:13 Uhr schrieb Verachten Bruno
:
I have different environments (X86, ARM, Jetson Nano) on which I'd like to
use hardware encoders and decoders if available.
I guess that I won't be able to use them if they have not be
Carl Eugen,
@Michael: Please understand that (at least most) developers do
not have the time to copy the change you already suggested into
their local tree. Instead install a git client of your choice and send
the documentation patches to the development mailing list where
they will find more fr
Hi,
in the documentation is a new example for afftfilt:
afftfilt="real='hypot(re,im)*sin(0)':imag='hypot(re,im)*cos(0)':win_size=512:overlap=0.75"
What's the point of multiplying by sin(0) and cos(0)? Isn't this exactly
the same as
afftfilt="real=0:imag='hypot(re,im)':win_size=512"
Michael
_
Hello,
FFmpeg can read PGM (Portable Gray Map) files that are either ASCII
coded (beginning with "P2") or binary coded (beginning with "P5"). When
writing a PGM file, the default format is binary. Is it also possible to
write an ASCII coded PGM file?
Thanks,
Michael
Am 20.10.2019 um 23:56 schrieb arthur brogard via ffmpeg-user:
I have been trying to convert a bunch of mkv files to mp4 using ffmpeg in batch
mode via a .bat file but I can't get it right.
I've followed a number of suggested commands from various hits I've got on
google but none of them work.
set OUT=%IN:mkv=mp4%
please note that this renaming will give unexpected results if "mkv" is
also part of the filename, for example
aamkv.mkv will be renamed to
aamp4.mp4
If someone knows a solution to this problem, please let us know.
Michael
_
Hi,
I have a video which looks fine in FFplay and also in VLC, but after
uploading to Facebook the first frame is corrupted, as can be seen here:
https://www.facebook.com/12490928195/videos/2597554727004199/
I don't know if this is a problem in my file or not. The console output
is below.
Am 30.10.2019 um 10:07 schrieb Michael Koch:
Hi,
I have a video which looks fine in FFplay and also in VLC, but after
uploading to Facebook the first frame is corrupted, as can be seen
here: https://www.facebook.com/12490928195/videos/2597554727004199/
I don't know if this is a probl
You may want to reconsider your pixel format, and whether that's really
required for the video content you are presenting. If "anything" is
fine, just add "-pix_fmt yuv420p" to your ffmpeg output options (or
experiment with other 8-bit formats such as yuv444p).
with yuv420p it works fine !
A
Hi,
I'm just making some experiments with the drawgraph filter and it took
me more than one hour to figure out what was wrong:
The colors of the graphs must be specified in format 0xAABBGGRR, but the
background color must be in format 0xRRGGBBAA.
That's very confusing!
Michael
Am 03.11.2019 um 06:55 schrieb Carl Eugen Hoyos:
Am So., 3. Nov. 2019 um 00:08 Uhr schrieb Michael Koch
:
I'm just making some experiments with the drawgraph filter and it took
me more than one hour to figure out what was wrong:
The colors of the graphs must be specified in format 0xAAB
Am 15.11.2019 um 16:35 schrieb David Previs:
Hello
I have the following Filters:
Watermark:
"overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2"
Text Overlay:
drawtext="fontsize=24:fontcolor=00@0.5:fontfile=/Windows/Fonts/Arial.ttf
:text='Test Overlay':x=(w-text_w)/2:y=
Am 15.11.2019 um 16:53 schrieb David Previs:
-y -i "c:\ffmpeg\Black.mp4" -i "C:\pw\temp\watermark99.png" -filter_complex
"overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2",
drawtext="fontsize=24:fontcolor=00@0.5:fontfile=/Windows/Fonts/Arial.ttf
:text='Huntley Film
Archives':x=
Hello all,
I'd like to draw a curve of the RMS audio level with the following
command line. The problem is that the output is a black video and no
curve is visible.
c://ffmpeg/ffmpeg -i P1000479.mov -lavfi astats,adrawgraph=
m1="lavfi.astats.RMS level
dB":mode=line:slide=scroll:min=-100:max=
Am 24.11.2019 um 22:17 schrieb Michael Koch:
Hello all,
I'd like to draw a curve of the RMS audio level with the following
command line. The problem is that the output is a black video and no
curve is visible.
c://ffmpeg/ffmpeg -i P1000479.mov -lavfi astats,adrawgraph=
m1="lavfi.
601 - 700 of 828 matches
Mail list logo