[FFmpeg-user] Need to Convert MP4 (or MKV) with AAC *and* AC-3 Audio to Multi-Rendition HLS (each with *both* audio streams)

2022-08-30 Thread Clay
Hi folks,
Can you help me fix this ffmpeg script:
The goal:
Take in an mp4 (or mkv) with one video stream and TWO audio streams (one
AAC/2ch and one AC-3/5.1ch)... and output four video renditions (1920,
1280, 720, 480) that all keep *both* audio streams.
The input file is around 4mbps and I am attempting to clamp the video
bitrates of the four renditions as close to these as possible: 4M/3M/2M/1M
Due to HLS compatibility, the AAC must remain in the output file; and
the AC-3 needs to remain as well.
I thought this would be easy... just passing the audio through and
copying it to each rendition output.. but this is kicking my butt.
If you have a better way to achieve this, I am ***all ears***.
Here is the code I have so far:
#!/bin/bash
ffmpeg -i ../InputFile-1920x804-24f.mp4 \
-filter_complex "[0:v]split=4[v1][v2][v3][v4]; [v1]copy[v1out];
[v2]scale=w=1280:h=536[v2out]; [v3]scale=w=720:h=300[v3out];
[v4]scale=w=480:h=200[v4out]"\
-map [v1out] -c:v:0 libx264 -x264-params -b:v:0 -maxrate:v:0 4M
-minrate:v:0 4M -bufsize:v:0 4M -crf 17 -preset slower -g 48
-sc_threshold 0 -keyint_min 48 \
-map [v2out] -c:v:1 libx264 -x264-params -b:v:1 -maxrate:v:1 3M
-minrate:v:1 3M -bufsize:v:1 3M -crf 17 -preset slower -g 48
-sc_threshold 0 -keyint_min 48 \
-map [v3out] -c:v:2 libx264 -x264-params -b:v:2 -maxrate:v:2 2M
-minrate:v:2 2M -bufsize:v:2 2M -crf 17 -preset slower -g 48
-sc_threshold 0 -keyint_min 48 \
-map [v4out] -c:v:3 libx264 -x264-params -b:v:3 -maxrate:v:3 1M
-minrate:v:3 1M -bufsize:v:3 1M -crf 17 -preset slower -g 48
-sc_threshold 0 -keyint_min 48 \
-map 0:a -c:a:0 \
-map 0:a -c:a:1 \
-map 0:a -c:a:2 \
-map 0:a -c:a:3 \
-f hls \
-hls_time 2 \
-hls_playlist_type vod \
-hls_flags independent_segments \
-hls_segment_type mpegts \
-hls_segment_filename OutputFile_%v/data%02d.ts \
-master_pl_name master.m3u8 \
-var_stream_map "v:0,a:0 v:1,a:1 v:2,a:2 v:3,a:3"OutputFile_%v.m3u8
This code breaks when I attempt to run it, giving the following error:
[NULL @ 0x55f0357bbd40] Unable to find a suitable output format for '0:a'
0:a: Invalid argument
I have tried many varieties of the audio stream mapping... 0:a, a:0,
a:0:1.. etc.
I am hoping one of you gurus here take pity on me and show me the way :-)
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Basic Video Manipulation Qustions

2022-10-27 Thread Clay

Carl's comments are great.  I try to carve out duplicate content, but
this Q format may allow for leaving Carl's initial reply inline.
>> 1. Why does it make sense to convert from a lossy format to raw?
>> You're not
>> gaining any more detail.
>
> It doesn't, unless you need raw for some further processing (that's
> what happens in the ffmpeg pipeline- demux, decode, filter/transform,
> encode, mux, see https://ffmpeg.org/ffmpeg.html#Detailed-description).
>
>> 2. Does re-rendering video over and over at the same resolution, frame
>> rate, and bit rate cause degradation? For example, if I want to make
>> several changes to the video, like colour correction, splicing out
>> sections, transitions, titling, etc. should I be trying to do that
>> all in
>> one command?
>
> It can, and yes or store the intermediate files in a lossless format
> (which could be raw frames in a container); I prefer intermediate
> files as the command lines can get complex and if I mess something up
> or don't like it, part of the work has already been done (as an
> ancient IT person, keeping backups of each step is assumed).
>
> Or use some ffmpeg library based software that allows for you to
> effectively script/render the output.
The stacked command-line ability of ffmpeg is a blessing and a curse;
python-ffmpeg (and other) wrappers really help to harness the
complexity. There are several python wrapper efforts, here are two of
the most popular: https://github.com/PyAV-Org/PyAV and
https://github.com/kkroening/ffmpeg-python (doing a search on github
illuminates 850 repos)
>> 3. I understand that -c copy is a great way to eliminate time and
>> preserve
>> detail but what's the best way to do this when converting from one
>> codec to
>> another, or when performing editing or other changes?
>
> Copy does just that, it copies; changing the
> encoding/size/pixel-format/etc requires going through an
> uncompressed/raw state (the pipeline).
We use Copy to preserve quality between various ffmpeg actions when
possible.
>
>> 4. How important is it to keep standard resolutions, frame rates, and
>> bit
>> rates? Do hardware decoders do better with those? Do software decoders
>> handle weird resolutions better?  Will strange decoding artifacts appear
>> more often with non-standard parameters?
>
> Importance is relative. If the final output must be played on a wide
> varieties of player, then stick to the standards and it's more likely
> to work. Bit rates- AFAIK there are no "standards" for compressed
> video & audio, only conventions or guidelines to use a certain max
> rate to get a given quality (and some media, like dvd/bluray, can only
> go so fast). I can't answer for how s/w decoders play with
> non-standard resolutions. And as I understand, some decoders get wonky
> the resolution isn't divisible by 8.
HLS (http packaged mp4) has some Apple recommended bitrates and sizes. 
For example, they recommend 2mbps avg-bitrate for wifi streaming and
730kbps for cellular streaming along with a slew of pixel
recommendations
(https://developer.apple.com/documentation/http_live_streaming/http_live_streaming_hls_authoring_specification_for_apple_devices).

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] What is a "pad" in the context of an "input pad", an "output pad" and a "filter pad"

2022-10-28 Thread Clay
Thanks for the perspective, Phil :-)  As I am not an ffmpeg expert, when
I [attempt to] mentally digest the documentation, I drill down into the
minutia until I hit "descriptively fundamental particles" ...often,
where I expect to find fundamental descriptions, there is just a hole
(in the meaning)..so I drop a rock, as a sounding method (oh crap, more
audio terminology) and listen for the splash.  In the [filtergraph]
context of Pad, that splash never came :-D

So..  It seems like its more of a temporary gateway/portal for
specifically crunched data.
>> In the case of ffmpeg _filters_, it looks like #3 is closest as a point of> 
>> interconnection but #2 could apply to 'pad' and 'apad' although IMHO 'fill'> 
>> would be a better term.
> I would tend to agree that what's going on here is that the English word 
> "pad" has a very large number of meanings and ffmpeg appears to be mixing 
> those meanings in a way that might reasonably be expected to cause confusion.
> At least part of this is happening because ffmpeg, by its nature, crosses 
> disciplines between IT and media production. Because of changes in the media 
> industry over the life of the ffmpeg project, this has become more and more 
> true over time, as digital post production has become ubiquitous. It's sort 
> of inevitable this would happen and it probably isn't anyone's fault. Dafter 
> things have happened to Premiere.
> It's also not the first time that this sort of collision has occurred 
> (witness the state of colour management in ffmpeg until fairly recently, and 
> I'll never forget the time someone fairly senior started complaining that 
> drop-frame timecode was untidy, to a reaction from more experienced hands 
> that ranged from mirth to disbelief).
> It seems that a cleanup of terminology is in order and at least something's 
> going to have to give.
> P  


___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] 5.1 AAC without lowpass on the LFE channel

2022-09-15 Thread Clay
Is there a way to specify the LFE bitrate separately from the other
full-range channels?

Steven Ruppert via ffmpeg-user wrote on 9/15/22 16:24:
> On 9/12/22 09:33, Steven Ruppert via ffmpeg-user wrote:
>> I couldn't find any references whether that's inherent to the AAC 5.1
>> encoding, something that ffmpeg does, or inherent to the decoding
>> process. (I did decode my same test files using something that uses
>> Microsoft MediaFoundation and the LFE channel was still lowpassed).
>
> Turns out the AAC codec inherently limits the LFE bandwidth, so
> there's no way around it.
>
> The actual ISO/IEC 13818-7:2006(en) Standard costs a bunch of money to
> read as usual, but in the free glossary there is an entry:
>
> >low frequency enhancement ( LFE ) channel: limited bandwidth channel
> for low frequency audio effects in a multichannel system
>
> https://www.iso.org/obp/ui/#iso:std:iso-iec:13818:-7:ed-4:v1:en
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] ffmpeg-user archive

2022-11-11 Thread Clay



Mark Filipak wrote on 11/11/22 16:57:
> In the ffmpeg-user archive, could you PLEASE hide the sender's address
> so it can't be harvested?
>
> --Mark.
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
+1
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] encode to RAW video

2022-11-02 Thread Clay

> Clay via ffmpeg-user (12022-11-02):
>> Doesn't this serial ordering of the same command (-c:v ) twice just
>> drive cpu workload up for no actual benefit? 
>>
>> To clarify:  executing -c:v  and then executing -c:v  just
>> causes one output: .   Thus you are forcing the CPU to
>> decode:encode:decode:encode rather than just decode:encode can
>> someone confirm or correct me here?
> No, except for the ridiculously negligible CPU workload necessary to
> process the option itself.
Doesn't -c enable encoding or decoding (as well as the certainly trivial
'copy') of a stream, depending upon the choice and the existing stream's
codec?  Encoding is not a trivial workload :-)  Can you clarify that a
bit more? *I learn more from ffmpeg-user mail than my best efforts using
TheGoogle :-D
> ffmpeg does not have the infrastructure to make multiple decode-encode
> cycles, what you describe is not possible on top of being useless. What
> happens is just that the second -c:v option will override the first one.
>
This is great to know!

What you are saying is:  ffmpeg reads in the entire serial string of
commands/actions and, when duplicate commands/actions are identified,
programmatically uses the last entry. 

Is this the case with all commands?

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] ffserver

2022-11-09 Thread Clay


Nicolas George wrote on 11/9/22 03:18:
> Dave Blanchard (12022-11-08):
>> Are you seriously asking me this? What a stupid question.
> I hope you do not expect getting help with that kind of attitude.
>
> To vent your frustration, please go elsewhere, this list is not made for
> that. (I suggest you prefer a therapist over violence towards people
> around you.)
>
> I would also suggest you use a mail software that does not break
> threads, but it will not matter much anyway.
>
While I *really* understand Dave's frustration (including the added
pressure of dealing with major irl problems), Nicolas is right:  this is
not the forum for high-temp venting.  This mail list is one of the best,
most beneficial resources "out there" and certainly is top tier for
ffmpeg.  Please keep it that way. 

Regarding ffserver, man, were I but a coder! I like to think I'd take it
up where it ended because it is a good starting point. It seems
NGINX+ffmpeg is a solid combination (at least for my use case).

Do any of you know which, if any, of the servers indicated therein
(https://en.wikipedia.org/wiki/List_of_streaming_media_systems#Servers)
are based upon ffmpeg or ffserver?

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Need to Convert MP4 (or MKV) with AAC *and* AC-3 Audio to Multi-Rendition HLS (each with *both* audio streams)

2022-09-01 Thread Clay Lambert
Thank you Moritz!

I will apply your advice and report back!

On Thu, Sep 1, 2022, 02:26 Moritz Barsnick  wrote:

> Hi Clay,
>
> On Tue, Aug 30, 2022 at 18:51:13 -0400, Clay wrote:
> > If you have a better way to achieve this, I am ***all ears***.
> > Here is the code I have so far:
>
> I think you need to fix your command line in various places:
>
> > ffmpeg -i ../InputFile-1920x804-24f.mp4 \
> > -filter_complex "[0:v]split=4[v1][v2][v3][v4]; [v1]copy[v1out];
> > [v2]scale=w=1280:h=536[v2out]; [v3]scale=w=720:h=300[v3out];
> > [v4]scale=w=480:h=200[v4out]"\
> > -map [v1out] -c:v:0 libx264 -x264-params -b:v:0 -maxrate:v:0 4M
>
> "-x264-params" needs an argument - that's missing here.
> "-b:v:0" needs an argument - that's missing here.
>
> And for bash's sake, I believe you need to quote "[v1out]" and the
> similar ones below (but I may be overambitious here).
>
> > -minrate:v:0 4M -bufsize:v:0 4M -crf 17 -preset slower -g 48
> > -sc_threshold 0 -keyint_min 48 \
> > -map [v2out] -c:v:1 libx264 -x264-params -b:v:1 -maxrate:v:1 3M
> > -minrate:v:1 3M -bufsize:v:1 3M -crf 17 -preset slower -g 48
> > -sc_threshold 0 -keyint_min 48 \
> > -map [v3out] -c:v:2 libx264 -x264-params -b:v:2 -maxrate:v:2 2M
> > -minrate:v:2 2M -bufsize:v:2 2M -crf 17 -preset slower -g 48
> > -sc_threshold 0 -keyint_min 48 \
> > -map [v4out] -c:v:3 libx264 -x264-params -b:v:3 -maxrate:v:3 1M
> > -minrate:v:3 1M -bufsize:v:3 1M -crf 17 -preset slower -g 48
> > -sc_threshold 0 -keyint_min 48 \
> > -map 0:a -c:a:0 \
> > -map 0:a -c:a:1 \
> > -map 0:a -c:a:2 \
> > -map 0:a -c:a:3 \
>
> "-c:a:x" needs an argument.
>
> > -f hls \
> > -hls_time 2 \
> > -hls_playlist_type vod \
> > -hls_flags independent_segments \
> > -hls_segment_type mpegts \
> > -hls_segment_filename OutputFile_%v/data%02d.ts \
> > -master_pl_name master.m3u8 \
> > -var_stream_map "v:0,a:0 v:1,a:1 v:2,a:2 v:3,a:3"OutputFile_%v.m3u8
>
> > This code breaks when I attempt to run it, giving the following error:
> > [NULL @ 0x55f0357bbd40] Unable to find a suitable output format for '0:a'
> > 0:a: Invalid argument
>
> Probably because ffmpeg's whole option parsing is messed up due to your
> missing arguments.
>
> > I have tried many varieties of the audio stream mapping... 0:a, a:0,
> > a:0:1.. etc.
>
> And this is not the actual issue.
>
> Cheers,
> Moritz
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Crop and transpose filters combined produces large output

2017-06-29 Thread Clay D. Montgomery

On 6/29/2017 11:15 AM, Moritz Barsnick wrote:

On Thu, Jun 29, 2017 at 21:32:23 +0530, Gyan wrote:

There's no cropping taking place, as the video filter argument has been
replaced.

Ah, d'uh, I missed that. That's what he meant with size.

Clay, why did you write "wrong size" and then quote the file size next
to it, if you meant the dimensions (i.e. 1080x1920) Confusing
indeed.

Moritz
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Gyan, Moritz,

  Yes, I did have the syntax wrong for combining the filters.  It is 
working for me now like this:


ffmpeg -i File1.mp4 -vf crop=1080:1920:0:0,transpose=2 File3.mp4

File1.mp4 79,717,759
File3.mp4   6,816,685

Thanks again for all your help. I really appreciate it. You guys Rock!

Regards, Clay



___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

[FFmpeg-user] Crop and transpose filters combined produces large output

2017-06-29 Thread Clay D. Montgomery

Hello,

I have been experimenting with using the video crop and transpose 
filters with ffmpeg (version 2.5.2 on Windows) to convert 4K video into 
1080p video without scaling. When I combine these filters in a 1 pass 
operation, I get an output (File4.mp4) that is about 4X larger than it 
should be. But, when I split the same operation into 2 passes, I get an 
output (File3.mp4) that seems to be the correct size. The details are:


Crop and transpose in 1 pass:
ffmpeg -i File1.mp4 -vf crop=1080:1920:0:0 -vf transpose=2 File4.mp4

File sizes in bytes:
File1.mp4 79,889,077(Input 4K Video)
File4.mp4 25,918,758(Excessive Size)

Crop and transpose in 2 passes:
ffmpeg -i File1.mp4 -vf crop=1080:1920:0:0 File2.mp4
ffmpeg -i File2.mp4 -vf transpose=2 File3.mp4

File sizes in bytes:
File1.mp4  79,889,077(Same Input 4K Video)
File2.mp46,994,548
File3.mp45,777,801(Correct Size)

All of these files can be played with ffplay, but I use other video 
players and they have trouble playing File4.mp4.
It is not a big problem to use 2 passes, but I'm curious if this is a 
known issue or am I missing something?


Thanks, Clay

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

[FFmpeg-user] Crop 4K Video into Four 1080p Videos?

2017-06-26 Thread Clay D. Montgomery

Hello,

I'm wondering if it is possible to use ffmpeg to slice 4K video 
into a set of four 1080p videos and do it without scaling.


I have experimented with using the -s size option before the input, but 
I have not found a way to offset the position for cropping the video.


This also might require rotating the output videos by 90 degrees.

I would appreciate any advise on this.


Thanks, Clay


___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Crop 4K Video into Four 1080p Videos?

2017-06-26 Thread Clay D. Montgomery

Thank You Gyan!


On 6/26/2017 2:02 PM, Gyan wrote:

On Mon, Jun 26, 2017 at 11:58 PM, Clay D. Montgomery <c...@montgomery1.com>
wrote:


Hello,

 I'm wondering if it is possible to use ffmpeg to slice 4K video into a
set of four 1080p videos and do it without scaling.


You need to use the crop filter. For four distinct outputs,

 ffmpeg -i input -vf crop=1920:1080:0:0 output1 -vf
crop=1920:1080:1920:0 output2 -vf crop=1920:1080:0:1080 output3 -vf
crop=1920:1080:1920:1080 output4

(I've ignored audio. The above command will transcode it. Add -c:a copy
before each output to avoid that)
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

[FFmpeg-user] What is a "pad" in the context of an "input pad", an "output pad" and a "filter pad"

2022-10-27 Thread Clay via ffmpeg-user
Dumb ffmpeg question alert:

What is a "pad" in the context of an "input pad", an "output pad" and a
"filter pad"?

I understand the concept of padding a video with horizontal or vertical
bars, padding audio with dead air or some other audio, but this is
different.
Is there a technical description of the actual term "pad" used to
describe command line items in the ffmpeg execution string?

There are a few references in the main ffmpeg documentation
(https://ffmpeg.org/documentation.html) and a search of the ffmpeg-user
archives brings many usages of the term, yet no actual description (385
hits:
https://www.mail-archive.com/search?l=ffmpeg-user%40ffmpeg.org=+pad+).

While I can infer the meaning, it would be great to get a meaningfully
pedantic description from someone intheknow.

On the interwebs I found this description:

PAD [https://www.computerhope.com/jargon/p/pad.htm]
Updated: 10/17/2017 by Computer Hope

PAD may refer to any of the following:
1. When referring to a network, PAD, short for packet
assembler/disassembler, is a device capable of converting from one
packet into another. 

...I presume this means to describe a device that converts packetized
data from one packet format into another packet format while preserving
the data [or information] contained within the original packet.
...This seems a little bit like converting a stream/file from one codec
(e.g. h.264) into another (e.g. h.265) but that doesn't quite match the
inference. 

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] What is a "pad" in the context of an "input pad", an "output pad" and a "filter pad"

2022-10-28 Thread Clay via ffmpeg-user

>
> 36.39 apad
> Pad the end of an audio stream with silence.
> This can be used together with ffmpeg -shortest to extend audio
> streams to the same length as the video stream
>
> 39.181 pad
> Add paddings to the input image, and place the original input at the
> provided x, y coordinates.
> It accepts the following parameters: width, w height, h
> Specify an expression for the size of the output image with the
> paddings added. If the value for width or height is 0, the
> corresponding input size is used for the output.
>
> pad, p
> If set to 1, the filter will pad the last audio packet with silence,
> so that it will contain the same number of samples (or roughly the
> same number of samples, see frame_rate) as the previous ones. Default
> value is 1.
>
> first_pts
> For swr only, assume the first pts should be this value. The time unit
> is 1 / sample rate. This allows for padding/trimming at the start of
> stream. By default, no assumption is made about the first frame’s
> expected pts, so no padding or trimming is done
As previously indicated, padding audio or video is covered. 

I am referring to the logical and syntactical meaning as it relates to
filters/filterchains. This is a minor issue, I know. :-D

My question is simply: what is the definition of a "pad" in the context
of Chapter 32

[Thanks to Michael for the chapter callout!].  

Even in chapter 32, the meaning of "pad" is presumptive rather than
defined (note how, in 32.1, the "link label" is so clearly defined? 
This is what I am looking for):

32 Filtergraph description

A filtergraph is a directed graph of connected filters. It can contain
cycles, and there can be multiple links between a pair of filters.Each
link has one input pad on one side connecting it to one filter from
which it takes its input, and one output pad on the other side
connecting it to one filter accepting its output.

Each filter in a filtergraph is an instance of a filter class registered
in the application, which defines the features and the number ofinput
and output pads of the filter.

A filter with noinput padsis called a "source", and a filter with
nooutput pads is called a "sink".

32.1 Filtergraph syntax
The name and arguments of the filter are optionally preceded and
followed by a list of link labels.A link label allows one to name a link
and associate it to a filter output or input pad. The preceding labels
in_link_1 ... in_link_N, are associated to the filter input pads, the
following labels out_link_1 ... out_link_M, are associated to the output
pads.

When two link labels with the same name are found in the filtergraph, a
link between the corresponding input and output pad is created.

If an output pad is not labelled, it is linked by default to the first
unlabelled input pad of the next filter in the filterchain. For example
in the filterchain

nullsrc, split[L1], [L2]overlay, nullsink
the split filter instance has two output pads, and the overlay filter
instance two input pads. The first output pad of split is labelled "L1",
the first input pad of overlay is labelled "L2", and the second output
pad of split is linked to the second input pad of overlay, which are
both unlabelled.

In a filter description, if the input label of the first filter is not
specified, "in" is assumed; if the output label of the last filter is
not specified, "out" is assumed.

In a complete filterchain all the unlabelled filter input and output
pads must be connected. A filtergraph is considered valid if all the
filter input and output pads of all the filterchains are connected.

We know the definition of a Source (a filter with no input pads) and the
definition of a Sink (a filter with no output pads) but we still do not
have the definition of what a "pad" actually is.

Maybe something like this?:

A padis [here's where I am looking for help]a/an
tunnel-interface/connector/teleportation-endpoint [...?] used to apply
input to, or output from a filter, within a filtergraph or filterchain.

Pads may be called into being (established) by appending link labels to
filters. Pads are also established without link labels on the first
filter (assuming it to be an input pad) and on the last filter (assuming
it to be an output pad) within a filterchain.  

32.1 Filtergraph syntax
 does
an admirably good job of defining much of Filtergraph's relevant parts,
but it glosses over this bit.



___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] Re-encoding mkv files from makemkv

2022-10-12 Thread Clay via ffmpeg-user
Pehache,

Thank you for this simple explanation/reminder:

> Reminder : by default ffmpeg keeps only 1 video track and 1 audio
> track (so here you are implicitely dropping the second audio track and
> the subtitle track). "-map 0" means that you keep all the tracks
> present in the input file. "-map 0:0 -map 0:2 -map 0:3" means that you
> want to kepp the video track, the second audio track, and the subtitle
> track.
When digging through the archives for guidance this kind of advice,
however simple it may seem, is gold.
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] How to transcode to HLS and DASH with mp4 and webm all at once

2022-10-11 Thread Clay via ffmpeg-user


--Hi Steve
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] How to transcode to HLS and DASH with mp4 and webm all at once

2022-10-11 Thread Clay via ffmpeg-user
Hi Steve

Have you looked into CMAF for this solution?  I am also working on some
complex transcoding and packaging actions for HLS ABR delivery.  I am
looking into using CMAF to reduce the overall storage footprint (gpu/cpu
threads still cost a lot more than storage).  Also, I am experimenting
with ffmpeg-python (https://github.com/kkroening/ffmpeg-python), have
you (or anyone reading this) played around with it?

*I apologize to the forum for the accidental prior message (inadvertent
click :-P)

Liberty & Regard,
Clay

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] encode to RAW video

2022-11-02 Thread Clay via ffmpeg-user

> Le 31/10/2022 à 14:23, Naveen.B a écrit :
>> I observed some weird behaviour with fast and medium preset,
>>
>> fast preset:
>> *ffmpeg -pixel_format gray10le -s 1600x1300 -r 30 -i
>> CapturedImage-%03d.raw
>> -c:v rawvideo -pixel_format yuv420p -f rawvideo -c:v libx264 -preset
>> fast
>> -crf 18 test.raw*
>
> "-c:v rawvideo" then "-c:v libx264" makes no sense
>
> At some point you should be clear if you want a raw or a h264 stream!
>
Doesn't this serial ordering of the same command (-c:v ) twice just
drive cpu workload up for no actual benefit? 

To clarify:  executing -c:v  and then executing -c:v  just
causes one output: .   Thus you are forcing the CPU to
decode:encode:decode:encode rather than just decode:encode can
someone confirm or correct me here?
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-user] encode to RAW video

2022-10-31 Thread Clay via ffmpeg-user
Carl Zwanzig wrote on 10/31/22 19:01:
> On 10/31/2022 6:23 AM, Naveen.B wrote:
>> I observed some weird behaviour with fast and medium preset,
>
> We've observed that you're -still- top-posting, please stop it.
>
>
Reminder:  Top Posting:  Replying ABOVE the Quote... :-)
*Don't ask me how I know :-)
> (reformatted)
>> fast preset:
>> *ffmpeg -pixel_format gray10le -s 1600x1300 -r 30 -i
>> CapturedImage-%03d.raw
> Specifies the input file(s) and their characteristics.
>
> All that follows is filters and output specs.
>
>> -c:v rawvideo -pixel_format yuv420p 
> Use the video codec "rawvideo" with that pixel format for output,
> except AFAICT that there is no video encoder "rawvideo" (that should
> throw an error, which because of the missing command output, we don't
> see).
>
>
>> -f rawvideo 
> Force the output format to rawvideo ("Raw muxers accept a single
> stream matching the designated codec. They do not store timestamps or
> metadata. The recognized extension is the same as the muxer name
> unless indicated otherwise.") Don't use this unless there's a _really_
> good reason.
>
>
>> -c:v libx264 -preset fast -crf 18 
> But now use the x264 codec!!! (into a "raw" file)
>
>> test.raw*
> but call the output file "test.raw" (not "test.mp4" or something like
> that).
>
>
> What do you really want? Why this insistence on "raw" output, either
> as an "encoding" or output file?  That's making life difficult and
> likely obscuring other issues.
>
>
>> command output:
> (but the the _complete_ output!! Stop making us guess about things and
> you'll get better answers.)
>
>
> z!
>
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".