Re: [FFmpeg-user] Possible to set color_range on the command line for mjpeg?

2015-08-25 Thread Paul B Mahol
Dana 25. 8. 2015. 17:50 osoba Robert Krüger krue...@lesspain.de
napisala je:

 Hi,

 for the purpose of debugging something in the libraries and submitting a
 corresponding bug report, I would like to specify the color_range of the
 codec context on the command line, i.e. in this case I want to create an
 mjpeg file with full range without using the deprecated yuvj pixel
formats.

 I tried this here:

 ffmpeg -i
 ~/lesspain/samples/software/fcp_7/xdcam_ex/fcp7_xdcam_ex_1080_25p_1s.mov
 -vf scale=out_range=full -c:v mjpeg -an -pix_fmt yuv444p
 mjpeg-fullrange-ffmpeg-noyuvj.mov
 ffmpeg version N-74034-gce46627 Copyright (c) 2000-2015 the FFmpeg
 developers
   built with Apple LLVM version 6.1.0 (clang-602.0.49) (based on LLVM
 3.6.0svn)
   configuration: --enable-gpl --enable-libx264
   libavutil  54. 29.100 / 54. 29.100
   libavcodec 56. 55.100 / 56. 55.100
   libavformat56. 40.101 / 56. 40.101
   libavdevice56.  4.100 / 56.  4.100
   libavfilter 5. 29.100 /  5. 29.100
   libswscale  3.  1.101 /  3.  1.101
   libswresample   1.  2.101 /  1.  2.101
   libpostproc53.  3.100 / 53.  3.100
 [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fddd181bc00] nclc: pri 1 trc 1 matrix
 1Guessed Channel Layout for  Input Stream #0.1 : stereo
 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from

'/Users/krueger/lesspain/samples/software/fcp_7/xdcam_ex/fcp7_xdcam_ex_1080_25p_1s.mov':
   Metadata:
 major_brand : qt
 minor_version   : 537199360
 compatible_brands: qt
 creation_time   : 2013-12-11 15:38:12
   Duration: 00:00:01.00, start: 0.00, bitrate: 25365 kb/s
 Stream #0:0(eng): Video: mpeg2video (Main) (xdve / 0x65766478),
 yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 23773 kb/s, 25 fps, 25
 tbr, 25 tbn, 50 tbc (default)
 Metadata:
   creation_time   : 2013-12-11 15:38:12
   handler_name: Apple Alias-Datensteuerung
   encoder : XDCAM EX 1080p25 (35 Mb/s VBR)
   timecode: 01:00:00:00
 Stream #0:1(eng): Audio: pcm_s16le (sowt / 0x74776F73), 48000 Hz, 2
 channels, s16, 1536 kb/s (default)
 Metadata:
   creation_time   : 2013-12-11 15:38:12
   handler_name: Apple Alias-Datensteuerung
 Stream #0:2(eng): Data: none (tmcd / 0x64636D74), 0 kb/s (default)
 Metadata:
   creation_time   : 2013-12-11 15:38:16
   handler_name: Apple Alias-Datensteuerung
   timecode: 01:00:00:00
 Incompatible pixel format 'yuv444p' for codec 'mjpeg', auto-selecting
 format 'yuvj444p'
 [swscaler @ 0x7fddd104a000] deprecated pixel format used, make sure you
did
 set range correctly
 Output #0, mov, to 'mjpeg-fullrange-ffmpeg-noyuvj.mov':
   Metadata:
 major_brand : qt
 minor_version   : 537199360
 compatible_brands: qt
 encoder : Lavf56.40.101
 Stream #0:0(eng): Video: mjpeg (jpeg / 0x6765706A), yuvj444p(pc),
 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 200 kb/s, 25 fps, 12800 tbn, 25 tbc
 (default)
 Metadata:
   creation_time   : 2013-12-11 15:38:12
   handler_name: Apple Alias-Datensteuerung
   timecode: 01:00:00:00
   encoder : Lavc56.55.100 mjpeg
 Stream mapping:
   Stream #0:0 - #0:0 (mpeg2video (native) - mjpeg (native))
 Press [q] to stop, [?] for help
 frame=   24 fps=0.0 q=24.8 size=2066kB time=00:00:00.96
 bitrate=17626.5kbitsframe=   25 fps=0.0 q=24.8 Lsize=2147kB
 time=00:00:01.00 bitrate=17589.0kbits/s
 video:2146kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB
 muxing overhead: 0.067178%

 but obviously the mjpeg encoder refuses the pixel format although it
should
 not do this if the deprecated pixel formats are really not to be used. Am
I
 missing something or is the mjpeg encoder simply not capable yet of
working
 with color_range instead of the deprecated yuvj pixel formats?

Nobody yet sent patch to fix it.


 Thanks in advance,

 Robert
 ___
 ffmpeg-user mailing list
 ffmpeg-user@ffmpeg.org
 http://ffmpeg.org/mailman/listinfo/ffmpeg-user
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] ffmpeg overlay last frame remaining in video

2015-08-25 Thread Chad Horton
Thank you.  I'm learning.  What I set up for all the various parameters
was what I pieced together while trying to figure out how to create the
overlays, both through trial and error, reading the documentation on
ffmpeg.org, and asking experts for guidance.

This particular ffmpeg is being used in an Android app I've developed.
It's very painful to compile ffmpeg via android ndk constantly and then
have to rebuild the apk and push out to the google play store.

 ... for your main issue, your overlay needs the eof_action option:
overlay=eof_action=pass

Worked like a charm. Thank you!


What is this -itsoffset for?

Users can set when they want to start the overlay video

Why use -c:v libx264 *and* -vcodec mpeg4? These are mutually exclusive,
and mpeg4 is chosen as the encoder since it was declared later.

I have corrected this.

Your main input is 3/1001. Why are you setting the output frame rate
with -r 30?

I have removed this.

Why do you use single pass with -b:v instead of -crf? See:
https://trac.ffmpeg.org/wiki/Encode/H.264

Interesting.  I¹m definitely going to test with -crf







-Original Message-
From: ffmpeg-user [mailto:ffmpeg-user-boun...@ffmpeg.org] On Behalf Of Lou
Sent: Sunday, August 23, 2015 7:12 PM
To: ffmpeg-user@ffmpeg.org
Subject: Re: [FFmpeg-user] ffmpeg overlay last frame remaining in video

On Sun, Aug 23, 2015, at 04:50 PM, Chad Horton wrote:

 ffmpeg -y -i basevideo.mov -r 30 -itsoffset 00:00:00.000 -i
 overlayvideo.mov -filter_complex
 
[1:v]scale=1280:720[ovrl];[0:v][ovrl]overlay=0:0[outv];[0:a][1:a]amix[out
a]
 -map [outv] -map [outa] -c:v libx264 -vcodec mpeg4 -r 30 -strict
 experimental -b:v 150 finalvideo.mp4

Why do you place -r as an input option for overlayvideo.mov?

What is this -itsoffset for?

Why use -c:v libx264 *and* -vcodec mpeg4? These are mutually exclusive,
and mpeg4 is chosen as the encoder since it was declared later.

Your main input is 3/1001. Why are you setting the output frame rate
with -r 30?

Why do you use single pass with -b:v instead of -crf? See:
https://trac.ffmpeg.org/wiki/Encode/H.264

Anyway, for your main issue, your overlay needs the eof_action option:
overlay=eof_action=pass

See overlay filter docs for more info:
https://ffmpeg.org/ffmpeg-filters.html#overlay

 ffmpeg version N-70223-g7296716 Copyright (c) 2000-2015 the FFmpeg
 developers

This version is about 6 months old which is considered geriatric. FFmpeg
development is very active.

Top-posting should be avoided on this mailing list.
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


[FFmpeg-user] Possible to set color_range on the command line for mjpeg?

2015-08-25 Thread Robert Krüger
Hi,

for the purpose of debugging something in the libraries and submitting a
corresponding bug report, I would like to specify the color_range of the
codec context on the command line, i.e. in this case I want to create an
mjpeg file with full range without using the deprecated yuvj pixel formats.

I tried this here:

ffmpeg -i
~/lesspain/samples/software/fcp_7/xdcam_ex/fcp7_xdcam_ex_1080_25p_1s.mov
-vf scale=out_range=full -c:v mjpeg -an -pix_fmt yuv444p
mjpeg-fullrange-ffmpeg-noyuvj.mov
ffmpeg version N-74034-gce46627 Copyright (c) 2000-2015 the FFmpeg
developers
  built with Apple LLVM version 6.1.0 (clang-602.0.49) (based on LLVM
3.6.0svn)
  configuration: --enable-gpl --enable-libx264
  libavutil  54. 29.100 / 54. 29.100
  libavcodec 56. 55.100 / 56. 55.100
  libavformat56. 40.101 / 56. 40.101
  libavdevice56.  4.100 / 56.  4.100
  libavfilter 5. 29.100 /  5. 29.100
  libswscale  3.  1.101 /  3.  1.101
  libswresample   1.  2.101 /  1.  2.101
  libpostproc53.  3.100 / 53.  3.100
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fddd181bc00] nclc: pri 1 trc 1 matrix
1Guessed Channel Layout for  Input Stream #0.1 : stereo
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from
'/Users/krueger/lesspain/samples/software/fcp_7/xdcam_ex/fcp7_xdcam_ex_1080_25p_1s.mov':
  Metadata:
major_brand : qt
minor_version   : 537199360
compatible_brands: qt
creation_time   : 2013-12-11 15:38:12
  Duration: 00:00:01.00, start: 0.00, bitrate: 25365 kb/s
Stream #0:0(eng): Video: mpeg2video (Main) (xdve / 0x65766478),
yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 23773 kb/s, 25 fps, 25
tbr, 25 tbn, 50 tbc (default)
Metadata:
  creation_time   : 2013-12-11 15:38:12
  handler_name: Apple Alias-Datensteuerung
  encoder : XDCAM EX 1080p25 (35 Mb/s VBR)
  timecode: 01:00:00:00
Stream #0:1(eng): Audio: pcm_s16le (sowt / 0x74776F73), 48000 Hz, 2
channels, s16, 1536 kb/s (default)
Metadata:
  creation_time   : 2013-12-11 15:38:12
  handler_name: Apple Alias-Datensteuerung
Stream #0:2(eng): Data: none (tmcd / 0x64636D74), 0 kb/s (default)
Metadata:
  creation_time   : 2013-12-11 15:38:16
  handler_name: Apple Alias-Datensteuerung
  timecode: 01:00:00:00
Incompatible pixel format 'yuv444p' for codec 'mjpeg', auto-selecting
format 'yuvj444p'
[swscaler @ 0x7fddd104a000] deprecated pixel format used, make sure you did
set range correctly
Output #0, mov, to 'mjpeg-fullrange-ffmpeg-noyuvj.mov':
  Metadata:
major_brand : qt
minor_version   : 537199360
compatible_brands: qt
encoder : Lavf56.40.101
Stream #0:0(eng): Video: mjpeg (jpeg / 0x6765706A), yuvj444p(pc),
1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 200 kb/s, 25 fps, 12800 tbn, 25 tbc
(default)
Metadata:
  creation_time   : 2013-12-11 15:38:12
  handler_name: Apple Alias-Datensteuerung
  timecode: 01:00:00:00
  encoder : Lavc56.55.100 mjpeg
Stream mapping:
  Stream #0:0 - #0:0 (mpeg2video (native) - mjpeg (native))
Press [q] to stop, [?] for help
frame=   24 fps=0.0 q=24.8 size=2066kB time=00:00:00.96
bitrate=17626.5kbitsframe=   25 fps=0.0 q=24.8 Lsize=2147kB
time=00:00:01.00 bitrate=17589.0kbits/s
video:2146kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB
muxing overhead: 0.067178%

but obviously the mjpeg encoder refuses the pixel format although it should
not do this if the deprecated pixel formats are really not to be used. Am I
missing something or is the mjpeg encoder simply not capable yet of working
with color_range instead of the deprecated yuvj pixel formats?

Thanks in advance,

Robert
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


[FFmpeg-user] overlay AND concatenate with ffmpeg - merging to one command?

2015-08-25 Thread Chad Horton
I have an android app where users record a video.  I then add an overlay
on the video and then concatenate a 4 second video to the end of the video.

I¹m currently using two executions, as provided below.  The second
(concatenate) execution TAKES FOREVER.  Well over 2 minutes.

1.) is there a much faster way to do this?
2.) is there a way to execute this in a single ffmpeg command instead of
two?

ffmpeg -y 
-i baselinevideo.mp4
-itsoffset 00:00:0.
-i overlayvideo.mp4
-filter_complex 
[1:v]scale=1280:1024[ovrl];[0:v][ovrl]overlay=eof_action=pass[outv];[0:a][
1:a]amix[outa]
-map [outv]
-map [outa]
-vcodec mpeg4
-strict experimental
-crf
finalvideo-temp.mp4

ffmpeg -y
-i finalvideo-temp.mp4
-i concatvideo.mp4
-filter_complex [0:0] [0:1] [1:0] [1:1] concat=n=2:v=1:a=1 [v] [a]
-map [v]
-map [a]
-strict experimental
-crf
finalvideo.mp4

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Not using all resources

2015-08-25 Thread Henk D. Schoneveld

 On 25 Aug 2015, at 11:34, Carl Eugen Hoyos ceho...@ag.or.at wrote:
 
 Henk D. Schoneveld belcampo at zonnet.nl writes:
 
 Command line and complete, uncut console output missing.
 
 They were in the screenshots, those screenshots were 
 png’s made under OSX in X-terminal
 
 This is not welcome here, instead copy  paste the 
 command line that allows to reproduce your issue 
 together with the complete, uncut console output.
 
 You don't have to paste the output of top, we believe 
 you if you write that you observe x% cpu usage.
This one is the one that loads the CPU to almost max
arte.sh 305 4092.96
ffmpeg version N-74320-g8015150 Copyright (c) 2000-2015 the FFmpeg developers
  built with gcc 4.8.2 (GCC)
  configuration: --prefix=/usr --enable-libfdk_aac --enable-libx264 
--enable-gpl --enable-nonfree
  libavutil  54. 30.100 / 54. 30.100
  libavcodec 56. 57.100 / 56. 57.100
  libavformat56. 40.101 / 56. 40.101
  libavdevice56.  4.100 / 56.  4.100
  libavfilter 5. 32.100 /  5. 32.100
  libswscale  3.  1.101 /  3.  1.101
  libswresample   1.  2.101 /  1.  2.101
  libpostproc53.  3.100 / 53.  3.100
[mp3 @ 0x3747320] Header missing
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[mp3 @ 0x3747320] Header missing
Last message repeated 2 times
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[mp3 @ 0x3747320] Header missing
Last message repeated 2 times
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[mp3 @ 0x3747320] Header missing
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[mp3 @ 0x3747320] Header missing
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[mp3 @ 0x3747320] Header missing
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[mp3 @ 0x3747320] Header missing
Last message repeated 1 times
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[mp3 @ 0x3747320] Header missing
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[mp3 @ 0x3747320] Header missing
Last message repeated 1 times
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[mp3 @ 0x3747320] Header missing
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 0x369d320] decode_slice_header error
[h264 @ 0x369d320] no frame!
[h264 @ 0x369d320] non-existing PPS 0 referenced
Last message repeated 1 times
[h264 @ 

Re: [FFmpeg-user] overlay AND concatenate with ffmpeg - merging to one command?

2015-08-25 Thread Chad Horton
Correction on my inputs for both executions:

ffmpeg -y -i baselinevideo.mp4 -itsoffset 00:00:0. -i overlayvideo.mp4
-filter_complex 
[1:v]scale=1280:720[ovrl];[0:v][ovrl]overlay=eof_action=pass[outv];[0:a][1
:a]amix[outa]” -map [outv] -map [outa] -vcodec mpeg4 -strict experimental
finalvideo-temp.mp4

ffmpeg -y -i finalvideo-temp.mp4 -i concatvideo.mp4 -filter_complex [0:0]
[0:1] [1:0] [1:1] concat=n=2:v=1:a=1 [v] [a] -map [v] -map [a] -vcodec
mpeg4 -strict experimental finalvideo.mp4






On 8/25/15, 9:32 AM, ffmpeg-user on behalf of Chad Horton
ffmpeg-user-boun...@ffmpeg.org on behalf of
chor...@hotsalsainteractive.com wrote:

I have an android app where users record a video.  I then add an overlay
on the video and then concatenate a 4 second video to the end of the
video.

I¹m currently using two executions, as provided below.  The second
(concatenate) execution TAKES FOREVER.  Well over 2 minutes.

1.) is there a much faster way to do this?
2.) is there a way to execute this in a single ffmpeg command instead of
two?

ffmpeg -y 
   -i baselinevideo.mp4
   -itsoffset 00:00:0.
   -i overlayvideo.mp4
   -filter_complex 
[1:v]scale=1280:1024[ovrl];[0:v][ovrl]overlay=eof_action=pass[outv];[0:a]
[
1:a]amix[outa]
   -map [outv]
   -map [outa]
   -vcodec mpeg4
   -strict experimental
   -crf
   finalvideo-temp.mp4

ffmpeg -y
   -i finalvideo-temp.mp4
   -i concatvideo.mp4
   -filter_complex [0:0] [0:1] [1:0] [1:1] concat=n=2:v=1:a=1 [v] [a]
   -map [v]
   -map [a]
   -strict experimental
   -crf
   finalvideo.mp4

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] DNXHD coding fails in current GIT-Head

2015-08-25 Thread Carl Eugen Hoyos
Rens Dijkshoorn rens at offlinemedia.nl writes:

 ffmpeg -i test.mov -pix_fmt yuv422p -vcodec dnxhd 
 -b:v 120M -c:a pcm_s16le test_dnxhd.mov

The following works fine here:
$ ffmpeg -f lavfi -i testsrc=s=hd1080 -f lavfi -i sine 
-pix_fmt yuv422p -vcodec dnxhd -b:v 120M -c:a pcm_s16le out.mov

Please provide your input file.

Carl Eugen

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Not using all resources

2015-08-25 Thread Henk D. Schoneveld

 On 25 Aug 2015, at 10:14, Henk D. Schoneveld belca...@zonnet.nl wrote:
 
 
 On 24 Aug 2015, at 23:14, Carl Eugen Hoyos ceho...@ag.or.at wrote:
 
 Henk D. Schoneveld belcampo at zonnet.nl writes:
 
 I’m encoding from several DVB-S sources, satellite, 
 with always the same parameters but recordings from 
 some sources CPU-load is about 50% of what it could 
 be, others use max. available as attachments show.
 
 Command line and complete, uncut console output missing.
 They were in the screenshots, those screenshots were png’s made under OSX in 
 X-terminal
Another try:
top - 22:22:32 up 15 days, 21:00,  2 users,  load average: 7.04, 2.09, 0.75
Tasks: 173 total,   2 running, 171 sleeping,   0 stopped,   0 zombie
Cpu(s): 27.6%us,  0.7%sy, 59.7%ni, 11.9%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   3976112k total,  3863768k used,   112344k free,  424k buffers
Swap:0k total,0k used,0k free,   977788k cached

  PID USER  PR  NI  VIRT  RES  SHR S %CPU %MEMTIME+  COMMAND   
17450 belcampo  20   0 1293m 246m 5048 R  701  6.4   6:47.10 ffmpeg
12682 belcampo  20   0 2790m 1.3g 1.2g S2 33.5 403:11.84 VBoxHeadless  
12543 belcampo  20   0 1648m 442m 420m S1 11.4 212:31.19 VBoxHeadless  
12824 belcampo  20   0 1394m 163m 146m S1  4.2 157:57.99 VBoxHeadless  
  320 root   0 -20 000 S0  0.0   3:53.56 kworker/7:1H  
12401 belcampo  20   0  554m 3640  348 S0  0.1  21:50.01 VBoxSVC   
26078 plex  20   0  325m  18m  584 S0  0.5   3:22.19 Plex DLNA Serve   
1 root  20   0 46412 2228  152 S0  0.1   0:00.53 systemd   
2 root  20   0 000 S0  0.0   0:00.03 kthreadd  
3 root  20   0 000 S0  0.0   0:12.87 ksoftirqd/0   
5 root   0 -20 000 S0  0.0   0:00.00 kworker/0:0H  
7 root  20   0 000 S0  0.0   2:11.80 rcu_sched 
8 root  20   0 000 S0  0.0   0:00.00 rcu_bh
9 root  RT   0 000 S0  0.0   0:00.19 migration/0   
   10 root  RT   0 000 S0  0.0   0:00.32 migration/1   
   11 root  20   0 000 S0  0.0   0:02.22 ksoftirqd/1   
   13 root   0 -20 000 S0  0.0   0:00.00 kworker/1:0H  
   14 root  RT   0 000 S0  0.0   0:00.20 migration/2   
   15 root  20   0 000 S0  0.0   0:02.21 ksoftirqd/2   
   17 root   0 -20 000 S0  0.0   0:00.00 kworker/2:0H  
   18 root  RT   0 000 S0  0.0   0:00.20 migration/3   
   19 root  20   0 000 S0  0.0   0:02.23 ksoftirqd/3   
   21 root   0 -20 000 S0  0.0   0:00.00 kworker/3:0H  
[belcampo@base4 ~]$ ps ax|grep ffmp
17450 pts/0Sl+7:41 ffmpeg -y -ss 305 -i base.ts -t 4092.96 -c:v 
libx264-crf 22 -preset veryfast -crf 22 -g 50 -me_method umh -c:a libfdk_aac 
-ab 128k ac 2 -r 25 -map 0:0 -map 0:4 base.mp4

The same CLI for another source-file to encode gives that ffmpeg leaves CPU 
Idle at about 60%
 
 Please do not attach screenshots unless there is a 
 very good reason.
 I thought/think I had a good reason.
 
 Carl Eugen
 ___
 ffmpeg-user mailing list
 ffmpeg-user@ffmpeg.org
 http://ffmpeg.org/mailman/listinfo/ffmpeg-user
 

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Not using all resources

2015-08-25 Thread Henk D. Schoneveld

 On 24 Aug 2015, at 23:14, Carl Eugen Hoyos ceho...@ag.or.at wrote:
 
 Henk D. Schoneveld belcampo at zonnet.nl writes:
 
 I’m encoding from several DVB-S sources, satellite, 
 with always the same parameters but recordings from 
 some sources CPU-load is about 50% of what it could 
 be, others use max. available as attachments show.
 
 Command line and complete, uncut console output missing.
They were in the screenshots, those screenshots were png’s made under OSX in 
X-terminal
 
 Please do not attach screenshots unless there is a 
 very good reason.
I thought/think I had a good reason.
 
 Carl Eugen
 ___
 ffmpeg-user mailing list
 ffmpeg-user@ffmpeg.org
 http://ffmpeg.org/mailman/listinfo/ffmpeg-user

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] relocation R_X86_64_PC32 against symbol `ff_w1111' can not be used when making a shared object; recompile with -fPIC

2015-08-25 Thread Carl Eugen Hoyos
Gabriel Pettier gabriel.pettier at gmail.com writes:

 I tried a lot of different ones, here are a few (by order)

Did you run make distclean after each try?
What version of FFmpeg is this?

I was unable to reproduce your issue with the following 
command:

$ ./configure  make libswscale/libswscale.a
$ gcc libswscale/libswscale.a -shared -o out.so

Carl Eugen

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Grab stills on scene change math question

2015-08-25 Thread Moritz Barsnick
Hi Bouke,

On Tue, Aug 25, 2015 at 13:00:40 +0200, Bouke (VideoToolShed) wrote:
 assuming that gt(scene\,0.4) will return a number that I could do math on, 
 but this fails.

ffmpeg's expressions give the select filter a true/false (take or don't
take this frame) decision. In this case: scene detection shows a change
of 0.4 or more, principally.

 And to make it more complex, can I also grab a shot out of the middle
 of a scene? (that would require to get the current and next
 framenumber, subtract and divide by two to an integer and add to the
 first...)

I wish I knew. I don't think such a filter exists. Part of the issue is
that ffmpeg would need to look ahead, and buffer a lot of data (which
it does for other filters as well).

What you could do is to parse the video for scene changes first,
calculate the images you want, and present them to the select filter in
the second run.

Moritz
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Grab stills on scene change math question

2015-08-25 Thread Moritz Barsnick
On Tue, Aug 25, 2015 at 14:22:31 +0200, Bouke (VideoToolShed) wrote:
  calculate the images you want, and present them to the select filter in
  the second run.
 
 But, I'm clueless on how to do that?.
 (Except restarting FFmpeg for each frame with a -ss)
 Btw, is there a -ss  option that i can input frames instead of time?

Assuming you can get the scene detection filter to print you the frame
numbers, and they turn out to be (e.g.) 0 22 134 157 222 331 413, you
can do this shell magic (line broken with '\' for readability):

FRAMESPEC=; \
for f in 0 22 134 157 222 331 413; do \
  if [ $FRAMESPEC =  ]; then FRAMESPEC=eq(n\,${f}); else 
FRAMESPEC=${FRAMESPEC}+eq(n\,${f}); fi; \
done; \
ffmpeg -i video.mkv -vf select=$FRAMESPEC -vsync 0 video.%04d.jpg

(The if term is a bit crude and can be improved by some ${?bla}
construct. I didn't bother.)

In other words: Give the select filter an expression which evaluates to
!= 0 on the correct frames (frame number 'n'). That's a logical OR
expression, achieved in ffmpeg by adding the eq() expression terms.

The -vsync was added so that the image2 muxer doesn't blow the
resulting stream back up to 25 fps, but only passes through the frames
from the filter one by one.

(The %04d does _not_ represent 'n', as the muxer has no knowledge of
that - it just counts upward. You shouldn't mind.)

I verified this with a video where each frame was overlayed with its
number (drawtext filter with text=%{n}), and by selecting those
frames with my script and checking whether the correct ones were
extracted.

Try that,
Moritz

P.S.: The whole thing will break once the Unix command line or the
amount of command line bytes to be evaluated by ffmpeg get too long.
;-)
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Grab stills on scene change math question

2015-08-25 Thread Bouke (VideoToolShed)

Hi Moritz,
Thanks!


Hi Bouke,

On Tue, Aug 25, 2015 at 13:00:40 +0200, Bouke (VideoToolShed) wrote:
assuming that gt(scene\,0.4) will return a number that I could do math 
on,

but this fails.


ffmpeg's expressions give the select filter a true/false (take or don't
take this frame) decision. In this case: scene detection shows a change
of 0.4 or more, principally.


Ah, that makes sense!


And to make it more complex, can I also grab a shot out of the middle
of a scene? (that would require to get the current and next
framenumber, subtract and divide by two to an integer and add to the
first...)


I wish I knew. I don't think such a filter exists. Part of the issue is
that ffmpeg would need to look ahead, and buffer a lot of data (which
it does for other filters as well).


Well, in theory it should only have to remember the last frame number...
But this is acedemic, as your solution is way easier.


What you could do is to parse the video for scene changes first,


Ok, this I can do.


calculate the images you want, and present them to the select filter in
the second run.


But, I'm clueless on how to do that?.
(Except restarting FFmpeg for each frame with a -ss)
Btw, is there a -ss  option that i can input frames instead of time?

Thanks,
Bouke


Moritz
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user




---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] Unable to get udp multicast stream

2015-08-25 Thread Lucas da Vila
I've tried with VLC but I was not able to get the multicast on my machine.
On the server I receive the multicast, I saw it with tcpdump, but its like
ffmpeg doesn't see it

On Wed, Aug 19, 2015 at 5:41 PM, Lucas da Vila lucas8...@gmail.com wrote:

 I've waited like 15 minutes and stills in the same state

 ffmpeg -v 9 -loglevel 99 -re -i udp://@239.255.0.1:56000
 ffmpeg version N-74369-g55a07cf Copyright (c) 2000-2015 the FFmpeg
 developers
   built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04)
   configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg
 --enable-avresample --disable-debug --enable-nonfree --enable-gpl
 --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb
 --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse
 --enable-libdcadec --enable-libfreetype --enable-libx264 --enable-libx265
 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus
 --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth
 --enable-libsoxr --enable-libxvid --enable-libvo-aacenc --enable-libvidstab
   libavutil  54. 30.100 / 54. 30.100
   libavcodec 56. 57.100 / 56. 57.100
   libavformat56. 40.101 / 56. 40.101
   libavdevice56.  4.100 / 56.  4.100
   libavfilter 5. 32.100 /  5. 32.100
   libavresample   2.  1.  0 /  2.  1.  0
   libswscale  3.  1.101 /  3.  1.101
   libswresample   1.  2.101 /  1.  2.101
   libpostproc53.  3.100 / 53.  3.100
 Splitting the commandline.
 Reading option '-v' ... matched as option 'v' (set logging level) with
 argument '9'.
 Reading option '-loglevel' ... matched as option 'loglevel' (set logging
 level) with argument '99'.
 Reading option '-re' ... matched as option 're' (read input at native
 frame rate) with argument '1'.
 Reading option '-i' ... matched as input file with argument 'udp://@
 239.255.0.1:56000'.
 Finished splitting the commandline.
 Parsing a group of options: global .
 Applying option v (set logging level) with argument 9.
 Successfully parsed a group of options.
 Parsing a group of options: input file udp://@239.255.0.1:56000.
 Applying option re (read input at native frame rate) with argument 1.
 Successfully parsed a group of options.
 Opening an input file: udp://@239.255.0.1:56000.
 [udp @ 0x2b6bc00] end receive buffer size reported is 131072

 When I hit ctrl-c appear this line
 [AVIOContext @ 0x2134100] Statistics: 0 bytes read, 0 seeks
 udp://239.255.0.1:56000: Immediate exit requested
 Exiting normally, received signal 2.


 Maybe tomorrow I will be able to test on my own computer with vlc.
 Over ssh the only thing I can do is this, really don't know if it helps
 cvlc udp://239.255.0.7:1002
 VLC media player 2.1.6 Rincewind (revision 2.1.6-0-gea01d28)
 [0x1118238] main interface error: no suitable interface module
 [0x1052118] main libvlc error: interface globalhotkeys,none
 initialization failed
 [0x1118238] dbus interface error: Failed to connect to the D-Bus session
 daemon: Unable to autolaunch a dbus-daemon without a $DISPLAY for X11
 [0x1118238] main interface error: no suitable interface module
 [0x1052118] main libvlc error: interface dbus,none initialization failed
 [0x1117b38] dummy interface: using the dummy interface module...
 ^C[0x7f22b0001998] main stream error: cannot pre fill buffer


 mplayer udp://239.255.0.7:1002
 MPlayer SVN-r37401 (C) 2000-2012 MPlayer Team
 mplayer: could not connect to socket
 mplayer: No such file or directory
 Failed to open LIRC support. You will not be able to use your remote
 control.

 Playing udp://239.255.0.7:1002.
 STREAM_UDP, URL: udp://239.255.0.7:1002
 Failed to connect to server
 udp_streaming_start failed
 No stream found to handle url udp://239.255.0.7:1002


 Exiting... (End of file)

 Thanks for the advice with aac :)
 Lucas


 On Wed, Aug 19, 2015 at 3:07 PM, Carl Eugen Hoyos ceho...@ag.or.at
 wrote:

 Lucas da Vila lucas8666 at gmail.com writes:

  ffmpeg -v 9 -loglevel 99 -re -i udp:// at 239.255.0.1:56000

 What happens if you just specify the above (and wait)?
 Which application does work? (vlc, MPlayer?)

  -c:v copy -c:a:0 libvo_aacenc

 This is the worst of several existing aac encoders,
 please don't use it.

 Carl Eugen

 ___
 ffmpeg-user mailing list
 ffmpeg-user@ffmpeg.org
 http://ffmpeg.org/mailman/listinfo/ffmpeg-user



___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


[FFmpeg-user] How to ffprobe the stream generated by ffmpeg

2015-08-25 Thread xiong xu
Hi,

I want analysis RTP H264 params with ffprobe.
But ffprobe complain about Unable to receive RTP payload type 122
without an SDP file describing it.

Can someone help?

-Xiong

Here's the ffprobe log:

D:\tools\ffmpeg-20150818-git-737aa90-win64-static\binffprobe
-show_streams rtp://10.17.41.163:45900
ffprobe version N-74462-g737aa90 Copyright (c) 2007-2015 the FFmpeg developers
  built with gcc 4.9.3 (GCC)
  configuration: --enable-gpl --enable-version3 --disable-w32threads
--enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r
--enable-gnutls --enable-iconv --enable-libass --enable-libbluray
--enable-libbs2b --enable-libcaca --enable-libdcadec
--enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc
--enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb
--enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus
--enable-librtmp --enable-libschroedinger --enable-libsoxr
--enable-libspeex --enable-libtheora --enable-libtwolame
--enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc
--enable-libvorbis --enable-libvpx --enable-libwavpack
--enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs
--enable-libxvid --enable-lzma --enable-decklink --enable-zlib
  libavutil  54. 30.100 / 54. 30.100
  libavcodec 56. 57.100 / 56. 57.100
  libavformat56. 40.101 / 56. 40.101
  libavdevice56.  4.100 / 56.  4.100
  libavfilter 5. 34.100 /  5. 34.100
  libswscale  3.  1.101 /  3.  1.101
  libswresample   1.  2.101 /  1.  2.101
  libpostproc53.  3.100 / 53.  3.100
[rtp @ 04f2bf60] Unable to receive RTP payload type 122
without an SDP file describing it
Input #0, rtp, from 'rtp://10.17.41.163:45900':
  Duration: N/A, bitrate: N/A



Here's the ffmpeg log generating RTP stream:

pi@raspberrypi ~/p2p-sip/src $ ffmpeg -f video4linux2 -i /dev/video0
-vcodec h264 -b 9 -payload_type 122 -s 320*240 -r 20 -profile:v
baseline -level 1.2 -f rtp rtp://10.17.41.163:45900
ffmpeg version N-74455-g3afca32 Copyright (c) 2000-2015 the FFmpeg developers
  built with gcc 4.6 (Debian 4.6.3-14+rpi1)
  configuration: --arch=armel --target-os=linux --enable-gpl
--enable-libx264 --enable-nonfree
  libavutil  54. 30.100 / 54. 30.100
  libavcodec 56. 57.100 / 56. 57.100
  libavformat56. 40.101 / 56. 40.101
  libavdevice56.  4.100 / 56.  4.100
  libavfilter 5. 33.100 /  5. 33.100
  libswscale  3.  1.101 /  3.  1.101
  libswresample   1.  2.101 /  1.  2.101
  libpostproc53.  3.100 / 53.  3.100
Input #0, video4linux2,v4l2, from '/dev/video0':
  Duration: N/A, start: 1440553547.642477, bitrate: 752025 kb/s
Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p,
1920x1088, 752025 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
Please use -b:a or -b:v, -b is ambiguous
[libx264 @ 0x289efc0] using cpu capabilities: ARMv6 NEON
[libx264 @ 0x289efc0] profile Constrained Baseline, level 1.2
Output #0, rtp, to 'rtp://10.17.41.163:45900':
  Metadata:
encoder : Lavf56.40.101
Stream #0:0: Video: h264 (libx264), yuv420p, 320x240, q=-1--1, 90
kb/s, 20 fps, 90k tbn, 20 tbc
Metadata:
  encoder : Lavc56.57.100 libx264
Stream mapping:
  Stream #0:0 - #0:0 (rawvideo (native) - h264 (libx264))
SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 10.17.41.163
t=0 0
a=tool:libavformat 56.40.101
m=video 45900 RTP/AVP 122
b=AS:90
a=rtpmap:122 H264/9
a=fmtp:122 packetization-mode=1

Press [q] to stop, [?] for help
Past duration 0.653969 too large
frame=14219 fps= 20 q=-1.0 Lsize=7989kB time=00:11:50.95 bitrate=
92.1kbits/s dup=9814 drop=22
video:7815kB audio:0kB subtitle:0kB other streams:0kB global
headers:0kB muxing overhead: 2.227252%
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


[FFmpeg-user] Mulitple output files from one -map?

2015-08-25 Thread Jonathan Viney
Hi,

Here's my ffmpeg command:

ffmpeg -i input1.mkv -i input2.mkv -filter_complex
myfilter[output1][output2] -map [output1] -f null /dev/null -map
[output2] -y high.mp4 medium.mp4 low.mp4

myfilter is a custom filter that uses dualinput and gives two outputs.

high.mp4 is encoded correctly with the contents of output2, but
medium.mp4 and low.mp4 are copies of input1.mkv. Is there a way to
encode multiple copies of output2 using -map? Specifying -map
[output2] more than once causes errors saying it's already been
used.

Inserting a split=3 filter into the filter chain works fine and allows
each output to be mapped individually:

ffmpeg -i input1.mkv -i input2.mkv -filter_complex
myfilter[output1][output2];[output2]split=3[one][two][three] -map
[output1] -f null /dev/null -map [one] high.mp4 -map [two]
medium.mp4 -map [three] low.mp4

Is that the best way to do this or can -map be used somehow? Would the
split filter potentially be doing extra frame copies unnecessarily?
I'm using current ffmpeg head.

Thanks,
-Jonathan.
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user