Re: [FFmpeg-user] How can I recode the movie in a way that I can step forward frame by frame?
I assume the movie is encoded using IPB frames. Thus I can't step from any frame to the previous/next frame but from I-frames to I-frames only. So I'd like to convert the movie to an I-frame only format. I don't wan't to convert it to a series of still pictures like bmp or jpg. Am 21.10.2017 um 17:50 schrieb Moritz Barsnick: On Sat, Oct 21, 2017 at 17:44:01 +0200, LaHu wrote: I like to watch movies in details. In order to check a movie for logical errors I watch some scenes in slow motion or even stepping throug them forward and backwards. I can't step really +-1 Frame, sometimes it moves several seconds ahead. Nice info. But that neither tells me whether whether my suggested command line suits your needs and whether you tried it, nor does it answer any of my questions. All else is fine then now? Moritz ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe". ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
Re: [FFmpeg-user] How can I recode the movie in a way that I can step forward frame by frame?
I like to watch movies in details. In order to check a movie for logical errors I watch some scenes in slow motion or even stepping throug them forward and backwards. I can't step really +-1 Frame, sometimes it moves several seconds ahead. Am 21.10.2017 um 15:02 schrieb Moritz Barsnick: On Sat, Oct 21, 2017 at 13:38:02 +0200, LaHu wrote: I have a recording of a satellite feed. I want to convert it to full frames so I can step through the movie frame by frame. How would I do that using ffmpeg? Thanks for helping! Several players are capable of stepping frame by frame. Certainly mplayer, likely its derivates such as mpv, and also VLC can do that IIRC. What were you considering using for viewing frame by frame? I also don't understand what you mean by "convert it to full frames". You can't mean to deinterlace, your material doesn't seem to be interlaced? Do you mean "single pictures", i.e. a separate image file for each frame? If so, check the image2 muxer, or just use: $ ffmpeg -i input out%06d.jpg Cheers, Moritz ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe". ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
[FFmpeg-user] How can I recode the movie in a way that I can step forward frame by frame?
Hi, I have a recording of a satellite feed. I want to convert it to full frames so I can step through the movie frame by frame. How would I do that using ffmpeg? Thanks for helping! This is the result of ffprobe: ffprobe version N-82324-g872b358 Copyright (c) 2007-2016 the FFmpeg developers built with gcc 5.4.0 (GCC) configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-dxva2 --enable-libmfx --enable-nvenc --enable-avisynth --enable-bzlib --enable-libebur128 --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib libavutil 55. 36.100 / 55. 36.100 libavcodec 57. 66.101 / 57. 66.101 libavformat57. 57.100 / 57. 57.100 libavdevice57. 2.100 / 57. 2.100 libavfilter 6. 66.100 / 6. 66.100 libswscale 4. 3.100 / 4. 3.100 libswresample 2. 4.100 / 2. 4.100 libpostproc54. 2.100 / 54. 2.100 [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] decode_slice_header error [h264 @ 009ee400] no frame! [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] decode_slice_header error [h264 @ 009ee400] no frame! [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] decode_slice_header error [h264 @ 009ee400] no frame! [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] decode_slice_header error [h264 @ 009ee400] no frame! [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] decode_slice_header error [h264 @ 009ee400] no frame! [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] decode_slice_header error [h264 @ 009ee400] no frame! [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] decode_slice_header error [h264 @ 009ee400] no frame! [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] decode_slice_header error [h264 @ 009ee400] no frame! [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] decode_slice_header error [h264 @ 009ee400] no frame! [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 009ee400] non-existing PPS 0 referenced [h264 @ 009ee400] decode_slice_header error [h264 @ 009ee400] no frame! [h264 @ 009ee400] SPS unavailable in decode_picture_timing [h264 @ 000
[FFmpeg-user] Capturing without break
Hi, I'd like to know if there's a way to go on recording even when the source will disappear for a while? Reason: I plan to record several sources at a time as independent recordings. Source A will become Video A, Source B will become Video B and so on. Source A will be the "master" and the duration of its recording is the reference for all other recordings. If Source A lasts 30 minutes then all other sources should last 30 minutes as well, no matter if their sources deliver valid signals or not. It is not really neccessary that the duration of all recordings is exactly the same. If there's a loss of signal and the recording stops then it is fine enough to automatically start a second recording without a signal if the break between ending the first recording and start with the next recording takes not longer than one second. I would use then a script which puts both streams together. Thanks in advance! ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
Re: [FFmpeg-user] How can I use a certain font and fontsize when using drawtext with the parameter textfile?
Sorry - I just noticed the mistake. My fault, please ignore my original question. All works fine. ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
[FFmpeg-user] How can I use a certain font and fontsize when using drawtext with the parameter textfile?
Hi, The command looks like this: ffmpeg.exe -i desk2.avi -vf drawtext=enable='between(t,0,5)':fontfile='C\:\\Windows\\Fonts\\arial.ttf':box=1:boxcolor=black:fontcolor=white:x=(w-tw)/2:y=h-th:textfile=sampletext.txt result.mp4 It works except that font and fontsize stay the same, no matter what parameters I enter. When I use text instead of textfile then it works as expected. How can I select another font and fontsize when overlaying text from a file? Can I store text to a variable and refer then to that variable instead of using text="abc" or textfile=sampletext.txt? Thanks for helping, LaHu ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
[FFmpeg-user] how to combine 2x stereo.wav to 1x stereo.wav while changing the panorama?
Hi, I have two different .wav files with different durations. File a: english narration on channel 1, effects on channel 2 File b: french narration on channel 1, original sound on channel 2 The result should be one stereo.wav file: english narration and effects mixed together to the left channel french narration and original sound mixed together to the right channel I only know how to do one part: ffmpeg -i engl_fx.wav -af pan=stereo:c0=c0+c1 engl.wav How could I do all in one step, regardless of the duration of the files? Thanks for suggestions! LaHu ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".