[FFmpeg-user] testing rtmp input
Hi, What would be the best way to test whether an rtmp stream is present? I'm reading an rtmp stream as input and creating an mp4 as output. But the stream isn't always present. I was thinking of doing a quick test for stream presence, maybe reading 5 seconds of input as a test prior to attempting the output. thanks Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
[FFmpeg-user] audio noise filtering
Hi, I know almost nothing about audio filtering, and I wanted to better understand more about it. I would like to filter out some low level hum/noise from an mp4 file (h264/aac) and I understand there are different types of filters, for example, a noise gate. Can someone provide me with different configurations (command line syntax) for noise filtering that I can experiment on the mp4 file and see the results? If it helps, the video is a recording from a church service (kind of like an auditorium type location), and I was looking for ways to enhance the audio. Any help is appreciated thanks Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
[FFmpeg-user] appending (or concussing) video on hls
Hi, I have two videos that I wanted to concat together. The end result is have the hls representation of the 2. I already have one video in hls format, with the m3u8 playlist file and all of the ts files. I wanted to append/concat a second video to the m3u8 playlist. Is it possible to tell ffmpeg to simply take new ts segments and append to the m3u8 playlist? For example I can take the 2nd video and convert to hls segments, would ffmpeg be able to take the additional segments and just append to the m3u8 playlist? thanks Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] Decklink input buffer overrun
On Wed, Sep 23, 2015 at 12:47 PM, Flávio Ponteswrote: > Hi, > > I need to stream the 2 inputs from a DeckLink Duo card in FHD. > So far I could compile ffmpeg with decklink support and capture one input > at a time without problems. > > When I try to capture from the 2 inputs I start getting the following > message repeatedly on both terminals: > [decklink @ 0x32287e0] Decklink input buffer overrun! > > I use these commands in the terminals: > > /opt/ffmpeg_build/bin/ffmpeg -f decklink -re -i 'DeckLink SDI (1)@7' -g 90 > -s hd1080 -profile:v baseline -pix_fmt yuv420p -preset veryfast -c:v > libx264 -b:v 2000k -bufsize 1400k -minrate 2000k -maxrate 2000k -c:a > libfdk_aac -b:a 96k -threads 4 -f flv rtmp://10.12.20.62/myapp/tvines > > /opt/ffmpeg_build/bin/ffmpeg -f decklink -re -i 'DeckLink SDI (2)@7' -g 90 > -s hd1080 -profile:v baseline -pix_fmt yuv420p -preset veryfast -c:v > libx264 -b:v 2000k -bufsize 1400k -minrate 2000k -maxrate 2000k -c:a > libfdk_aac -b:a 96k -threads 4 -f flv rtmp://10.12.20.62/myapp/tvescola > > I tried many combinations but couldn't get it to work. > I wonder if the input buffer is shared between the 2 inputs. > Is there any way to set an input buffer size independently for each > command? > > Thx in advance! > Flávio. > ___ > ffmpeg-user mailing list > ffmpeg-user@ffmpeg.org > http://ffmpeg.org/mailman/listinfo/ffmpeg-user > Hi Flávio, I'm not sure if this directly addresses your problem, but have you tried using the -thread_queue_size parameter? I believe having a deeper queue might help, the default value is only 8. I've used much larger values, even 2048 for example. Granted, I think 2048 is overkill but I don't see any problem in setting up larger queues. Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
[FFmpeg-user] ffserver: how to create ffm file anew each time a feed comes in?
I’ve noticed that when I send a stream to ffserver, it keeps updating the ffm file if one already exists. Is there a config such that the ffm starts anew when a new feed comes in? Otherwise I guess I would have to delete the ffm file? thanks Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
[FFmpeg-user] ensuring output of #EXT-X-ENDLIST to m3u8 file for HLS
Hi, I have ffserver and ffmpeg working together to produce HLS streaming files. Basically ffserver has an flv stream, and an ffmpeg process reads the stream and generates the HLS segments. However, once the flv stream ends, ffmpeg doesn’t react to that by ending “cleanly” on its own. I’m forced to kill the process otherwise it just sits indefinitely waiting for input from the flv. The side effect is that the m3u8 file misses the last line which typically contains #EXT-X-ENDLIST Since I’m running the ffmpeg process in the background, I can’t cleanly “hit q” to quit it. Does ffmpeg have a command option to cleanly exit (or timeout) if the input is halted for a period of time? Alternatively, is there a signal I can send to ffmpeg for it to exit cleanly and it outputs #EXT-X-ENDLIST? thanks Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
[FFmpeg-user] help with live capture and audio sync issues
Hi, I have very inconsistent results with doing live capture. Sometimes the resulting file has perfect audio sync, but sometimes it doesn’t. I’ve tried different combinations but the results are never consistent. I am capturing video from a black magic design UltraStudio Mini Recorder and the audio from an external Mic. This is done on a MacBook Pro. Here’s the capture command: ffmpeg -y \ -thread_queue_size 2048 -f avfoundation -async 1 -i none:'C-Media USB Audio Device' \ -thread_queue_size 2048 -f decklink -vsync 0 -i 'UltraStudio Mini Recorder@12':none\ -c:v libx264 -preset veryfast -profile:v baseline -vf scale=640:360 \ -pix_fmt yuv420p \ -c:a libfdk_aac -b:a 80k \ -map 1:1 -map 0:0 \ video_tests/test-avsync.mp4 Here’s the ffmpeg output: ffmpeg version N-73104-g7604358 Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn) configuration: --prefix=/usr/local --enable-gpl --enable-nonfree --enable-ffplay --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-decklink --extra-cflags=-I/Users/ptl/blackmagicsdk/Mac/include --extra-ldflags=-L/Users/ptl/blackmagicsdk/Mac/include libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 44.100 / 56. 44.100 libavformat56. 38.100 / 56. 38.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 17.100 / 5. 17.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc53. 3.100 / 53. 3.100 Input #0, avfoundation, from 'none:C-Media USB Audio Device': Duration: N/A, start: 1057.070635, bitrate: 1411 kb/s Stream #0:0: Audio: pcm_f32le, 44100 Hz, mono, flt, 1411 kb/s [decklink @ 0x7fdc23843600] Found Decklink mode 1920 x 1080 with rate 29.97(i) Guessed Channel Layout for Input Stream #1.0 : stereo Input #1, decklink, from 'UltraStudio Mini Recorder@12:none': Duration: N/A, start: 0.00, bitrate: 1536 kb/s Stream #1:0: Audio: pcm_s16le, 48000 Hz, 2 channels, s16, 1536 kb/s Stream #1:1: Video: rawvideo (UYVY / 0x59565955), uyvy422, 1920x1080, -5 kb/s, 29.97 tbr, 1000k tbn, 29.97 tbc -async is forwarded to lavfi similarly to -af aresample=async=1:min_hard_comp=0.10:first_pts=0. [libx264 @ 0x7fdc23845c00] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX [libx264 @ 0x7fdc23845c00] profile Constrained Baseline, level 3.0 [libx264 @ 0x7fdc23845c00] 264 - core 144 r2533 c8a773e - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=1:0:0 analyse=0x1:0x111 me=hex subme=2 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=10 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00 Output #0, mp4, to 'video_tests/test-avsync.mp4': Metadata: encoder : Lavf56.38.100 Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 640x360, q=-1--1, 29.97 fps, 30k tbn, 29.97 tbc Metadata: encoder : Lavc56.44.100 libx264 Stream #0:1: Audio: aac (libfdk_aac) ([64][0][0][0] / 0x0040), 44100 Hz, mono, s16, 80 kb/s Metadata: encoder : Lavc56.44.100 libfdk_aac Stream mapping: Stream #1:1 -> #0:0 (rawvideo (native) -> h264 (libx264)) Stream #0:0 -> #0:1 (pcm_f32le (native) -> aac (libfdk_aac)) Press [q] to stop, [?] for help frame= 18 fps=0.0 q=29.0 size= 0kB time=00:00:00.03 bitrate= 11.5kbits/frame= 33 fps= 33 q=29.0 size= 79kB time=00:00:00.53 bitrate=1216.8kbits/frame= 48 fps= 32 q=29.0 size= 148kB time=00:00:01.03 bitrate=1173.1kbits/frame= 63 fps= 31 q=29.0 size= 245kB time=00:00:01.53 bitrate=1305.0kbits/frame= 79 fps= 31 q=29.0 size= 369kB time=00:00:02.06 bitrate=1460.7kbits/frame= 94 fps= 31 q=29.0 size= 491kB time=00:00:02.56 bitrate=1565.9kbits/frame= 109 fps= 31 q=29.0 size= 593kB time=00:00:03.06 bitrate=1583.1kbits/frame= 124 fps= 31 q=29.0 size= 691kB time=00:00:03.57 bitrate=1585.9kbits/frame= 139 fps= 31 q=29.0 size= 785kB ... time=00:00:26.79 bitrate=1108.6kbits/frame= 822 fps= 30 q=-1.0 Lsize= 3732kB time=00:00:27.42 bitrate=1114.6kbits/s video:3445kB audio:263kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.636530% [libx264 @ 0x7fdc23845c00] frame I:9 Avg QP:25.83 size: 18818 [libx264 @ 0x7fdc23845c00] frame P:813 Avg QP:27.67 size: 4130 [libx264 @ 0x7fdc23845c00] mb I I16..4: 32.9% 0.0% 67.1% [libx264 @ 0x7fdc23845c00] mb P I16..4: 5.6% 0.0%
Re: [FFmpeg-user] fixing audio sync on mp4
Thanks On Aug 9, 2015, at 7:18 PM, Quinn Wood mtcja...@gmail.com wrote: Alternatively you can edit the file in one operation if the desync can be fixed by adjusting the delay. *ffmpeg -i original.avi -itsoffset 0.2 -i original.avi -map 0:0 -map 1:1 -acodec copy -vcodec copy **synced.avi* I can try. The problem it’s not a fixed delay, the audio drifts. http://alien.slackbook.org/blog/fixing-audio-sync-with-ffmpeg/ ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] ffmpeg and multiple outputs
Thanks everyone for the follow-ups. Does anyone know if SDL also works on OSX, or what would be the equivalent? On Mon, Aug 3, 2015 at 8:11 AM, MrNice wxcvbn2...@iol.ie wrote: On 03/08/15 14:35, Christian Ebert wrote: * MrNice on Monday, August 03, 2015 at 12:06:20 +0100 Obviously I don't have SDL output: ./ffmpeg -formats | grep SDL ffmpeg -devices narrows it down better for this purpose imho. Thanks Moritz and Christian I installed SDL-devel and recompiled the last snapshot, working fine now :-) ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] ffmpeg and multiple outputs
Hi Carl, On Aug 2, 2015, at 10:08 AM, Carl Eugen Hoyos ceho...@ag.or.at wrote: Ricardo Kleemann ricardo at americasnet.com writes: I know I can do more than one output but is it possible to use ffplay in conjunction with that? I believe it’s possible to pipe ffmpeg output to ffplay, but what about doing ffplay as well as streaming? ffmpeg (the application) allows to display without using ffplay. Good point, how would I display ffmpeg on OS X? I’m not quite sure what the output device would be? thanks Ricardo Carl Eugen ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
[FFmpeg-user] ffmpeg and multiple outputs
Hi, I know I can do more than one output but is it possible to use ffplay in conjunction with that? I believe it’s possible to pipe ffmpeg output to ffplay, but what about doing ffplay as well as streaming? I’d like to use ffplay as a monitor of what I’m grabbing with ffmpeg, but then also need to stream that out. thanks Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] [SOLVED] taking audio and video inputs from different devices
Thanks Carl, One of the things also that was happening was insufficient thread queue size. I was seeing this error: Thread message queue blocking; consider raising the thread_queue_size option (current value: 8) I’ve been able to get audio and video sync’ed properly, video coming from one input device and audio from the other as follows: ffmpeg -y -thread_queue_size 512 -f decklink -vsync 0 -i 'UltraStudio Mini Recorder@12' -thread_queue_size 512 -f avfoundation -async 1 -i none:1 -c:v libx264 -preset veryfast -vf scale=720:405 -c:a libfdk_aac -ac 2 -ar 48000 -map 0:1 -map 1:0 out.mp4 So far things seem to be working ok Ricardo On Jul 28, 2015, at 9:04 AM, Carl Eugen Hoyos ceho...@ag.or.at wrote: Ricardo Kleemann ricardo at americasnet.com writes: ffmpeg -f avfoundation -i none:1 -f decklink -i You could try to move -vsync 0 in front of the first input. If this does not help, it may be possible to insert the setpts filter to make the timestamp of the first frame 0 (so it matches the audio starting timestamp). Carl Eugen ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] taking audio and video inputs from different devices
On Jul 26, 2015, at 3:28 AM, Carl Eugen Hoyos ceho...@ag.or.at wrote: Ricardo Kleemann ricardo at americasnet.com writes: Input #0, avfoundation, from 'none:1': Duration: N/A, start: 28853.224036 Input #1, decklink, from 'UltraStudio Mini Recorder at 9': Duration: N/A, start: 0.00 The different start times are an issue. Does it work if you only record audio? I’m not sure where the crazy start time comes from, but yes, the audio works fine if done separately. Maybe I need to do something differently to mix the audio and video? $ ffmpeg -f avfoundation -i none:1 -ac 2 -vn audio.mp4 ffmpeg version N-73104-g7604358 Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn) configuration: --prefix=/usr/local --enable-gpl --enable-nonfree --enable-ffplay --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-decklink --extra-cflags=-I/Users/ptl/blackmagicsdk/Mac/include --extra-ldflags=-L/Users/ptl/blackmagicsdk/Mac/include libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 44.100 / 56. 44.100 libavformat56. 38.100 / 56. 38.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 17.100 / 5. 17.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc53. 3.100 / 53. 3.100 Input #0, avfoundation, from 'none:1': Duration: N/A, start: 66610.243492, bitrate: 2822 kb/s Stream #0:0: Audio: pcm_f32le, 44100 Hz, stereo, flt, 2822 kb/s Output #0, mp4, to 'audio.mp4': Metadata: encoder : Lavf56.38.100 Stream #0:0: Audio: aac (libfdk_aac) ([64][0][0][0] / 0x0040), 44100 Hz, stereo, s16, 128 kb/s Metadata: encoder : Lavc56.44.100 libfdk_aac Stream mapping: Stream #0:0 - #0:0 (pcm_f32le (native) - aac (libfdk_aac)) Press [q] to stop, [?] for help size= 330kB time=00:00:20.81 bitrate= 129.9kbits/s video:0kB audio:326kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.301418% Carl Eugen ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] what program to use to grab device and send to ffserver?
Hi guys, I’ve had the chance to try out all 16 modes… On Jun 24, 2015, at 10:04 AM, Ricardo Kleemann rica...@americasnet.com wrote: Thanks. I was thinking about that last night. With Flash Media Live Encoder, the only one that works is the 1920x1080 60fps. I guess I'll have to try each one and see. I only got video from modes 9 and 12 [decklink @ 0x7f94b081ba00] 9 1920x1080 at 3/1001 fps [decklink @ 0x7f94b081ba00] 12 1920x1080 at 3/1001 fps (interlaced, upper field first) what is the effective difference between interlaced and non-interlaced, in terms of the resulting video file? In terms of streaming, also…? thank you Ricardo On Wed, Jun 24, 2015 at 1:08 AM, Arnaud Wijns arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be wrote: Hello, Make sure you are using the good input format for your device. BMD is quite touchy in that field. For instance, with a 1080i 50fps camera, I must choose the following format : 11 1920x1080 at 25000/1000 fps (interlaced, upper field first). If I use any other, I also have the mire (vertical color bars). Do not hesitate to try all available formats. One of them will suit your needs (typically 8, 10, 11 or 13 if you use a full HD camera / 14, 16 for a screen grabber). If none of them works, I am not sure what to do in that case ;) Arnaud. Arnaud WIJNS ULB Podcast | Informaticien http://podcast.ulb.ac.be http://podcast.ulb.ac.be/ http://podcast.ulb.ac.be/ http://podcast.ulb.ac.be/ Téléphone : 02/650.29.26 Email : arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be Université libre de Bruxelles Campus du Solbosch - CP 160/26 Avenue F.D. Roosevelt, 50 - 1050 Ixelles Le 24 juin 2015 à 04:42, Ricardo Kleemann rica...@americasnet.com mailto:rica...@americasnet.com a écrit : Hello Arnaud, Thanks again for your great help. Please see below… On Jun 22, 2015, at 5:28 AM, Arnaud Wijns arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be wrote: Hello Ricardo, I am working on a project which uses the same devices for the same purpose. You can use both bmdtools (with pipe to FFMPEG) or decklink support for FFMPEG (the second solution is easier and works better imo). 1) Install the Mac driver for BMD (BMD Desktop) 2) Download the Decklink SDK (from https://www.blackmagicdesign.com/support/download/f3e35f03b97440c4893fdf7e0dfdf97c/Mac%20OS%20X https://www.blackmagicdesign.com/support/download/f3e35f03b97440c4893fdf7e0dfdf97c/Mac%20OS%20X) 3) Install all dependencies for FFMPEG (use Brew as explained in the FFMPEG compilation guide for Mac OS) — brew install automake fdk-aac git lame libass libtool libvorbis libvpx \ opus sdl shtool texi2html theora wget x264 xvid yasm 4) Download FFMPEG from Git — git clone git://source.ffmpeg.org/ffmpeg.git http://source.ffmpeg.org/ffmpeg.git ffmpeg cd ffmpeg 5) Prepare the configuration file (adapt according to your needs BUT —enable-deckling is required for BMD). Adapt the path for your Decklink SDK — ./configure --prefix=/usr/local --enable-gpl --enable-nonfree --enable-libass \ --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus \ --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-decklink --extra-cflags=-I/somepath/blackmagicsdk/Mac/include --extra-ldflags=-L/somepath/blackmagicsdk/Mac/include 6) If you use Mac OS X 10.9 or above, edit « config.mak » — vi config.mak Remove all references to « -std=c99 » (there are 2) 7) Edit DeckLinkAPIDispatch.cpp (in /somepath/blackmagicsdk/Mac/include) — add the word static to the beginning of lines 56, 77, and 157 56: void InitDeckLinkAPI (void) 77: boolIsDeckLinkAPIPresent (void) 157: void InitBMDStreamingAPI(void) 8) Make sudo make install Now that FFMPEG in installed with Decklink support, you can get the audio-video stream from your BMD devices directly in FFMPEG. Use the following commands: ffmpeg -f decklink -list_devices 1 -i dummy ffmpeg -f decklink -list_formats 1 -i ‘UltraStudio Mini Recorder’ ffmpeg -f decklink -i ‘UltraStudio Mini Recorder@16’ output.mov I was able to build ffmpeg with support for BMD. These commands are now all working and I get the set of formats supported. My problem now is that ffmpeg (and even ffplay) all they show is the multi-colored vertical bars. If I run BMD Media Express, however, I see the video coming from the camera. So something seems quite strange in how ffmpeg is acquiring the source. I get the error of “No input signal detected” $ ffmpeg -f decklink -i 'UltraStudio Mini Recorder@10' output.mov ffmpeg version N-73104-g7604358 Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM
Re: [FFmpeg-user] what program to use to grab device and send to ffserver?
Hello Arnaud, On Jun 24, 2015, at 1:08 AM, Arnaud Wijns arnaud.wi...@ulb.ac.be wrote: Hello, Make sure you are using the good input format for your device. BMD is quite touchy in that field. For instance, with a 1080i 50fps camera, I must choose the following format : 11 1920x1080 at 25000/1000 fps (interlaced, upper field first). If I use any other, I also have the mire (vertical color bars). Do not hesitate to try all available formats. One of them will suit your needs (typically 8, 10, 11 or 13 if you use a full HD camera / 14, 16 for a screen grabber). If none of them works, I am not sure what to do in that case ;) I thought the blackmagic device already provided H.264 in hardware, however it seems the drivers are seeing rawvideo. Do you know how to get h264 video? $ ffprobe -f decklink -i 'UltraStudio Mini Recorder@12' ffprobe version N-73104-g7604358 Copyright (c) 2007-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn) configuration: --prefix=/usr/local --enable-gpl --enable-nonfree --enable-ffplay --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-decklink --extra-cflags=-I/Users/ptl/blackmagicsdk/Mac/include --extra-ldflags=-L/Users/ptl/blackmagicsdk/Mac/include libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 44.100 / 56. 44.100 libavformat56. 38.100 / 56. 38.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 17.100 / 5. 17.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc53. 3.100 / 53. 3.100 [decklink @ 0x7f9182802a00] Found Decklink mode 1920 x 1080 with rate 29.97(i) [decklink @ 0x7f9182802a00] Frame received (#1) - Input returned - Frames dropped 1 Input #0, decklink, from 'UltraStudio Mini Recorder@12': Duration: N/A, start: 0.00, bitrate: 1536 kb/s Stream #0:0: Audio: pcm_s16le, 48000 Hz, 2 channels, s16, 1536 kb/s Stream #0:1: Video: rawvideo (UYVY / 0x59565955), uyvy422, 1920x1080, -5 kb/s, 29.97 tbr, 1000k tbn, 29.97 tbc Arnaud. Arnaud WIJNS ULB Podcast | Informaticien http://podcast.ulb.ac.be http://podcast.ulb.ac.be/ Téléphone : 02/650.29.26 Email : arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be Université libre de Bruxelles Campus du Solbosch - CP 160/26 Avenue F.D. Roosevelt, 50 - 1050 Ixelles Le 24 juin 2015 à 04:42, Ricardo Kleemann rica...@americasnet.com a écrit : Hello Arnaud, Thanks again for your great help. Please see below… On Jun 22, 2015, at 5:28 AM, Arnaud Wijns arnaud.wi...@ulb.ac.be wrote: Hello Ricardo, I am working on a project which uses the same devices for the same purpose. You can use both bmdtools (with pipe to FFMPEG) or decklink support for FFMPEG (the second solution is easier and works better imo). 1) Install the Mac driver for BMD (BMD Desktop) 2) Download the Decklink SDK (from https://www.blackmagicdesign.com/support/download/f3e35f03b97440c4893fdf7e0dfdf97c/Mac%20OS%20X) 3) Install all dependencies for FFMPEG (use Brew as explained in the FFMPEG compilation guide for Mac OS) — brew install automake fdk-aac git lame libass libtool libvorbis libvpx \ opus sdl shtool texi2html theora wget x264 xvid yasm 4) Download FFMPEG from Git — git clone git://source.ffmpeg.org/ffmpeg.git ffmpeg cd ffmpeg 5) Prepare the configuration file (adapt according to your needs BUT —enable-deckling is required for BMD). Adapt the path for your Decklink SDK — ./configure --prefix=/usr/local --enable-gpl --enable-nonfree --enable-libass \ --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus \ --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-decklink --extra-cflags=-I/somepath/blackmagicsdk/Mac/include --extra-ldflags=-L/somepath/blackmagicsdk/Mac/include 6) If you use Mac OS X 10.9 or above, edit « config.mak » — vi config.mak Remove all references to « -std=c99 » (there are 2) 7) Edit DeckLinkAPIDispatch.cpp (in /somepath/blackmagicsdk/Mac/include) — add the word static to the beginning of lines 56, 77, and 157 56: void InitDeckLinkAPI (void) 77: boolIsDeckLinkAPIPresent (void) 157: void InitBMDStreamingAPI(void) 8) Make sudo make install Now that FFMPEG in installed with Decklink support, you can get the audio-video stream from your BMD devices directly in FFMPEG. Use the following commands: ffmpeg -f decklink -list_devices 1 -i dummy ffmpeg -f decklink -list_formats 1 -i ‘UltraStudio Mini Recorder’ ffmpeg -f decklink -i ‘UltraStudio Mini Recorder@16’ output.mov I was able to build ffmpeg with support for BMD. These commands are now all working and I get the set of formats supported. My problem now is that ffmpeg (and even ffplay
[FFmpeg-user] taking audio and video inputs from different devices
Hi, I’m not having much success mixing in audio from the default audio input with the video. The video comes out fine but audio is basically inaudible with intermittent a little bit of static. I don’t know if I have the proper settings, I do know that if I don’t use the async flag then the video plays back in very fast forward. the avfoundation “none:1” would be to use the default built-in audio $ ffmpeg -f avfoundation -i none:1 -ac 2 -f decklink -re -i 'UltraStudio Mini Recorder@9' -vcodec libx264 -crf 20 -preset veryfast -vf scale=640:360 -async 1 out7.mp4 ffmpeg version N-73104-g7604358 Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn) configuration: --prefix=/usr/local --enable-gpl --enable-nonfree --enable-ffplay --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-decklink --extra-cflags=-I/Users/ptl/blackmagicsdk/Mac/include --extra-ldflags=-L/Users/ptl/blackmagicsdk/Mac/include libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 44.100 / 56. 44.100 libavformat56. 38.100 / 56. 38.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 17.100 / 5. 17.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc53. 3.100 / 53. 3.100 Input #0, avfoundation, from 'none:1': Duration: N/A, start: 28853.224036, bitrate: 2822 kb/s Stream #0:0: Audio: pcm_f32le, 44100 Hz, stereo, flt, 2822 kb/s [decklink @ 0x7f8362024800] Found Decklink mode 1920 x 1080 with rate 29.97 [decklink @ 0x7f8362024800] Frame received (#1) - No input signal detected - Frames dropped 1 Guessed Channel Layout for Input Stream #1.0 : stereo Input #1, decklink, from 'UltraStudio Mini Recorder@9': Duration: N/A, start: 0.00, bitrate: 1536 kb/s Stream #1:0: Audio: pcm_s16le, 48000 Hz, 2 channels, s16, 1536 kb/s Stream #1:1: Video: rawvideo (UYVY / 0x59565955), uyvy422, 1920x1080, -5 kb/s, 29.97 tbr, 1000k tbn, 29.97 tbc File 'out7.mp4' already exists. Overwrite ? [y/N] [decklink @ 0x7f8362024800] Frame received (#2) - Input returned - Frames dropped 2 y No pixel format specified, yuv422p for H.264 encoding chosen. Use -pix_fmt yuv420p for compatibility with outdated media players. -async is forwarded to lavfi similarly to -af aresample=async=1:min_hard_comp=0.10:first_pts=0. [libx264 @ 0x7f83618ad000] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX [libx264 @ 0x7f83618ad000] profile High 4:2:2, level 3.0, 4:2:2 8-bit [libx264 @ 0x7f83618ad000] 264 - core 144 r2533 c8a773e - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=1 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=2 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=1 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=10 rc=crf mbtree=1 crf=20.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00 Output #0, mp4, to 'out7.mp4': Metadata: encoder : Lavf56.38.100 Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv422p, 640x360, q=-1--1, 29.97 fps, 30k tbn, 29.97 tbc Metadata: encoder : Lavc56.44.100 libx264 Stream #0:1: Audio: aac (libfdk_aac) ([64][0][0][0] / 0x0040), 44100 Hz, stereo, s16, 128 kb/s Metadata: encoder : Lavc56.44.100 libfdk_aac Stream mapping: Stream #1:1 - #0:0 (rawvideo (native) - h264 (libx264)) Stream #0:0 - #0:1 (pcm_f32le (native) - aac (libfdk_aac)) Press [q] to stop, [?] for help [decklink @ 0x7f8362024800] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8) frame= 17 fps=0.0 q=0.0 size= 0kB time=00:00:00.00 bitrate=N/A dup=10 dr[avfoundation @ 0x7f8362015800] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8) frame= 32 fps= 32 q=26.0 size= 52kB time=00:00:00.34 bitrate=1230.7kbits/frame= 47 fps= 31 q=26.0 size= 194kB time=00:00:00.90 bitrate=1754.8kbits/frame= 62 fps= 31 q=26.0 size= 338kB time=00:00:01.36 bitrate=2023.0kbits/ ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] what program to use to grab device and send to ffserver?
Thanks. I was thinking about that last night. With Flash Media Live Encoder, the only one that works is the 1920x1080 60fps. I guess I'll have to try each one and see. On Wed, Jun 24, 2015 at 1:08 AM, Arnaud Wijns arnaud.wi...@ulb.ac.be wrote: Hello, Make sure you are using the good input format for your device. BMD is quite touchy in that field. For instance, with a 1080i 50fps camera, I must choose the following format : 11 1920x1080 at 25000/1000 fps (interlaced, upper field first). If I use any other, I also have the mire (vertical color bars). Do not hesitate to try all available formats. One of them will suit your needs (typically 8, 10, 11 or 13 if you use a full HD camera / 14, 16 for a screen grabber). If none of them works, I am not sure what to do in that case ;) Arnaud. Arnaud WIJNS ULB Podcast | Informaticien http://podcast.ulb.ac.be http://podcast.ulb.ac.be/ Téléphone : 02/650.29.26 Email : arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be Université libre de Bruxelles Campus du Solbosch - CP 160/26 Avenue F.D. Roosevelt, 50 - 1050 Ixelles Le 24 juin 2015 à 04:42, Ricardo Kleemann rica...@americasnet.com a écrit : Hello Arnaud, Thanks again for your great help. Please see below… On Jun 22, 2015, at 5:28 AM, Arnaud Wijns arnaud.wi...@ulb.ac.be wrote: Hello Ricardo, I am working on a project which uses the same devices for the same purpose. You can use both bmdtools (with pipe to FFMPEG) or decklink support for FFMPEG (the second solution is easier and works better imo). 1) Install the Mac driver for BMD (BMD Desktop) 2) Download the Decklink SDK (from https://www.blackmagicdesign.com/support/download/f3e35f03b97440c4893fdf7e0dfdf97c/Mac%20OS%20X ) 3) Install all dependencies for FFMPEG (use Brew as explained in the FFMPEG compilation guide for Mac OS) — brew install automake fdk-aac git lame libass libtool libvorbis libvpx \ opus sdl shtool texi2html theora wget x264 xvid yasm 4) Download FFMPEG from Git — git clone git://source.ffmpeg.org/ffmpeg.git ffmpeg cd ffmpeg 5) Prepare the configuration file (adapt according to your needs BUT —enable-deckling is required for BMD). Adapt the path for your Decklink SDK — ./configure --prefix=/usr/local --enable-gpl --enable-nonfree --enable-libass \ --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus \ --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-decklink --extra-cflags=-I/somepath/blackmagicsdk/Mac/include --extra-ldflags=-L/somepath/blackmagicsdk/Mac/include 6) If you use Mac OS X 10.9 or above, edit « config.mak » — vi config.mak Remove all references to « -std=c99 » (there are 2) 7) Edit DeckLinkAPIDispatch.cpp (in /somepath/blackmagicsdk/Mac/include) — add the word static to the beginning of lines 56, 77, and 157 56: void InitDeckLinkAPI (void) 77: boolIsDeckLinkAPIPresent (void) 157: void InitBMDStreamingAPI(void) 8) Make sudo make install Now that FFMPEG in installed with Decklink support, you can get the audio-video stream from your BMD devices directly in FFMPEG. Use the following commands: ffmpeg -f decklink -list_devices 1 -i dummy ffmpeg -f decklink -list_formats 1 -i ‘UltraStudio Mini Recorder’ ffmpeg -f decklink -i ‘UltraStudio Mini Recorder@16’ output.mov I was able to build ffmpeg with support for BMD. These commands are now all working and I get the set of formats supported. My problem now is that ffmpeg (and even ffplay) all they show is the multi-colored vertical bars. If I run BMD Media Express, however, I see the video coming from the camera. So something seems quite strange in how ffmpeg is acquiring the source. I get the error of “No input signal detected” $ ffmpeg -f decklink -i 'UltraStudio Mini Recorder@10' output.mov ffmpeg version N-73104-g7604358 Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn) configuration: --prefix=/usr/local --enable-gpl --enable-nonfree --enable-ffplay --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-decklink --extra-cflags=-I/Users/ptl/blackmagicsdk/Mac/include --extra-ldflags=-L/Users/ptl/blackmagicsdk/Mac/include libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 44.100 / 56. 44.100 libavformat56. 38.100 / 56. 38.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 17.100 / 5. 17.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc53. 3.100 / 53. 3.100 [decklink @ 0x7fba0980] Found Decklink mode 1920 x 1080 with rate 30.00 [decklink @ 0x7fba0980] Frame received (#1) - No input signal detected
Re: [FFmpeg-user] what program to use to grab device and send to ffserver?
Thanks again Arnaud. On Tue, Jun 23, 2015 at 12:04 AM, Arnaud Wijns arnaud.wi...@ulb.ac.be wrote: Hi, 1) It is indeed the standard desktop video driver. 2) You can unzip it wherever you want, just make sure you use the same path when you prepare the configuration file. 3) You can make a « brew update » and « brew upgrade program » to make sure you have the latest version of all dependencies. One curiosity... in terms of how you plan to use this, would you be using ffplay to display whatever ffmpeg is processing? How do you plan on getting visual feedback of whatever ffserver is processing from the BMD device? Have fun :) Arnaud. Arnaud WIJNS ULB Podcast | Informaticien http://podcast.ulb.ac.be http://podcast.ulb.ac.be/ Téléphone : 02/650.29.26 Email : arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be Université libre de Bruxelles Campus du Solbosch - CP 160/26 Avenue F.D. Roosevelt, 50 - 1050 Ixelles Le 23 juin 2015 à 03:19, Ricardo Kleemann rica...@americasnet.com a écrit : Hello Arnaud, Thanks for the great suggestion, I will give it a try. More below… On Jun 22, 2015, at 5:28 AM, Arnaud Wijns arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be wrote: Hello Ricardo, I am working on a project which uses the same devices for the same purpose. You can use both bmdtools (with pipe to FFMPEG) or decklink support for FFMPEG (the second solution is easier and works better imo). 1) Install the Mac driver for BMD (BMD Desktop) I assume you mean the standard desktop video driver (which is installed together with Media Express)? 2) Download the Decklink SDK (from https://www.blackmagicdesign.com/support/download/f3e35f03b97440c4893fdf7e0dfdf97c/Mac%20OS%20X https://www.blackmagicdesign.com/support/download/f3e35f03b97440c4893fdf7e0dfdf97c/Mac%20OS%20X ) Where should I unzip the sdks, does it matter? 3) Install all dependencies for FFMPEG (use Brew as explained in the FFMPEG compilation guide for Mac OS) — brew install automake fdk-aac git lame libass libtool libvorbis libvpx \ opus sdl shtool texi2html theora wget x264 xvid yasm If I’ve run brew before to install the stuff should I remove and reinstall? 4) Download FFMPEG from Git — git clone git://source.ffmpeg.org/ffmpeg.git ffmpeg cd ffmpeg 5) Prepare the configuration file (adapt according to your needs BUT —enable-deckling is required for BMD). Adapt the path for your Decklink SDK — ./configure --prefix=/usr/local --enable-gpl --enable-nonfree --enable-libass \ --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus \ --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-decklink --extra-cflags=-I/somepath/blackmagicsdk/Mac/include --extra-ldflags=-L/somepath/blackmagicsdk/Mac/include 6) If you use Mac OS X 10.9 or above, edit « config.mak » — vi config.mak Remove all references to « -std=c99 » (there are 2) 7) Edit DeckLinkAPIDispatch.cpp (in /somepath/blackmagicsdk/Mac/include) — add the word static to the beginning of lines 56, 77, and 157 56: void InitDeckLinkAPI (void) 77: boolIsDeckLinkAPIPresent (void) 157: void InitBMDStreamingAPI(void) 8) Make sudo make install Now that FFMPEG in installed with Decklink support, you can get the audio-video stream from your BMD devices directly in FFMPEG. Use the following commands: ffmpeg -f decklink -list_devices 1 -i dummy ffmpeg -f decklink -list_formats 1 -i ‘UltraStudio Mini Recorder’ ffmpeg -f decklink -i ‘UltraStudio Mini Recorder@16’ output.mov Thanks again!! I’m sure I’ll soon be writing with more requests for assistance ;-) Ricardo Cheers, Arnaud. Arnaud WIJNS ULB Podcast | Informaticien http://podcast.ulb.ac.be http://podcast.ulb.ac.be/ http://podcast.ulb.ac.be/ http://podcast.ulb.ac.be/ Téléphone : 02/650.29.26 Email : arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be mailto: arnaud.wi...@ulb.ac.be mailto:arnaud.wi...@ulb.ac.be Université libre de Bruxelles Campus du Solbosch - CP 160/26 Avenue F.D. Roosevelt, 50 - 1050 Ixelles Le 16 juin 2015 à 19:30, Ricardo Kleemann rica...@americasnet.com a écrit : On Tue, Jun 16, 2015 at 9:03 AM, Carl Eugen Hoyos ceho...@ag.or.at wrote: Ricardo Kleemann ricardo at americasnet.com writes: I have a BlackMagic Design Ultrastudio mini recorder connected to macbook pro thunderbolt. But it seems the only program that recognizes the device is FMLE. What is FMLE? Why can't you feed whatever FMLE produces to ffmpeg (the application)? Sorry, it is Flash Media Live Encoder, and it does its own encoding and sends to a server like Red5 or Flash Media Server. I was hoping to not have to use FMLE and instead something that can push directly to ffserver. Certainly I can feed the FMLE output to ffmpeg and then to ffserver but then I'm
Re: [FFmpeg-user] how do I list the devices?
Hello, Regarding the options for config/compile, I didn't follow any specific set of instruction or logic, I took a script used for building ffmpeg on a mac. I can certainly change the script based on your helpful feedback. More below... On Sun, Jun 21, 2015 at 11:56 PM, Carl Eugen Hoyos ceho...@ag.or.at wrote: Ricardo Kleemann ricardo at americasnet.com writes: configuration: Unrelated: --enable-postproc --enable-swscale --enable-avfilter --enable-pthreads These options have no effect, please remove them to make debugging your configure line easier. --enable-hardcoded-tables Out of curiosity: Why? --enable-libfaac libfdk offers superior quality. --enable-version3 From a quick look, this is unneeded (or isn't it?), I suggest you remove it. [AVFoundation input device 0x7fcd3ae00960] AVFoundation video devices: [AVFoundation input device 0x7fcd3ae00960] [0] FaceTime HD Camera (Built-in) [AVFoundation input device 0x7fcd3ae00960] [1] Capture screen 0 [AVFoundation input device 0x7fcd3ae00960] AVFoundation audio devices: [AVFoundation input device 0x7fcd3ae00960] [0] Blackmagic Audio [AVFoundation input device 0x7fcd3ae00960] [1] Built-in Input The Blackmagic shows for Audio, but no for video. Which application that records the screen and the FaceTime camera is able to record Blackmagic video? What makes you think this isn't just a BlackMagic driver issue? There's an app from Blackmagic (Media Express) which works fine with the driver. There's also the Flash Live Media Encoder (from Adobe) which recognizes the blackmagic device and works well, so I can only assume the driver is working fine. Unfortunately neither ffmpeg nor VLC recognize the device so somehow to these the driver isn't working, no idea why... :-/ Carl Eugen ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] how do I list the devices?
Thanks for pointing that out… my bad! I was using an existing script for compiling and didn’t notice the option, I’ll rebuild and try again. On Jun 20, 2015, at 4:12 AM, Moritz Barsnick barsn...@gmx.net wrote: Hi Ricardo, On Fri, Jun 19, 2015 at 18:52:02 -0700, Ricardo Kleemann wrote: And tried the command -devices and I don’t understand the output. I’m trying to see which devices are available And ffmpeg correctly tells you: No input devices. Devices: D. = Demuxing supported .E = Muxing supported -- E sdl SDL output device Which is pretty obvious, because of the way you built your ffmpeg: configuration: --enable-nonfree --enable-gpl --enable-version3 --enable-postproc --enable-swscale --enable-avfilter --enable-libmp3lame --enable-libvorbis --enable-libtheora --enable-libfaac --enable-libxvid --enable-libx264 --enable-libvpx --enable-hardcoded-tables --enable-shared --enable-pthreads --disable-indevs Do you see that very last option? --disable-indevs What were you trying to achive when using it? Moritz ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] how do I list the devices?
Thanks again for your help. I’ve rebuilt ffmpeg and now I do see devices. But I don’t know how to differentiate. From the output below it seems to me there’s only 1 device which very likely would be the built-in webcam of the macbook pro. Which, if that is the case, then my thunderbolt device isn’t being recognized. I do know that the capture device works, I can see it if I run the Flash Media Live Encoder application or the Media Express application that comes with the capture device. $ /usr/local/bin/ffmpeg -devices ffmpeg version N-73015-g8edc17b Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn) configuration: --enable-nonfree --enable-gpl --enable-version3 --enable-postproc --enable-swscale --enable-avfilter --enable-libmp3lame --enable-libvorbis --enable-libtheora --enable-libfaac --enable-libxvid --enable-libx264 --enable-libvpx --enable-hardcoded-tables --enable-shared --enable-pthreads libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 41.100 / 56. 41.100 libavformat56. 37.100 / 56. 37.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 17.100 / 5. 17.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc53. 3.100 / 53. 3.100 Devices: D. = Demuxing supported .E = Muxing supported -- D avfoundationAVFoundation input device D lavfi Libavfilter virtual input device D qtkit QTKit input device E sdl SDL output device On Jun 20, 2015, at 4:12 AM, Moritz Barsnick barsn...@gmx.net wrote: Hi Ricardo, On Fri, Jun 19, 2015 at 18:52:02 -0700, Ricardo Kleemann wrote: And tried the command -devices and I don’t understand the output. I’m trying to see which devices are available And ffmpeg correctly tells you: No input devices. Devices: D. = Demuxing supported .E = Muxing supported -- E sdl SDL output device Which is pretty obvious, because of the way you built your ffmpeg: configuration: --enable-nonfree --enable-gpl --enable-version3 --enable-postproc --enable-swscale --enable-avfilter --enable-libmp3lame --enable-libvorbis --enable-libtheora --enable-libfaac --enable-libxvid --enable-libx264 --enable-libvpx --enable-hardcoded-tables --enable-shared --enable-pthreads --disable-indevs Do you see that very last option? --disable-indevs What were you trying to achive when using it? Moritz ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] how do I list the devices?
Thank you for your feedback. I apologize, I was building from a script and didn’t realize the option. I will try again. :-) Ricardo On Jun 20, 2015, at 4:12 AM, Moritz Barsnick barsn...@gmx.net wrote: Hi Ricardo, On Fri, Jun 19, 2015 at 18:52:02 -0700, Ricardo Kleemann wrote: And tried the command -devices and I don’t understand the output. I’m trying to see which devices are available And ffmpeg correctly tells you: No input devices. Devices: D. = Demuxing supported .E = Muxing supported -- E sdl SDL output device Which is pretty obvious, because of the way you built your ffmpeg: configuration: --enable-nonfree --enable-gpl --enable-version3 --enable-postproc --enable-swscale --enable-avfilter --enable-libmp3lame --enable-libvorbis --enable-libtheora --enable-libfaac --enable-libxvid --enable-libx264 --enable-libvpx --enable-hardcoded-tables --enable-shared --enable-pthreads --disable-indevs Do you see that very last option? --disable-indevs What were you trying to achive when using it? Moritz ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] how do I list the devices?
Hi Moritz On Jun 20, 2015, at 11:01 AM, Moritz Barsnick barsn...@gmx.net wrote: On Sat, Jun 20, 2015 at 09:56:13 -0700, Ricardo Kleemann wrote: $ /usr/local/bin/ffmpeg -devices What does -list_devices show you now? Is the qtkit indev of any use to you? (I don't have a Mac or know much about it, but if it knows both your devices, they show both be presented to ffmpeg as well.) I was hopeful this time but still having troubles. This is what I see: $ ffmpeg -f avfoundation -list_devices true -i ffmpeg version N-73015-g8edc17b Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn) configuration: --enable-nonfree --enable-gpl --enable-version3 --enable-postproc --enable-swscale --enable-avfilter --enable-libmp3lame --enable-libvorbis --enable-libtheora --enable-libfaac --enable-libxvid --enable-libx264 --enable-libvpx --enable-hardcoded-tables --enable-shared --enable-pthreads libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 41.100 / 56. 41.100 libavformat56. 37.100 / 56. 37.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 17.100 / 5. 17.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc53. 3.100 / 53. 3.100 [AVFoundation input device @ 0x7fcd3ae00960] AVFoundation video devices: [AVFoundation input device @ 0x7fcd3ae00960] [0] FaceTime HD Camera (Built-in) [AVFoundation input device @ 0x7fcd3ae00960] [1] Capture screen 0 [AVFoundation input device @ 0x7fcd3ae00960] AVFoundation audio devices: [AVFoundation input device @ 0x7fcd3ae00960] [0] Blackmagic Audio [AVFoundation input device @ 0x7fcd3ae00960] [1] Built-in Input : Input/output error The Blackmagic shows for Audio, but no for video. I tried to see if at least the “capture screen” would be the one but that’s a special device which captures the current screen (output to the screen/monitor) I don’t know what the Input/output error at the end means… but I’m not able to get the Blackmagic video device unfortunately. I know you’re not a Mac person, is anyone else able to help? thanks Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
[FFmpeg-user] how do I list the devices?
Hi, I just built ffmpeg from snapshot. I am on a MacBook Pro, OS X 10.10.3 I tried using the -list_devices command but get an error: $ /usr/local/bin/ffmpeg -f avfoundation -list_devices true -i ffmpeg version N-73015-g8edc17b Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn) configuration: --enable-nonfree --enable-gpl --enable-version3 --enable-postproc --enable-swscale --enable-avfilter --enable-libmp3lame --enable-libvorbis --enable-libtheora --enable-libfaac --enable-libxvid --enable-libx264 --enable-libvpx --enable-hardcoded-tables --enable-shared --enable-pthreads --disable-indevs libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 41.100 / 56. 41.100 libavformat56. 37.100 / 56. 37.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 17.100 / 5. 17.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc53. 3.100 / 53. 3.100 Unrecognized option 'list_devices'. Error splitting the argument list: Option not found And tried the command -devices and I don’t understand the output. I’m trying to see which devices are available $ /usr/local/bin/ffmpeg -devices ffmpeg version N-73015-g8edc17b Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn) configuration: --enable-nonfree --enable-gpl --enable-version3 --enable-postproc --enable-swscale --enable-avfilter --enable-libmp3lame --enable-libvorbis --enable-libtheora --enable-libfaac --enable-libxvid --enable-libx264 --enable-libvpx --enable-hardcoded-tables --enable-shared --enable-pthreads --disable-indevs libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 41.100 / 56. 41.100 libavformat56. 37.100 / 56. 37.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 17.100 / 5. 17.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc53. 3.100 / 53. 3.100 Devices: D. = Demuxing supported .E = Muxing supported -- E sdl SDL output device I don’t even see the standard webcam device from the mac I have an A/V capture device connected to the thunderbolt port (Blackmagic Design UltraStudio Mini Recorder) and I wanted to see which device to use with ffmpeg thanks Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Re: [FFmpeg-user] what program to use to grab device and send to ffserver?
On Tue, Jun 16, 2015 at 9:03 AM, Carl Eugen Hoyos ceho...@ag.or.at wrote: Ricardo Kleemann ricardo at americasnet.com writes: I have a BlackMagic Design Ultrastudio mini recorder connected to macbook pro thunderbolt. But it seems the only program that recognizes the device is FMLE. What is FMLE? Why can't you feed whatever FMLE produces to ffmpeg (the application)? Sorry, it is Flash Media Live Encoder, and it does its own encoding and sends to a server like Red5 or Flash Media Server. I was hoping to not have to use FMLE and instead something that can push directly to ffserver. Certainly I can feed the FMLE output to ffmpeg and then to ffserver but then I'm just adding another component into the chain. I was hoping to go directly to ffserver, if for example ffmpeg was able to see the input video device. thanks Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
[FFmpeg-user] what program to use to grab device and send to ffserver?
I want to feed ffserver but I’m not sure what program to use. I have a BlackMagic Design Ultrastudio mini recorder connected to macbook pro thunderbolt. But it seems the only program that recognizes the device is FMLE. ffmpeg won’t list the video device if I run the list_devices (only shows the audio device) $ ffmpeg -f avfoundation -list_devices true -i ffmpeg version 2.6.3 Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn) configuration: --prefix=/usr/local/Cellar/ffmpeg/2.6.3 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-libx264 --enable-libmp3lame --enable-libvo-aacenc --enable-libxvid --enable-libfreetype --enable-libvorbis --enable-libvpx --enable-libass --enable-ffplay --enable-libfdk-aac --enable-libopus --enable-libquvi --enable-libx265 --enable-nonfree --enable-vda libavutil 54. 20.100 / 54. 20.100 libavcodec 56. 26.100 / 56. 26.100 libavformat56. 25.101 / 56. 25.101 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 11.102 / 5. 11.102 libavresample 2. 1. 0 / 2. 1. 0 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 1.100 / 1. 1.100 libpostproc53. 3.100 / 53. 3.100 [AVFoundation input device @ 0x7f8038c23220] AVFoundation video devices: [AVFoundation input device @ 0x7f8038c23220] [0] FaceTime HD Camera (Built-in) [AVFoundation input device @ 0x7f8038c23220] [1] Capture screen 0 [AVFoundation input device @ 0x7f8038c23220] AVFoundation audio devices: [AVFoundation input device @ 0x7f8038c23220] [0] Blackmagic Audio [AVFoundation input device @ 0x7f8038c23220] [1] Built-in Input I tried VLC and it also doesn’t show anything other than the built-in webcam. Is there any other program that could be used for feeding into ffserver? I don’t believe FMLE is capable of feeding fresher. thanks Ricardo ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user
[FFmpeg-user] ffserver - hls/segment options in ffserver.conf?
Hi, Anyone know if any of the segment (or hls) options used with ffmpeg can be set for a Stream in ffserver.conf? For example, hls_segment_filename, hls_time, etc... I have a working hls file set of commands for ffmpeg but wanted to be able to have ffserver produce a stream from that. I've tried feeding ffserver using the -override_ffserver option in an attempt to have the ffmpeg options passed over to the ffserver stream but that's not working. I tried setting up ffserver.conf with the following stream: Stream live.m3u8 Feed feed1.ffm Format hls /Stream And the command to feed ffserver is $ ffmpeg -re -i ~/video/sample.mp4 -c copy -flags +global_header -bsf:v h264_mp4toannexb -hls_time 10 -hls_list_size 60 -hls_wrap 60 -hls_allow_cache 1 -hls_segment_filename '/shared/www/html/video/live/live%03d.ts' -segment_list_flags +live -override_ffserver http://localhost:8090/feed1.ffm ffmpeg version 2.6.git Copyright (c) 2000-2015 the FFmpeg developers built with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1) configuration: --prefix=/home/ubuntu/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/ubuntu/ffmpeg_build/include --extra-ldflags=-L/home/ubuntu/ffmpeg_build/lib --bindir=/home/ubuntu/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree libavutil 54. 23.101 / 54. 23.101 libavcodec 56. 35.101 / 56. 35.101 libavformat56. 31.100 / 56. 31.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 16.100 / 5. 16.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 1.100 / 1. 1.100 libpostproc53. 3.100 / 53. 3.100 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/home/ubuntu/video/sample.mp4': Metadata: major_brand : qt minor_version : 512 compatible_brands: qt creation_time : 1970-01-01 00:00:00 encoder : Lavf52.73.0 Duration: 00:09:56.46, start: 0.00, bitrate: 524 kb/s Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 424x240, 420 kb/s, 24 fps, 24 tbr, 24 tbn, 48 tbc (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler encoder : libx264 Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 99 kb/s (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler Output #0, ffm, to 'http://localhost:8090/feed1.ffm': Metadata: major_brand : qt minor_version : 512 compatible_brands: qt creation_time : now encoder : Lavf56.31.100 Stream #0:0(eng): Video: h264 (avc1 / 0x31637661), yuv420p, 424x240, q=2-31, 420 kb/s, 24 fps, 24 tbr, 1000k tbn, 24 tbc (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler encoder : libx264 Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, 99 kb/s (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler Stream mapping: Stream #0:0 - #0:0 (copy) Stream #0:1 - #0:1 (copy) Press [q] to stop, [?] for help frame= 13 fps=0.0 q=-1.0 size= 16kB time=00:00:00.54 bitrate= 242.0kbits/frame= 25 fps= 25 q=-1.0 size= 32kB time=00:00:01.04 bitrate= 250.8kbits/frame= 37 fps= 24 q=-1.0 size= 56kB time=00:00:01.55 bitrate= 294.6kbits/ So when I feed this to ffserver, none of the segment/hls options take effect, more importantly the ones with the filename paths. All I see are ts files being written to the local directory where ffserver was launched, and the m3u8 file is only written as .tmp and unable to be renamed $ sudo bin/ffserver ffserver version 2.6.git Copyright (c) 2000-2015 the FFmpeg developers built with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1) configuration: --prefix=/home/ubuntu/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/ubuntu/ffmpeg_build/include --extra-ldflags=-L/home/ubuntu/ffmpeg_build/lib --bindir=/home/ubuntu/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree libavutil 54. 23.101 / 54. 23.101 libavcodec 56. 35.101 / 56. 35.101 libavformat56. 31.100 / 56. 31.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 16.100 / 5. 16.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 1.100 / 1. 1.100 libpostproc53. 3.100 / 53. 3.100 /etc/ffserver.conf:82: Setting default value for audio bit rate = 64000. Use NoDefaults to disable it. /etc/ffserver.conf:82: Setting default value for audio sample rate = 22050. Use NoDefaults to disable it. /etc/ffserver.conf:82: Setting default value for audio channel count = 1. Use NoDefaults to
[FFmpeg-user] Does anyone have a working ffmpeg - ffserver config for hls/m3u8?
I've been searching around the net found only a few examples but so far not having much success. My objective is to get a camera stream (available as input to ffmpeg as m3u8) and feed it to ffserver, and play it out also as m3u8. First, what I'm doing is simulating that by taking an mp4 file as input instead. Here's the info on the sample file: ~/bin/ffmpeg -i sample.mp4 ffmpeg version 2.6.git Copyright (c) 2000-2015 the FFmpeg developers built with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1) configuration: --prefix=/home/ubuntu/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/ubuntu/ffmpeg_build/include --extra-ldflags=-L/home/ubuntu/ffmpeg_build/lib --bindir=/home/ubuntu/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree libavutil 54. 23.101 / 54. 23.101 libavcodec 56. 35.101 / 56. 35.101 libavformat56. 31.100 / 56. 31.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 16.100 / 5. 16.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 1.100 / 1. 1.100 libpostproc53. 3.100 / 53. 3.100 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'sample.mp4': Metadata: major_brand : qt minor_version : 512 compatible_brands: qt creation_time : 1970-01-01 00:00:00 encoder : Lavf52.73.0 Duration: 00:09:56.46, start: 0.00, bitrate: 524 kb/s Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 424x240, 420 kb/s, 24 fps, 24 tbr, 24 tbn, 48 tbc (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler encoder : libx264 Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 99 kb/s (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler Here's my ffmpeg feed to ffserver. ~/bin/ffmpeg -re -i sample.mp4 -c copy -hls_time 10 -hls_list_size 6 -hls_wrap 18 -start_number 1 -override_ffserver http://localhost:8090/feed1.ffm ffmpeg version 2.6.git Copyright (c) 2000-2015 the FFmpeg developers built with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1) configuration: --prefix=/home/ubuntu/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/ubuntu/ffmpeg_build/include --extra-ldflags=-L/home/ubuntu/ffmpeg_build/lib --bindir=/home/ubuntu/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree libavutil 54. 23.101 / 54. 23.101 libavcodec 56. 35.101 / 56. 35.101 libavformat56. 31.100 / 56. 31.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 16.100 / 5. 16.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 1.100 / 1. 1.100 libpostproc53. 3.100 / 53. 3.100 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'sample.mp4': Metadata: major_brand : qt minor_version : 512 compatible_brands: qt creation_time : 1970-01-01 00:00:00 encoder : Lavf52.73.0 Duration: 00:09:56.46, start: 0.00, bitrate: 524 kb/s Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 424x240, 420 kb/s, 24 fps, 24 tbr, 24 tbn, 48 tbc (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler encoder : libx264 Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 99 kb/s (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler Output #0, ffm, to 'http://localhost:8090/feed1.ffm': Metadata: major_brand : qt minor_version : 512 compatible_brands: qt creation_time : now encoder : Lavf56.31.100 Stream #0:0(eng): Video: h264 (avc1 / 0x31637661), yuv420p, 424x240, q=2-31, 420 kb/s, 24 fps, 24 tbr, 1000k tbn, 24 tbc (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler encoder : libx264 Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, 99 kb/s (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler Stream mapping: Stream #0:0 - #0:0 (copy) Stream #0:1 - #0:1 (copy) Using username ubuntu. Authenticating with public key imported-openssh-key Welcome to Ubuntu 14.04.2 LTS (GNU/Linux 3.13.0-51-generic x86_64) * Documentation: https://help.ubuntu.com/ System information as of Tue May 5 11:17:35 PDT 2015 System load: 0.0 Processes: 120 Usage of /: 39.1% of 7.74GB Users logged in: 1 Memory usage: 2%IP address for eth0: 172.31.17.7 Swap usage: 0% Graph this data and manage this system at:
Re: [FFmpeg-user] ffserver avc1 incompatible with codec id '28'
: 1970-01-01 00:00:00 handler_name: DataHandler encoder : Lavc56.35.101 libx264 Stream mapping: Stream #0:1 - #0:0 (aac (native) - aac (libfdk_aac)) Stream #0:0 - #0:1 (h264 (native) - h264 (libx264)) On Sun, May 3, 2015 at 9:31 AM, Ricardo Kleemann rica...@americasnet.com wrote: Hi, I'm trying to feed a stream (mp4 file) to ffserver via ffmpeg, and trying to access on ffserver and get the error: Sun May 3 09:18:25 2015 127.0.0.1 - - [GET] /feed1.ffm HTTP/1.1 200 4175 Sun May 3 09:18:32 2015 [mp4 @ 0x24edf10]Tag avc1/0x31637661 incompatible with output codec id '28' ([33][0][0][0]) Sun May 3 09:18:32 2015 Error writing output header for stream 'live.mp4': Invalid data found when processing input The ffserver config for testing is very simple: Stream live.mp4 Feed feed1.ffm Format mp4 AVOptionVideo flags +global_header AVOptionAudio flags +global_header /Stream (I've tried the test with and without the line VideoCodec libx264 with same results... when not having the line in there the objective is to copy the instream codec) When I startup ffserver this is what I get: ~$ sudo ffserver ffserver version 2.4.3-1ubuntu1~trusty6 Copyright (c) 2000-2014 the FFmpeg developers built on Nov 22 2014 17:07:19 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1) configuration: --prefix=/usr --extra-version='1ubuntu1~trusty6' --build-suffix=-ffmpeg --toolchain=hardened --extra-cflags= --extra-cxxflags= --libdir=/usr/lib/x86_64-linux-gnu --shlibdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --enable-shared --disable-stripping --enable-avresample --enable-avisynth --enable-fontconfig --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-opengl --enable-x11grab --enable-libxvid --enable-libx265 --enable-libdc1394 --enable-libiec61883 --enable-libzvbi --enable-libzmq --enable-frei0r --enable-libx264 --enable-libsoxr --enable-openal --enable-libopencv libavutil 54. 7.100 / 54. 7.100 libavcodec 56. 1.100 / 56. 1.100 libavformat56. 4.101 / 56. 4.101 libavdevice56. 0.100 / 56. 0.100 libavfilter 5. 1.100 / 5. 1.100 libavresample 2. 1. 0 / 2. 1. 0 libswscale 3. 0.100 / 3. 0.100 libswresample 1. 1.100 / 1. 1.100 libpostproc53. 0.100 / 53. 0.100 Sun May 3 09:27:58 2015 FFserver started. Here is my line for ffmpeg: $ ffmpeg -re -i sample.mp4 -c copy http://localhost:8090/feed1.ffm ffmpeg version 2.4.3-1ubuntu1~trusty6 Copyright (c) 2000-2014 the FFmpeg developers built on Nov 22 2014 17:07:19 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1) configuration: --prefix=/usr --extra-version='1ubuntu1~trusty6' --build-suffix=-ffmpeg --toolchain=hardened --extra-cflags= --extra-cxxflags= --libdir=/usr/lib/x86_64-linux-gnu --shlibdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --enable-shared --disable-stripping --enable-avresample --enable-avisynth --enable-fontconfig --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-opengl --enable-x11grab --enable-libxvid --enable-libx265 --enable-libdc1394 --enable-libiec61883 --enable-libzvbi --enable-libzmq --enable-frei0r --enable-libx264 --enable-libsoxr --enable-openal --enable-libopencv libavutil 54. 7.100 / 54. 7.100 libavcodec 56. 1.100 / 56. 1.100 libavformat56. 4.101 / 56. 4.101 libavdevice56. 0.100 / 56. 0.100 libavfilter 5. 1.100 / 5. 1.100 libavresample 2. 1. 0 / 2. 1. 0 libswscale 3. 0.100 / 3. 0.100 libswresample 1. 1.100 / 1. 1.100 libpostproc53. 0.100 / 53. 0.100 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'sample.mp4': Metadata: major_brand : qt minor_version : 512 compatible_brands: qt creation_time : 1970-01-01 00:00:00 encoder : Lavf52.73.0 Duration: 00:09:56.46, start: 0.00, bitrate: 524 kb/s Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 424x240, 420 kb/s, 24 fps
[FFmpeg-user] help with live streaming with ffserver
Hello, I've written a couple of messages last couple of days, and I'd say for now ignore them. I've been doing a lot of testing with many different configurations and honestly I'm pretty lost on how to get this working... I'm running a simple test... I'm simulating a live stream by reading a file with ffmpeg and feeding it to ffserver. The file I have is mp4 with h264 and aac. I want to be able to play it within an HTML5 video object by accessing via http from ffserver. So my first question... is this even possible, within video object and http? What would be the appropriate ffserver config and ffmpeg command line? My current command line to feed ffserver is: $ ffmpeg -re -i sample.mp4 -c copy http://localhost:8090/feed1.ffm ffmpeg version 2.6.git Copyright (c) 2000-2015 the FFmpeg developers built with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1) configuration: --prefix=/home/ubuntu/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/ubuntu/ffmpeg_build/include --extra-ldflags=-L/home/ubuntu/ffmpeg_build/lib --bindir=/home/ubuntu/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree libavutil 54. 23.101 / 54. 23.101 libavcodec 56. 35.101 / 56. 35.101 libavformat56. 31.100 / 56. 31.100 libavdevice56. 4.100 / 56. 4.100 libavfilter 5. 16.100 / 5. 16.100 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 1.100 / 1. 1.100 libpostproc53. 3.100 / 53. 3.100 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'sample.mp4': Metadata: major_brand : qt minor_version : 512 compatible_brands: qt creation_time : 1970-01-01 00:00:00 encoder : Lavf52.73.0 Duration: 00:09:56.46, start: 0.00, bitrate: 524 kb/s Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 424x240, 420 kb/s, 24 fps, 24 tbr, 24 tbn, 48 tbc (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler encoder : libx264 Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 99 kb/s (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler [libx264 @ 0x2379000] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX [libx264 @ 0x2379000] profile High, level 2.1 [libx264 @ 0x2379000] 264 - core 142 r2389 956c8d8 - H.264/MPEG-4 AVC codec - Copyleft 2003-2014 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=3 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=5 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=abr mbtree=1 bitrate=64 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=128 vbv_bufsize=128 nal_hrd=none filler=0 ip_ratio=1.40 aq=1:1.00 [libx264 @ 0x2361d40] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX [libx264 @ 0x2361d40] profile High, level 1b [libx264 @ 0x2361d40] 264 - core 142 r2389 956c8d8 - H.264/MPEG-4 AVC codec - Copyleft 2003-2014 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=3 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=5 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=abr mbtree=1 bitrate=64 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=128 vbv_bufsize=128 nal_hrd=none filler=0 ip_ratio=1.40 aq=1:1.00 [libx264 @ 0x2364520] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX [libx264 @ 0x2364520] profile High, level 2.1 [libx264 @ 0x2364520] 264 - core 142 r2389 956c8d8 - H.264/MPEG-4 AVC codec - Copyleft 2003-2014 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=3 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=5 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=abr mbtree=1 bitrate=64 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=128 vbv_bufsize=128 nal_hrd=none filler=0 ip_ratio=1.40 aq=1:1.00 Output #0, ffm, to
[FFmpeg-user] ffserver avc1 incompatible with codec id '28'
Hi, I'm trying to feed a stream (mp4 file) to ffserver via ffmpeg, and trying to access on ffserver and get the error: Sun May 3 09:18:25 2015 127.0.0.1 - - [GET] /feed1.ffm HTTP/1.1 200 4175 Sun May 3 09:18:32 2015 [mp4 @ 0x24edf10]Tag avc1/0x31637661 incompatible with output codec id '28' ([33][0][0][0]) Sun May 3 09:18:32 2015 Error writing output header for stream 'live.mp4': Invalid data found when processing input The ffserver config for testing is very simple: Stream live.mp4 Feed feed1.ffm Format mp4 AVOptionVideo flags +global_header AVOptionAudio flags +global_header /Stream (I've tried the test with and without the line VideoCodec libx264 with same results... when not having the line in there the objective is to copy the instream codec) When I startup ffserver this is what I get: ~$ sudo ffserver ffserver version 2.4.3-1ubuntu1~trusty6 Copyright (c) 2000-2014 the FFmpeg developers built on Nov 22 2014 17:07:19 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1) configuration: --prefix=/usr --extra-version='1ubuntu1~trusty6' --build-suffix=-ffmpeg --toolchain=hardened --extra-cflags= --extra-cxxflags= --libdir=/usr/lib/x86_64-linux-gnu --shlibdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --enable-shared --disable-stripping --enable-avresample --enable-avisynth --enable-fontconfig --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-opengl --enable-x11grab --enable-libxvid --enable-libx265 --enable-libdc1394 --enable-libiec61883 --enable-libzvbi --enable-libzmq --enable-frei0r --enable-libx264 --enable-libsoxr --enable-openal --enable-libopencv libavutil 54. 7.100 / 54. 7.100 libavcodec 56. 1.100 / 56. 1.100 libavformat56. 4.101 / 56. 4.101 libavdevice56. 0.100 / 56. 0.100 libavfilter 5. 1.100 / 5. 1.100 libavresample 2. 1. 0 / 2. 1. 0 libswscale 3. 0.100 / 3. 0.100 libswresample 1. 1.100 / 1. 1.100 libpostproc53. 0.100 / 53. 0.100 Sun May 3 09:27:58 2015 FFserver started. Here is my line for ffmpeg: $ ffmpeg -re -i sample.mp4 -c copy http://localhost:8090/feed1.ffm ffmpeg version 2.4.3-1ubuntu1~trusty6 Copyright (c) 2000-2014 the FFmpeg developers built on Nov 22 2014 17:07:19 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1) configuration: --prefix=/usr --extra-version='1ubuntu1~trusty6' --build-suffix=-ffmpeg --toolchain=hardened --extra-cflags= --extra-cxxflags= --libdir=/usr/lib/x86_64-linux-gnu --shlibdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --enable-shared --disable-stripping --enable-avresample --enable-avisynth --enable-fontconfig --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-opengl --enable-x11grab --enable-libxvid --enable-libx265 --enable-libdc1394 --enable-libiec61883 --enable-libzvbi --enable-libzmq --enable-frei0r --enable-libx264 --enable-libsoxr --enable-openal --enable-libopencv libavutil 54. 7.100 / 54. 7.100 libavcodec 56. 1.100 / 56. 1.100 libavformat56. 4.101 / 56. 4.101 libavdevice56. 0.100 / 56. 0.100 libavfilter 5. 1.100 / 5. 1.100 libavresample 2. 1. 0 / 2. 1. 0 libswscale 3. 0.100 / 3. 0.100 libswresample 1. 1.100 / 1. 1.100 libpostproc53. 0.100 / 53. 0.100 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'sample.mp4': Metadata: major_brand : qt minor_version : 512 compatible_brands: qt creation_time : 1970-01-01 00:00:00 encoder : Lavf52.73.0 Duration: 00:09:56.46, start: 0.00, bitrate: 524 kb/s Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 424x240, 420 kb/s, 24 fps, 24 tbr, 24 tbn, 48 tbc (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler encoder : libx264 Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 99 kb/s (default) Metadata: creation_time : 1970-01-01 00:00:00 handler_name: DataHandler Output #0, ffm, to