Hi List,

as stated in my previous message, the rkmpp h264 encoder has not yet
been finalized, so I'm stuck for the moment.

gstreamer has the hardware encoding part, so I thought of piping the
output of ffmpeg (grabing an HDMI-IN pseudo-camera) to gstreamer for
h.264 encoding, and then back to ffmpeg to stream to YouTube.

Unfortunately, I have not been able to create the first part of the
pipe. What I'd like to do is get the video in raw format, to avoid
unnecessary CPU use and compression, and then give it to gstreamer to

So far, I have tried ffmpeg  -f v4l2 -pix_fmt nv12 -s 1920x1080 -r 30
-i /dev/video0 -c:v rawvideo -f yuv - >/dev/null
and got
ffmpeg version N-95733-g73ee53f317 Copyright (c) 2000-2019 the FFmpeg developers
  built with gcc 9 (Debian 9.2.1-19)
  configuration: --arch=arm --bindir=/home/linaro/bin
--disable-stripping --enable-libaom --enable-avisynth
--enable-avresample --enable-chromaprint --enable-frei0r --enable-gpl
--enable-ladspa --enable-libass --enable-libbs2b --enable-libcaca
--enable-libcdio --enable-libdc1394 --enable-libfdk-aac
--enable-libfontconfig --enable-libfreetype --enable-libfribidi
--enable-libgme --enable-libgsm --enable-libiec61883
--enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt
--enable-libopus --enable-libpulse --enable-librubberband
--enable-libshine --enable-libsnappy --enable-libsoxr
--enable-libspeex --enable-libssh --enable-libtheora
--enable-libtwolame --enable-libvorbis --enable-libvpx
--enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265
--enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx
--enable-openal --enable-opengl --enable-sdl2 --enable-shared
--libdir=/usr/lib/arm-linux-gnueabihf --enable-nonfree --enable-rkmpp
--enable-version3 --enable-libdrm
--extra-libs='-lpthread -lm' --pkg-config-flags=--static
  libavutil      56. 35.101 / 56. 35.101
  libavcodec     58. 62.100 / 58. 62.100
  libavformat    58. 35.100 / 58. 35.100
  libavdevice    58.  9.100 / 58.  9.100
  libavfilter     7. 66.100 /  7. 66.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  6.100 /  5.  6.100
  libswresample   3.  6.100 /  3.  6.100
  libpostproc    55.  6.100 / 55.  6.100
[video4linux2,v4l2 @ 0xab664e80] The driver does not permit changing
the time per frame
[video4linux2,v4l2 @ 0xab664e80] Time per frame unknown
[video4linux2,v4l2 @ 0xab664e80] Stream #0: not enough frames to
estimate rate; consider increasing probesize
Input #0, video4linux2,v4l2, from '/dev/video0':
  Duration: N/A, start: 0.000000, bitrate: N/A
    Stream #0:0: Video: rawvideo (NV12 / 0x3231564E), nv12, 1920x1080,
1000k tbr, 1000k tbn, 1000k tbc
[NULL @ 0xab6682b0] Requested output format 'yuv' is not a suitable
output format
pipe:: Invalid argument
as a result.

When I replace -f yuv by -f mpegts, it does work with two cores at
50%, which I would like to avoid.
What output format could I use to just copy the raw video?

Thanks a lot,
Bruno Verachten
ffmpeg-user mailing list

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Reply via email to