I'm building a small tool to run on a Raspberry Pi while I shoot timelapses where I transfer the small JPG files shot besides my raw files and use them for exposure checking and correction. As I already have them on the raspi then I'd like to write a small preview movie clip from them. Therefore I'd like to use libav/ffmpeg with the omx plugin for hardware encoding giving it the images directly from ram as I already have those JPG files open for exposure analyzing.

To start playing with libav I just build a small app opening a bunch of jpg files one after another and wanted to add libav encoding. For starters I copied the encode function from the "encode_video.c" example and tried to compile to check if includes and linking are working and always end up not being able to link.

My example file (main.cpp), the CMakeLists.txt for building it and my compiler output is here:

(My files are in /home/pi/dev/test/ffmpeg-encode (build in separate directort /home/pi/dev/test/ffmpeg-encode/build) and ffmpeg is compiled and installed from source with omx support with PREFIX=/home/pi/bin/ffmpeg [./configure --prefix=/home/pi/bin/ffmpeg --enable-mmal --enable-omx --enable-omx-rpi --enable-shared --enable-pic])

I searched for a while but I can't find a clue what's going wrong here. Can anyone here hel me with this? Why will linking fail in this case? Same result with `g++ -std=c++11 -Wall -I /home/pi/bin/ffmpeg/include -o encode-test main.cpp -lOpenImageIO -L/home/pi/bin/ffmpeg/lib -lavformat -lavcodec -lavutil -lswresample -lswscale`

Kind regards,
Libav-user mailing list

Reply via email to