After 7.5 years of waiting, my banana plant is finally flowering! I want to do 
a time-lapse capture of the flowering and fruiting process. Due to its 
location, the easiest way for me to get a camera out there is to use a little 
WyzeCam v3 with the RTSP firmware and the Wyze lamp socket. Unfortunately the 
WyzeCam doesn’t (yet) have a externally accessible JPG snapshot feature, so I 
have a cron job set up to:

./ffmpeg -rtsp_transport tcp -i rtsp://anonymous:password@$IPAddress/live 
-frames:v 1 $outfile

every hour. The results are OK, but not fantastic:

https://www.kan.org/pictures/BananaTimeLapseFirstImage.jpg 
<https://www.kan.org/pictures/BananaTimeLapseFirstImage.jpg>

Is there a way to tell ffmpeg to collect N frames of video and output one 
single averaged image to improve the SNR? Even if there’s some wind, the flower 
stalk shouldn’t be moving much. 

I tried:

./ffmpeg -rtsp_transport tcp -i rtsp://anonymous:[email protected]/live 
-frames:v 10 ~/BananaLapse/MultiFrame%03d.jpg

and that results in N JPGs. I suppose I could have a second ffmpeg command that 
averages those 10 JPGs, but can this all be done in one pass? Thanks!
_______________________________________________
ffmpeg-user mailing list
[email protected]
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
[email protected] with subject "unsubscribe".

Reply via email to