Hello,

We're building a wireless 3G IP camera system that will FTP a large 
(2.5Mbyte) 10 megapixel jpeg image every 30 minutes to a remote server, for 
use in a time-lapse image application. Using a cron job, we pull the image 
from the attached IP cam via curl http://<local cam IP address>/img.jpg 
 >/tmp/image.jpg, and then FTP it to the remote server. This works fine when 
the 3G connection is working well (around 300-400K upload bandwidth). But 
when the 3G connection slows to a crawl, which happens multiple time each 
day, the FTP transfer hangs or times out.

I tested a script that uses split to divide the 2.5Mbyte image into smaller 
50k chunks, which are individually ftp'd, then reassembled at the server 
using cat. This works but will require a fair amount of experimentation and 
additional code to make it reasonably robust to deal with missing files, 
slowdowns, timeouts, retries, etc. etc. My question - is there a better 
apporoach or code out there (for a bare-bones slack 2.4.23 environment) that 
would automate this process and reliably handle the transfer of the large 
file to the remote server under erratic bandwidth conditions?

Thanks,

Tom S. 

_______________________________________________
PLUG mailing list
[email protected]
http://lists.pdxlinux.org/mailman/listinfo/plug

Reply via email to