>>>>> "Tom" == Tom Sharples <[email protected]> writes:

Tom> Hello, We're building a wireless 3G IP camera system that will
Tom> FTP a large (2.5Mbyte) 10 megapixel jpeg image every 30 minutes
Tom> to a remote server, for use in a time-lapse image
Tom> application. Using a cron job, we pull the image from the
Tom> attached IP cam via curl http://<local cam IP address>/img.jpg
Tom> /tmp/image.jpg, and then FTP it to the remote server. This works
Tom> fine when the 3G connection is working well (around 300-400K
Tom> upload bandwidth). But when the 3G connection slows to a crawl,
Tom> which happens multiple time each day, the FTP transfer hangs or
Tom> times out.

Tom> I tested a script that uses split to divide the 2.5Mbyte image
Tom> into smaller 50k chunks, which are individually ftp'd, then
Tom> reassembled at the server using cat. This works but will require
Tom> a fair amount of experimentation and additional code to make it
Tom> reasonably robust to deal with missing files, slowdowns,
Tom> timeouts, retries, etc. etc. My question - is there a better
Tom> apporoach or code out there (for a bare-bones slack 2.4.23
Tom> environment) that would automate this process and reliably handle
Tom> the transfer of the large file to the remote server under erratic
Tom> bandwidth conditions?

No software is going to make your 3G network not suck, but have you
tried scp or rsync?


-- 
Russell Senior, President
[email protected]
_______________________________________________
PLUG mailing list
[email protected]
http://lists.pdxlinux.org/mailman/listinfo/plug

Reply via email to