Give rsync a try. If you don't want the overhead of SSL I believe you can still have rsync use the r commands for the transfer. Can't remember been a long time.
Rsync options can also take care of the file's deletion when successfully transfered. As a success check may want to look at the option for rsync to use a checksum vs just the filesize. A question tho' does each file have a unique name ? If not after 30 seconds a reattempt could delete the previous file. -pete On Wed, Apr 27, 2011 at 9:10 AM, Tom Sharples <[email protected]> wrote: > Hello, > > We're building a wireless 3G IP camera system that will FTP a large > (2.5Mbyte) 10 megapixel jpeg image every 30 minutes to a remote server, for > use in a time-lapse image application. Using a cron job, we pull the image > from the attached IP cam via curl http://<local cam IP address>/img.jpg > >/tmp/image.jpg, and then FTP it to the remote server. This works fine when > the 3G connection is working well (around 300-400K upload bandwidth). But > when the 3G connection slows to a crawl, which happens multiple time each > day, the FTP transfer hangs or times out. > > I tested a script that uses split to divide the 2.5Mbyte image into smaller > 50k chunks, which are individually ftp'd, then reassembled at the server > using cat. This works but will require a fair amount of experimentation and > additional code to make it reasonably robust to deal with missing files, > slowdowns, timeouts, retries, etc. etc. My question - is there a better > apporoach or code out there (for a bare-bones slack 2.4.23 environment) that > would automate this process and reliably handle the transfer of the large > file to the remote server under erratic bandwidth conditions? > > Thanks, > > Tom S. > > _______________________________________________ > PLUG mailing list > [email protected] > http://lists.pdxlinux.org/mailman/listinfo/plug > _______________________________________________ PLUG mailing list [email protected] http://lists.pdxlinux.org/mailman/listinfo/plug
