I've built a Rails app that retrieves zip files from a remote server and processes them (loads the CSV files contained therein into a MySQL database).
The file retrieval is done using Ruby's Net::SFTP module running in the background via DelayedJob. It works well, unless the zip file is over 4GB, in which case it quits pretty much exactly after transferring 4GB of data. There are no errors whatsoever in the log -- I just see that the file download was started and never finished. The file is actually 8.5GB, and downloading it manually with sftp in the terminal works just fine. I've searched the interwebs for any information about his problem without any success. The only clue I have is that trying to unzip the files using rubyzip also failed for large zip files because rubyzip doesn't handle the Zip64 format, which any zip file over 4GB has. I'm wondering if rubyzip has a similar 32-bit limit in the size of a file it can handle. Nothing in the Net::SFTP docs say anything about 32-bit limitations. Thoughts? If anyone has a clue, I'm all ears. I'm not sure if it's a ruby problem, a Rails problem, or perhaps a problem on the other end in which the FTP client disconnects after 4GB has been transferred. Odd, to say the least. Thanks, Chris -- -- SD Ruby mailing list [email protected] http://groups.google.com/group/sdruby --- You received this message because you are subscribed to the Google Groups "SD Ruby" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/groups/opt_out.
