I've already got it working using Net::FTP. The problem is it runs
slow using FTP. Here is an example of what I'm trying to do:
my $h = $ftp->{handle};
foreach my $directory ( @directories ) {
$h->cwd( $directory ) or die "can't change to directory: $directory
$!";
my $dir_ls = $h->ls;
foreach my $file_name ( @$dir_ls ) {
unless ( substr( $file_name, 0, 1 ) eq "." ) {
my $dir_nfo = $h->dir( $directory . $file_name );
$_ = $dir_nfo->[ 0 ];
s/(\s)+/ /g;
my @file_nfo = split / /, $_;
my $file_size = $file_nfo[ 4 ];
if( $file_size != 0 ) {
add to database
}
}
}
}
$h->quit;
I tried using ftp->size( $directory . $file_name );
But it seems to only return a size for small files,
at least on my OSX box.
Thanks,
Boysenberry
This message contains information that is confidential
and proprietary to Humaniteque and / or its affiliates.
It is intended only for the recipient named and for
the express purpose(s) described therein.
Any other use is prohibited.
http://www.habitatlife.com
The World's Best Site Builder
On Aug 1, 2005, at 5:54 PM, Philip M. Gollucci wrote:
Boysenberry Payne wrote:
I'm not sure if HEAD would work.
Basically, I'm trying to read a directory's files.
After I confirm a file exists and doesn't have zero
size I check that it has the appropriate extension
for the directory then I add the directory address,
file name and extension to a table in our database.
We actually do something very similar to this involving pictures being
uploaded from a digital camera to eventually be published on a
website.
Cronjob1:
Poll destination directory and move the files to a temp location
The destination directory is where the camera puts them.
Cronjob2:
Poll temp directory and move image into permenent
location andinsert a row into our "images" table.
Its split only so that if some part breaks the uploading from camera's
does not and people can continue to upload from camera's. Digital
camera's [at least the ones the government uses :)] upload with the
same non-unique file names for each upload, so we have to process each
batch rather quickly.
I didn't write this, but I can say in 3 years its only crashed once
and makes us millions.
[snipped for breviety of course]
Cronjob1:
use Net::FTP;
my $ftp = Net::FTP->new($PEERADDR, Debug => 0, Timeout => 30)
|| die "Connect to server failed\n";
$ftp->login($USERNAME, $PASSWORD)
|| die "Cannot login to FTP server\n";
$ftp->binary();
my @files = $ftp->ls('-R');
foreach my $file (@files) {
unless some critera
$ftp->get("$dir/$file", $localFilename);
}
$ftp->quit();