Actually it could be worse. Because you're dealing with network latency and
the chance the connection could be dropped, it's not as predictable as
dealing with the local file system.

My suggestion would be to use queued implementation. Log into the FTP and
get the full contents of the directory. For each Directory, push the path
into an array of directories to visit and all the files into a download
array. Than when you've finished gathering all the files for download, you
can begin recreating the folder structure and downloading the files into
their proper directories.

I used the same implementation for a webspider to generate sitemaps and it
worked *way* faster than the recursive method I had initially came up with.

Cheers,

Kevin

-----Original Message-----
From: Jeff Chastain [mailto:[EMAIL PROTECTED] 
Sent: February 22, 2006 1:43 PM
To: CF-Talk
Subject: Re: Get Entire Directory Tree via FTP

Well, in the sometimes you have to do what you have to do category ... it
can't be much worse than recursively deleting an entire folder structure
which CF does not allow us to do either.  That was why I asked if there were
any better ideas for doing this when all I have is an FTP connection to this
folder structure and I need to download the entire thing and parse it.

Thanks
-- Jeff



~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:233172
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54

Reply via email to