Hi all,

We use jsch as the library to download files over sftp in our system Input
for the sftp module is a folder on a remote host, this folders often
contains alot of subfolders with many small files, total amout of data can
be a couple of 100gb. Now I think I have just a very standard way of
downloading each file for all the subfolders on the host. So I was wondering
if there was any tricks I could  to speed up things? E.g. some sort of batch
download of a directory. After the file have been downloaded we add it to a
tar archive.  I guess I could use 'scp -r ', will there be any implications
with respect to performance etc?

Any tips on this?

Cheers, Håkon
------------------------------------------------------------------------------
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security 
threats, fraudulent activity, and more. Splunk takes this data and makes 
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2d-c2
_______________________________________________
JSch-users mailing list
JSch-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jsch-users

Reply via email to