Oliver Fromme wrote:
O. Hartmann <[email protected]> wrote:
> I run into a problem I can not solve. I need to fetch a whole directory > tree from a public remote site. The top level directory and its > subdirectories are accessible via ftp:// and http:// so I tried fetch, > but fetch does only retrieve data on file basis and does not copy a > whole directory tree recursively. The remote site does not offer > sftp/sshd for that purpose. > > Is there a simple way to perform such a task with FreeBSD's own tools (I > try to avoid installing 'wget' and sibblings)? I need to keep it simple, > task should be performed via cronjob.

I'm afraid you can't do that with FreeBSD base tools.

An alternative to wget would be "omi" (ports/ftp/omi)
which is a simple FTP mirroring tool, written in C
without any dependencies.  Usage is simple:

$ omi -s server.name.com -r /remote/dir -l ./local/dir

Note that, by default, it tries to synchronize the local
dir perfectly, i.e. if the remote dir is empty, it will
wipe out the local dir.  (The option "-P 0" will prevent
omi from removing anything.)

Best regards
   Oliver


Thanks for so much answers.

I tried 'omi' but I find that the tool does not travers deeper into a dir than level one, so subdirs seem to be left out. I will try wget, although this tool would not be the first choice.


Thanks,
Oliver
_______________________________________________
[email protected] mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to "[email protected]"

Reply via email to