Matthew Palmer wrote:

This will not work if the site contains dynamic content, and I would be
utterly astounded if there was any client-side tool which could take a
server-useful copy of a dynamic website.
Any WebDAV tool will do this, as long as the ISP supports
updating the web pages using WebDAV (they might know it
as "support for Windows2000/XP web folders").  Apache
comes with a mod_dav.

If the ISP doesn't have DAV support you might want to suggest
it to them, as most authoring tools now have DAV support and
work better with DAV than alternatives such as FTP.

WevDAV pulls the source of the page, not the processed HTML.

As one scary example, I've copied dynamic content websites
using a DAV filesystem and the standard Unix utilities.

  mount -t davfs https://dav.development.example.edu.au/ /mnt/development
  mount -t davfs https://dav.www.example.edu.au/ /mnt/www
  rm -r /mnt/www
  cp -r /mnt/development /mnt/www
  umount /mnt/development
  umount /mnt/www

You can do a similar thing (called "copying a collection") with
most WebDAV clients, although you may need to copy the entire
website to local disk and then copy it from ocal disk to the new
website.  davfs lets you transfer websites without needing any
space on your local harddrive.  Davfs is beta software and
occassionally is really unstable.  Command line clients like
cadaver are solid.

Cheers,
Glen

--
SLUG - Sydney Linux User's Group - http://slug.org.au/
More Info: http://lists.slug.org.au/listinfo/slug

Reply via email to