> Images are loaded from other sites. Use -H to allow Wget to go to But -H would download all internet using it with -m and I don't want that.
> other sites and -D feedle.com to limit the search to *.feedle.com. Sometimes it creates problems, as I described before: http://www.mail-archive.com/wget%40sunsite.dk/msg07441.html Ideally it would be the best if in this particular case (i.e. http://andyk.feedle.com/) I could be in total controll what and where is downloaded (or mirrored really). 1. I would like to download all graphics from the page, even those set in another servers. 2. That links http://skrypty.feedle.com/ I would like to be mirrored to the local skrypty directory, and the mirrored andyk web page should have a link to that mirrored skrypty page. The same with several other links, I would like them to be downladed in separate named by me directories, and it should be possible to get to the mirrors through the mirrored andyk web page, and separetely... However, one or two links chosen by me from the andyk page should be left intact. I guess that such requirements is too much for wget to cope in one go? I guess I'll have to mirror each page separately, because only then I can name the directory into which it should be downloaded correctly. > -p should probably go to other sites as well by default, but it > doesn't do so yet. Well, that's why I wanted to use this option for! The manual reagarding the -p option says: "Note that Wget will behave as if -r had been specified, but only that single page and its requisites will be downloaded. Links from that page to external documents will not be followed. Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to -p:" So I suggest to correct the manual because it's misleading! :( > Wget -- *many* bugs have been fixed since 1.7. I'm going to send a note about that to the admin. a.