Hi all,

I have been using wget for a very long time but I've never encountered
this problem before. I usually backup sites that I like because they
tend to dissapear quite often and I use "wget -r -k" with either -I or
-D switches depending on whether I want to backup a section or the
whole site. I came across an interesting site that has all of its
content spread across many subdomains. Subdomains are of the format
sub1.domain.com and I am looking on some suggestions on how to back it
up. I tried using "-D*.domain.com" but it didn't work.

Does anyone have any suggestions? 


Thank you,

Brian Palter

Reply via email to