Hi,

would you tell us what are the exact URLs you are trying with?

If you can't post those URLs in public, either set up a test server, or send 
them to me privately.

I think what you need is the -r/--recursive option but I won't be sure unless I 
see the URLs.

Also, bear in mind that Wget does not support any kind of content-coding. This means that 
if the content comes gzipped, it won't be able to decompress it, as you would expect. So 
passing '--header="Accept-encoding: gzip"' won't probably do what you expected.

On 09/22/2015 07:57 PM, andreas wpv wrote:
All,
I am trying to download all files of a webpage - but compressed, if they
come compressed, and regular if not compressed. Get all the files the way a
browser would.

so, this works for the html file itself:

wget --header="Accept-encoding: gzip" "url"

and this for itself works to download all elements:

wget -p -H "url"

So, now I want these combined:

wget -p -H  --header="Accept-encoding: gzip" "url"

Unfortunately this only pulls the html files (because where I pull them
they are compressed), and not all the other scripts and stylesheets and so
on, although at least a few of these are compressed, either.


Ideas, tips?


Regards,
- AJ

Reply via email to