Randy Kramer wrote:
I just joined the list and I'm jumping the gun a little bit (because I usually
lurk on a list for a little while before posting), but...
I'm trying to save a local copy of this page with all the graphs:
http://www.businessinsider.com/what-wall-street-protesters-are-so-angry-about-2011-10?op=1
After finally finding the wget manual and the examples there, I thought I
found the right command--I tried:
wget -p --convert-links -nH -nd -Pdownload
http://www.businessinsider.com/what-wall-street-protesters-are-so-angry-about-2011-10?op=1
That saves the page, but not the graphs. Can anybody give me a clue as to
what I need to do to also save the graphs?
The images are at a different host (eg. static5.businessinsider.com).
You need to add -H
Interestingly, wget is wrongly recursing inline html from that page
("code for your site",
and inside a <script> tag).