I'd like to download a sequence of pages which are produced by someone's asp 
application so that I may read them while I am offline.

Is there a parameter to wget that will allow me to do this?

The URL for the first page is something like

http://www.something.com/junk.asp&thepageIwant=1

I can use the "--html-extension" to cause the page I download to have a .html 
extension, so my web browsers know what to do with the file.  However, I don't 
seem to be able to get wget to follow the link within that page to the next 
page, because the link is given as a parameter to an asp application.  That is, 
there is HTML like this:

<p>Click the following to go to the
<a href="http://www.something.com/junk.asp&thepageIwant=2";>next page</a>.</p>

What I need is for wget to understand that stuff following an "&" in a URL 
indicates that it's a distinctly different page, and it should go recursively 
retrieve it.  The --recursive option doesn't seem to help.

Any help you can give me is appreciated.

Mike
-- 
Michael D. Crawford
GoingWare Inc. - Expert Software Development and Consulting
http://www.goingware.com/
[EMAIL PROTECTED]

      Tilting at Windmills for a Better Tomorrow.


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED] 
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

Reply via email to