Thanks everyone for the replies so far......

The problem I am having is that the customer is using ASP & Java script. The URL stays 
the same as I click through the links. So, using "wget URL" for the page I want may 
not work (I may be wrong). Any suggestions on how I can tackle this?

Thanks,
Suhas

----- Original Message ----- 
From: "Hrvoje Niksic" <[EMAIL PROTECTED]>
To: "Suhas Tembe" <[EMAIL PROTECTED]>
Cc: <[EMAIL PROTECTED]>
Sent: Monday, October 06, 2003 5:19 PM
Subject: Re: Web page "source" using wget?


> "Suhas Tembe" <[EMAIL PROTECTED]> writes:
> 
> > Hello Everyone,
> >
> > I am new to this wget utility, so pardon my ignorance.. Here is a
> > brief explanation of what I am currently doing:
> >
> > 1). I go to our customer's website every day & log in using a User Name & Password.
> > 2). I click on 3 links before I get to the page I want.
> > 3). I right-click on the page & choose "view source". It opens it up in Notepad.
> > 4). I save the "source" to a file & subsequently perform various tasks on that 
> > file.
> >
> > As you can see, it is a manual process. What I would like to do is
> > automate this process of obtaining the "source" of a page using
> > wget. Is this possible? Maybe you can give me some suggestions.
> 
> It's possible, in fact it's what Wget does in its most basic form.
> Disregarding authentication, the recipe would be:
> 
> 1) Write down the URL.
> 
> 2) Type `wget URL' and you get the source of the page in file named
>    SOMETHING.html, where SOMETHING is the file name that the URL ends
>    with.
> 
> Of course, you will also have to specify the credentials to the page,
> and Tony explained how to do that.
> 

Reply via email to