Hi Gurus, 

I am very new to this tool. I have insight about CLI browsers. And sometime 
back I tried wget to access a website and parse it. But I had a constraint not 
to store the html file in my machine but to directly grep the content and parse 
it. (Anyways I agree a browser needs to cache/buffer to get the file content 
and render. But thats implicit) 

Something like this which I was not able to accomplish from wget
==================================================================
pa...@hdhctdam45375 -bash-3.00# ./links -no-g -http-proxy 129.156.85.11:8080 
-dump www.google.com | wc -l
      15
pa...@hdhctdam45375 -bash-3.00# ./links -no-g -http-proxy 129.156.85.11:8080 
-dump www.google.com | grep -i search
    ________________________________________________________   Advanced Search
            [ Google Search ] [ I'm Feeling Lucky ]            Preferences
                   Search: [ ] the web [ ] pages from the UK
==================================================================

So far so good. But wget had the parameter options to pass the username and 
password to the website. 

Like :
bash-3.00# wget --help | grep user
     --user=USER               set both ftp and http user to USER.
     --http-user=USER        set http user to USER.
     --ftp-user=USER         set ftp user to USER.


Do we have any such option in links? Hope is expectation is valid 
(I read an option in links only for anonymous ftp user! )

Your efforts are greatly appreciated. 

Regards,
Paras




      Add more friends to your messenger and enjoy! Go to 
http://messenger.yahoo.com/invite/
_______________________________________________
links-list mailing list
links-list@linuxfromscratch.org
http://linuxfromscratch.org/mailman/listinfo/links-list

Reply via email to