Re: wget -O writes empty file on failure

2006-01-24 Thread Hrvoje Niksic
Mauro Tortonesi [EMAIL PROTECTED] writes:

 the following patch (just commited into the trunk) should solve the
 problem.

I don't think that patch is such a good idea.

-O, as currently implemented, is simply a way to specify redirection.
You can think of it as analogous to command  file in the shell.  In
that light, leaving empty files makes perfect sense (that's what shell
does with nosuchcommand  foo).

Most people, on the other hand, expect -O to simply change the
destination file name of the current download (and fail to even
consider what should happen when multiple URLs are submitted to Wget).
For them, the current behavior doesn't make sense.

Until -O is changed to really just change the destination file name, I
believe the current behavior should be retained.


--page-requisites option

2006-01-24 Thread Adidi Tiflopin
Hello,

I haven't a .wgetrc. I'm on Kubuntu
I tried  with the Kubuntu wget and I compiled the wget from 
http://ftp.gnu.org/pub/gnu/wget/

This command :
wget --page-requisites http://www.cplusplus.com 

doesn't download the image that are needed for display in a given HTML page.

The picture at the tof left of the HTML page is written like this :
a href=/
img src=/img/headlogo1.gif width=165 height=75 border=0
/a

For me, wget should write the image in www.cplusplus.com/imgs/headlogo1.gif
but it doesn't.

May I doesn't understand this option, and it is the case, could explain me how 
can i do to wget this adress http://www.cplusplus.com.

Sorry for English, I'm french.

Anyway, Thanks for wget.

Cordially 

Florent Ourth








___ 
Nouveau : téléphonez moins cher avec Yahoo! Messenger ! Découvez les tarifs 
exceptionnels pour appeler la France et l'international.
Téléchargez sur http://fr.messenger.yahoo.com



Re: wget -O writes empty file on failure

2006-01-24 Thread Mauro Tortonesi

Hrvoje Niksic wrote:

Mauro Tortonesi [EMAIL PROTECTED] writes:



the following patch (just commited into the trunk) should solve the
problem.



I don't think that patch is such a good idea.

-O, as currently implemented, is simply a way to specify redirection.
You can think of it as analogous to command  file in the shell.  In
that light, leaving empty files makes perfect sense (that's what shell
does with nosuchcommand  foo).

Most people, on the other hand, expect -O to simply change the
destination file name of the current download (and fail to even
consider what should happen when multiple URLs are submitted to Wget).
For them, the current behavior doesn't make sense.

Until -O is changed to really just change the destination file name, I
believe the current behavior should be retained.


you might be actually right. the real problem here is that the semantics 
of -O are too generic and not well-defined. as you say, we should split 
the redirection and output filename functions in two different commands.


in this case, the redirection command would simply write all the 
downloaded data to the output without performing any trasformation. on 
the other hand, the output filename command could perform more complex 
operations, like saving downloaded resources in a temporary file, 
parsing them for new URLs (maybe also providing a programming hook for 
external parsers) and writing the resources to their destination, 
archiving them in a well defined format in case of multiple downloads.


what do you think?

--
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi  http://www.tortonesi.com

University of Ferrara - Dept. of Eng.http://www.ing.unife.it
GNU Wget - HTTP/FTP file retrieval tool  http://www.gnu.org/software/wget
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it


Local source port binding

2006-01-24 Thread Geoff Silver
Hi there,

I'm wondering if anyone has a patch to allow a user to select the source port
to use when running wget on the command line, or if not, if anyone would be
willing to add that?  I have an HTTP-based application which I need to ensure
is being called by the client as root, and the easiest way to do this is for
me to bind to a privileged port.  Unfortunately I've not been able to find any
command-line HTTP client (wget, curl, elinks, lynx, etc) which has the ability
to do this.

Thanks for any assistance!  (I've not yet subscribed to the list yet, btw, so
please CC me on any replaces.  Thanks again!)

-- 
Geoff Silver
Sr. Systems Administrator


Re: --page-requisites option

2006-01-24 Thread Steven M. Schweda
   wget -V should tell us which Wget version you are using.  1.10.2 is
the latest released version.  http://directory.fsf.org/wget.html

   Adding -d to the command may generate some useful output.



   Steven M. Schweda   (+1) 651-699-9818
   382 South Warwick Street[EMAIL PROTECTED]
   Saint Paul  MN  55105-2547


Re: wget -O writes empty file on failure

2006-01-24 Thread Hrvoje Niksic
Mauro Tortonesi [EMAIL PROTECTED] writes:

 you might be actually right. the real problem here is that the
 semantics of -O are too generic and not well-defined.

The semantics of -O are well-defined, but they're not what people
expect.  In other words, -O breaks the principle of least surprise.

 in this case, the redirection command would simply write all the
 downloaded data to the output without performing any
 trasformation. on the other hand, the output filename command could
 perform more complex operations,

That seems to break the principle of least surprise as well.  If such
an option is specified, maybe Wget should simply refuse to accept more
than a single URL.