wget 1.11 no-clobber still sending GET request

2008-01-31 Thread Charles
of a new request (in line 1434). Thanks in advance. --- Charles

Wget continue option and buggy webserver

2008-02-19 Thread Charles
producing somefile.1 . So, has there any change to these behavior or can this be filed as bug/enhancement? Thanks. --- Charles

Re: Wget continue option and buggy webserver

2008-02-19 Thread Charles
On Feb 19, 2008 11:25 PM, Steven M. Schweda [EMAIL PROTECTED] wrote: From: Charles In wget 1.10, [...] Have you tried this in something like a current release (1.11, or even 1.10.2)? My wget version is 1.10.2. It isn't really a problem for me, I just want to know if this is a known

Re: Wget continue option and buggy webserver

2008-02-19 Thread Charles
On Feb 20, 2008 2:12 AM, Micah Cowan [EMAIL PROTECTED] wrote: We could have Wget treat 200 OK exactly as 416 Requested Range Not Satisfiable; but then it won't properly handle servers that legitimately do not support byte ranges for some or all files. Yes, what I would ask is that wget compare

Re: .wgetrc to have a log from where I am downloading stuff

2008-03-02 Thread Charles
be creating a simple wget wrapper $ mkdir ~/bin $ cat ~/bin/wget #!/bin/sh echo $* ~/.wget_history /usr/bin/wget $* ^D $ chmod 755 ~/bin/wget $ export PATH=~/bin:$PATH --- Charles

Re: .wgetrc to have a log from where I am downloading stuff

2008-03-07 Thread Charles
$* - Some notes: $(date) : capture the output of date command '['$(date) : string concatenation $* : all the command line arguments If you want to customize the date format, see the man page of date. --- Charles

Re: how to parse a webpage to download links of certain type?

2008-03-09 Thread Charles
-A .odf http://site-url --- Charles

Re: wget aborts when file exists

2008-03-12 Thread Charles
there; not retrieving. FINISHED --20:31:41-- Downloaded: 0 bytes in 0 files I think wget 1.10.2 behavior is more correct. Anyway it did not abort in my case. --- Charles

Re: wget aborts when file exists

2008-03-12 Thread Charles
On Thu, Mar 13, 2008 at 1:17 AM, Hrvoje Niksic [EMAIL PROTECTED] wrote: It assums, though, that the preexisting index.html corresponds to the one that you were trying to download; it's unclear to me how wise that is. That's what -nc does. But the question is why it assumes that

Re: Meta-Database

2008-03-17 Thread Charles
the parser first) or a proprietary binary format. OK, that's some suggestions I have. Thanks for your time :D --- Charles.

Re: Meta-Database

2008-03-17 Thread Charles
On Mon, Mar 17, 2008 at 3:20 PM, Micah Cowan [EMAIL PROTECTED] wrote: echo http://something links echo http://anotherthing links echo wget http://something | at 23:30 wget -i links Sure, I used to do this. The only problem I have is that all the links have to be collected first

Re: Meta-Database

2008-03-17 Thread Charles
On Mon, Mar 17, 2008 at 4:41 PM, Micah Cowan [EMAIL PROTECTED] wrote: Is that true? I thought wget actually read the input file in a streaming fashion. If that is the case, then I think it's possible to add links to the list while wget has already running. I don't expect that a single

Re: Save request headers

2008-03-17 Thread Charles
that it is the way you want :-). Maybe a better way is to run wget in the background so that it produce a wget-log that can be used to trace the URLs or 'tee' the output of wget to a file. --- Charles

Re: Save request headers

2008-03-17 Thread Charles
that it is the way you want :-). Maybe a better way is to run wget in the background so that it produce a wget-log that can be used to trace the URLs or 'tee' the output of wget to a file. --- Charles

Re: Session Info Database API concepts [regarding wget and gsoc]

2008-03-20 Thread Charles
On Fri, Mar 21, 2008 at 2:33 AM, Micah Cowan [EMAIL PROTECTED] wrote: The intent is that all the writer operations would take virtually no time at all. The sidb_read function should take at most O(N log N) time on the size of the SIDB file, and should take less than a second under normal

Re: Session Info Database API concepts [regarding wget and gsoc]

2008-03-21 Thread Charles
On Sat, Mar 22, 2008 at 12:14 AM, Micah Cowan [EMAIL PROTECTED] wrote: YAML uses UTF-8; I'm beginning to think YAML may not be what we want, though, given that the definition for a given entry may be interposed with defining content for other entries; I don't want to kludge that by

Re: cannot log on to Oracle portal/apache - full request - ignore previous

2008-03-26 Thread Charles
On Wed, Mar 26, 2008 at 11:17 PM, [EMAIL PROTECTED] wrote: Can you help me figure out how to use wget to log in to this page? Once logged in, I am intending to do a recursive download, or mirror. Normally the steps should be like these: 1. wget --post-data=uname=usernamepwd=password

Re: Problem with libeay32.dll, ordinal 2253

2008-09-19 Thread Charles
On Wed, Sep 17, 2008 at 11:02 PM, Tobias Opialla [EMAIL PROTECTED] wrote: Hey all, I hope this is the right adress, and you can help me. I'm currently trying to run a perlscript including some wget commands, but if I try to run it, it says: The ordinal 2253 could not be located in the

Variant on wget -m

2004-11-19 Thread Charles Gagnon
/ file1 file2 20041119/ file1 file2 So download what you would have with a normal wget -m but store in this subdirectory. Any ideas? -- Charles Gagnon | My views are my views and they http://unixrealm.com | do not represent those of anybody charlesg

Thanks RE: Makefile hassles

2005-03-02 Thread Belov, Charles
Up and running, thanks. -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Wednesday, March 02, 2005 12:34 PM To: Belov, Charles Cc: wget@sunsite.dk Subject: Re: Makefile hassles Belov, Charles [EMAIL PROTECTED] writes: I would like to use wget 1.9.1 instead

--convert-links vs. non-recursion

2005-03-02 Thread Belov, Charles
to upload/download from a PC. Thanks in advance, Charles Chas Belov

RE: --convert-links vs. non-recursion

2005-03-02 Thread Belov, Charles
need to post-edit my new files outside of wget to fix the links? Note: The target site is on a Un*x box, but I have to be able to upload/download from a PC. Thanks in advance, Charles Chas Belov

Set-Cookie parsing error

2002-10-21 Thread Charles M. Hannum
wget does not correctly handle trailing whitespace after the last ; in a Set-Cookie tag. This causes it to spew repeated `premature end of string' errors with some web sites generated by Cold Fusion. E.g., the following is legal but not accepted (quotes added for clarity): Set-Cookie: FOO=bar;