of a new request (in line 1434).
Thanks in advance.
---
Charles
producing somefile.1 . So, has there any change to these behavior or
can this be filed as bug/enhancement?
Thanks.
---
Charles
On Feb 19, 2008 11:25 PM, Steven M. Schweda [EMAIL PROTECTED] wrote:
From: Charles
In wget 1.10, [...]
Have you tried this in something like a current release (1.11, or
even 1.10.2)?
My wget version is 1.10.2. It isn't really a problem for me, I just
want to know if this is a known
On Feb 20, 2008 2:12 AM, Micah Cowan [EMAIL PROTECTED] wrote:
We could have Wget treat 200 OK exactly as 416 Requested Range Not
Satisfiable; but then it won't properly handle servers that legitimately
do not support byte ranges for some or all files.
Yes, what I would ask is that wget compare
be
creating a simple wget wrapper
$ mkdir ~/bin
$ cat ~/bin/wget
#!/bin/sh
echo $* ~/.wget_history
/usr/bin/wget $*
^D
$ chmod 755 ~/bin/wget
$ export PATH=~/bin:$PATH
---
Charles
$*
-
Some notes:
$(date) : capture the output of date command
'['$(date) : string concatenation
$* : all the command line arguments
If you want to customize the date format, see the man page of date.
---
Charles
-A .odf http://site-url
---
Charles
there; not retrieving.
FINISHED --20:31:41--
Downloaded: 0 bytes in 0 files
I think wget 1.10.2 behavior is more correct. Anyway it did not abort
in my case.
---
Charles
On Thu, Mar 13, 2008 at 1:17 AM, Hrvoje Niksic [EMAIL PROTECTED] wrote:
It assums, though, that the preexisting index.html corresponds to
the one that you were trying to download; it's unclear to me how
wise that is.
That's what -nc does. But the question is why it assumes that
the parser first) or a proprietary
binary format.
OK, that's some suggestions I have. Thanks for your time :D
---
Charles.
On Mon, Mar 17, 2008 at 3:20 PM, Micah Cowan [EMAIL PROTECTED] wrote:
echo http://something links
echo http://anotherthing links
echo wget http://something | at 23:30
wget -i links
Sure, I used to do this. The only problem I have is that all the links
have to be collected first
On Mon, Mar 17, 2008 at 4:41 PM, Micah Cowan [EMAIL PROTECTED] wrote:
Is that true? I thought wget actually read the input file in a streaming
fashion.
If that is the case, then I think it's possible to add links to the
list while wget has already running.
I don't expect that a single
that it is the way you want :-).
Maybe a better way is to run wget in the background so that it produce
a wget-log that can be used to trace the URLs or 'tee' the output of
wget to a file.
---
Charles
that it is the way you want :-).
Maybe a better way is to run wget in the background so that it produce
a wget-log that can be used to trace the URLs or 'tee' the output of
wget to a file.
---
Charles
On Fri, Mar 21, 2008 at 2:33 AM, Micah Cowan [EMAIL PROTECTED] wrote:
The intent is that all the writer operations would take virtually no
time at all. The sidb_read function should take at most O(N log N) time
on the size of the SIDB file, and should take less than a second under
normal
On Sat, Mar 22, 2008 at 12:14 AM, Micah Cowan [EMAIL PROTECTED] wrote:
YAML uses UTF-8; I'm beginning to think YAML may not be what we want,
though, given that the definition for a given entry may be interposed
with defining content for other entries; I don't want to kludge that by
On Wed, Mar 26, 2008 at 11:17 PM, [EMAIL PROTECTED] wrote:
Can you help me figure out how to use wget to log in to this page? Once
logged in, I am intending to do a recursive download, or mirror.
Normally the steps should be like these:
1. wget --post-data=uname=usernamepwd=password
On Wed, Sep 17, 2008 at 11:02 PM, Tobias Opialla
[EMAIL PROTECTED] wrote:
Hey all,
I hope this is the right adress, and you can help me.
I'm currently trying to run a perlscript including some wget commands, but if
I try to run it, it says:
The ordinal 2253 could not be located in the
/
file1 file2
20041119/
file1 file2
So download what you would have with a normal wget -m but store in
this subdirectory.
Any ideas?
--
Charles Gagnon | My views are my views and they
http://unixrealm.com | do not represent those of anybody
charlesg
Up and running, thanks.
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Wednesday, March 02, 2005 12:34 PM
To: Belov, Charles
Cc: wget@sunsite.dk
Subject: Re: Makefile hassles
Belov, Charles [EMAIL PROTECTED] writes:
I would like to use wget 1.9.1 instead
to
upload/download from a PC.
Thanks in advance,
Charles Chas Belov
need to post-edit my new files outside of wget to fix the links?
Note: The target site is on a Un*x box, but I have to be able to
upload/download from a PC.
Thanks in advance,
Charles Chas Belov
wget does not correctly handle trailing whitespace after the last ; in
a Set-Cookie tag. This causes it to spew repeated `premature end of
string' errors with some web sites generated by Cold Fusion. E.g.,
the following is legal but not accepted (quotes added for clarity):
Set-Cookie: FOO=bar;
23 matches
Mail list logo