A forum has topics which are available only for members.
How to use wget for downloading copy of the pages in that
case? How to get the proper cookies and how to get wget to
use them correctly? I use IE in PC/Windows and wget in
a unix computer. I could use Lynx in the unix computer
if needed.
Zitat von Micah Cowan [EMAIL PROTECTED]:
And the only other code I found which parses the remote date is in the part
which handles the logic around the timestamping option. In older versions
this
was a conditional block starting with if (!got_head) ... , now it starts
with
if
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Jochen Roderburg wrote:
Zitat von Micah Cowan [EMAIL PROTECTED]:
Hm... that change came from the Content-Disposition fixes. I'll investigate.
OK, but I hope I am still allowed to help a little with the investigation ;-)
Oh, I'm always
Hi guys,
ohloh.net keeps track of FLOSS authors and projects and do some interesting
stats and numbers. Wget is listed too:
http://www.ohloh.net/projects/7947?p=Wget
(No I'm not involved with the site in any way but as a happy visitor and
registered user.)
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
I haven't discovered why yet, but all of addictivecode.org's internet
services went down last night around 7:30 pm PDT (02:30 UTC). The web
and ssh services were brought back up in response to an email query,
around 2:30 am PDT (09:30 UTC), but it
On 9/9/07, Jochen Roderburg [EMAIL PROTECTED] wrote:
Hi,
This is now an easy case for a change ;-)
In the log output for wget -c we have the line:
The sizes do not match (local 0) -- retrieving.
This shows always 0 as local size in the current svn version.
The variable which is
On 9/12/07, Juhana Sadeharju [EMAIL PROTECTED] wrote:
A forum has topics which are available only for members.
How to use wget for downloading copy of the pages in that
case? How to get the proper cookies and how to get wget to
use them correctly? I use IE in PC/Windows and wget in
a unix
Problem:
I'm using
wget -q -T 0 -O - 'http://some.remote.host/cgi-bin/some_script?...'
to access a PERL-CGI script on a remote
computer running Apache httpd that is configured with a 300 second
timeout.
The script sometimes takes more than 300 seconds to begin sending
data (because there is
When I try to execute the command (minus quotes) wget -P ftp.usask.ca -r
-np -passive-ftp ftp://ftp.usask.ca/pub/mirrors/apple/; wget works for a bit
and then terminates with the following error:
xmalloc.c:186: failed assertion `ptr !=NULL'
Abort trap
What causes this error? What does this error
On 9/11/07, Hex Star [EMAIL PROTECTED] wrote:
When I try to execute the command (minus quotes) wget -P ftp.usask.ca -r
-np -passive-ftp ftp://ftp.usask.ca/pub/mirrors/apple/;
wget works for a bit and then terminates with the following error:
xmalloc.c:186: failed assertion `ptr !=NULL'
Abort
On 9/13/07, Josh Williams [EMAIL PROTECTED] wrote:
failed assertion means that at some point along the line, one of the
variables's value was not what it should have been.
I'll check into it. Thanks!
Ok great, thanks :)
Oh and the configuration on which wget was running is: PowerBook G4
1.5ghz(PowerPC), 768mb ram, Mac OS X
10.4.10
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Micah Cowan wrote:
I haven't discovered why yet, but all of addictivecode.org's internet
services went down last night around 7:30 pm PDT (02:30 UTC).
Note that the addictivecode.org failure was completely unrelated to the
main Wget mailing list
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Hex Star wrote:
Oh and the configuration on which wget was running is: PowerBook G4
1.5ghz (PowerPC), 768mb ram, Mac OS X 10.4.10
One crucial bit of information you've left out, is which version of Wget
you're running. :)
Sorry if it took a
Hello,
If i run :
wget http://server.domain/file
How can I differentiate between a network problem that made wget fail
of the server sending back a HTTP 404 error?
( I have a use case described in debian bug http://bugs.debian.org/422088 )
I think it would be nice if the exit code of wget
Zitat von Micah Cowan [EMAIL PROTECTED]:
Btw, continued downloads (wget -c) are also
broken now in this case (probably for the same reason).
Really? I've been using this Wget version for a bit, and haven't noticed
this problem. Could you give an invocation that produces this problem?
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Alex Owen wrote:
Hello,
If i run :
wget http://server.domain/file
How can I differentiate between a network problem that made wget fail
of the server sending back a HTTP 404 error?
( I have a use case described in debian bug
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Todd Plessel wrote:
Q1. Is there a way that I can run wget that somehow avoids this
timeout. For example, by sending an out-of-band ack to stderr every
30 seconds so httpd does not disconnect.
By out-of-band, I mean it cannot be included in the
On 13/09/2007, Micah Cowan [EMAIL PROTECTED] wrote:
Alex Owen wrote:
I think it would be nice if the exit code of wget could be inspected
to determin if wget failed because of a 404 error or some other
reason.
Hi Alex,
We do plan to look evaluate differentiation of exit statuses at
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Micah Cowan wrote:
Todd Plessel wrote:
Q2. If not, then could the PERL-CGI script be modified to spawn a
thread that writes an ack to stderr to keep the httpd from timing-out?
If so, can you point me to some sample code?
This would be the
Hi!
I'm doing a master thesis on online news at the University of Oslo,
and need a software that can download html pages based on RSS feeds.
I suspect that Wget could be modified to do this.
- Do you know if there are any ways to get Wget to read RSS files and
download new files every hour or
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Josh Williams wrote:
On 9/12/07, Erik Bolstad [EMAIL PROTECTED] wrote:
Hi!
I'm doing a master thesis on online news at the University of Oslo,
and need a software that can download html pages based on RSS feeds.
I suspect that Wget could be
On 9/12/07, Erik Bolstad [EMAIL PROTECTED] wrote:
Hi!
I'm doing a master thesis on online news at the University of Oslo,
and need a software that can download html pages based on RSS feeds.
I suspect that Wget could be modified to do this.
- Do you know if there are any ways to get Wget to
On 9/13/07, Micah Cowan [EMAIL PROTECTED] wrote:
One crucial bit of information you've left out, is which version of Wget
you're running. :)
Oops sorry about that, the version is...
wget 1.9+cvs-dev
On 9/13/07, Hex Star [EMAIL PROTECTED] wrote:
wget 1.9+cvs-dev
Try it in either the latest release or (preferably) the subversion
trunk and let us know if you still have the same problem. The version
you're using is an old trunk version, so we can safely assume that it
has plenty of fixed bugs
25 matches
Mail list logo