Thierry Vignaud <[EMAIL PROTECTED]> writes:
> Hrvoje Niksic <[EMAIL PROTECTED]> writes:
>
>> > i'm maintaining wget in mandrake linux distribution.
>> >
>> > here're some patches we apply on top of wget:
>> [...]
>>
>> Thanks for sharing the patches. The file names imply that they
>> apply to di
On Thu, 11 Dec 2003, Hrvoje Niksic wrote:
> IIRC passive FTP is not documented by RFC 959
It was.
--
-=- Daniel Stenberg -=- http://daniel.haxx.se -=-
ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol
Hrvoje Niksic wrote:
Noèl Köthe <[EMAIL PROTECTED]> writes:
I configure passive-ftp as default in the Debian packages, too. Is
it possible to make it the default for wget or is there a reason
against it, which I dont see?
IIRC passive FTP is not documented by RFC 959, so it wasn't the
de
Noèl Köthe <[EMAIL PROTECTED]> writes:
> I configure passive-ftp as default in the Debian packages, too. Is
> it possible to make it the default for wget or is there a reason
> against it, which I dont see?
IIRC passive FTP is not documented by RFC 959, so it wasn't the
default. I don't have a
Am Do, den 11.12.2003 schrieb Thierry Vignaud um 13:13:
Hello,
> some patches that were site specific -- /etc/ in doc, passive ftp by
> default,...
I configure passive-ftp as default in the Debian packages, too.
Is it possible to make it the default for wget or is there a reason
against it, whic
On Thu, 11 Dec 2003, Roman Bednarek wrote:
>
> Hi.
>
>The --convert-links does not work always work correctly with -E option
> and with --restrict-file-names options.
> For example:
> index.php?s=a
> was saved as
> [EMAIL PROTECTED]
>
> but the links in other files were not conv
Currently there is no way around it. In fact, the problem might get
worse when we implement the support for the `Content-Disposition'
header.
My plan for a future version was to change the behavior of `-P' so
that it works more like Mozilla's "save entire web page". Then you
could do something l
Hi,
I'm trying to create a little program which parses e-mails from Google News
Alerts, downloads the referenced stories, and creates an index of the stories
to make a sort of "virtual scrapbook" of new stories on a particular subject.
I'm using wget --page-requisites (and other options) to downl
Hi.
The --convert-links does not work always work correctly with -E option
and with --restrict-file-names options.
For example:
index.php?s=a
was saved as
[EMAIL PROTECTED]
but the links in other files were not converted with the -k option.
I hope it can be fixed without much
Thierry Vignaud <[EMAIL PROTECTED]> writes:
>> Also, the French translation should be taken with the French
>> translation team at the GNU Translation Project. I normally take my
>> translations from the TP.
>
> hum. it's an obvious typo
I know, but the problem is twofold:
1. I don't know Frenc
Hrvoje Niksic <[EMAIL PROTECTED]> writes:
> > i'm maintaining wget in mandrake linux distribution.
> >
> > here're some patches we apply on top of wget:
> [...]
>
> Thanks for sharing the patches. The file names imply that they
> apply to different versions of Wget? Is this really the case?
no
Roman Bednarek <[EMAIL PROTECTED]> writes:
> Is there a way to do this with wget?
Unfortunately, this is currently not possible (without modifying the
source code, that is).
Hi.
I am using 1.9.1 version of wget under Windows. I am trying to
recursively download some pages from a site. All pages on the site are
generated by a php script index.php, I tried to limit the pages to
download based on the arguments to the index.php script.
For example:
http:site/index.php
Thierry Vignaud <[EMAIL PROTECTED]> writes:
> i'm maintaining wget in mandrake linux distribution.
>
> here're some patches we apply on top of wget:
[...]
Thanks for sharing the patches. The file names imply that they apply
to different versions of Wget? Is this really the case?
I'm primarily
"Bretton Ford" <[EMAIL PROTECTED]> writes:
> wget --http-user=username --http-passwd=password -P=c:\temp -nd
Note that `-P=c:\temp' is wrong, `=' is allowed only with long
options. You can use `-P c:\temp' or even `-Pc:\temp'.
> http://www.sophos.com/downloads/ide/*.zip
You cannot use globbing
hi.
i'm maintaining wget in mandrake linux distribution.
here're some patches we apply on top of wget:
- this patch print more usefull progression info:
--- ./src/progress.c.orig 2003-09-23 22:48:10.0 +0200
+++ ./src/progress.c 2003-12-11 10:29:23.0 +0100
@@ -253,10 +253,58 @@
}
Hi Guys / Gals
I am trying to use your application to retrieve a ZIP file from the
Sophos website, but I'm receiving error doing so. The command that I'm
using is as follows:
wget --http-user=username --http-passwd=password -P=c:\temp -nd
http://www.sophos.com/downloads/ide/*.zip
I have
17 matches
Mail list logo