According to http://www.gnu.org/manual/wget/html_mono/wget.html#SEC49 they
are not encrypted.
-Original Message-
From: Laurent [mailto:[EMAIL PROTECTED]
Sent: Monday, August 04, 2003 9:57 AM
To: [EMAIL PROTECTED]
Subject: https authentification with wget
High to the ML.
I would like to
Hi
I was wondering if you could help me.
I am trying to use wget to download zip files from the web. the problem is
that because I use a proxy server to access the web wget won't work. how
can i tell wget to use a specific proxy ip address to and port number to
access the web.
thank-you very
Windows (DOS) wget 1.8.2.
When I go to download anything it says:
Resolving site.. done (instant)
Connecting to site [IP]:80... connected (instant)
--- WAIT ABOUT 8-10sec each time this is run...
Continues downloading (any time to download now is network traffic so I
don't care.
Does this
Hi all :))
After asking in the wget list (with no success), and after having
a look at the sources (a *little* look), I think that this is a bug,
so I've decided to report here.
Let's go to the matter: when I download, thru FTP, some
hierarchy, the spaces are translated as '%20'.
I use wget 1.8.2:
-r -nH -P /usr/file/somehost.com somehost.com http://somehost.com
Bug description:
If some script http://somehost.com/cgi-bin/rd.cgi return http header with
status
302 and redirect to http://anotherhost.com then
the first page of http://anotherhost.com/index.html accepted and
Hi,
I am helping a friend tranfer 12 big sites/domains from one remote Linux box to
another. I use ftp to do this? Can I use wget (and how) to do this
easily?
Assume I have a domain example.com on 1.2.3.4 which I want to shift on
4.3.2.1, I ftp from one and download/upload files, can wget be more
Well after further reviewing and testing I have found:
(A) Someone was able to run this and they are using straight DSL (no
router)
(B) To eliminate my windows/network config as a potential problem, I
disconnected from the router/firewall and went straight to the internet.
Now the pause was
I use wget 1.8.2:
-r -nH -P /usr/file/somehost.com somehost.com http://somehost.com
Bug description:
If some script http://somehost.com/cgi-bin/rd.cgi return http header with
status
302 and redirect to http://anotherhost.com then
the first page of http://anotherhost.com/index.html accepted and
Hi there,
I got the same problems as Volker, so I took a look into the source
code:
line 527 in cvs src/recur.c checks if it is necessary to download the file :
if (!acceptable (u-file))
This only checks the first part of the filename, e.g. :
on the following url:
On Tue, 12 Aug 2003, Tony Lewis wrote:
Daniel Stenberg wrote:
The GNU project is looking for a new maintainer for wget, as the
current one wishes to step down.
I think that means we need someone who:
1) is proficient in C
2) knows Internet protocols
3) is willing to learn the
searching the web i found out that cygwin has wget and there's also this:
http://kimihia.org.nz/projects/cygwget/
/a
On Wed, 13 Aug 2003, Shell Gellner wrote:
Dear Sirs,
I've downloaded the GNU software but when I try to run the WGET.exe file
it keeps telling me 'is linked to missing
I use wget 1.8.1 ported to Windows and try to mirror a ftpserver. Some
directories and files there named with Cyrillic chars and during updating
they copy again but they have names like
@[EMAIL PROTECTED]@[EMAIL PROTECTED]@[EMAIL PROTECTED]@[EMAIL PROTECTED]@[EMAIL
PROTECTED]@[EMAIL
Daniel Stenberg wrote:
The GNU project is looking for a new maintainer for wget, as the current
one
wishes to step down.
I think that means we need someone who:
1) is proficient in C
2) knows Internet protocols
3) is willing to learn the intricacies of wget
4) has the time to go through
On Tue, 12 Aug 2003, Aaron S. Hawley wrote:
1) is proficient in C
That is next to common knowledge.
2) knows Internet protocols
THe HTTP 1.0 and FTP RFC959 that wget supports today are pretty basic and
no-frills. Besides, there are many proof-readers in the audience that help
identifying
Hello, Everyone :)
I am new to the list and I have a few questions. But before I ask any
questions, can anyone tell me how I can get a download of the complete
archives for this list? As long as when I export the file/files I end
up with the individual messages and not the digest format.
15 matches
Mail list logo