Ed [EMAIL PROTECTED] writes:
Sorry for off topic post but as you see I can't get anything else to work.
If I try to join the list at sunsite.dk I get a message in Danish
asking me to do something to verify myself, I tried translating it
but it made no sense.
How exactly are you trying to
Tony Lewis [EMAIL PROTECTED] writes:
A) This is the list for reporting bugs. Questions should go to
wget@sunsite.dk
For what it's worth, [EMAIL PROTECTED] is simply redirected to
[EMAIL PROTECTED] It is still useful to have a separate address for
bug reports, for at least two reasons. One,
Axel Boldt [EMAIL PROTECTED] writes:
I would like to advocate for a multithreading/parallel download
feature and I believe the last quoted sentence above is simply
false; parallel downloading provides considerable speedups in almost
all settings.
The most noticable speedup in the
[EMAIL PROTECTED] writes:
O, I understand, I hoped only that it be known that the accepted is a bad
spelling.
It is known. The documentation says:
referer = STRING
Set HTTP `Referer:' header just like `--referer=STRING'. (Note it
was the folks who wrote the HTTP spec who got the
Nicle Yang [EMAIL PROTECTED] writes:
Does the wget support the HTTP1.1 chunked data?
It doesn't. Wget sends out an HTTP/1.0 request, so it shouldn't (in
theory) ever encounter chunked data.
Jochen Roderburg [EMAIL PROTECTED] writes:
Petr Kras schrieb:
When transfer is broken and restoration is required
it doesnt work for files greater than 4GB (not checked for 2GB)
and brake is behind 4GB (2GB) limit.
--13:58:54-- ftp://streamlib.pan.eu/Streams/TVDC_SS_01100.ts
=
Mauro Tortonesi [EMAIL PROTECTED] writes:
you're right, of course. the patch included in attachment should fix
the problem. since the new HTTP code supports Content-Disposition
and delays the decision of the destination filename until it
receives the response header, the best solution i could
Matthias Kuehn [EMAIL PROTECTED] writes:
it seems wget sorts the v4/v6 ip´s before creating a table of the
returned ip´s.
Wget does really do that. It does prefer IPv4 addresses to IPv6
addresses, and it caches the addresses resolved (but only during a
single Wget run), but it doesn't sort
www.mail [EMAIL PROTECTED] writes:
The following command crashes wget 1.11-alpha-1 on Windows 2000 SP4:
wget --output-document=- --no-content-disposition http://www.google.com/;
Fixed now, thanks for the report.
Christopher G. Lewis [EMAIL PROTECTED] writes:
Lets edit that and comment out ENABLE_DEBUG:
config.h
/* Define if you want the debug output support compiled in. */
/* #define ENABLE_DEBUG 1 */
After futzing around for a little while, I got it to work.
That should work without any
Christopher G. Lewis [EMAIL PROTECTED] writes:
For some reason, a change that was made in log.c between 1.8 and 1.9
has broken the ability to do a build without debug enabled.
Basically, in config.h if you change ENABLE_DEBUG to 0, wget will no
longer build.
That's not how it works, you're
Jochen Roderburg [EMAIL PROTECTED] writes:
E.g, a file which was supposed to have the name BW.txt came with the header:
Content-Disposition: attachment; filename=Bamp;W.txt;
All programs I tried (the new wget and several browsers and my own script ;-)
seemed to stop parsing at the first
Mauro Tortonesi [EMAIL PROTECTED] writes:
Hrvoje Niksic ha scritto:
Gisle Vanem [EMAIL PROTECTED] writes:
Kinda misleading that wget prints login incorrect here. Why
couldn't it just print the 530 message?
You're completely right. It was an ancient design decision made by me
when I wasn't
Tony Lewis [EMAIL PROTECTED] writes:
I don't think that's valid HTML. According to RFC 1866: An HTML user
agent should treat end of line in any of its variations as a word
space in all contexts except preformatted text. I don't see any
provision for end of line within the HREF attribute of
Jamie Zawinski [EMAIL PROTECTED] writes:
If I specify -O, it is able to download the data; but if wget is
picking the file name itself, it is unable to write the file
(invalid argument). Neither --restrict-file-names=unix nor --
restrict-file-names=windows affects it.
It could be that your
bruce [EMAIL PROTECTED] writes:
add to 'struct cmdline_option option_data[]'
.
.
{ version, 'V', OPT_FUNCALL, (void *) print_version, no_argument },
{ wait, 'w', OPT_VALUE, wait, -1 },
{ college-file, 'C', OPT_VALUE, collegefile, -1 },
{ waitretry, 0,
bruce [EMAIL PROTECTED] writes:
btw, i just saw the comment you menttioned in the init.c file is
it possible to add this same comment to main.c.
The comment there would be misleading because options in main.c are in
fact not alphabetically sorted. They don't need to be because they
are
bruce [EMAIL PROTECTED] writes:
as you guys create/go forth in dealing with windows.. are you
focused on XP, or 2000 as well... keep in mind, there are a lot of
2000 users still around!!
Historically Wget was supposed to compile and run on Windows 98 as
well. I haven't been able to test
Herold Heiko [EMAIL PROTECTED] writes:
The --ignore-case (and wgetrc option) don't seem to be documented in
the texi.
Fixed now, thanks for the report!
bruce [EMAIL PROTECTED] writes:
i tried to follow the instructions provided with the 1.10 source on
the site...
I don't think that has anything to do with Windows 2000 vs. Windows
XP. You need to make sure that the correct windows/config*.h is being
picked for your compiler. The fact that
bruce [EMAIL PROTECTED] writes:
any idea as to who's working on this feature?
No one, as far as I know.
Note that, even if you don't import 1.11 into the current Fedora, you
can always take a look at the bugfixes in the 1.10 branch. For
example:
$ svn diff http://svn.dotsrc.org/repo/wget/tags/WGET_1_10_2 \
http://svn.dotsrc.org/repo/wget/branches/1.10
long.
This means that the only the last 1 or 2 bytes should be used in the base
64 algorithm.
You're right; thanks for reporting this. I have now installed this fix:
2006-06-19 Hrvoje Niksic [EMAIL PROTECTED]
* utils.c (base64_encode): Would read past end of STR.
Reported
I understand and agree with the reasoning behind removing the GPL as
the invariant section; but why also remove the GFDL as an invariant
section?
Robert Nicholson [EMAIL PROTECTED] writes:
When I set my acceptlist to
viewtopic.php?t=5098
I notice that when it actually does the check it only sends
thru viewtopic.php leaving off the rest of the query string
Is that intentional?
It's intentional -- the acceptlist and friends check
Robert Nicholson [EMAIL PROTECTED] writes:
When wget is traversing a url what stops it visiting that url again?
It keeps a table of visited URLs.
and assuming it checks the url is it only checking for the exact
string?
It is.
ie. different url but same response because the url it's
Robert Nicholson [EMAIL PROTECTED] writes:
It looks like some modules are going to send back this and in some
cases I'd like to it retry rather than continue just because it
thinks it get the file.
How do you propose for Wget to differentiate between 200 ok and 200
Service Temporarily
Mauro Tortonesi [EMAIL PROTECTED] writes:
Noèl Köthe wrote:
Hello,
a forwarded report from http://bugs.debian.org/366434
could this behaviour be added to the doc/manpage?
i wonder if it makes sense to add generic support for multiple headers
in wget, for instance by extending the --header
Mauro Tortonesi [EMAIL PROTECTED] writes:
Toni Casueps wrote:
I use Wget 1.10 for Linux. If I use -O and there was already a file
in the current directory with the same name it overwrites it, even
if I use -nc. Is this a bug or intentional?
IMVHO, this is a bug. if hrvoje does not provide a
Don Armstrong [EMAIL PROTECTED] writes:
Summary: The issue with wget.texi is that the GNU GPL is an Invariant
Section; since the GNU GPL cannot be modified anyway, this just forces
gpl.texi to always be distributed with wget.texi, even when you're
just distributing the manual.
The GPL text
Don Armstrong [EMAIL PROTECTED] writes:
On Thu, 18 May 2006, Hrvoje Niksic wrote:
If the point you're making is that someone might want to remove the
GPL text from the manual, for example to make it shorter, I guess
that's a valid concern.
Yes, that's the issue.
I see. But that's still
yy :) [EMAIL PROTECTED] writes:
I ran wget -P /tmp/.test [1]http://192.168.1.10; in SUSE system (SLES 9)
and found that it saved the file in /tmp/_test.
This command works fine inRedHat, is it a bug?
I believe the bug is introduced by SuSE in an attempt to protect the
user. Try reporting it
J. Grant [EMAIL PROTECTED] writes:
Could an extra value be added which lists the average rate? average
rate: xx.xx K/s ?
Unfortunately it would have problems fitting on the line.
J. Grant [EMAIL PROTECTED] writes:
I think I may have found a bug, the ETA is listed, but not the
K/s rate, the ETA must have been calculated using an K/s rate
determined by the current time into the download.
The --.-- current download rate means that the download is currently
not
J. Grant [EMAIL PROTECTED] writes:
The --.-- current download rate means that the download is
currently
not progressing. The ETA calculation is based on the average download
rate, and is always available.
is the K/s rate not the average rate ?
No, it's the current rate.
Jeff Dickey [EMAIL PROTECTED] writes:
It would be very nice if there were a complement to the --limit-rate
parameter to specify a minimum allowable transfer rate, such as -M
or --minimum-rate; this would abort a transfer and cause wget to
terminate with a nonzero exit code when the transfer
ks [EMAIL PROTECTED] writes:
How do i define inside .wgetrc that it should not follow any links a
href=abc inside html files only download the files that are present in
the directory?
The only way to find which files are present *is* to follow a
href=... (and other elements). If you don't
Kat [EMAIL PROTECTED] writes:
Something like ...
wget --title=News Server #1 http://www.etc.com/latest_news.html
Doesn't work.
I believe that was intended as a suggestion, not as a description of
an existing feature.
www.mail [EMAIL PROTECTED] writes:
Something like ...
wget --title=News Server #1 http://www.etc.com/latest_news.html
So that News Server #1 appears as the console title rather than the URL
(or its possible redirect).
I think the standard Windows console application (cmd.exe) always
[ Moving the discussion to the Wget mailing list.
Jerry's patch implements a --random option that shuffles the list
of addresses returned by getaddrinfo. ]
Jerry Lundström [EMAIL PROTECTED] writes:
A user scenario could be that wget with ipv6 enabled always picks a
broken website since RFC
www.mail [EMAIL PROTECTED] writes:
Changing console title was IMHO a mistaken feature to implement in the
first place
I agree.
In the case of Windows, I believe that no console application should
alter the window title, as such applications will overwrite the title
which was specified at
Herold Heiko [EMAIL PROTECTED] writes:
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
The idea behind the feature is that you can see which URL is
*currently* being downloaded (you can specify several). That's
somewhat different than just seeing the command line. I still
consider
18 mao [EMAIL PROTECTED] writes:
[...]
To answer the question in your subject line: no, Wget does not
auto-convert downloaded text files. It strives to download all files
unchanged byte-for-byte.
i use wget to download some html files,and use vim to edit them,i
fount that the newline has been
[ Please leave the Cc to [EMAIL PROTECTED] ]
18 mao [EMAIL PROTECTED] writes:
To answer the question in your subject line: no, Wget does not
auto-convert downloaded text files. It strives to download all files
unchanged byte-for-byte.
maybe that's the reason about the character
Jesse Cantara [EMAIL PROTECTED] writes:
A quick resolution to the problem is to use the -nH command line
argument, so that wget doesn't attempt to create that particular
directory. It appears as if the problem is with the creation of a
directory with a ':' in the name, which I cannot do
Maxim Brandwajn [EMAIL PROTECTED] writes:
Hi guys, I keep getting this error at random files/times:
[...]
What version of Wget are you using?
Greg Stark [EMAIL PROTECTED] writes:
Hrvoje Niksic [EMAIL PROTECTED] writes:
What Wget could do to ensure consistency is unset the variable
POSIXLY_CORRECT during option processing. All other effects of
POSIXLY_CORRECT on getopt (such as use of illegal rather than
invalid in the error
Gregory Stark [EMAIL PROTECTED] writes:
However wget includes an internal copy of GNU getopt. In that case
it would be reasonable for wget to rip out this feature so that the
behaviour always matches the documentation.
You have a point, but it is not entirely true that Wget includes an
Vladimir Volovich [EMAIL PROTECTED] writes:
MT == Mauro Tortonesi writes:
are there any news on the wget update?
MT hrvoje fixed this problem more than one month ago. from the
MT ChangeLog:
i don't see the official source at ftp.gnu.org/gnu/wget/
that's what i'm asking about.
The
Is there any interest in finishing the GnuTLS support in Wget? The
support currently available in the repository can be tested using
`./configure --with-ssl=gnutls'. It should enable you to download
from SSL servers using --no-check-certificate, but it is not yet
finished. Specifically, and in
Thomas Braby [EMAIL PROTECTED] writes:
eta_hrs = (int) (eta / 3600), eta %= 3600;
Yes that also works. The cast is needed on Windows x64 because eta is
a wgint (which is 64-bit) but a regular int is 32-bit so otherwise a
warning is issued.
The same is the case on 32-bit Windows, and also
cliff [EMAIL PROTECTED] writes:
Just an FYI since wget exposes this bug, you may see more questions about
it. The solution to my problem was
https://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=186592
Specically, removing nisplus from the host line in /etc/nsswitch.conf
resolved the
Maybe you should file a bug with the Fedora people. I don't think
Wget is doing anything wrong in the IPv6 department. It basically
calls getaddrinfo and accepts both types of addresses (preferring IPv4
addresses for connecting, unless specified otherwise). That
getaddrinfo should fail means
cliff [EMAIL PROTECTED] writes:
Thanks
$ gcc a.c
$ ./a.out yahoo.com
success
$ wget yahoo.com
--12:27:32-- http://yahoo.com/
= `index.html'
Resolving yahoo.com... failed: No such file or directory.
That is not good because it means that either Wget has a so far
unencountered
Mauro Tortonesi [EMAIL PROTECTED] writes:
Scott Scriven wrote:
* Mauro Tortonesi [EMAIL PROTECTED] wrote:
wget -r --filter=-domain:www-*.yoyodyne.com
This appears to match www.yoyodyne.com, www--.yoyodyne.com,
www---.yoyodyne.com, and so on, if interpreted as a regex.
not really. it
Mauro Tortonesi [EMAIL PROTECTED] writes:
wget -r --filter=-domain:www-*.yoyodyne.com
This appears to match www.yoyodyne.com, www--.yoyodyne.com,
www---.yoyodyne.com, and so on, if interpreted as a regex.
not really. it would not match www.yoyodyne.com.
Why not?
i may be wrong, but if -
Wincent Colaiuta [EMAIL PROTECTED] writes:
Are you sure that www-* matches www?
Yes.
As far as I know www-* matches one w, another w, a third w, a
hyphen, then 0 or more hyphens.
That would be www--* or www-+.
Tony Lewis [EMAIL PROTECTED] writes:
Mauro Tortonesi wrote:
no. i was talking about regexps. they are more expressive
and powerful than simple globs. i don't see what's the
point in supporting both.
The problem is that users who are expecting globs will try things like
Tony Lewis [EMAIL PROTECTED] writes:
I didn't miss the point at all. I'm trying to make a completely different
one, which is that regular expressions will confuse most users (even if you
tell them that the argument to --filter is a regular expression).
Well, most users will probably not use
Zembower, Kevin [EMAIL PROTECTED] writes:
[EMAIL PROTECTED]:/tmp$ wget --timestamping --no-host-directories --glob=on
--recursive --cut-dirs=4
'ftp://xxx:[EMAIL PROTECTED]/%2Fccp1/data/shared/news/motd/qotd.txt'
If you need double slash, you must spell it explicitly:
wget [...]
Herold Heiko [EMAIL PROTECTED] writes:
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
I don't think such a thing is necessary in practice, though; remember
that even if you don't escape the dot, it still matches the (intended)
dot, along with other characters. So for quickdirty usage
Greg Hurrell [EMAIL PROTECTED] writes:
El 28/03/2006, a las 20:43, Tony Lewis escribió:
Hrvoje Niksic wrote:
The cast to int looks like someone was trying to remove a warning and
botched operator precedence in the process.
I can't see any good reason to use , here. Why not write the line
Mauro Tortonesi [EMAIL PROTECTED] writes:
for instance, the syntax for --filter presented above is basically the
following:
--filter=[+|-][file|path|domain]:REGEXP
I think there should also be url for filtering on the entire URL.
People have been asking for that kind of thing a lot over the
Jim Wright [EMAIL PROTECTED] writes:
what definition of regexp would you be following? or would this be
making up something new?
It wouldn't be new, Mauro is definitely referring to regexps as
normally understood. The regexp API's found on today's Unix systems
might be usable, but
Thomas Braby [EMAIL PROTECTED] writes:
With wget 1.10.2 compiled using Visual Studio 2005 for Windows XP x64
I was getting no ETA until late in the transfer, when I'd get things
like:
49:49:49 then 48:48:48 then 47:47:47 etc.
So I checked the eta value in seconds and it was correct, so
thomas [EMAIL PROTECTED] writes:
i tried adding '-r -l1 - A.pdf' but that removes the html page and all the
'-p' files.
How about -r -l1 -R.html? That would download the HTML and the linked
contents, but not other HTMLs.
thomas [EMAIL PROTECTED] writes:
i feel like the desired behavior is closer to -p than -r. it seems
kind of unnatural to me that --accept totally overrides -p but on
the other hand the current -A behavior is important in the context
of -r.
I think you have a point there -- -A shouldn't so
[ Moving the discussion from the patches list to the general
discussion list, followed by more people. ]
Juho Vähä-Herttua [EMAIL PROTECTED] writes:
Thank you for mentioning this feature, I forgot to explicitly mention
it in my mail. Currently wget doesn't handle the charset at all on
HTML
Juho Vähä-Herttua [EMAIL PROTECTED] writes:
It is very much related to IDN. If wget would detect the correct
charset of web page content it would be trivial to do the
conversions to IDN knowing that, this is what most web browsers do.
Wget never even detects the correct charset.
And that is
Juho Vähä-Herttua [EMAIL PROTECTED] writes:
It's not that hard, either -- you can always transform UTF-16 into
UTF-8 and work with that.
No you can't. Then the filenames in URLs that should be as escaped UTF-16 will
be transformed into escaped UTF-8.
Can you elaborate on this? What I had
Juho Vähä-Herttua [EMAIL PROTECTED] writes:
On 22.3.2006, at 17:10, Hrvoje Niksic wrote:
Can you elaborate on this? What I had in mind was:
1. start with a stream of UTF-16 sequences
2. convert that into a string of UCS code points
3. encode that into UTF-8
now work with UTF-8
Mauro Tortonesi [EMAIL PROTECTED] writes:
unfortunately, i am not familiar with windows shell
scripting. however, if you use python something like this should work:
[...]
os.execlp (wget, format % t)
I don't think os.execlp will work under Windows. Recent Python
versions have a more
Alain Bench [EMAIL PROTECTED] writes:
Sure: I installed Subversion, and began learning it, just to put my
hands on the said trunk (found no tar.gz snapshots?).
There are no tar.gz snapshots yet. They would be easy to
autogenerate, it's just that no one has volunteered to set up such a
script,
Denis Solovyov [EMAIL PROTECTED] writes:
Is it true or false that if --connect-timeout is set to a value
larger than timeout implemented by system libraries, it will make no
sense because system timeout will take precedence (i.e. it will
happen earlier than wget's internal timeout)?
True.
satisfied with it and would be very interested in any
suggestions and, of course, bugs. hash.c was written to be
reuse-friendly.
Also note that you can get the latest version of the file (this fix
included) from http://svn.dotsrc.org/repo/wget/trunk/src/hash.c .
2006-03-06 Hrvoje Niksic [EMAIL
Alain Bench [EMAIL PROTECTED] writes:
Seems a little bit like unusable nonsense to me. Either there is a
magic option I missed, or I'd recommend to treat Borland as C locale
(forcing coma separator and grouping by 3).
That's totally weird. I suggest we do the latter, as I don't think
all
Alain Bench [EMAIL PROTECTED] writes:
On Thursday, March 2, 2006 at 7:51:43 +0100, Hrvoje Niksic wrote:
Then the code could look like this:
Seems good to me. I can help testing, if someone compiles.
Note that you can get a free compiler from here:
http://www.borland.com/downloads
Alain Bench [EMAIL PROTECTED] writes:
the setlocale invocation should look like this:
Hum... You dropped the fallback to ANSI when GetConsoleOutputCP()
returns 0.
Ah, I didn't know it could return 0. The code was based on your
description, which said Call GetConsoleOutputCP(), get [for]
I'll be visiting Boston in mid-March, most likely March 11-18. If
some of you would like to get together and discuss life, Wget, and
everything else over a beer, please email me.
to be retained. In other words,
that Wget does so is no accident, it had to be separately coded into
path_simplify, as shown by this ChangeLog entry:
2003-11-14 Hrvoje Niksic [EMAIL PROTECTED]
(path_simplify): Don't swallow ..'s at the beginning of string.
E.g. simplify foo
Alain Bench [EMAIL PROTECTED] writes:
Call setlocale(LC_ALL, .OCP) which will select the default OEM
charset of the current Windows language. OCP means OEM Code Page,
and console apps by default need to use this OEM charset: Probably
CP-852 for you, CP-850 for me, and so on. Here this
It would help if you posted the output with the `-d' flag. Otherwise
we can't know where exactly things went wrong. I can't connect to
that site.
Jasmo Hiltula [EMAIL PROTECTED] writes:
I tried today a easy hax for one problem. It lead me to a situation
where wget should be able to follow more than 20
http-redirections. Now wget says 20 redirections exceeded. I
searched the manual even read the source code trying to find some
way to
Valery Kondakoff [EMAIL PROTECTED] writes:
I'm not a programmer, so I may be wrong, but I'm pretty sure the
problem lies in wrong ANSI/OEM character encoding conversion.
I've seen that mentioned before, but I don't know what it refers to.
What are the steps a Windows console program needs to
于斌 [EMAIL PROTECTED] writes:
I am using a version of GNU Wget 1.5.3.1
Old versions of Wget always read the HTTP socket until EOF. This does
not hold well with some modern servers, which expect the client to
only read the amount of bytes specified in the Content-Length header.
This is fixed in
Valery Kondakoff [EMAIL PROTECTED] writes:
When downloading wget displays 'a' character insted of '.' (dot) in
a file length. Here is a screenshot
http://www.nncron.ru/temp/wget.jpg (GNU Wget 1.10.1 under WinXP
SP2). Is this a bug or this is intentional behaviour? Am I doing
smth wrong?
Dan Jacobson [EMAIL PROTECTED] writes:
H Maybe it should rather vary between 0.5*wait and 1.5*wait?
There you go again making assumptions about what the user wants.
Simply looking for a more reasonable default.
Mauro Tortonesi [EMAIL PROTECTED] writes:
* The local name is copied from the header verbatim without
inspecting
it for dangerous characters, such as / (on Windows also \).
* There seems to be no code to check for uniqueness of file name. So
far Wget's philosophy has been not to
Dan Jacobson [EMAIL PROTECTED] writes:
--random-wait causes the time between requests to vary between 0 and
2 * wait seconds, where wait was specified using the --wait option,
So one can no longer specify a minimum wait time!
Good point. Maybe it should rather vary between 0.5*wait and
Hrvoje Niksic [EMAIL PROTECTED] writes:
The code in question is:
if (!hs-local_file)
{
if (resp_header_copy (resp, Content-Disposition, hdrval, sizeof
(hdrval)))
/* Honor Content-Disposition. */
{
hs-local_file = xstrdup (hdrval
Mauro Tortonesi [EMAIL PROTECTED] writes:
You might want to put the URL last in line, so that the (try...)
text is not confused with what is being downloaded.
do you mean:
--13:20:41-- (try:02) http://www.tortonesi.com/
Yes, why not?
Is the empty line intentional?
yes. i thought that
Mauro Tortonesi [EMAIL PROTECTED] writes:
Resolving download.skype.com... 198.63.211.251, 212.72.49.140, 195.215.8.138
Connecting to download.skype.com|198.63.211.251|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 7906384 (7,5M) [application/x-debian-package]
BTW it looks like the Content-Disposition header isn't being parsed
correctly. For example:
$ ./wget http://www.mininova.org/get/212851
--22:11:52-- http://www.mininova.org/get/212851
Resolving www.mininova.org... 83.149.119.115
Connecting to www.mininova.org|83.149.119.115|:80... connected.
Naotoshi Seo [EMAIL PROTECTED] writes:
Why does not wget do url_unescape for saved local files?
It does. It would help if you described which problem you are having,
and with which URL. See also the option --restrict-file-names.
Mauro Tortonesi [EMAIL PROTECTED] writes:
[EMAIL PROTECTED]:~/code/svn/wget/src$ ./wget http://www.tortonesi.com
--13:20:41-- http://www.tortonesi.com/ (try:02)
Resolving www.tortonesi.com... 62.149.140.31
You might want to put the URL last in line, so that the (try...)
text is not
Mauro Tortonesi [EMAIL PROTECTED] writes:
what are you exactly suggesting to do? to keep the current behavior
of -O allowing only single downloads?
Huh? -O doesn't currently allow only single downloads -- multiple
downloads are appended to the same output file.
let me rephrase. are you
Mauro Tortonesi [EMAIL PROTECTED] writes:
That seems to break the principle of least surprise as well. If such
an option is specified, maybe Wget should simply refuse to accept more
than a single URL.
what are you exactly suggesting to do? to keep the current behavior
of -O allowing only
Mauro Tortonesi [EMAIL PROTECTED] writes:
the following patch (just commited into the trunk) should solve the
problem.
I don't think that patch is such a good idea.
-O, as currently implemented, is simply a way to specify redirection.
You can think of it as analogous to command file in the
Mauro Tortonesi [EMAIL PROTECTED] writes:
you might be actually right. the real problem here is that the
semantics of -O are too generic and not well-defined.
The semantics of -O are well-defined, but they're not what people
expect. In other words, -O breaks the principle of least surprise.
Taduri, Prakash [EMAIL PROTECTED] writes:
I am having a problem while connecting to a FTP server using wget command.
There is a \ in the username i.e., \
I had checked the documentation of wget, but could not figure out any
option of overcoming \ in the username. I tried putting
101 - 200 of 1457 matches
Mail list logo