David wrote:
In replies to the post requesting support of the “file://” scheme, requests
were made for someone to provide a compelling reason to want to do this.
Perhaps the following is such a reason.
hi david,
thank you for your interesting example. support for “file://” scheme
will be
In replies to the post requesting support of the file:// scheme, requests were made for someone to provide a compelling reason to want to do this. Perhaps the following is such a reason.I have a CD with HTML content (it is a CD of abstracts from a scientific conference), however for space
Hi All,
first of all, keep in mind that rsync already handle local (and remote)
file/dir transfer at best, IMHO rsync is the best solution, ever, for
coping files when you have shell access to.
You can have a look at here: http://rsync.samba.org/ftp/rsync/rsync.html
just look at the details of
I, too, see little value in using Wget to copy files which are
accessible locally, but let's say that someone wished to add this
feature. Given a link like file:///a/b.c, what would be the
destination for the downloaded file on the local file system? How
would link conversion work?
Also,
[EMAIL PROTECTED] writes:
This may be useful when some network share are mounted to local file system
*** OK, so why do you want to download the file from local file system to
local
file system?
Because Wget can shows the download speed, restart a download with
`-c', etc. I second the
I believe this is already on the todo list. However, this is made
harder by the fact that, to implement this kind of reject, you have to
start downloading the file. This is very different from the
filename-based rejection, where the decision can be made at a very
early point in the download
Hi,
I am forwarding Debian wishlist bug 21148
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=21148repeatmerged=yes
While wget allows me to include/exclude documents based on their
extension,
it doesn't allow me to do the same based on mime type (for example,
if I only want to save text
Debian wishlist bug 104122
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=104122repeatmerged=yes
It would be extremly useful to have a 'quirks' mode which would do the
following (for instance, other things can be added):
- If a URL with \ characters gets a 404, try again with s
Debian wishlist bug 105278
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=105278repeatmerged=yes
It would be nice if, upon noticing that it's getting a lot of invalid
port errors, wget would automatically try a passive FTP download unless
there had been some explicit configuration
Quoting Dan Harkless ([EMAIL PROTECTED]):
the file's size). This feature would enable the writing of cool scripts to
do something like multi-threaded retrieval at file level.
[...]
Hi, Alec. You're the second person within a few days to ask for such a
feature. I've added it to the
Jan Prikryl [EMAIL PROTECTED] writes:
Quoting Dan Harkless ([EMAIL PROTECTED]):
the file's size). This feature would enable the writing of cool
scripts to do something like multi-threaded retrieval at file level.
[...]
Hi, Alec. You're the second person within a few days to ask
Hi ppl
First of all want to thank you for the great tool.
I think a nice feature would be to make wget able to retrieve a chunk from a
file (i suppose it shouldnt be hard to enhace the download resume code to to do
that - the start offset would be taken from the command line instead of
looking
12 matches
Mail list logo