Michelle Konzack napsal(a):
Am 2008-09-20 22:05:35, schrieb Micah Cowan:
I'm confused. If you can successfully download the files from
HOSTINGPROVIDER in the first place, then why would a difference exist?
And if you can't, then this wouldn't be an effective way to find out.
I mean, IF you
Am 2008-09-20 22:05:35, schrieb Micah Cowan:
I'm confused. If you can successfully download the files from
HOSTINGPROVIDER in the first place, then why would a difference exist?
And if you can't, then this wouldn't be an effective way to find out.
I mean, IF you have a local (master) mirror
file://. Here is what I wrote
then:
At 03:45 PM 26/06/2006, David wrote:
In replies to the post requesting support of the file:// scheme, requests
were made for someone to provide a compelling reason to want to do this.
Perhaps the following is such a reason.
I have a CD with HTML content
in 2006 describing what I consider a compelling reason to support
file:// file:///. Here is what I wrote then:
At 03:45 PM 26/06/2006, David wrote:
In replies to the post requesting support of the file:// scheme,
requests were made for someone to provide a compelling reason to want to
do
to be in the business of duplicating the system
cp command, but I might conceivably not mind file:// support if it
means simple _content_ transfer, and not actual file duplication.
Also in need of addressing is what recursion should mean for file://.
Between ftp:// and http://, recursion currently
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Michelle Konzack wrote:
Imagine you have a local mirror of your website and you want to know why
the site @HOSTINGPROVIDER has some files more or such.
You can spider the website @HOSTINGPROVIDER recursiv in a local tmp1
directory and then,
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Petri Koistinen wrote:
Hi,
I would be nice if wget would also support file://.
Feel free to file an issue for this (I'll mark it Needs Discussion and
set at low priority). I'd thought there was already an issue for this,
but can't find it (either
Hi,
I would be nice if wget would also support file://.
Petri
David wrote:
In replies to the post requesting support of the “file://” scheme, requests
were made for someone to provide a compelling reason to want to do this.
Perhaps the following is such a reason.
hi david,
thank you for your interesting example. support for “file://” scheme
In replies to the post requesting support of the file:// scheme, requests were made for someone to provide a compelling reason to want to do this. Perhaps the following is such a reason.I have a CD with HTML content (it is a CD of abstracts from a scientific conference), however for space
Hi All,
first of all, keep in mind that rsync already handle local (and remote)
file/dir transfer at best, IMHO rsync is the best solution, ever, for
coping files when you have shell access to.
You can have a look at here: http://rsync.samba.org/ftp/rsync/rsync.html
just look at the details of
I, too, see little value in using Wget to copy files which are
accessible locally, but let's say that someone wished to add this
feature. Given a link like file:///a/b.c, what would be the
destination for the downloaded file on the local file system? How
would link conversion work?
Also,
[EMAIL PROTECTED] writes:
This may be useful when some network share are mounted to local file system
*** OK, so why do you want to download the file from local file system to
local
file system?
Because Wget can shows the download speed, restart a download with
`-c', etc. I second the
13 matches
Mail list logo