Hi,
I need to download all .html and .jpg files from a server at a fixed
location i.e. http://ip address/desig_fr/
Is it possible thru' wget and if yes how? The names are not fixed of
the files. In short, I need to download all the *.html and *.jpg
Thanks in advance.
With warm regards,
-Payal
Hi,
I had asked this on rsync list, but now today when I look at it again
I find it is more closely related to wget and its capabalities.
I have a client uploading a few designs (25-30 Mbs) daily at a remote
ftp server. We download them in morning. Since we have a slow
connection we daily waste a
On Thu, Oct 02, 2003 at 12:03:34PM +0200, Hrvoje Niksic wrote:
Payal Rathod [EMAIL PROTECTED] writes:
On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
The way to do it with Wget would be something like:
wget --mirror --no-host-directories ftp://username:[EMAIL
Hi,
I have 5-7 user accounts in /home whose data is important. Every day at
12:00 I want to back their data to a differnt backup machine.
The remote machine has a ftp server.
Can I use wget for this? If yes, how do I proceed? I am keen to use wget
rather than rsync for this.
I want to preserve
On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
The way to do it with Wget would be something like:
wget --mirror --no-host-directories ftp://username:[EMAIL PROTECTED]
But if I run in thru' crontab, where will it store the downloaded files?
I want it to store as it is in
Hi,
I am helping a friend tranfer 12 big sites/domains from one remote Linux box to
another. I use ftp to do this? Can I use wget (and how) to do this
easily?
Assume I have a domain example.com on 1.2.3.4 which I want to shift on
4.3.2.1, I ftp from one and download/upload files, can wget be more
Hi all,
I need some kind of method to download only the questions andn answers listedk in this
url. I don't want any picutes, just question
andtheir answers.
The url is http://qmail.faqts.com/
It is harder than it looks.
I want to download all the questions and answers. Can anyone suggest any
Hi,
I want to download all .html, .htm and .txt files from http://cr.yp.to,
so I gave,
$ wget -l7 -r -nc -A .html,.htm,.txt http://cr.yp.to
But I found that it was downloading all other files also plusfiles from
domains such as joker.com and others. Can someone tell me the exact
syntax for this
* Fred Holmes [EMAIL PROTECTED] [2003-02-23 01:43]:
-e robots=off
-i filespec
where filespec is an ASCII file containing the list of URLs to be
downloaded.
Thanks lot for the suggestion. I haven't got a chance the first one yet,
as I cannot think of such a site right now.
Thanks and
Hi all,
Can I tell wget to ignore robots.txt? If so how do I do it?
Also, if I have 10 different URL to retrieve from, can I specify all of
them in a file and ask wget to get them from the file and retrieve them
without any manual intervention? How do I do it?
Thanks a lot and bye.
With warm
10 matches
Mail list logo