downloading multiple files

2005-03-07 Thread Payal Rathod
Hi, I need to download all .html and .jpg files from a server at a fixed location i.e. http://ip address/desig_fr/ Is it possible thru' wget and if yes how? The names are not fixed of the files. In short, I need to download all the *.html and *.jpg Thanks in advance. With warm regards, -Payal

site mirroring?

2005-01-27 Thread Payal Rathod
Hi, I had asked this on rsync list, but now today when I look at it again I find it is more closely related to wget and its capabalities. I have a client uploading a few designs (25-30 Mbs) daily at a remote ftp server. We download them in morning. Since we have a slow connection we daily waste a

Re: downloading files for ftp

2003-10-02 Thread Payal Rathod
On Thu, Oct 02, 2003 at 12:03:34PM +0200, Hrvoje Niksic wrote: Payal Rathod [EMAIL PROTECTED] writes: On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote: The way to do it with Wget would be something like: wget --mirror --no-host-directories ftp://username:[EMAIL

downloading files for ftp

2003-10-01 Thread Payal Rathod
Hi, I have 5-7 user accounts in /home whose data is important. Every day at 12:00 I want to back their data to a differnt backup machine. The remote machine has a ftp server. Can I use wget for this? If yes, how do I proceed? I am keen to use wget rather than rsync for this. I want to preserve

Re: downloading files for ftp

2003-10-01 Thread Payal Rathod
On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote: The way to do it with Wget would be something like: wget --mirror --no-host-directories ftp://username:[EMAIL PROTECTED] But if I run in thru' crontab, where will it store the downloaded files? I want it to store as it is in

transferring files

2003-08-14 Thread Payal Rathod
Hi, I am helping a friend tranfer 12 big sites/domains from one remote Linux box to another. I use ftp to do this? Can I use wget (and how) to do this easily? Assume I have a domain example.com on 1.2.3.4 which I want to shift on 4.3.2.1, I ftp from one and download/upload files, can wget be more

method to download this url

2003-06-05 Thread Payal Rathod
Hi all, I need some kind of method to download only the questions andn answers listedk in this url. I don't want any picutes, just question andtheir answers. The url is http://qmail.faqts.com/ It is harder than it looks. I want to download all the questions and answers. Can anyone suggest any

downloading specific files

2003-03-09 Thread Payal Rathod
Hi, I want to download all .html, .htm and .txt files from http://cr.yp.to, so I gave, $ wget -l7 -r -nc -A .html,.htm,.txt http://cr.yp.to But I found that it was downloading all other files also plusfiles from domains such as joker.com and others. Can someone tell me the exact syntax for this

Re: say no to robots

2003-02-24 Thread Payal Rathod
* Fred Holmes [EMAIL PROTECTED] [2003-02-23 01:43]: -e robots=off -i filespec where filespec is an ASCII file containing the list of URLs to be downloaded. Thanks lot for the suggestion. I haven't got a chance the first one yet, as I cannot think of such a site right now. Thanks and

say no to robots

2003-02-22 Thread Payal Rathod
Hi all, Can I tell wget to ignore robots.txt? If so how do I do it? Also, if I have 10 different URL to retrieve from, can I specify all of them in a file and ask wget to get them from the file and retrieve them without any manual intervention? How do I do it? Thanks a lot and bye. With warm