Re: Wget 1.9-beta2 is available for testing

2003-10-01 Thread DervishD
Hi Hrvoje :) * Hrvoje Niksic [EMAIL PROTECTED] dixit: This beta includes several important bug fixes since 1.9-beta1, most notably the fix for correct file name quoting with recursive FTP downloads. That works, at least for me. I've tested with the ftp repository that previously

Re: Option to save unfollowed links

2003-10-01 Thread Hrvoje Niksic
[ Added Cc to [EMAIL PROTECTED] ] Tony Lewis [EMAIL PROTECTED] writes: The following patch adds a command line option to save any links that are not followed by wget. For example: wget http://www.mysite.com --mirror --unfollowed-links=mysite.links will result in mysite.links containing all

Submitting a `.pot' file to the Translation Project

2003-10-01 Thread Hrvoje Niksic
Does anyone know the current procedure for submitting the `.pot' file to the GNU Translation Project? At the moment, the project home page at http://www.iro.umontreal.ca/contrib/po/HTML/ appears dead.

Re: Option to save unfollowed links

2003-10-01 Thread Tony Lewis
Hrvoje Niksic wrote: I'm curious: what is the use case for this? Why would you want to save the unfollowed links to an external file? I use this to determine what other websites a given website refers to. For example: wget

Re: Option to save unfollowed links

2003-10-01 Thread Hrvoje Niksic
Tony Lewis [EMAIL PROTECTED] writes: Hrvoje Niksic wrote: I'm curious: what is the use case for this? Why would you want to save the unfollowed links to an external file? I use this to determine what other websites a given website refers to. For example: wget

downloading files for ftp

2003-10-01 Thread Payal Rathod
Hi, I have 5-7 user accounts in /home whose data is important. Every day at 12:00 I want to back their data to a differnt backup machine. The remote machine has a ftp server. Can I use wget for this? If yes, how do I proceed? I am keen to use wget rather than rsync for this. I want to preserve

Re: Option to save unfollowed links

2003-10-01 Thread Hrvoje Niksic
Tony Lewis [EMAIL PROTECTED] writes: Would something like the following be what you had in mind? 301 http://www.mysite.com/ 200 http://www.mysite.com/index.html 200 http://www.mysite.com/followed.html 401 http://www.mysite.com/needpw.html --- http://www.othersite.com/notfollowed.html Yes,

Re: downloading files for ftp

2003-10-01 Thread Hrvoje Niksic
Payal Rathod [EMAIL PROTECTED] writes: I have 5-7 user accounts in /home whose data is important. Every day at 12:00 I want to back their data to a differnt backup machine. The remote machine has a ftp server. Can I use wget for this? If yes, how do I proceed? The way to do it with Wget

Wget 1.9-beta3 is available for testing

2003-10-01 Thread Hrvoje Niksic
Not many changes from the previous beta. This is for the purposes of the Translation Project, to which I've submitted `wget.pot', and which might wonder where to get the source of a wget-1.9-beta3 from. Get it from: http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-beta3.tar.gz Mauro's IPv6

Re: downloading files for ftp

2003-10-01 Thread Payal Rathod
On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote: The way to do it with Wget would be something like: wget --mirror --no-host-directories ftp://username:[EMAIL PROTECTED] But if I run in thru' crontab, where will it store the downloaded files? I want it to store as it is in