Re: suggestion for wget

2005-02-05 Thread Ryan Underwood

On Sat, Feb 05, 2005 at 02:04:26PM +0200, Sorin wrote:
 hi there ::)
 
 the would be ok to have 2 or more downloads in the same time because 
 some files are big and the host limits the speeed...

You could use a multithreaded download manager (example: d4x).  Many of
these packages use wget as a backend.  You could also use the screen
utility to run many wgets concurrently, or just background them in the
current shell (but your screen will become a mess ... )

-- 
Ryan Underwood, [EMAIL PROTECTED]


Re: wget has issues with large files (over 2GB)

2005-01-25 Thread Ryan Underwood

On Tue, Jan 25, 2005 at 02:44:16PM +, Andrew Robb wrote:
 First of all, many thanks for a rock-solid utility!
 
 OS: SuSE Linux 9.1:
 rpm -q wget
 wget-1.9.1-45
 
 I am resuming the download of a DVD image but the sizes seem to overflow 
 32-bit signed integer.
 
 wget -c 
 ftp://mirror.mcs.anl.gov/pub/suse/i386/9.2/iso/SUSE-Linux-9.2-FTP-DVD.iso
 
 == RETR SUSE-Linux-9.2-FTP-DVD.iso ... done.
 Length: -931,424,256 [-1,990,048,240 to go] (unauthoritative)

This could be a problem with the FTP server.  What is the FTP server
software?

-- 
Ryan Underwood, [EMAIL PROTECTED]


Re: mirror

2003-09-19 Thread Ryan Underwood

Hi,

On Thu, Sep 18, 2003 at 01:31:35PM +0200, Thijs Thiessens wrote:
 Hey!
 
 How can I mirror a small ftp location using wget? I want to sync the ftp
 location with a local location. Or is it better to use another program?

man wget is your friend.

hint: wget -np -m ftp://your.server.com/location/

-- 
Ryan Underwood, nemesis at icequake.net, icq=10317253


Re: wget and download.com

2003-08-27 Thread Ryan Underwood

Hello,

On Wed, Aug 27, 2003 at 06:19:17AM -0400, joe j wrote:
 Hello,
 
 I am trying to download files with wget from download.com. I am using a 
 windows system. For some reason wget doesn?t know how to deal with 
 download.com links. For example Kazaa link 
 http://download.com.com/redir?pid=10214919merid=88691mfgid=88691edId=3siteId=4oId=3002-20-10214919ontId=20ltype=dl_dlnowlop=linkdestUrl=http%3A%2F%2Fdownload.kazaa.com%2Fkmd.exe

$ wget 
'http://download.com.com/redir?pid=10214919merid=88691mfgid=88691edId=3siteId=4oId=3002-20-10214919ontId=20ltype=dl_dlnowlop=linkdestUrl=http%3A%2F%2Fdownload.kazaa.com%2Fkmd.exe'

works fine for me.  Note the single quotes (otherwise the shell catches
the '' characters).

-- 
Ryan Underwood, nemesis at icequake.net, icq=10317253


conditional url encoding

2003-02-21 Thread Ryan Underwood

Hi,

I don't know how stupid of a question this is, but it was worth a hack for
me, so maybe other users might benefit from it too.

It seems that some servers are broken and in order to fetch files with certain
filenames, some characters that are normally encoded in HTTP sequences must
be sent through unencoded.  For example, I had a server the other day that
I was fetching files from at the URL:
http://server.com/~foobar/files

Sending the normal request for
GET /%7Efoobar/files

caused the server to return a 404.  However, I hacked wget and added a
noencodetilde option, and changed the following line around url.c:109:

#define UNSAFE_CHAR(c) (urlchr_test(c, urlchr_unsafe))

to:
#define UNSAFE_CHAR(c) (urlchr_test(c, urlchr_unsafe)  !(opt.noencodetilde  
c=='~'))

This caused the tilde to be sent through unencoded, and the files were fetched
properly.

The reason I mention this is that perhaps a general option to give wget a list
of characters not to encode (that would normally be encoded) would be useful
in fetching files from corner cases such as this.

Thanks!

-- 
Ryan Underwood, nemesis at icequake.net, icq=10317253