RE: add tar option

2002-04-23 Thread Herold Heiko
I think wget needs sometimes (often) to reread what it wrote to the disk (html conversion). This means something like that wouldn't work, or better, would be to specialized. What would work better is a (sometimes requested in the past) switch to output to a file a list of everything retrieved

Unsubscribe due to rebrand.

2002-04-23 Thread Ian . Pellew
Guys Apologies for using this route. My EMAILED CHANGED since I subscribed. Please can the administrator unsubscribe/help me because my email address has changed ((CompanRrebrand) I am present on your list as [EMAIL PROTECTED] not the above. Unsuscribing as going on vacation. Happy

ScanMail Message: To Recipient virus found or matched file blocking setting.

2002-04-23 Thread System Attendant
ScanMail for Microsoft Exchange has taken action on the message, please refer to the contents of this message for further details. Sender = [EMAIL PROTECTED] Recipient(s) = [EMAIL PROTECTED]; Subject = To your DTD. Scanning Time = 04/23/2002 10:45:48 Engine/Pattern = 6.150-1001/269 Action on

Re: segmentation fault on bad url

2002-04-23 Thread Ian Abbott
On 22 Apr 2002 at 21:38, Renaud Saliou wrote: Hi, wget -t 3 -d -r -l 3 -H --random-wait -nd --delete-after -A.jpg,.gif,.zip,.png,.pdf http://http://www.microsoft.com DEBUG output created by Wget 1.8.1 on linux-gnu. zsh: segmentation fault wget -t 3 -d -r -l 3 -H --random-wait -nd

Re: apache irritations

2002-04-23 Thread Maciej W. Rozycki
On Mon, 22 Apr 2002, Tony Lewis wrote: I'm not sure what you are referring to. We are discussing a common problem with static pages generated by default by Apache as index.html objects for server's filesystem directories providing no default page. Really? The original posting from

wget does not honour content-length http header [http://bugs.debian.org/143736]

2002-04-23 Thread Noel Koethe
Hello, If the http content-length header differs from actual data length, wget disregards the http specification as follows: 1) if content-length is greater than actual data, wget keeps retrying to receive the whole file indefinitely. Using the command-line parameter --ignore-length fixes this

Re: RFE:add tar option

2002-04-23 Thread Maciej W. Rozycki
On Mon, 22 Apr 2002, Max Waterman wrote: Someone (rudely) suggested it was unacceptable to ask for a 'cc' rather than joining the email list. If this is so, I apologise, but would like to point out that I was only following the suggestion on the wget web page : I believe such suggestions

Re: wget does not honour content-length http header[http://bugs.debian.org/143736]

2002-04-23 Thread Hrvoje Niksic
Noel Koethe [EMAIL PROTECTED] writes: If the http content-length header differs from actual data length, wget disregards the http specification as follows: It doesn't disregard the HTTP specification. As far as I'm aware, HTTP simply specifies that the information provided by Content-Length

Re: add tar option

2002-04-23 Thread Hrvoje Niksic
Herold Heiko [EMAIL PROTECTED] writes: I think wget needs sometimes (often) to reread what it wrote to the disk (html conversion). This means something like that wouldn't work, or better, would be to specialized. In the long run, I hope to fix that. The first step has already been done --

Re: add tar option

2002-04-23 Thread Ian Abbott
On 23 Apr 2002 at 18:19, Hrvoje Niksic wrote: On technical grounds, it might be hard to shoehorn Wget's mode of operation into what `tar' expects. For example, Wget might need to revisit directories in random order. I'm not sure if a tar stream is allowed to do that. You can add stuff to

Re: Name changing

2002-04-23 Thread Hrvoje Niksic
Caddell, Travis [EMAIL PROTECTED] writes: I'm stuck with Windows at my office :( But what option offered by wget would allow the user to specify the name of the folder that the web site would be saved in. For example if I were to wget -cdr www.cnn.com the folder would be named

Feature request

2002-04-23 Thread Frederic Lochon \(crazyfred\)
Hello, I'd like to know if there is a simple way to 'mirror' only the images from a galley (ie. without thumbnails). Maybe a new feature could be useful. This could be done throught this ways: - mirroring only images that are a link - mirroring only 'last' links from a tree - a more general