wget bug report

2005-06-13 Thread A.Jones
Sorry for the crosspost, but the wget Web site is a little confusing on the 
point of where to send bug reports/patches.

Just installed wget 1.10 on Friday. Over the weekend, my scripts failed with 
the 
following error (once for each wget run):
Assertion failed: wget_cookie_jar != NULL, file http.c, line 1723
Abort - core dumped

All of my command lines are similar to this:
/home/programs/bin/wget -q --no-cache --no-cookies -O /home/programs/etc/alte_se
iten/xsr.html 'http://www.enterasys.com/download/download.cgi?lib=XSR'

After taking a look at it, i implemented the following change to http.c and 
tried again. It works for me, but i don't know what other implications my 
change 
might have.

--- http.c.orig Mon Jun 13 08:04:23 2005
+++ http.c  Mon Jun 13 08:06:59 2005
@@ -1715,6 +1715,7 @@
   hs-remote_time = resp_header_strdup (resp, Last-Modified);
 
   /* Handle (possibly multiple instances of) the Set-Cookie header. */
+  if (opt.cookies)
   {
 char *pth = NULL;
 int scpos;


Mit freundlichen Grüßen

MVV Energie AG
Abteilung AI.C

Andrew Jones

Telefon: +49 621 290-3645
Fax: +49 621 290-2677
E-Mail: [EMAIL PROTECTED] Internet: www.mvv.de
MVV Energie · Luisenring 49 · 68159 Mannheim
Handelsregister-Nr. HRB 1780
Vorsitzender des Aufsichtsrates: Oberbürgermeister Gerhard Widder
Vorstand: Dr. Rudolf Schulten (Vorsitzender) · Dr. Werner Dub · Hans-Jürgen 
Farrenkopf · Karl-Heinz Trautmann


wget to push

2005-06-13 Thread Jared Greenwald
Is there any utility like wget that will PUT instead of GET.  I need
something that will work on ftp protocol.

Thanks,
Jared


Can I block all access to a named directory?

2005-06-13 Thread Johann Schoonees
I have read the wget man and info pages, searched the archive of this 
list and googled all over the place but I still don't have a 
satisfactory answer to a simple question:


Can wget be asked not to retrieve *anything* - not even .html pages - 
from a given directory and its subdirectories?


This is relevant in situations where one wants to mirror a site with 
many links to a restricted part of the site which requires 
authorization but is otherwise of no interest.  With wget-1.9.1 my log 
file contains hundreds of Authorization failure messages.


For example:

wget -nv -w1 -kpE -m -X /restricted http://www.example.com/ 

will still attempt to download URLs like 
http://www.example.com/restricted/index.html and 
http://www.example.com/restricted/subdir/rubbish.html


Looking though the source of the newly released wget-1.10, it looks as 
though wget gets .html pages even if they are in the 
exclude-directories list, so presumably wget-1.10 will behave the same 
way.


I am not sure if this is related, but something similar is logged in 
Bugzilla: https://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=124867


Can anyone confirm the behaviour I have seen, or suggest a work-around?

Many thanks in advance,
Johann