Why no -nc with -N?

2004-02-03 Thread Dan LeGate
I'd love to have an option so that, when mirroring, it
will backup only files that are replaced because they
are newer on the source system (time-stamping).

Is there a reason these can't be enabled together?

__
Do you Yahoo!?
Yahoo! SiteBuilder - Free web site building tool. Try it!
http://webhosting.yahoo.com/ps/sb/


bug in connect.c

2004-02-03 Thread francois eric
problem: wget can't download by command
wget --bind-address=Your.External.Ip.Address -d -c -v -b -a "logo.txt"
ftp://anonymous:[EMAIL PROTECTED]/incoming/Xenos/knigi/Programming/LinuxUni
x/SHELL/PICTURES/LOGO.GIF
logo.txt contains:
--
DEBUG output created by Wget 1.9.1 on freebsd4.5.
--11:36:30--
ftp://anonymous:[EMAIL PROTECTED]/incoming/Xenos/knigi/Programming/Li
nuxUnix/SHELL/PICTURES/LOGO.GIF
  => `LOGO.GIF'
Connecting to 193.233.88.66:21... Releasing 0x807a0d0 (new refcount 0).
Deleting unused 0x807a0d0.
Closing fd 4
failed: Can't assign requested address.
Releasing 0x807a0b0 (new refcount 0).
Deleting unused 0x807a0b0.
Retrying.
.
--
so failure is in bind command.  i tested the same command, but
without --bind-address (logo.gif appeared on my hdd):
--
DEBUG output created by Wget 1.9.1 on freebsd4.5.
--11:39:22--
ftp://anonymous:[EMAIL PROTECTED]/incoming/Xenos/knigi/Programming/Li
nuxUnix/SHELL/PICTURES/LOGO.GIF
  => `LOGO.GIF'
Connecting to 193.233.88.66:21... connected.
Created socket 4.
Releasing 0x807a0a0 (new refcount 0).
Deleting unused 0x807a0a0.
Logging in as anonymous ... 220 diamond.stup.ac.ru FTP server (Version
wu-2.6.2-8) ready.
...
--
after some test:
bug is when: ftp, with username and password, with bind address specifyed
bug is not when: http, ftp without username and password
looks like memory leaks. so i made some modification before bind:
src/connect.c:
--
...
 /* Bind the client side to the requested address. */
 wget_sockaddr bsa;
//!
 memset (&bsa,0,sizeof(bsa));
/!!
 wget_sockaddr_set_address (&bsa, ip_default_family, 0, &bind_address);
 if (bind (sock, &bsa.sa, sockaddr_len ()))
..
--
after it all downloads become sucesfull.
i think better do memset in wget_sockaddr_set_address, but it is for your
choose.
best regards
p.s. sorry for my english 8(
_
The new MSN 8: smart spam protection and 2 months FREE*  
http://join.msn.com/?page=features/junkmail



Re: downloading multiple files question...

2004-02-03 Thread Jens Rösner
Hi Ron!

If I understand you correctly, you could probably use the 
-A acclist
--accept acclist
accept = acclist
option.

So, probably (depending on your site), the syntax should be something like:
wget -r -A *.pdf URL
wget -r -A *.pdf -np URL
or, if you have to recurse through multiple html files, 
it could be necessary/beneficial to
wget -r -l0 -A *.pdf,*.htm* -np URL

Hope that helps (and is correct ;) )
Jens


> In the docs I've seen on wget, I see that I can use wildcards to 
> download multiple files on ftp sites.  So using *.pdf would get me all 
> the pdfs in a directory.  It seems that this isn't possible with http 
> sites though.  For work I often have to download lots of pdfs when 
> there's new info I need, so is there any way to download multiple files 
> of the same type from an http web page?
> 
> I'd like to be cc'd in replies to my post please as I'm not subscribed 
> to the mailing list.
> 

-- 
GMX ProMail (250 MB Mailbox, 50 FreeSMS, Virenschutz, 2,99 EUR/Monat...)
jetzt 3 Monate GRATIS + 3x DER SPIEGEL +++ http://www.gmx.net/derspiegel +++



RE: apt-get via Windows with wget

2004-02-03 Thread Jens Rösner
Hi Heiko!

> > Until now, I linked to your main page. 
> > Would you mind if people short-cut this? 
> Linking to the directory is bad since people would download 

Sorry, I meant linking directly to the "latest" zip.
However, I personally prefer to read what the provider 
(in this case you) has to say about a download anyway.


> Do link to the complete url if you prefer to, although I like to keep 
> some stats.

Understood.


> for example since start of the year
> there have been 7 referrals from www.jensroesner.de/wgetgui 

Wow, that's massive... 
...not!
;-)


> Since that is about 0.05% stats shouldn't be 
> altered too much if you link directly to the archive ;-)

Thanks for pointing that out ;-}


> > What do you think about adding a "latest-ssl-libraries.zip"?
> I don't think so.
> If you get the "latest complete wget" archive those are included anyway 
> and you are sure it will work. 

Oh I'm very sorry, must have overread/misunderstood that. 
I thought the "latest" zip would not contain the SSLs.
That's great.


> I'd prefer to not force a unneeded (admittedly, small) download by
bundling 
> the ssl libraries in every package.

Very true.
Especially as wget seems to be used by quite some people on slow
connections.


Kind regards
Jens




-- 
GMX ProMail (250 MB Mailbox, 50 FreeSMS, Virenschutz, 2,99 EUR/Monat...)
jetzt 3 Monate GRATIS + 3x DER SPIEGEL +++ http://www.gmx.net/derspiegel +++



downloading multiple files question...

2004-02-03 Thread Ron
In the docs I've seen on wget, I see that I can use wildcards to 
download multiple files on ftp sites.  So using *.pdf would get me all 
the pdfs in a directory.  It seems that this isn't possible with http 
sites though.  For work I often have to download lots of pdfs when 
there's new info I need, so is there any way to download multiple files 
of the same type from an http web page?

I'd like to be cc'd in replies to my post please as I'm not subscribed 
to the mailing list.