Tony Lewis wrote:
Bruce wrote:
any idea as to who's working on this feature?
Mauro Tortonesi sent out a request for comments to the mailing list on March
29. I don't know whether he has started working on the feature or not.
yes. i haven't started coding it yet, though. i am still working
hey tony...
any idea as to who's working on this feature?
thanks..
-bruce
-Original Message-
From: Tony Lewis [mailto:[EMAIL PROTECTED]
Sent: Thursday, June 22, 2006 4:36 PM
To: [EMAIL PROTECTED]
Cc: wget@sunsite.dk
Subject: RE: wget - tracking urls/web crawling
Bruce wrote
bruce [EMAIL PROTECTED] writes:
any idea as to who's working on this feature?
No one, as far as I know.
to crawl through a site and extract the
required information
thanks
-bruce
-Original Message-
From: Tony Lewis [mailto:[EMAIL PROTECTED]
Sent: Thursday, June 22, 2006 4:36 PM
To: [EMAIL PROTECTED]
Cc: wget@sunsite.dk
Subject: RE: wget - tracking urls/web crawling
Bruce wrote
Bruce wrote:
any idea as to who's working on this feature?
Mauro Tortonesi sent out a request for comments to the mailing list on March
29. I don't know whether he has started working on the feature or not.
Tony
bruce wrote:
hi...
i'm testing wget on a test site.. i'm using the recursive function of wget
to crawl through a portion of the site...
it appears that wget is hitting a link within the crawl that's causing it to
begin to crawl through the section of the site again...
i know wget isn't as
Try using the -np (no parent) parameter.
Mark Post
-Original Message-
From: bruce [mailto:[EMAIL PROTECTED]
Sent: Thursday, June 22, 2006 4:15 PM
To: 'Frank McCown'; wget@sunsite.dk
Subject: RE: wget - tracking urls/web crawling
hi frank...
there must be something simple i'm missing
... the pages i need to exclude
are based on information that's in the query portion of the url...
-bruce
-Original Message-
From: Frank McCown [mailto:[EMAIL PROTECTED]
Sent: Thursday, June 22, 2006 2:34 PM
To: [EMAIL PROTECTED]
Cc: wget@sunsite.dk
Subject: Re: wget - tracking urls/web