Mauro Tortonesi wrote:
> perhaps we should modify wget in order to print the list of "touched"
> URLs as well? maybe only in case -v is given? what do you think?
On June 28, 2005, I submitted a patch to write unfollowed links to a file.
It would be pretty simple to have a similar --followed-links
Christopher G. Lewis ha scritto:
OK, the Win32 compile is working, I've got both the SVN Trunk and the 1.11
alpha branch from
ftp://alpha.gnu.org/pub/pub/gnu/wget/wget-1.11-alpha-1.tar.gz . We'll
obviously work through the warnings that are coming up, and re-address the
CL parameters to fit with
Christopher G. Lewis ha scritto:
Hi all -
I've published the latest alpha Win32 binaries using a similar format
to Heiko's Win32 page.
many thanks, christopher. we should try to move your page on
wget.sunsite.dk, if it's ok for you.
Hopefully I'll be able to keep up with what Heiko's d
bruce ha scritto:
when you guys are building/testing wget, are you ever using any kind of IDE?
no, i only use vim:
http://www.vim.org
and while i can get it to build using Eclipse on my linux box, i cant' seem
to figure out what i need to do within the settings to actually be able to
step i
Antonio Mignolli ha scritto:
Looking for proxies usage with wget 1.2.10, in the man page,
we can found:
--no-proxy
Don't use proxies, even if the appropriate *_proxy
environment variable is defined.
For more information about the use of proxies with Wget,
And the last
Tony Lewis ha scritto:
Run the command with -d and post the output here.
in this case, -S can provide more useful information than -d. be careful to
obfuscate passwords, though!!!
--
Aequam memento rebus in arduis servare mentem...
Mauro Tortonesi http://www.torto
bruce ha scritto:
Is there an option that I've missed that will allow me to crawl, but just
return the list of URLs/Links that I've touched? Can't seem to find an
option that does this...
hi bruce,
the use of -r and --spider options in combination provides the closest
feature to what you're