If someone took the trouble to write a robots.txt do please observe it.
A website with limited bandwidth can take a big hit from too many full spiders.
In resonance with another active thread, I still take it as an old fashioned
duty of "hacker ethic" that we have to lead by example in an age where the law
and business are no longer a reliable guide. 

I'm happy to spend time with any beginner off-list to demonstrate CVS or wget,
seriously just drop me a line and we'll go through it, but it's just a little
off-topic to be filling the list traffic with, and IOhans and Frank have already
given the essentials. Please read those FAQs again guys.




On Thu, 8 Mar 2007 10:36:35 +0100
Frank Barknecht <[EMAIL PROTECTED]> wrote:

> Hallo,
> Pete Redest hat gesagt: // Pete Redest wrote:
> 
> > Anyways, for the benefit of Jared and of others interested in downloading
> > the examples/tutorial/workshop from CVS, I am attaching a script,
> > which surely can be adjusted/improved in several ways. The script
> > is just what worked for me.
> 
> Please never use this script!!! 
> 
> It's bad practice, not polite at all and in some circles even
> considered extremely rude to ignore the instructions in "robots.txt"!
> 
> And it's not necessary at all: Instead of abusing wget as a CVS-tool,
> one should just get comfortable with a real CVS-utility. 
> 
> Ciao
> -- 
>  Frank Barknecht                 _ ______footils.org_ __goto10.org__
> 
> _______________________________________________
> [email protected] mailing list
> UNSUBSCRIBE and account-management -> 
> http://lists.puredata.info/listinfo/pd-list

_______________________________________________
[email protected] mailing list
UNSUBSCRIBE and account-management -> 
http://lists.puredata.info/listinfo/pd-list

Reply via email to