re: [sqlite] Network Performance

2003-11-25 Thread Greg Obleshchuk
Hi Brian, I few things. The problem is your network speed compared to local disk speed (also distance is an issue as well) Disk transfer is in Mega bytes where as network IO is in mega bits. SO while IDE standard 33.3MB looks slower than 100mb for your network (even switched) it is in fact

Re: [sqlite] Spiders vandalizing the wiki

2003-11-25 Thread Matt Sergeant
On 25 Nov 2003, at 12:48, D. Richard Hipp wrote: In the past couple of days, I've been having problems with spiders vandalizing the Wiki at http://www.sqlite.org/cvstrac/wiki. The damage (so far) has been relatively minor and easy to fix. But I've been monitoring these spiders for a while and

Re: [sqlite] Spiders vandalizing the wiki

2003-11-25 Thread Gerhard Häring
Good suggestions, IMO, Peter. I normally really hate this, but you could try to mangle the email adresses they look for with some JavaScript gimmicks using document.write. OTOH this sucks big time, because it will make the site harder to use for those who surf with JavaScript disabled or

Re: [sqlite] Spiders vandalizing the wiki

2003-11-25 Thread Peter
D. Richard Hipp wrote: If you have any suggestions on what to do about them, I'd like to hear from you. Block the entire IP range, for say 2 weeks at a time. I'm guessing that these spiders are coming from spammers looking to harvest email addresses. Last nights attack came from 61.51.123.205.

Re: [sqlite] Spiders vandalizing the wiki

2003-11-25 Thread Mrs. Brisby
Google won't submit forms. Robots can't read. Require a challenge before allowing submissions- whether it be an email-based challenge, or an image that contains distorted text. I don't know if there are any blind users of SQLite but they would probably prefer the former. On Tue, 2003-11-25 at

Re: [sqlite] Spiders vandalizing the wiki

2003-11-25 Thread Gerhard Häring
D. Richard Hipp wrote: In the past couple of days, I've been having problems with spiders [...] You could use a robots.txt to guard against those spiders that behave well. If the misbehaving spiders use a certain distinguishable User-Agent header, you could block that. -- Gerhard