On 25/08/07, costasb5 <[EMAIL PROTECTED]> wrote:
>
> Hello,
>
>  I've collected some data and created a db. I am putting together a
>  website allowing users to search the db with a search-query
>  functionality. I am pretty much building a google-like engine,
>  although the data is stored in a MySQL.
>
>  If I use a GET form in the search page (which is what most engines
>  do), then the query results are displayed in pages where the URL
>  contains the query_string (example,
>  www.blahblah.com/search.php?queryitem1=my1&queryitem2=my2+my3).
>  So far so good, but in my case the data is of a local nature (say, a
>  yellow page type of thing with some extras) and someone can basically
>  pick the entire database by running a script like:
>
>  wget www.blahblah.com/search.php?zip=00001
>  wget www.blahblah.com/search.php?zip=00002
>  ..........
>  wget www.blahblah.com/search.php?zip=99999
>
>  This is something I would like to prevent. Could someone please offer
>  their opinion on:
>
>  1. how to best go about it with a GET form?
>
>  2. Is it indeed doable with a POST form in the search page (as opposed
>  to a GET form)?
>
>  3. I've read that the main advantage of a GET vs POST form is that
>  someone can bookmark a search results page (eg.
>  www.blahblah.com/search.php?zip=12356) if it the form is GET. Any
>  other advantages/disadvantages that i am missing?
>
>  Looking forward to everyone's input! Thank you in advance,
>
>  Costas
>

Hi,

GET and POST *should* be used for exactly that..  GETting and POSTing
data (hence the name).  You could try to put in place measures to slow
people down, restrict them to 5 searches in every 5 minutes for
example though you either do this by IP Address (which is often more
than one person), or cookie/session (but wget doesn't normally
maintain a session). You could also restrict the search feature to
clients that have a valid user agent string, for example wget 1.10.2
reports itself as "Wget/1.10.2", easy enough to detect.

Phill

Reply via email to