The key to speed with SQL databases is INDEX. You are doing sequential
reads of the contents of each of these fields; no matter what database you
used performance would be atrocious.
Can you break some keywords out of your text, and assign them to fields
like keywd1, keywd2, etc.?
How often does the data change? Could you periodically parse the contents
of those target fields and create a lookup table, with an exclusion list so
that you don't index "the", "a" and the like, with pointers from the lookup
table to the primary key of search?
Semi-finally, this may not be a good scenario for a database. What about
storing these files as text and using something like htdig to index them
and return search results.
Finally (you can thell this isn't any logical order), doesn't one of the
beta versions of MySQL support text indexing?
Anyway - some thought starters. But you will NEVER get good performance
with sequential reads.
HTH - Miles Thompson
At 03:56 PM 7/23/01 +0200, Jome wrote:
>for the moment I'm trying to write searchengine kind-of-thing in PHP using
>MySQL, however MySQL tend to be very slow when searching.
>My table is about 100 mb and the query is SELECT * FROM search WHERE
>search.content LIKE '%$keyword%' OR search.filnamn LIKE '%$keyword%'
>I've tried setting INDEX but it didn't work out since I'm using
>BLOB-fields. Is there any other way than using INDEX?
>PHP General Mailing List (http://www.php.net/)
>To unsubscribe, e-mail: [EMAIL PROTECTED]
>For additional commands, e-mail: [EMAIL PROTECTED]
>To contact the list administrators, e-mail: [EMAIL PROTECTED]
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]