Manu, et al,
The database in alenet.com is rather tiny (about 28 docs, as you
mentioned). I developed the procedure for a intranet with some 5000
records. It is working fine. It has some 20+ clients that are
constantly hitting the database. The server is a desktop (nothing out
of the
Curt, et al:
You just described my procedure! I do exacly that, plus, I also save a
soundex or metaphone code for each word so I can check spelling and
closest matches.
Cesar.
Curt Zirzow wrote:
Instead of just adding a word to table of words, you add a field
that holds the qty of times it
I used a variant of this scheme a time ago and it worked well though I had
not reached 2 000 records.
I separated the words using strtok().
Manu.
PD: How large is the http://www.alenet.com DB; I searched the word 'the'
(which is likely to be in every english doc) and it returned only 28 docs.
* Thus wrote Cesar Cordovez ([EMAIL PROTECTED]):
2. save in the keyword table the non repeating words in the array with a
reference to the original document, for example the document id.
3. Then, if you want to search for, let say, people you will do:
select distinct(docid) from
, October 14, 2003 12:54 AM
Subject: [PHP] Slow searches in large database
Hi there
Wondering if someone could help or give some advice.
We have a mysql database that has approximately 20,000 records and has a
total size of 125mb There are approximately 25 fields that we need to
search each
Hi!
I think that you will need a keyword table to speed up this procedure.
The basic idea is to create an index of words in your fields,
therefore you will not use like but = making things run much, much
faster.
The steps for doing this are:
1. Every time you save a record in the table
Rather than searching every field for every search I usually provide a
select drop down or checkbox that allows the user to indicate what
information they are searching, then only hit those fields in the SQL
query.
MySQL is fast for simple queries but it doesn't scale well with larger
You might wanna see if a search engine/indexing tool can help you, I
use:
http://search.mnogo.ru/features.html
On Mon, 2003-10-13 at 18:54, Adrian Teasdale wrote:
Hi there
Wondering if someone could help or give some advice.
We have a mysql database that has approximately 20,000 records
Hi there
Wondering if someone could help or give some advice.
We have a mysql database that has approximately 20,000 records and has a
total size of 125mb There are approximately 25 fields that we need to
search each time that someone performs a search. We have installed
TurckMMCache onto the
Adrian Teasdale mailto:[EMAIL PROTECTED]
on Monday, October 13, 2003 3:54 PM said:
An example of one of our search strings is:
[snip]
Basically, is there anything that anyone can immediately suggest that
we need to do to speed things up?
1. You could try changing 'docs.*' to a verbose
Start by checking the MySQL docs to find out if indexes are used with IN
and LIKE especially as the latter is using wildcards. I suspect not.
Given that you are essentially performing a sequential read of the
database and checking on all these fields your performance is remarkably good.
11 matches
Mail list logo