Author: Alexander Barkov
Email: [EMAIL PROTECTED]
Message:
Take a look into sql.c, add
#define DEBUG_SQL 1
then recompile. Run search.cgi from command line:
./search.cgi word >/dev/null
You'll see every SQL query sent to backend with time spent
on their execution. What is the most long query?
> I have an mnoGoSearch (3.1.12) running with mysql 3.23.22-beta as back-end database.
>The computer is a Pentium II 350MHz with 128Mb running RedHat 7.0.
> The only settings that are not like standard, are multi-crc mode, not using/reading
>robots.txt on the site - and I have an extensive stop word table.
>
> Currently, I've got 84 websites indexed (not all done), and searching the database
>have become extremely slow.
>
> indexer -S gives this result:
>
> Database statistics
>
> Status Expired Total
> -----------------------------
> 0 32 647 Not indexed yet
> 200 0 122872 OK
> 301 0 29 Moved Permanently
> 302 0 16826 Moved Temporarily
> 304 0 5130 Not Modified
> 403 0 4 Forbidden
> 404 0 395 Not found
> 503 0 1 Service Unavailable
> 504 2 1965 Gateway Timeout
> -----------------------------
> Total 34 147869
>
>
> I don't really think this is very much (I've got about 1500 more websites to
>index!!).
> Searching for a simple word like "Kyosho", takes 2 seconds, giving about
>3600 hits... not all that bad.
> A combined search "Kyosho Inferno MP-6" takes 20 seconds!!
> If I repeat the search instantly, it takes only 0.7 seconds...
>
> So... have I done anything wrong?? Must I optimize the database somehow? I'm a bit
>worried about this.
>
> All input appreciated. Thanks in advance.
>
> Best regards,
> �rjan Sandland
Reply: <http://search.mnogo.ru/board/message.php?id=1767>
___________________________________________
If you want to unsubscribe send "unsubscribe general"
to [EMAIL PROTECTED]