thanx,

I am gonna play with it

Andy

"Lutz Brückner" <[EMAIL PROTECTED]> schrieb im Newsbeitrag
[EMAIL PROTECTED]">news:[EMAIL PROTECTED]...
> Hi Andy,
>
> it is difficult to give you an advice without the knowing about
> how you are searching the database. But if you have to search
> through all entries, it was a bad decision to split them up into
> 250 tables.
>
> Some other hints:
> You are lost without an index, because mysql have to do a full
> table scan to find the matching rows.
>
> CREATE INDEX cityidx ON cities (city(10));
> creates an index for the leading 10 characters of the column which
> needs much less tablespace than a fulltext index and will probably
> fitt your needs.
>
> OPTIMIZE TABLE gm_cities;
> releases the tablespace lost after a DROP INDEX or after deleting
> rows.
>
> Do you really need a char(100) for the city names? If you can use
> char(50) instead, the table would be much smaller.
>
> Buy lot of ram and fast harddisks. And setup the mysql-server to use
> the ram proberly.
>
> Good luck,
>                Lutz
>
>
> [EMAIL PROTECTED] (Andy) writes:
>
> > Hi there,
> >
> > I used to work with a table containing over 2.5 million cities. This
> > was verry slow even using indexes. So I did split the huge table into
> > 250 tables named after their country. Example: gm_cities for the german
city
> > table.
> > Now performance is ok, because my interface knows in which table
> > to look at.
> >
> > A new problem arised. There is a need for a search on all tables to find
a
> > specific
> > city. To find this city even in one table takes 0.5s which is a lot if
you
> > put it times 250 tables.
> > I have to admit that there is no index on the city column yet. I tryed
to
> > put a full text index
> > on the city column, but this busted up the table size from 2.5Mb to
7.5MB
> > which made
> > the querry take 1s.
> >
> > My questions are as follows:
> >
> > 1. whats wrong with this one table having 7.5Mb instead of 2.5 I did
delete
> > the index and optimized
> > the table but still 7.5
> >
> > 2. What would be the best way to organize this data to make the search
on
> > this 2.5 mio entrys to
> > take not longer than 0.5s? I could put it all into one table again, but
this
> > would meen a week of work
> > because my application is based on this system. Maybe a good hint for
> > putting the right index
> > would be the best help for me.
> >
> > My tablestrukture looks like this:
> >
> > ID     ->    mediumint(9) unique index
> > city   ->    char(100)
> > province_id    -> char(2) index
> >
> > A nother story would be that it might take even longer because I have to
get
> > the province name and the country name
> > out of the table countries and provinces
> >
> > Thank you for any help,
> >
> > Andy



-- 
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to