Re: [PHP] Speed Opinion
Nathan Rixham napsal(a): Ashley Sheridan wrote: On Thu, 2009-02-05 at 09:44 +1100, Chris wrote: PHP wrote: Hi all, I am seeking some knowledge, hopefully I explain this right. I am wondering what you think is faster. Say you have 1000 records from 2 different tables that you need to get from a MySQL database. A simple table will be displayed for each record, the second table contains related info for each record in table 1. Is if faster to just read all the records from both tables into two arrays, then use php to go through the array for table 1 and figure out what records from table 2 are related. Or, you dump all the data in table 1 into an array, then as you go through each record you make a database query to table 2. Make the db do it. PS: I know I can use a join, but I find anytime I use a join, the database query is extremely slow, I have tried it for each version of mysql and php for the last few years. The delay difference is in the order of 100x slower or more. Then you're missing indexes or something, I've joined tables with hundreds of thousands of records and it's very fast. -- Postgresql php tutorials http://www.designmagick.com/ I've used joins on tables with millions of rows, and it's still not been too slow to use. Admittedly it was an MSSQL database, which I've always found to be slower, but MySQL was built to be a relational database, and can handle many many millions of records quite happily. The slowdown you experienced is either not using indexes on tables, or the way you were displaying/manipulating those results from within PHP. Ash www.ashleysheridan.co.uk and if you use spatial indexes and points instead of integers you can join on the biggest of databases with literally no perfomance hit, same speed regardless of table size :p (plus cos a point has two values you can use one for id and the other for timestamp ;) regards ps: i've said this many times before, but not for like 6 months so time for another reminder MySQL supports spatial extensions to allow the generation, storage, and analysis of geographic features. So, I use spatial indexes when creating a geographic map? Is it good for anything else? -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] Speed Opinion
Hi all, I am seeking some knowledge, hopefully I explain this right. I am wondering what you think is faster. Say you have 1000 records from 2 different tables that you need to get from a MySQL database. A simple table will be displayed for each record, the second table contains related info for each record in table 1. Is if faster to just read all the records from both tables into two arrays, then use php to go through the array for table 1 and figure out what records from table 2 are related. Or, you dump all the data in table 1 into an array, then as you go through each record you make a database query to table 2. PS: I know I can use a join, but I find anytime I use a join, the database query is extremely slow, I have tried it for each version of mysql and php for the last few years. The delay difference is in the order of 100x slower or more.
Re: [PHP] Speed Opinion
PHP wrote: Hi all, I am seeking some knowledge, hopefully I explain this right. I am wondering what you think is faster. Say you have 1000 records from 2 different tables that you need to get from a MySQL database. A simple table will be displayed for each record, the second table contains related info for each record in table 1. Is if faster to just read all the records from both tables into two arrays, then use php to go through the array for table 1 and figure out what records from table 2 are related. Or, you dump all the data in table 1 into an array, then as you go through each record you make a database query to table 2. Make the db do it. PS: I know I can use a join, but I find anytime I use a join, the database query is extremely slow, I have tried it for each version of mysql and php for the last few years. The delay difference is in the order of 100x slower or more. Then you're missing indexes or something, I've joined tables with hundreds of thousands of records and it's very fast. -- Postgresql php tutorials http://www.designmagick.com/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Speed Opinion
PHP wrote: Hi all, I am seeking some knowledge, hopefully I explain this right. I am wondering what you think is faster. Say you have 1000 records from 2 different tables that you need to get from a MySQL database. A simple table will be displayed for each record, the second table contains related info for each record in table 1. Is if faster to just read all the records from both tables into two arrays, then use php to go through the array for table 1 and figure out what records from table 2 are related. Or, you dump all the data in table 1 into an array, then as you go through each record you make a database query to table 2. in general mysql is faster than php. do/select as much as you can in mysql. -afan PS: I know I can use a join, but I find anytime I use a join, the database query is extremely slow, I have tried it for each version of mysql and php for the last few years. The delay difference is in the order of 100x slower or more. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Speed Opinion
On Wed, 2009-02-04 at 14:42 -0800, PHP wrote: Hi all, I am seeking some knowledge, hopefully I explain this right. I am wondering what you think is faster. Say you have 1000 records from 2 different tables that you need to get from a MySQL database. A simple table will be displayed for each record, the second table contains related info for each record in table 1. Is if faster to just read all the records from both tables into two arrays, then use php to go through the array for table 1 and figure out what records from table 2 are related. Or, you dump all the data in table 1 into an array, then as you go through each record you make a database query to table 2. PS: I know I can use a join, but I find anytime I use a join, the database query is extremely slow, I have tried it for each version of mysql and php for the last few years. The delay difference is in the order of 100x slower or more. Grab records from table 1... build a list of IDs to match in table 2. Use an IN clause. 2 queries and no joins as you requested. SELECT * from foo_table AS FOO where something something; SELECT * from fee_table AS FEE where foo_id IN ( list, of, ids ); You REALLY don't want to do a query for every row matched in the first query. Cheers, Rob. -- http://www.interjinn.com Application and Templating Framework for PHP -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Speed Opinion
On Thu, 2009-02-05 at 09:44 +1100, Chris wrote: PHP wrote: Hi all, I am seeking some knowledge, hopefully I explain this right. I am wondering what you think is faster. Say you have 1000 records from 2 different tables that you need to get from a MySQL database. A simple table will be displayed for each record, the second table contains related info for each record in table 1. Is if faster to just read all the records from both tables into two arrays, then use php to go through the array for table 1 and figure out what records from table 2 are related. Or, you dump all the data in table 1 into an array, then as you go through each record you make a database query to table 2. Make the db do it. PS: I know I can use a join, but I find anytime I use a join, the database query is extremely slow, I have tried it for each version of mysql and php for the last few years. The delay difference is in the order of 100x slower or more. Then you're missing indexes or something, I've joined tables with hundreds of thousands of records and it's very fast. -- Postgresql php tutorials http://www.designmagick.com/ I've used joins on tables with millions of rows, and it's still not been too slow to use. Admittedly it was an MSSQL database, which I've always found to be slower, but MySQL was built to be a relational database, and can handle many many millions of records quite happily. The slowdown you experienced is either not using indexes on tables, or the way you were displaying/manipulating those results from within PHP. Ash www.ashleysheridan.co.uk -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Speed Opinion
Ashley Sheridan wrote: On Thu, 2009-02-05 at 09:44 +1100, Chris wrote: PHP wrote: Hi all, I am seeking some knowledge, hopefully I explain this right. I am wondering what you think is faster. Say you have 1000 records from 2 different tables that you need to get from a MySQL database. A simple table will be displayed for each record, the second table contains related info for each record in table 1. Is if faster to just read all the records from both tables into two arrays, then use php to go through the array for table 1 and figure out what records from table 2 are related. Or, you dump all the data in table 1 into an array, then as you go through each record you make a database query to table 2. Make the db do it. PS: I know I can use a join, but I find anytime I use a join, the database query is extremely slow, I have tried it for each version of mysql and php for the last few years. The delay difference is in the order of 100x slower or more. Then you're missing indexes or something, I've joined tables with hundreds of thousands of records and it's very fast. -- Postgresql php tutorials http://www.designmagick.com/ I've used joins on tables with millions of rows, and it's still not been too slow to use. Admittedly it was an MSSQL database, which I've always found to be slower, but MySQL was built to be a relational database, and can handle many many millions of records quite happily. The slowdown you experienced is either not using indexes on tables, or the way you were displaying/manipulating those results from within PHP. Ash www.ashleysheridan.co.uk and if you use spatial indexes and points instead of integers you can join on the biggest of databases with literally no perfomance hit, same speed regardless of table size :p (plus cos a point has two values you can use one for id and the other for timestamp ;) regards ps: i've said this many times before, but not for like 6 months so time for another reminder -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php