Re: [PHP] Exporting large data from mysql to html using php

2011-11-03 Thread Ashley Sheridan
On Wed, 2011-11-02 at 19:47 -0400, Jason Pruim wrote:

 Jason Pruim
 li...@pruimphotography.com
 
 
 
 On Oct 31, 2011, at 7:52 PM, Ashley Sheridan wrote:
 
  On Mon, 2011-10-31 at 19:29 -0400, Jason Pruim wrote:
  
  Jason Pruim
  li...@pruimphotography.com
  
  
  
  On Oct 31, 2011, at 7:11 PM, Jim Lucas wrote:
  
   On 10/24/2011 5:50 PM, Jason Pruim wrote:
   Now that I've managed to list 3 separate programming languages and 
   somewhat tie it back into php here's the question...
   
   I have about 89 million records in mysql... the initial load of the 
   page takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on 
   the SQL query's... But they just aren't going fast enough...
   
   What I would like to do, is pull the data out of MySQL and store it in 
   the HTML files, and then update the HTML files once a day/week/month... 
   I can figure most of it out... BUT... How do I automatically link to 
   the individual pages?
   
   I have the site working when you pull it from MySQL... Just the load 
   time sucks... Any suggestions on where I can pull some more info from? 
   :)
   
   Thanks in advance!
   
   
   Jason Pruim
   li...@pruimphotography.com
   
   
   Jason,
   
   How large a data set are you starting with?  How many records in all.
   
   Will you show us your DB schema?
  
  Hey Jim,
  
  I am working with 89 Million records right now... Going to be expanding to 
  a much larger dataset as the site expands.
  
  Here is the main table that I am using:
  
  mysql describe main;
  ++-+--+-+-++
  | Field  | Type| Null | Key | Default | Extra  |
  ++-+--+-+-++
  | areacode   | int(3)  | NO   | MUL | NULL||
  | exchange   | int(3)  | NO   | | NULL||
  | subscriber | char(4) | NO   | | NULL||
  | id | int(11) | NO   | PRI | NULL| auto_increment |
  | state  | varchar(20) | YES  | | NULL||
  | config | text| YES  | | NULL||
  ++-+--+-+-++
  
  
  
  config is just going to contain varius settings for commenting on records, 
  and future expansion. State will actually be the state spelled out. 
  
  Thanks for taking a looking!
  
  
  
  I'd put the spelling of the state in another table and just include the 
  reference to it in this table, it will save a lot on storage and it's easy 
  to do a join to get it. That way, it's also much faster to look up entries 
  by state, as a numerical index is quicker that a string index.
  
  On the subject of indexes, what other ones do you have apart from the 
  primary key there?
  
  -- 
  Thanks,
  Ash
  http://www.ashleysheridan.co.uk
  
  
 
 Hey Ash,
 
 Sorry for the delay... Had a sick kid at home which kept me very busy and 
 away from the computer!
 
 You think putting the spelling of the state in another table with some kind 
 of a reference code (such as 1, 2, 3, 4, etc etc ) would be faster then 
 searching based on some spelling of the name? I am using the state name in 
 the URL that I'm referencing... So something like: 
 www.example.com/new-york/212/314 would bring up the entire 10,000 records for 
 212-314- 
 
 Trying to avoid the normal index.php?vars=blahyou=Blah type stuff...
 
 Indexes... I'm still learning about those... I created a index based on the 
 areacode, and exchange fields... But more reading is needed to figure out 
 just how to use it properly :)
 
 Thanks Ash!
 
 
 
 
 


You could still search based on the state name, but you could then use a
join or first search the states table for the ids of each matching state
and use an IN clause in the query against the phone numbers.

-- 
Thanks,
Ash
http://www.ashleysheridan.co.uk




Re: [PHP] Exporting large data from mysql to html using php

2011-11-02 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 31, 2011, at 7:52 PM, Ashley Sheridan wrote:

 On Mon, 2011-10-31 at 19:29 -0400, Jason Pruim wrote:
 
 Jason Pruim
 li...@pruimphotography.com
 
 
 
 On Oct 31, 2011, at 7:11 PM, Jim Lucas wrote:
 
  On 10/24/2011 5:50 PM, Jason Pruim wrote:
  Now that I've managed to list 3 separate programming languages and 
  somewhat tie it back into php here's the question...
  
  I have about 89 million records in mysql... the initial load of the page 
  takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL 
  query's... But they just aren't going fast enough...
  
  What I would like to do, is pull the data out of MySQL and store it in 
  the HTML files, and then update the HTML files once a day/week/month... I 
  can figure most of it out... BUT... How do I automatically link to the 
  individual pages?
  
  I have the site working when you pull it from MySQL... Just the load time 
  sucks... Any suggestions on where I can pull some more info from? :)
  
  Thanks in advance!
  
  
  Jason Pruim
  li...@pruimphotography.com
  
  
  Jason,
  
  How large a data set are you starting with?  How many records in all.
  
  Will you show us your DB schema?
 
 Hey Jim,
 
 I am working with 89 Million records right now... Going to be expanding to a 
 much larger dataset as the site expands.
 
 Here is the main table that I am using:
 
 mysql describe main;
 ++-+--+-+-++
 | Field  | Type| Null | Key | Default | Extra  |
 ++-+--+-+-++
 | areacode   | int(3)  | NO   | MUL | NULL||
 | exchange   | int(3)  | NO   | | NULL||
 | subscriber | char(4) | NO   | | NULL||
 | id | int(11) | NO   | PRI | NULL| auto_increment |
 | state  | varchar(20) | YES  | | NULL||
 | config | text| YES  | | NULL||
 ++-+--+-+-++
 
 
 
 config is just going to contain varius settings for commenting on records, 
 and future expansion. State will actually be the state spelled out. 
 
 Thanks for taking a looking!
 
 
 
 I'd put the spelling of the state in another table and just include the 
 reference to it in this table, it will save a lot on storage and it's easy to 
 do a join to get it. That way, it's also much faster to look up entries by 
 state, as a numerical index is quicker that a string index.
 
 On the subject of indexes, what other ones do you have apart from the primary 
 key there?
 
 -- 
 Thanks,
 Ash
 http://www.ashleysheridan.co.uk
 
 

Hey Ash,

Sorry for the delay... Had a sick kid at home which kept me very busy and away 
from the computer!

You think putting the spelling of the state in another table with some kind of 
a reference code (such as 1, 2, 3, 4, etc etc ) would be faster then searching 
based on some spelling of the name? I am using the state name in the URL that 
I'm referencing... So something like: www.example.com/new-york/212/314 would 
bring up the entire 10,000 records for 212-314- 

Trying to avoid the normal index.php?vars=blahyou=Blah type stuff...

Indexes... I'm still learning about those... I created a index based on the 
areacode, and exchange fields... But more reading is needed to figure out just 
how to use it properly :)

Thanks Ash!







Re: [PHP] Exporting large data from mysql to html using php

2011-10-31 Thread Jim Lucas
On 10/24/2011 5:50 PM, Jason Pruim wrote:
 Now that I've managed to list 3 separate programming languages and somewhat 
 tie it back into php here's the question...
 
 I have about 89 million records in mysql... the initial load of the page 
 takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL 
 query's... But they just aren't going fast enough...
 
 What I would like to do, is pull the data out of MySQL and store it in the 
 HTML files, and then update the HTML files once a day/week/month... I can 
 figure most of it out... BUT... How do I automatically link to the individual 
 pages?
 
 I have the site working when you pull it from MySQL... Just the load time 
 sucks... Any suggestions on where I can pull some more info from? :)
 
 Thanks in advance!
 
 
 Jason Pruim
 li...@pruimphotography.com
 

Jason,

How large a data set are you starting with?  How many records in all.

Will you show us your DB schema?

-- 
Jim Lucas

http://www.cmsws.com/
http://www.cmsws.com/examples/
http://www.bendsource.com/

C - (541) 408-5189
O - (541) 323-9113
H - (541) 323-4219

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-31 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 31, 2011, at 7:11 PM, Jim Lucas wrote:

 On 10/24/2011 5:50 PM, Jason Pruim wrote:
 Now that I've managed to list 3 separate programming languages and somewhat 
 tie it back into php here's the question...
 
 I have about 89 million records in mysql... the initial load of the page 
 takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL 
 query's... But they just aren't going fast enough...
 
 What I would like to do, is pull the data out of MySQL and store it in the 
 HTML files, and then update the HTML files once a day/week/month... I can 
 figure most of it out... BUT... How do I automatically link to the 
 individual pages?
 
 I have the site working when you pull it from MySQL... Just the load time 
 sucks... Any suggestions on where I can pull some more info from? :)
 
 Thanks in advance!
 
 
 Jason Pruim
 li...@pruimphotography.com
 
 
 Jason,
 
 How large a data set are you starting with?  How many records in all.
 
 Will you show us your DB schema?

Hey Jim,

I am working with 89 Million records right now... Going to be expanding to a 
much larger dataset as the site expands.

Here is the main table that I am using:

mysql describe main;
++-+--+-+-++
| Field  | Type| Null | Key | Default | Extra  |
++-+--+-+-++
| areacode   | int(3)  | NO   | MUL | NULL||
| exchange   | int(3)  | NO   | | NULL||
| subscriber | char(4) | NO   | | NULL||
| id | int(11) | NO   | PRI | NULL| auto_increment |
| state  | varchar(20) | YES  | | NULL||
| config | text| YES  | | NULL||
++-+--+-+-++



config is just going to contain varius settings for commenting on records, and 
future expansion. State will actually be the state spelled out. 

Thanks for taking a looking!




Re: [PHP] Exporting large data from mysql to html using php

2011-10-31 Thread Ashley Sheridan
On Mon, 2011-10-31 at 19:29 -0400, Jason Pruim wrote:

 Jason Pruim
 li...@pruimphotography.com
 
 
 
 On Oct 31, 2011, at 7:11 PM, Jim Lucas wrote:
 
  On 10/24/2011 5:50 PM, Jason Pruim wrote:
  Now that I've managed to list 3 separate programming languages and 
  somewhat tie it back into php here's the question...
  
  I have about 89 million records in mysql... the initial load of the page 
  takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL 
  query's... But they just aren't going fast enough...
  
  What I would like to do, is pull the data out of MySQL and store it in the 
  HTML files, and then update the HTML files once a day/week/month... I can 
  figure most of it out... BUT... How do I automatically link to the 
  individual pages?
  
  I have the site working when you pull it from MySQL... Just the load time 
  sucks... Any suggestions on where I can pull some more info from? :)
  
  Thanks in advance!
  
  
  Jason Pruim
  li...@pruimphotography.com
  
  
  Jason,
  
  How large a data set are you starting with?  How many records in all.
  
  Will you show us your DB schema?
 
 Hey Jim,
 
 I am working with 89 Million records right now... Going to be expanding to a 
 much larger dataset as the site expands.
 
 Here is the main table that I am using:
 
 mysql describe main;
 ++-+--+-+-++
 | Field  | Type| Null | Key | Default | Extra  |
 ++-+--+-+-++
 | areacode   | int(3)  | NO   | MUL | NULL||
 | exchange   | int(3)  | NO   | | NULL||
 | subscriber | char(4) | NO   | | NULL||
 | id | int(11) | NO   | PRI | NULL| auto_increment |
 | state  | varchar(20) | YES  | | NULL||
 | config | text| YES  | | NULL||
 ++-+--+-+-++
 
 
 
 config is just going to contain varius settings for commenting on records, 
 and future expansion. State will actually be the state spelled out. 
 
 Thanks for taking a looking!
 
 


I'd put the spelling of the state in another table and just include the
reference to it in this table, it will save a lot on storage and it's
easy to do a join to get it. That way, it's also much faster to look up
entries by state, as a numerical index is quicker that a string index.

On the subject of indexes, what other ones do you have apart from the
primary key there?

-- 
Thanks,
Ash
http://www.ashleysheridan.co.uk




Re: [PHP] Exporting large data from mysql to html using php

2011-10-27 Thread Lester Caine

Jim Giner wrote:

Your boss wants to give access to phone numbers to the public in general?
Then what?

Glad mine's unlisted.


Is it?
Does it start 518248 ?
I often forget to get a phone number when a parcel has to go by carrier, and 
paypal does not include that info, but only rarely does one not come up on 
google ... even if in your case I had to check the cashed copy of the the result ;)


--
Lester Caine - G8HFL
-
Contact - http://lsces.co.uk/wiki/?page=contact
L.S.Caine Electronic Services - http://lsces.co.uk
EnquirySolve - http://enquirysolve.com/
Model Engineers Digital Workshop - http://medw.co.uk//
Firebird - http://www.firebirdsql.org/index.php

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-27 Thread David Robley
Jim Giner wrote:

 David Robley robl...@aapt.net.au wrote in message
 news:49.50.34068.1b567...@pb1.pair.com...

 Consider running EXPLAIN on all your queries to see if there is something
 Mysql thinks could be done to improve performance.

 
 Why do so many responders seem to think the problem here is in the
 preparation of the query?
 
 It is the Approach that it is the problem.  It needs to re-addressed, not
 simply tinkered with.  I don't care if someone figures out how to spew out
 89M+ records in 5 seconds flat.  What are you doing pulling up that many
 detail records at one time for? Who is going to look at them?  You?  If
 so, see my previous post on how long that is going to take you just to
 examine (!) the first million alone.
 
 This problem needs a re-working of the data processing and user interfaces
 to allow for the selection of an appropriate amount of individual records
 in any result set, otherwise simply preparing a summary of some sort.

Jason originally said that he was using LIMIT to grab a subset of the data,
so I don't see why you would think he is trying to pull the full data set
in one hit. My response suggesting EXPLAIN was made in the knowledge that
he is using LIMIT.


Cheers
-- 
David Robley

I was the first to climb Mount Everest, said Tom hilariously.
Today is Setting Orange, the 8th day of The Aftermath in the YOLD 3177. 


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-27 Thread Tommy Pham
On Wed, Oct 26, 2011 at 5:47 PM, Jason Pruim li...@pruimphotography.comwrote:


 Jason Pruim
 li...@pruimphotography.com

 The server that's running it is a home computer with a VPS installed...
 It's not my dev environment :)


Home computer used for a production environment?  Wow.. I'm speechless.


 The information being searched is specifically phone numbers, and the
 bosses want to provide the public away to browse them, hence the
 pagination... Once I removed a COUNT from mysql it started working alot
 better... So I still need to provide a better pagination system, but it's
 working.


If just showing phone numbers only and no other information, what's the
point since there are already several other white and yellow pages on the
net?  There's even a reverse number look-up too.  There's also a well known
paid service for getting a DB of contacts (name, address, phone numbers of
persons and businesses).  Just out of curiosity, what is it that your boss
intend to offer that would stand out for already existing services?  I doubt
anyone would be sane enough to sit there and just browse through phone
numbers only.  Even if it does show the owner's name of the registered
number, I highly doubt anyone sane would browse unless you provide some kind
of filter/search.


 Oh, and the 89 million is just for one state :) We are talking the
 possibility of I believe 10 billion numbers to cover the US, not to mention
 Canada which I believe uses the same numbering system as we do so that could
 be another 10 billion...

 
  As I've mentioned, something of this magnitude is better to leave it to
 the
  DBA and work together with that DBA.  Either hire/contract one or become
 one
  :)

 I'm working on becoming one ontop of web designer and programmer :)


Good luck, that's a LOT of reading.  I'd estimate that's about 3k+ pages of
reading. :)


 
 
  Regards,
  Tommy




Re: [PHP] Exporting large data from mysql to html using php

2011-10-27 Thread Nathan Nobbe
On Mon, Oct 24, 2011 at 6:50 PM, Jason Pruim li...@pruimphotography.comwrote:

 Now that I've managed to list 3 separate programming languages and somewhat
 tie it back into php here's the question...

 I have about 89 million records in mysql... the initial load of the page
 takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL
 query's... But they just aren't going fast enough...

 What I would like to do, is pull the data out of MySQL and store it in the
 HTML files, and then update the HTML files once a day/week/month... I can
 figure most of it out... BUT... How do I automatically link to the
 individual pages?

 I have the site working when you pull it from MySQL... Just the load time
 sucks... Any suggestions on where I can pull some more info from? :)

 Thanks in advance!


dial in the db schema (think keys) and queries; then investigate a reverse
proxy like varnish to cache the generated html.

you'll be able to handle a couple thousand requests per second against the
proxy in no time.

might be worth pre-generating some of the pages if they are still really
slow after db optimization.

-nathan


Re: [PHP] Exporting large data from mysql to html using php

2011-10-27 Thread Jim Giner

 Good luck, that's a LOT of reading.  I'd estimate that's about 3k+ pages 
 of
 reading. :)


 
 
  Regards,
  Tommy




nice to see someone else is finally getting the point that I'm been making. 



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-27 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 27, 2011, at 2:44 PM, Tommy Pham wrote:

 On Wed, Oct 26, 2011 at 5:47 PM, Jason Pruim li...@pruimphotography.com 
 wrote:
 
 Jason Pruim
 li...@pruimphotography.com
 
 The server that's running it is a home computer with a VPS installed... It's 
 not my dev environment :)
 
 
 Home computer used for a production environment?  Wow.. I'm speechless.

Ummm... No... Home computer for a dev environment... The final product will be 
moved to a better host. This is temporary... And actually isn't a bad idea... 
If you get it running good on a cheap home computer, why wouldn't it run 
*BETTER* on the high-end production server?

  
 The information being searched is specifically phone numbers, and the bosses 
 want to provide the public away to browse them, hence the pagination... Once 
 I removed a COUNT from mysql it started working alot better... So I still 
 need to provide a better pagination system, but it's working.
 
 
 If just showing phone numbers only and no other information, what's the point 
 since there are already several other white and yellow pages on the net?  
 There's even a reverse number look-up too.  There's also a well known paid 
 service for getting a DB of contacts (name, address, phone numbers of persons 
 and businesses).  Just out of curiosity, what is it that your boss intend to 
 offer that would stand out for already existing services?  I doubt anyone 
 would be sane enough to sit there and just browse through phone numbers only. 
  Even if it does show the owner's name of the registered number, I highly 
 doubt anyone sane would browse unless you provide some kind of filter/search.

I'm not expecting people to look at every single page... But I need away to 
allow curious people the ability to page through results if they want. 
  
 Oh, and the 89 million is just for one state :) We are talking the 
 possibility of I believe 10 billion numbers to cover the US, not to mention 
 Canada which I believe uses the same numbering system as we do so that could 
 be another 10 billion...
 
 
  As I've mentioned, something of this magnitude is better to leave it to the
  DBA and work together with that DBA.  Either hire/contract one or become one
  :)
 
 I'm working on becoming one ontop of web designer and programmer :)
 
 Good luck, that's a LOT of reading.  I'd estimate that's about 3k+ pages of 
 reading. :)

I've always been a fan of reading :) Just wish I had more time to do it! :)




Re: [PHP] Exporting large data from mysql to html using php

2011-10-27 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 27, 2011, at 4:08 PM, Nathan Nobbe wrote:

 On Mon, Oct 24, 2011 at 6:50 PM, Jason Pruim li...@pruimphotography.com 
 wrote:
 Now that I've managed to list 3 separate programming languages and somewhat 
 tie it back into php here's the question...
 
 I have about 89 million records in mysql... the initial load of the page 
 takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL 
 query's... But they just aren't going fast enough...
 
 What I would like to do, is pull the data out of MySQL and store it in the 
 HTML files, and then update the HTML files once a day/week/month... I can 
 figure most of it out... BUT... How do I automatically link to the individual 
 pages?
 
 I have the site working when you pull it from MySQL... Just the load time 
 sucks... Any suggestions on where I can pull some more info from? :)
 
 Thanks in advance!
 
 dial in the db schema (think keys) and queries; then investigate a reverse 
 proxy like varnish to cache the generated html.
 
 you'll be able to handle a couple thousand requests per second against the 
 proxy in no time.
 
 might be worth pre-generating some of the pages if they are still really slow 
 after db optimization.

Would it make sense that COUNT()'ing 89 million records would kill the 
performance? Once that was gone... It worked perfectly! 




Re: [PHP] Exporting large data from mysql to html using php

2011-10-26 Thread Lester Caine

Tommy Pham wrote:

It turns out the issue was actually in the pagination... I'm reworking the
  whole thing and stream lining it... But in the pagination that I found on
  the internet it used a SELECT COUNT(*) WHERE state='{$state}'; and the
  COUNT was killing the time... Once that was removed, I was displaying
  records faster then I could imagine... So it's off to pagination land to fix
  it! And possibly redo the entire thing!


If you're encountering performance issues while doing SELECT COUNT(*), it
sounds like you have serious disk IO performance issue.  Is the DB on RAID
subsystem?  If not, why not? If so, what level?  Also, what type of HDDs?
For something like this, it should be RAID 10 with HDDs spinning at least
7200RPM, 10,000+ RPM recommended, connected to a good RAID controller, like
3ware's.  Also, the controller should be in slot PCI-X or, preferably,
PCI-e.


What a load of twoddle ...

SELECT COUNT(*) is a problem on any fully transactional database, since it has 
to be generated from the currently active view of the data.
Rather than trying to get the database engine access every record faster, the 
correct action is to either avoid the count altogether, or more practically 
maintain a separate table with useful counts that have been generated from the 
committed data.


Jason, it is not unreasonable that an initial view will be displaying unfiltered 
data, so you just need to restrict the number of records displayed. As you have 
found out, telling the user how many more records there are is the real problem, 
so if it's not important ... don't, but if it's useful to know, then keep a 
'cache' of counts that link to your initial filter options. Once you are down to 
a few thousand records, then a SELECT COUNT(*) may be appropriate ;) Where it 
becomes a problem is when there the user can set up a more complex filter that 
the cache does not cover ...


--
Lester Caine - G8HFL
-
Contact - http://lsces.co.uk/wiki/?page=contact
L.S.Caine Electronic Services - http://lsces.co.uk
EnquirySolve - http://enquirysolve.com/
Model Engineers Digital Workshop - http://medw.co.uk//
Firebird - http://www.firebirdsql.org/index.php

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-26 Thread Tommy Pham
On Wed, Oct 26, 2011 at 12:52 AM, Lester Caine les...@lsces.co.uk wrote:

 Tommy Pham wrote:

 It turns out the issue was actually in the pagination... I'm reworking the
   whole thing and stream lining it... But in the pagination that I found
 on
   the internet it used a SELECT COUNT(*) WHERE state='{$state}'; and
 the
   COUNT was killing the time... Once that was removed, I was displaying
   records faster then I could imagine... So it's off to pagination land
 to fix
   it! And possibly redo the entire thing!
 

 If you're encountering performance issues while doing SELECT COUNT(*), it
 sounds like you have serious disk IO performance issue.  Is the DB on RAID
 subsystem?  If not, why not? If so, what level?  Also, what type of HDDs?
 For something like this, it should be RAID 10 with HDDs spinning at least
 7200RPM, 10,000+ RPM recommended, connected to a good RAID controller,
 like
 3ware's.  Also, the controller should be in slot PCI-X or, preferably,
 PCI-e.


 What a load of twoddle ...


I wonder ... The real question is what's the purpose of the DB?  Is it for
OLAP or OLTP? ;)
As for dealing with DB having millions of rows, you're crossing over into
DBA area.


 SELECT COUNT(*) is a problem on any fully transactional database, since it
 has to be generated from the currently active view of the data.
 Rather than trying to get the database engine access every record faster,
 the correct action is to either avoid the count altogether, or more
 practically maintain a separate table with useful counts that have been
 generated from the committed data.

 Jason, it is not unreasonable that an initial view will be displaying
 unfiltered data, so you just need to restrict the number of records
 displayed. As you have found out, telling the user how many more records
 there are is the real problem, so if it's not important ... don't, but if
 it's useful to know, then keep a 'cache' of counts that link to your initial
 filter options. Once you are down to a few thousand records, then a SELECT
 COUNT(*) may be appropriate ;) Where it becomes a problem is when there the
 user can set up a more complex filter that the cache does not cover ...

 --
 Lester Caine - G8HFL
 -
 Contact - 
 http://lsces.co.uk/wiki/?page=**contacthttp://lsces.co.uk/wiki/?page=contact
 L.S.Caine Electronic Services - http://lsces.co.uk
 EnquirySolve - http://enquirysolve.com/
 Model Engineers Digital Workshop - http://medw.co.uk//
 Firebird - 
 http://www.firebirdsql.org/**index.phphttp://www.firebirdsql.org/index.php


 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php




Re: [PHP] Exporting large data from mysql to html using php

2011-10-26 Thread Lester Caine

Tommy Pham wrote:

I wonder ... The real question is what's the purpose of the DB?  Is it for OLAP
or OLTP? ;)
As for dealing with DB having millions of rows, you're crossing over into DBA 
area.


Many of my customers have coming up on 20 years of data available. There has 
been a debate on transferring historic data to a separate database, but having 
it available is not causing a problem, except for some counts and larger search 
actions, and being able to see how long a client has been visiting is often 
useful. Statistical analysis is always done on a separate machine, with a 
replicated copy of the data, so as not to affect the active users ...


That said, we are now using the much more detailed LLPG address data rather than 
simple postcode, and that has added another order of magnitude to data that is 
being searched live ...


--
Lester Caine - G8HFL
-
Contact - http://lsces.co.uk/wiki/?page=contact
L.S.Caine Electronic Services - http://lsces.co.uk
EnquirySolve - http://enquirysolve.com/
Model Engineers Digital Workshop - http://medw.co.uk//
Firebird - http://www.firebirdsql.org/index.php

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-26 Thread Tommy Pham
On Wed, Oct 26, 2011 at 1:40 AM, Lester Caine les...@lsces.co.uk wrote:

 Tommy Pham wrote:

 I wonder ... The real question is what's the purpose of the DB?  Is it for
 OLAP
 or OLTP? ;)
 As for dealing with DB having millions of rows, you're crossing over into
 DBA area.


 Many of my customers have coming up on 20 years of data available. There
 has been a debate on transferring historic data to a separate database, but
 having it available is not causing a problem, except for some counts and
 larger search actions, and being able to see how long a client has been
 visiting is often useful. Statistical analysis is always done on a separate
 machine, with a replicated copy of the data, so as not to affect the active
 users ...


What kind of counts/filters?  What kind of RAID subsystem is the storage?
What's the total size of the DB?  Up to 20 years of data should be in the
peta range.  In that peta range, if you're not having performance issue and
not using either RAID 0, 0+1, 10, 50, or 60, I'd love to hear about the
application and database design in details. :)


 That said, we are now using the much more detailed LLPG address data rather
 than simple postcode, and that has added another order of magnitude to data
 that is being searched live ...


 --
 Lester Caine - G8HFL
 -
 Contact - 
 http://lsces.co.uk/wiki/?page=**contacthttp://lsces.co.uk/wiki/?page=contact
 L.S.Caine Electronic Services - http://lsces.co.uk
 EnquirySolve - http://enquirysolve.com/
 Model Engineers Digital Workshop - http://medw.co.uk//
 Firebird - 
 http://www.firebirdsql.org/**index.phphttp://www.firebirdsql.org/index.php


Since Jason didn't disclose sufficient information, I suggested that RAID
storage based on the worst case scenario being this:  as a business analyst,
I'd want to do drill-downs, filters, counts of a the following for an
up-coming marketing campaign:

* county(ies)
* city(ies)
* zip code(s)
* address type (business and/or residential)
* of business (commercial, industrial/manufacturing, etc)
* of residential ( homes - single/townhouses - or apartments/condos )

The filters and counts will any combinations of the above, ie: 5 random zip
codes within the given state that lists all other criteria and break down.
As Jason mention having 89million rows for a given state, how long would it
take to run the drill-downs if the DB isn't sitting on a fast storage
medium?  That 89 million is the most likely the average count in the USA.
For California and New York, the number can double that easily.  That's only
the basic filtering.  What of a specific business industry such as
landscaping?  What of the filtering by yearly income and/or real estate
value?  BTW, as a business analyst, I don't want to wait hours for the info
to update every time I change a criteria/filter to get the counts before I
look into a few random individual records from the results.

As I've mentioned, something of this magnitude is better to leave it to the
DBA and work together with that DBA.  Either hire/contract one or become one
:)


Regards,
Tommy


Re: [PHP] Exporting large data from mysql to html using php

2011-10-26 Thread Lester Caine

Tommy Pham wrote:


Many of my customers have coming up on 20 years of data available. There has
been a debate on transferring historic data to a separate database, but
having it available is not causing a problem, except for some counts and
larger search actions, and being able to see how long a client has been
visiting is often useful. Statistical analysis is always done on a separate
machine, with a replicated copy of the data, so as not to affect the active
users ...


What kind of counts/filters?  What kind of RAID subsystem is the storage?
What's the total size of the DB?  Up to 20 years of data should be in the peta
range.  In that peta range, if you're not having performance issue and not using
either RAID 0, 0+1, 10, 50, or 60, I'd love to hear about the application and
database design in details. :)


We are still only in hundreds on Mb and historic data is has less detail than 
the current 'transactions'. The current postcode table is 500Mb, and while the 
LLPG data would increase that by the order of 100, it's currently only 
restricted to a councils immediate working area, so we keep the problem 
contained. Dropping back to postcode for out of area enquiries. Users complain 
if an enquiry takes more than a few seconds, and Firebird is giving me more than 
adequate performance, and allows shadow data to be created via triggers to 
reduce the need for 'counting'.


I have a new 'application' which is using the same search criteria but the data 
volume is growing a lot faster, 10Gb on the test system here, but I am still 
seeing the same search speeds once the correct indexes have been generated. But 
it will take a few more years before that starts reaching the 100Gb level :)


--
Lester Caine - G8HFL
-
Contact - http://lsces.co.uk/wiki/?page=contact
L.S.Caine Electronic Services - http://lsces.co.uk
EnquirySolve - http://enquirysolve.com/
Model Engineers Digital Workshop - http://medw.co.uk//
Firebird - http://www.firebirdsql.org/index.php

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-26 Thread Tommy Pham
On Wed, Oct 26, 2011 at 4:14 AM, Lester Caine les...@lsces.co.uk wrote:

 Tommy Pham wrote:


Many of my customers have coming up on 20 years of data available.
 There has
been a debate on transferring historic data to a separate database, but
having it available is not causing a problem, except for some counts
 and
larger search actions, and being able to see how long a client has been
visiting is often useful. Statistical analysis is always done on a
 separate
machine, with a replicated copy of the data, so as not to affect the
 active
users ...


 What kind of counts/filters?  What kind of RAID subsystem is the storage?
 What's the total size of the DB?  Up to 20 years of data should be in the
 peta
 range.  In that peta range, if you're not having performance issue and not
 using
 either RAID 0, 0+1, 10, 50, or 60, I'd love to hear about the application
 and
 database design in details. :)


 We are still only in hundreds on Mb and historic data is has less detail
 than the current 'transactions'. The current postcode table is 500Mb, and
 while the LLPG data would increase that by the order of 100, it's currently
 only restricted to a councils immediate working area, so we keep the problem
 contained. Dropping back to postcode for out of area enquiries. Users
 complain if an enquiry takes more than a few seconds, and Firebird is giving
 me more than adequate performance, and allows shadow data to be created via
 triggers to reduce the need for 'counting'.

 I have a new 'application' which is using the same search criteria but the
 data volume is growing a lot faster, 10Gb on the test system here, but I am
 still seeing the same search speeds once the correct indexes have been
 generated. But it will take a few more years before that starts reaching the
 100Gb level :)


 --
 Lester Caine - G8HFL
 -
 Contact - 
 http://lsces.co.uk/wiki/?page=**contacthttp://lsces.co.uk/wiki/?page=contact
 L.S.Caine Electronic Services - http://lsces.co.uk
 EnquirySolve - http://enquirysolve.com/
 Model Engineers Digital Workshop - http://medw.co.uk//
 Firebird - 
 http://www.firebirdsql.org/**index.phphttp://www.firebirdsql.org/index.php



I'm just curious.  What's the total rows count?  Data accumulated in 20
years and only taking that much space doesn't seem like there's a lot going
on each year over the years.  All the DBAs that I know they deal with
minimum addition/import of 1 million rows per week, currently.  I didn't
bother asking them how far back they keep the data as that amount of rows is
too overwhelming for me, for the moment, as I'm not a DBA :)


Re: [PHP] Exporting large data from mysql to html using php

2011-10-26 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 26, 2011, at 6:23 AM, Tommy Pham wrote:

 On Wed, Oct 26, 2011 at 1:40 AM, Lester Caine les...@lsces.co.uk wrote:
 
 Tommy Pham wrote:
 
 I wonder ... The real question is what's the purpose of the DB?  Is it for
 OLAP
 or OLTP? ;)
 As for dealing with DB having millions of rows, you're crossing over into
 DBA area.
 
 
 Many of my customers have coming up on 20 years of data available. There
 has been a debate on transferring historic data to a separate database, but
 having it available is not causing a problem, except for some counts and
 larger search actions, and being able to see how long a client has been
 visiting is often useful. Statistical analysis is always done on a separate
 machine, with a replicated copy of the data, so as not to affect the active
 users ...
 
 
 What kind of counts/filters?  What kind of RAID subsystem is the storage?
 What's the total size of the DB?  Up to 20 years of data should be in the
 peta range.  In that peta range, if you're not having performance issue and
 not using either RAID 0, 0+1, 10, 50, or 60, I'd love to hear about the
 application and database design in details. :)
 
 
 That said, we are now using the much more detailed LLPG address data rather
 than simple postcode, and that has added another order of magnitude to data
 that is being searched live ...
 
 
 --
 Lester Caine - G8HFL
 -
 Contact - 
 http://lsces.co.uk/wiki/?page=**contacthttp://lsces.co.uk/wiki/?page=contact
 L.S.Caine Electronic Services - http://lsces.co.uk
 EnquirySolve - http://enquirysolve.com/
 Model Engineers Digital Workshop - http://medw.co.uk//
 Firebird - 
 http://www.firebirdsql.org/**index.phphttp://www.firebirdsql.org/index.php
 
 
 Since Jason didn't disclose sufficient information, I suggested that RAID
 storage based on the worst case scenario being this:  as a business analyst,
 I'd want to do drill-downs, filters, counts of a the following for an
 up-coming marketing campaign:
 
 * county(ies)
 * city(ies)
 * zip code(s)
 * address type (business and/or residential)
 * of business (commercial, industrial/manufacturing, etc)
 * of residential ( homes - single/townhouses - or apartments/condos )
 
 The filters and counts will any combinations of the above, ie: 5 random zip
 codes within the given state that lists all other criteria and break down.
 As Jason mention having 89million rows for a given state, how long would it
 take to run the drill-downs if the DB isn't sitting on a fast storage
 medium?  That 89 million is the most likely the average count in the USA.
 For California and New York, the number can double that easily.  That's only
 the basic filtering.  What of a specific business industry such as
 landscaping?  What of the filtering by yearly income and/or real estate
 value?  BTW, as a business analyst, I don't want to wait hours for the info
 to update every time I change a criteria/filter to get the counts before I
 look into a few random individual records from the results.

The server that's running it is a home computer with a VPS installed... It's 
not my dev environment :)  

The information being searched is specifically phone numbers, and the bosses 
want to provide the public away to browse them, hence the pagination... Once I 
removed a COUNT from mysql it started working alot better... So I still need to 
provide a better pagination system, but it's working.

Oh, and the 89 million is just for one state :) We are talking the possibility 
of I believe 10 billion numbers to cover the US, not to mention Canada which I 
believe uses the same numbering system as we do so that could be another 10 
billion... 

 
 As I've mentioned, something of this magnitude is better to leave it to the
 DBA and work together with that DBA.  Either hire/contract one or become one
 :)

I'm working on becoming one ontop of web designer and programmer :)
 
 
 Regards,
 Tommy


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-26 Thread Jim Giner
Your boss wants to give access to phone numbers to the public in general? 
Then what?

Glad mine's unlisted.



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-26 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 26, 2011, at 9:09 PM, Jim Giner wrote:

 Your boss wants to give access to phone numbers to the public in general? 
 Then what?
 
 Glad mine's unlisted.

There's no identifying information on the phone numbers... Simply just the 
phone number... Eventually maybe some info about who owns the phone line... And 
a few other things. 

The rest is covered under an NDA until the site goes live :) but there 
shouldn't be any privacy related issues...



--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread Jim Giner
I disagree.  It's not about tuning the queries, it is more about the appl. 
design that currently thinks it SHOULD do such huge queries.

My approach would be to prompt the user for filtering criteria that 
automatically would reduce the result set size.  Although at this time I 
believe the OP mentioned that the db is just telephone numbers so that 
doesn't leave much room for filter-criteria. 



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 25, 2011, at 10:51 AM, Jim Giner wrote:

 I disagree.  It's not about tuning the queries, it is more about the appl. 
 design that currently thinks it SHOULD do such huge queries.
 
 My approach would be to prompt the user for filtering criteria that 
 automatically would reduce the result set size.  Although at this time I 
 believe the OP mentioned that the db is just telephone numbers so that 
 doesn't leave much room for filter-criteria. 
 
 

Yes it is just phone numbers... The only select that I'm running on the entire 
site is related to the pagination... A simple:
$sqlCount = SELECT COUNT(*) FROM main WHERE state = '{$state}';

which limits it to everything inside the state... Unfortunately if you look at 
the possibilities, it's still quite a large dataset... 89 million :)

The rest of the query's will be much more limited to areacode, exchange, and in 
some cases the full phone number... Maybe the better way to do it would be not 
to count the records But set a variable with the total count... That way I 
don't have to load all the data... The data amount won't change alot... Easy 
enough to set a variable...  Just need to see if I can integrate that with the 
pagination...

Back to the drawing board! :)



 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread Jim Giner
Again  why even do a detail query?  Nobody is going to examine pages and pages 
and etc.  
Do a summary qry if u just need a count - no pagination there
jg


On Oct 25, 2011, at 6:26 PM, Jason Pruim li...@pruimphotography.com wrote:

 
 Jason Pruim
 li...@pruimphotography.com
 
 
 
 On Oct 25, 2011, at 10:51 AM, Jim Giner wrote:
 
 I disagree.  It's not about tuning the queries, it is more about the appl. 
 design that currently thinks it SHOULD do such huge queries.
 
 My approach would be to prompt the user for filtering criteria that 
 automatically would reduce the result set size.  Although at this time I 
 believe the OP mentioned that the db is just telephone numbers so that 
 doesn't leave much room for filter-criteria. 
 
 
 
 Yes it is just phone numbers... The only select that I'm running on the 
 entire site is related to the pagination... A simple:
 $sqlCount = SELECT COUNT(*) FROM main WHERE state = '{$state}';
 
 which limits it to everything inside the state... Unfortunately if you look 
 at the possibilities, it's still quite a large dataset... 89 million :)
 
 The rest of the query's will be much more limited to areacode, exchange, and 
 in some cases the full phone number... Maybe the better way to do it would be 
 not to count the records But set a variable with the total count... That 
 way I don't have to load all the data... The data amount won't change alot... 
 Easy enough to set a variable...  Just need to see if I can integrate that 
 with the pagination...
 
 Back to the drawing board! :)
 
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 
 
 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 25, 2011, at 6:35 PM, Jim Giner wrote:

 Again  why even do a detail query?  Nobody is going to examine pages and 
 pages and etc.  
 Do a summary qry if u just need a count - no pagination there
 jg


The bosses wanted them to be able to page through the results... But I might 
bring that up to them again... Working with 89 million records (With the number 
going up and expanding in the future) could cause lots of issues in the long 
run...


 
 On Oct 25, 2011, at 6:26 PM, Jason Pruim li...@pruimphotography.com wrote:
 
 
 Jason Pruim
 li...@pruimphotography.com
 
 
 
 On Oct 25, 2011, at 10:51 AM, Jim Giner wrote:
 
 I disagree.  It's not about tuning the queries, it is more about the 
 appl. 
 design that currently thinks it SHOULD do such huge queries.
 
 My approach would be to prompt the user for filtering criteria that 
 automatically would reduce the result set size.  Although at this time I 
 believe the OP mentioned that the db is just telephone numbers so that 
 doesn't leave much room for filter-criteria. 
 
 
 
 Yes it is just phone numbers... The only select that I'm running on the 
 entire site is related to the pagination... A simple:
 $sqlCount = SELECT COUNT(*) FROM main WHERE state = '{$state}';
 
 which limits it to everything inside the state... Unfortunately if you look 
 at the possibilities, it's still quite a large dataset... 89 million :)
 
 The rest of the query's will be much more limited to areacode, exchange, and 
 in some cases the full phone number... Maybe the better way to do it would 
 be not to count the records But set a variable with the total count... 
 That way I don't have to load all the data... The data amount won't change 
 alot... Easy enough to set a variable...  Just need to see if I can 
 integrate that with the pagination...
 
 Back to the drawing board! :)
 
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 
 
 
 
 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread Bastien




On 2011-10-25, at 6:56 PM, Jason Pruim li...@pruimphotography.com wrote:

 
 Jason Pruim
 li...@pruimphotography.com
 
 
 
 On Oct 25, 2011, at 6:35 PM, Jim Giner wrote:
 
 Again  why even do a detail query?  Nobody is going to examine pages and 
 pages and etc.  
 Do a summary qry if u just need a count - no pagination there
 jg
 
 
 The bosses wanted them to be able to page through the results... But I might 
 bring that up to them again... Working with 89 million records (With the 
 number going up and expanding in the future) could cause lots of issues in 
 the long run...

Guarantee the bosses will never use it. If they had to page thru 89m records, 
they'd rebel too







 
 
 
 On Oct 25, 2011, at 6:26 PM, Jason Pruim li...@pruimphotography.com wrote:
 
 
 Jason Pruim
 li...@pruimphotography.com
 
 
 
 On Oct 25, 2011, at 10:51 AM, Jim Giner wrote:
 
 I disagree.  It's not about tuning the queries, it is more about the 
 appl. 
 design that currently thinks it SHOULD do such huge queries.
 
 My approach would be to prompt the user for filtering criteria that 
 automatically would reduce the result set size.  Although at this time I 
 believe the OP mentioned that the db is just telephone numbers so that 
 doesn't leave much room for filter-criteria. 
 
 
 
 Yes it is just phone numbers... The only select that I'm running on the 
 entire site is related to the pagination... A simple:
 $sqlCount = SELECT COUNT(*) FROM main WHERE state = '{$state}';
 
 which limits it to everything inside the state... Unfortunately if you look 
 at the possibilities, it's still quite a large dataset... 89 million :)
 
 The rest of the query's will be much more limited to areacode, exchange, 
 and in some cases the full phone number... Maybe the better way to do it 
 would be not to count the records But set a variable with the total 
 count... That way I don't have to load all the data... The data amount 
 won't change alot... Easy enough to set a variable...  Just need to see if 
 I can integrate that with the pagination...
 
 Back to the drawing board! :)
 
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 
 
 
 
 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 

Bastien Koert
905-904-0334
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread Jim Giner
David Robley robl...@aapt.net.au wrote in message 
news:49.50.34068.1b567...@pb1.pair.com...

 Consider running EXPLAIN on all your queries to see if there is something
 Mysql thinks could be done to improve performance.


Why do so many responders seem to think the problem here is in the 
preparation of the query?

It is the Approach that it is the problem.  It needs to re-addressed, not 
simply tinkered with.  I don't care if someone figures out how to spew out 
89M+ records in 5 seconds flat.  What are you doing pulling up that many 
detail records at one time for? Who is going to look at them?  You?  If so, 
see my previous post on how long that is going to take you just to examine 
(!) the first million alone.

This problem needs a re-working of the data processing and user interfaces 
to allow for the selection of an appropriate amount of individual records in 
any result set, otherwise simply preparing a summary of some sort. 



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread David Robley
Jason Pruim wrote:

 
 Jason Pruim
 li...@pruimphotography.com
 
 
 
 On Oct 25, 2011, at 10:51 AM, Jim Giner wrote:
 
 I disagree.  It's not about tuning the queries, it is more about the
 appl. design that currently thinks it SHOULD do such huge queries.
 
 My approach would be to prompt the user for filtering criteria that
 automatically would reduce the result set size.  Although at this time I
 believe the OP mentioned that the db is just telephone numbers so that
 doesn't leave much room for filter-criteria.
 
 
 
 Yes it is just phone numbers... The only select that I'm running on the
 entire site is related to the pagination... A simple: $sqlCount = SELECT
 COUNT(*) FROM main WHERE state = '{$state}';
 
 which limits it to everything inside the state... Unfortunately if you
 look at the possibilities, it's still quite a large dataset... 89 million
 :)
 
 The rest of the query's will be much more limited to areacode, exchange,
 and in some cases the full phone number... Maybe the better way to do it
 would be not to count the records But set a variable with the total
 count... That way I don't have to load all the data... The data amount
 won't change alot... Easy enough to set a variable...  Just need to see if
 I can integrate that with the pagination...
 
 Back to the drawing board! :)
 

Consider running EXPLAIN on all your queries to see if there is something
Mysql thinks could be done to improve performance.



Cheers
-- 
David Robley

Why do they put Braille dots on the keypad of the drive-up ATM?
Today is Prickle-Prickle, the 7th day of The Aftermath in the YOLD 3177. 


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 25, 2011, at 9:58 PM, Jim Giner wrote:

 David Robley robl...@aapt.net.au wrote in message 
 news:49.50.34068.1b567...@pb1.pair.com...
 
 Consider running EXPLAIN on all your queries to see if there is something
 Mysql thinks could be done to improve performance.
 
 
 Why do so many responders seem to think the problem here is in the 
 preparation of the query?
 
 It is the Approach that it is the problem.  It needs to re-addressed, not 
 simply tinkered with.  I don't care if someone figures out how to spew out 
 89M+ records in 5 seconds flat.  What are you doing pulling up that many 
 detail records at one time for? Who is going to look at them?  You?  If so, 
 see my previous post on how long that is going to take you just to examine 
 (!) the first million alone.
 
 This problem needs a re-working of the data processing and user interfaces 
 to allow for the selection of an appropriate amount of individual records in 
 any result set, otherwise simply preparing a summary of some sort. 


It turns out the issue was actually in the pagination... I'm reworking the 
whole thing and stream lining it... But in the pagination that I found on the 
internet it used a SELECT COUNT(*) WHERE state='{$state}'; and the COUNT was 
killing the time... Once that was removed, I was displaying records faster then 
I could imagine... So it's off to pagination land to fix it! And possibly redo 
the entire thing!



--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread Jim Giner


- Original Message - 
From: Jason Pruim li...@pruimphotography.com

To: Jim Giner jim.gi...@albanyhandball.com
Cc: php-general@lists.php.net
Sent: Tuesday, October 25, 2011 10:06 PM
Subject: Re: [PHP] Exporting large data from mysql to html using php


It turns out the issue was actually in the pagination... I'm reworking the 
whole thing and stream lining it... But in the pagination that I found on 
the internet it used a SELECT COUNT(*) WHERE state='{$state}'; and the 
COUNT was killing the time... Once that was removed, I was displaying 
records faster then I could imagine... So it's off to pagination land to fix 
it! And possibly redo the entire thing!



So you're still going to generate pages and pages of an un-manageable volume 
of data?


Sorry, but I offer my congratulations on producing a technically marvelous 
solution, and my condolences for a completely worthless application.








--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 25, 2011, at 10:10 PM, Jim Giner wrote:

 
 - Original Message - From: Jason Pruim li...@pruimphotography.com
 To: Jim Giner jim.gi...@albanyhandball.com
 Cc: php-general@lists.php.net
 Sent: Tuesday, October 25, 2011 10:06 PM
 Subject: Re: [PHP] Exporting large data from mysql to html using php
 
 
 It turns out the issue was actually in the pagination... I'm reworking the 
 whole thing and stream lining it... But in the pagination that I found on the 
 internet it used a SELECT COUNT(*) WHERE state='{$state}'; and the COUNT 
 was killing the time... Once that was removed, I was displaying records 
 faster then I could imagine... So it's off to pagination land to fix it! And 
 possibly redo the entire thing!
 
 
 So you're still going to generate pages and pages of an un-manageable volume 
 of data?
 
 Sorry, but I offer my congratulations on producing a technically marvelous 
 solution, and my condolences for a completely worthless application.

It's walking the fine line between what I want, and what the people who are 
paying for it want... What I always try and do is give them what they ask for, 
but also give them what they want... because what they ask for != what they 
want MOST of the time...  So most likely I will end up pulling the pagination 
and really limiting the searching... 

But until I convince the right people... I go on as planned.


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-25 Thread Tommy Pham
On Tue, Oct 25, 2011 at 7:06 PM, Jason Pruim li...@pruimphotography.comwrote:


 It turns out the issue was actually in the pagination... I'm reworking the
 whole thing and stream lining it... But in the pagination that I found on
 the internet it used a SELECT COUNT(*) WHERE state='{$state}'; and the
 COUNT was killing the time... Once that was removed, I was displaying
 records faster then I could imagine... So it's off to pagination land to fix
 it! And possibly redo the entire thing!


If you're encountering performance issues while doing SELECT COUNT(*), it
sounds like you have serious disk IO performance issue.  Is the DB on RAID
subsystem?  If not, why not? If so, what level?  Also, what type of HDDs?
For something like this, it should be RAID 10 with HDDs spinning at least
7200RPM, 10,000+ RPM recommended, connected to a good RAID controller, like
3ware's.  Also, the controller should be in slot PCI-X or, preferably,
PCI-e.

Regards,
Tommy


[PHP] Exporting large data from mysql to html using php

2011-10-24 Thread Jason Pruim
Now that I've managed to list 3 separate programming languages and somewhat tie 
it back into php here's the question...

I have about 89 million records in mysql... the initial load of the page takes 
2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL query's... 
But they just aren't going fast enough...

What I would like to do, is pull the data out of MySQL and store it in the HTML 
files, and then update the HTML files once a day/week/month... I can figure 
most of it out... BUT... How do I automatically link to the individual pages?

I have the site working when you pull it from MySQL... Just the load time 
sucks... Any suggestions on where I can pull some more info from? :)

Thanks in advance!


Jason Pruim
li...@pruimphotography.com




--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-24 Thread Shiplu Mokaddim
2-3 minutes is long enough. I think your SELECT query and MySQL schema has 
something that can be improved.

Sent from a handheld device.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-24 Thread Bastien


On 2011-10-24, at 8:50 PM, Jason Pruim li...@pruimphotography.com wrote:

 Now that I've managed to list 3 separate programming languages and somewhat 
 tie it back into php here's the question...
 
 I have about 89 million records in mysql... the initial load of the page 
 takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL 
 query's... But they just aren't going fast enough...
 
 What I would like to do, is pull the data out of MySQL and store it in the 
 HTML files, and then update the HTML files once a day/week/month... I can 
 figure most of it out... BUT... How do I automatically link to the individual 
 pages?
 
 I have the site working when you pull it from MySQL... Just the load time 
 sucks... Any suggestions on where I can pull some more info from? :)
 
 Thanks in advance!
 
 
 Jason Pruim
 li...@pruimphotography.com
 
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 

That's a ton of data. So there are a couple of questions:

1. Is the data ordered in any way? 

The issue is that you might need to regenerate the files if new data needs to 
be interspersed.

2. Why is pagination slow? 

A LIMIT with an OFFSET should be very quick if the table is properly indexed. 
Is there any tuning that you can do to further filter the results? Say by date 
or some other criteria? Have you run an EXPLAIN plan on the db to show how the 
queries are being run?

Other thoughts:

- are your indexes up to date? Have you optimized those index to be clean, not 
fragmented, and therefore fast?

- can you organize the data by some criteria to better show the results? Like 
by date? Or by starting alpha character? Or by category?

Bastien Koert
905-904-0334





--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-24 Thread Jim Giner
Why would any user need to have access to 89M records?



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-24 Thread Bastien

On 2011-10-24, at 10:44 PM, Jim Giner jim.gi...@albanyhandball.com wrote:

 Why would any user need to have access to 89M records?
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 

History or audit trail data? I can think of lots. I know of an app with a 
poorly designed system that easily gets over 240 million

Bastien Koert
905-904-0334
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-24 Thread Jason Pruim




On Oct 24, 2011, at 10:44 PM, Jim Giner wrote:

 Why would any user need to have access to 89M records?

They don't need access to it to edit it... Just to be able to view it... Also, 
it will be expanding in the future to include alot more data.



Jason Pruim
li...@pruimphotography.com
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-24 Thread Jason Pruim

Jason Pruim
li...@pruimphotography.com



On Oct 24, 2011, at 9:20 PM, Bastien wrote:

 
 
 On 2011-10-24, at 8:50 PM, Jason Pruim li...@pruimphotography.com wrote:
 
 Now that I've managed to list 3 separate programming languages and somewhat 
 tie it back into php here's the question...
 
 I have about 89 million records in mysql... the initial load of the page 
 takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL 
 query's... But they just aren't going fast enough...
 
 What I would like to do, is pull the data out of MySQL and store it in the 
 HTML files, and then update the HTML files once a day/week/month... I can 
 figure most of it out... BUT... How do I automatically link to the 
 individual pages?
 
 I have the site working when you pull it from MySQL... Just the load time 
 sucks... Any suggestions on where I can pull some more info from? :)
 
 Thanks in advance!
 
 
 Jason Pruim
 li...@pruimphotography.com
 
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 
 
 That's a ton of data. So there are a couple of questions:
 
 1. Is the data ordered in any way? 

Data is in numeric order in the database, but not specifing a ORDER BY on the 
SELECT..
 
 The issue is that you might need to regenerate the files if new data needs to 
 be interspersed.

Indexes are freshly done... I might need to tweak them and learn more about 
them... Working on that right now!
 
 2. Why is pagination slow? 

Pagination isn't all that slow... Just the initial load of the site... The rest 
of it works fairly well... But the guys with the paychecks aren't going for a 
2-3 minute initial load time on a public website
 
 A LIMIT with an OFFSET should be very quick if the table is properly indexed. 
 Is there any tuning that you can do to further filter the results? Say by 
 date or some other criteria? Have you run an EXPLAIN plan on the db to show 
 how the queries are being run?
 
 Other thoughts:
 
 - are your indexes up to date? Have you optimized those index to be clean, 
 not fragmented, and therefore fast?
 

Just generated the indexes so it should be completely perfect.

 - can you organize the data by some criteria to better show the results? Like 
 by date? Or by starting alpha character? Or by category?

Trying to split up the data some... It's phone numbers... So I have the area 
code, exchange, and the subscriber number stored in 3 separate fields... But 
that doesn't seem any faster then when I had it all in one field

I think I just need to rethink the design a little bit, and optimize the entire 
app... But I think that will hit tomorrow.

Thanks for your suggestions!
 
 Bastien Koert
 905-904-0334
 
 
 
 
 
 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-24 Thread Jim Giner
Yes - but - we're talking about a user-app that the OP is trying to provide 
89M records to.  Sure - some users might have need of looking at even as 
much as a million records IF they were researching something that needed it. 
But - for the 'general' user of an app - I cannot see a need to be providing 
even that much data.  Think about it - you give the user 1M records, how 
long is that user going to page thru the results?  Let's say there are 20 
results on a page and it takes a mere (wow!) 2 seconds to scan thru them 
looking for something apparently very obvious.  That's 600 results viewed 
per minute at best.  Six hundred into a M is 1666 minutes which is 27+ 
hours.  Even at 1 second per page view time, it's still more time than in a 
normal work day.  And you want come up with a way to feed him 89M?

The problem needs to be looked at differently - put together a 
business-smart solution to finding the  needle in this haystack of  89M 
pieces of straw and only then apply technology to it.



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-24 Thread Bastien


On 2011-10-24, at 11:26 PM, Jim Giner jim.gi...@albanyhandball.com wrote:

 Yes - but - we're talking about a user-app that the OP is trying to provide 
 89M records to.  Sure - some users might have need of looking at even as 
 much as a million records IF they were researching something that needed it. 
 But - for the 'general' user of an app - I cannot see a need to be providing 
 even that much data.  Think about it - you give the user 1M records, how 
 long is that user going to page thru the results?  Let's say there are 20 
 results on a page and it takes a mere (wow!) 2 seconds to scan thru them 
 looking for something apparently very obvious.  That's 600 results viewed 
 per minute at best.  Six hundred into a M is 1666 minutes which is 27+ 
 hours.  Even at 1 second per page view time, it's still more time than in a 
 normal work day.  And you want come up with a way to feed him 89M?
 
 The problem needs to be looked at differently - put together a 
 business-smart solution to finding the  needle in this haystack of  89M 
 pieces of straw and only then apply technology to it.
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 

I agree, Jim. Which is why I was asking a about tuning the queries. Jason 
already indicated he was using limits and offsets, so it's other avenues i was 
looking for.


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-24 Thread Bastien

On 2011-10-24, at 11:14 PM, Jason Pruim li...@pruimphotography.com wrote:

 
 Jason Pruim
 li...@pruimphotography.com
 
 
 
 On Oct 24, 2011, at 9:20 PM, Bastien wrote:
 
 
 
 On 2011-10-24, at 8:50 PM, Jason Pruim li...@pruimphotography.com wrote:
 
 Now that I've managed to list 3 separate programming languages and somewhat 
 tie it back into php here's the question...
 
 I have about 89 million records in mysql... the initial load of the page 
 takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL 
 query's... But they just aren't going fast enough...
 
 What I would like to do, is pull the data out of MySQL and store it in the 
 HTML files, and then update the HTML files once a day/week/month... I can 
 figure most of it out... BUT... How do I automatically link to the 
 individual pages?
 
 I have the site working when you pull it from MySQL... Just the load time 
 sucks... Any suggestions on where I can pull some more info from? :)
 
 Thanks in advance!
 
 
 Jason Pruim
 li...@pruimphotography.com
 
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 
 
 That's a ton of data. So there are a couple of questions:
 
 1. Is the data ordered in any way? 
 
 Data is in numeric order in the database, but not specifing a ORDER BY on the 
 SELECT..
 
 The issue is that you might need to regenerate the files if new data needs 
 to be interspersed.
 
 Indexes are freshly done... I might need to tweak them and learn more about 
 them... Working on that right now!
 
 2. Why is pagination slow? 
 
 Pagination isn't all that slow... Just the initial load of the site... The 
 rest of it works fairly well... But the guys with the paychecks aren't going 
 for a 2-3 minute initial load time on a public website
 
 A LIMIT with an OFFSET should be very quick if the table is properly 
 indexed. Is there any tuning that you can do to further filter the results? 
 Say by date or some other criteria? Have you run an EXPLAIN plan on the db 
 to show how the queries are being run?
 
 Other thoughts:
 
 - are your indexes up to date? Have you optimized those index to be clean, 
 not fragmented, and therefore fast?
 
 
 Just generated the indexes so it should be completely perfect.
 
 - can you organize the data by some criteria to better show the results? 
 Like by date? Or by starting alpha character? Or by category?
 
 Trying to split up the data some... It's phone numbers... So I have the area 
 code, exchange, and the subscriber number stored in 3 separate fields... But 
 that doesn't seem any faster then when I had it all in one field
 
 I think I just need to rethink the design a little bit, and optimize the 
 entire app... But I think that will hit tomorrow.
 
 Thanks for your suggestions!
 
 Bastien Koert
 905-904-0334
 
 
 
 
 
 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 
 

My preference would be to have that as one numeric field ( bigint ) and search 
on that. DBs like numbers and should be able to return the data very quickly

As for the app start, can you present a tiny ( say 10 rows ) subset that you 
could read in from a cache? Perhaps the 10 most common searched on or last 10 
used? You could cache it for an hour and then refresh it. Same thing applies if 
the start is like a dashboard with various reports. Cache out what you can and 
hit the DB for the bare minimum to keep people happy

Then your app should be right quick 
to start 

Bastien Koert
905-904-0334


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Exporting large data from mysql to html using php

2011-10-24 Thread tamouse mailing lists
On Mon, Oct 24, 2011 at 7:50 PM, Jason Pruim li...@pruimphotography.com wrote:
 I have about 89 million records in mysql... the initial load of the page 
 takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL 
 query's... But they just aren't going fast enough...

 What I would like to do, is pull the data out of MySQL and store it in the 
 HTML files, and then update the HTML files once a day/week/month... I can 
 figure most of it out... BUT... How do I automatically link to the individual 
 pages?

Does this help?

http://peeling.hubpages.com/hub/Generate_Static_HTML_Pages_From_A_Database

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php