Re: [PHP] Includes eating up my time [SOLVED]
David, Thank you for responding. __autoload ... which basically only loads classes when they are required. Yes, this is exactly what I needed. Not only was I already on PHP5, but fortunately I had also already been building my classes so that it was one class per file, as __autoload requires. So this was possibly the easiest and most effective change I've ever implemented. And the results were impressive. The Zend profiler reported that I went from 500 to 800 milliseconds per page request down to 40 to 60 milliseconds. Occasional database connections raised the time now and again, but I think I have a handle on that. Thank you for pointing that out to me. Just the right solution. -- Dave M G Ubuntu Feisty 7.04 Kernel 2.6.20-16-386 -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] Includes eating up my time
PHP general list, This is probably obvious to people who are good at PHP, but I'm I have a PHP based CMS (content management system) built, which has grown and become quite robust. It's now spread out over about 30 files, and each file represents one class within the object oriented design. Each are a couple hundred lines of code, and two or three of the most critical classes are over a thousand lines of code. While first building it, I didn't really anticipate quite that many files. What I did is have a file called includes.php, which list all the files to be included. Then I in turn included includes.php at the beginning of my index.php file. Every page request passes through the index.php file, so that basically means every single file is included at the start of every new page request. I'm using Zend Studio, which has a profile option, which shows how long it takes for my PHP scripts to complete a request. It has a breakdown showing percentages of which scripts are using that processing time. Currently, my processes are taking under a second, but they can be around half a second or more. Although it all happens too fast for me to really notice as a person, it seems to me that a half second of processing time might be kind of long and lead to scalability problems. My first question is: Is a half second too long? I'm pretty sure it is, but maybe I'm just being paranoid. What do people consider to be acceptable time frames for processing a web page similar to what Wikipedia delivers? Most of the time is taken with the includes. Anywhere from 60% to 90% of the time it takes to process my scripts is coming from the includes.php file. I read somewhere that it's not a good idea to have more than 10 includes in any one place. I'm fine with trying to break up my include requests, but I'm unsure as to how. As each function in each class passes around objects, it's not clear from looking at the code which ones are used at any one time, so I'm unsure how to efficiently include only the necessary classes. My second question is: Is there a systematic way of determining how to incrementally include files that people use? Or is it just a constant process of testing and checking? Thank you for any advice. -- Dave M G Ubuntu Feisty 7.04 Kernel 2.6.20-16-386 -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Includes eating up my time
Hi Dave, PHP general list, This is probably obvious to people who are good at PHP, but I'm I have a PHP based CMS (content management system) built, which has grown and become quite robust. It's now spread out over about 30 files, and each file represents one class within the object oriented design. Each are a couple hundred lines of code, and two or three of the most critical classes are over a thousand lines of code. It may just be a 'dave' thing but my stuff does that too :-) While first building it, I didn't really anticipate quite that many files. What I did is have a file called includes.php, which list all the files to be included. Then I in turn included includes.php at the beginning of my index.php file. Every page request passes through the index.php file, so that basically means every single file is included at the start of every new page request. Ahh the 'dave' thing returns :-) I'm using Zend Studio, which has a profile option, which shows how long it takes for my PHP scripts to complete a request. It has a breakdown showing percentages of which scripts are using that processing time. Real dave's use vi top :-P My second question is: Is there a systematic way of determining how to incrementally include files that people use? Or is it just a constant process of testing and checking? Thank you for any advice. What version of PHP are you using ? I moved over to 5 and it has a nice cute feature :- http://uk3.php.net/__autoload which basically only loads classes when they are required. PHP 5 has some other nice to have features and will cause you a bit of work if you migrate to it BUT it is worth the effort especially in the light of 888 :-) As for your performance problems, the times are heavily dependant on the hardware and underlying OS server load etc. without further info, diagnosing problems will be dificult. TTFN Dave php/general-2007-07-31.tx php-general [EMAIL PROTECTED] ++ | Dave Restall, Computer Nerd, Cyclist, Radio Amateur G4FCU, Bodger | | Mob +44 (0) 7973 831245 Skype: dave.restall Radio: G4FCU | | email : [EMAIL PROTECTED] Web : Not Ready Yet :-( | ++ | Big book, big bore.| | -- Callimachus | ++ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Includes eating up my time
Dave M G wrote: Currently, my processes are taking under a second, but they can be around half a second or more. Although it all happens too fast for me to really notice as a person, it seems to me that a half second of processing time might be kind of long and lead to scalability problems. That's hardly the worst performance I've seen from a CMS, but you should know that nearly all CMS systems are slow, many slower than this, for similar reasons. The solution is usually to build a front-end cache, either in the CMS itself or using an external tool. For instance, MODx caches internally, while others rely on Apache/Enfold/etc. My first question is: Is a half second too long? I'm pretty sure it is, but maybe I'm just being paranoid. What do people consider to be acceptable time frames for processing a web page similar to what Wikipedia delivers? When you quote Wikipedia, you do realize that they're not a CMS, right, that they're a Wiki? There are some subtle differences. I haven't looked at Wikipedia's Wiki code (I like TWiki) but the Wikis I've used don't actually use a database or a billion classes to get their work done. They're more focused on editing an entire page of static content, which is stored on disk (and thus directly accessible by the server). If you want that kind of scalability you also MUST implement some sort of caching. PHP is a scripting language, and no scripting language will ever keep up with compiled code, no matter how good (and PHP is good). You might also consider looking at the Zend Optimizer - I've never tried it, but have heard good things. My second question is: Is there a systematic way of determining how to incrementally include files that people use? Or is it just a constant process of testing and checking? PHP does have an auto-include system called the autoloader. We use this heavily in Blackbird ESB to load classes on the fly when they're referenced. It only works for loading classes, but since you say that's what you have... Take a look here: http://us.php.net/autoload Regards, Chad -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Includes eating up my time
On Tuesday 31 July 2007, Dave M G wrote: PHP general list, This is probably obvious to people who are good at PHP, but I'm I have a PHP based CMS (content management system) built, which has grown and become quite robust. It's now spread out over about 30 files, and each file represents one class within the object oriented design. Each are a couple hundred lines of code, and two or three of the most critical classes are over a thousand lines of code. While first building it, I didn't really anticipate quite that many files. What I did is have a file called includes.php, which list all the files to be included. Then I in turn included includes.php at the beginning of my index.php file. Every page request passes through the index.php file, so that basically means every single file is included at the start of every new page request. Yep, that's the downside of a shared-nothing architecture. Initialization gets slower. Possible solutions include: - If you're using all classes, use PHP 5's __autoload() or better still spl_autoload_register() to load classes on demand instead of all at once. - Refactor your code to conditionally include code only when needed. E.g., you probably only need one page handler loaded per page request. I'm in the process of doing that for Drupal right now and the savings are quite substantial. - Op code cache. This is exactly where an op code cache will get you the biggest win, by saving you the loading and parsing time. - Page caching. To do page caching best, you should have the system do a partial bootstrap, check to see if it can serve a page from the cache, do so if it can, and if it can't only then finish loading the rest of the system. That way you skip most of the loading process on cached requests. - Some combination of the above. I'd do the op code cache last, as that's a sure-fire easy win while the others take effort. So do those first, and then whatever's left you can throw an op code cache at for an extra boost. -- Larry Garfield AIM: LOLG42 [EMAIL PROTECTED] ICQ: 6817012 If nature has made any one thing less susceptible than all others of exclusive property, it is the action of the thinking power called an idea, which an individual may exclusively possess as long as he keeps it to himself; but the moment it is divulged, it forces itself into the possession of every one, and the receiver cannot dispossess himself of it. -- Thomas Jefferson -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php