Let me start by saying I am not a server management or Apache guru, I’m not. I’ve been working with it on my own servers for about a decade now, but I am not a guru. I’m the guy who wears many hats. The graphic hat, the css hat, the programmer hat, the SEO hat. I’d consider myself guru quality in SEO and in generic Internet business management, but the rest I’m just a jack.
So, anyways, I’ve struggled for many years to get my literature site to run smoothly. I think there are three articles on this site about caching php pages that were born out of me needing to do that. The site is my oldest, 10 years, and very popular, with unique monthly visitors measured in the millions, but it is also a beast to run.
Currently it is on a pretty high end dual dual-core (thats 4 total cores for you math majors) server with 12 GB of ram. And it was still having problems. All the heavy hitting non-forum non-search page was cached as plain HTML, and it was still having problems, then I cached all the deep content pages as well, and it was still having problems.
Awhile back I had it set to use Apache’s worker module, as opposed to pre-fork, because I heard it was better, I believe I was told so in a vbulletin.com forums server optimization thread. Regardless, still problems. Though, I’m not entirely sure, the new configuration of Apache 2.2 in cpanel servers is a little more confusing to handle how all the config files are spread out and the need to distill and whatnot (Sorry if you have no idea when I mean with that last line).
Now, PHP was compiled as an Apache module. I’ve done this and requested this for years going back to when I was first doing search engine friendly URLs (and popularizing the practice through my articles on Sitepoint about it). I always thought PHP as CGI ran slow, and more specifically it had a bug where a few of the search engine friendly URL methods would fail. That was in 2001, and in the intervening years no one had bothered to change my mind, and, because I’m not an Apache guru, I don’t keep up on developments unless I have a problem I need to solve.
Then last Thursday and last Friday night my site started crashing every 2 hours. Looking into the log files I couldn’t detect anything weird except that I would get a warning about Max Clients being reached and it’d crash. The thing is though that Friday the site gets about half as much traffic as any other day of the week. So it made no sense.
Looking at current activity it seemed like MSN and Yahoo (why do they both need to crawl still?) and an obnoxious VoilaBot were all crawling my site at the same time, that might have done it, but I banned VoilaBot and throttled the other two and it was still crashing every two hours.
I still don’t know for sure what was causing the problems, but in my search to find a solution I came upon some information about when to use Apache’s worker module, and when to use pre-fork, and how worker really isn’t helpful when PHP is compiled as an Apache module, and how running PHP as CGI, specifically FastCGI (or fcgi) is better in a multiprocessor environment (such as I have). The reasons are a little more technical than I want to go into, but maybe someone who IS an Apache Guru will comment and explain if they like, I don’t want to because I’m afraid I’ll get it wrong.
So, I load up cPanel’s EasyApache (Apache configuation and upgrade utility) which is wonderfully easy to use (cPanel is so much better than other server management software I’ve tried), and make the selections.
Now my server runs differently. Instead of PHP existing within Apache, it exists apart from it, and so when I check current processes and performance data I can see PHP’s usage outside of Apache (turns out, it must have been the lion’s share of Apache resource usage). This, I believe, lets me monitor things better.
But more importantly, the site is as fast as I ever remember it being, like wicked fast. The change was immediate, and awesome. The crashes stopped.
Traffic increased 15%, thats right, 15% more visitors & page views a day, roughly. It is not apples to apples because February 22nd was not exactly February 18th. But my site is typically very consistent (and on a large sample size) and I can see no other reason for the higher traffic plateau I’ve seen this week. I also noticed active forum users on at any one time has increased (a faster forum means more engaged users). Additionally, of course, over time more traffic will beget even more traffic like compounding interest works on your bank account.
The server isn’t even crashing in the middle of the night when it does the MySQL backups (which are huge, and which used to almost always cause a 10 minute “crash” (not really a crash, but unresponsiveness).
This setup is definitely working for me. So, if you’ve got a similar problem, try giving it a go.
February 26th, 2010 at 2:31 am
The problem is that cpanel defaults to running PHP as CGI, which gives poor performance and prevents you from using things like eAccelerator, APC, xcache etc). The reason it does this is to allow better privilege separation (with PHP code runs as the user who owns the domain, rather than as the web server user).
On a multi-user server, this is a good precaution; but it’s unnecessary when you’re only using the server to host your own sites. In these cases it’s probably best just to use PHP via the DSO module in Apache – you can do this via the easyapache command line script
March 1st, 2010 at 9:43 pm
I’d take a look at alternative web servers as well. lighttpd, cherokee. very fast, light memory footprint (when compared with apache).
September 28th, 2010 at 10:02 am
This post made take the risk to switch to mod_fcgid/MPM Worker and I couldn’t be happier. Load decreased significantly and crashes seem to be problems of the past. The configuration was a breeze, nothing really hard, although scarce on the Net.
Still, I’m having minor problems when zombie processes get killed, it seems php stops serving when this is happening. Still looking for an answer to that, but it’s just a minor glitch.
thanks from another Jack-of-all-trades