Are you saying that just taking the source of a rendered page and maybe making slight changes would not work?
The object would be to render static pages to reduce server load (PHP and MySQL, specifically), thereby, increasing the speed at which a page is delivered by Apache.
I’m not complaining about EE’s ability to cache. It’s awesome. But a Digg hit or a SlashDot hit or a Fireball hit tends to slow down many servers where the site is served dynamically (Apache, PHP, MySQL).
All spiders and search engines - as well as the wayback machine, etc. effectively capture the rendered page - which seems like what you want to do…..
Yeah, the static page should have the same URL as the dynamic page, hence a solution probably mucks with .htaccess, too.
My basic advice - which I base on a very recent experience - is that upgrading the server and serving dynamically is the better way to go. Any site with traffic that heavy should be able to afford the basic upgrades in service!
Life should be that easy.
The problem has to do more with spikes (above) in traffic and how to handle them without bringing a site down. WordPress sites have this problem regularly (Digged, Slashdotted, Fireballed) but a couple of very good caching plugins which render content as static files work wonders to mitigate frequent spikes in traffic.
In my case, I am spending about 20% more per month, and my page load times went down by a factor of 5 to 10X. Sure, static pages MIGHT be quicker, but actually I have some relatively static pages which are still slow…since they pull in google ads, a google search box, etc.
Yeah, I avoid the hosts which feature unlimited storage and unlimited bandwidth for $4.99 a month. But even more robust hardware can see server problems when traffic increases by an order of magnitude or two.
I think throwing hardware and optimization at it is the best deal going…
Sys admins probably love that solution. Average EE clients don’t share the same sentiment. If the traffic is constantly high then the entire business model changes, which justifies a more robust hardware solution. It’s the unpredictable spikes I seek to conquer.
SolSpace’s Static Page Caching seems to be going in the right direction, though I see some limitation in workflow management (needs to be automated to produce static pages for dynamic pages which start to get a heavier hit load).