The vast majority of requests which hit the web servers are simple page views, either from non-logged in users or from logged in users who are just reading articles. It occurs to me that all of these pages can be generated from [header] + [user toolbar] + [new messages] + [content] + [footer]. Everything except the user toolbar and new messages can be trivially cached, assuming we implement a cache large enough to store the HTML rendered version of every article (which shouldn't be all that large). We can then intercept the HTTP request, and send all simple page views to a cache server cluster which is separate from the rest of the site. As well as significantly speeding up normal page views, this would allow us to serve reads even if editing is not possible (for example, the backend or database is down). We'd need to update the cached interface (skin) HTML on mediawiki message changes, and maybe move some more preferences to CSS, but that shouldn't be too hard.