Back to the future, anyone at first sight can be forgiven for thinking that website generation is going backwards with static website generators becoming more and more popular. This is no web masters’ whimsy, big companies and institutions are favoring the static route over currently more popular dynamic database driven options.
Size is not a restriction, large websites have been built using site generators example healthcare.gov. Even Google has adopted some static pages for some products, so are we returning to the 1990’s, the beginning of the web?
Since the initiation of the web, browsers have matured and become not only applications but application environments in their own right. Network speed and technology have improved considerably. The one major irony is that in general websites or more specifically, how they are generated has not kept pace.
The speed with which a site can be rendered by a browser depends on essentially the resources of the server and the speed of the network. That said and for even a modest network the brunt of the workload devolves to site server. It makes sense therefore if speed is a consideration and it is, to make page delivery as efficient as possible
A significant number of modern sites are generated by popular Content Management Systems (CMS like WordPress, Joomla, Mod x, Drupal, etc). These types of applications are very similar in how they function. A site page is requested and the site server passes that onwards for processing which inevitably means extracting content from a data store (database MySQL, Postgres). Data is extracted and then composed into a page and sent off to the browser. That procedure is duplicated for every single request for a page in real time.
That’s the theory, in practice that request->database->compile proves very expensive in server resources and there is a significant time penalty from request to page serving.
There are many advantages to this type of system, almost instant and easy updating, user allocated pages. Ideal where content editors strive to publish events in real time but, how many company websites actually get updated on a regular basis? The ability to edit documents at anytime may be a useful facility but often falls into disuse. Business owners are more concerned with running their business than becoming proficient at updated or generating web pages.
There are some significant drawbacks:
1. These popular applications are a constant target for hackers. Mismatch of components and failure to regularly update installations opens the door to security issues and vulnerability. That may not seem like much until you have had the experience of being hacked and having to clean up the mess.
Because of the database layer and constant reference it requires maintenance and the most efficient queries possible to maintain as much through put as possible.
There is no highly trafficked dynamic site that does not have some sort of caching policy with all the complications that involves. Caching involves generating segments of content that are saved to the file system to avoid accessing the database as much as possible.
Old technology with a new twist, site generators in one form or another have been around since the beginning but failed to make any dent on the popularity of content management systems. The modern static site generator is a far cry from it’s ancestor like Dreamweaver.
1. Pages are compiled ready for delivery, all required processing has been completed prior to mounting on the server. Every web server of the planet is capable of handling web encoded pages with minimum stress and maximum efficiency and speed. The fastest way ever to deliver web pages.
Caching technology, databases and associated time consuming effort completely dispensed with. Data is stored in flat files.
Often cited is the fact that static sites are cheaper to develop but that is not necessarily the case depending on the exact purpose.
The inevitable security issues associated with popular CMS’s is completely avoided.
1. Content editing is the biggest drawback, but considerable advances have been made to make this a less formidable task for the non-technical. But to really forge ahead and compete with popular blogging software this issue has to be resolved completely and until then, static sites may never reach their full potential and become mainstream.
- Depending on the application, user interaction is limited without using external services. Web available services have vastly increased in number and quality making commonly available sophisticated elements like search, e-commerce shops, commenting etc.
Static sites are not just becoming popular now, it has been an industry trend over the last few years which is still building momentum. Significantly, some high profile agencies have chosen to focus on delivering digital content by static means to the exclusion of mainline software.
Most definitely this resurgence of interest in static site generation is being driven by the most popular hand held devices of mobile phones and tablets where there is a demand for high performance. People are no longer content to wait for a site to appear in a dozen seconds. Page rendering of four seconds or better is considered not just an attainable target but a requirement.
Undoubtedly the issue of security vulnerabilities is playing a part. Because of hacker targeting, popular CMS software has to be updated regularly with security patches.
A static site can be hosted on a static hosting facility or any other server capable of serving html encoded pages, which is all of them. Since a static site can be hosted on any server, shared hosts can be used with noticeable performance boots. That would allow elements of processing if required for example a contact form.
Clearly, static sites will not be the best fit for all possible web applications and would probably be best fit for information sites and those that would not require frequent additions or editing. Depending on what frequency of updating is required and what definition of ‘real time’ is acceptable a reasonable approximation to real time can be achieved.
That ‘hybrid’ approach using static pages where most appropriate and external services to complete an application is completely viable with the explosion in growth of web services from search to shops.
Because of the two most burning issues of security and performance, there is no doubt that the upsurge in interest and use of static sites will continue. Certainly, bigger concerns who have the resources to engage fully with static site generators will pursue the advantages.