elega Elega Corporation

Making a Website 12x Faster




In December of 2020, I set out to build a new website for Elega Corporation. The original website used Bootstrap 3, HTML, and ran a Python backend using the Django framework. Not only was this overkill in all kinds of ways but I'm here to argue that it is overkill for most websites, and perhaps just a bad idea in general.

Indeed, what has finally been learned around the web is that we kind of did everything better in the 1990s. You wouldn't think that. When people think of the 1990s, they typically think of slow, dial-up internet speed. Whether you had a 28.8k or a 56k dial-up modem back in the day: it was slow. If you're young enough that the dial-up internet era was before your time, then you probably imagine an internet full of no social media, perhaps some cobwebs, and maybe low resolution graphics.

Don't get me wrong. Those were not the glory days. I am not here to say that we should literally make *everything* like the 1990s. But what was true is that most websites were static HTML files. All any web server had to do was find the HTML file requested, return its bytes, and that was that - any JavaScript in those days was usually fairly primitive. It certainly took a minute for lots of full blown games made in JavaScript to emerge.

Certainly, the creators of JavaScript never intended for it to become a wildly popular, first class citizen of a language. They definitely never intended for something like NodeJS, which is the runtime and compiler that allows JavaScript to run server side outside the browser. Indeed, a lot of what has happened has been a mistake.

In case you're not getting it, here is the big difference I am hinting at: websites were way, way simpler than what they have become. There was *no such thing* as a web application. Applications were not on the web. HTML files were considered 'markup' documents. Documents. I hate to sound like the old person again but young people do not remember that the World Wide Web, also known as the internet we now know today, was sold as a way to have all of the world's knowledge at your fingertips.

All of those original intentions changed. Web 2.0 emerged, desktop applications waned in popularity, often due to security concerns, and thus people spending a lot of their life in the web browser, even for things like logging in to their bank account, or paying their taxes, emerged!

Take, for instance, the MEAN stack. MEAN is an acronym for MongoDB, Express, Angular, and NodeJS. It's a way to create an application completely in JavaScript. The frontend, handled by Angular, and the backend, handled by Express & Node. But what are these applications normally doing? Running queries against databases, spitting data back out at the user. Occasionally, there may be a chart used somewhere, which probably comes from yet another library to rely on.

To make the new Elega website, I rejected all of this. The construction of the pages, through whatever dynamic content is needed, is done through a proprietary static site generator written in Rust. This produces a big library of static files, which are inert. They are just HTML, no web server backing them. They have some markup, and there is some CSS for the visuals. There is no JavaScript at all so far.

The files are then uploaded to an Apache server by SFTP, where the website is served up the same way it was in the 1990s: just by returning back static files.

Different from the 1990s: most people are on connections that are more than 100x faster. Download speeds have gone from 56k to 1,000,000kb, or in other words: we have gone to 1 gigabit speeds. I may have some of the technicals sort of skewed here, like for instance that the 56k modem probably only ever approached 53.3kbits, or some other tidbit about gigabit speeds as well. But the point remains despite my vague allusions here: internet speeds are now insanely fast.

Central processing units (CPUs) are not getting that much faster, and the browser isn't either. Instead, calculations can get done only so fast on a single thread, and for multiple threads: only certain elements can be parallelized while others require state dependency, or stated differently: must remain being processed serially.

The final result is that I have gone from a website that has abstraction layers to one that strips much of that away. Instead of Python, it is Rust, which means that instead of a slower runtime that uses garbage collection, I have compiled machine code without garbage collection. Instead of a web server that performs computation at the time of an HTTP request, it is instead spending almost all of its time just returning back data. Instead of a database connection being formed and a query being run at the time of an HTTP request, no such connection is made. The result of the query is already there on the page that was generated by the Rust code.

This means going from a loading speed that takes in excess of 2,800-3,500 milliseconds and going to one that is done processing, page rendering of cascading style sheets (CSS) and all, within 250 milliseconds. Then, the page you just visited is put into cache on the client side browser. Next time you refresh the page, it likely takes closer to 55 milliseconds.

This is between ~11x to ~12x faster than the original website design - over a full order of magnitude faster. It delivers the blog, sprawling visual design, links, all the things a typical website needs, and dynamic content is coming. For dynamic content, the principle is the same: cache all content ahead of the request, then deliver it without any additional computation once the request is happening.

The internet desparately needs a return to the fundamentals and a grand simplification of the architecture that gets adopted. The cruft is not your friend - there is a better way.