8 September 2013

Progressive enhancement still matters: Getting the JavaScript balance right

In the last ten years browsers have evolved from document viewers to application runtime engines. This change has been accompanied by the emergence of a plethora of frameworks that have added some much needed structure to JavaScript development. This is widely regarded evidence of a paradigm shift where JavaScript will be at the heart of far more dynamic browser experiences.

This has given rise to a fair amount of dogma that does not leave any room for progressive enhancement in modern web development. In this new world it’s not good enough to start with a full page load and then augment features with script. It is beginning to feel a little like a return to fat client applications.

It’s worth bearing in mind that progressive enhancement has never been about accommodating users who turn off JavaScript. The point is to create more robust websites that can deliver functionality to the widest number of users. It reduces your testing efforts, particularly on older browsers and it can help to create a more sensible balance between server and client side execution.

The hammer for every nail

In recent years there has been a slow creep of JavaScript’s responsibilities. Scripting used to bring the page to life, handle a few events and make any related UI changes. Now it is taking on tasks normally associated with server-side code, being used to control the appearance of the page, handle redirection, render HTML and process any underlying data sources.

There has been a gradual bloat in the size and complexity of JavaScript being used on web pages. The HTTP archive shows that over the last few years the number of script requests per page has remained fairly constant, though the size of these files has more than doubled. Some of the more commonly used sites on the web are now packing pretty heavy JavaScript payloads:

Site Size of JS files
Facebook.com 585.9kb
Google.com 1151.2kb
Youtube.com 661.2kb
Yahoo.com 685.3kb

The download size isn’t so much of an issue as this can be mitigated via techniques such as GZIP compression, browser caching and content delivery networks. The problem here is the sheer amount of code that’s being executed for such straightforward web pages. Are bloated JavaScript payloads are being used to implement features that might be better performed server-side?

Getting the balance right

There does need to be a more sensible balance between client and server. Ideally, processing should lean on the server as much as possible. It has more power at its disposal and is normally closer to the underlying data. Bear in mind that HTML is just a semantic data format, much like JSON. There’s no point throwing data out from the server just to convert it between data formats using slower client-side JavaScript.

Server-based development is also easier to test and maintain as you get to control the environment and all the dependencies. There will be fewer edge cases that throw your code out of kilter, less testing required on older browsers and issues will be easier to replicate.

Many projects split front-end and server-based work between different developers or teams. This can make it particularly difficult to co-ordinate client and server interactions. The end result can be an over reliance either on JavaScript or server-based processing caused by a failure of collaboration rather than an explicit design decision.

Speed and optimisation

JavaScript-based interfaces are often thought of as faster, but this isn’t necessarily the case. There needs to be a more sensible trade-off, as Twitter realised in 2012 when they moved a significant proportion of rendering to the server.

In this case, the raw parsing and execution speed of JavaScript was causing significant delays in perceived rendering speed. When you download a page the JavaScript has to wait for the entire HTML domain and script to download before it can start to render. This can make for a significantly slower initial render, even if subsequent processing is faster than successive round-trips to the server.

In Twitter’s case they decided on the metrics that really mattered to their users, e.g. “speed to first Tweet” and optimised accordingly. The trade-offs will be different between every application – the point is not to allow a dogmatic approach to lead you into a cul-de-sac.

When does a “website” become a “web app”?                                        

The rise of JavaScript has been linked to the development of “web apps” as more feature-rich successors to the document-orientated “website” of yore. The argument is that these more advanced creations are not bound by the same rules of their more static forebears.

This is an artificial distinction that doesn’t have any clear definition.  At one extreme there are document archives with very limited interaction that feel very much like the kind of largely website you encountered in 1995. At the other are highly interactive single page applications designed to leverage the browser as a runtime environment. However, between these two lie a hugely varied and shifting blend of interactivity and content that does not easily fit into a bucket marked “site” or “app”.

This distinction can actually be harmful as it implies that there is a new class of browser-based experience that can be exempt from concerns over accessibility, performance and browser support. This does not reflect how people are still accessing the internet in the real world.

Whatever happened to web accessibility?

Let’s face it – the web is a cold and dark place without JavaScript. Even Firefox has thrown in the towel recently by removing the ability to switch scripting off. That doesn’t mean that progressive enhancement is somehow obsolete.

It’s easy to forget that there is an incredibly diverse range of people who are accessing the web in ways that you might not expect. They might be using assistive technologies, dated browsers, obsolete hardware or a slow connection. The demands of web accessibility can sometimes get lost in the rush to modern, interactive experiences.

Any decisions about interface technology should be driven by a clear and unambiguous understanding of the audience. This varies from site to site of course, so log aggregation sites such as StatCounter don’t necessarily help your decision making. For larger sites with more conservative audiences it can be surprising just how many potentially influential users will be excluded by more modern interface technologies.

Filed under Rants, UI Development, Web accessibility.