Weisser Zwerg Logo

Weisser Zwerg

Software Engineering: Web Development

Published by Weisser Zwerg Blog on

Reflection on the state of web development or the story of Media Artists AG (1997).

This blog post is part of the Odysseys in Software Engineering series.

Setting the Stage

In my first year at university, I studied physics and computer science in parallel. After my first year I decided to stop the computer science lectures and start working in a start-up instead. This was 1997. This was the time of the dot-com boom, where everybody wanted to have a place on the internet, nobody knew what the internet was, and we gave them the internet. The start-up was called Media Artists AG. There were the two founders and I was their first employee.

In today’s terminology, the product idea was to create a content management system (CMS) for big German publishers. And as the internet and the web were the big things the natural choice was to build this application in web native technologies like HTML and JavaScript.

This was also the time of the first browser war, the competition for dominance in the usage share of Microsoft’s Internet Explorer against Netscape’s Navigator. There was basically no compatibility between browsers and even minor version upgrades of the same browser were breaking backward compatibility in substantial ways. Internet Explorer 4 and its dialect of JavaScript, called JScript, were the first technologies that enabled, what is now called a single-page application (SPA). I started to work on early beta versions of Internet Explorer 4 as Media Artists AG was tasked to lead the road-show of Microsoft in Europe for the market introduction of Internet Explorer 4. Every minor version upgrade of the beta version of Internet Explorer 4 meant that nothing was working any longer, because every minor version upgrade broke backward compatibility in substantial ways.

We used our experience and ingenuity to create a single-page application (SPA) to be used by editors at big German publishers for updating news articles on the publisher’s web-sites. The application (mis-)used hidden frames to communicate with the server and the server generated valid JScript code to communicate data-structures back to the application. This way we saved the need to write a parser in contrast to later implementations that used XML as the “wire-format”. Cross-site scripting (XSS) and the browser protection mechanisms against it were no worry at the time, so that you could freely exchange data from one frame to another. This was basically, what later would be called Asynchronous JavaScript and XML (AJAX) and JavaScript Object Notation (JSON), e.g. technologies that were only later during the Web 2.0 movement gaining more widespread use.

While all of this was working fine, the incompatibilities between different versions of Internet Explorer 4 were horrific. Even after the beta phase for Internet Explorer 4 ended and Internet Explorer 4 was supposed to be stable, every (even minor) update brought countless incompatibilities that we had to account for in our code base with a lot of “if-then-else” logic handling different version numbers of IE4.

This experience was so painful for me that for a long time I did not touch front-end technologies again and preferred to stay on the back-end. Only in 2019, when my daughter had an idea for an own social start-up, I decided it was time to revisit the topic once again. So, what follows, is the story of my re-discovery of web technologies with a 20 year break in between. And just to make the above story about Media Artists AG complete: we switched from HTML and JScript to Java, where I worked in this context with early versions of Java Swing, before it became an integral part of the JDK. At least Java Swing stayed backward compatible between releases so that we could focus on the product rather than on incompatibilities.

The App

At the moment you can find the prototype web-app at https://joto-prekt-dev.web.app. We decided to use Firebase for the back-end, because I was curious about the serverless movement and we did not want to take care of server administration. We selected React, because the functional nature appealed to me. We started originally with JavaScript, but soon switched to TypeScript.

The Good

The biggest positive surprise was TypeScript. I could not imagine that a retrofitted type system could work that well, especially after my disappointments with type systems in the python world. Switching from JavaScript to TypeScript improved the stability and reduced the amount of tests we had to write by an order of magnitude. It also made the IDE support for refactoring and introspection so much more powerful.

The package management system like npm I also count as a positive. I am aware that people are sometimes critical about package management systems and complain that they “download the whole internet”. But managing external libraries in the past was always a major pain! You had to download them from somewhere, mostly as a zip archive[1], and then keep track of changes made there yourself. In addition you had to identify and download all the transitive dependencies yourself. A package system makes all of that a repeatable and stable process.

Another nice encounter was bootstrap: finally, some useful defaults for a starting point that you can build upon. But I was a bit disappointed by the state of CSS in general. CSS existed already in 1997 and the improvements are limited. Why are pre-processors like sass or less still needed? Why is that not already part of the core standard language?

I also enjoyed working with react due to its functional nature. You take one state and transform it into another state. Nice and clean. I always thought that UIs should be built like they are in computer games, e.g. a big while loop where you take inputs from your periphery, update the game state and draw the scene as a derivative of that game state. This is the exact opposite of these tangled balls of mud with UI state scattered over your whole codebase and event chains ending in infinite loops. While this is not one-to-one related to react the state management via redux together with redux-persist to handle situations where users inadvertently close tabs or press F5 also felt like an improvement.

While this is not part of the core web technologies and I am still not 100% sure if I should count it as a positive, I found the serverless security rules intriguing. I always was wondering how a serverless infrastructure would handle these security aspects in the absence of any protected and safe secure area like the backend. But writing and testing these rules and being sure that they finally achieve the intended goal was a major headache. I am not sure if people will in the future come up with better solutions perhaps?

The chrome browser dev tools together with the debugger and the JavaScript Source Maps and further plug-ins like React Developer Tools and Redux DevTools made it relatively straight forward to introspect what was going on in the browser.

The Bad

The “bad” is easily summarized as “layers on top of layers of indirection” (or turtles all the way down). It took me quite a lot of time to understand what is (roughly) going on. I won’t pretend that I understand it all.

It starts with babel[2] and webpack. Actually, we used Create React App, to configure a “black box” webpack setup for us and it took me quite some time to understand what was happening behind the scenes. Due to the still existing incompatibilities between browsers as documented by sites like Can I use the “build” machinery of a web application has to jump through several hoops to make the result work the same way on whatever browser the user uses. One aspect of it are polyfills, which is code that implements a feature on web browsers that do not support the feature. Then you have the compilation from TypeScript to a broadly accepted version of ECMAScript, e.g. ES5 or so. I guess, in principle, the TypeScript compiler should be able to handle that, but somehow babel, a JavaScript to JavaScript compiler, is still in the game. And finally, all of the resulting artifacts have to be packaged in as small as possible pieces of code that can be loaded incrementally to not burden the start-up time of the web page. This aspect is called code splitting.

As a side remark: recently I had a look at some Angular code and noticed the many annotations. I was not sure which part of this layers on top of layers of indirection is taking care of converting the annotations into actual JavaScript and how. I learned that Decorators, as annotations are called in the JavaScript world, are a stage 2 proposal for JavaScript and are available as an experimental feature of TypeScript. It is astonishing, that experimental features are promoted as core features of a widely used framework like Angular.

And the Ugly

The ugly pieces are the “speed” at which everything needs to move in the JavaScript/Web world and especially the constant need for security fixes without a clear path forward on how to deal with those fixes.

People, often say, that computer science is a young field and therefore a lot of progress is still being made. This also nurtures the narrative of the fast-paced technological progress and the constant need for learning and staying atop of the curve to not become irrelevant. I would argue that most relevant work in computer science was done before the 2000s (with a few exceptions like DNNs and similar) and what we see as fast-paced technological progress is mostly bike-shedding. Typically, for the difficult problems, there is only one way how to do it. For the trivial problems, there are 100s of ways on how to do it and that invites the myriads of web frameworks and short-lived hypes out there. I still remember when GWT was the hottest thing on the planet. Then it was dropped by Google and most of its users. Then came “newer technologies” like Polymer, AngularJS, Flutter with Dart, Angular, … and this is only an enumeration of Google technologies. The same is true in the Facebook/React camp with Gatsby, next.js, Create React App, … and that does not even touch yet the 10’000s of other web frameworks out there.

Why are there still no agreed best practice approaches out there? I may exaggerate here, but as far as I can tell it looks to me as if you have to throw away your front-end every 1 to 2 years, only because the web framework on which you originally built your site was abandoned by its creators for no other good reason than the next fad. How do you want to build stable and production ready software that delivers business value like that?

Especially the state of the constant security alerts worries me. There are “solutions” out there like GitHub’s Dependabot that at least warn you about known vulnerabilities, but the process for how you’re supposed to deal with those is still not clear to me? I tried to read quite a bit about it and there are articles like How to fix Security Vulnerabilities in NPM Dependencies in 3 Minutes out there, but following these recipes nearly never leads to the wanted result and you’re left with the message:

X vulnerabilities required manual review and could not be updated

The one thing that seems to at least sometimes do the right thing is npm-check-updates. But why is there no canonical and well described process that every front-end developer knows by heart and can apply in their sleep? At least I did not find anything like that.

I would even argue that most of the (security) problems in web applications stem from the fact that you use a web application in the first place. Why did rich-clients on desktop computers fall so much out of favour? On the mobile platforms like iOS and Android native rich-clients are still very common, but on a desktop computer rich-clients are a threatened species. In a rich-client you don’t have to worry about XSS or what you store or mustn’t store in the browser’s localStorage. You know that the operating system is making sure that no other process has unauthorized access to your process memory. In addition rich-clients could be so much more responsive and productive. Have you recently worked with O365 or the cloud version of Atlassian tools or any other “cloud native” application for that matter? While waiting for the hundreds of little background requests to make your application useable my feet fall asleep. This was not the case in the past when the local rich-client application was actually doing something useful like allowing you to start working immediately and syncing in the background with some server.

Perhaps it will help if we introduce a new buzz word like edge computing to make something we did do 20 years ago sound like technological progress.

Summary

All in all I am quite underwhelmed by the state of front-end web development in 2021.

On the one hand side I hope that rich-clients will gain favour again to avoid many of the problems I describe above; the problems you wouldn’t have had if you wouldn’t have used a web application in the first place. Such a rich-client may actually come as a web-application in disguise like Electron. But things like Java Web Start (by now removed from the JDK but still living on as OpenWebStart) and JavaFX deserve their place, too.

The other thing I would hope for is a reduction of the layers on top of layers of indirection. It seems that the people from the Modern Web project are thinking in a similar direction and are promoting buildless approaches and workflows. The same people are behind the Open Web Components sister project to promote components that are independent of any framework. There exist several base-libraries and some component-libraries that look at least on first sight promising.

I would hope for something simpler (less layers of abstraction) and something more stable with a longer life-cycle (longevity) to protect businesses investments rather than needing to follow one short-lived hype after the other for no good reason.

Footnotes


  1. keep in mind that SourceForge was only founded in 1999 and did not exist yet in 1997. ↩︎

  2. SWC seems to be the new kid on the block. ↩︎

Feedback

Have you written a response to this? Let me know the URL via telegraph.

No mentions yet.