I have seen the future and it doesn't work -- Zardoz (1974)
We are well into the 21st century now, and "the web" is broken, it's sick, and all the endless band aids and stitches being applied to it to keep it alive and keep it working are actually suffocating and killing it. Modern software development of web apps is a nightmare because the web is based on a primitive protocol that is being pushed to absurd lengths to create web apps that have responsive rich UIs like desktop programs.
Some History
The web that become publicly available around 1992 was based upon a very simple protocol called HTTP: you requested a document from a specified address and the response hopefully contained the data, which was usually HTML for display in a browser. The protocol is stateless, connections are short-lived and have no memory of what requests you make or what they contain. This was all you needed for a basic web browsing experience.
Then people had the idea that programs could possibly run in the web browser and you could do some useful "work" in it. The underlying framework was far too primitive for this, so an arms race began to pump it up with cookies, parameters, special HTML tags, scripts, and so on in attempt to provide the necessary features. By the end of the 1990s, developing web apps was a nightmare. The development tools were primitive, debugging was nearly impossible, the few standards weren't adhered to and different browsers behaved and rendered differently.
From my usual .NET developer perspective, the release of the Microsoft .NET Framework around 2002 provided ASP.NET Web Forms so that server-side programs could interact with web browsers and programmatically construct HTML for the browser to render. All this did was take the basic HTTP request-response protocol and insert a ridiculously complicated pipeline of events into the middle of it. The release of ASP.NET MVC several years later, gave you more control over the rendering pipeline, but it just replaced one type of complexity with a different type.
All of the enhancements, extensions and frameworks built around the web in the first 20 years or so to support web apps were basically attempts to dress up a pig in a wedding gown. The underlying engine driving the web is still just built from HTTP, HTML and web browsers which were never designed to support rich, connected and stateful applications.
The REST Disease
For many years on the .NET platform I used SOAP web services, which is a simple and well-defined network protocol. It unfortunately wasn't in widespread use outside the Microsoft ecosystem. Then for some stupid reason, REST became really popular in recent years, which is a terrible tragedy, as it's important to realise that unlike SOAP, REST not a protocol, it's just a convention. As a result, it's not self-documenting, the formatting of message bodies is not defined, the response codes have narrow meanings, error handling is undefined, and everyone argues about what should be conventional. Like JavaScript, REST was one guy's pet project which seems to have spread like a disease without cure.
The set of verbs (GET, PUT, POST, etc) combined with the http response codes (OK, NotFound, Created, etc) are so restrictive and inflexible that it's practically impossible to fit them over a service that performs realistic business work. It's depressing that something as vague and ill-defined as REST has become so popular. Evidence of this is the fact that the web is jammed with articles arguing about every aspect of REST ... how to process errors, how to transfer in segments, what response codes to use, how to process binary, and so on. REST has poisoned the web. See my other blog post on REST where I attempt to tame it by turning it into a simple protocol.
The JavaScript Disease
I believe that the craze in recent years for developing JavaScript frameworks to drive web applications on the client-side is cancerous and an evolutionary dead-end.
The JavaScript language was created as a hobby project with a stupid misleading name that was a marketing ploy to ride on the trendy name "Java". I have used a dozen scripting languages in the last 30 years, covering a wide variety of styles and platforms, and by a long margin, JavaScript is one of the worst in all respects. It's a jumble of functional ideas and procedural constructs, there are no standard libraries, scopes are inconsistent, no namespaces, null/undefined confusion, no consideration of how to structure large projects, not even any concept of a "project". JavaScript lacks absolutely everything necessary to create large, sensibly structured projects. When compared to scripting languages designed and created by skilled developers, JavaScript really looks like someone's unfinished hobby project.
So here we are 22 years after JavaScript was released, and they're finally attempting to standardise the language (lookup ECMA standard). The JavaScript language is so crude and clumsy that people have written whole pseudo-languages on top of it to make it useful, JQuery and TypeScript for example. The mere fact that anyone would need to write such things is a strong hint that something is rotten.
In recent months I was compelled to research modern JS frameworks seriously so one could be chosen for a possible browser-based app that displayed and managed marketing data statistics. I initially chose the latest Angular because it seemed popular (it even has articles in MSDN magazine) and it was complete, in the sense that you didn't need to glue together multiple frameworks.
I watched 5 hours of a 10 hour long Pluralsight lesson on Angular and I generated multiple starter projects using different IDEs and commands. It was around this time I became dismayed and quite shocked by what I discovered about the JS ecosystem.
Firstly ... how can a lesson in writing JavaScript be 10 hours long?! Well now I know ... it's a gigantic framework full of conventions, templates, components, services, binding, routing, validation, pipes, filters, injection, observables, and so on. Angular looks like the result of a graduate student's indulgent thesis project. Why do you need something so monstrously large and complicated just to write a goddamn app in a web browser?! After 5 hours I realised I was watching the cogs spin on a huge over-engineered Rube Goldberg Machine that was displacing useful stuff out of my brain.
Next ... I thought I was hallucinating, but both of the Angular starter projects I created contained about 27,000 files and 1.6 million lines of code. And so I was cruelly reminded that JavaScript is a "scripting language" and does not support compilation or optimisation, or any of the features we normally associate with mainstream languages. I have used scripts for decades to glue things together, make repairs or perform utility work, and it's great that they can be managed as text files, usually of moderate size. It has been a law of programming all of my life that if you find yourself writing huge scripts, then you're digging a hole for yourself and it's time to migrate to a "real" language. So here we are with JavaScript and that old sensible rule has been shredded. JavaScript has become a kind of text-based assembly language, despite the fact that it is utterly unfit for the purpose. As an example, the desktop program Azure Storage Explorer is written using JavaScript based Electron, and the resulting installation folder ludicrously contains about 5000 script files and 3000 folders. This mess is caused by JavaScript dangerously leaking into places where it was never intended to be used.
Summary
So here we are in 2018 attempting to write professional programs for the web that are based upon 25 year old technology that was never designed for the purpose. More and more JS frameworks and tools are being released to assist client-side development, but as far as I'm concerned it's like trying to polish a giant turd, and even claiming to make it edible.
What's the answer? Imagine a parallel universe where all the big companies and standards institutes worked together to create what I call "an app player" that has nothing to do with web browsers, HTML or scripts. The principle behind Flash, Silverlight and Java Applets could be stolen to create a well-defined virtual machine and communications protocol, and a "player" could be created for different platform UIs. A team of talented students could even design and create such a thing.
WebAssembly and Blazor for .NET developers looks like a potential cure for the JavaScript disease, as you can write managed C# code to run in the browser. In early August 2018 the Blazor preview is basically working well and it gives me hope that JS and its endless frameworks will fade into the background and no longer be a primary web development language. However, one terrible problem is not solved by WebAssembly … the rendering. It looks like we will still be completely dependent upon the web browser with HTML (and probably some JavaScript) to render the UI. HTML is completely and utterly inadequate for creating UIs for business applications, as there is no concept of a viewport (screen size), no virtualization of list items, primitive layout features, no custom controls and no interactivity. HTML was invented in the early 1990s for simple drawing of text and images and is incapable of rendering rich UIs that are common in desktop applications. So although WebAssembly and Blazor may mercifully displace JavaScript as the primary client-side web development language, we are still stuck with the brain-dead web browser as the application host environment. Not much progress really.