Mike Driscoll, CTO at Metamarket, published a provocative post on the future of Web Application architectures. He argues that frameworks like node.js signal the end of LAMP.
Three months ago, we decided to tear down the framework we were using for our dashboard, Python’s Django, and rebuild it entirely in server-side Javascript, using node.js. This decision was driven by a realization: the LAMP stack is dead.
Mike sees 3 phases to the Web:
- 1991-1999: The HTML Age - The HTML Age was about documents,
- 2000-2009: The LAMP Age - The LAMP Age was about databases.
- 2010-??: The Javascript Age.The Javascript age is about event streams.
Modern web pages are not pages, they are event-driven applications through which information moves.
He explains:
AMP architectures are dead because few web applications want to ship full payloads of markup to the client in response to a small event; they want to update just a fragment of the DOM, using Javascript. AJAX achieved this, but when your server-side LAMP templates are 10% HTML and 90% Javascript, it’s clear that you’re doing it wrong...
Mike sees that the principal role of the server is to ship an application to the client (Javascript), along with data (JSON), and let the client construct the UI from it. The secondary role of the server is to listen in on a stream for events (a new edit, a message, or ticker change) and efficiently push responses back to clients.
Several people commented:
Bruce Atherton agreed with Mike, but he does not see events flowing through HTTP:
Websockets and SPDY [will] take over the world in that regard, as they are infinitely better suited to the task than HTTP is.
Chase Sechrist showed some concerns about node.js even though he uses it widely:
You still need to know (arguably advanced knowledge) how to debug race conditions and how an event loop works, and even how a call stack works due to recursive callbacks smashing the stack. Because of that, the control flow is very strange and mind-bending to people that have been writing C for 20 years, and even junior engineers that are just getting into programming
"Jorjun" noted that, at the current rate of change, even if this new architecture is justified it will not last:
Within 2 years [there will be] a more productive means of encoding valuable IP. Watch your back, the young ones are coming thru, and Java means nothing to them – they weren’t around in the late 1990s. Javascript is a silly name for a silly language. With curly brackets & geek fudgery & intensely annoying artefacts, to old-skoolers like me Javascript looks hasty, nasty & far too easy to obfuscate.
Asher Snyder, co-founder of NOLOH, agrees with the general premise of the post: "the web should and is moving towards events." but does not believe Javascript will lead the way. He suggests that "we’re heading towards a platform or unified language age, as that’s really the only way the craziness of the web is really manageable for rapid development".
InfoQ talked briefly with Subbu Allamaraju who recently published some performance number comparing node.js with play:
Personally, I find [frameworks like node.js and play] exciting for web developers as they bring in some fresh thinking. Web framework land, particularly on the Java-side have had not such simplicity for a while. Play in particular made a good choice by not layering it on top of Netty and not the legacy servlet framework.
The evolution of Web application architecture is indeed accelerating. It seems that we will land where we started as Web Applications get "thicker", in particular, one can only ask what is left to REST in an event-driven world? We actually don't hear much about REST and its uniform interface lately and how it changed successfully the architecture of Web applications. What is your take on the future of Web Application Architecture? How do you feel about Javascript becoming a mainstream programming language?
Community comments
Typo?
by Aslak Hellesøy,
Re: Typo?
by Jean-Jacques Dubray,
Re: Typo?
by Subbu Allamaraju,
wrong date?
by ian green,
JavaScript and mainstream
by Florian Salihovic,
Re: JavaScript and mainstream
by John L,
Re: JavaScript and mainstream
by Jean-Jacques Dubray,
Hypermedia is an Event Filter
by Andrew Wahbe,
Re: Hypermedia is an Event Filter
by Jean-Jacques Dubray,
Re: Hypermedia is an Event Filter
by Andrew Wahbe,
Re: Hypermedia is an Event Filter
by Jean-Jacques Dubray,
Re: Hypermedia is an Event Filter
by Andrew Wahbe,
Re: Hypermedia is an Event Filter
by Jean-Jacques Dubray,
Re: Hypermedia is an Event Filter
by Andrew Wahbe,
Re: Hypermedia is an Event Filter
by Jean-Jacques Dubray,
Typo?
by Aslak Hellesøy,
Your message is awaiting moderation. Thank you for participating in the discussion.
"Play in particular made a good choice by not layering it on top of Netty".
Typo? I'm pretty sure Play *is* layered on top of Netty.
Also - you forgot to link to the Driscoll article: metamarketsgroup.com/blog/node-js-and-the-javas...
Re: Typo?
by Jean-Jacques Dubray,
Your message is awaiting moderation. Thank you for participating in the discussion.
I have added the link thank you. I'll Subbu answer your other comment.
wrong date?
by ian green,
Your message is awaiting moderation. Thank you for participating in the discussion.
was this intended for april 1?
Re: Typo?
by Subbu Allamaraju,
Your message is awaiting moderation. Thank you for participating in the discussion.
Thanks for pointing it out. It was a typo. That should have read as
"Play in particular made a good choice by layering it on top of Netty".
Subbu
JavaScript and mainstream
by Florian Salihovic,
Your message is awaiting moderation. Thank you for participating in the discussion.
Last question answered first: JavaScript was, is and will stay a mainstream language. And actually, it kind of surprises me, that such a topic comes up years after the language was introduced. Libraries and frameworks made the language popular, but it was used by a lot of web developers for years now.
And when it comes to the importance of JavaScript compared to Java, PHP, Python etc... the AMP stack won't disappear in the next decade because JavaScript along with the new frameworks and tools is the next big thing.
The only thing that will happen is that more and more work will be generated since there will never be a consensus found on how things should be done. Is SOAP dead? No. Is SOA dead? No. Is REST dead? No. Is Flash dead? No. is Cobol dead? No. Is C(++) dead? No. Is Objective-C dead? No.
The most important question to ask will be, what is the right technology for my needs.
- How can i start fast?
- Does it need to scale?
- Is it a reliable technology?
- Do i find good developers for my project?
And i am really bored by those kind of discussions since it is always just about hypes ...
Re: JavaScript and mainstream
by John L,
Your message is awaiting moderation. Thank you for participating in the discussion.
Seconded. Except... the fast-start question is the ONLY one that seems to matter in the real world, which is why the "superior" solution will always get left behind, and we'll find ourselves maintaining rickety crap 10 years from now, only to be supplanted by the next big thing using a lesser-quality framework that's 10 years more advanced than today's lesser-quality framework.
Re: JavaScript and mainstream
by Jean-Jacques Dubray,
Your message is awaiting moderation. Thank you for participating in the discussion.
I agree with both John and Florian, in the end this is why I believe that adopting a model driven software development approach is key. You solution will always do pretty much the same thing regardless of the technology you use or the architecture you deploy.
Hypermedia is an Event Filter
by Andrew Wahbe,
Your message is awaiting moderation. Thank you for participating in the discussion.
Hypermedia and REST have a key role in event processing on the Web -- see: Hypermedia is an Event Filter. We've somehow missed this along the way.
Now what REST doesn't account for are events generated by the server. The current direction of the web is to fall back to Mobile code (Option 2) to deal with this. I think that a better approach can be found. This might be something like extending REST with 1) event filters pushed to the server to help optimize the transmission of data, and 2) a standard protocol that is aware of the event streams (perhaps, in practice, using web sockets as transport) to allow intermediaries to further help in the transmission/processing of data (e.g. forking data streams to multiple clients). We could also use better hypermedia constructs in HTML for declaratively dealing with asynchronous network requests and events.
However, it's likely that the industry will need to get experience with the drawbacks of Mobile Code before looking for something better.
Re: Hypermedia is an Event Filter
by Jean-Jacques Dubray,
Your message is awaiting moderation. Thank you for participating in the discussion.
Andrew:
the definition of an "event" is "an occurence of a state", when a user "clicks" on a link, it requests an action to be performed. The action usually results in a state transition, the occurence of the new state is "the event".
Hypermedia defines an action-oriented interface to a resource. At any point in time, the set of actions is limited to transitions available in the current state of the resource (e.g. if a purchase order is the "shipped" state, the "cancel PO" action is not available.
REST does not make ANY state of the resource explicit, only actions. Hence, REST has no particular hooks to relate resource and events (as the occurence of a state). The difference between REST/Hypermedia and Distributed Objects is that it provides an explicit set of actions available for every resource representation. In the case of DO, the client has to know what's allowed or not at any point. However, because REST provides the explicit set of available actions for a given "state" of the resource, REST does not make the states of the resource explicit.
Re: Hypermedia is an Event Filter
by Andrew Wahbe,
Your message is awaiting moderation. Thank you for participating in the discussion.
When most users click a link they are not asking for a "GET" action to be performed on a "resource" -- they don't know what those things are. They are clicking the blue underlined text because it affords clicking and have an expectation that the screen will update with information related to the link text. The role of hypermedia is to communicate information with embedded controls to a user (read Fielding's definition of hypermedia!). The controls afford input from the user and determine how the input is processed. The events I'm referring to in the article are UI events -- a mousedown event reflects that the state of the mouse button is that it is being pressed. Hypermedia translates these events into actions on resources. Just open the hood of a browser and look at what it is doing!
Re: Hypermedia is an Event Filter
by Jean-Jacques Dubray,
Your message is awaiting moderation. Thank you for participating in the discussion.
Andrew,
I am not quite sure one would actually couple a "mouse-down event" to a resource state transition.
Coupling the resource representation with UI does not seem a good value proposition either.
Re: Hypermedia is an Event Filter
by Andrew Wahbe,
Your message is awaiting moderation. Thank you for participating in the discussion.
JJ,
I think we must really be misunderstanding or talking past each other, because I'm interpreting your comment roughly as "HTML doesn't seem to be a good idea". A comment thread is likely just the wrong forum for this discussion.
Re: Hypermedia is an Event Filter
by Jean-Jacques Dubray,
Your message is awaiting moderation. Thank you for participating in the discussion.
No quite ... as I understand REST, it does not mean serve up a bunch of HTML to a browser. REST is about exchanging resource representations that are directly related to the application state, and by composition the resource state. The resource representation embeds "links" which define what you can do next from that resource perspective. This model does not assume any UI, though it works best to support a UI.
Our industry has been looking for ages for a Web Application Architecture that's decent and generic, starting with the days of Web Objects (dynamic HTML) to node.js today. I would conclude: unsuccessfully. It has been a complete mess leading to very large amounts of failed projects or under performing systems. The big guys from Yahoo to Google or Amazon, not to mention Facebook love the Web for its ability to bring billions of users to their domain. Very few benefit, everyone else pays the price for a suboptimal architecture (by a large margin), decades old technologies and complete lack of evolution.
Telling us that now the Web was all along event driven -and we didn't know it- is quite a stretch for a technology that has no eventing mechanism whatsoever. When you see how much a hack is XMPP, I can only repeat whatsoever. Now give me a socket and I'll do anything you want, for sure, but please, let's stop the nonsense of a) telling everyone that Web technologies are well designed for modern information system architectures -they are not, b) let's stop pretending we can fix completely broken technologies with yet another so called silver bullet.
The fundamental problem our industry refuses to admit is that the world (of information technology) is not "monadic", everyone that comes with a technology or paradigm where "everything is XXX" (we've tried XXX = table, procedure, object, service, (business) process, resource, event, function ... That monadism is bound to fail, yesterday, today and tomorrow. The world of information technology is polyadic, it requires all these concepts to be carefully articulated in a single programming model. Three years ago I developed WSPER (Web, Service, Process Event Resource) to show that it was possible and beneficial to do so. But it looks like we are still looking, after 50 years or so of software engineering, for that elusive monadic solution. Whether it is for the Web, Client/Server, Mainframe, Mobile, ... that solution does not exist. Any reasonable human would have long moved on to expand the horizon. Not our industry.
As Tim Bray would say it so well, the emperor has no clothes. The Web as we know it is dead, unable to adapt as all its followers think this is the ultimate platform. Think that "Javascript" can salvage it is actually quite ironic.
Re: Hypermedia is an Event Filter
by Andrew Wahbe,
Your message is awaiting moderation. Thank you for participating in the discussion.
JJ,
I agree with everything you are saying about silver bullets, "monadic" tech culture etc. Let's not go there because I don't think we disagree (I spend a great deal of my time with non-Web/REST systems, e.g. SIP-based communication systems). I want to be clear that I'm not saying that REST addresses server-side events (or not much beyond say having the server update the state of a resource in response to an event).
My point was:
1) A key role of hypermedia in REST (or at very least the HTML Web) is to provide declarative instructions on the processing of client-side events (it seems you might disagree but let's get to that later)
2) I am proposing that it is worth exploring how some of techniques REST brings to client-side event handling might be applied to server-side event handling.
Let me also be clear in saying that the output from (2) would likely not be REST -- it would be some other new style. I would hope that it would be an evolutionary path from REST rather than something entirely brand new as that would be a more practical outcome if we are looking to extend the current Web (though that would not be as much of a concern for other systems).
The reason that I'm saying this is that because the current trend towards downloaded scripts + websockets (which I think corresponds to a move away from REST towards the Mobile Code style I examined in my article) has a lot of serious drawbacks and I think we can and must do better.
So back to client-side event processing: I realize I might have a different take on hypermedia than most. I think that defining hypermedia simply as "linked data" is incorrect. Roy Fielding's definition is as follows: "Hypermedia is defined by the presence of application control information embedded within, or as a layer above, the presentation of information." In his 2008 ApacheCon presentation he summarized hypertext as "Data-Guided Controls". To interpret REST's constraints such as "Hypermedia as the Engine of Application State" you have to use the same definition of "hypermedia" as the author. I think this definition requires more than links to be added to data -- it requires "controls" to be added to data. My interpretation (influenced by my experience building hypermedia-driven systems such as VoiceXML browsers) is that the term "control" implies an client-side input/event processing mechanism. I discuss this further here: Machine-to-Machine Hypermedia
JJ, I often agree with many of your criticisms of REST; however, while I feel that the issues you point out apply to REST as commonly practiced (i.e. using a flawed notion of hypermedia), they may not hold for the fundamental style (i.e. when actual hypermedia controls are used) . I don't think sprinkling links in your data magically frees you of client-server coupling. I do think that a client-side event processor driven by declarative programs (i.e. "hypermedia") dynamically downloaded from the server does a lot to reduce coupling. In a nutshell, it gives you the decoupling benefits of an event driven architecture but delivers event filters to the event source (the client) at run-time to optimize network communication and server load. And that same approach is what I'm suggesting might be worth investigation for server-generated events.
I hope you don't see this as "the same old REST argument" as I wouldn't want to waste your time with that.
Re: Hypermedia is an Event Filter
by Jean-Jacques Dubray,
Your message is awaiting moderation. Thank you for participating in the discussion.
Andrew.
the "old REST argument" doesn't exist anymore. Most people have moved on and understand that the way REST was sold in 2007 to the greater community of developers, just was not true. All the BS about "uniform interface" was complete vaporware -as usual.
To get back to your points:
1) to provide declarative instructions on the processing of client-side events , well, yes and no. Define "declarative", ATOM gives you some semantics -if you usee them- otherwise href doesn't tell you anything about how to process events. I think the general problem is that yes we can do anything we want by adding semantics anywhere. Once we do that we create a "proprietary" way of using REST. Lots of people might see it as a best practice. Lots of people might laugh at the standardization process of WS-* (including me) but at the end of the day, we have to be very clear and precise by what we mean by declarative.
2) It might, but what's the different with ASP.net? or JSF (which I know less) or GWT? Those guys have already established this kind of thing, for some about a decade ago.
In the end, we have to differentiate the solution model from the technology AND the architecture, otherwise we'll be circling like this for another 50 years. The solution model has to describe GUI, Entities, ... a technology and architecture neutral way. Project Volta at Microsoft introduced the concept of Architecture Refactoring, this is the kind of direction we need to go to. We need to be able to express solutions in a technology and architecture way.
REST is both a technology (HTTP) and an Architecture. Hence it is unsuitable to create solution models. Yes the concept of Resource is interesting and a good candidate to make it to the solution model, but REST is deliberately vague about what a resource is (a piece of addressable information).
Our industry should look for solutions models, technologies and architectures independent of each other. Why should I rewrite my order entry system? because the technology I used is no longer supported? because architectures change? No way ...
In the end you have to decide where this kind of sentence fits: "a client-side event processor driven by declarative programs (i.e. "hypermedia") dynamically downloaded from the server does a lot to reduce coupling."
I don't disagree with it, but where does it belong? Solution model, Technology or Architecture?
So I don't disagree with your statements, they are punctually true, however, in the general direction of our industry, I think they go in the wrong direction. You are trying to add solution model semantics to technologies and architectures. That should not happen. Things like Machine-to-Machine hypermedia is of the realm of the solution model. HTTP can be the best technology (today) and REST can be the best underlying architecture, however, the solution model semantics should be independent.
I hope we agree.