BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Articles Virtual Panel: How to Survive Asynchronous Programming in JavaScript

Virtual Panel: How to Survive Asynchronous Programming in JavaScript

This item in japanese

Programmers take certain features for granted - sequential programming for instance, writing down an algorithm that does one thing after the other.

However, if you're writing code in Javascript that uses blocking I/O or other long running operations, sequential coding is out of the question because blocking the only thread in the system is a very bad idea. The solution is to implement algorithms using asynchronous callbacks, ie. spread out sequential code over multiple callbacks.

That solves the problem but means we lose the ability to write down a sequential algorithm; instead for non-trivial sequential code, we end up with a graph of callbacks.

This becomes even more critical for large scale applications that make heavy use of asynchronicity. Using callback-passing for asynchronous actions does not compose very well and might create complex flows of passing callbacks around to handle return values.

The Javascript community is aware of this; particularly the Node.js community because Node.js puts an emphasis on asynchronous code.

The CommonJS group has an answer for this in the form of Promises, which aim to provide an interface for interacting with an object that represents the result of an action that is performed asynchronously, and may or may not be finished at any given point in time. This way different components can return promises for asynchronous actions and consumers can utilize the promises in a predictable manner. Promises can also provide the fundamental entity to be leveraged for more syntactically convenient language-level extensions that assist with asynchronicity.

Stratified JavaScript is another approach to fix this in a superset of the Javascript language. But if you can't switch languages, the route to go is via a flexible API that allows you to emulate sequential code. If that API allows for a concise notation, it is often referred to as an embedded DSL.

InfoQ took a look at a list of these APIs and DSLs and talked to their creators about how they approached the problem, the design principles, the paradigms they follow and much more. And of course - where the limits of these solutions are.

In particular InfoQ has contacted:

InfoQ: What problems does your library address? Ie. is it mostly about removing boilerplate code and manual callback handling from async I/O, or does it also provide orchestration or other functionality (ie. to help with issuing multiple I/O calls and waiting for them to finish, etc).

Tim (Step): Step's goal is to both remove boilerplate code and to improve readability of asynchronous code.  It is very minimal and doesn't do anything you can't do manually with copious use of try..catch blocks and counter variables.  The features are easy chaining of serial actions with optional parallel groups within each step.

Will (Flow-js): Flow-JS provides a javascript construct that is something like a continuation or a fiber found in other languages. Practically speaking, it can be used to eliminate so-called "pyramids" from your multi-step asynchronous logic. Rather than directly nesting callback function literals, you use a special "this" value as a callback to the next function in the flow definition.

Kris Zyp (node-promise): The problem is that typical callback style execution flow (continuation passing style) conflates the concerns of an interface, mixing functional inputs with output handlers. Promises encapsulate the eventual completion of a computation, allowing functions/methods to be executed with purely input parameters while the output, returned promises, hold the result.

I've explained this principles in little more detail here and here.

By encapsulating an eventual computation, promises work great for orchestrating parallel and serial actions, even with complex conditional logic. The node-promise library includes functions to make this easy (the all() function and the step() function in promised-io)

Also, just FYI, promised-io has kind of superceded node-promise. It is the same core promise library, but promised-io also includes promise-style versions of the NodeJS IO functions and where available platform normalization to access similar functions in the browser.

Caolan (Async): Yes, for the most part this is about removing boilerplate. The code for calling functions in series or parallel then waiting for callbacks in JavaScript is pretty verbose but obvious. Shortly after node.js adopted callbacks over promises as the basic means of handling async code, I found myself using the same patterns again and again. Extracting these into a separate library seemed the obvious thing to do.

Since then it has grown to encompass more complex features which allow the orchestration of callbacks based on their dependencies. For the most part, however, this is a fairly low-level library that leaves the overall structure to the developer. However, I find JavaScript frequently suits a more functional programming style, and quickly added asynchronous versions of map, reduce, filter and the other usual suspects. The library really shines when used in this way and allows you to stick with conventional callbacks without using continuations or promise objects.

Fabian (Async.js): Async.js tries to simplify common async patterns in JavaScript. It's main purpose is to apply a series of async functions to a set of uniform objects. It evolved from an asynchronous forEach function and generalized the concept. This is especially useful, when working with the async file system API of node.js though it is not bound to node.js and can be used for all similar problems. This snippet demonstrates the gist of async.js:

async.readdir(__dirname)
   .stat()
   .filter(function(file) {
       return file.stat.isFile()
   })
   .readFile("utf8")
   .each(function(file) {
       console.log(file.data)
   })
   .end(function(err) {
       if (err)
           console.log("ERROR: ", err)
       else
           console.log("DONE")
   })

This code operates on all items in the current directory. This is the uniform set of objects. For each item a sequence of async operations is performed. If first filters all items, which are not files (e.g. directories), then reads it from disc and prints it to the console. After all items are processed the final callback is called indicating an eventual error.

AJ (FuturesJS): Asynchronous and event-driven programming can be somewhat difficult to reason about.

I created Futures primarily to

  • provide a single asynchronous control-flow library for Browser and Server-Side (Node.JS) use.
  • expose a consistent pattern for handling callbacks and errbacks
  • control the flow of an application in which events depend on one another
  • handle callbacks for multiple resources, such as mash-ups
  • encourage good programming practices such as using models and handling errors

Futures.future and Futures.sequence simply reduce the amount of common boilerplate code and provide some flexibility.

Futures.join can join (in similar fashion to how one would join threads) or synchronize (for events which occur at intervals) multiple futures.

Futures.chainify makes it easy to create asynchronous models, similar to the Twitter Anywhere API.

Isaac (slide-flow-control): The problem that slide addresses is that I needed something to talk about for the OakJS meetup, and didn't want to have to come up with a whole new idea, because I am so very lazy.  Mostly, I just wanted to do almost no work, show up, drink some beer, eat some chinese food, rub elbows with interesting people, bask in a little bit of positive attention, and then go home.  The work-to-payoff ratio is very important to me, in software and in life. So, I just repurposed the dead-simple async helper functions that I use in npm so that they'd fit in a slide deck, called it "slide" because of that restraint, and presented it.

The other problem it addresses is that it shows that it's very easy to write your own flow control library.  Everyone likes their own the best, so it makes sense to just give people a few basic patterns and let them get creative.

InfoQ: Does the library implement ideas from CS research?

Tim (Step): Nothing directly.

Will (Flow-js): Not that I know of. It was just my first stab at making business logic, which tends to make many synchronous calls to outside services, manageable in Node.js.

Kris Zyp (node-promise): Yes, absolutely. Most of computer science research on asynchronous design has pointed to promises in various forms as the most appropriate mechanism for functional flow and proper separation of concerns. The term "promise" was originally proposed by in by Daniel P. Friedman and David Wise, dating back to '76. You can read more of the rich computer science history behind promises on the wikipedia article.

Caolan (Async): I don't have a CS background and implemented the Async library on a purely pragmatic basis. When I needed a higher-order function to clean up some async JavaScript, and I used it repeatedly, it went into the library.

Fabian (Async.js): The implementations of async.js remotely resembles Haskell's Monads but this is more by accident.

AJ (FuturesJS): Yes. The biggest influences have been

The best thing about asynchronous programming is that it naturally leads to writing more module code and if you're going to have any sort of an asynchronous model you're forced to follow the principle of always passing parameters in and never passing data which belongs to a model outside of that model.

Isaac (slide-flow-control): It implements the continuation pattern, sort of. A lot of CS research misses the point, I think.  It is a finger pointing at the moon.  To get there, you need a rocket.  Straighter fingers will not help. If you can grok this, then deep mysteries will reveal themselves.

InfoQ: Does the library offer any error handling strategies? How does it interact with exception handling?

Tim (Step): If an exception is thrown in any step, it's caught and forwarded to the next step as the error parameter.  Also non-undefined return values are forwarded to the next step as the callback value.  This way steps can be either sync or async using the same syntax.

Will (Flow-js): Flow-JS doesn't have any built-in exception handling, which is definitely a weakness. Tim Caswell wrote a module based on flow called "Step", which wraps calls to each of the provided functions in try/catch blocks and forwards caught exceptions to the next function in the sequence.

Kris Zyp (node-promise): Yes, promises are designed to provide an asynchronous equivalent of synchronous flow, so just as a JavaScript function can throw an error or successfully complete andreturn a value, a promise can resolve to a successful value or to an error state. Promises that are passed down to callers can propagate errors until an error handler "catches" it, just as a thrown error propagates until it is caught. The node-promise library properly maintains this concept, making it easy to register error handlers or propagate errors until they are caught otherwise (to avoid silent suppression of errors). By having direct synchronous equivalents to promise, it is easy to reason about promises using existing understanding of code flow.

Caolan (Async): Exception handling with async code can be a little tricky, especially if you're unfamiliar with heavily async environments like node.js. For me, actually handling the errors is more important than whatever style you adopt to do it. In the browser this is especially important since its easy to accidentally let an exception bubble up to the top and kill all JavaScript on the page.

There is a lot to be said for following convention here. Exception handling needs to be straight-forward and preferably familiar so its easy to implement and obvious when you forget. Unfortunately JavaScript in the browser doesn't help us much here, but node.js has provided a simple convention that can be easily used in both environments.

The Async library adopts exactly this style, using the first argument of a callback to pass errors to the next step in your program. If the first argument is null (or another 'falsy' value) then it can be ignored, otherwise it is treated as an exception. Where possible, execution will be cut-short in the Async library to speed things up. If one function of a collection being run in series passes an error to its callback, then subsequent functions in the collection will not be called.

Fabian (Async.js): Async.js builds an the error handling conventions of node.js. The first argument of each callback is reserved for an error object. If a computation fails or an exception is thrown, the error/exception is passed as first argument to the callback. Async.js supports two error handling strategies, which can be configured through the API. On error either the whole set operation is stopped and an error callback is called or the failing element is skipped from the set.

AJ (FuturesJS): Since exceptions can't be "thrown" asynchronously, instead the user is encouraged to pass any exceptions as the first argument to the callback.

The basic idea is to try {} catch(e) {} the error and pass it rather than stopping the application at some unrelated time. Futures.asyncify() does this for the purpose of using synchronous functions in a predominantly asynchronous environment.

Here's an example:

(function () {
  "use strict";

  var Futures = require('futures'),
    doStuffSync,
    doStuff;

  doStuffSync = function () {
    if (2 % Math.floor(Math.random()*11)) {
      throw new Error("Some Error");
    }
    return "Some Data";
  };

  doStuff = Futures.asyncify(doStuffSync);

  doStuff.whenever(function (err, data) {
    if (err) {
      console.log(err);
      return;
    }
    console.log(data);
  });

  doStuff();
  doStuff();
  doStuff();
  doStuff();
}());
  

Isaac (slide-flow-control): Never throw.  Never ever throw.  Throwing is the devil.  Don't do it. Never ever not at all ever forever. When a callback is called, the first argument is either an error or null.  If it's an error, handle it, or pass it to your callback to handle. Pass an error to a callback to signal that an error has occurred.

InfoQ: Was the library inspired/influenced by the work done with F#'s Workflows, Rx (the Javascript version), or other projects?

Tim (Step): Yes, the style was inspired directly by flow-js.

Will (Flow-js): Not really. It was just the first solution that occurred to me.

Kris Zyp (node-promise): The node-promise library was influenced by Mark Miller's E language and it's use of promises, Tyler Close's ref_send library, Kris Kowal's Q library, Neil Mix's NarrativeJS, Twisted's and Dojo's Deferred implementations, and many other libraries.

Caolan (Async): I'm afraid I've not used F# or Rx so I'm not qualified to comment on its relationship with those projects. It does, however, take inspiration from Underscore.js, an excellent functional programming library for JavaScript. Most of the functions which use iterators in Underscore have been updated to accept callbacks and operate asynchronously in the Async library.

Fabian (Async.js): The chaining character of the API is influenced by jQuery. One of my goals is to provide a jQuery like API for the node.js file system module. The other big influence are python style generators. Each element in the chain generates values, which can be consumed by subsequent chain elements. The whole operation is triggered by the last element in the chain, which "pulls" values through the chain. In this sense async.js is different from jQuery and Rx where values are pushed from the source. This pull system makes it possible to compute all values lazily and also to have generators, which yield an infinite number of values (e.g. all even integers).

AJ (FuturesJS): Not at first, no.

I was building a mash-up site using Facebook and Amazon and my first attempt was a mess because I just didn't understand how to go about handling a model made from two resources. (I wasn't really too familiar with JavaScript at the time; I had been trial-and-error-ing through the "WTFJS"es and using a little jQuery to ease the DOM pain)

What I found was that it is easier to always assume that any data may take some time to retrieve than to ever assume that the data will exist when needed and then have to refactor everything down the entire chain when any one particular dataset can only be handled asynchronously.

I tried, failed, and half-succeeded using a few different methods and then, fortunately, someone on my local JavaScript User Group list made mention of the Crockford on JS lecture series. After watching the whole series (I watched the 3rd section at least 3 times) I finally had a better grasp on how to manage the "problems" (or rather, opportunities) of asynchronous programming so I found Crockfords slides and started with the promises example that he gave as my base point.

Later on I started playing with Node.JS and that's when I changed my error-handling strategy (but not the documentation hasn't relflected that until just a few days ago). In Futures 2.0, which I'll be releasing this upcoming Sunday, I've also bundled Node.JS's EventEmitter for Browser use.

Isaac (slide-flow-control): Nope.  It was, I suppose, inspired by the pattern that we came to use in NodeJS for callbacks.  I just am a stickler for consistency, because I'm not smart enough to remember more than one type of thing (or maybe two, on a good day, with lots of coffee) without getting really confused and walking into a closet thinking it's the bathroom and then leaving all the coats smelling like... well... you get the idea.

No.  One kind of thing, everywhere.  That's all.  Functions that take a callback as the last argument.  Callbacks get an error as the first argument, or null/undefined if it worked.  Slide is just a few helper functions that make it easy to do a bunch of things using that pattern.

InfoQ: Are there any new Javascript language features or changes that could make the library better, eg. allow to be more concise, etc?

Tim (Step): Perhaps, but not without major changes to the semantics of the language.  Maybe a preprocessor like coffeescript could help the syntax, but I think it's best to stick to vanilla JavaScript most the time.

Will (Flow-js): In my opinion, Javascript desperately needs something like Ruby 1.9's fibers. I've spent a lot of time considering Node.js for my upcoming projects, but at some point, the asynchronous programming always gets brain-melting. There are lots of tools for making it more manageable, but I can't help but feel like the existence of so many libs like Flow-JS is just an indication that Javascript actually sucks for concurrent programming.

I know one of the goals of Node was to avoid modifying the V8 javascript engine, but according to the guys at Asana, adding fibers wasn't too much trouble.

Kris Zyp (node-promise): Yes, there has been discussion around single-frame or shallow continuations, similar to generators, which can avoid the need for callbacks that complicate branching and looping flows. This could be used in conjunction with promises to create extremely simple and easy to read asynchronous code.

Also, one more note, the node-promise library also implements the http://wiki.commonjs.org/wiki/Promises/A specification, which means it can be interoperate with Dojo and probably future jQuery promises.

Caolan (Async): The Async library was designed to make the most of the language as it stands. To work with the grain, and not try to implement a new language on top of JavaScript.  

That said, the addition of yield to JavaScript 1.7 could have some interesting applications for future projects. With yield it might be possible to port some Twisted-like features to JavaScript for more a synchronous-looking coding style. This is something a colleague of mine was exploring with Whorl, although the effort now appears to be abandoned.

Fabian (Async.js): A standardization of the Generators and Iterators supported by Mozilla could make the async.js code more concise and simplify these kind of problems.

AJ (FuturesJS): The most repeated code in the library is the fudge that makes the same code work in the Browser and Node.JS. I know some libraries (such as teleport) have popped up to try to solve this issue, but I haven't played with any of them yet. It certainly would be nice if an asynchronous require had been built into the the language.

From my point of view, a language with as naturally asynchronous as JavaScript should have something akin to Futures built-in.

Although CommonJS has a few proposals for standardizing server-side Promises, but whereas their focus is more an data privacy, Futures is more focused on control-flow, end-developer ease-of-use, and browser compatibility.

Isaac (slide-flow-control): No.  My flow control is the best.  It can't be improved by anyone else, because it's best-ness is directly related to its my-ness, so any outside influence would make it less mine and thus less best. If you want to have this experience, I suggest writing a flow control library.  You'll find that it is the best as soon as you write it.  If any other library seems like it might be better, then you can rush back to your editor, hiding your shame, and quickly reinvent all their ideas in a slightly different way that is yours, and then you will know in your heart that it is now the best.

You can find more information about JavaScript, right here on InfoQ!

Rate this Article

Adoption
Style

BT