By on September 20, 2010 12:24 am

Promises are a well-established mechanism for modeling future or asynchronous actions. Promises allow asynchronicity while maintaining the core programming principles of composability and encapsulation. Writing asynchronous code in JavaScript can often be a confusing exercise due to the extensive need for callbacks, but promises help to define composable units of asynchronicity to encapsulate actions and reliably separate caller and callee’s concerns.


Promised-IO utilizes promises as an abstraction for I/O operations on top of Node, Narwhal/Rhino, and the browser (where possible). This serves two purposes. First, this package provides the benefits of promise usage: clean separation of concerns and proper encapsulation of eventual values. Second, Promised-IO provides a consistent normalized interface for I/O that will work on multiple platforms without sacrificing any of the advantages of asynchronous I/O, making it easy to build modules that can be used by developers on many platforms.

Usage Principles

One of the characteristics of a good developer is understanding when to apply different abstractions (rather than trying to force a single approach on every situation). Asynchronous code is often more complicated than synchronous code, but can provide big benefits to performance and scalability. In the application of asynchronous I/O there are several different approaches, and each has an appropriate usage.

Synchronous (don’t use asynchronous)

Of course synchronous calls are usually simpler (no addition of callbacks), and for parts of applications that aren’t “hot” parts of concurrent code paths, and aren’t performance sensitive, this can be the smart choice. For the startup process for servers, long period scheduled tasks, and infrequently executed code, the simplicity and code manageability of using synchronous code can outweigh any performance benefit of asynchronous code.

(There are actually times when synchronous performs better than asynchronous. Asynchronous operations have the additional overhead of handling the low-level callback, then adding the JavaScript callback to the event queue, and finally pulling events of the event loop. Synchronous operations do not have this overhead. For example, for file operations that hit the OS cache, the synchronous operation can often be about twice as fast (half as much CPU time) as the asynchronous counterpart. Of course this is only advantageous for cached data.

Synchronous operations can have a much larger performance advantage when there isn’t to-the-core support for asynchronicity. Fortunately, most of the I/O operations in NodeJS leverage libeio and low level async Posix operations for good performance, but for actions that are natively synchronous, some libraries use libeio’s eio_custom function to make it asynchronous. This function can be very expensive and should only be used for operations that will take a significant amount of time (more than about 0.1ms). I have observed synchronous operations executing hundreds of times faster than the asynchronous counterparts when eio_custom is used.

All this being said, most of the time asynchronous is a good choice for code that needs to scale, but it is important to know when and when not to use it.)

Event Registration

This is the typical way that one is notified of events in the browser and in NodeJS. You register for an event and are notified of the event anytime after the registration (no notification of past events). Event registration is the most generic and low-level asynchronous API. NodeJS uses this model exclusively because it is a low-level platform for I/O. For some situations (see below) it is appropriate to build abstractions on top of this API. However, event registration is the right choice if multiple events may take place and you only want future events.


An asynchronous abstraction designed specifically to model an action that has a single eventual completion (successful or failing). Promises act much like a value (returned from the result of a function call or computation), except that the result is asynchronous, and may or may not be immediately available. Promises are more specific than event registration, providing encapsulation (that can easily be passed around) of an eventual value and decoupling the receiver from the timing of the underlying events that fulfill the action (whether they be past or future). Promises are the right choice for asynchronous actions that have a single point of eventual completion.


Streams are an asynchronous abstraction for the progressive flow of sequential data. Streams are important for allowing a sender to feed data to a receiver without requiring large scale buffering at either end. Promised-IO implements streams with “lazy arrays”, an array-like interface that follows the standard API of JavaScript Arrays. This greatly improves the modularity of streams since they can be used with code that expects standard arrays. Lazy arrays are indeed lazy, following the functional programming principle of lazy evaluation as well. Iterative methods including map, some, filter, and every are purely functional and only iterate through the stream as needed, avoiding any unnecessary buffering or computations. Promised-IO’s streams also utilize promises to signal the eventual completion of a stream (for the end of a file or end of a HTTP response, for example), for consistency. For situations where you want to model sequential data, streams/lazy arrays are appropriate.

Promised-IO Modules

There are several key modules available in Promised-IO for access to promise-based I/O.


The “fs” module implements key parts of the CommonJS file system API (the entire API will eventually be supported). It also provides Node’s file system API where each asynchronous function returns a promise rather than taking callback parameters (fortunately these two APIs can easily coexist). This gives you access to a number of the convenience functions from CommonJS’s API (like makeTree()) and gives normalizable access to asynchronous functions. This module is obviously not available for browser usage. Here is an example of using readFile() (an asynchronous function from Node’s API):

var fs = require("promised-io/fs");
return fs.readFile("my-file.txt").then(function(contents){
   // once it is loaded, we can do something with the contents

One of the most important functions in the “fs” module is the open() function which can be used in several powerful ways. The open function returns a file descriptor for use with Node’s functions that take a descriptor (the read, write, and close functions). While the open function opens the file asynchronously, the returned object can be directly passed to read, write, and close functions (they will execute once the file is open):

var fs = require("promised-io/fs");
var myFile ="my-file.txt", "r");, buffer, offset, length, position).then(function(){
  // buffer should now be filled

The returned object is also a promise for the completion of the open operation:

var when = require("promised-io/promise").when;
var myFile = when("my-file.txt", "r"), onSuccess, onFailure);

The object returned from open also has methods (based on asynchronous versions of the CommonJS filesystem API) for writing and closing:

var myFile ="my-file.txt", "a");
myFile.write("some data").then(myFile.close);

And finally, the returned file object can be treated like a JavaScript array for streamed reading (lazy arrays):

var myFile ="my-file.txt", "r");
    // called for each block of data
}).then(myFile.close); // The forEach returns promise for the completion of the reading

The some() JavaScript array method can be particularly useful if you might not need to traverse the whole file:

var myFile ="my-file.txt", "r");
    // if we are searching for something, we can exit be returning true at any time

One other useful aspect of using these lazy arrays is that we can return the file object directly as the body of JSGI responses and it will automatically be piped to the client.

function SomeJSGIApp(request){
  return {
    status: 200,
    headers: {"content-type":"text/plain"},
    body:"my-file.txt", "r");


An HTTP client that follows the JSGI API. Originally developed for handling incoming HTTP requests (for web servers) and returning responses, http-client effectively uses the JSGI API for the reverse situation, allowing for the construction of JSGI requests and receiving JSGI responses. This module allows for some extra convenience properties and defaults to make it easy to create a request. You can simply provide a “url” property to indicate the target, and then all other properties are optional. For example:

var request = require("promised-io/http-client").request;
  method: "GET", // optional
  headers: {} // also optional
  response.status -> http status code
  response.headers -> http response headers
  response.body -> A lazy array stream representing the body of the response

This module can be used on the server and browser.


This module provides promise-based scheduling and timing similar to that of setTimeout and setInterval, but utilizing promises. The two exported functions are delay(ms) and schedule(ms). The delay() function returns a promise that is fulfilled after the specified amount of time. The schedule() function returns a lazy array that iterates periodically by the given amount of time. For example:

var delay = require("promised-io/delay").delay;
  // and two seconds later this executes

This module can also can be used on the server and browser.


This module implements the CommonJS “system” module API and provides access to a print() function and process arguments.


Promised-IO provides a robust, cross-platform I/O system based on the proven design principles of promises for efficient, portable, asynchronous JavaScript.

Other posts in the series