I've been working with JavaScript, in one form or another, since about 1997. Back then a mouse rollover was impressive; 20 years later and JS has reached critical mass, with browser vendors competing to create the best (fastest, least battery usage, whatever) implementations they can. Node.js gives us JS on the server. Progressive Web Apps lets JS compete with native code on mobile (the success of that is something for another blog post).
Today I want to write about one of the new features in JS that makes it so much easier to write complex applications with it. Something that lets us do more with less code.
JS is single threaded. Well, it isn't (there are lots of events that will run in parallel and that list varies by browser) but it doesn't give you control. There are multiple threads, but you don't get to pick how to use them. The same code might thread differently across browsers, or even in the same browser depending on what's happening at the time.
What you will have coded is asynchronous events, lots and lots of events. Timer events, user interaction events, call back events, you name it. What we're interested in here are the events where you ask for something, and then want to carry out an action against the result when (or if) you get it.
I'm going to pick a common use case: AJAX. I'm going to start with a really basic callback implementation:
/** Simple callback XMLHttpRequest */
function ajaxCallback(url, onSuccess, onFail) {
var request = new XMLHttpRequest();
request.addEventListener('load', function () {
if (request.status >= 200 &&
request.status < 300) {
var json = JSON.parse(request.response);
onSuccess(json);
}
else
onFail(request.statusText);
});
request.addEventListener('error', function () {
onFail('Network');
});
request.open('GET', url);
request.send();
}
ajaxCallback(
'yourApi/getsomething',
function (resultData) {
// Do something on success
},
function (err) {
// Log an error
});
Yes, I realise that your preferred library name here has a much more complete/stable/whatever AJAX implementation. I'm not recommending this, it's just a really simple example (not intended for production, it only handles HTTP GET requests and JSON results).
You provide a function to call if successful, and that function continues from where you left off. This is fine; it works consistently in lots of browsers; everybody understands what it's doing.
However, it gets messy fast as soon as you have multiple asynchronous actions. Branching logic is even nastier. With two AJAX actions, you can nest them (with the first callback starting the second) or run them concurrently (in which case you need to deal with recording the completion states of both, in whatever order they return). With three actions it's more complicated still.
For instance, suppose we want to:
(This really should be reduced into fewer requests. It's just an example with both sequential and parallel asynchronous actions – feel free to pick your own.)
function logError(err) { ... }
function finalCallback(items) {
// Do something with the complete list
}
function doTheThing() {
// AJAX get the list
ajaxCallback('yourApi/getlist',
function (resultData) {
// When that completes get the expected number of items
var itemLimit = resultData.items.length;
// For each item make a new AJAX call
var retrievedItems = [];
// We need to track errors that may or may not have fired in a callback
var errored = false;
for(var i in resultData.items) {
if(errored)
break; // Error already fired before loop completed
var itemid = resultData.items[i];
ajaxCallback('yourApi/getitem/' + itemid,
function (item) {
if(errored)
return; // Error fired by another async return
retrievedItems.push(item);
// Have all the callbacks finished then?
if(retrievedItems.length === itemLimit) {
// We've now added all the expected results, so sort
var sorted = retrievedItems.sort(comparer);
try {
// Call the final callback
finalCallback(sorted);
}
catch(ex) {
// We have an error in finalCallback
logError(ex);
}
}
},
function (err) {
// We have to wrap our error logger
logError(err);
// Set an error flag or we'll never finish
errored = true;
});
}
},
logError);
}
This is messy, almost certainly has some corner cases on errors (and generally unpleasant error handling), and is tough to maintain. It also smells bad – that deeply nested finalCallback
really should sit outside of this tree, errors handled at each level is duplication at best, and so on. Come back to this six months later to add another asynchronous action (say IndexedDB
local caching) and you have to unpick the whole tree of callbacks.
Actually, let's do that, just for fun (feel free to scroll on to the next section about Promises):
function logIndexedDBError(evt) {
// Get the useful error message out of the event object
logError(evt.target.error.message);
}
function setUpObjectStores(evt) {
// Upgrade the IndexedDB instance to the latest version
// I really should check for the version and only add the specific missing stores
evt.currentTarget.result.createObjectStore('results', { keyPath: 'id' });
}
function getIndexedDBStore(storeName, openedStore, write) {
// Open the DB, with a callback - note that older browsers may call this something else
var dbRequest = indexedDB.open('my-cache', 1);
dbRequest.onerror = logIndexedDBError;
dbRequest.onupgradeneeded = setUpObjectStores;
// Theres also dbRequest.onblocked, but let's leave that for now
// This fires once the DB has opened
dbRequest.onsuccess = function(evt) {
var db = evt.target.result;
db.onerror = logIndexedDBError;
// Open a transaction for the store, then open the store from that
var transaction = db.transaction(storeName, write ? 'readwrite' : 'readonly');
transaction.onerror = logIndexedDBError;
var objectStore = transaction.objectStore(storeName);
openedStore(objectStore);
};
}
function getCache(id, onDone) {
// Try and get the value from the results cache, and always fire onDone(id)
// so the next callback knows which ID we were trying to get.
getIndexedDBStore('results', function(objectStore) {
var storeRequest = objectStore.get(id);
storeRequest.onerror = function(evt) {
logIndexedDBError(evt);
onDone(id); // Still fire something to let the caller know we're done
};
storeRequest.onsuccess = function(evt) {
onDone(id, evt.target.result);
};
}, false);
}
function setCache(item) {
// Fire and forget to set the cache, because this example is complicated enough
getIndexedDBStore('results', function(objectStore) {
var storeRequest = objectStore.put(item);
storeRequest.onerror = logIndexedDBError;
}, true);
}
OK, that's already pretty horrible. Loads of the error handling is missing, upgrades aren't handled correctly, blocking isn't handled, older browsers aren't handled, the save method is fire and forget, and low level exceptions will cause the top level callbacks to not be called. IndexedDB
isn't the most user friendly. However, let's go with it, just for this example:
function doTheThing() {
// AJAX get the list
ajaxCallback('yourApi/getlist',
function (resultData) {
// When that completes get the expected number of items
var itemLimit = resultData.items.length;
// For each item make a new AJAX call
var retrievedItems = [];
// We need to track errors that may or may not have fired in a callback
var errored = false;
for(var i in resultData.items) {
if(errored)
break; // Error already fired before loop completed
// Oh the joys of var in JS, by the time getCache fires its callback
// this variable will hold an ID from later in the loop
var itemid = resultData.items[i];
getCache(itemid, function(callbackid, cacheItem) {
if(cacheItem) {
// Duplicate this block, but for the item from the cache
retrievedItems.push(cacheItem);
if(retrievedItems.length === itemLimit) {
var sorted = retrievedItems.sort(comparer);
try {
finalCallback(sorted);
}
catch(ex) {
logError(ex);
}
}
return;
}
// Not in the cache, so nest an AJAX callback in our IndexedDB callback
// Use callbackid, echoed from getCache(), rather than itemid
ajaxCallback('yourApi/getitem/' + callbackid,
function (item) {
if(errored)
return; // Error fired by another async return
// Really we should make sure this completes, but that's ANOTHER callback!
setCache(item);
retrievedItems.push(item);
// Have all the callbacks finished then?
if(retrievedItems.length === itemLimit) {
// We've now added all the expected results, so sort
var sorted = retrievedItems.sort(comparer);
try {
// Call the final callback
finalCallback(sorted);
}
catch(ex) {
// We have an error in finalCallback
logError(ex);
}
}
},
function (err) {
// We have to wrap our error logger
logError(err);
// Set an error flag or we'll never finish
errored = true;
});
});
}
},
logError);
}
Urg. I feel ill just looking at that block. There are things I could do – refactor this out, add smart wrapper objects and clean up the duplicated code and such like. They all require significant rewrites and more complicated code.
I was wrong: that wasn't fun at all.
Another awkward issue with callbacks is that in some cases the event may have already fired before you subscribe to it – then, in addition to your event callback functions you need yet more logic to handle the case where the asynchronous action finished before your next line of code is executed.
With a Promise
, instead of providing a callback function, you have an object (the Promise
) that keeps track of the asynchronous event.
These aren't a new concept, and have also been called deferred (in jQuery), future, eventual, delay or task (in .NET). Though there are (sometimes drastic) implementation differences between all of them, they all share the same basic idea.
Promises are now supported natively by all the main browsers, except IE11 (because IE) though it's fairly simple to shim. Because promises are native they are rapidly becoming the standard for any framework that needs an asynchronous pattern.
We can easily rewrite our basic callback based AJAX function using Promises:
/** Simple Promise XMLHttpRequest */
function ajaxPromise(url) {
// Init a promise with a function that has two function parameters
return new Promise(function (resolve, reject) {
var request = new XMLHttpRequest();
request.addEventListener('load', function () {
if (request.status >= 200 &&
request.status < 300){
var json = JSON.parse(request.response);
// Calling resolve completes the promise
resolve(json);
}
else
reject(Error(request.statusText));
});
request.addEventListener('error', function () {
reject(Error('Network')); // For nice stack traces this should be an Error
});
request.open('GET', url);
request.send();
});
}
The promise based function is called in a fairly similar way to the callback:
ajaxPromise('yourApi/getsomething').then(
function(resultData) {
// Do something on success - inside the promise resolve() has been called
},
function(err) {
// Log an error - inside the promise reject() has been called
});
So far this doesn't look a whole lot different.
Where this becomes really useful is when you have multiple asynchronous actions. Rather than nest callbacks we can now chain them using then
– we don't need to cascade tracking and error logic down the callback functions.
Let's go back to our example from before:
function doTheThing() {
// AJAX get the list
ajaxPromise('yourApi/getlist')
.then(function (r) {
var itemPromises = r.items.map(function(itemid){
// For each item make a new AJAX promise
return ajaxPromise('yourApi/getitem/' + itemid);
});
// Return this promise once all of the item promises have finished, in any order
return Promise.all(itemPromises);
})
.then(function(retrievedItems) {
return retrievedItems.sort(comparer);
})
.then(finalCallback)
.catch(logError);
}
This is a lot less code, and it's a lot more maintainable – if we want to add another asynchronous action we can just tag on another then
. We can easily refactor out one of our bad code smells too:
function doTheThing() {
// AJAX get the list
return ajaxPromise('yourApi/getlist')
.then(function (resultData) {
var itemPromises = resultData.items.map(function(itemid){
// For each item make a new AJAX promise
return ajaxPromise('yourApi/getitem/' + itemid);
});
// Return this promise once all of the item promises have finished, in any order
return Promise.all(itemPromises);
})
.then(function(retrievedItems) {
return retrievedItems.sort(comparer);
})
.catch(logError);
}
// finalCallback no longer needs the deep nesting or additional parameter
doTheThing().then(finalCallback);
Along with Promise
we get some really useful utility functions, such as Promise.all
, which allows you to wait until all the promises have succeeded and then handle the results (that is always a pain with callbacks).
There are still some weird gotchas, for instance Promise.race
– you might assume that returns the first promise to finish, and it does, but it also calls off the entire race if any of the promises fail before there's a winner (I find it helps to think of it as a car race that gets red-flagged if anyone crashes, rather than athletics where someone trips and everyone else finishes). It's also still easy to fail to catch exceptions, as they're just passed along the chain until a then
with an error function or a catch
is reached.
There's a new fetch
API that returns a Promise
– really we should use that instead. The syntax is a little different from our XMLHttpRequest
function, as fetch
makes reading the response body a promise too, which is better for large responses and streams. It's really worthy of its own blog, but it's easy to drop in to replace XMLHttpRequest
.
function ajaxPromise(url) {
return fetch(url, { method: 'GET' }).then(function(r) {
if (r.ok)
return r.json();
throw new Error(r.statusText);
});
}
Most new APIs are using promise-based implementations (like fetch
, Cache API, Battery Status API, Bluetooth API, and so on). Older ones still use callbacks, but (as seen above) it's fairly simple to convert a callback into a promise. We saw earlier the fun of callbacks in IndexedDB
, and others have already done the hard work of wrapping all that in promises, such as idb.
OK, so promises are great, but you have lots of callback based async actions, and they work fine. Why change to the new pattern? Let's look at what's coming next…
async
/await
Coming in ES2017 are the async
and await
language keywords. These keywords are already supported in Chrome. Support in Firefox is coming in v52, due out in March. Support is coming soon in Edge and Safari. They work in node.js 8.0.0 (and >= 7.3.0 with --harmony-async-await
). If you need to run on anything other than those both Babel and Typescript can transpile them down for older JS versions.
These are syntactic sugar for Promise
functions that make asynchronous actions easy (if you've used them in .NET, where they are syntactic sugar for Task
, this should look very similar). Anything you write with Promise
today is going to work with async
/await
as soon as the browsers/stack you use supports them.
async
marks a function as asynchronous – it will return a Promise
that resolves with whatever you return
, and rejects if an error is thrown. If async
are nested then the await
only returns the closest one.await
tells the function to return the Promise
here, and continue the rest of this function in the .then()
(when the Promise
resolves). You can await
any function that's marked as async
or that returns a Promise
(in fact, you almost always should await
anything that you can). Multiple (or branched) await
can apply in a single async
, in a similar way to how promises can be chained.So how does our AJAX example look in ES2017?
// The async keyword tells JS that this is going to be awaited
async function doTheThing() {
try {
// The await keyword says: come back and resume when .then happens
// This is similar to return ajaxPromise('yourApi/getlist').then(function(resultData){...
const resultData = await ajaxPromise('yourApi/getlist');
// The async keyword can also be on anon functions, but we don't here
// because ajaxPromise also returns a promise that we'll await below
const itemPromises = resultData.items.map(itemid => ajaxPromise('yourApi/getitem/' + itemid));
// Now wait for all the item promises to finish
// This is similar to return Promise.all(itemPromises).then(function(retrievedItems){...
const retrievedItems = await Promise.all(itemPromises);
// We can just return the final instance - async will 'wrap it' in a Promise for us
return retrievedItems.sort(comparer);
}
catch (err) {
// A regular try catch
logError(err);
}
}
// Promise syntax still works...
doTheThing().then(finalCallback);
// Or, in another async function...
finalCallback(await doTheThing());
This isn't really less code than with promises (yet), but it's a lot clearer. It looks a lot like synchronous actions would; we can even use try-catch
for error handling. It easier to write, it's easier to read, it's easier to see what fires when and whether exceptions are caught.
Let's have a go with IndexedDB
again, and let's use idb to promisify it:
async function getIndexedDBStore(storeName, write) {
// idb changes the upgrade signature, for this demo I've just put it inline
const db = await idb.open('my-cache', 1, db => db.createObjectStore('results', { keyPath: 'id'}));
const transaction = await db.transaction(storeName, write ? 'readwrite' : 'readonly');
const objectStore = transaction.objectStore(storeName);
return objectStore;
}
async function getCache(id) {
const objectStore = await getIndexedDBStore('results', false);
const item = await objectStore.get(id);
return item;
}
async function setCache(item) {
const objectStore = await getIndexedDBStore('results', true);
await objectStore.put(item);
return;
}
OK, that was a bit easier. How about plugging it in?
async function doTheThing() {
try {
const resultData = await ajaxPromise('yourApi/getlist');
// We convert an array of IDs into an array of promises waiting for AJAX data
const itemPromises = resultData.items.map(async itemid => {
// Try for the IndexedDB cache
// This await continues the map loop, but not the parent doTheThing - only the closest async applies
let item = await getCache(itemid);
if(item)
return item; // We have it cached
// Call the service
// This await resumes the map loop
item = await ajaxPromise('yourApi/getitem/' + itemid);
// Fire and forget to save the item
setCache(item);
return item;
});
// resultData.items.map isn't async, so the result of the anon function will be a promise
// We don't want to sort the loop has finished, so we still need Promise.all
const retrievedItems = await Promise.all(itemPromises);
return retrievedItems.sort(comparer);
}
catch (err) {
logError(err);
}
}
That was a lot easier.
So what's the gotcha? There's got to be some downsides, right? Well, once you go async
you never go back – functions that call async
code tend to need to be async
too. Although the async code looks synchronous it's really just wrappers for promises, and underneath it all we're really still just managing all those callbacks.
When a function calls an async
then it can:
setCache
above – probably best avoided.async
too and call the nested function with await
.async
too).That last case is the messiest, but we've already seen it with the map
loop – if you don't await
an async
function then you get a promise:
async function getThing() {
const thing = await ajaxThing();
return thing;
}
// Then
let t = await getThing(); // t will be the thing, once getThing() has finished
let p = getThing(); // p will immediately be a promise, that resolves with the thing once getThing() has finished
// Either of these will get us the thing
p.then(t => /* We have t */);
t = await p;
As long as you understand the underlying promises you shouldn't have too many problems with this, just be careful with try-catch
as that only works with await
– if you have a promise you'll need a catch(handleErr)
in the chain. In our example the fire-and-forget setCache
method is outside of the try-catch
because it is not awaited – when that method is called the asynchronous process starts, but nothing tracks what happens to it. If we change that line to await setCache();
then errors will be caught, but the Promise.all
will take slightly longer.
async
/await
is an awesome feature in JS, not because it gives us something we couldn't do before, but because it makes it so much easier to do. If you have just one or two asynchronous actions it's probably not going to make a difference, but the chances are that you don't. The modern JS developer will have asynchronous REST services, asynchronous local storage, service workers (with asynchronous fetching and caching), who knows how many new asynchronous APIs?
async
/await
is the key to making these easy to code.
Watch some videos from our monthly JavaScript North West Meetup >> JS Meetup Videos
Join the JavaScript North West Meetup Group and attend our next meetup!
Search the latest Javascript jobs here
Need help finding top JS talent?