Photo by Tomas Sobek on Unsplash
JavaScript Asynchronous Programming Tips, Tricks, and Gotchas
How and why asynchronous programming is necessary for JavaScript as well as common coding errors that can catch anyone.
Table of contents
- Synchronous vs asynchronous code
- Why asynchronous code is necessary
- Tip 1: remember to return after executing a callback
- Tip 2: ensure functions are 100% synchronous or 100% asynchronous
- Tip 3: switch to Promises
- Tip 4: use async/await
- Tip 5: run Promises in parallel when possible
- Tip 6: avoid using asynchronous functions in synchronous loops
- Conclusion
Asynchronous concepts may be evident in other languages but it's impossible to avoid them in JavaScript. JavaScript runs applications on a single-threaded non-blocking I/O event loop.
To explain the jargon, imagine you're running a restaurant on your own. You take the first order, cook the food, and serve it to the customer. The whole process takes one hour.
You're operating on a single processing thread and you cannot take more orders or prepare other dishes during that time.
You now hire a chef to make your restaurant more efficient. You take the first order and pass it to the chef. The chef will alert you when the dish is ready so you can then serve it to the customer.
While you're waiting, you can take further orders and pass them to the chef. The chef may still take one hour to prepare an individual dish, but they can work on other dishes in parallel while food is cooking.
You're still operating on a single processing thread, but the chef handles all food preparation. You're free to do other tasks.
JavaScript applications run on a single processing thread and execute a single command at a time. To make this more efficient, the OS handles input/output operations such as HTTP requests, file reading, or database updates. The application doesn't wait for the operation to finish: it asks the engine to execute a callback function when the OS completes the task. This callback receives success or error data as necessary.
Synchronous vs asynchronous code
Consider the following PHP code to write text to a file:
<?php
echo 'saving file';
$err = file_put_contents('file.txt', 'content');
if ($err !== false) echo 'file saved';
echo 'complete';
?>
When run, the code outputs:
saving file
file saved
complete
The PHP interpreter executes the file_put_contents()
statement and waits for completion before progressing to the next command.
Now consider similar JavaScript (Node.js) code:
import { writeFile } from 'node:fs';
console.log('saving file');
writeFile('file.txt', 'content', 'utf8', err => {
if (!err) console.log('file saved');
});
console.log('complete');
The code outputs:
saving file
complete
file saved
Processing completes before writing the file!
The fourth argument passed to writeFile()
is an anonymous ES6 callback function with a single err
parameter. The callback runs once the file saves or fails to save. This takes a few milliseconds but it's an OS I/O operation which runs in the background. The JavaScript interpreter is free to run more code so it progresses to the next line and outputs complete
.
Note: it's standard practice to return an error as the first argument to a callback function. If no error occurs, the argument should be a falsy value such as null
or undefined
.
Why asynchronous code is necessary
JavaScript's non-blocking I/O event loop avoids problems when running code on a single thread:
Browser JavaScript does not need to wait for a user to click a button -- the browser raises an event which calls a function when a click occurs.
Browser JavaScript does not need to wait for a response to a
Fetch
request -- the browser raises an event which calls a function when the server returns data.Node.js JavaScript does not need to wait for the result of a database query -- the runtime raises an event which calls a function when data's returned.
JavaScript applications would grid to a halt if the engine handled input/output operations synchronously.
You may have written asynchronous event-handling functions in client-side JavaScript. These should run fast and pages do not remain open for long. An asynchronous bug could cause issues for an individual user but a page reload would fix it.
Asynchronous code bugs in server-side or complex client-side applications can be fatal. A small error can cascade into a series of memory leaks that crash the application.
Asynchronous programming is one of the primary causes of developer confusion when migrating to JavaScript from other languages. It can appear complex but the following tips will help you avoid common problems.
Tip 1: remember to return
after executing a callback
The following pause()
function waits for a set number of milliseconds before executing a callback function:
// pause for ms milliseconds
function pause(ms, callback) {
ms = parseFloat(ms);
// invalid ms value?
if (!ms || ms < 1 || ms > 5000) {
const err = new RangeError('Invalid ms value');
callback( err, ms );
}
// wait ms before callback
setTimeout( callback, ms, null, ms );
}
(() => {
console.log('starting');
// pause for 500 ms
pause(500, (err, ms) => {
if (err) console.log(err);
else console.log(`paused for ${ ms }ms`);
});
})();
The syntax looks correct and it runs as expected with the paused
message appearing half a second after started
:
starting
paused for 500ms
Now try passing an invalid ms
argument to pause()
such as 0
. The output:
started
RangeError: Invalid ms value
paused for 0ms
It raises the error but the setTimeout
also executes and the program outputs paused for 0ms
. The callback function executes twice because pause()
does not terminate when an error occurred.
It's important to remember that executing a callback does not end function execution. A return
statement can solve the problem, e.g.
// invalid ms value?
if (!ms || ms < 1 || ms > 5000) {
const err = new RangeError('Invalid ms value');
callback( err, ms );
return;
}
Or you could execute the callback in the return
:
// invalid ms value?
if (!ms || ms < 1 || ms > 5000) {
const err = new RangeError('Invalid ms value');
return callback( err, ms );
}
The application executes the callback once and the output is correct:
started
RangeError: Invalid ms value
Our callback code works. Or does it?...
Tip 2: ensure functions are 100% synchronous or 100% asynchronous
The fixed code above looks correct and could pass automated testing but there's a subtler issue: the callback runs immediately when an error occurs. At that point, the function is no longer asynchronous — it's synchronous. This can cause memory leaks in larger, long-running applications which eventually crash with "memory overflow" errors that are difficult to debug.
JavaScript functions must be 100% synchronous or 100% asynchronous. There should not be any path through an asynchronous function which immediately executes a callback on the current iteration of the event loop.
The solution is to ensure all callbacks run after a delay. Another setTimeout
can raise the error after one millisecond:
// pause for ms milliseconds
function pause(ms, callback) {
ms = parseFloat(ms);
// invalid ms value?
if (!ms || ms < 1 || ms > 5000) {
const err = new RangeError('Invalid ms value');
setTimeout( callback, 1, err, ms );
return;
}
// wait ms before callback
setTimeout( callback, ms, null, ms );
}
Note: Node.js offers setImmediate()
which calls a function during the next iteration of the event loop. You may have also used `process.nextTick() which works similarly but executes the callback before the end of the current iteration of the event loop. (This can cause a never-ending event loop if nextTick()
is recursively called.)
Tip 3: switch to Promises
Callback-based code becomes increasingly difficult to maintain when you want to make a series of asynchronous function calls. It can lead to deeply nested callback hell:
asyncFn1(err => {
console.log('asyncFn1 complete');
asyncFn2(err => {
console.log('asyncFn2 complete');
asyncFn3(err => {
console.log('asyncFn3 complete');
});
});
});
You'll find ways to flatten this structure in ECMAScript 5 but ES6/2015 introduced Promises. Promises are syntactical sugar and callbacks are still used below the surface. An asynchronous function must return a Promise
object constructed with two parameters:
resolve
: a function run when processing successfully completes, andreject
: a function run when an error occurs.
An alternative to the pause()
function which returns a Promise:
// pause for ms milliseconds
function pausePromise(ms) {
ms = parseFloat(ms);
return new Promise((resolve, reject) => {
if (!ms || ms < 1 || ms > 5000) {
reject( new RangeError('Invalid ms value') );
}
else {
setTimeout( resolve, ms, ms );
}
});
}
Note: Node.js provides util.promisify()
. You can pass it a callback-based function (which returns an error as the first parameter) and it returns a Promise-based alternative.
Anything that returns a Promise can run a:
then()
method. It's passed a function with a single argument containing the result of the previousresolve()
catch()
method. It's passed a function with a single argument containing the result of the previousreject()
finally()
method. A function called at the end of processing regardless of success or failure.
Example code to call pausePromise()
:
pausePromise(500)
.then(ms => console.log(`paused ${ ms }ms`) )
.catch(err => console.log( err ) )
.finally( () => console.log('complete') );
A .then()
function can return another Promise or a value (which JavaScript converts to an asynchronous Promise) so you can chain sequential asynchronous functions using a flatter and easier-to-read syntax:
pausePromise(100)
.then(ms => {
console.log(`paused ${ ms }ms`);
return pausePromise(200);
})
.then(ms => {
console.log(`paused ${ ms }ms`);
return pausePromise(300);
})
.then(ms => {
console.log(`paused ${ ms }ms`);
})
.catch(err => {
console.log( err );
});
Tip 4: use async
/await
Flat Promise chains can still be confusing and it's easy to miss brackets. Note also that the whole Promise chain is asynchronous: any function using Promises should return its own Promise ... or run a callback function to confuse future you!
ES2017 introduced async
and await
which allows you to use Promise-based functions with a clearer syntax. The chain above rewritten to use await
:
try {
const p1 = await pausePromise(100);
console.log(`paused ${ p1 }ms`);
const p2 = await pausePromise(200);
console.log(`paused ${ p2 }ms`);
const p3 = await pausePromise(300);
console.log(`paused ${ p3 }ms`);
}
catch(err) {
console.log(err);
}
An await
keyword before any Promise-based asynchronous function makes the JavaScript wait until it's resolved or rejected. The resulting code looks much like a series of synchronous calls.
Any function using await
must have an async
statement to indicate it's asynchronous and turn it into a Promise-based function, e.g.
async function pauseSeries() {
try {
const p1 = await pausePromise(100);
console.log(`paused ${ p1 }ms`);
const p2 = await pausePromise(200);
console.log(`paused ${ p2 }ms`);
const p3 = await pausePromise(300);
console.log(`paused ${ p3 }ms`);
}
catch(err) {
console.log(err);
}
}
async
/await
is great but there are times you cannot depend on it...
Tip 5: run Promises in parallel when possible
The asynchronous examples above execute each function in series one after the other. This is necessary when the input for one call depends on the result of the previous call.
You'll also encounter situations when one or more asynchronous functions are not related to others, such as fetch
requests to unconnected endpoints:
try {
const fetch1 = await fetch('/f1');
console.log(`fetch1 status ${ fetch1.status }`);
const fetch2 = await fetch('/f2');
console.log(`fetch2 status ${ fetch2.status }`);
const fetch3 = await fetch('/f3');
console.log(`fetch3 status ${ fetch3.status }`);
}
catch(err) {
console.log(err);
}
Running thee series is inefficient: it's faster to run them in parallel using Promise.all()
. The method takes an array of Promises, runs each in parallel, and returns a new outer Promise where resolve()
returns an array of output values in the same order:
Promise.all([
fetch('/f1'),
fetch('/f2'),
fetch('/f3')
])
.then(result => {
console.log(`fetch1 status ${ result[0].status }`);
console.log(`fetch2 status ${ result[1].status }`);
console.log(`fetch3 status ${ result[2].status }`);
})
.catch(err => {
console.log( err );
})
The .catch()
triggers when a single Promise reject()
runs -- it also aborts any pending Promises.
The code runs as fast as the slowest Promise. There's no equivalent await
syntax, but async
functions return a Promise so you can use them in the Promise.all
array.
Similar Promise methods include:
-
Runs all Promises in the array and waits until every one has resolved or rejected. Each item in the returned array is an object with a
.status
property (either'fulfilled'
or'rejected'
) and a.value
property with the returned value. -
Runs all Promises in the array but resolves as soon as the first Promise resolves and aborts all others. It returns a single value.
-
Runs all Promises in the array but resolves or rejects as soon as the first Promise resolves or rejects and aborts all others. It returns a single value.
Tip 6: avoid using asynchronous functions in synchronous loops
The following code uses the Array.forEach()
method to pause three times and sum each result to totalWaited
:
const pause = [100, 200, 300];
let totalWaited = 0;
pause.forEach(async p => {
const ms = await pausePromise(p);
console.log(`paused ${ ms }ms`);
totalWaited += ms;
});
console.log(`total time waited: ${ totalWaited }ms`);
The result is not what you'd expect: the loop ends before the Promises resolve:
total wait time: 0ms
paused 100ms
paused 200ms
paused 300ms
This occurs because forEach()
expects a synchronous function. It executes asynchronous functions but will not wait until they resolve or reject. The loop itself is synchronous: Promises run in parallel and it's not possible to pass the result of one as an argument to the next. Promise.all()
would be a better option.
Methods including map()
and reduce()
are also synchronous and exhibit the same behavior.
You can use clever ways to solve the problem but the simplest option is a standard for
loop which will await
between each iteration:
const pause = [100, 200, 300];
let totalWaited = 0;
for (let p = 0; p < pause.length; p++) {
const ms = await pausePromise( pause[p] );
console.log(`paused ${ ms }ms`);
totalWaited += ms;
}
console.log(`total time waited: ${ totalWaited }ms`);
The output:
paused 100ms
paused 200ms
paused 300ms
total time waited: 600ms
Conclusion
Asynchronous programming takes time to understand and can catch the most experienced JavaScript developers. I suspect it's the primary cause of unstable Node.js applications which crash with strange "Out of memory" errors. Key tips:
use Promises with
async
andawait
when possibleensure functions are 100% asynchronous even when triggering a parameter error
keep your code as simple as possible.
See also How to use client and server-side web workers which describes how to write long-running synchronous JavaScript functions which do not block the event loop.