javascript

Understanding the Event Loop, the Task Queue, and the Microtask Queue in JavaScript

author

Luis Paredes

published

Jun 4, 2023

When working with JavaScript one of the concepts that confuses the most is the event loop mechanism and how it interacts with Javascript's internal data structures involved in the execution process.

In this article, we'll examine all of the concepts necessary to fully grok the mechanism and see how knowing this can help us avoid common pitfalls and get to the root cause of some bugs more efficiently.

Let's get started!

Concurrency in JavaScript

Because of the single threaded nature of Javascript, the language runtime cannot run several process in parallel 1 but rather relies on a clever concurrency model to be able to handle several operations in a non-blocking manner.

This model relies on the use of several internal queues that hold deferred items for execution after all the synchronous code has been handled.

Let's take a look at the most important queues, how they behave, as well as other important internal data structures to understand how it all fits together.

JavaScript internal data structures

The call stack

The call stack is an essential concept in JavaScript's execution model. It serves as a memory structure that keeps track of function calls during code execution. Whenever a function is invoked, a new frame is created and added to the top of the call stack. This frame contains information about the function, including its arguments and local variables.

As the function executes, it may call other functions, which in turn create new frames and are stacked on top of the previous ones. This stacking behavior forms a hierarchical structure, resembling a stack of plates. When a function finishes executing, its frame is popped off the call stack, and the control returns to the function that invoked it.

The call stack operates on a Last-In-First-Out (LIFO) principle. The most recently added function is always at the top of the stack and is the one currently executing. This sequential behavior ensures that functions are executed in the order they are called, preventing concurrent execution and guaranteeing the integrity of the program's state.

Understanding the call stack is crucial for debugging JavaScript code. When an error occurs, the call stack provides valuable information about the chain of function calls that led to the error, enabling developers to identify the source of the problem. By examining the call stack, developers can trace the execution flow and understand how functions interact with each other, aiding in the resolution of errors and the overall comprehension of program behavior.

Task queue

The task queue, also known as the macrotask queue (to diferentiate it from the microtask queue) or callback queue, is another crucial component in JavaScript's event-driven architecture. It is responsible for storing some of the tasks that are executed asynchronously, ensuring non-blocking behavior and efficient handling of operations that may take longer to complete.

When asynchronous operations, such as a DOM event, a setTimeout callback, or an AJAX request are encountered, the corresponding task is added to the task queue.

The task queue follows a First-In-First-Out (FIFO) order, meaning that tasks are executed in the order they were added to the queue. This ensures that tasks are processed sequentially, maintaining the integrity of the program's state and avoiding race conditions.

The task queue plays a vital role in managing asynchronous operations in JavaScript. By offloading time-consuming tasks to the task queue, JavaScript can continue executing other operations without waiting for the completion of each individual task. This allows for smoother user experiences, as the browser remains responsive even during heavy processing or long-running tasks.

It's worth noting that while the task queue is commonly referred to as the macrotask queue, it is also sometimes called the callback queue. This is because many asynchronous operations in JavaScript involve providing a callback function that will be executed when the task is ready to be processed. Hence, the task queue can be seen as a queue of callbacks waiting to be executed.

Microtask queue

The microtask queue is a critical part of JavaScript's asynchronous mechanism, working in tandem with the task queue. It is responsible for handling microtasks, which typically include promises and Mutation Observers.

Microtasks have a higher precedence than tasks in the execution. This means that when the call stack is empty and JavaScript is ready to process new tasks, it first checks the microtask queue. If there are any microtasks waiting to be executed, they take precedence and are processed before moving on to the next task in the task queue.

One important aspect to note is that even if new microtasks are added to the microtask queue while it is being processed, these newly added microtasks will still be executed before moving on to the next queue. This ensures that any subsequent microtasks generated during the execution of existing microtasks are given a chance to be processed without delay.

The microtask queue is especially useful for handling asynchronous operations that need to be executed immediately after the current task completes. For example, when a promise is resolved or rejected, the associated callbacks are scheduled as microtasks. This allows for precise control and sequencing of operations, ensuring that dependent actions occur promptly after a promise settles.

Other internal data structures

Depending on the context of execution (Node or the browser), there may be other queues involved in the concurrency mechanism of JavaScript, however, the 3 data structures we mentioned in this section are present regardless of the execution context.

In the next section, we'll see how the event loop makes use of these data structures.

The event loop

In JavaScript, the runtime model is managed through an event loop, which oversees the execution of code, handles events, and deal with queued sub-tasks.

The mechanism got its name because its implementation usually resembles the following pseudo-code:

while (queue.waitForMessage()) {
  queue.processNextMessage();
}

When taking into account the internal data structures, explained in the previous section, the whole mechanism behaves like this:

  • The event loop continuously checks the state of the call stack, microtask queue, and task queue.
  • If the call stack is empty:
    • Check the microtask queue.
      • If there are microtasks present:
        • Take the next microtask from the microtask queue.
        • Execute the associated callback.
        • Repeat until all microtasks are processed.
    • If the microtask queue is empty:
      • Move on to the task queue (macro task queue).
        • Take the next task from the task queue.
        • Execute the task's associated code.
      • Repeat the above steps until the task queue is empty.
  • Once all microtasks and tasks are processed, return to the start of the event loop and repeat the process.

Let's take a look at the mechanism using an example:

console.log(1);

setTimeout(() => console.log(2), 1000);

Promise.resolve().then(() => console.log(3));

setTimeout(() => console.log(4), 0);

console.log(5);"

Given the previous snippet, what will be printed to the console and why? —Take a moment to analyze the execution order based on the event loop mechanism before looking at the explanation below.

Explanation:

  1. The first statement to be processed will be console.log(1). As it is a synchronous statement, it will be added directly to the call stack and the frame will be processed right away, hence 1 will be printed to the console.
  2. Next, we have setTimeout(() => console.log(2), 1000), the setTimeout itself gets processed right away, but it's callback (() => console.log(2)) gets deferred to be executed at least 1000ms (1s) after the setTimeout itself was executed (after all the synchronous code gets executed and after the microtasks have been handled since the setTimeout callback will get enqueued in the task queue).
  3. Then, we get to Promise.resolve().then(() => console.log(3)), which similarly to the previous line, gets executed right away, but the then() callback gets deferred (this is a microtask that will be picked up as soon as as all the synchronous code gets processed)
  4. setTimeout(() => console.log(4), 0) gets processed just like the first setTimeout, notice that the deferral time is significantly smaller than the previous timeout (1000ms vs 0ms)
  5. Now we execute the final statement console.log(5), which happens to be synchronous and as a result, we get 5 logged to the console.
  6. After all of the synchronous code has been handled, it's time to take a look at the queues. The first one to be processed would be the microtask queue, and as there's a then() callback there, this callback (() => console.log(3)) will be executed and as result, 3 will be logged to the console.
  7. Lastly, the event loop moves to the task queue because all of the messages in the microtask queues have been processed. If both of the timeouts had the same deferral time (or a sufficiently small deferral time difference 2), the callback corresponding to the first setTimout callback (() => console.log(2)) would get enqueued first and hence be processed first, however, since the deferral time difference is big enough, the second callback (() => console.log(4)) will get enqueued first and hence processed first and as a result, 4 will get printed before 2.

Gotchas

As a programmer, I'm sure you're a familiar with the need to prevent infinite loops when working with loop operators such as for and while as well as when working with recursive functions.

Now that you understand the basics of the event loop in JavaScript, there's another patter you need to watch out for: infinite microtasks loops.

If we were to create and invoke a function like this:

function loop() {
  setTimeout(loop, 0);
}

loop();

JavaScript wouldn't have any issues processing the infinite execution because:

  • any synchronous code would be processed before processing the callbacks in the task queue
  • any microtasks would be processed before processing the messages in the task queue
  • messages in other internal higher priority queues (rendering related queues in the browser for example) would be processed first
  • even though each invocation enqueues a new callback, the callback will be processed in the next cycle of the event loop

However, if we were to change the asynchronous behavior for something like this:

function loop() {
  Promise.resolve().then(loop);
}

we would be blocking the execution of the program because this time we're dealing with microtasks, and the event loop won't let go of the microtask queue until there are no messages left to process (which will never happen because each message adds a new one).

Of course, we won't be implementing something literally like the function above, however, we can fall into the pattern without realizing it, so keep this in mind when dealing with asynchronous code in JavaScript.

Other resources

If you'd like to dig deeper into the topic, here are a couple of talks I recommend to gain a deeper understanding of the event loop in the browser context:

Conclusion

By now, hopefully you have a solid grasp of the essentials of the event loop mechanism and its associated data structures.

By comprehending how the event loop interacts with these data structures, developers can avoid common pitfalls, debug errors more efficiently, and ensure the proper execution order of tasks and microtasks.

Footnotes

[1] Unless configuring a Node cluster or using Web workers.

[2] If we were to change 1000 by 1 in the example, Node will change the execution order becauese the deferral time difference is not big enough to prevent the first setTimout callback to get into the task queue before the other one. However, when executed in the browser, the 1ms difference is enough to make the first timeout callback end up being executed after the second, the reason being, that the browser handles more queues (related to DOM events and rendering) that modifify the actual deferral time (remember that the timeout time only guarantees the minimum deferral time, hence the actual deferral time can be much higher).