JavaScript

Async/Await

12 Questions

The async keyword before a function declaration guarantees it always returns a Promise — returned primitives are auto-wrapped in Promise.resolve(), and thrown errors become rejections via Promise.reject(). It also enables the await keyword inside the function body, which is a syntax error in non-async functions.
async function greet() {
  return "Hello"; // auto-wrapped: Promise.resolve("Hello")
}
greet().then(msg => console.log(msg)); // "Hello"

// Throwing produces a rejected promise
async function fail() {
  throw new Error("Oops");
}
fail().catch(err => console.error(err.message)); // "Oops"

// Arrow function form
const fetchData = async () => {
  const res = await fetch("/api/data");
  return res.json();
};

Why it matters: Understanding the async/Promise relationship prevents bugs where developers treat the return value of an async function as the resolved data rather than a Promise.

Real applications: Every data-fetching function in React, Next.js, and Express uses async functions. getServerSideProps in Next.js must be async to fetch data server-side before rendering.

Common mistakes: Forgetting that async functions always return a Promise — const val = asyncFn() gives a pending Promise, not the resolved value. Always await or chain .then() to access the actual result.

await pauses execution of the enclosing async function until the Promise resolves or rejects, then either returns the resolved value or re-throws the rejection as an exception, enabling normal try/catch handling. It can only be used inside async functions or at the top level of ES modules.
async function fetchUser(id) {
  const res = await fetch(`/api/users/${id}`); // pauses here
  const user = await res.json();               // pauses here
  return user;
}

// await unwraps any thenable
async function example() {
  const val = await Promise.resolve(42);
  console.log(val); // 42 (not a Promise)

  // await on a non-Promise is a no-op
  const str = await "hello";
  console.log(str); // "hello"
}

Why it matters: await makes asynchronous code read linearly, eliminating deeply nested .then() chains and making complex async workflows tractable for large teams.

Real applications: Stripe payment processing: const charge = await stripe.charges.create(params). Prisma ORM: const user = await prisma.user.findUnique({ where: { id } }).

Common mistakes: Using await inside a forEach callback does nothing useful — forEach ignores returned Promises. Use for...of for sequential awaiting, or Promise.all(arr.map(async fn)) for parallel.

try/catch is the primary error-handling mechanism for async/await — when an awaited Promise rejects, the rejection becomes a thrown exception caught by the nearest catch block. You can also chain .catch() on the returned Promise, or adopt a safe-wrapper pattern that returns [data, error] tuples to avoid nested try/catch blocks.
async function loadUser(id) {
  try {
    const res = await fetch(`/api/users/${id}`);
    if (!res.ok) throw new Error(`HTTP ${res.status}`);
    return await res.json();
  } catch (err) {
    console.error("Failed:", err.message);
    return null;
  }
}

// Safe-wrapper pattern — eliminates nested try/catch
async function safe(promise) {
  try { return [await promise, null]; }
  catch (e) { return [null, e]; }
}
const [user, err] = await safe(fetchUser(1));
if (err) return handleError(err);

Why it matters: Unhandled async rejections crash Node.js processes (since v15) and produce errors in browsers. Proper error handling at every async boundary is non-negotiable in production code.

Real applications: Express.js route handlers need try/catch for every async operation. Many teams use wrapper utilities like express-async-errors or the Go-style [data, error] pattern for cleaner error flow.

Common mistakes: Writing return someAsyncFn() inside a try/catch without await — the rejection escapes the catch block entirely. Always use return await inside try blocks when you need local error handling.

Consecutive await expressions execute sequentially — each waits for the previous to complete before starting, even when the operations are independent. Use Promise.all() to start all independent operations concurrently and await their combined resolution, cutting total wait time to the duration of the slowest single operation.
// Sequential — total time = A + B (slow for independent tasks)
async function sequential() {
  const user = await fetchUser(1);     // waits 200ms
  const profile = await fetchProfile(1); // then waits 150ms
  // Total: ~350ms
}

// Parallel with Promise.all — total time = max(A, B)
async function parallel() {
  const [user, profile] = await Promise.all([
    fetchUser(1),    // starts immediately
    fetchProfile(1)  // starts immediately
  ]);
  // Total: ~200ms
}

// Start promises first, then await (same as Promise.all)
const userP = fetchUser(1);
const profileP = fetchProfile(1);
const [user, profile] = [await userP, await profileP];

Why it matters: Accidentally serializing independent async operations is one of the most common performance bugs in Node.js backends and React data-fetching — it multiplies latency unnecessarily.

Real applications: Next.js server components use Promise.all to fetch multiple data sources in parallel. GraphQL resolvers parallelize field resolution for unrelated data sources.

Common mistakes: Writing const a = await fetchA(); const b = await fetchB(); when the fetches are independent — if each takes 300ms, you waste 300ms. Use Promise.all when operations do not depend on each other's results.

An async IIFE (Immediately Invoked Function Expression) is an async function that is immediately invoked in the same expression, creating a self-contained async scope. It is the classic workaround for using await at module scope in CommonJS environments, older Node.js versions, and non-module browser scripts where top-level await is unavailable.
(async () => {
  try {
    const config = await fetch("/api/config").then(r => r.json());
    await initApp(config);
    console.log("App ready");
  } catch (err) {
    console.error("Startup failed:", err);
  }
})();

// Bootstrap server (CommonJS Node.js pattern)
(async () => {
  await connectDatabase();
  await loadCache();
  app.listen(3000, () => console.log("Server ready"));
})();

// Async IIFE in legacy script tag
(async () => {
  const data = await fetchInitialData();
  renderUI(data);
})();

Why it matters: Before top-level await (ES2022), async IIFEs were the only way to use await at module scope. They remain common in CommonJS backends and legacy browser scripts without a bundler.

Real applications: Express.js server startup scripts use async IIFEs to connect to databases before starting the HTTP server. CLI tools and build scripts use them for top-level initialization.

Common mistakes: Forgetting to add .catch() or a try/catch to the IIFE — unhandled rejections will crash the process silently. Always wrap the body in try/catch or append .catch(console.error) to the IIFE.

Top-level await allows using await outside of async functions at the module top level in ES modules (.mjs files or type="module" scripts). The module pauses evaluation — and all importing modules wait — until the awaited Promise settles, enabling clean sequential initialization without wrapping code in async IIFEs.
// config.mjs — top-level await works here
const res = await fetch("/api/config");
export const config = await res.json();
// Importing modules automatically wait for config

// Dynamic conditional import
const locale = navigator.language.startsWith("de") ? "de" : "en";
const { strings } = await import(`./i18n/${locale}.js`);

// Fallback pattern
let db;
try {
  db = await connectPrimary();
} catch {
  db = await connectFallback();
}
export { db };

Why it matters: Top-level await eliminates async IIFEs in module-based projects, making initialization code more readable. It enables lazy loading of locale files, configs, and conditionally required modules without boilerplate.

Real applications: Vite and Webpack 5 support top-level await in builds. It is used in Deno (supported since v1) for database initialization, configuration loading, and dynamic localization.

Common mistakes: Using slow top-level awaits in files imported by many modules — the delay propagates up the entire module graph. Keep top-level awaits fast; move slow initialization to lazy-loaded modules instead.

Promisification wraps callback-based APIs in a new Promise() constructor, resolving with the success value or rejecting with the error. Node.js provides util.promisify() for automatic conversion of error-first callback functions (err, result) =>, and modern Node.js exposes native promise APIs in fs/promises, dns/promises, and other sub-namespaces.
// Manual promisification for any callback style
function readFileAsync(path) {
  return new Promise((resolve, reject) => {
    fs.readFile(path, "utf8", (err, data) => {
      if (err) reject(err);
      else resolve(data);
    });
  });
}
const content = await readFileAsync("./data.txt");

// util.promisify (error-first callbacks only)
const { promisify } = require("util");
const execAsync = promisify(require("child_process").exec);
const { stdout } = await execAsync("ls -la");

// Modern Node.js — built-in promise APIs (preferred)
import { readFile } from "fs/promises";
const data = await readFile("./data.txt", "utf8");

Why it matters: Millions of npm packages use the callback pattern. Promisification is the bridge between the legacy callback ecosystem and modern async/await code, essential when integrating older libraries.

Real applications: Wrapping legacy Redis or MySQL clients, converting child_process.exec (promisified as shown above), and integrating AWS SDK v2 callbacks with async/await code.

Common mistakes: util.promisify() only works with the strict error-first convention (err, result). Functions with different callback signatures (like setTimeout or EventEmitter listeners) require manual wrapping with new Promise().

The three most impactful async/await mistakes are: missing await (operating on a pending Promise instead of resolved data), await inside forEach (which runs all callbacks independently instead of sequentially), and unnecessarily sequential awaits for independent operations that should run in parallel with Promise.all().
// Mistake 1: forgetting await
async function bad() {
  const data = fetch("/api"); // data is Promise, not response!
  console.log(data); // Promise { <pending> }
}

// Mistake 2: await in forEach (callbacks fire concurrently)
items.forEach(async (item) => {
  await process(item); // iterations overlap, not sequential
});
// Fix: for...of for sequential
for (const item of items) await process(item);
// Fix: Promise.all for parallel
await Promise.all(items.map(process));

// Mistake 3: redundant return await (outside try/catch)
async function bad2() {
  return await getValue(); // redundant
}
// REQUIRED inside try/catch — removing await escapes the catch
async function correct() {
  try {
    return await riskyOp(); // needed to catch errors here!
  } catch(e) { return null; }
}

Why it matters: These mistakes cause subtle production bugs — the code appears to work locally but produces race conditions, incorrect ordering, or unhandled rejections under real traffic.

Real applications: The await-in-forEach bug frequently surfaces in database seeding scripts and batch processing jobs where each item must be processed sequentially before proceeding to the next.

Common mistakes: The ESLint rule no-await-in-loop flags all awaits in loops as potential performance issues, but sometimes sequential processing is intentional — configure the rule carefully or disable it with a comment for intentional sequential loops.

for await...of consumes async iterables — objects that produce Promises on each next() call. Each iteration awaits the resolved value before continuing, making it ideal for processing async generators, Node.js Readable streams, and paginated API responses incrementally without loading all data into memory.
// Async generator yielding paginated data
async function* paginate(url) {
  let page = 1, hasMore = true;
  while (hasMore) {
    const res = await fetch(`${url}?page=${page++}`);
    const data = await res.json();
    hasMore = data.hasMore;
    yield data.items;
  }
}

async function processAll() {
  for await (const items of paginate("/api/users")) {
    items.forEach(processItem); // process one page at a time
  }
}

// Node.js Readable stream — constant memory
import { createReadStream } from "fs";
const stream = createReadStream("large.csv", "utf8");
for await (const chunk of stream) {
  processChunk(chunk);
}

Why it matters: Streaming with for await...of uses constant memory regardless of data size, unlike Promise.all which loads everything at once. Critical for large files, database cursors, and paginated APIs.

Real applications: AWS S3 streaming downloads, MongoDB cursor iteration (for await (const doc of cursor)), processing large CSV/NDJSON files in Node.js, and consuming Server-Sent Events streams in browsers.

Common mistakes: When iterating an array of Promises, all Promises start immediately (not lazily) — for await...of only controls processing order, not execution order. Use async generators for true lazy/on-demand execution.

When an awaited Promise rejects, async/await converts the rejection into a thrown exception that propagates up through the async call chain — just like synchronous errors propagate up the call stack. If no try/catch catches it anywhere in the chain, the outermost async function's returned Promise rejects, producing an unhandled rejection.
async function step1() { throw new Error("step1 failed"); }
async function step2() { await step1(); }  // propagates up
async function step3() { await step2(); }  // propagates up

// Catch at the top level
step3().catch(err => console.error(err.message)); // "step1 failed"

// Catch at any level for recovery
async function withRecovery() {
  try {
    await step2();
  } catch(err) {
    console.error("Caught:", err.message);
    return "fallback";
  }
}

// Unhandled rejection — crashes Node.js 15+
async function uncaught() { throw new Error("oops"); }
uncaught(); // no .catch() = UnhandledPromiseRejection!

Why it matters: In Node.js 15+, unhandled rejections terminate the process. In browsers they fire unhandledrejection events. Every async entry point must have at least a top-level catch handler.

Real applications: Express.js middleware chains, Next.js API routes, and AWS Lambda functions all rely on error propagation reaching the framework's global error handler to send proper HTTP error responses.

Common mistakes: Adding async to event listeners without a try/catch: button.addEventListener('click', async () => { await riskyOp(); }) — rejections silently become unhandled. Always wrap async event handlers in try/catch.

Async operations can be processed sequentially (one at a time with for...of + await), in parallel (all at once with Promise.all), or in controlled batches (limited concurrency with chunked Promise.all). Batching is the most robust production pattern — it prevents rate limiting, memory exhaustion, and connection pool saturation when processing large sets.
// Sequential — ordered, safe, slowest
async function sequential(items) {
  for (const item of items) await processItem(item);
}

// Parallel — fastest, can overwhelm downstream APIs
async function parallel(items) {
  return Promise.all(items.map(processItem));
}

// Batched — recommended for production (e.g., 5 at a time)
async function batched(items, size = 5) {
  const results = [];
  for (let i = 0; i < items.length; i += size) {
    const chunk = items.slice(i, i + size);
    results.push(...await Promise.all(chunk.map(processItem)));
  }
  return results;
}

// p-limit: elegant concurrency control
import pLimit from "p-limit";
const limit = pLimit(5);
await Promise.all(items.map(item => limit(() => processItem(item))));

Why it matters: Running 10,000 API calls simultaneously with Promise.all triggers rate limits and exhausts connection pools. Batching to 5–10 concurrent requests is the standard production approach.

Real applications: GitHub Actions batch test parallelism, Stripe batch billing processes charges in controlled batches to respect rate limits, Shopify bulk product import scripts use batched async processing.

Common mistakes: Choosing between all-sequential and all-parallel without considering the batching middle ground. Most bulk operations in production need batching — use the p-limit npm package for flexible concurrency control without reinventing the logic.

async/await is syntactic sugar over Promises — async functions compile to Promise chains internally. The key difference is style: Promises use .then()/.catch() method chaining, while async/await uses familiar imperative style with try/catch. Both are equally capable; async/await is generally more readable for sequential logic and conditional branching.
// Promise chain
function getUser(id) {
  return fetch(`/api/users/${id}`)
    .then(res => res.json())
    .then(user => fetch(`/api/orders/${user.id}`))
    .then(res => res.json())
    .catch(err => { console.error(err); return null; });
}

// Equivalent async/await — cleaner conditional logic
async function getUser(id) {
  try {
    const res = await fetch(`/api/users/${id}`);
    const user = await res.json();
    const orderRes = await fetch(`/api/orders/${user.id}`);
    return await orderRes.json();
  } catch(err) {
    console.error(err); return null;
  }
}

// Both are fully interoperable
const [a, b] = await Promise.all([fetchA(), fetchB()]);

Why it matters: Interviewers ask candidates to rewrite Promise chains as async/await and vice versa, testing deep understanding of the underlying mechanism rather than just surface-level syntax familiarity.

Real applications: Axios and fetch return Promises — you can use either style to consume them. Promise.all(), Promise.race(), and Promise.allSettled() always require the Promise API regardless of which style you prefer.

Common mistakes: Believing that async/await and Promises are completely different mechanisms — an async function always returns a Promise, so you can freely chain .then() on it or use it with Promise.all().