Node.js

Modules System

15 Questions

CommonJS uses require() to load modules and module.exports to expose functionality, treating each file as a separate module with its own scope to prevent variable collisions. Modules are wrapped internally in a function that provides exports, require, module, __filename, and __dirname — this is the module wrapper function. They are loaded synchronously and cached after the first require() call, so subsequent imports return the same cached instance without re-executing the module.
// math.js
module.exports.add = (a, b) => a + b;
module.exports.subtract = (a, b) => a - b;

// app.js
const { add, subtract } = require('./math');
console.log(add(2, 3)); // 5

// Verify caching — same object reference
const m1 = require('./math');
const m2 = require('./math');
console.log(m1 === m2); // true

Why it matters: CommonJS is still the dominant module system in Node.js backends. Understanding how it works is essential for debugging module-related errors, understanding singleton patterns, and migrating to ES Modules effectively.

Real applications: Every Express, Fastify, and NestJS application traditionally uses CommonJS. The singleton behavior means a shared database connection module returns the same connection pool instance when required from multiple files.

Common mistakes: Developers confuse exports (shorthand) with module.exports (actual export object), reassigning exports = {} directly which breaks the reference and results in an empty export from the module.

exports is a shorthand reference to module.exports — both initially point to the same empty object. However, reassigning exports directly breaks this reference and does not affect what the module actually exports — only module.exports matters at the end. This is one of the most common sources of confusion and silent bugs for developers new to the CommonJS module system.
// Works — adding properties to the shared object
exports.greet = () => 'Hello';
exports.farewell = () => 'Bye';

// BROKEN — reassignment breaks the reference to module.exports
exports = { greet: () => 'Hello' }; // does nothing!

// Correct — reassign module.exports for single/complete export
module.exports = { greet: () => 'Hello' };
module.exports = class MyService {};  // class export

Why it matters: This subtle distinction causes silent bugs — the module appears to export something but the consumer receives an empty object. It's a classic interview trap that tests genuine CommonJS understanding versus surface-level familiarity.

Real applications: Service classes and singleton instances are exported using module.exports = new MyService(). Using exports = ... accidentally here means every consumer gets an empty object and service calls fail silently.

Common mistakes: Mixing both styles in the same file — using exports.foo = ... then later module.exports = {...} which removes all previously added properties since it replaces the entire export object.

ES Modules use import/export syntax and are the official standard for JavaScript module management in both browsers and Node.js. Enable them by setting "type": "module" in package.json or using the .mjs file extension. Key advantages over CommonJS include static analysis for tree-shaking, top-level await support, and live bindings that reflect the current exported value rather than a copied snapshot.
// utils.mjs
export const greet = (name) => `Hello, ${name}`;
export default class Logger {
  log(msg) { console.log(msg); }
}

// app.mjs
import Logger, { greet } from './utils.mjs';
const logger = new Logger();
logger.log(greet('World'));

// Top-level await (only in ES Modules)
const config = await fetch('/config').then(r => r.json());

Why it matters: The JavaScript ecosystem is migrating to ES Modules. New frameworks and tools default to ESM, so understanding it is essential for modern Node.js development and for writing libraries that work in both Node.js and browsers.

Real applications: Modern TypeScript projects, Vite-based backends, and Deno applications use ES Modules natively. Publishing an npm package with dual CJS/ESM support requires understanding both systems and conditional exports.

Common mistakes: Forgetting that ESM requires explicit file extensions (e.g., import './utils.js', not import './utils'), and that __dirname is unavailable — you must use import.meta.url with the path module instead.

When you call require('module'), Node.js resolves the module using a specific search order — first built-in modules (always win), then file/directory paths for relative imports, then crawling up node_modules directories toward the filesystem root. For file modules, Node.js tries appending .js, .json, and .node extensions in order, then checks index.js as a fallback for directories. Understanding this algorithm is essential for debugging the common "Cannot find module" error.
require('./myModule');
// Tries: ./myModule → ./myModule.js → ./myModule.json → ./myModule/index.js

require('express');
// Tries: built-in? No → ./node_modules/express → ../node_modules/express → ...

// Check resolution without loading
const path = require.resolve('express');
console.log(path); // /project/node_modules/express/index.js

Why it matters: Module resolution bugs — wrong relative paths, missing files, conflicting package versions — are common in large projects. Understanding the algorithm helps diagnose "Cannot find module" errors quickly instead of guessing.

Real applications: Monorepo setups with workspaces rely on node_modules hoisting behavior determined by this algorithm. Tools like webpack and ts-node extend it to support TypeScript paths and path aliases.

Common mistakes: Forgetting the leading ./ for local files — require('utils') searches node_modules while require('./utils') looks in the current directory. This typo causes "module not found" errors that are hard to spot.

Node.js caches modules in require.cache after the first require() call, keyed by the resolved absolute file path. Subsequent calls return the cached exports object without re-executing module code, meaning all consumers share the same instance — enabling the singleton pattern naturally. The cache can be manually cleared by deleting entries from require.cache, though this is rarely recommended.
// counter.js
let count = 0;
module.exports = { increment: () => ++count, getCount: () => count };

// app.js — both require the same cached instance
const c1 = require('./counter');
const c2 = require('./counter');
c1.increment();
console.log(c2.getCount()); // 1 — same singleton instance

// Inspect the cache
console.log(Object.keys(require.cache)); // all loaded module paths

Why it matters: Module caching enables the singleton pattern in Node.js — a database connection created once in a module is shared across the entire app. Understanding caching prevents unexpected behavior when multiple files import the same module.

Real applications: Database connection pools, configuration objects, and logger instances are conventionally defined in modules so they are cached and shared — ensuring only one pool is created regardless of how many controllers require it.

Common mistakes: Clearing module cache in production to "reload" configuration — this can cause memory leaks and inconsistent state where different parts of the app hold different module instances simultaneously.

Circular dependencies occur when module A requires module B and module B requires module A, creating a cycle. Node.js handles this gracefully by returning a partially loaded export instead of throwing an error — the module required second only sees the exports defined before the circular require statement. This can lead to subtle bugs where properties are undefined on the imported object.
// a.js
exports.loaded = false;
const b = require('./b');
exports.loaded = true;
console.log('b.done:', b.done); // true (b finished loading)

// b.js
const a = require('./a');
exports.done = true;
console.log('a.loaded:', a.loaded); // false — partial A export!
// a.loaded is false because b required a before a finished

Why it matters: Circular dependencies are a silent killer in large Node.js projects. They cause properties to be undefined at runtime in ways that are hard to trace, especially when the circular path spans many files.

Real applications: In large MVC applications, controllers importing models that import services that import controllers creates circular dependency chains. Refactoring shared logic into a standalone utility module breaks the cycle.

Common mistakes: Ignoring eslint-plugin warnings about circular imports and only discovering the issue at runtime when a method is called on undefined. Tools like madge can visualize and detect circular dependencies in your codebase.

Creating an npm package involves initializing a project with npm init, writing the module code with a proper entry point, and publishing to the npm registry with npm publish. Each package needs a unique name, valid package.json with main and files fields, and follows semantic versioning (semver) to communicate the nature of changes. Scoped packages (@scope/pkg) are useful for organization or private namespaced packages.
npm init -y                      # Create package.json
# Write your module code...
npm login                        # Authenticate with npm

npm publish                      # Publish to registry
npm publish --access public      # For scoped packages

npm version patch                # 1.0.0 → 1.0.1 (bug fix)
npm version minor                # 1.0.1 → 1.1.0 (new feature)
npm version major                # 1.1.0 → 2.0.0 (breaking change)

Why it matters: Understanding the npm publishing workflow is important for open-source contributors and for teams building shared internal libraries. Semantic versioning knowledge prevents breaking consumer applications unexpectedly.

Real applications: Companies like Airbnb, Facebook, and Google publish open-source utility packages on npm (e.g., eslint-config-airbnb). Internal platform teams publish private scoped packages to a private registry for shared components.

Common mistakes: Publishing without setting the files whitelist in package.json, accidentally including node_modules, test files, or secrets in the published bundle. Always run npm pack first to inspect what will be published.

dependencies are packages required at runtime in production (like Express, Mongoose), while devDependencies are only needed during development (like Jest, ESLint, TypeScript). This separation allows production deployments to install only what the app needs to run, reducing image size and attack surface. There is also peerDependencies for plugins that require the host project to provide a specific package version.
npm install express            # → dependencies (needed in production)
npm install jest --save-dev    # → devDependencies (test only)
npm install typescript --save-dev  # → devDependencies (build tool)

// package.json
{
  "dependencies": { "express": "^4.18.0", "mongoose": "^8.0.0" },
  "devDependencies": { "jest": "^29.0.0", "typescript": "^5.0.0" }
}

Why it matters: Incorrectly categorizing dependencies adds unnecessary packages to production deployments, increasing container image size, startup time, and the npm audit attack surface. This is often caught in production optimization reviews.

Real applications: Docker production images use npm ci --only=production to install only runtime dependencies, keeping images lean. A miscategorized webpack (dev tool) in dependencies bloats every production container unnecessarily.

Common mistakes: Putting build tools like TypeScript, webpack, and Babel in dependencies instead of devDependencies — they're only needed during the build, not at runtime. This is a common mistake caught in code reviews and security audits.

Built-in (core) modules ship with Node.js and require no installation — they are always prioritized over npm packages with the same name and are loaded by name without a path prefix. Node.js 16+ also supports the node: prefix (e.g., require('node:fs')) for explicit, unambiguous core module imports. These built-ins cover file system, networking, cryptography, streaming, OS interaction, and more.
const fs = require('node:fs');       // File system operations
const path = require('node:path');   // Path manipulation
const http = require('node:http');   // HTTP server/client
const os = require('node:os');       // OS info (CPUs, memory)
const crypto = require('node:crypto');// Hashing, encryption
const events = require('node:events');// EventEmitter base class
const stream = require('node:stream');// Stream base classes
const url = require('node:url');     // URL parsing/formatting

Why it matters: Knowing the built-in module catalog prevents reinventing the wheel by installing npm packages for functionality Node.js already provides. It also helps you write faster code by avoiding the overhead of third-party dependencies.

Real applications: The crypto module is used for generating secure tokens and hashing passwords, child_process for spawning shell commands, and cluster for multi-process scaling — all without external packages.

Common mistakes: Installing npm packages for functionality already in Node.js core — like using uuid when crypto.randomUUID() is available in Node.js 14.17+, or using path-browserify when node:path handles the same use case.

package-lock.json locks the exact dependency tree so every install produces identical node_modules across all environments, machines, and CI runs. It records the precise version, download URL, and integrity hash of every installed package and its transitive dependencies. This resolves the "works on my machine" problem caused by semver ranges like ^4.18.0 installing different patch versions in different environments.
// package.json — allows a range of versions
"express": "^4.18.0"  // installs latest 4.x

// package-lock.json — locks the exact resolved version
"express": {
  "version": "4.18.2",
  "resolved": "https://registry.npmjs.org/express/-/express-4.18.2.tgz",
  "integrity": "sha512-..."
}

// Use npm ci (not npm install) in CI/CD for deterministic installs
// npm ci reads only package-lock.json, fails if it doesn't exist

Why it matters: Inconsistent dependencies between development and production cause hard-to-reproduce bugs. Interviewers ask this to verify you understand reproducible builds — a core DevOps and engineering discipline.

Real applications: CI/CD pipelines use npm ci which requires and respects package-lock.json for deterministic installs. Without it, a new patch version of a dependency could silently break production builds.

Common mistakes: Adding package-lock.json to .gitignore thinking it's a generated file that shouldn't be committed — it should always be committed. Deleting and regenerating it loses integrity hashes, reducing supply-chain security.

Before executing module code, Node.js wraps it in a module wrapper function that provides five module-scoped variables: exports, require, module, __filename, and __dirname. This wrapper is why top-level variables in a module are not global — they're scoped to the wrapper function. Understanding the wrapper demystifies why these variables exist without any explicit import.
// Node.js internally wraps your module code like this:
(function(exports, require, module, __filename, __dirname) {
  // Your module code runs here — fully scoped
  const myVar = 'scoped to this module only';
  module.exports = { myVar };
  // exports is initially the same as module.exports
});

// Verify by checking wrapper source
const { wrapper } = require('module');
console.log(wrapper[0]); // '(function(exports, require, module, ...'

Why it matters: This is a deep internals question that separates candidates who understand Node.js at a fundamental level. It explains why require and __dirname appear to be "magic" global variables in every module.

Real applications: Understanding the wrapper is why you can always use require without importing it, and why declaring a variable with var at the top level of a module doesn't pollute the global scope unlike browser scripts.

Common mistakes: Assuming top-level var declarations in Node.js modules create global variables (they don't — they're trapped in the wrapper function), unlike browser scripts where var outside a function does create globals on window.

Dynamic imports use the import() function to load modules at runtime, returning a promise that resolves to the module namespace object. Unlike static import statements, dynamic imports can be used conditionally and inside functions, and they work in both CommonJS and ES Module contexts. They are essential for lazy-loading heavy modules to improve startup time and for conditional dependency loading.
// Dynamic import in CommonJS context
async function loadModule() {
  const { readFile } = await import('node:fs/promises');
  return readFile('config.json', 'utf8');
}

// Conditional dynamic import (e.g., based on config)
const dbDriver = process.env.DB === 'mongo'
  ? await import('./drivers/mongo.js')
  : await import('./drivers/postgres.js');

// Lazy load heavy module only when needed
app.get('/report', async (req, res) => {
  const { generatePDF } = await import('./pdf-generator.js');
  res.send(await generatePDF(req.query));
});

Why it matters: Dynamic imports are the modern way to implement code splitting, feature flags, and optional dependency loading without bundler overhead. They're increasingly important in serverless environments where cold start time matters.

Real applications: Serverless functions use dynamic imports to lazily load heavy SDKs (like AWS SDK or Stripe) only on the paths that need them, dramatically reducing cold start times for functions that don't use those libraries.

Common mistakes: Using dynamic imports for every module "to be safe", which can actually hurt performance by deferring module evaluation to runtime instead of the startup phase. Only use dynamic imports when there's a real benefit to deferring load.

Subpath exports (the exports field in package.json) control which modules consumers can import from your package, effectively creating a defined public API and hiding internal implementation files. This replaces the older main field with fine-grained access control and supports conditional exports for providing separate CJS and ESM entry points from the same package.
// package.json — control public API
{
  "name": "my-lib",
  "exports": {
    ".": {
      "import": "./src/index.mjs",   // ESM entry
      "require": "./src/index.cjs"   // CJS entry
    },
    "./utils": "./src/utils.js",
    "./types": "./src/types.d.ts"
  }
}

// Consumers can import:
import { main } from 'my-lib';          // ✓ maps to index.mjs
import { helper } from 'my-lib/utils'; // ✓ maps to utils.js
import { secret } from 'my-lib/internal'; // ✗ not exposed

Why it matters: Package exports are now the standard for publishing dual CJS/ESM packages. Without them, consumers hit import resolution errors or accidentally import internal files that aren't part of your public API.

Real applications: Major libraries like Lodash, date-fns, and Vite plugins use conditional exports to support both CommonJS and ESM consumers from the same package, ensuring compatibility across the entire Node.js ecosystem.

Common mistakes: Defining exports without including "." (the package root) which prevents consumers from doing import pkg from 'my-lib'. Also forgetting that once exports is defined, all other paths are automatically blocked.

The node: protocol prefix explicitly identifies a module as a Node.js built-in, preventing potential confusion with npm packages of the same name. Introduced in Node.js 14.18 and recommended from Node.js 16+, it guarantees you're importing the core module even if an npm package with the same name is installed. Some newer built-in modules like node:test are only accessible via the node: prefix.
// Without prefix (ambiguous — could be shadowed by npm package)
const fs = require('fs');

// With node: prefix (explicit and safe — always the built-in)
const fs = require('node:fs');
const { readFile } = require('node:fs/promises');
const path = require('node:path');
const { randomUUID } = require('node:crypto');

// ES Module style with node: prefix
import { createServer } from 'node:http';
import { join, resolve } from 'node:path';

Why it matters: The node: prefix is a security and clarity best practice. Without it, a malicious or mistakenly installed npm package named fs could shadow the built-in module, intercepting file system operations.

Real applications: Security-conscious codebases and internal style guides at companies like Cloudflare mandate the node: prefix for all built-in module imports to prevent dependency confusion attacks.

Common mistakes: Forgetting to use node: prefix when importing newer built-ins like node:test (the built-in test runner, Node.js 18+) — without the prefix it tries to find an npm package named "test" and fails.

require.resolve() returns the resolved absolute file path of a module without loading or executing it, while require() fully loads and executes the module. This makes require.resolve() useful for checking module existence, finding package locations, passing paths to other tools, or implementing optional dependency patterns. It follows the same resolution algorithm as require() and throws MODULE_NOT_FOUND if the module cannot be found.
// Returns the full path without executing the module
const expressPath = require.resolve('express');
console.log(expressPath); // '/project/node_modules/express/index.js'

// Check if an optional module is installed
function hasModule(name) {
  try {
    require.resolve(name);
    return true;
  } catch {
    return false;
  }
}

if (hasModule('sharp')) {
  const sharp = require('sharp'); // only require if available
}

Why it matters: require.resolve() enables safe optional dependency patterns — a library can check if an optional peer dependency is installed before using it, providing fallback behavior without crashing when the peer isn't present.

Real applications: Test runners and build tools use require.resolve() to locate configuration files, plugin paths, and executable binaries relative to the project root rather than hardcoding paths that vary between installations.

Common mistakes: Using require.resolve() expecting it to load the module — it only resolves the path. Also catching the wrong error type when checking existence; it throws a plain Error with code: 'MODULE_NOT_FOUND', not a specific error class.