This is the most famous JavaScript closure pitfall. When using var in a for loop, only ONE variable i is created for the entire loop — all closures capture the same reference.
By the time the setTimeout callbacks execute, the loop has already finished and i equals 3 (the exit condition).
Two clean fixes exist: use let (creates a new binding per iteration) or use an IIFE to capture the current value in a new scope.
// With var — all callbacks share one 'i'
for (var i = 0; i < 3; i++) {
setTimeout(function() { console.log(i); }, 0);
}
// Output: 3, 3, 3 (all three print 3)
// Fix 1: let (new binding per iteration)
for (let i = 0; i < 3; i++) {
setTimeout(() => console.log(i), 0); // 0, 1, 2 ✓
}
// Fix 2: IIFE to capture current i
for (var i = 0; i < 3; i++) {
((n) => setTimeout(() => console.log(n), 0))(i); // 0, 1, 2 ✓
}
Root cause: var is function-scoped, not block-scoped — the loop body doesn't create a new scope.
This is one of the main reasons let was introduced in ES6 — to enable proper block scoping in loops.
JavaScript's hoisting mechanism moves var declarations to the top of their containing function (or global scope), but NOT the assignments. The variable exists from the start of the scope but holds undefined until the assignment line is reached.
let and const are also hoisted but remain in a Temporal Dead Zone (TDZ) — accessing them before declaration throws a ReferenceError.
Function declarations are hoisted completely (including the body), while function expressions follow var/let/const hoisting rules.
// var: hoisted, initialized as undefined
console.log(x); // undefined (no error)
var x = 5;
console.log(x); // 5
// let: hoisted but in TDZ
console.log(y); // ReferenceError: Cannot access 'y' before initialization
let y = 10;
// Function declaration: fully hoisted
greet(); // "Hello!" (works!)
function greet() { console.log("Hello!"); }
TDZ (Temporal Dead Zone): the period between entering scope and the variable's declaration where let/const throw if accessed.
Avoid relying on hoisting — declare variables at the top of their scope for clarity and predictability.
JavaScript is single-threaded with an event loop. The call stack must be completely empty before any callback from the task queue can run.
setTimeout(fn, 0) means "schedule fn as soon as possible" but it STILL goes through the macro-task queue and can never run before all synchronous code finishes.
This is why Node.js heavy synchronous computation blocks all I/O — the event loop can't process callbacks mid-computation.
console.log('A'); // 1st: synchronous
setTimeout(() => console.log('B'), 0); // scheduled to task queue
console.log('C'); // 2nd: still synchronous
// Output: A, C, B
// Even with delay 0, B runs last:
// Call stack: [A, C] -> empty -> event loop -> B
// setTimeout minimum delay is ~4ms in browsers (0 rounds up to ~1ms in Node)
Zero delay does NOT mean immediate execution — it means "queue this to run after the current call stack clears".
The actual minimum delay for nested setTimeouts in browsers is 4ms (per the HTML spec), not truly zero.
The event loop has TWO queues: the microtask queue (Promises, MutationObserver, queueMicrotask) and the macrotask queue (setTimeout, setInterval, I/O). Microtasks always have priority.
After each macrotask, ALL pending microtasks are processed before the next macrotask. This means a chain of Promise .then() calls all execute before any setTimeout callback.
Understanding this ordering prevents subtle bugs in async code and explains why certain patterns work or don't work.
setTimeout(() => console.log('4 timeout'), 0);
Promise.resolve().then(() => console.log('2 promise1'));
Promise.resolve().then(() => console.log('3 promise2'));
console.log('1 sync');
// Output order:
// 1 sync (call stack)
// 2 promise1 (microtask queue, empties before macrotask)
// 3 promise2 (still in microtask queue)
// 4 timeout (macrotask — runs after microtasks are drained)
Microtask queue drains completely after each synchronous block and after each macrotask, before the next macrotask starts.
If microtasks keep adding more microtasks, the macrotask queue (setTimeout) can starve indefinitely.
typeof null === "object" is one of JavaScript's oldest bugs. In the original JavaScript implementation, values were stored with type tags, and the null pointer (0x00) was mistakenly identified as an object.
The fix was proposed (returning "null") but rejected because it would break millions of existing websites. So it remains for backward compatibility.
To correctly check for null, always use strict equality === null instead of typeof.
console.log(typeof null); // "object" — BUG in JS!
console.log(typeof undefined); // "undefined"
console.log(null === undefined); // false (different types)
console.log(null == undefined); // true (loose equality quirk)
// Correct null check:
const val = null;
if (val === null) console.log('it is null'); // correct
if (!val) console.log('falsy'); // also catches 0, "", false
Never use typeof x === "null" — that's wrong. Use x === null for null checks.
Both null and undefined are falsy values and loosely equal to each other, but strictly unequal.
JavaScript's loose equality == applies Abstract Equality Comparison which coerces operands to compatible types before comparing. The rules are complex and non-intuitive.
Key coercion rules: if one side is a number and the other is a string, the string is converted to a number. Boolean values are converted to numbers (true→1, false→0) first. null == undefined is a special case that returns true.
Always use === (strict equality) in production code — it never coerces types and is predictable.
console.log(0 == ''); // true ('' converts to 0)
console.log(0 == '0'); // true ('0' converts to 0)
console.log('' == '0'); // false (both strings, different)
console.log(false == '0'); // true (false→0, '0'→0)
console.log(null == undefined); // true (special rule only)
console.log(null == 0); // false (null only == undefined)
console.log([] == false); // true ([].toString()=''→0, false→0)
The null == undefined special case: they are only loosely equal to each other, not to 0, false, or ''.
The == coercion rules follow the ECMAScript spec's Abstract Equality Comparison algorithm — memorizing all cases is impractical. Use === always.
NaN (Not a Number) is the only value in JavaScript — and indeed in IEEE 754 floating point — that is NOT equal to itself. This is specified in the standard to allow detecting "computation failed" results.
This means x !== x is ONLY true when x is NaN — a useful check before Number.isNaN was available.
Use Number.isNaN() over the older global isNaN() — the global version coerces strings to numbers first, causing false positives.
console.log(NaN === NaN); // false (NaN ≠ NaN by spec!)
console.log(NaN == NaN); // false (even loose equality)
console.log(Number.isNaN(NaN)); // true (correct)
console.log(Number.isNaN('hello')); // false (no coercion)
console.log(isNaN('hello')); // true (coerces: BAD!)
// Old trick before Number.isNaN:
const isNaNTrick = x => x !== x; // true only for NaN
console.log(isNaNTrick(NaN)); // true
Number.isNaN is the reliable check: it only returns true for actual NaN values, not string coercions.
NaN arises from: 0/0, Math.sqrt(-1), parseInt('abc'), NaN + 5 — any operation with NaN propagates NaN.
The delete operator removes properties from objects. It returns true on success (or for non-existent properties), and false only for non-configurable properties.
Variables declared with var, let, const, or function names CANNOT be deleted — delete silently returns true but the variable remains.
Only undeclared properties set directly on the global object (implicit globals) CAN be deleted.
// Cannot delete declared variables
var a = 1;
console.log(delete a); // true (lies! but a still exists)
console.log(a); // 1 (not deleted)
// CAN delete object properties
const obj = { x: 10, y: 20 };
console.log(delete obj.x); // true
console.log(obj.x); // undefined (deleted)
console.log(obj); // {y: 20}
// Cannot delete non-configurable properties:
delete Math.PI; // false (non-configurable, strict mode throws)
delete returns true even when it can't delete a variable — it's not reliable as an existence check.
After deletion, accessing the property returns undefined. The property key is completely removed (unlike setting to undefined).
The comma operator (not to be confused with commas in arrays or function parameters) evaluates each operand from LEFT to RIGHT, and returns the value of the LAST operand. All side effects still occur.
It has the lowest precedence of all operators. Parentheses are often required to distinguish it from array/param commas.
Practical use: for loops with multiple update expressions, or obscure code golf patterns. Rarely used in modern code.
let x = (1, 2, 3);
console.log(x); // 3 (last value returned)
let y = (console.log('a'), console.log('b'), 42);
// logs: 'a' then 'b' (side effects happen)
console.log(y); // 42
// Common use: for loop with multiple increments
for (let i = 0, j = 10; i < 5; i++, j--) {
console.log(i, j); // 0,10 1,9 2,8 3,7 4,6
}
The comma operator evaluates ALL expressions (side effects happen) but discards all values except the last.
The for loop update expression i++, j-- is the most common legitimate use of the comma operator.
JavaScript has two increment operators: pre-increment ++x increments FIRST then returns the new value; post-increment x++ returns the CURRENT value THEN increments.
The same distinction applies to decrement: --x (pre) vs x-- (post).
Pre vs post increment is a common source of subtle bugs, especially in complex expressions and loop conditions.
let a = 5;
console.log(a++); // 5 (returns 5 first, a becomes 6)
console.log(++a); // 7 (a becomes 7 first, then returns 7)
console.log(a); // 7
// Dangerous: let x = 5; let y = x++ + ++x; // 5 + 7 = 12 (not 12!)
// Actual: x++ returns 5 (x=6), then ++x makes x=7 and returns 7. y = 5+7=12
Key rule: ++x = increment first, use incremented value. x++ = use current value, increment after.
Avoid mixing pre/post increment in complex expressions — it makes code hard to reason about.
Function declarations are fully hoisted — both the declaration AND the definition move to the top of their scope, so they can be called before they appear in code.
Function expressions (const/let/var) are NOT fully hoisted — only the variable declaration is hoisted, not the assignment.
This difference is crucial: calling an expression variable before assignment throws a TypeError or ReferenceError.
// Function declaration: hoisted fully
sayHi(); // "Hi!" — works before declaration
function sayHi() { console.log("Hi!"); }
// Function expression: NOT hoisted
greet(); // TypeError: greet is not a function
var greet = function() { console.log("Hello!"); };
// const/let: ReferenceError (temporal dead zone)
// hi(); // ReferenceError
// const hi = () => "hi";
Function declarations are hoisted completely (both name and body). var function expressions: the var name is hoisted as undefined.
Prefer const arrow functions — predictable behavior with no hoisting surprises.
Using typeof on an undeclared variable does NOT throw a ReferenceError — it returns the string "undefined". This is unique to typeof.
This safe behavior makes typeof useful for checking if optional dependencies or globals exist before using them.
All other operations on undeclared variables DO throw ReferenceError.
console.log(typeof undeclaredVar); // "undefined" (no error!)
console.log(typeof null); // "object" (known JS bug)
console.log(typeof undefined); // "undefined"
console.log(typeof 42); // "number"
console.log(typeof "hi"); // "string"
console.log(typeof true); // "boolean"
console.log(typeof function(){}); // "function"
console.log(typeof {}); // "object"
console.log(typeof []); // "object" (arrays too!)
console.log(typeof Symbol()); // "symbol"
Arrays and null both return "object" — use Array.isArray() and strict equality for those instead.
For robust type checking use Object.prototype.toString.call(val) which distinguishes all types.
The spread operator ... creates a shallow copy of arrays/objects. For primitives it expands them as individual arguments or array elements.
Shallow copy means nested objects/arrays are still shared by reference — modifying them affects both copies.
For a deep copy, use structuredClone(), JSON.parse(JSON.stringify()), or recursive cloning.
// Array spread: shallow copy
const a = [1,2,3], b = [...a];
b.push(4);
console.log(a); // [1,2,3] — unaffected
// Object spread: shallow copy
const obj = {x:1, nested:{y:2}};
const copy = {...obj};
copy.nested.y = 99;
console.log(obj.nested.y); // 99 (MUTATED — shallow!)
// Function arguments
Math.max(...[3,1,4,1,5]); // 5 (same as Math.max(3,1,4,1,5))
Shallow copy: top-level properties are duplicated, but nested objects share the same reference.
Spread is NOT deep cloning — this trips up many developers when they modify "copies" and see the original change.
Promise .then() returns a new Promise. Returning a value from .then() wraps it in a resolved Promise automatically. Throwing inside .then() rejects the chain.
Each .then() receives the return value of the previous .then(), enabling sequential async operations without nesting.
Understanding Promise chaining is essential for writing clean asynchronous JavaScript.
Promise.resolve(1)
.then(x => x + 1) // receives 1, returns 2
.then(x => x * 2) // receives 2, returns 4
.then(x => console.log(x)); // outputs: 4
// Short-circuit on rejection:
Promise.resolve(1)
.then(x => { throw new Error('fail'); })
.then(x => console.log('skipped')) // skipped
.catch(e => console.log(e.message)); // "fail"
Each .then() receives the return value of the previous handler. Returning undefined gives the next handler undefined.
A rejection skips all .then() handlers until the nearest .catch(). After .catch(), the chain continues as resolved.
The value of this in JavaScript depends on HOW a function is called, not where it's defined — except for arrow functions which inherit this lexically.
Arrow functions do NOT have their own this. They capture this from the surrounding scope at definition time.
Understanding this context is one of the most important JavaScript concepts for writing correct OOP code.
const obj = {
name: 'Alice',
regular: function() { return this.name; }, // obj.name = 'Alice'
arrow: () => this.name, // window.name (outer this, not obj)
};
console.log(obj.regular()); // "Alice"
console.log(obj.arrow()); // undefined (window.name)
// Losing context:
const fn = obj.regular;
fn(); // undefined (this = window/global in strict mode error)
fn.call(obj); // "Alice" (explicit this via call)
Arrow function this: fixed at definition time (lexical). Regular function this: determined at call time.
Use .bind(this), .call(this), or arrow functions to control what this evaluates to.
JavaScript's && and || do NOT just return true/false — they return the actual OPERAND VALUES that determined the result (short-circuit evaluation).
&& returns the first FALSY value, or the LAST value if all are truthy. || returns the first TRUTHY value, or the LAST value if all are falsy.
This behavior is heavily used in JSX default props, conditional rendering, and default parameter patterns.
console.log(1 && 2 && 3); // 3 (all truthy, last value)
console.log(1 && 0 && 3); // 0 (first falsy)
console.log(0 || '' || 'hi'); // "hi" (first truthy)
console.log(0 || '' || false);// false (last value, all falsy)
// Practical usage:
let name = user && user.name; // safe property access
let val = input || 'default'; // default value fallback
&& short-circuits on falsy — second operand is NOT evaluated if first is falsy.
|| short-circuits on truthy. Use ?? (nullish coalescing) to only default on null/undefined (not 0 or '').
Both Object.assign(target, ...sources) and the spread operator {...obj1, ...obj2} merge objects shallowly with later properties overwriting earlier ones.
Key difference: Object.assign MUTATES the target object. Spread creates a new object. Both are shallow.
For React state updates and immutable patterns, always prefer spread to avoid accidentally mutating existing objects.
const a = {x: 1, y: 2};
const b = {y: 3, z: 4};
// Spread (new object, a unchanged)
const c = {...a, ...b};
console.log(c); // {x:1, y:3, z:4}
console.log(a); // {x:1, y:2} unchanged
// Object.assign (mutates first arg!)
const d = Object.assign({}, a, b);
console.log(d); // {x:1, y:3, z:4}
Object.assign(a, b); // MUTATES a!
console.log(a); // {x:1, y:3, z:4}
Prefer spread for most cases since it doesn't mutate existing objects.
Both are shallow — nested objects share references. For deep merge use libraries like Lodash _.merge().
Array destructuring extracts values by position. Default values activate when the extracted value is undefined. Rest collects remaining items.
Destructuring is cleaner than manual index access and allows renaming, defaults, and selective extraction in one step.
Be careful: destructuring binds to undefined positions (beyond array length) as undefined, not as index-out-of-bounds errors.
const [a, b, c = 10] = [1, 2];
console.log(a, b, c); // 1 2 10 (default c=10 since undefined)
const [x, , y] = [1, 2, 3]; // skip index 1
console.log(x, y); // 1 3
const [first, ...rest] = [1, 2, 3, 4];
console.log(first); // 1
console.log(rest); // [2, 3, 4]
// Swap without temp variable:
let p = 5, q = 10;
[p, q] = [q, p];
console.log(p, q); // 10 5
Default activates only for undefined — not for null, 0, or false values (which are valid falsy values).
The swap pattern [a, b] = [b, a] is idiomatic ES6 and avoids needing a temporary variable.
Arrow functions with a single expression (no curly braces) implicitly return that expression. Adding curly braces requires an explicit return statement.
This concise syntax is common in map/filter/reduce callbacks but can be confused with object literal shorthand.
A common mistake: using {} thinking it's an implicit return of an object, but it's treated as a function body.
const double = x => x * 2; // implicit return
console.log(double(5)); // 10
const add = (a, b) => a + b; // implicit return
console.log(add(3, 4)); // 7
// Return object literal: must wrap in ()
const makeObj = x => ({ value: x }); // OK
const buggy = x => { value: x }; // undefined! (block body, no return)
console.log(makeObj(5)); // {value: 5}
console.log(buggy(5)); // undefined
Rule: omit braces = implicit return. With braces = function body requiring explicit return.
Wrap object literals in parentheses ( {} ) to distinguish them from function bodies when using implicit return.
The nullish coalescing operator ?? returns the right-hand side only when the left is null or undefined — NOT for other falsy values like 0, '', or false.
Logical OR || returns the right-hand side for ANY falsy value, which causes unintended behavior when 0 or empty string are valid user inputs.
?? was introduced in ES2020 specifically to handle the "0 is a valid value" scenario that || gets wrong.
const a = 0 || 'default'; // "default" (wrong! 0 is valid)
const b = 0 ?? 'default'; // 0 (correct! 0 is not null/undefined)
const c = '' || 'fallback'; // "fallback" (bad for empty strings)
const d = '' ?? 'fallback'; // '' (empty string preserved)
const e = null ?? 'use me'; // "use me" (expected)
const f = undefined ?? 'use me'; // "use me" (expected)
Use ?? when 0, false, or empty string are valid values that should not trigger the fallback.
Combine with optional chaining: user?.name ?? 'Guest' safely handles undefined users.
Generator functions return an iterator. They pause execution at each yield keyword and resume when .next() is called, remembering their state between calls.
Each .next() call returns an object { value, done }. When done is true, the generator is exhausted.
Generators enable lazy sequence generation and are useful for infinite sequences, async control flow, and pipelines.
function* counter() {
yield 1;
yield 2;
yield 3;
}
const gen = counter();
console.log(gen.next()); // {value: 1, done: false}
console.log(gen.next()); // {value: 2, done: false}
console.log(gen.next()); // {value: 3, done: false}
console.log(gen.next()); // {value: undefined, done: true}
// for...of consumes generators automatically:
for (const x of counter()) console.log(x); // 1, 2, 3
Generators are lazy — values are computed on demand, not all at once. Perfect for infinite sequences.
Can pass values INTO generators via gen.next(value) — the value becomes the result of the previous yield expression.
The JavaScript event loop has two queues: the microtask queue (Promises, queueMicrotask) and the macrotask queue (setTimeout, setInterval). Microtasks always drain completely before any macrotask runs.
Order of execution: synchronous code → microtasks → next macrotask → microtasks → next macrotask...
This ordering is critical for understanding how async code interleaves with synchronous code.
console.log('1 sync');
setTimeout(() => console.log('4 timeout'), 0);
Promise.resolve().then(() => console.log('2 promise'));
queueMicrotask(() => console.log('3 microtask'));
console.log('1b sync');
// Output order:
// 1 sync
// 1b sync
// 2 promise
// 3 microtask
// 4 timeout
Priority: Call stack > Microtasks (Promises) > Macrotasks (setTimeout). Microtasks drain before any macrotask runs.
Even setTimeout(fn, 0) runs AFTER all pending microtasks — "zero delay" just means "add to macrotask queue as soon as possible".
The in operator checks if a property exists anywhere in the prototype chain. hasOwnProperty checks ONLY the object's own properties, not inherited ones.
This distinction matters when working with objects that inherit from prototypes or when iterating with for...in loops.
Always use hasOwnProperty in for...in loops to avoid accidentally processing prototype properties.
function Animal(name) { this.name = name; }
Animal.prototype.type = 'animal';
const dog = new Animal('Rex');
console.log('name' in dog); // true (own)
console.log('type' in dog); // true (prototype)
console.log(dog.hasOwnProperty('name')); // true
console.log(dog.hasOwnProperty('type')); // false!
// for...in includes prototype properties:
for (let key in dog) {
if (dog.hasOwnProperty(key)) console.log(key); // only "name"
}
Rule of thumb: use hasOwnProperty in for...in loops to skip inherited properties.
Modern alternative: Object.keys(obj) and Object.hasOwn(obj, key) only return/check own properties.
A Symbol is a primitive data type introduced in ES6. Every Symbol value created with Symbol() is guaranteed to be unique — even two Symbols with the same description are not equal.
Symbols are used as unique property keys to avoid name collisions between libraries or to create "private-like" properties on objects.
Symbols are not iterable in for...in or Object.keys() — they're intentionally "hidden" unless you use Object.getOwnPropertySymbols().
const s1 = Symbol('id');
const s2 = Symbol('id');
console.log(s1 === s2); // false! Each symbol is unique
const obj = { [s1]: 123 };
console.log(obj[s1]); // 123 (symbol as key)
console.log(Object.keys(obj)); // [] (symbols hidden!)
// Well-known symbols:
class Collection {
[Symbol.iterator]() { /* makes object iterable */ }
}
Symbol uniqueness is guaranteed — Symbol('x') !== Symbol('x') always. No equality without the reference.
Used for object property keys to prevent accidental property collisions in large codebases or when extending others' objects.