var a = 1; // function-scoped, hoisted
let b = 2; // block-scoped, TDZ applies
const c = 3; // block-scoped, cannot reassign
if (true) {
var x = 10; // accessible outside block
let y = 20; // NOT accessible outside block
}
console.log(x); // 10
console.log(y); // ReferenceError
Use const by default for values that won't be reassigned. Use let only when you need to reassign the variable later.
Avoid var in modern code because its function scoping and hoisting behavior leads to confusing bugs.
Why it matters: var/let/const scoping rules are foundational JavaScript knowledge and a near-universal interview topic. Understanding block scope vs function scope, temporal dead zone, and the const non-mutability-of-binding distinction demonstrates solid JavaScript foundations.
Real applications: Using const by default for all values (enforces immutable bindings), let in for loops and variables that need reassignment, avoiding var in all new code, and understanding legacy codebases that use var extensively. React hooks use const almost exclusively for state and effect variables.
Common mistakes: Thinking const means immutable (it prevents reassignment, not mutation of the value), using var in loops and getting closure bugs (the classic setTimeout in a loop gotcha), not knowing let/const are in the temporal dead zone before their declaration (accessing them before declaration throws ReferenceError, unlike var which returns undefined).
typeof identifies each primitive:
typeof "hello" // "string"
typeof 42 // "number"
typeof true // "boolean"
typeof undefined // "undefined"
typeof null // "object" (historical bug)
typeof Symbol() // "symbol"
typeof 10n // "bigint"
Note that typeof null returns "object" — this is a well-known bug from the first version of JavaScript that was never fixed for backwards compatibility.
Primitives are automatically wrapped in objects temporarily when you call methods on them (like "hello".toUpperCase()), a process called autoboxing.
Why it matters: Primitive types form the foundation of all JavaScript data. Knowing which are primitives (immutable, pass by value) vs objects (mutable, pass by reference) explains many behaviors: why strings are immutable, why you can call methods on numbers, and why typeof null is 'object' (historical bug).
Real applications: Understanding why string methods return new strings (not mutating), knowing that Symbol ensures uniqueness across the codebase, using BigInt for precise financial calculations, and understanding how null/undefined propagate through optional chains and default values.
Common mistakes: Thinking strings are mutable (they're immutable — string methods always return new strings), not knowing typeof null === 'object' (historical bug, use val === null check instead), and confusing autoboxing (temporary wrapper allows methods) with actually converting a primitive to an object (new String() creates an object permanently).
typeof operator returns a string indicating the type of the operand. It is safe to use on undeclared variables — it returns "undefined" instead of throwing a ReferenceError.
The most well-known quirk is that typeof null returns "object" instead of "null". This is a bug from JavaScript's first implementation that was never fixed.
Another quirk is that typeof NaN returns "number" even though NaN stands for "Not a Number".
Here are all the possible return values of typeof:
typeof undefined // "undefined"
typeof null // "object" (bug)
typeof NaN // "number"
typeof function(){} // "function"
typeof [] // "object"
typeof undeclared // "undefined" (no error)
For reliable type checking, use Object.prototype.toString.call(value) which returns strings like "[object Array]" and "[object Null]".
Use Array.isArray() to check for arrays since typeof [] returns "object".
Why it matters: typeof has several well-known quirks that trip up developers. Knowing them is essential for writing reliable type-checking code and is a frequent interview topic. The typeof null === 'object' bug is particularly famous and appears in many interviews.
Real applications: Type guards in utility functions, argument validation in library code, distinguishing between different kinds of objects (array vs map vs plain object), feature detection before using newer APIs, and debugging unexpected values at system boundaries.
Common mistakes: Using typeof to check for arrays (returns 'object' — use Array.isArray()), relying on typeof null === 'null' (it returns 'object'), not knowing typeof undeclaredVariable doesn't throw (returns 'undefined' — useful for feature detection), and using typeof for instanceof checks (typeof only works for primitives and functions, not specific class instances).
==) but not strictly equal (===) because they are different types.
Here is how they compare:
let a;
console.log(a); // undefined
let b = null;
console.log(b); // null
console.log(null == undefined); // true
console.log(null === undefined); // false
console.log(typeof null); // "object"
console.log(typeof undefined); // "undefined"
Use null when you want to explicitly clear a value or indicate "no data." Let JavaScript assign undefined naturally — don't manually set variables to undefined.
In APIs and database queries, null often represents missing data while undefined means the field was never set.
Why it matters: null vs undefined is one of the most fundamental distinctions in JavaScript. Many bugs arise from not knowing when each is used (undefined = not present/initialized, null = intentionally absent) and from using == null (catches both) vs === null (only null).
Real applications: API response parsing where null = field exists but has no value, function parameters where undefined means "use default" but null means "explicitly no value", database NULL representation in API responses, optional chaining returning undefined for missing properties, and React props defaulting when undefined but not when null.
Common mistakes: Using == null for checking both null and undefined (sometimes intended, more often confusing), returning null from functions that should return undefined (JavaScript functions return undefined by default), not distinguishing between "property doesn't exist" (undefined) and "property explicitly cleared" (null), and thinking null == undefined is false (it's true — loose equality).
NaN === NaN and NaN == NaN return false.
Always use Number.isNaN() for reliable checking. The global isNaN() is unreliable because it coerces its argument to a number first.
Here is the difference between the two checking methods:
console.log(typeof NaN); // "number"
console.log(NaN === NaN); // false
console.log(NaN == NaN); // false
console.log(isNaN("hello")); // true (coerces string)
console.log(Number.isNaN("hello")); // false (strict check)
console.log(Number.isNaN(NaN)); // true
The global isNaN("hello") returns true because it first converts "hello" to a number (which gives NaN), then checks if the result is NaN.
Number.isNaN() does not coerce — it only returns true if the actual value passed to it is NaN.
Why it matters: NaN is the only value in JavaScript not equal to itself (NaN !== NaN). This makes it impossible to detect with ===, requiring a special function. The global isNaN() coerces its argument, leading to false positives. Number.isNaN() is the safe alternative.
Real applications: Validating numeric inputs from user forms, checking whether a parsed number is valid, handling arithmetic overflow guard logic, detecting when parseFloat fails to parse (returns NaN), and safely detecting division by zero results (0/0 returns NaN).
Common mistakes: Using the global isNaN() which coerces its argument (isNaN('hello') === true even though it's not NaN, just unconvertible), using === NaN to check for NaN (always false), not realizing typeof NaN === 'number' (NaN is of type number, just a special invalid one), and propagating NaN through calculations without checking (NaN + anything = NaN).
BigInt(). BigInts cannot be mixed with regular numbers in arithmetic operations without explicit conversion.
BigInt is useful for cryptography, working with large database IDs, and financial calculations that need exact precision.
Here is how BigInt works:
const big = 9007199254740993n;
const also = BigInt("9007199254740993");
console.log(big + 1n); // 9007199254740994n
// console.log(big + 1); // TypeError: Cannot mix BigInt and other types
console.log(big + BigInt(1)); // 9007199254740994n
console.log(typeof big); // "bigint"
BigInt cannot be used with Math methods and cannot represent decimal numbers — it is only for whole integers.
When comparing, BigInt and Number can be loosely equal (1n == 1 is true) but not strictly equal (1n === 1 is false).
Why it matters: Floating-point numbers have precision limits — Number.MAX_SAFE_INTEGER is 2^53-1. Beyond this, regular numbers lose precision, which can cause subtle financial or cryptographic bugs. BigInt solves this for integer arithmetic.
Real applications: Financial systems handling large sums in cents, cryptographic operations requiring exact 64-bit integer math, database ID handling (Postgres IDs can exceed JavaScript's safe integer range), blockchain transaction amount processing, and astronomical or scientific computing requiring arbitrary precision.
Common mistakes: Mixing BigInt and Number in arithmetic without explicit conversion (TypeError), using BigInt where floating-point is needed (BigInt only supports integers), not knowing JSON doesn't serialize BigInt (throws TypeError on JSON.stringify), and converting BigInt to Number for display without checking if the value fits in safe integer range.
Symbol.for() creates shared symbols that can be accessed globally using a key string, unlike regular symbols which are always unique.
Here is how symbols work as unique identifiers:
const s1 = Symbol("id");
const s2 = Symbol("id");
console.log(s1 === s2); // false (always unique)
const obj = { [s1]: "value" };
console.log(obj[s1]); // "value"
// Global registry
const g1 = Symbol.for("app.id");
const g2 = Symbol.for("app.id");
console.log(g1 === g2); // true
Symbol properties are not enumerable by default — they don't show up in for...in loops or Object.keys(). Use Object.getOwnPropertySymbols() to list them.
JavaScript has built-in well-known symbols like Symbol.iterator, Symbol.toPrimitive, and Symbol.hasInstance that customize object behavior.
Why it matters: Symbols solve the property naming collision problem for library and framework code. They guarantee uniqueness, can't be accidentally accessed by string-based reflection, and the well-known symbols provide standard hooks to customize core JavaScript behavior.
Real applications: Using Symbol.iterator to make custom objects iterable in for...of loops, Symbol() as semi-private object keys in library APIs, Symbol.for() for cross-module shared symbols (like Redux action type constants), Symbol.toPrimitive for custom primitive coercion, and Symbol.hasInstance to customize instanceof behavior.
Common mistakes: Thinking Symbol.for('key') === Symbol('key') (false — Symbol() is always unique, Symbol.for() is global registry), not knowing symbols are invisible to Object.keys() and JSON.stringify(), using symbols for truly private data (they're discoverable via Object.getOwnPropertySymbols()), and not understanding that well-known symbols (Symbol.iterator) replace old string-based interfaces with standard hooks.
typeof for primitives, Array.isArray() for arrays, instanceof for class instances, and Object.prototype.toString.call() for the most reliable check across all types.
The typeof operator fails for null (returns "object") and arrays (returns "object"). The instanceof operator checks the prototype chain.
Object.prototype.toString.call() is the gold standard because it returns a unique string for every type including null, arrays, dates, and regex.
Here are the different type-checking approaches:
typeof "str" // "string"
Array.isArray([1, 2]) // true
new Date() instanceof Date // true
// Most reliable approach
Object.prototype.toString.call(null) // "[object Null]"
Object.prototype.toString.call([]) // "[object Array]"
Object.prototype.toString.call(new Date()) // "[object Date]"
The instanceof operator does not work across different execution contexts (e.g., iframes) because each context has its own constructor.
For production code, prefer Array.isArray() for arrays and typeof for primitives — only use toString.call() when you need to distinguish between complex types.
Why it matters: Each type-checking method has specific use cases and limitations. Interviews frequently include questions about type checking because the right answer depends on what you're checking. Knowing the toolkit prevents both missed checks and over-engineering.
Real applications: Argument validation in utility functions, building type narrowing guards in TypeScript-style JavaScript, runtime schema validation at API boundaries, distinguishing arrays from plain objects in serialization code, and building debug/inspection utilities that need to identify all value types.
Common mistakes: Using instanceof for cross-iframe type checks (different prototype chains), using typeof [] (returns 'object', not 'array'), trusting constructor.name (minification renames it), and writing fragile type checks that miss edge cases like null being typeof 'object' or function being both typeof 'function' and instanceof Object.
const prevents reassignment of the variable binding, not mutation of the value it holds. When you use const with an object, the variable always points to the same object, but the object's properties can be changed freely.
The same applies to arrays — you can push, pop, and modify elements of a const array, but you cannot assign a new array to the variable.
To make an object truly immutable, use Object.freeze(). Note that freeze is shallow — nested objects can still be modified.
Here is const behavior with objects and arrays:
const obj = { name: "Alice" };
obj.name = "Bob"; // Allowed — mutating the object
// obj = {}; // TypeError — reassigning the binding
const arr = [1, 2, 3];
arr.push(4); // Allowed
// arr = []; // TypeError
const frozen = Object.freeze({ x: 1 });
frozen.x = 2; // Silently fails (strict mode: TypeError)
console.log(frozen.x); // 1
For deep immutability, you need to recursively freeze all nested objects, or use libraries like Immer or Immutable.js.
In interviews, explain that const protects the binding (the arrow from variable name to memory location), not the contents of that memory.
Why it matters: This is one of the most commonly misunderstood JavaScript features. Many developers assume const means immutable, leading to bugs when const object properties are unexpectedly modified. Understanding binding vs value mutability is a core concept.
Real applications: Using const for object references that shouldn't be rebound (even if properties change), using Object.freeze() when you want true shallow immutability, understanding why React's useState hook setter replaces instead of mutates state, and explaining why spreading an object into a new const doesn't prevent modification of the spread copy.
Common mistakes: Thinking const obj = {} prevents mutation of obj's properties (it doesn't — only prevents reassignment), using const for a value you're about to mutate (works but misleads), not using Object.freeze() when you actually need immutability, and confusing const with readonly in TypeScript (similar concept, works at compile time only).
Number(), String(), Boolean()).
The + operator prefers string concatenation when one operand is a string. Other arithmetic operators (-, *, /) always convert to numbers.
The == operator performs type coercion before comparing, while === does not — this is why strict equality is always recommended.
Here are examples of both implicit and explicit coercion:
// Implicit coercion
"5" + 3 // "53" (string concat)
"5" - 3 // 2 (numeric)
true + 1 // 2
"" == false // true
// Explicit conversion
Number("42") // 42
String(123) // "123"
Boolean(0) // false
Boolean("hi") // true
parseInt("10px") // 10
There are only 6 falsy values in JavaScript: false, 0, "" (empty string), null, undefined, and NaN. Everything else is truthy.
parseInt() and parseFloat() parse from left to right and stop at the first non-numeric character, making them more lenient than Number().
Why it matters: Type coercion is one of JavaScript's most notorious features. It enables some convenient shorthand but also causes subtle bugs. Understanding when implicit coercion happens (==, +, if conditions, template literals) prevents unexpected behaviors that are hard to debug.
Real applications: Intentional coercion with !! for boolean conversion, Number() for strict numeric parsing, String() for safe toString conversion, parseInt with explicit radix for parsing user input, and understanding why API responses that stringify numbers need explicit Number() conversion.
Common mistakes: Not specifying radix in parseInt() (parseInt('010') was octal in old browsers), using + to concatenate a number to a string and getting concatenation instead of addition ('5' + 3 === '53'), not knowing Number('') === 0 (empty string converts to 0 not NaN), and using == for null checks where the coercion behavior is intentional but confusing to readers.
// Pass by value (primitives)
let a = 10;
function change(x) { x = 20; }
change(a);
console.log(a); // 10 — unchanged
// Pass by reference (objects)
let obj = { name: "Alice" };
function modify(o) { o.name = "Bob"; }
modify(obj);
console.log(obj.name); // "Bob" — changed!
// Reassigning doesn't affect original
function replace(o) { o = { name: "Charlie" }; }
replace(obj);
console.log(obj.name); // "Bob" — still Bob
When you reassign a parameter inside a function, you are only changing the local reference — the original variable still points to the same object.
To prevent mutations, create a copy of the object using spread syntax ({...obj}) or structuredClone() for deep copies.
Why it matters: Pass by value vs pass by reference is the most fundamental concept for understanding why primitives are safe to share but objects are not. Every mutation bug in React state, Redux reducers, or shared function arguments traces back to accidentally sharing a reference.
Real applications: React state updates requiring new object references to trigger re-renders, Redux reducers creating new state objects, function parameters that should not modify caller's data, caching patterns where you store copies not references, and detecting changes via reference equality (===) in memoization.
Common mistakes: Passing an object to a function and being surprised it was mutated (functions receive the reference, not a copy), using === to compare objects by content (compares references — use JSON or deep equal), not knowing that const for objects doesn't prevent mutation (only prevents reassignment), and shallow copying and then still seeing mutations in nested objects.
false, 0 (and -0), "" (empty string), null, undefined, and NaN. Everything else is truthy.
Surprisingly, empty arrays [], empty objects {}, and the string "0" are all truthy — this catches many beginners off guard.
Here is how truthy and falsy values behave:
// Falsy values
Boolean(false) // false
Boolean(0) // false
Boolean("") // false
Boolean(null) // false
Boolean(undefined) // false
Boolean(NaN) // false
// Truthy values (surprising ones)
Boolean([]) // true — empty array is truthy!
Boolean({}) // true — empty object is truthy!
Boolean("0") // true — non-empty string
Boolean("false") // true — non-empty string
Boolean(-1) // true — non-zero number
Boolean(Infinity) // true
Truthy/falsy values are used in if conditions, logical operators (&&, ||), and the ternary operator.
Use double negation (!!value) as a shorthand to convert any value to its boolean equivalent.
Why it matters: Truthy/falsy values are used everywhere in JavaScript for conditional checks. Knowing the complete list of falsy values (false, 0, -0, 0n, '', null, undefined, NaN) prevents bugs from unexpected falsy behavior and is a frequent interview topic.
Real applications: Conditional rendering in React JSX ({items.length && <List/>}), guard clauses at function start (if (!user) return), default value patterns with || operator, form validation (empty string is falsy), and filtering arrays of mixed values with Boolean as predicate: array.filter(Boolean).
Common mistakes: Using 0 as a valid value in a truthy check (0 is falsy, use explicit !== undefined), JSX rendering bug with {count && <Component/>} when count === 0 (renders "0" — use ternary instead), empty string being falsy when you want to distinguish "empty" from "missing", and NaN being falsy (arithmetic failures silently pass falsy checks).
"" == false is true because both coerce to zero.
In professional JavaScript development, always use === unless you specifically need type coercion (common only for null checks).
Here are some surprising comparisons:
// Loose equality (==) — with coercion
"5" == 5 // true (string coerced to number)
"" == false // true (both become 0)
null == undefined // true (special rule)
0 == false // true (false becomes 0)
"" == 0 // true (both become 0)
// Strict equality (===) — no coercion
"5" === 5 // false (different types)
"" === false // false
null === undefined // false
0 === false // false
// The only useful == check
value == null // true for null AND undefined
The null == undefined check is the one case where loose equality is actually useful — it lets you check for both null and undefined in a single comparison.
ESLint's eqeqeq rule enforces strict equality throughout your codebase and is enabled in most professional configurations.
Why it matters: Abstract equality (==) performs type coercion according to a complex algorithm causing counterintuitive results. Understanding when == is explicitly useful (== null checks both null and undefined) vs when it's dangerous ('' == 0 is true) is important for reading legacy code and writing solid comparisons.
Real applications: Using === for all comparisons in new code, the intentional == null pattern to check for both null and undefined in one comparison, understanding legacy code that relies on == coercion, configuring ESLint to enforce strict equality, and TypeScript's strict mode which also catches type mismatch comparisons at compile time.
Common mistakes: Using == when comparing user input strings to numbers ('5' == 5 is true but may lead to bugs), not knowing null == undefined but null !== undefined, relying on implicit coercion chains that are hard to reason about, and being tripped up by 0 == '' (true) or false == '0' (true) in conditional logic.
const original = { name: "Alice", address: { city: "NYC" } };
// Shallow copy methods
const shallow1 = { ...original };
const shallow2 = Object.assign({}, original);
shallow1.name = "Bob";
console.log(original.name); // "Alice" — primitive copied
shallow1.address.city = "LA";
console.log(original.address.city); // "LA" — nested object shared!
// Deep copy methods
const deep1 = structuredClone(original); // Modern (recommended)
const deep2 = JSON.parse(JSON.stringify(original)); // Classic
deep1.address.city = "Chicago";
console.log(original.address.city); // "LA" — independent copy
structuredClone() is the modern recommended approach — it handles circular references, Dates, Maps, Sets, and more. Available in all modern browsers and Node.js 17+.
JSON.parse(JSON.stringify()) fails with functions, undefined, Dates (converted to strings), RegExp, and circular references.
Why it matters: Shallow copy vs deep copy is a critical distinction for state management. Shallow copies share nested references (mutation propagates), deep copies are independent. Choosing incorrectly causes either unnecessary memory use or accidental mutation bugs.
Real applications: Redux reducers updating nested state (shallow copy one level, replace nested where changed), React setState with deeply nested objects, cloning template objects for configuration, duplicating complex nested data models for undo/redo stacks, and sharing state snapshots between components.
Common mistakes: Using spread or Object.assign() for nested objects and thinking they're independent copies (they share nested references), using JSON round-trip for deep copy when the object has Dates, functions, or undefined values (silently loses them), and not using structuredClone() which is the modern, spec-compliant deep copy that handles most types correctly.
?.) safely accesses deeply nested properties without throwing an error if an intermediate property is null or undefined. It short-circuits and returns undefined instead of throwing a TypeError.
Before optional chaining, you had to write verbose checks like user && user.address && user.address.city. Now you can write user?.address?.city.
It works with property access, array indexing, and function calls. Combined with nullish coalescing (??), it provides elegant default values.
Here is optional chaining in action:
const user = { name: "Alice", address: null };
// Without optional chaining — verbose and error-prone
const city1 = user && user.address && user.address.city;
// With optional chaining — clean and safe
const city2 = user?.address?.city; // undefined (no error)
// With arrays
const arr = null;
console.log(arr?.[0]); // undefined
// With function calls
const obj = {};
console.log(obj.someMethod?.()); // undefined
// Combined with nullish coalescing for defaults
const city = user?.address?.city ?? "Unknown";
console.log(city); // "Unknown"
Optional chaining only checks for null and undefined — it does NOT short-circuit for other falsy values like 0, "", or false.
Don't overuse optional chaining on every property — if a value should always exist, let it throw so you can catch the bug early.
Why it matters: Optional chaining was one of the most anticipated JavaScript features. It eliminates verbose null guard code and prevents the classic "Cannot read property of undefined" error. It's now fundamental to accessing deeply nested API response data.
Real applications: Accessing nested API response properties (response?.data?.user?.name), checking event targets (?. before accessing event.target.value), calling optional callback props in React, accessing optional chained methods (?.), and safely reading nested config objects where some levels may not exist.
Common mistakes: Overusing ?. everywhere making it impossible to detect when required data is actually missing (should throw), not knowing ?. short-circuits (returns undefined, doesn't continue the chain), confusing ?. with ?? (chaining vs nullish coalescing), and using optional chaining on function calls (?.(args)) when the function itself should always exist.
??) returns the right-hand operand only when the left-hand side is null or undefined. The logical OR (||) returns the right-hand side for any falsy value.
This is important because || treats 0, "", and false as falsy and replaces them with the default — which is often not what you want.
Use ?? when you want to provide a default only for null/undefined, and || when you want to replace all falsy values.
Here is the key difference:
// || replaces ALL falsy values
0 || 10 // 10 (0 is falsy)
"" || "default" // "default" (empty string is falsy)
false || true // true (false is falsy)
// ?? replaces ONLY null and undefined
0 ?? 10 // 0 (0 is not null/undefined)
"" ?? "default" // "" (empty string is not null/undefined)
false ?? true // false (false is not null/undefined)
null ?? 10 // 10
undefined ?? 10 // 10
// Practical example
function getConfig(options) {
const timeout = options.timeout ?? 3000; // 0 is valid
const retries = options.retries ?? 3; // 0 means no retries
}
Use ?? for configuration values where 0, empty string, or false are meaningful values that should not be replaced.
You cannot mix ?? with && or || without parentheses — JavaScript requires explicit grouping to avoid ambiguity.
Why it matters: ?? vs || is a common interview question about null/undefined handling. The key insight is that 0, '', and false are valid intentional values that should not trigger || fallback but should trigger React prop defaults and function parameter defaults correctly.
Real applications: Configuration objects where 0 timeout or 0 retries are valid values (not defaults), React component prop defaults where 0 or false should be preserved, database-driven config values where empty string is valid, and TypeScript optional parameter patterns where undefined triggers default but null means "explicitly none".
Common mistakes: Using || for default values when 0 or false are valid (|| treats them as falsy, replaces with default), not knowing ??= assignment shorthand (assigns only if current value is null/undefined), and combining ?? without parentheses with && or || (syntax error — JS requires explicit grouping for clarity).
// Object.preventExtensions — no new properties
const obj1 = { a: 1 };
Object.preventExtensions(obj1);
obj1.a = 2; // Allowed — modify existing
delete obj1.a; // Allowed — delete existing
obj1.b = 3; // Silently fails (strict: TypeError)
// Object.seal — no add/delete, can modify
const obj2 = { a: 1 };
Object.seal(obj2);
obj2.a = 2; // Allowed — modify existing
delete obj2.a; // Silently fails
obj2.b = 3; // Silently fails
// Object.freeze — no changes at all
const obj3 = { a: 1 };
Object.freeze(obj3);
obj3.a = 2; // Silently fails
delete obj3.a; // Silently fails
obj3.b = 3; // Silently fails
// Check status
Object.isExtensible(obj1); // false
Object.isSealed(obj2); // true
Object.isFrozen(obj3); // true
In strict mode, all silently failing operations throw TypeError instead, making it easier to catch mistakes.
For truly immutable data structures, consider libraries like Immer or the structuredClone() + freeze pattern.
Why it matters: Object.freeze() is the standard way to create immutable configuration objects at runtime. Understanding the immutability hierarchy (preventExtensions < seal < freeze) and their shallow nature is important for both practical use and interviews.
Real applications: Freezing configuration constants that should never change, preventing mutation of shared theme objects in design systems, creating truly immutable value objects in domain-driven design, read-only data models for display components, and understanding Immutable.js/Immer under the hood.
Common mistakes: Thinking freeze() deep-freezes nested objects (it's shallow — nested objects can still mutate), expecting mutations to throw in sloppy mode (they silently fail — only strict mode throws), using Object.seal() when you mean Object.freeze() (seal still allows value changes), and not checking with Object.isFrozen() if you need to conditionally freeze.
"hello".toUpperCase(), JavaScript temporarily wraps it in a wrapper object (like new String("hello")), calls the method, and then discards the wrapper. This is called autoboxing.
You can also explicitly create wrapper objects using new String(), new Number(), or new Boolean(), but this is strongly discouraged because they create objects, not primitives.
Wrapper objects behave differently from primitives in comparisons — new String("hello") === "hello" is false because one is an object and the other is a primitive.
Here is how autoboxing and wrapper objects work:
// Autoboxing — JS temporarily wraps primitives
"hello".toUpperCase(); // JavaScript does: new String("hello").toUpperCase()
(42).toFixed(2); // JavaScript does: new Number(42).toFixed(2)
// DON'T use constructor wrappers
const strObj = new String("hello");
const strPrim = "hello";
console.log(typeof strObj); // "object"
console.log(typeof strPrim); // "string"
console.log(strObj === strPrim); // false!
// Wrapper objects are truthy!
const boolObj = new Boolean(false);
if (boolObj) {
console.log("This runs!"); // Because objects are truthy
}
Never use new String(), new Number(), or new Boolean() — use the primitive values directly or the conversion functions without new.
The difference between String("42") (conversion function, returns primitive) and new String("42") (constructor, returns object) is critical.
Why it matters: Autoboxing is the invisible mechanism that lets you call .length, .toUpperCase(), and other methods on string literals. Without it, you'd need to use new String('hello') everywhere. Understanding this also explains why typeof new String('x') === 'object', not 'string'.
Real applications: Understanding why strings have methods (.split, .trim, .includes) even though they're primitives, explaining why String('x') !== new String('x') in strict equality, avoiding the new String()/Number()/Boolean() anti-pattern (returns objects, not primitives), and understanding TypeScript's string vs String type distinction.
Common mistakes: Using new String(), new Number(), or new Boolean() as constructors (creates objects not primitives — causes typeof confusion and broken equality checks), expecting primitive method calls to mutate the primitive (they can't — strings are immutable), and not knowing that new Boolean(false) is truthy (it's an object, objects are truthy).
``) instead of quotes and support string interpolation with ${expression}, multi-line strings, and tagged templates.
Interpolation can contain any JavaScript expression — variables, function calls, arithmetic, and even ternary operators.
Tagged templates allow you to process template literals with a function, giving you access to the string parts and interpolated values separately. This is used in libraries like styled-components and GraphQL.
Here is how template literals and tagged templates work:
const name = "Alice";
const age = 30;
// String interpolation
const greeting = `Hello, ${name}! You are ${age} years old.`;
// Multi-line strings
const html = `
<div>
<h1>${name}</h1>
<p>Age: ${age}</p>
</div>
`;
// Expressions inside interpolation
const msg = `Status: ${age >= 18 ? "adult" : "minor"}`;
// Tagged template
function highlight(strings, ...values) {
return strings.reduce((result, str, i) =>
result + str + (values[i] ? `${values[i]}` : ""), "");
}
const output = highlight`Name: ${name}, Age: ${age}`;
// "Name: Alice, Age: 30"
Tagged templates receive the string parts as an array and the interpolated values as separate arguments, allowing custom string processing.
The String.raw built-in tag returns the raw string without processing escape sequences — useful for regex patterns and file paths.
Why it matters: Template literals are the modern standard for string interpolation and multiline strings. Tagged templates go further, enabling DSLs like GraphQL queries, SQL builders, styled-components CSS-in-JS, and i18n systems. Understanding both forms is important for modern JavaScript.
Real applications: String interpolation in all modern JS code, multiline HTML or SQL strings, styled-components CSS template literals, GraphQL query definitions, i18n translation tags, and String.raw for regex or Windows file path patterns where backslashes shouldn't be escape sequences.
Common mistakes: Not knowing tagged template literals can process the template as a custom string (they receive raw parts + values as separate arguments), using concatenation instead of template literals for complex strings, and not knowing String.raw is the only built-in tag (other useful ones come from libraries like gql, css, html, sql).
=== returns false — they are different objects in memory.
Here is how value and reference types differ:
// Value types — independent copies
let a = 10;
let b = a;
b = 20;
console.log(a); // 10 — unchanged
// Reference types — shared reference
let obj1 = { x: 1 };
let obj2 = obj1;
obj2.x = 2;
console.log(obj1.x); // 2 — changed!
// Object comparison
const arr1 = [1, 2, 3];
const arr2 = [1, 2, 3];
console.log(arr1 === arr2); // false — different references
console.log(arr1 === arr1); // true — same reference
// To compare by content, use JSON or deep comparison
console.log(JSON.stringify(arr1) === JSON.stringify(arr2)); // true
For deep content comparison, use JSON.stringify() for simple cases or libraries like Lodash's isEqual() for complex objects with circular references.
Understanding value vs reference types explains many common bugs — especially when passing objects to functions or storing them in arrays.
Why it matters: Value type vs reference type is the conceptual foundation of JavaScript's memory model. It explains object mutation bugs, why React needs immutable state updates (reference equality for change detection), and why you need to copy objects before modifying them safely.
Real applications: React useState needing new object references to trigger updates, Redux requiring pure functions that return new state objects, debugging why a function unexpectedly modified the caller's object, building memoization caches that compare by reference, and understanding why WeakMap/WeakSet use object references as keys (weak references to objects).
Common mistakes: Modifying an array or object parameter inside a function and being surprised the caller sees the change, using === to compare objects by content (compares reference identity), not creating copies when needed (use spread, slice, or structuredClone), and confusing "passing by reference" semantics — JavaScript actually passes object references by value (you can't redirect the caller's variable, only mutate the pointed-to object).
toString() method throws on null and undefined.
Here are all the common conversion methods:
// To Number
Number("42") // 42
Number("") // 0
Number(true) // 1
Number(null) // 0
Number(undefined) // NaN
Number("hello") // NaN
parseInt("42px") // 42 — stops at non-numeric
parseFloat("3.14") // 3.14
+"42" // 42 — unary plus shorthand
// To String
String(42) // "42"
String(null) // "null"
String(undefined) // "undefined"
(42).toString() // "42"
42 + "" // "42" — concat shorthand
(255).toString(16) // "ff" — base conversion
// To Boolean
Boolean(0) // false
Boolean("") // false
Boolean(null) // false
Boolean("hello") // true
Boolean([]) // true — empty array is truthy!
!!42 // true — double negation shorthand
The unary plus (+value) is the shortest way to convert to a number, and double negation (!!value) is the shortest way to convert to a boolean.
Be careful with Number("") returning 0 and Number(null) returning 0 — these can be unexpected in practice.
Why it matters: Explicit type conversion is essential when working with user input, API responses, URL parameters, and form data — all of which are strings by default. Knowing which conversion function to use and what edge cases exist prevents subtle parsing bugs.
Real applications: Parsing URL query parameters (always strings) to numbers, converting form input strings to typed values, normalizing API response data from JSON where all values may be strings, converting boolean flags from localStorage strings, and building type-safe data transformation layers for external data sources.
Common mistakes: Not specifying radix in parseInt (parseInt('010') differed between engines in older code), using + unary to convert to number (works but is less readable), Number('') returning 0 instead of the expected NaN ('' looks like "nothing" but converts to 0), and Boolean('false') returning true (non-empty string is truthy — use val === 'true' for string boolean parsing).