false whenever types differ. The only widely accepted use of == is value == null, which conveniently matches both null and undefined in one check.
// == coerces types before comparing
1 == '1' // true ('1' coerced to 1)
true == 1 // true (true coerced to 1)
null == undefined // true (special spec rule)
'' == false // true (both become 0)
[] == false // true ([] > '' > 0, false > 0)
// === compares type + value — no coercion
1 === '1' // false (number vs string)
null === undefined // false (different types)
// Only valid == use: covers null + undefined together
if (value == null) { }
// Same as: value === null || value === undefined
// Use === everywhere else
if (count === 0) { }
if (typeof x === 'string') { }
Why it matters: Every JavaScript interview covers loose vs strict equality; the Airbnb ESLint config (eqeqeq rule), used in most professional React and Angular projects, bans == except for null checks.
Real applications: ESLint's eqeqeq rule and TypeScript's strict mode enforce strict equality at the tooling level; Google's JavaScript style guide and Airbnb's config ban == in all production codebases.
Common mistakes: Writing if (x == null) without realizing it also matches undefined — sometimes intentional, but developers often overlook this and let unexpected undefined values silently pass null guards.
if, &&, ||) silently coerce values to boolean using truthy/falsy rules. For objects, JavaScript uses the ToPrimitive algorithm, checking Symbol.toPrimitive, then valueOf(), then toString().
// + prefers strings — concatenation over addition
'5' + 3 // "53" (3 coerced to string)
'5' + true // "5true"
1 + 2 + '3' // "33" (left-to-right: 3 then "33")
// Other arithmetic always coerces to numbers
'6' - 2 // 4 ('6' > 6)
null + 5 // 5 (null > 0)
undefined + 1 // NaN (undefined > NaN)
true + true // 2 (1 + 1)
// Boolean coercion in conditions
if ('') { } // falsy — skipped
if ([]) { } // truthy — runs (empty array is truthy!)
// Unary + forces numeric coercion
+'42' // 42
+true // 1
+null // 0
+'' // 0
Why it matters: Failing to predict implicit coercion is a top cause of subtle production bugs in JavaScript; virtually every technical interview includes at least one output-prediction coercion question.
Real applications: Node.js req.query values are always strings, so req.query.page + 1 concatenates instead of adding; React conditional renders like {count && <Component />} render a visible "0" when count is falsy.
Common mistakes: Writing {count && <Component />} in React when count can be 0 — the number 0 is falsy but JSX renders it as the character "0"; use {count > 0 && <Component />} instead.
null and undefined, unlike .toString() which throws a TypeError on those values. parseInt() and parseFloat() are more permissive than Number() — they stop at the first non-numeric character and return a partial result. Always pass a radix to parseInt() to avoid octal misinterpretation in older engines.
// To String — String() never throws
String(123) // "123"
String(null) // "null" (safe)
String(undefined) // "undefined" (safe)
// null.toString() // TypeError! (unsafe)
// To Number
Number('42') // 42
Number('') // 0
Number(null) // 0
Number(undefined) // NaN (not 0!)
Number('12px') // NaN (rejects partial strings)
// parseInt is more lenient (parses until non-digit)
parseInt('12px') // 12
parseInt('3.5rem') // 3 (integer only)
parseInt('0x1F', 16) // 31 (hex with radix)
// To Boolean
Boolean(0) // false
Boolean([]) // true (empty array is truthy!)
Boolean(null) // false
Boolean('false') // true (non-empty string)
// Shorthands
+value // same as Number(value)
!!value // same as Boolean(value)
'' + value // same as String(value)
Why it matters: Explicit conversion communicates intent clearly and prevents coercion surprises; TypeScript and most linting configs require explicit conversion over implicit, making this a signal of production code quality.
Real applications: D3.js uses unary +d.value to coerce CSV strings to numbers when parsing datasets; Lodash's internal _.toString() wraps String() to safely handle all value types including null.
Common mistakes: Forgetting the radix argument in parseInt() — parseInt('08') returns 0 in older JavaScript engines that treat the leading zero as an octal prefix; always write parseInt('08', 10).
false in boolean context: false, 0, -0, 0n (BigInt zero), "" (empty string), null, undefined, and NaN. Every other value — including empty arrays [] and empty objects {} — is truthy. This distinction is fundamental because virtually every JavaScript conditional, short-circuit expression, and filter operation relies on it.
// The complete list of falsy values
Boolean(false) // false
Boolean(0) // false
Boolean(-0) // false
Boolean(0n) // false (BigInt zero)
Boolean('') // false (empty string)
Boolean(null) // false
Boolean(undefined) // false
Boolean(NaN) // false
// Surprising truthy values
Boolean([]) // true! (empty array)
Boolean({}) // true! (empty object)
Boolean('0') // true! (non-empty string)
Boolean('false') // true! (non-empty string)
Boolean(-1) // true (non-zero number)
// Practical: filter out falsy values from array
const values = [0, 1, '', 'hello', null, true, undefined];
values.filter(Boolean); // [1, 'hello', true]
// Correct empty array / object checks
arr.length === 0 // correct
Object.keys(obj).length === 0 // correct
!arr // WRONG — [] is truthy, ![] is false
Why it matters: Every React conditional render and Node.js guard clause relies on truthy/falsy evaluation; misunderstanding these leads to incorrect empty-check guards and broken UI rendering.
Real applications: Vue.js v-if and Angular's *ngIf evaluate truthiness of template expressions; React uses short-circuit && rendering where a falsy left operand suppresses the component.
Common mistakes: Using if (!arr) to check for an empty array — [] is truthy so ![] is always false; always check arr.length === 0 explicitly.
true or false using the 8-falsy-value rule, and the shorthand !! (double negation) produces the same result. A key distinction separates || and ??: the || operator falls through on any falsy value while ?? (nullish coalescing) only falls through on null or undefined, making them non-interchangeable when 0, "", or false are valid data values.
// Boolean() and !! are equivalent
Boolean('') // false
!!'' // false
Boolean([]) // true
!![] // true
// filter(Boolean) removes falsy values — elegant pattern
[0, 1, '', 'ok', null, true, undefined].filter(Boolean);
// > [1, 'ok', true]
// Gotcha: Boolean OBJECTS are truthy even when wrapping false
const b = new Boolean(false);
!!b // true (it's an object!)
b.valueOf() // false (primitive value)
// || uses falsy — 0 and '' trigger fallback
const port = 0;
port || 3000 // 3000 (0 is falsy — BAD if 0 is valid)
// ?? uses nullish only — 0 and '' are preserved
port ?? 3000 // 0 (0 is not null/undefined — CORRECT)
const label = '';
label || 'Unnamed' // "Unnamed" (may not be intended)
label ?? 'Unnamed' // "" (empty string preserved)
Why it matters: The || vs ?? distinction is a modern JavaScript interview signal; confusing them causes config-driven bugs where valid falsy values like port 0 or empty-string labels get silently replaced with defaults.
Real applications: Next.js API routes and Express.js middleware use ?? for environment variable defaults to correctly preserve numeric port 0; React component props use ?? to distinguish "no value provided" from "empty string provided".
Common mistakes: Using process.env.PORT || 3000 in Node.js — if PORT is "0", the string "0" is falsy and the app defaults to 3000; use Number(process.env.PORT) ?? 3000 or an explicit existence check instead.
null becomes 0, undefined becomes NaN, booleans become 0/1, numeric strings become their value, and non-numeric strings become NaN. parseInt() and parseFloat() are more permissive — they stop at the first non-numeric character and return a partial result. Arrays converted via Number() use toString() first: [] becomes "" which becomes 0, while [1,2] becomes "1,2" which becomes NaN.
// String inputs
Number('42') // 42
Number('3.14') // 3.14
Number('') // 0 (empty string is 0)
Number(' ') // 0 (whitespace-only is 0)
Number('12px') // NaN (any non-numeric chars > NaN)
// Special values
Number(null) // 0 (null > 0)
Number(undefined) // NaN (undefined > NaN, NOT 0!)
Number(true) // 1
Number(false) // 0
// parseInt is more lenient
parseInt('12px') // 12 (stops at 'p')
Number('12px') // NaN
// Arrays go through toString() first
Number([]) // 0 ('' > 0)
Number([7]) // 7 ('7' > 7)
Number([1,2]) // NaN ('1,2' > NaN)
// Unary + is shorthand for Number()
+'42' // 42
+null // 0
+'' // 0
Why it matters: Form input values and URL query params are always strings; failing to convert them before arithmetic produces silent NaN values that propagate through calculations without any thrown error.
Real applications: React controlled form inputs always deliver string values to handlers; Angular reactive forms require explicit +value or Number() conversion before numeric calculations in validators or computed fields.
Common mistakes: Expecting Number(undefined) to return 0 like Number(null) — undefined yields NaN, which then propagates silently through all subsequent arithmetic and corrupts computed results.
null and undefined safely — returning the strings "null" and "undefined" — while calling .toString() directly on those values throws a TypeError. Template literals and the + operator with a string operand both trigger implicit string conversion via the same ToPrimitive algorithm. Objects default to "[object Object]" unless they implement a custom toString() or Symbol.toPrimitive method.
// String() handles all values safely
String(123) // "123"
String(true) // "true"
String(null) // "null" (no error)
String(undefined) // "undefined" (no error)
String(NaN) // "NaN"
String([1,2,3]) // "1,2,3"
String({}) // "[object Object]"
// .toString() throws on null/undefined
// null.toString() // TypeError!
// undefined.toString() // TypeError!
// Template literals coerce implicitly
`value: ${null}` // "value: null"
`score: ${NaN}` // "score: NaN"
// Customize with toString() method
class Temperature {
constructor(c) { this.c = c; }
toString() { return `${this.c}°C`; }
}
String(new Temperature(100)); // "100°C"
`${new Temperature(0)}` // "0°C"
Why it matters: Knowing exactly what String() produces prevents "[object Object]" appearing in UI output or logs; JSON.stringify silently drops undefined values, so string conversion behavior directly affects API payloads.
Real applications: React accidentally renders "[object Object]" when an object is passed as JSX children without stringification; Lodash's _.toString() wraps String() to safely handle all value types including null.
Common mistakes: Using string concatenation (value + '') on plain objects without implementing toString(), producing the useless "[object Object]" in logs, error messages, and rendered UI text.
<, >, <=, >=) convert both operands to numbers unless both are strings, in which case they compare lexicographically by Unicode code point. null and undefined have asymmetric behavior: null >= 0 is true (null converts to 0) but null == 0 is false (null only equals undefined under ==). NaN always returns false for every relational comparison, including comparing NaN to itself.
// Numeric comparison after coercion
'5' > 3 // true ('5' > 5)
'10' > '9' // false! (lexicographic: '1' < '9')
'b' > 'a' // true (Unicode code points)
// null comparison paradox (classic interview trap)
null > 0 // false (null > 0, but 0 > 0 is false)
null == 0 // false (null only equals undefined via ==)
null >= 0 // true (null > 0, 0 >= 0 is true)
null < 1 // true (null > 0, 0 < 1)
// undefined always gives NaN for relational ops
undefined > 0 // false (NaN > 0 = false)
undefined == 0 // false (undefined only equals null)
// NaN — always false for all comparisons
NaN > 0 // false
NaN < 0 // false
NaN === NaN // false
NaN >= NaN // false
Why it matters: The null >= 0 true but null == 0 false paradox is a classic senior interview question; real forms with optional numeric fields can silently pass null through relational inequality guards.
Real applications: Lodash adds explicit null guards before comparisons in sort and clamp functions; Express.js route handlers must validate query parameters as numeric strings before using them in range comparisons.
Common mistakes: Expecting "10" > "9" to be true — string comparison is lexicographic, so "10" < "9" because the character "1" has a lower Unicode code point than "9"; always parse to numbers before comparing user input.
false to 0 (boolean to number), then converts [] through ToPrimitive to "", which converts to 0 — making both sides 0 == 0, which is true. Crucially, [] is still truthy in boolean context (if ([]) runs), revealing an asymmetry between if (x) semantics and x == false semantics. Using === instead eliminates all this complexity.
// Step-by-step: [] == false
// 1. false > 0 (boolean > number)
// 2. [] == 0
// 3. [].toString() > "" (object > primitive)
// 4. "" == 0
// 5. Number("") > 0
// 6. 0 == 0 > true!
[] == false // true (both become 0)
[] == true // false (0 != 1)
// But [] is TRUTHY in boolean context!
if ([]) { console.log('runs!'); } // runs!
Boolean([]) // true
// More surprising cases
[] == ![] // true (![] = false, [] == false, both > 0)
'' == false // true (both > 0)
' ' == 0 // true (whitespace string > 0)
// === eliminates all ambiguity
[] === false // false
'' === false // false
0 === false // false
Why it matters: This is a canonical JavaScript interview "gotcha" that tests understanding of the difference between boolean coercion (truthy/falsy) and abstract equality (==); it is the top reason experienced teams enforce strict equality via ESLint.
Real applications: ESLint's eqeqeq rule and Airbnb's config were created partly because of abstract equality footguns that caused hard-to-diagnose bugs in codebases at Facebook, Airbnb, and Google.
Common mistakes: Assuming that if x is truthy then x == true must be true — any non-1 truthy value (arrays, objects, numbers other than 1) will fail this equality check because == compares numeric values after coercion.
NaN === NaN is false. It propagates silently through all subsequent arithmetic. Use Number.isNaN() for reliable detection; the legacy global isNaN() coerces its argument first, producing false positives for strings and undefined.
// NaN is not equal to itself
NaN === NaN // false (unique in JavaScript)
NaN !== NaN // true (only value where x !== x)
// Number.isNaN() — reliable, no coercion
Number.isNaN(NaN); // true
Number.isNaN('hello'); // false (correct)
Number.isNaN(undefined); // false (correct)
// AVOID: legacy isNaN() coerces first
isNaN('hello'); // true (misleading!)
isNaN(undefined); // true (misleading!)
// NaN sources
0 / 0; parseInt('abc'); Math.sqrt(-1); undefined + 1;
// NaN propagates
NaN + 5 // NaN, NaN * 10 // NaN, NaN > 0 // false
Why it matters: NaN propagation is a silent production bug — an invalid calculation upstream silently corrupts all downstream values. Knowing that Number.isNaN() is the safe check (not the legacy isNaN() which coerces first) is a direct interview differentiator.
Real applications: Form numeric validation, coordinate calculations, financial computations, and any code that processes user-supplied numbers must guard against NaN. React's rendering logic and comparison utils often encounter NaN from uninitialized state.
Common mistakes: Using isNaN('text') and getting true (string is coerced to NaN first — use Number.isNaN), checking value === NaN (always false — use Number.isNaN), and not handling NaN in Math.min()/Math.max() calls (they return NaN if any argument is NaN).
NaN === NaN is false but Object.is(NaN, NaN) is true, and +0 === -0 is true but Object.is(+0, -0) is false. These distinctions matter because Array.prototype.includes() uses Object.is internally — which is why [NaN].includes(NaN) returns true while [NaN].indexOf(NaN) returns -1.
// Object.is() vs === — same in most cases
Object.is(1, 1) // true
Object.is('hi', 'hi') // true
Object.is(null, null) // true
// Difference 1: NaN comparison
NaN === NaN // false (confusing!)
Object.is(NaN, NaN) // true (correct)
// Difference 2: signed zero
+0 === -0 // true (=== merges them)
Object.is(+0, -0) // false (distinguishes them)
// When -0 appears in practice
-1 * 0 // -0
Math.round(-0.1) // -0
// includes() uses Object.is — NaN-safe
[NaN].includes(NaN) // true
[NaN].indexOf(NaN) // -1 (uses === internally)
// Polyfill
Object.is = (x, y) =>
(x === y) ? (x !== 0 || 1/x === 1/y) : (x !== x && y !== y);
Why it matters: React's useState and useMemo hooks use Object.is for change detection — understanding this is key when debugging "why isn't my component re-rendering" issues involving NaN or -0 state values.
Real applications: React's reconciliation algorithm and Redux Toolkit use Object.is when comparing immutable state slices to determine if a reducer produced a meaningful change worth re-rendering.
Common mistakes: Using arr.indexOf(NaN) expecting to find a NaN value — it always returns -1 because it uses ===; use arr.includes(NaN) or arr.findIndex(x => Number.isNaN(x)) instead.
valueOf() or toString(). Left-to-right evaluation means 1 + 2 + '3' gives "33" while '1' + 2 + 3 gives "123" — the first operand's type determines the chain behavior.
// Number + Number = addition
5 + 3 // 8
// String + anything = concatenation
'5' + 3 // "53"
3 + '5' // "35"
'' + null // "null"
'' + undefined // "undefined"
// Left-to-right evaluation order is critical
1 + 2 + '3' // "33" (3 then "33")
'1' + 2 + 3 // "123" ("12" then "123")
// Object ToPrimitive via toString()
[] + [] // "" (both > "")
[] + {} // "[object Object]" ([] > "", {} > "[object Object]")
{} + [] // 0 (in console: {} = empty block, +[] = 0)
// Unary + for explicit number conversion
+'5' // 5
+true // 1
+'' // 0
+[] // 0
// Date objects prefer toString with binary +
const d = new Date();
d + 1 // date string + "1" (string concat)
+d // milliseconds (unary forces number hint)
Why it matters: Accidentally concatenating instead of adding is one of the most common JavaScript bugs, especially with server data that arrives as strings — and it fails silently with no thrown error.
Real applications: Node.js Express handlers receive numeric query params as strings, so total + req.query.amount concatenates; React form onChange always provides a string value from inputs, requiring explicit Number(e.target.value) before arithmetic.
Common mistakes: Writing price + shipping in Node.js when both arrive as query-param strings — the result is string concatenation, not addition; always convert with Number() or parseFloat() before summing numeric values.
null and undefined as fallback triggers. All three use short-circuit evaluation — the right operand is never evaluated when the result is already determined.
// || returns first truthy (or last value)
0 || 'default' // "default" (0 is falsy)
'' || 'fallback' // "fallback"
'hello' || 'world' // "hello" (first is truthy)
null || undefined || 'end' // "end"
// && returns first falsy (or last value)
1 && 'hello' // "hello" (1 is truthy, returns last)
0 && 'hello' // 0 (short-circuits at 0)
'a' && 'b' && 'c' // "c" (all truthy)
// ?? returns first non-nullish (0 and '' pass through)
0 ?? 'default' // 0 (0 is not null/undefined)
'' ?? 'fallback' // "" (empty string is not nullish)
null ?? 'default' // "default"
undefined ?? 42 // 42
// || vs ?? — critical difference with valid falsy values
const port = 0;
port || 3000 // 3000 (WRONG if 0 is valid)
port ?? 3000 // 0 (CORRECT)
// Short-circuit — right side never evaluates
false && sideEffect() // sideEffect never runs
true || sideEffect() // sideEffect never runs
Why it matters: The || vs ?? distinction is a modern JavaScript interview signal; confusing them produces config-driven bugs where valid falsy values like port 0 or empty string names are silently replaced with defaults.
Real applications: Next.js uses ?? for SSR environment defaults; React hooks that consume context use ?? to distinguish a missing context value (undefined) from a valid falsy value like false or 0.
Common mistakes: Using process.env.PORT || 3000 in Node.js — if PORT is "0", the string "0" is falsy and the app binds to port 3000 instead; use parseInt(process.env.PORT, 10) ?? 3000.
"number", "string", or "default" — to determine the conversion context. When defined, it takes complete precedence over both valueOf() and toString(). JavaScript's built-in Date objects use it internally, which explains why +new Date() returns milliseconds while String(new Date()) returns a human-readable string.
const money = {
amount: 100,
currency: 'USD',
[Symbol.toPrimitive](hint) {
if (hint === 'number') return this.amount;
if (hint === 'string') return `${this.amount} ${this.currency}`;
return this.amount; // 'default' hint (== and +)
}
};
+money // 100 (numeric hint)
money * 2 // 200 (numeric hint)
String(money) // "100 USD" (string hint)
money + 50 // 150 (default hint)
money == 100 // true (default hint)
// Without Symbol.toPrimitive, engine falls back to:
// numeric hint: valueOf() then toString()
// string hint: toString() then valueOf()
const obj = {
valueOf() { return 42; },
toString() { return 'forty-two'; }
};
+obj // 42 (valueOf for numeric)
String(obj) // "forty-two" (toString for string)
// Date uses Symbol.toPrimitive internally
+new Date() // 1712188800000 (ms since epoch)
String(new Date()) // "Fri Apr 04 2026 ..."
Why it matters: Understanding Symbol.toPrimitive explains why +new Date() and String(new Date()) produce different types — a classic interview question about built-in objects and their coercion behavior.
Real applications: Moment.js and day.js implement custom primitive conversion so datetime objects work naturally in arithmetic; Prisma uses value objects with Symbol.toPrimitive for correct serialization in ORM queries.
Common mistakes: Defining only valueOf() expecting it to also control string conversion — for string hints, JavaScript uses toString() first; implement Symbol.toPrimitive to explicitly handle all three hint types in one place.
const user = { profile: { name: 'Alice' } };
// Optional chaining returns undefined for missing paths
user.address?.city; // undefined (no address property)
user.profile?.name; // "Alice"
user.getAddress?.(); // undefined (method doesn't exist)
// Coercion of undefined result
user.address?.city + ''; // "undefined" (string concat)
+user.address?.city; // NaN (Number(undefined) = NaN)
// Safe pattern: optional chaining + nullish coalescing
const city = user.address?.city ?? 'Unknown'; // "Unknown"
const age = user.profile?.age ?? 0; // 0
// Works with arrays
const arr = null;
arr?.[0]; // undefined (not TypeError)
arr?.length; // undefined
// Nested optional chaining
const value = obj?.a?.b?.c ?? 'default';
// Short-circuiting — no further evaluation
null?.prop.subprop; // undefined (does not try to access .subprop)
// Cannot use for assignment
// user?.name = 'Bob'; // SyntaxError
The combination of ?. and ?? is the modern null-safety pattern. Before ES2020, you wrote user && user.profile && user.profile.name || 'default'; now it's simply user?.profile?.name ?? 'default'.
Why it matters: Optional chaining with nullish coalescing is now standard in all modern JS/TS codebases. Understanding that ?. returns exactly undefined (not false or null) prevents wrong default-value guards with || — use ?? instead.
Real applications: React component props with optional nested data (props.user?.address?.city ?? 'N/A'), API response access where not all fields are guaranteed, and Angular template expressions with optional service data all use this pattern daily.
Common mistakes: Using || instead of ?? after ?. (replaces valid values of 0, false, ''), trying to assign through optional chaining (obj?.prop = val is a SyntaxError), and chaining ?. on a value that cannot be nullish (adds unnecessary noise — TypeScript will warn about this).
// Classic trick questions
true + true + true // 3 (1 + 1 + 1)
true - true // 0 (1 - 1)
[] + [] // "" (both toString to "")
[] + {} // "[object Object]"
{} + [] // 0 (in console: {} is empty block, +[] = 0)
!!"false" // true (non-empty string is truthy)
!!"" // false
!!0 // false
!!null // false
!!undefined // false
!!NaN // false
// More tricky coercions
'2' + 1 // "21"
'2' - 1 // 1
null + 1 // 1 (null becomes 0)
undefined + 1 // NaN
'5' + - + - + - + 3 // "5-3" (unary operators: -(-(-3)) = -3)
// Type of results
typeof NaN // "number" (NaN is technically a number)
typeof null // "object" (historical bug)
typeof undefined // "undefined"
typeof [] // "object" (arrays are objects)
// Equality tricks
false == '0' // true (false=0, '0'=0)
false == '' // true (false=0, ''=0)
'' == 0 // true
'0' == '' // false (both strings, not equal)
The best strategy: memorize the 8 falsy values, remember + prefers strings while other operators prefer numbers, and walk through each conversion step methodically. In real code, use explicit conversion and strict equality — never rely on implicit coercion.
Why it matters: Coercion trick questions are a staple of JavaScript technical screens because they reveal whether candidates truly understand the language's type system or just its surface syntax. Walking through the rules step by step is what interviewers want to see.
Real applications: While you'd never write [] + {} in production, the underlying rules — ToPrimitive, numeric coercion, string concatenation precedence — affect real code whenever you mix user input strings with arithmetic, conditionals with non-boolean values, or JSON data with expected types.
Common mistakes: Memorizing specific outputs without understanding the underlying algorithm (you'll fail variations), forgetting that {} + [] parses as an empty block + unary +[] in statement context (giving 0) vs object-plus-array in expression context (giving "[object Object]"), and not knowing that typeof null === 'object' is a legacy bug, not a feature.