Logical Reasoning

Sorting & Searching Logic

24 Questions

Sorting approach: sort descending (O(n log n)) and return index k-1. Min-heap of size k approach: maintain a heap of the k largest seen so far — root is always the kth largest. O(n log k) time.

QuickSelect (partition-based): like QuickSort but recurse only on the side containing the kth position. Average O(n), worst O(n²). Most efficient on average.

For streaming data where you can't store all elements, the min-heap of size k is the standard approach.

// Sorting approach: O(n log n)
function kthLargest(arr, k) {
  arr.sort((a, b) => b - a);
  return arr[k - 1];
}

// QuickSelect: O(n) average
function kthLargestQS(arr, k) {
  const target = arr.length - k; // kth largest = (n-k)th smallest
  function select(lo, hi) {
    const pivot = arr[hi];
    let i = lo;
    for (let j = lo; j < hi; j++) if (arr[j] <= pivot) [arr[i], arr[j]] = [arr[j], arr[i++]];
    [arr[i], arr[hi]] = [arr[hi], arr[i]];
    if (i === target) return arr[i];
    return i < target ? select(i + 1, hi) : select(lo, i - 1);
  }
  return select(0, arr.length - 1);
}
// kthLargest([3,2,1,5,6,4], 2) => 5

QuickSelect partition: exactly like QuickSort but only recurse on the partition containing the target index. Expected O(n) average case.

Built-in sort is simplest and usually fast enough. Use QuickSelect when performance is critical and extra O(log n) overhead matters.

Two-pointer technique: maintain a pointer for each sorted array. Compare the current elements, push the smaller one to the result, advance that pointer. Append any remaining elements at the end.

O(m+n) time and space — optimal since every element must be examined and placed at least once.

This merge step is the core of Merge Sort and also used in merging k sorted lists (use a min-heap for k-way merge).

function mergeSorted(a, b) {
  let res = [], i = 0, j = 0;
  while (i < a.length && j < b.length)
    res.push(a[i] <= b[j] ? a[i++] : b[j++]);
  return [...res, ...a.slice(i), ...b.slice(j)];
}
// mergeSorted([1,3,5],[2,4,6]) => [1,2,3,4,5,6]
// mergeSorted([1,2],[3,4,5]) => [1,2,3,4,5]

Append remaining elements with spread — when one array is exhausted, the remaining other array is already sorted and can be appended directly.

In-place merge of two sorted arrays in O(n) space is trivial. In-place with O(1) space is complex (requires block-swap or gap method).

Binary search cuts the search space in half with each comparison. Requires a sorted array. Use lo=0, hi=n-1, mid=(lo+hi)/2. If arr[mid]===target return mid; if too small, search right; if too large, search left.

O(log n) time — each iteration halves the range. Searching 1 billion elements requires at most 30 comparisons!

Common pitfall: integer overflow in mid calculation. Use lo + Math.floor((hi-lo)/2) instead of Math.floor((lo+hi)/2) in languages with integer overflow.

function binarySearch(arr, target) {
  let lo = 0, hi = arr.length - 1;
  while (lo <= hi) {
    const mid = lo + Math.floor((hi - lo) / 2);
    if (arr[mid] === target) return mid;
    arr[mid] < target ? lo = mid + 1 : hi = mid - 1;
  }
  return -1; // not found
}
// binarySearch([1,3,5,7,9,11], 7) => 3 (index)
// binarySearch([1,3,5,7,9,11], 6) => -1

Loop condition is lo <= hi (not lo < hi) — ensures we check the remaining single element when lo === hi.

Template: when looking for the first/last position, use lo < hi with hi=mid or lo=mid+1 patterns (lower/upper bound variants).

A peak element is greater than both its neighbors (boundary elements are considered greater than their imaginary out-of-bounds neighbors). At least one peak always exists.

Linear scan O(n): find first element where arr[i] > arr[i+1]. Binary search O(log n): if arr[mid] < arr[mid+1], peak must be in right half (ascending side); otherwise left half.

This works because if the slope is positive (arr[mid] < arr[mid+1]), a peak MUST exist in the right portion — it can't slope up forever.

function findPeak(arr) {
  let lo = 0, hi = arr.length - 1;
  while (lo < hi) {
    const mid = Math.floor((lo + hi) / 2);
    arr[mid] < arr[mid + 1] ? lo = mid + 1 : hi = mid;
  }
  return lo; // returns INDEX of peak, arr[lo] is peak value
}
// findPeak([1,2,3,1]) => 2 (index, value 3)
// findPeak([3,4,3,2,1]) => 1 (index, value 4)

Binary search guarantee: going toward the higher neighbor always leads to a peak — if arr[mid] < arr[mid+1], right half MUST have a peak.

For 2D matrix peak: find peak in middle column, then compare with neighbors to decide which row to recurse on — O(n log m).

A rotated sorted array has one sorted half and one that wraps around. Modified binary search: first identify which half is sorted, then determine if the target lies within that sorted half.

If arr[lo] <= arr[mid]: left half is sorted. Check if target is in [arr[lo], arr[mid]). Otherwise search right. Reverse logic for right half.

Handles rotation by checking which half is linearly sorted before deciding the search direction.

function searchRotated(arr, target) {
  let lo = 0, hi = arr.length - 1;
  while (lo <= hi) {
    const mid = Math.floor((lo + hi) / 2);
    if (arr[mid] === target) return mid;
    if (arr[lo] <= arr[mid]) { // left half is sorted
      target >= arr[lo] && target < arr[mid] ? hi = mid - 1 : lo = mid + 1;
    } else {                   // right half is sorted
      target > arr[mid] && target <= arr[hi] ? lo = mid + 1 : hi = mid - 1;
    }
  }
  return -1;
}
// searchRotated([4,5,6,7,0,1,2], 0) => 4

Identify the sorted half first: compare arr[lo] with arr[mid]. If arr[lo] ≤ arr[mid], left half is sorted. Then check if target lies in that sorted range.

For duplicates: arr[lo] === arr[mid] is ambiguous (both halves could be sorted). Must advance lo++ — worst case degrades to O(n).

Use binary search to find the leftmost (first) position of target, then the rightmost (last) position. Count = last - first + 1. Two binary searches = O(2 log n) = O(log n).

Left bound: when arr[mid] === target, move hi to mid-1 to search for earlier occurrence. Right bound: when arr[mid] === target, move lo to mid+1.

Linear scan O(n) is simpler but binary search is O(log n) — crucial for large sorted datasets.

function countOccurrences(arr, target) {
  function bound(isFirst) {
    let lo = 0, hi = arr.length - 1, res = -1;
    while (lo <= hi) {
      const mid = Math.floor((lo + hi) / 2);
      if (arr[mid] === target) {
        res = mid;
        isFirst ? hi = mid - 1 : lo = mid + 1; // push left or right
      } else arr[mid] < target ? lo = mid + 1 : hi = mid - 1;
    }
    return res;
  }
  const f = bound(true), l = bound(false);
  return f === -1 ? 0 : l - f + 1;
}
// countOccurrences([1,2,2,2,3,4], 2) => 3 (indices 1,2,3)

Left bound: when equal, record and continue LEFT (hi=mid-1). Right bound: when equal, record and continue RIGHT (lo=mid+1).

The same bound functions also solve "first occurrence" and "last occurrence" queries — building blocks for many binary search problems.

Binary search in range [0, n]: find the largest integer mid such that mid*mid <= n. Track the answer in a variable and continue searching right until the condition fails.

This is the "find the last valid answer" binary search template — update the answer when condition holds, then continue optimizing.

Newton's method converges even faster (quadratic convergence) but binary search is simpler and guaranteed O(log n).

function mySqrt(n) {
  if (n < 2) return n;
  let lo = 1, hi = Math.floor(n / 2), ans = 1;
  while (lo <= hi) {
    const mid = lo + Math.floor((hi - lo) / 2);
    if (mid * mid <= n) { ans = mid; lo = mid + 1; } // record and search right
    else hi = mid - 1;
  }
  return ans;
}
// mySqrt(16) => 4, mySqrt(17) => 4 (floor), mySqrt(0) => 0

Search upper half only: sqrt(n) <= n/2 for n >= 2, so hi = Math.floor(n/2) is a tighter bound that halves the initial search space.

For exact square root check: verify ans*ans === n after finding ans. Use BigInt for large numbers to avoid floating-point issues.

Binary search: if version mid is bad, the first bad version is at mid or earlier (search left); if good, it's after mid (search right). When lo === hi, we've found it.

Using lo < hi (not lo <= hi) with hi=mid when bad prevents overshooting the answer. When the loop exits, lo === hi === first bad version.

This minimizes API calls — log n calls instead of n calls for linear scan. Real-world analogy: git bisect for finding a breaking commit.

function firstBadVersion(n, isBad) {
  let lo = 1, hi = n;
  while (lo < hi) {
    const mid = lo + Math.floor((hi - lo) / 2);
    isBad(mid) ? hi = mid : lo = mid + 1;
    // bad: first bad is at mid or left (hi=mid)
    // good: first bad is after mid (lo=mid+1)
  }
  return lo; // lo === hi is the answer
}
// versions [G,G,B,B,B], n=5 => firstBadVersion(5) => 3

hi=mid (not mid-1) when mid is bad — we can't exclude mid as a candidate. Loop exits when lo===hi, which is the first bad version.

This pattern (search for leftmost valid position) is also used for: first time something is true in an ordered predicate, lower bound insertion point.

The minimum element is the only element that is smaller than its previous element (the rotation point). Binary search: compare arr[mid] with arr[hi] to determine which half contains the minimum.

If arr[mid] > arr[hi]: right half has the wrap-around, minimum is in right half (lo = mid+1). Otherwise minimum is in left half including mid (hi = mid).

This runs in O(log n). For sorted (non-rotated) arrays, mid <= hi always, hi keeps shrinking to index 0 — correct minimum.

function findMin(arr) {
  let lo = 0, hi = arr.length - 1;
  while (lo < hi) {
    const mid = Math.floor((lo + hi) / 2);
    arr[mid] > arr[hi] ? lo = mid + 1 : hi = mid;
    // arr[mid] > arr[hi]: min is in right half (rotation in right)
    // arr[mid] <= arr[hi]: min is in left half including mid
  }
  return arr[lo]; // or return lo for index
}
// findMin([3,4,5,1,2]) => 1
// findMin([4,5,6,7,0,1,2]) => 0

Compare with hi, not lo — comparing arr[mid] with arr[hi] reliably identifies which side has the rotation wrap-around.

If duplicates present (arr[mid] === arr[hi]): can't determine which side, must do hi-- and try again — degrades to O(n) worst case.

Binary search on the smaller array to find the correct partition point where all left elements are ≤ all right elements across both arrays.

For each partition i in array A, the partition j in B is determined automatically: j = (m+n+1)/2 - i. Check cross-conditions: lA ≤ rB and lB ≤ rA.

This achieves O(log(min(m,n))) — much better than merging both arrays and finding the middle O(m+n).

function findMedian(a, b) {
  if (a.length > b.length) [a, b] = [b, a]; // binary search on smaller
  let m = a.length, n = b.length, lo = 0, hi = m;
  while (lo <= hi) {
    let i = Math.floor((lo + hi) / 2);
    let j = Math.floor((m + n + 1) / 2) - i;
    let lA = i === 0 ? -Infinity : a[i-1], rA = i === m ? Infinity : a[i];
    let lB = j === 0 ? -Infinity : b[j-1], rB = j === n ? Infinity : b[j];
    if (lA <= rB && lB <= rA) {
      let total = m + n;
      return total % 2 ? Math.max(lA, lB) : (Math.max(lA, lB) + Math.min(rA, rB)) / 2;
    }
    lA > rB ? hi = i - 1 : lo = i + 1;
  }
}
// findMedian([1,3],[2]) => 2
// findMedian([1,2],[3,4]) => 2.5

Correct partition check: lA ≤ rB AND lB ≤ rA ensures the partition is valid. Use ±Infinity for edge partitions.

The "combined left half" max is Math.max(lA, lB), the "right half" min is Math.min(rA, rB) — median of even-length total is their average.

An inversion is a pair (i, j) where i < j but arr[i] > arr[j]. Brute force O(n²): count all pairs. Efficient O(n log n): count inversions during merge sort — whenever a right element merges before a left element, that's mid-left inversions.

Inversion count measures how "unsorted" an array is. Zero inversions = sorted, n(n-1)/2 inversions = reverse sorted.

The merge-sort approach is elegant: while merging, if right[j] < left[i], all remaining left elements (mid-i of them) form inversions with right[j].

function countInversions(arr) {
  let count = 0;
  function mergeSort(a) {
    if (a.length <= 1) return a;
    const mid = Math.floor(a.length / 2);
    const left = mergeSort(a.slice(0, mid));
    const right = mergeSort(a.slice(mid));
    const merged = [];
    let i = 0, j = 0;
    while (i < left.length && j < right.length) {
      if (left[i] <= right[j]) merged.push(left[i++]);
      else { merged.push(right[j++]); count += left.length - i; } // inversions!
    }
    return [...merged, ...left.slice(i), ...right.slice(j)];
  }
  mergeSort(arr);
  return count;
}
// countInversions([2,4,1,3,5]) => 3 (pairs: (2,1),(4,1),(4,3))

count += left.length - i — when right[j] wins, ALL remaining left elements are larger than it (since left is sorted), forming (left.length - i) inversions at once.

Used in competitive programming, measuring similarity between rankings, and detecting nearly-sorted arrays for adaptive sorting.

Pick a pivot, partition the array so all elements smaller than pivot go left and larger go right. Recursively sort each partition. The partitioning step is the heart of QuickSort.

Average O(n log n), worst case O(n²) when pivot is always min/max (e.g., sorted array with first-element pivot). Randomized pivot selection avoids worst case in practice.

QuickSort is typically faster than MergeSort in practice due to better cache performance and lower constant factors.

function quickSort(arr, lo = 0, hi = arr.length - 1) {
  if (lo >= hi) return;
  // Lomuto partition: last element as pivot
  let pivot = arr[hi], i = lo;
  for (let j = lo; j < hi; j++)
    if (arr[j] <= pivot) [arr[i], arr[j]] = [arr[j], arr[i++]];
  [arr[i], arr[hi]] = [arr[hi], arr[i]]; // place pivot
  quickSort(arr, lo, i - 1);
  quickSort(arr, i + 1, hi);
}
// quickSort([3,1,4,1,5,9,2,6]) => [1,1,2,3,4,5,6,9]

Randomized pivot: swap a random element to position hi before partitioning to avoid worst-case O(n²) on sorted input.

3-way partition (Dutch National Flag) handles duplicates efficiently — equal elements are placed in their final position during partition, not re-sorted.

Two-pointer approach on a SORTED array: start with left=0, right=n-1. If sum matches, done. If sum < target, increment left. If sum > target, decrement right.

This works because the array is sorted — moving left pointer increases sum, moving right decreases it. O(n) after sorting, O(n log n) overall.

HashMap approach: O(n) time, O(n) space — check if (target - num) exists in map for each element.

// Two-pointer O(n log n) time, O(1) space
function twoSum(arr, target) {
  arr.sort((a, b) => a - b);
  let lo = 0, hi = arr.length - 1;
  while (lo < hi) {
    const sum = arr[lo] + arr[hi];
    if (sum === target) return [arr[lo], arr[hi]];
    sum < target ? lo++ : hi--;
  }
  return null;
}
// twoSum([2,7,11,15], 9) => [2,7]

// HashMap O(n) time, O(n) space
function twoSumMap(arr, target) {
  const map = new Map();
  for (let i = 0; i < arr.length; i++) {
    const complement = target - arr[i];
    if (map.has(complement)) return [map.get(complement), i]; // return indices
    map.set(arr[i], i);
  }
}

Two-pointer requires a sorted array. HashMap approach works on unsorted arrays and returns original indices.

For all pairs (not just first match), use nested loops or collect all solutions during the sweep.

Counting sort works for non-negative integers in a known range [0, k]. Count occurrences of each value, convert counts to positions, then output in sorted order. O(n + k) time and space.

Non-comparison-based — can beat the O(n log n) lower bound for comparison-based sorts when the value range k is small relative to n.

Stable version: accumulate prefix sums for counts, iterate input backwards to place elements — preserves relative order of equal elements.

function countingSort(arr, maxVal) {
  const count = Array(maxVal + 1).fill(0);
  for (const n of arr) count[n]++;
  // Prefix sum (stable sort)
  for (let i = 1; i <= maxVal; i++) count[i] += count[i-1];
  const output = Array(arr.length);
  for (let i = arr.length - 1; i >= 0; i--) {
    output[--count[arr[i]]] = arr[i];
  }
  return output;
}
// countingSort([4,2,2,8,3,3,1], 8) => [1,2,2,3,3,4,8]

Best when k = O(n): if the max value is much larger than n, the count array wastes memory and time.

Radix sort uses counting sort as a subroutine on each digit, achieving O(d*(n+k)) for d-digit numbers — excellent for large integer arrays.

For each cell, start DFS/backtracking if the first character matches. At each step, try all 4 directions. Mark cells as visited and unmark on backtrack.

This is a grid backtracking problem — exhaustive search with pruning (stop if current path can't possibly form the word).

Time O(m*n*4^L) where L is word length. Very slow for long words, but pruning on character mismatch makes it fast in practice.

function wordSearch(board, word) {
  const m = board.length, n = board[0].length;
  function dfs(r, c, k) {
    if (k === word.length) return true;
    if (r < 0 || r >= m || c < 0 || c >= n) return false;
    if (board[r][c] !== word[k]) return false;
    const tmp = board[r][c];
    board[r][c] = '#'; // mark visited
    const found = dfs(r+1,c,k+1)||dfs(r-1,c,k+1)||dfs(r,c+1,k+1)||dfs(r,c-1,k+1);
    board[r][c] = tmp; // unmark (backtrack)
    return found;
  }
  for (let r = 0; r < m; r++)
    for (let c = 0; c < n; c++)
      if (dfs(r, c, 0)) return true;
  return false;
}
// board=[["A","B"],["C","D"]] word="AB" => true

Temporarily mark with '#' to indicate visited — restore the original character on backtrack to allow other paths to use that cell.

Optimization: check if the word's character frequency exceeds the board's to prune early before even starting DFS.

Add all numbers to a Set. For each number that is the START of a sequence (num-1 not in set), count how long the consecutive run extends. O(n) time, O(n) space.

The key insight: only start counting from numbers where num-1 is NOT in the set, avoiding redundant counting from mid-sequence.

Sorting approach is O(n log n). The HashSet approach achieves O(n) by eliminating the need to sort.

function longestConsecutive(nums) {
  const set = new Set(nums);
  let maxLen = 0;
  for (const num of set) {
    if (!set.has(num - 1)) { // start of sequence
      let current = num, len = 1;
      while (set.has(current + 1)) { current++; len++; }
      maxLen = Math.max(maxLen, len);
    }
  }
  return maxLen;
}
// longestConsecutive([100,4,200,1,3,2]) => 4 (sequence: 1,2,3,4)

Only start from sequence beginnings — ensures each element is visited at most twice O(n) total despite the while loop.

Classic problem showing that a hash set can replace sorting for specific types of order-related queries.

In-order traversal of a BST produces elements in sorted order (ascending). The kth element visited in in-order is the kth smallest.

Iterative in-order with early termination: traverse left subtree first, then root (decrement counter), then right. Stop when counter reaches 0.

If k is frequently queried, augment BST nodes with subtree size to find kth element in O(log n) without traversal.

function kthSmallest(root, k) {
  let count = 0, result = null;
  function inorder(node) {
    if (!node || result !== null) return;
    inorder(node.left);
    if (++count === k) { result = node.val; return; }
    inorder(node.right);
  }
  inorder(root);
  return result;
}
// BST: [3,1,4,null,2] => kthSmallest(root, 2) => 2
// In-order: 1,2,3,4 -> 2nd = 2

In-order BST traversal = sorted order — kth element = kth smallest. Early return once found avoids unnecessary traversal.

Augmented BST nodes with left-subtree size enable O(log n) kth-smallest queries, useful when the tree is queried many times.

Two-pointer approach: start at beginning of both arrays. Advance the pointer of the smaller element. When equal, record the intersection element and advance both.

O(m+n) time, O(1) space (excluding output). This is optimal since you must examine all elements at least once.

For unsorted arrays: use a Set. Insert all elements of arr1 into set, then filter arr2 elements that are in the set.

function intersection(a, b) {
  const result = [];
  let i = 0, j = 0;
  while (i < a.length && j < b.length) {
    if (a[i] === b[j]) {
      if (!result.length || result[result.length-1] !== a[i])
        result.push(a[i]); // avoid duplicates
      i++; j++;
    } else if (a[i] < b[j]) i++;
    else j++;
  }
  return result;
}
// intersection([1,2,2,3,5],[2,2,4,5]) => [2,5]

Two-pointer works only on sorted arrays — both pointers advance together when equal, independently when unequal.

For multiple arrays: intersect pairs sequentially. First intersect A∩B, then result∩C, etc. Result size shrinks with each step.

Mathematical approach: expected sum = n*(n+1)/2. Subtract actual sum from expected. The difference is the missing number. O(n) time, O(1) space.

XOR approach: XOR all numbers 1 to n, then XOR all array elements. Duplicate XOR-pairs cancel out leaving the missing number.

Binary search approach works on sorted arrays only: check if arr[mid] = mid+1 (1-indexed). If mismatch, missing is in left half; else right half.

// Sum formula: O(n) time, O(1) space
function missingNumber(arr, n) {
  const expected = (n * (n + 1)) / 2;
  return expected - arr.reduce((sum, x) => sum + x, 0);
}
// missingNumber([3,1,2,5], 5) => 4

// XOR approach: works without overflow risk
function missingXOR(arr, n) {
  let xor = 0;
  for (let i = 1; i <= n; i++) xor ^= i;
  for (const num of arr) xor ^= num;
  return xor;
}
// Each pair of equal numbers XOR to 0, leaving only the missing one

Sum formula can overflow for very large n in some languages (JavaScript handles BigInt, but sum formula is usually fine).

XOR approach is elegant and handles overflow-safe operations — pairs cancel: x^x=0, 0^x=x, so the unpaired missing number remains.

Monotonic stack approach: traverse right-to-left, maintaining a stack of "candidates for next greater". For each element, pop all elements smaller than it. The new stack top (if any) is its next greater element.

Stack always holds elements in decreasing order — the monotone invariant ensures O(n) total operations (each element pushed and popped at most once).

If no greater element exists to the right, next greater = -1.

function nextGreater(arr) {
  const result = Array(arr.length).fill(-1);
  const stack = []; // stores indices
  for (let i = arr.length - 1; i >= 0; i--) {
    while (stack.length && arr[stack[stack.length-1]] <= arr[i])
      stack.pop(); // remove smaller elements
    if (stack.length) result[i] = arr[stack[stack.length-1]];
    stack.push(i);
  }
  return result;
}
// nextGreater([2,1,2,4,3]) => [4,2,4,-1,-1]

Monotonic stack: stack maintains decreasing order. Each element is pushed/popped once — total O(n) despite the nested while loop.

Circular array variant: traverse 2n indices (mod n) to handle wrapping — the same candidate can appear as next greater after wrapping around.

Three-pointer approach: lo=0 (boundary of 0s), mid=0 (current element), hi=n-1 (boundary of 2s). Walk mid pointer forward, swapping 0s to lo and 2s to hi.

When arr[mid]=0: swap with lo and advance both. When arr[mid]=2: swap with hi and decrement hi (don't advance mid, re-check). When arr[mid]=1: just advance mid.

One pass, O(n) time, O(1) space. Generalizes to sorting k distinct values but requires k-1 passes.

function sortColors(arr) {
  let lo = 0, mid = 0, hi = arr.length - 1;
  while (mid <= hi) {
    if (arr[mid] === 0) {
      [arr[lo], arr[mid]] = [arr[mid], arr[lo]];
      lo++; mid++;
    } else if (arr[mid] === 2) {
      [arr[mid], arr[hi]] = [arr[hi], arr[mid]];
      hi--; // don't advance mid, re-examine swapped element
    } else mid++;
  }
}
// sortColors([2,0,2,1,1,0]) => [0,0,1,1,2,2]

Don't advance mid when swapping with hi — the element that moved to arr[mid] from the right was unseen; it might be 0, 1, or 2.

Named after Dijkstra's description using Dutch flag colors (red, white, blue = 0, 1, 2). Foundation for 3-way QuickSort.

Sort arrival and departure times separately. Use two pointers to simulate time progression. When the next event is an arrival, platforms needed increases; when departure, it decreases.

Track the maximum simultaneous overlap across all intervals — this is the minimum platforms required.

Classic "meeting rooms" / "interval scheduling" problem. Same logic applies to scheduling problems in OS and calendar apps.

function minPlatforms(arrivals, departures) {
  arrivals.sort((a, b) => a - b);
  departures.sort((a, b) => a - b);
  let platforms = 0, maxPlatforms = 0;
  let i = 0, j = 0;
  while (i < arrivals.length) {
    if (arrivals[i] < departures[j]) {
      platforms++; i++;          // new train arrives
    } else {
      platforms--; j++;          // a train departs
    }
    maxPlatforms = Math.max(maxPlatforms, platforms);
  }
  return maxPlatforms;
}
// arrivals=[9,9,10,11] departures=[10,12,15,11] => 3

Key insight: sort arrivals and departures independently — treat them as separate event streams merged in time order.

If arrivals[i] === departures[j], count departure first (train leaving frees a platform before the new train needs one).

Sort the array. For each element (fix it as the first element), use two pointers on the remaining elements to find pairs that sum to its negative.

Skip duplicates to avoid duplicate triplets in the result. If arr[i] > 0, break early — no three positive numbers sum to zero.

O(n²) after sorting. Better than naive O(n³) brute force. This is a direct extension of the 2Sum two-pointer approach.

function threeSum(nums) {
  nums.sort((a, b) => a - b);
  const result = [];
  for (let i = 0; i < nums.length - 2; i++) {
    if (nums[i] > 0) break;           // no valid triplet possible
    if (i > 0 && nums[i] === nums[i-1]) continue; // skip duplicate firsts
    let lo = i + 1, hi = nums.length - 1;
    while (lo < hi) {
      const sum = nums[i] + nums[lo] + nums[hi];
      if (sum === 0) {
        result.push([nums[i], nums[lo++], nums[hi--]]);
        while (lo < hi && nums[lo] === nums[lo-1]) lo++; // skip duplicates
        while (lo < hi && nums[hi] === nums[hi+1]) hi--;
      } else sum < 0 ? lo++ : hi--;
    }
  }
  return result;
}
// threeSum([-1,0,1,2,-1,-4]) => [[-1,-1,2],[-1,0,1]]

Skip duplicate outer elements with i > 0 && nums[i] === nums[i-1]. Skip inner duplicates after finding a valid triplet.

4Sum follows the same pattern: fix two elements with nested loops, then use two pointers for the remaining two — O(n³) overall.

Place each positive integer in its "correct" index (value v at index v-1) using index-as-hash trick. Then scan for the first index where arr[i] !== i+1: that's the missing positive.

This cyclic sort / index-placement approach is O(n) time, O(1) extra space — elegant because the array itself serves as the hash map.

The answer is always in range [1, n+1] for an n-element array — so we only need to look at indices 0 to n-1.

function firstMissingPositive(nums) {
  const n = nums.length;
  // Phase 1: place each value v at index v-1 if 1 <= v <= n
  for (let i = 0; i < n; i++) {
    while (nums[i] > 0 && nums[i] <= n && nums[nums[i]-1] !== nums[i])
      [nums[i], nums[nums[i]-1]] = [nums[nums[i]-1], nums[i]];
  }
  // Phase 2: find first mismatch
  for (let i = 0; i < n; i++)
    if (nums[i] !== i + 1) return i + 1;
  return n + 1; // all 1..n present
}
// firstMissingPositive([3,4,-1,1]) => 2
// firstMissingPositive([1,2,0])     => 3

Cyclic sort pattern: while nums[i] != i+1 and the target slot holds a different value, keep swapping. Integers outside [1,n] are ignored.

The while loop may look O(n²) but each element is placed in its correct position at most once — total O(n) swaps across all iterations.