React.lazy is the built-in API for component-level code splitting — it wraps a dynamic import() and tells React to download that component's code only when it's first rendered. Before the chunk loads, React pauses rendering and shows the nearest Suspense fallback. This splits your bundle along component boundaries so the initial download includes only the code needed for the first view. Use it primarily for route-level components and large feature modules.
const LazyProfile = React.lazy(() => import('./Profile'));
function App() {
return (
<Suspense fallback={<Loading />}>
<LazyProfile />
</Suspense>
);
}
Why it matters: Code splitting reduces the initial bundle size — users on slow connections load the app faster because they only download what they need first.
Real applications: Lazy load route-level components so the home page bundle doesn't include the admin dashboard code users may never visit.
Common mistakes: Not wrapping the lazy component in Suspense — React throws an error if a lazy component has no Suspense boundary above it.
Suspense is a boundary component that catches the suspended state from lazy or async children and displays a fallback UI while they load. When a child throws a Promise (the mechanism behind lazy loading and data fetching), Suspense intercepts it and shows its fallback prop until the Promise resolves. Multiple Suspense boundaries can be nested to provide separate loading states for different sections of the page. Place Suspense close to the component that suspends for the smoothest user experience.
Why it matters: Without Suspense, lazy components throw an error on first render — Suspense turns that into a graceful loading state.
Real applications: Show a skeleton screen while a heavy chart component loads, instead of a blank space or an error.
Common mistakes: Putting the Suspense boundary too high — a single fallback for the whole page is jarring; wrap smaller sections for smoother loading.
Code splitting breaks your JavaScript bundle into smaller chunks that load on demand rather than all upfront. Users download only the code needed for the current view, reducing initial load time and Time to Interactive — especially impactful on mobile or slow connections. Each lazy-loaded route or component becomes a separate cached network request. The bundler (Vite or Webpack) handles the chunk boundaries automatically when you use dynamic imports.
Why it matters: A smaller initial bundle means the app becomes interactive faster, especially on mobile or slower networks.
Real applications: A settings page or admin panel that most users never visit should be a separate chunk — no point making everyone download it.
Common mistakes: Splitting too aggressively — tiny components aren't worth splitting; focus on route-level or large feature-level splits.
Starting with React 18, Suspense can handle not just lazy loading but also data fetching when used with a compatible library like TanStack Query, Relay, or the Next.js App Router. The library throws a Promise when data isn't cached; Suspense catches it and shows the fallback until the Promise resolves. This enables declarative loading states at the boundary level — no manual isLoading checks scattered through component code. Plain fetch in useEffect does not integrate with Suspense.
Why it matters: Suspense for data fetching lets you declare loading states at the component boundary level instead of scattering loading checks throughout the component.
Real applications: In Next.js App Router, server components and use() with Suspense give you declarative loading states for data-driven pages.
Common mistakes: Using Suspense for data fetching without a supporting framework or library — plain fetch in useEffect doesn't integrate with Suspense by default.
Route-level code splitting is the highest-impact place to apply lazy loading — each route's component becomes a separate chunk that downloads only when the user navigates to that route. Combine React.lazy with React Router's Routes and a single Suspense boundary to get automatic per-route splitting. The initial bundle shrinks to only the home route code and every subsequent navigation loads its chunk on demand. This pattern alone can cut initial bundle size by 50% or more in large apps.
const Home = React.lazy(() => import('./pages/Home'));
const About = React.lazy(() => import('./pages/About'));
function App() {
return (
<Suspense fallback={<Spinner />}>
<Routes>
<Route path="/" element={<Home />} />
<Route path="/about" element={<About />} />
</Routes>
</Suspense>
);
}
Why it matters: Route-level splitting is the highest-impact place to code split — each route loads only its own code when the user navigates to it.
Real applications: In a React Router app, wrap all routes in a single Suspense with a page-level skeleton so each route chunk is fetched on demand.
Common mistakes: Forgetting to handle the loading state — without a Suspense fallback, navigating to a lazy route shows a blank screen while the chunk downloads.
startTransition is a React 18 API that marks a state update as non-urgent, allowing React to interrupt the expensive re-render if a higher-priority update (like a user keystroke) arrives. Without transitions, a heavy re-render caused by typing blocks the UI thread, making inputs feel sluggish. Wrap only the expensive derived state update in startTransition — never the urgent input value itself, or the input will feel delayed.
import { startTransition } from 'react';
startTransition(() => {
setSearchResults(filteredData); // Non-urgent update
});
Why it matters: Without transitions, heavy state updates block the UI — typing in a search box feels sluggish because every keystroke triggers an expensive re-render.
Real applications: Filtering a large list while the user types — mark the filter update as a transition so the input stays responsive.
Common mistakes: Wrapping urgent updates (like the input value itself) in startTransition — the input will feel laggy; only wrap the expensive derived update.
useTransition is the hook version of startTransition that additionally returns an isPending flag, letting you show a visual indicator while the non-urgent state update is in progress. This keeps the current UI visible during the transition instead of showing a blank state — users see the stale content while the new render prepares in the background. It gives a significantly smoother experience than replacing content with a spinner during every filter or navigation.
const [isPending, startTransition] = useTransition();
const handleChange = (e) => {
setInput(e.target.value); // Urgent
startTransition(() => {
setFilteredList(filter(e.target.value)); // Non-urgent
});
};
return isPending ? <Spinner /> : <List items={filteredList} />;
Why it matters: useTransition gives you an isPending flag so you can show a subtle loading indicator without hiding the current content.
Real applications: A search results list that shows a spinner next to the input while the transition is pending, keeping the old results visible until new ones are ready.
Common mistakes: Using useTransition for async operations like fetch — it only handles synchronous state updates; pair it with Suspense for async data.
Nested Suspense boundaries let different sections of a page load independently, each showing its own fallback without affecting siblings. The inner boundary's fallback appears only for its suspended children while components outside continue rendering normally. This enables progressive content reveal — the page header can appear immediately while the main content and sidebar each show their own skeletons. Avoid nesting too aggressively; too many simultaneous loading states create a visual flickering effect.
<Suspense fallback={<PageSkeleton />}>
<Header />
<Suspense fallback={<PostSkeleton />}>
<Posts />
</Suspense>
</Suspense>
Why it matters: Nested boundaries let different sections load independently — the header can appear while the posts are still loading.
Real applications: A complex dashboard where the navigation loads immediately, then each widget shows its own skeleton while fetching its data.
Common mistakes: Over-nesting Suspense boundaries — too many independent loading states at once creates a "flickering" feel; group related sections together.
Suspense handles only the loading state — when a lazy import or data fetch fails (network error, chunk download failure), Suspense has no mechanism to show an error fallback. For error handling, wrap the Suspense boundary inside an ErrorBoundary that catches the rejection and renders an error UI. In production, always pair ErrorBoundary > Suspense > LazyComponent so both loading and error states are handled gracefully.
<ErrorBoundary fallback={<Error />}>
<Suspense fallback={<Loading />}>
<DataComponent />
</Suspense>
</ErrorBoundary>
Why it matters: Suspense handles the loading state but can't catch errors — you need an error boundary above it to handle failed loads gracefully.
Real applications: Wrap every Suspense boundary in an ErrorBoundary in production so a failed chunk download shows "Something went wrong" instead of a crash.
Common mistakes: Using Suspense without an ErrorBoundary — if the lazy import fails (network error), the app crashes with no fallback UI.
React.lazy requires the dynamic import() to resolve a module with a default export — it cannot directly lazy load named exports. This is a common obstacle because many utility or UI modules use named exports. The official workaround is a simple re-export wrapper file that re-exports the named export as the default. Using .then(m => m.NamedExport) in the lazy import also works but the wrapper file is considered the cleaner pattern.
// MathUtils.js exports { Calculator }
// Calculator.js (wrapper)
export { Calculator as default } from './MathUtils';
const Calculator = React.lazy(() => import('./Calculator'));
Why it matters: Knowing this limitation prevents a confusing error when you try to lazy load a named export directly — the workaround is a tiny re-export file.
Real applications: A utility module that exports many components — create a default-export wrapper for each one you want to lazy load.
Common mistakes: Writing React.lazy(() => import('./utils').then(m => m.Calculator)) — the correct pattern is a wrapper file, not a .then() chain.
Preloading hides the latency of code splitting by starting a chunk download before the user clicks — on hover or keyboard focus over the link. Calling import('./Dashboard') without rendering the result starts the network request and the browser caches the module. When React.lazy later requests the same module, it's already in the module cache and renders instantly. Preload only the most likely next navigation — preloading everything defeats the purpose of splitting.
const LazyDashboard = React.lazy(() => import('./Dashboard'));
// Preload function — starts the download without rendering
const preloadDashboard = () => import('./Dashboard');
function Nav() {
return (
<Link
to="/dashboard"
onMouseEnter={preloadDashboard} // preload on hover
onFocus={preloadDashboard} // preload on keyboard focus
>
Dashboard
</Link>
);
}Calling import('./Dashboard') starts the network request. When React.lazy later needs the component, it is already cached by the browser.
Why it matters: Preloading hides the network latency of code splitting — the chunk is ready by the time the user clicks, so navigation feels instant.
Real applications: Preload the dashboard component on hover over the "Dashboard" nav link — by the time the user clicks, the chunk is loaded.
Common mistakes: Preloading everything eagerly — this defeats the purpose of code splitting; only preload components the user is very likely to visit next.
useDeferredValue accepts a value and returns a deferred copy that intentionally lags behind during rapid updates — React processes urgent re-renders with the current value first, then updates the deferred copy at lower priority. This is React's built-in alternative to manual debouncing for derived computations: the input responds immediately while the expensive result list catches up. Use it for values you receive but don't control (like a prop); use useTransition for state updates you control directly.
import { useDeferredValue } from 'react';
function SearchResults({ query }) {
const deferredQuery = useDeferredValue(query);
// deferredQuery lags behind query during fast typing
return (
<ul>
{filterItems(allItems, deferredQuery).map(item => (
<li key={item.id}>{item.name}</li>
))}
</ul>
);
}useDeferredValue is similar to debouncing but React-aware — it keeps the old value visible while the new one loads, preventing blank flashes.
Why it matters: It prevents the UI from going blank while computing expensive derived content — users see stale data instead of nothing.
Real applications: A filtered list where the filter computation is expensive — defer the filtered value so the input stays snappy while the list catches up.
Common mistakes: Confusing it with useTransition — useDeferredValue defers a value you don't control (like a prop); use useTransition when you control the state update.
With React 18 and Suspense-compatible data libraries (TanStack Query, Relay, SWR), components can suspend during render — the library throws a Promise when data isn't cached, Suspense catches it and shows the fallback, then re-renders when data arrives. This makes component code dramatically cleaner: no isLoading flags, no conditional rendering branches for loading states. The Next.js App Router uses this pattern natively with async server components and Suspense boundaries for each data-driven section.
// With TanStack Query (suspense mode)
function UserProfile({ userId }) {
// Throws a promise if data isn't ready — Suspense catches it
const { data } = useSuspenseQuery({ queryKey: ['user', userId],
queryFn: () => fetchUser(userId) });
return <h1>{data.name}</h1>;
}
// Parent wraps with Suspense
<Suspense fallback={<Skeleton />}>
<UserProfile userId={1} />
</Suspense>The component code stays clean — no manual loading checks. Suspense handles all loading states declaratively at the boundary level.
Why it matters: Declarative loading states mean less boilerplate — no isLoading flags scattered through component code.
Real applications: Next.js App Router server components with async/await and Suspense boundaries for each data-driven section of a page.
Common mistakes: Using use() or Suspense-compatible data fetching in React 17 or earlier — Suspense for data is a React 18+ feature.
Vendor chunk splitting separates large third-party libraries like chart.js, lodash, or d3 into their own chunks that load independently from application code. Since vendor libraries change rarely, browsers can cache them across deployments — users don't re-download React or chart.js every time you ship a feature update. Analyze your bundle first with vite-bundle-visualizer or webpack-bundle-analyzer to find which libraries are actually worth splitting before optimizing.
// Dynamic import — creates a separate chunk for the library
async function loadChart() {
const { Chart } = await import('chart.js');
return Chart;
}
// React.lazy with a heavy component that imports a large library
const HeavyChart = React.lazy(() => import('./ChartComponent'));
// ChartComponent imports chart.js — bundled into its own chunk
// Vite config — manual chunk splitting
export default defineConfig({
build: {
rollupOptions: {
output: { manualChunks: { vendor: ['react', 'react-dom'] } }
}
}
});Analyze your bundle with vite-bundle-visualizer or source-map-explorer to identify large dependencies worth splitting.
Why it matters: Vendor libraries like chart.js or moment can be megabytes — splitting them means they're loaded only when needed and cached separately from app code.
Real applications: A charting page loads the chart.js chunk only when visited; it gets cached by the browser so subsequent visits are instant even after app code changes.
Common mistakes: Not analyzing the bundle before optimizing — split based on real size data, not guesses; sometimes a "large" library is already tree-shaken to nothing.