Om

I Cut My Site's Load Time from 15s to 3s with Surgical Suspense Boundaries

om

Share this

I built jobs.ibbe.in as a hiring site. At some point I had to sit down and face the reality that the site was embarrassingly slow. We're talking 15+ seconds to get anything on screen. On a good connection. That's the kind of load time that makes people close the tab before your content even exists.

I fixed it. The site now loads meaningful content in under 3 seconds. Here's exactly what I did and why it worked.

The actual problem, in plain terms

Every page on the site was a single async component. That means when someone visited the homepage, Next.js would start the request, go fetch data from Supabase, wait for every query to finish, and only then begin sending any HTML to the browser. The user sat staring at a blank screen the entire time.

The worst part is that most of what was on screen had absolutely nothing to do with data. The section headings, the decorative backgrounds, the category pills, the newsletter signup form — all of that is static. It's the same for every visitor. And yet it was blocked, sitting behind database queries it had zero relationship with.

This is the core mistake: treating a whole page like one big async operation when only a small slice of it actually needs data.

React Suspense and streaming - what they actually do

Before getting into the changes, here's the mental model you need.

When Next.js renders a page using React Server Components, it has the ability to stream HTML to the browser in chunks. Instead of waiting for everything before sending anything, it can send the parts it already knows, then fill in the gaps as data becomes available.

<Suspense> is the mechanism that makes this work. You wrap a component in <Suspense fallback={<YourSkeleton />}>, and React treats that boundary as "figure this out later." Everything outside the Suspense boundary renders and streams immediately. The skeleton shows up as a placeholder. When the data comes back, the real component swaps in.

The key insight is this: only the component that actually fetches data belongs inside Suspense. Static siblings should live outside it, in what I call the shell, so they render instantly regardless of what the database is doing.

Phase 1: The foundation

Before the surgical audit, I set up the infrastructure that made all of this possible.

I created lib/queries.ts with 13 cached query functions. Each one uses Next.js's unstable_cache with a 60-second revalidation window, and each one selects only the columns it actually needs from Supabase rather than doing a SELECT *. This alone reduced query payload size significantly.

import { unstable_cache } from 'next/cache'
import { createClient } from '@/lib/supabase/static'
 
export const getFeaturedArticles = unstable_cache(
  async () => {
    const supabase = createClient()
    const { data } = await supabase
      .from('articles')
      .select('id, title, slug, excerpt, cover_image, category, published_at')
      .eq('featured', true)
      .order('published_at', { ascending: false })
      .limit(6)
    return data ?? []
  },
  ['featured-articles'],
  { revalidate: 60 }
)

I also added skeleton components in components/skeletons.tsx so that when a Suspense boundary is waiting, it shows a realistic placeholder instead of nothing. And I configured the client router cache in next.config.ts:

experimental: {
  staleTimes: {
    dynamic: 60,
    static: 300,
  }
}

This tells Next.js to keep prefetched pages in the browser's memory longer before re-fetching them. The dynamic number covers pages with server data; the static number covers fully static pages. Navigating back to a page you've already visited gets instant without another round trip.

Phase 2: The surgical audit

With the foundation in place, I went through every public-facing page and asked the same question: what on this page is static, and what genuinely requires a database call? Then I moved everything static outside of Suspense.

The homepage

Before the fix, the "Available Positions" section was one big async component inside a single Suspense. The section heading ("Available Positions," the "Live Roles" badge) was waiting behind a database query for the job cards.

After the fix:

// app/page.tsx
 
export default function HomePage() {
  return (
    <main>
      {/* This renders immediately — no data dependency */}
      <section>
        <h2>Available Positions</h2>
        <span className="badge">Live Roles</span>
 
        {/* Only this waits for the database */}
        <Suspense fallback={<JobCardsGridSkeleton />}>
          <FeaturedJobCards />
        </Suspense>
      </section>
    </main>
  )
}

The heading appears instantly. The job cards stream in when the query resolves. Users know what they're looking at before the data arrives.

The stories listing page - the biggest impact

This was the worst offender. The entire page — hero section, category pills, featured article, article grid, newsletter form — was wrapped in a single Suspense. Nothing appeared until every article query finished.

After the audit, I split it into five independent pieces:

// app/stories/page.tsx
 
export default function StoriesPage() {
  return (
    <main>
      {/* Instant — decorative, title, CTA buttons, no data */}
      <HeroSection />
 
      {/* Instant — rendered from a hardcoded CATEGORIES array */}
      <CategoriesSection />
 
      {/* Streams in independently */}
      <Suspense fallback={<FeaturedArticleSkeleton />}>
        <FeaturedArticle />
      </Suspense>
 
      {/* Streams in independently */}
      <Suspense fallback={<StoriesGridSkeleton />}>
        <ArticlesGrid />
      </Suspense>
 
      {/* Instant — static form, no data */}
      <NewsletterCTA />
    </main>
  )
}

One specific thing I had to remove was a dynamic article count badge in the hero ("X articles published"). It sounds like a minor feature, but it was the only data dependency in the hero section, which meant the entire hero was blocked behind a database count query. Removing it let the hero render instantly. The tradeoff is obvious and worth it.

The category pills are worth explaining. The <CategoriesSection> component previously fetched categories from the database. I replaced it with a hardcoded CATEGORIES array imported from a constants file. Categories change maybe once every few months. Hitting the database for them on every page load was unnecessary. Static data belongs in static config.

The story category page

Each category page had a similar structure problem. The hero (icon, title, description, decorative wave separator) was inside an async function that also fetched article data. The hero doesn't need data — it's built from a static CATEGORIES config object keyed by slug.

// app/stories/category/[slug]/page.tsx
 
export default function CategoryPage({ params }) {
  const category = CATEGORIES[params.slug]
 
  return (
    <main>
      {/* Instant — from static config, zero DB */}
      <CategoryHero
        icon={category.icon}
        title={category.title}
        description={category.description}
      />
 
      {/* Instant — static string */}
      <h3>Latest in {category.title}</h3>
 
      {/* Tiny — just an article count */}
      <Suspense fallback={<CategoryStatsSkeleton />}>
        <CategoryStats slug={params.slug} />
      </Suspense>
 
      {/* Main content */}
      <Suspense fallback={<CategoryArticlesSkeleton />}>
        <CategoryArticlesGrid slug={params.slug} />
      </Suspense>
    </main>
  )
}

The story detail page

Individual story pages were already fast because of generateStaticParams and ISR (Incremental Static Regeneration). The article content itself is pre-built at deploy time and served as static HTML — no database query on the critical path.

The only remaining issue was that "More Stories" (related articles at the bottom) was being fetched in the same async function as the main article, making them sequential. I extracted it into its own component with its own Suspense boundary:

// app/stories/[slug]/page.tsx
 
export default async function StoryPage({ params }) {
  // This is a static page — content is pre-rendered, instant
  const article = await getArticle(params.slug)
 
  return (
    <article>
      <ArticleContent article={article} />
 
      {/* Loads in background while user reads */}
      <Suspense fallback={<RelatedArticlesSkeleton />}>
        <RelatedArticlesSection currentSlug={params.slug} category={article.category} />
      </Suspense>
    </article>
  )
}

The user gets the article immediately. Related stories load in the background while they read. By the time they finish the article, the related section has almost certainly already populated.

The RLS bug that was causing 404s

In Phase 1, I had switched the job detail page to use a static Supabase client (initialized with just the anon key) to avoid creating a new server client on every request. The intention was good — the static client is more efficient for public data.

The problem is that the jobs table has Row Level Security enabled in Supabase. RLS policies evaluate whether the requesting user has permission to read a row. The static client has no auth context, which means it looks like an anonymous public request. If the RLS policy requires authentication to read job rows, the static client gets back null, the page calls notFound(), and the user sees a 404.

The fix was straightforward: revert the job detail page to the server client (which uses cookies and has the auth context), while keeping the static client for generateStaticParams where only slugs are needed at build time and RLS is irrelevant.

// app/jobs/[slug]/page.tsx
 
import { createServerClient } from '@/lib/supabase/server'  // has auth context
import { createStaticClient } from '@/lib/supabase/static'  // anon only
 
// Build time — only needs slugs, no RLS concern
export async function generateStaticParams() {
  const supabase = createStaticClient()
  const { data } = await supabase.from('jobs').select('slug')
  return data?.map(job => ({ slug: job.slug })) ?? []
}
 
// Runtime — needs auth context for RLS
export default async function JobPage({ params }) {
  const supabase = createServerClient()
  const { data: job } = await supabase
    .from('jobs')
    .select('*')
    .eq('slug', params.slug)
    .single()
 
  if (!job) notFound()
  // ...
}

What I deliberately left alone

The jobs listing and category pages run a JobSearch client component that receives all its data as props. Since it's a client component handling search, filtering, and state, splitting it further would require a completely different architecture. The performance is acceptable as-is, and a refactor there is a separate project.

Auth-gated pages (login, signup, candidate profile, application status) were also untouched. Auth checks must complete before rendering because the decision of what to show depends on who you are. Streaming HTML before knowing the user's auth state would either expose content to wrong users or require client-side corrections after the fact. The current pattern is correct for those pages.

The net result

15 seconds to under 3 seconds. The site went from watching a blank screen for an uncomfortable amount of time to feeling fast. Content appears progressively: shell first, then data as it arrives, in the right order, with skeletons holding space in between.

The principle that made all of this possible is simple. Only put components inside Suspense if they actually need to wait for data. Everything else belongs in the shell and should be on screen immediately. When you audit a slow page with that question in mind, you'll find that most of what's blocking the render has no business doing so.