Next.js On-Demand Rendering
Point bext at a Next.js app/ directory and start serving — no next build, no .next/ directory. Pages compile on first request and are cached. Like PHP, but with React.
Quick Start
# In your Next.js project directory:
bext dev .
# bext.config.toml
[server]
app_dir = "."
[framework]
type = "nextjs"
[nextjs]
on_demand = true
That's it. Visit http://localhost:3000/ and bext will:
1. Scan app/ for page.tsx + layout.tsx files
2. Compile the requested page on demand (warm bun worker, ~15ms)
3. Render via V8 — renderToString for sync pages, renderToReadableStream + bext's custom event loop driver for streaming
4. Cache both the compiled bundle and the warm V8 page context (the next call to the same route skips bundle eval entirely, ~5ms)
5. Serve subsequent requests from cache
How It Works
GET /blog/hello-world
│
▼
Route scanner matches: app/blog/[slug]/page.tsx
│
▼
ISR cache check → miss (first request)
│
▼
Bundle cache check → miss (never compiled)
│
▼
Generate wrapper entry:
import Layout from "app/layout.tsx"
import Page from "app/blog/[slug]/page.tsx"
renderToReadableStream()
│
▼
Warm bun worker → compiled JS (~15ms)
│
▼
V8 streaming render → HTML (~3ms warm, ~30ms cold)
│
▼
Cache: bundle (by route + mtime), warm V8 context (by bundle hash), HTML (ISR with TTL)
│
▼
Return HTML
Next request: GET /blog/hello-world → ISR cache hit → 0ms
Configuration
[nextjs]
compat = "full" # Real layout.tsx, metadata, full Next.js semantics
on_demand = true # Enable on-demand compilation
bundler = "bun" # bun (warm worker pool) or turbopack (in-process Rust)
bundle_cache_max = 500 # Max cached compiled bundles
[render]
streaming = true # React 19 renderToReadableStream + bext event loop
# rsc = true # Experimental RSC Flight payload endpoint
# streaming_timeout_ms = 5000 # Wall-clock cap for a single render
# streaming_stall_ms = 250 # Bail out if no chunk for this long
[nextjs.aliases] # Import path aliases
"@/*" = "src/*"
compat = "full" means bext understands real layout.tsx, metadata exports, async server components, and the rest of the App Router contract. Path aliases come from your tsconfig.json automatically.
What's Supported
| Feature | Status | Notes |
|---|---|---|
Static pages (page.tsx) |
✅ | Compiled + cached on first request |
Layouts (layout.tsx) |
✅ | Nested composition (root → leaf), per-layout-chain shell cache |
Dynamic routes ([slug]) |
✅ | Params extracted from URL |
Catch-all ([...path]) |
✅ | All remaining segments captured |
Route groups ((group)) |
✅ | No URL impact, organizational only |
generateStaticParams |
✅ | Used for pre-rendering hints |
Static metadata (export const metadata) |
✅ | Detected at scan time |
API routes (route.ts) |
✅ | GET/POST/etc., dynamic params via second arg |
| Async Server Components | ✅ | Native — streaming = true uses renderToReadableStream + bext's V8 event loop |
| `` boundaries | ✅ | Streaming SSR pumps async children to completion |
use(promise) hook |
✅ | Driven by the streaming render loop |
redirect() / notFound() |
✅ | Translated to HTTP 307 / 404 via Error.prototype.digest capture |
next/headers / next/cookies |
✅ | Read from the live request |
'use client' components |
✅ | Server-rendered to HTML, client bundle hydrates them |
| RSC Flight payload (experimental) | ⚙️ | [render] rsc = true exposes /__bext/rsc/<path> returning a real Flight payload via react-server-dom-parcel |
Server Actions ('use server') |
🔜 | Wired through react-server-dom-parcel's decodeReply/decodeAction — in progress |
Next.js API Shims
bext's V8 require shim handles the common Next.js server APIs out of the box — no install step:
redirect() and notFound() set their digest field, and bext installs an Error.prototype.digest setter on the SSR wrapper that captures the URL/code and translates the thrown error into the right HTTP response.
Async Server Components, Suspense, and use()
Async Server Components, `` with async children, and the use(promise) hook all work natively when [render] streaming = true (the default for new projects):
// app/blog/[slug]/page.tsx
export default async function Page({ params }) {
const { slug } = await params;
const post = await fetchPost(slug);
return <article>{post.title}</article>;
}
V8 has no native event loop, so bext implements one in Rust: collect_async_render_result pumps perform_microtask_checkpoint() until React's renderToReadableStream reports done, with stall detection (streaming_stall_ms) and a wall-clock cap (streaming_timeout_ms). Reading progress through raw V8 object access avoids per-tick JS compile/eval overhead, so a warm streaming render lands at ~1ms of eval-thread time.
If you set streaming = false, bext falls back to React's synchronous renderToString. That path doesn't support async server components, Suspense boundaries, or use(promise), but it skips the event loop entirely — useful for projects that have no async server-side fetches.
React Server Components (experimental)
Set [render] rsc = true and bext exposes /__bext/rsc/<path> for any matched route. The endpoint returns a real React Flight payload produced by react-server-dom-parcel/server.edge, ready to be consumed by createFromFetch on the client.
Under the hood, bext compiles a separate RSC bundle per route through the warm bun worker:
- Bun is invoked with --conditions=react-server, so React resolves to its server-only build
- The bundle inlines that server-only React (instead of leaving it external) — required so the Flight bundle is a self-contained classic script V8 can run, and so its React internals don't collide with the regular React used by the SSR HTML bundle
- The compiled bundle is evaluated in a separate V8 page context from the SSR HTML bundle. Both the SSR context and the RSC context are cached independently — the next request to the same route reuses both
- A Flight payload for a small route lands in ~10ms warm, ~50ms cold (compile + eval combined)
The Flight endpoint is currently localhost-only and additive — it does not yet replace the SSR HTML response. The client manifest, hydration bootstrap, and 'use server' wiring are the remaining pieces before RSC is fully end-to-end.
Bundle Cache
Compiled bundles are cached per route:
| Key | route:/blog/:slug |
|---|---|
| Invalidation | page.tsx mtime changes, or any layout.tsx mtime changes |
| Max entries | 500 (configurable via bundle_cache_max) |
| Eviction | LRU when full |
When you edit a file:
- File watcher detects the mtime change
- Bundle cache entry invalidated for affected routes
- ISR cache entries cleared for those routes
- Next request recompiles on demand (~15ms)
Performance
Validated end-to-end via the SSR harness in harnesses/ssr-dev/ (33 base tests + 10 streaming-specific tests against a real Next.js fixture app):
| Scenario | Time |
|---|---|
| Warm bun compile (single page + layout chain) | ~13–15ms |
| V8 cold render (bundle eval + first call) | ~30ms |
| V8 warm render (cached page context) | ~3–5ms |
| Streaming render — async server component + Suspense | ~5ms |
| RSC Flight payload (cold) | ~50ms |
| RSC Flight payload (warm) | ~10ms |
| ISR cache hit | sub-millisecond |
| File change → invalidate → recompile | ~15ms |
See bext.dev/benchmarks for the full numbers and how they were measured.
Example: nextjs-blog
The examples/nextjs-blog directory contains a complete Next.js 15 app:
examples/nextjs-blog/
bext.config.toml
package.json
app/
layout.tsx # Root layout with nav + footer
page.tsx # Home page with post list
blog/
[slug]/
page.tsx # Dynamic blog post with async params
api/
revalidate/
route.ts # ISR revalidation endpoint
Run it:
cd examples/nextjs-blog
bun install
bext dev .
Visit http://localhost:3060/ — home page renders instantly. Click a blog post — compiles on first visit, cached after.
Comparison with next build
next build + next start |
bext on-demand | |
|---|---|---|
| Build step | Required (~30-60s) | None |
| First page load | Pre-rendered | ~20ms compile |
| Subsequent loads | Instant | Instant (ISR cache) |
| File change | Rebuild required | Auto-recompile (~15ms) |
| Memory | Full .next/ directory | Per-route cache |
| Cold start | Load all routes | Load nothing, compile on demand |
bext on-demand is ideal for development and content sites where instant feedback matters more than zero-ms first paint.