V8 Render Engine
bext uses V8 (via rusty_v8) for all server-side rendering. The render pool is engine-agnostic at the public API level — historically it also supported WebKit/JavaScriptCore, but that backend was removed since V8's heap snapshots and TurboFan JIT gave better cold-start and steady-state performance across every real deployment.
Building
cargo build -p bext-server --release \
--no-default-features \
--features v8,tls,nginx-compat,php,route-css
The v8 feature is part of the default feature set. --no-default-features above just drops feature baggage we don't need; see Build Flags for the full list.
Configuring
[render]
workers = 4 # Dedicated V8 worker threads (default: 4)
bundle_path = "dist/ssr-bundle.js"
There is no engine field — V8 is the only option. Existing configs that set engine = "v8" still load cleanly (the unknown field is ignored).
Three-Layer Rendering Pipeline
V8 on-demand rendering uses a layered architecture for maximum performance:
Layer 1: React Base (747 KB, built once per process)
→ React + ReactDOM + jsx-runtime as globalThis globals
→ Disk-cached at .bext/v8-react-base.js (survives restarts)
Layer 2: Site Shell (31 KB for simple sites, cached per-site)
→ Root layout + Nav + Footer + shared UI components
→ Disk-cached at .bext/ssr-cache/__shell.js
→ Rebuilt when layout files change
Layer 3: Page Delta (27-51 KB per route, cached per-route)
→ Just the page component (React externalized)
→ Disk-cached at .bext/ssr-cache/{route}.js
→ Rebuilt when page source changes
Performance
| Metric | Without layering | With layering |
|---|---|---|
| Route bundle size | ~800 KB | 27-51 KB (96% smaller) |
| Bun compile time | 30ms | 9-14ms |
| V8 eval (first context) | 100ms | 32ms |
| V8 eval (subsequent) | 100ms | 7ms |
| Response cache hit | 6ms | 6ms |
Cache Hierarchy
Bundles are cached at three levels:
1. Memory — DashMap, instant lookup, cleared on source change
2. Disk — .bext/ssr-cache/, survives restarts, validated by file mtime
3. Compile — Bun subprocess, only runs on full cache miss
After a restart with warm disk cache, routes load without any Bun compilation.
Configuration
The on-demand pipeline is configurable via bext.config.toml:
[nextjs]
on_demand = true
cache_ttl = 60 # Response cache TTL in seconds (default: 60)
cache_swr = 300 # Stale-while-revalidate window in seconds (default: 300)
max_shell_size = 524288 # Max shell bundle size in bytes (default: 512KB)
[render]
streaming = true # Use renderToReadableStream + bext event loop
# streaming_timeout_ms = 5000 # Wall-clock cap for one render
# streaming_stall_ms = 250 # Bail out if no chunk for this long
# rsc = true # Expose /__bext/rsc/<path> Flight endpoint
Sites with shells larger than max_shell_size (e.g., full-stack apps with Prisma/tRPC) automatically fall back to full-bundle compilation.
Streaming SSR
When streaming = true, bext drives React 19's renderToReadableStream from react-dom/server directly inside the V8 isolate. V8 has no built-in event loop, so bext implements one in Rust:
- collect_async_render_result pumps perform_microtask_checkpoint() until React reports done
- Progress is read via raw V8 object access, so each tick costs almost nothing
- A stalled stream (no new chunk for streaming_stall_ms) bails out early
- A wall-clock cap (streaming_timeout_ms, default 5000ms) bounds the worst case
Streaming mode unlocks:
- async function Page() server components
- `` boundaries with async children
- The use(promise) hook
- await fetch(...) and other server-only async work
When streaming = false, bext deletes globalThis.__ReactDOMServer.renderToReadableStream from the page context so the bundle wrapper falls through to React's synchronous renderToString. That path can't await, but it skips the event loop entirely — useful for fully-sync pages.
React Server Components (experimental)
Set [render] rsc = true and bext exposes /__bext/rsc/<path> for any matched route. The endpoint returns a real React Flight payload via react-server-dom-parcel/server.edge.
The RSC pipeline uses a separate bundle and a separate V8 page context from the SSR HTML pipeline:
1. The bun warm worker compiles a per-route RSC entry with --conditions=react-server so React resolves to its server-only build
2. The worker bundles that server-only React inline (skipping the usual react-external rule). Externalizing it would leave import.meta.require("react") references in the output that V8's classic-script mode can't parse, and would force the runtime to use the SSR bundle's regular React, which is incompatible with react-server-dom-parcel's server-only code paths
3. The bundle is evaluated in a fresh V8 context, cached separately from the SSR context (the page-context cache key is namespaced as page:rsc:... vs page:ssr:...)
4. renderToReadableStream from react-server-dom-parcel/server.edge produces the Flight payload, which collect_async_rsc_result drains via the same event-loop driver as streaming SSR
A small route's Flight payload lands in ~10ms warm, ~50ms cold (compile + eval combined). The endpoint is currently localhost-only and additive — it does not yet replace the SSR HTML response.
Response Headers
V8-rendered pages include:
| Header | Value |
|---|---|
x-bext-mode |
on-demand |
x-bext-cache |
hit, stale, or miss |
cache-control |
public, max-age={ttl}, stale-while-revalidate={swr} |
etag |
FNV hash of response body |
x-content-type-options |
nosniff |
x-frame-options |
SAMEORIGIN |
referrer-policy |
strict-origin-when-cross-origin |
link |
Preload hints extracted from page HTML |
content-encoding |
gzip (when client accepts) |
vary |
Accept-Encoding |
ETag enables 304 Not Modified responses — zero transfer on repeat visits.
Architecture
Dedicated Eval Thread
V8 evaluation runs on a single dedicated OS thread with a 32MB stack:
1. Single long-lived isolate — never dropped (avoids rusty_v8 v130 SEGV)
2. Fresh context per request — each render gets a clean scope
3. Channel-based dispatch — callers send work via mpsc::channel
4. 30-second timeout — prevents hanging renders
5. Periodic GC — low_memory_notification() every 50 renders
Cached mtime
Source directory mtime is cached with a 1-second TTL to avoid filesystem walks on every request (53+ files for a typical site).
Bridge Functions
V8 exposes native IO bridge functions to bundle code:
| Function | Purpose |
|---|---|
__readFile(path) |
Read files from disk |
__readDir(path) |
List directory contents |
__httpFetch(url, opts) |
HTTP requests via ureq |
__dbQuery(db, sql, params) |
SQLite queries |
__env(key) |
Environment variable access |
__log(level, msg) |
Structured logging |
Polyfills
V8 bare isolates don't have Node.js/Bun APIs. bext provides polyfills for:
- process.env, process.versions.node, process.cwd()
- console.log/warn/error/info
- TextEncoder / TextDecoder
- setTimeout / clearTimeout
- queueMicrotask
- window, document, navigator, location (minimal SSR stubs)
Troubleshooting
Empty <head> in rendered HTML
React 19's renderToString may drop <head> children in V8. bext works around this by extracting head content from layout.tsx server-side and injecting it into the HTML if <head></head> is empty.
Shell too large
If you see "Shell too large, using full bundle" in logs, the site's layout chain includes heavy dependencies (Prisma, tRPC, etc.). The shell exceeds max_shell_size and falls back to full-bundle compilation. This is expected for full-stack apps.
To increase the threshold:
[nextjs]
max_shell_size = 1048576 # 1MB
Memory usage
Monitor V8 memory and render performance:
bext-server diagnose --pass <admin-password>
bext-server diagnose --pass <admin-password> --format json
Checking eval timing
V8 logs per-request timing breakdown:
V8 eval timing: polyfills_ms=0, base_shell_ms=1, bundle_ms=0, render_ms=5, total_ms=7
- base_shell_ms: Time to eval React base + site shell (1ms after first context, 21ms first time)
- bundle_ms: Time to eval page delta (0ms for 27-51KB bundles)
- render_ms: React renderToString time (5-10ms)