- deception or bluffing games that react to pulse changes during key moments
- stress or arousal feedback in training and social experiences
- breathing and relaxation flows that show physiological response over time
- biofeedback-oriented health or wellness apps that want camera-based pulse signals without extra hardware
Start With The Fastest Path
If you want a working reference app before integrating manually, scaffold the rPPG demo:Install
Recommended Entry Point
@elata-biosciences/rppg-web provides createRppgSession() as the recommended
browser integration API.
Start there unless you intentionally need lower-level sample ingestion or custom
runtime orchestration.
How Apps Usually Use rPPG
- Ask for camera access and attach the stream to a
videoelement. - Start
createRppgSession()on that video. - Read metrics and diagnostics from the session.
- Show pulse-related feedback, readiness UI, or monitoring outputs in the app.
Minimal Camera → BPM Loop
The shortest path from camera to BPM readings:faceMesh: "off" uses the full video frame as the ROI — good enough for a
face filling most of the frame. Switch to faceMesh: "mediapipe" for
face-crop ROI when the user might move around.
If you need a single boolean for UI gating, use createRppgAppAdapter().canPublish
instead of polling getMetrics() directly — it handles the backend check,
confidence threshold, and warmup window in one place.
Minimal Integration
Typical Flow
- Acquire a camera stream and attach it to a
videoelement. - Call
createRppgSession({ video, backend: "auto" }). - Read metrics from
session.getMetrics(). - Surface diagnostics through
onDiagnosticsorsession.getDiagnostics(). - Stop the session during cleanup with
await session.stop().
Managed Restart Flow
If your app wants the SDK to own restart timing after terminal processor failures, usecreateManagedRppgSession():
Public Diagnostics And Debug Data
Useful public APIs for app-side diagnostics:session.getDiagnostics()session.getTraceSnapshot()computeTraceWaveformDebug()normalizeRppgError()createRppgAppAdapter()createRppgAppMonitor()
Vite Config
WASM asset placement
The default session loader fetches WASM files from/pkg/rppg_wasm.js and
/pkg/rppg_wasm_bg.wasm at runtime. In a Vite app, place those files under
public/pkg/ so they are served at that path:
packages/rppg-web/pkg/ after running
pnpm --dir packages/rppg-web run build:wasm, or in node_modules/@elata-biosciences/rppg-web/pkg/
after an npm install. Copy or symlink that directory into your public/ folder.
Alternatively, use the import-based options below to let Vite manage the asset
URLs instead of serving them from public/.
Dynamic import restriction
Vite 7 blocksimport(url) for files served from /public. Two approaches
work:
Option A — vite-plugin-wasm (recommended for new projects)
?url imports to fingerprinted asset
URLs at build time, bypassing the public directory restriction entirely.
Common Gotchas
- BPM is always null but status is
running— this looks like warmup but could mean WASM didn’t load. Checksession.backendModefirst: if it is"unavailable", the WASM assets are not being served at/pkg/. Fix the asset placement and the metrics will flow. See the Vite Config section above. - If
session.backendModeisunavailable, your app is probably not serving the packagedpkg/assets correctly. - If
session.state.statusisfailed, recreate the session instead of continuing to use the same poisoned processor. - If camera access fails, verify that the page can call
getUserMedia. - If you are just evaluating the SDK, the scaffolded demo is much faster than wiring the whole camera pipeline yourself.
Next Steps
- Package reference: rppg-web
- Compatibility details: Compatibility
- Troubleshooting: Troubleshooting