Skip to main content

What is rPPG?

Remote photoplethysmography (rPPG) extracts heart rate from subtle color changes in facial skin captured by a standard camera. No contact sensor is needed, just a webcam.

When To Use This Package

Use @elata-biosciences/rppg-web when your app needs:
  • camera-based biosignal processing in the browser
  • a higher-level rPPG session API instead of raw WASM orchestration
  • diagnostics and app-facing helpers around rPPG sessions

Installation

pnpm add @elata-biosciences/rppg-web
Requirements: Node.js 20+, browser with camera access and WebAssembly support.
For most browser apps, start with createRppgSession(). It handles WASM init, frame capture, ROI orchestration, diagnostics, and cleanup.
import { createRppgSession } from "@elata-biosciences/rppg-web";

const session = await createRppgSession({
  video: videoEl,
  sampleRate: 30,
  backend: "auto",
  faceMesh: "off",
  onDiagnostics: (diagnostics) => {
    console.log(diagnostics.state.status, diagnostics.faceTrackingMode);
    console.log(diagnostics.framesSeen, diagnostics.totalSamplesReceived);
  },
});

const metrics = session.getMetrics();
console.log("BPM:", metrics.bpm);

// Stop and release camera on cleanup
await session.stop();
Use createManagedRppgSession() if you also want automatic restart after terminal processor failures.

Diagnostics Guidance

Every session diagnostics payload includes state, issue codes, sampling stats, and processor failure information.
  • If session.state.status becomes failed, treat the underlying processor as terminal and recreate the session.
  • If backend: "auto" falls back to an unavailable backend mode, diagnostics report that state instead of failing silently.

Advanced: RppgProcessor

For custom orchestration, drop to RppgProcessor directly. Only use this when you need something createRppgSession() does not provide. If you are debugging, compare against createRppgSession() first.
import { RppgProcessor } from "@elata-biosciences/rppg-web";

const processor = new RppgProcessor("wasm", 30); // backend, sampleRate

// Feed green-channel intensity from face ROI each frame
processor.pushSample(performance.now(), greenIntensity);

const metrics = processor.getMetrics();
console.log("BPM:", metrics.bpm);

Constructor

new RppgProcessor(backend: Backend, sampleRate: number, windowSeconds?: number)
ParameterTypeDescription
backend"wasm" or "native"Processing backend
sampleRatenumberExpected frames per second (e.g., 30)
windowSecondsnumberAnalysis window length (default: 10)

Pushing Samples

processor.pushSample(timestampMs, intensity);           // green channel only
processor.pushSampleRgb(timestampMs, r, g, b, skinRatio?);
processor.pushSampleRgbMeta(timestampMs, r, g, b, skinRatio?, motion?, clipRatio?);

Metrics

FieldTypeDescription
bpmnumberEstimated heart rate in BPM
qualitynumberSignal quality (0 to 1)
spectralBpmnumberSpectral analysis estimate
acfBpmnumberAutocorrelation estimate
confidencenumberOverall confidence

Key Exports

  • createRppgSession
  • createManagedRppgSession
  • RppgProcessor
  • loadWasmBackend
  • createRppgAppAdapter
  • createRppgAppMonitor
  • normalizeRppgError
  • computeTraceWaveformDebug
  • ensureVideoPlaying

Next

rPPG Existing App Tutorial

Step-by-step integration guide

Frame Sources

MediaPipe face detection and camera capture

Calibration

Muse fusion and calibration models

Camera Integration Guide

End-to-end rPPG setup