Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.elata.bio/llms.txt

Use this file to discover all available pages before exploring further.

Use this page if you want the package model for browser EEG, not a full integration walkthrough. If you want numbered steps for an existing app, use Add EEG To An Existing Browser App. If you want the scaffold path, use Build Your First Elata App.
If you want a running app without manual wiring, scaffold with create-elata-demo first.
EEG is optional in the usual Elata journey: camera rPPG is the primary browser integration for many apps. Add eeg-web when you need brain-signal analysis; add eeg-web-ble alongside it when you need Web Bluetooth for a Muse-compatible headset. EEG measures electrical activity from the brain using a wearable headset. In browser apps, developers usually use EEG to stream headset samples, compute features such as band powers, and drive feedback, dashboards, adaptive UX, or interactive experiences. Concrete app examples include:
  • meditation or breath-training apps that adapt guidance to focus or calmness
  • neurofeedback experiences that reward attention or steadiness
  • simulated athletic-performance or training apps that react to mental state
  • games or creative tools that change difficulty, pacing, or effects based on live EEG features

Install

pnpm add @elata-biosciences/eeg-web

What eeg-web Gives You

@elata-biosciences/eeg-web provides:
  • browser-side EEG WASM initialization
  • signal-processing and model exports such as band_powers
  • shared types and contracts used by higher-level browser integrations
This package does not handle Bluetooth device connection by itself. Add @elata-biosciences/eeg-web-ble if you also need browser BLE transport.

Minimal Integration

import { initEegWasm, band_powers } from "@elata-biosciences/eeg-web";

await initEegWasm();

const eegData = new Float32Array([0, 1, 0, -1]);
const powers = band_powers(eegData, 256);

console.log("alpha", powers.alpha);

Typical Integration Flow

  1. Initialize the packaged WASM runtime with initEegWasm().
  2. Pass browser-side EEG sample buffers into the exported analysis functions.
  3. If you later need live device transport, combine this package with eeg-web-ble.

When To Use The EEG Template Instead

Prefer the scaffolded eeg-demo template when you want:
  • a known-good Vite setup
  • a reference for how bundled WASM assets should be served
  • a synthetic-data app that runs without hardware

Common Gotchas

  • band_powers() takes a single-channel Float32Array, not number[]. Convert with new Float32Array(samples[channelIdx]) before passing. Passing a plain array will cause a WASM runtime error. For a Muse headband (4 channels: TP9, AF7, AF8, TP10), use a frontal channel (AF7 = index 1, or AF8 = index 2) for cognitive state features, or average across channels.
  • WasmCalmnessModel.process() expects interleaved samples: [s0_ch0, s0_ch1, s1_ch0, s1_ch1, ...]. The frame.eeg.samples layout is per-channel. Convert before passing.
  • WasmCalmnessModel needs channelCount at construction, but channelCount only arrives on the first frame. Construct the model inside your first-frame handler.
  • If initEegWasm() fails, your app may not be loading the packaged wasm/ assets correctly.
  • If you need a live headset connection, eeg-web alone is not enough.
  • If you are only evaluating the SDK, the scaffolded app is faster than manual setup.

Next

Add EEG To An Existing App

Step-by-step integration tutorial

Web Bluetooth

Connect a Muse headset

eeg-web Reference

Package API and exports

Troubleshooting

Common failures and fixes