Skip to main content
EEG measures electrical activity from the brain using a wearable headset. In browser apps, developers usually use EEG to stream headset samples, compute features such as band powers, and drive feedback, dashboards, adaptive UX, or interactive experiences. Concrete app examples include:
  • meditation or breath-training apps that adapt guidance to focus or calmness
  • neurofeedback experiences that reward attention or steadiness
  • simulated athletic-performance or training apps that react to mental state
  • games or creative tools that change difficulty, pacing, or effects based on live EEG features

Start With The Right Path

If you want a known-good reference app first, scaffold the EEG demo:
npm create @elata-biosciences/elata-demo my-app -- --template eeg-web-demo
cd my-app
pnpm install
pnpm run dev
Use this guide when you want to integrate EEG APIs into an existing app.

Install

pnpm add @elata-biosciences/eeg-web
npm install @elata-biosciences/eeg-web

What eeg-web Gives You

@elata-biosciences/eeg-web provides:
  • browser-side EEG WASM initialization
  • signal-processing and model exports such as band_powers
  • shared types and contracts used by higher-level browser integrations
This package does not handle Bluetooth device connection by itself. Add @elata-biosciences/eeg-web-ble if you also need browser BLE transport.

How Apps Usually Use EEG

  1. Connect to a compatible headset or load sample buffers.
  2. Initialize the WASM runtime.
  3. Run feature extraction or analysis on incoming samples.
  4. Map those results into app state, scoring, content adaptation, or user feedback.
That “map those results” step is where the product behavior lives. For example, an app might slow a meditation guide when focus drops, increase challenge in a training simulation when sustained attention improves, or visualize calm vs. activation during a neurofeedback session.

Minimal Integration

import { initEegWasm, band_powers } from "@elata-biosciences/eeg-web";

await initEegWasm();

const eegData = new Float32Array([0, 1, 0, -1]);
const powers = band_powers(eegData, 256);

console.log("alpha", powers.alpha);

Typical Flow

  1. Initialize the packaged WASM runtime with initEegWasm().
  2. Pass browser-side EEG sample buffers into the exported analysis functions.
  3. Add eeg-web-ble later if you need live headset transport.

Vite Config

Two approaches work. Use whichever fits your setup: Option A — plugins (recommended for new projects)
npm install -D vite-plugin-wasm vite-plugin-top-level-await
import { defineConfig } from "vite";
import wasm from "vite-plugin-wasm";
import topLevelAwait from "vite-plugin-top-level-await";

export default defineConfig({
  plugins: [wasm(), topLevelAwait()],
});
import { initEegWasm } from "@elata-biosciences/eeg-web";
await initEegWasm(); // no argument needed
Option B — explicit URL import (no extra plugins) If you prefer not to add plugins, pass the WASM asset URL directly:
import wasmUrl from "@elata-biosciences/eeg-web/wasm/eeg_wasm_bg.wasm?url";
import { initEegWasm } from "@elata-biosciences/eeg-web";

await initEegWasm(wasmUrl);
Option B works because initEegWasm accepts a URL and fetches the WASM manually, bypassing Vite’s ESM module handling entirely.

Common Gotchas

  • band_powers() takes a single-channel Float32Array — not number[]. Convert with new Float32Array(samples[channelIdx]) before passing. Passing a plain array will cause a WASM runtime error. For a Muse headband (4 channels: TP9, AF7, AF8, TP10), use a frontal channel (AF7 = index 1, or AF8 = index 2) for cognitive state features, or average across channels.
  • WasmCalmnessModel.process() expects interleaved samples: [s0_ch0, s0_ch1, s1_ch0, s1_ch1, ...]. The frame.eeg.samples layout is per-channel — convert before passing.
  • WasmCalmnessModel needs channelCount at construction, but channelCount only arrives on the first frame. Construct the model inside your first-frame handler.
  • If initEegWasm() fails, your app may not be serving the packaged wasm/ assets correctly. See Vite config above.
  • If you need a live headset connection, eeg-web alone is not enough.

Next Steps