Documentation Index
Fetch the complete documentation index at: https://docs.elata.bio/llms.txt
Use this file to discover all available pages before exploring further.
Use this guide when you want the full browser EEG flow in one place. You will initialize the EEG WASM layer, connect over Web Bluetooth, stream frames, and turn those frames into app-ready metrics.
What this guide covers
This guide shows how to combine@elata-biosciences/eeg-web and @elata-biosciences/eeg-web-ble into a single browser flow.
By the end, you will be able to:
- initialize the EEG WASM runtime
- connect to a Muse-compatible headband from a browser
- receive normalized
HeadbandFrameV1frames - compute real-time band powers from incoming EEG samples
- stop cleanly and recover from disconnects
Before you start
You need:@elata-biosciences/eeg-web@elata-biosciences/eeg-web-ble- Chrome or Edge with Web Bluetooth support
https://orlocalhost- a Muse 2 or Muse S headband
Choose the right starting point
Start from a working demo
Scaffold an EEG BLE demo if you want the fastest path to a reference app.
EEG Web getting started
Learn the browser EEG processing package before adding device transport.
EEG Web BLE getting started
Review the transport package, lifecycle methods, and platform constraints.
Live stream tutorial
Follow the step-by-step version if you want a tutorial before this end-to-end guide.
Install the packages
- pnpm
- npm
@elata-biosciences/eeg-web-ble depends on @elata-biosciences/eeg-web for shared frame types and the EEG WASM layer.Integration flow
Initialize EEG WASM
Initialize the EEG WASM module before you create a transport or call analysis helpers.
Create a BLE transport
Include an Athena decoder up front so both classic and Athena-compatible headbands work through the same setup.
Connect from a user action and start streaming
Browser BLE flows usually need a click or another user gesture to open the device picker.In simpler app flows,
startStreaming() is often the safest default because it combines connect() and start() in one call.Full example
Clean shutdown
Stop the stream and release the Bluetooth session when the user leaves the view or closes the page.Handle reconnection
If a disconnect is recoverable, try to reconnect and restart the stream.Understand the transport lifecycle
| Method | What it does |
|---|---|
startStreaming() | Connects and starts the stream in one call |
connect() | Opens the Bluetooth picker and prepares the session |
start() | Begins the EEG stream and triggers onFrame |
stop() | Stops the stream but keeps the Bluetooth session open |
disconnect() | Releases the Bluetooth session |
Device notes
- Classic Muse
- Athena firmware
Works with the standard browser BLE flow.
Architecture at a glance
The browser EEG BLE flow usually looks like this:- initialize
@elata-biosciences/eeg-web - create a
BleTransport - connect from a user gesture
- receive
HeadbandFrameV1frames - compute app metrics such as band powers, scores, or state transitions
- render charts, scores, or adaptive UI in your app
Integration tips
- Buffer frames before analysis for more stable results
- Use
Float64Arraywhen callingband_powers - Keep BLE connection logic behind explicit user actions
- Move frame handling into app state instead of leaving it in
console.log - Start with a demo app if you need a working reference project first
- Test with synthetic data first during development if no headband is available
band_powers is most useful with roughly 1 to 2 seconds of data. At 256 Hz, that is about 256 to 512 samples.Where to go next
Muse device details
Review device behavior, protocol details, and compatibility notes.
EEG Web BLE getting started
See the transport package API and platform caveats in more detail.
Stream EEG over Web Bluetooth
Follow the tutorial version of this workflow with app-oriented steps.
Add EEG to an existing browser app
Integrate browser EEG processing into an app before adding BLE transport.
Headband transport
Learn the frame schema and transport boundary used by EEG flows.
rPPG camera integration
Pair this EEG workflow with the browser camera pipeline when you need multimodal biometrics.