About Us
We started with a vision: To become the trusted runtime layer for neural and biometric data — ensuring that every signal, from EEG to eye-tracking, is governed, consent-bound, time-locked, and provable before it powers research, therapies, or applications.
From the chaos of clinical EEG to consumer headsets leaking raw brain data. That’s why Oyster was built: to deliver a governed runtime where every signal is consent-bound, time-locked, and anchored with proofs. Our team blends neurotech research, clinical study ops, and AI infrastructure to make the most sensitive data safe, reproducible, and verifiable.
We Envision a Future Where
Ungoverned brain signals never leak raw to apps, clouds, or intermediaries.
Every session carries a Sync Certificate, proving timing quality and reproducibility.
Every export includes tamper-evident proofs, so regulators, labs, and users can verify provenance offline.
Developers, labs, and OEMs build freely on device-agnostic infrastructure, without being locked into hardware or cloud silos.
Compliance with neuroprivacy laws (GDPR, HIPAA, EU AI Act, California Neuroprivacy) is by default, not an afterthought.