eeg-notebooks: a Python library for mobile EEG experiments and analysis

Project info


Title: eeg-notebooks: a Python library for mobile EEG experiments and analysis

Project lead:

Project collaborators: eeg-notebooks is one of the many fruits of the NeuroTechX global neurotechnology hacker community. There’s a long list of past and present contributors to the project. Currently active contributors include John Griffiths, Jadin Tredup, Morgan Hough, Cristiano Micheli, Kyle Mathewson, and several others .

Please add your name to the list here if you’re interested in contributing eeg-notebooks during the BrainHack Global 2020 sprint, and/or thereafter.

Registered Brainhack Global 2020 Event: The main BHG2020 eegnb sprint will happen during Brainhack Ontario (2-4th Dec).

…but we are expecting to have contributors from around the world in multiple time zones during this time :)

Project Description:

eeg-notebooks is a Python library that allows you to run cognitive neuroscience experiments with a simple mobile EEG device and laptop computer. We like to think of and talk about this as “democratizing the cognitive neuroscience experiment”, because it makes it possible to achieve for a few hundred bucks what traditionally has only been possible in dedicated research labs with quadruple figure price tags.


Launch experiment and device selection prompt with the stand-alone command line interface:

eegnb runexp -ip 

Run a visual N170 (faces vs. houses) RSVP EEG experiment via the Python API:

from eegnb.experiments.visual_n170 import n170   # choose and import the experiment to run from a wide selection
from eegnb.devices.eeg import EEG                # import and initialize an EEG object that deals with 
eeg_device = EEG(device='muse2016')              # ...connecting to and streaming from your device
n170.present(duration=300, eeg=eeg_device,       # launch the experiment (psychopy visual stimulus presentation), 
                     save_fn='my170data.csv')    # ...and write EEG data and experiment triggers to file

For more examples, check out the sphinx examples gallery.

Data to use:

  1. The eeg-notebooks repo has multiple example datasets for several EEG experiments. We are very interested in contributions that present compelling new analyses of these data in the examples documentation

  2. Collect your own data! We are compiling a large database of user-contributed EEG data, collected using the eeg-notebooks visual and auditory experiments.

Link to project repository/sources: Docs page:

Goals for Brainhack Global 2020:

At BHG 2020 we are primarily looking for contributions in the following areas

We’re also all ears for any cool new additional functionality ideas.

Important point: You don’t need an EEG device available to join us and make contributions.

For example, there’s plenty of important work to be done that is straight-up MNE, and/or Scikit-Learn, and/or Psychopy.

You can contribute to data collection and device testing if you have access to one or more of the following: Muse2016, Muse2, MuseS, OpenBCI Ganglion, OpenBCI Cyton, G.Tec Unicorn, BrainBit, Neurosity Notion 1, Neurosity Notion 2. See the eeg-notebooks documentation for more info.

Good first issues:

  1. Test installation, report any bugs + observations for improvement in docs (more here )

  2. Run the visual N170 experiment and compute ERPs (more here)

See more good first issues here




Useful, depending on what you do:

Tools/Software/Methods to Use:

See the installation instructions . Please work through these prior to 2nd Dec, and post any issues you have in getting set up on the github repo issues page.

Communication channels:

Project labels

Project Submission

Submission checklist

Once the issue is submitted, please check items in this list as you add under ‘Additional project info’

Optionally, you can also include information about:

We would like to think about how you will credit and onboard new members to your project. If you’d like to share your thoughts with future project participants, you can include information about:

Jan 1, 0001 12:00 AM