Coder Social home page Coder Social logo

kylemath / eegedu Goto Github PK

View Code? Open in Web Editor NEW
163.0 7.0 32.0 24.6 MB

Interactive Brain Playground - Browser based tutorials on EEG with webbluetooth and muse

Home Page: http://eegedu.com

License: MIT License

HTML 0.73% JavaScript 99.27%
eeg openbci ecg muse js data-visualization data-analysis plotting react-native bci

eegedu's Introduction

EEGEdu

MIT license PRs welcome! issues

Interactive Brain Playground Logo

EEGEdu is an Interactive Brain Playground.

EEGEdu is served live at https://eegedu.com/. This website is served from the software in this repository. So, all you need to do to try the system out is head to EEGEdu.

EEGEdu is designed as an interactive educational website to learn/teach about working with electroencephalogram (EEG) data. It is a teaching tool that allows for students to interact with their own brain waves.

EEGEdu has been inspired by multiple works that came before. It is inspired by EEG101, but EEGEdu is web-based. Being web-based allows students to interact with EEG brain data without having to install any software. Others have used Neurotech EEG-notebooks in python for data collection and analysis with muse-lsl. These software support the Interaxon MUSE headset but require a bluetooth low-energey (BLE) dongle to work with common operating systems (e.g. Windows or Mac OSX). These tools also required the editing pyglatt code to connect to Muse headsets. Thus, previous software are cumbersome and serve as a barrier to entry for many students learning about EEG. EEGEdu aims to provide students with an accesible introduction to working with their own brain waves.

EEGEdu Curriculum

EEGEdu provides an step-by-step incremental tutorial for students to interact with EEG-based brain signals. So, we break down the curriculum into 10 lessons as follows:

  1. Connect + hardware, Biophysics + signal viewing
  2. Heart rate time domain data
  3. Heart rate frequency domain -beats per minute
  4. Raw Data + artifacts + blinks + record
  5. Frequency domain explanation + Raw spectra + record
  6. Frequency bands + record
  7. Spectrogram
  8. Neurofeedback p5js demos
  9. Eyes closed eyes open experiment
  10. SSVEP experiment
  11. BCI trainer

Installation for Development

If you are interested in developing EEGEdu, here are some instructions to get the software running on your system. Note: Currently EEGEdu development requires a Mac OSX operating system.

To start, you will need to install Homebrew and yarn. These are easy to install with the following Terminal / bash commands:

## Install homebrew
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

## Install yarn
# NOTE: this will also install Node.js if it is not already installed.
brew install yarn 

# NOTE: Node.js must be version 10.x for Muse interaction

# Thus, if you are getting version issues, install n, with the following command:
# sudo npm install -g n

# Then, you can switch to version 10.x with the following command:
# sudo n 10.16.0

Then, in Terminal/bash, clone this Git repository and change directory into the newly cloned folder:

git clone https://github.com/kylemath/EEGEdu
cd EEGEdu

Then, you can install the required yarn packages for EEGEdu:

yarn install

Local Development Environment

Then, you can run the Local Development Environment of EEGEdu:

yarn start dev

If it is working correctly, the EEGEdu application will automatically open in a browser window at http://localhost:3000.

Local Production Environment

To start the Local Production Environment, you can use the following commands:

yarn cache clean
yarn run build
serve -s build

Local Testing of Changes

  1. Install any new packages yarn install
  2. Start the Local Development Environment yarn start dev
  3. Look for errors in terminal log
  4. Open's browser to http://localhost:3000
  5. Open Javascript console
  6. Look for errors in console
  7. Connect Mock data stream by clicking Connect button
  8. Run through the checkFunction below with Mock data.
  9. Disconnect Mock data stream
  10. Turn on Interaxon Muse headband
  11. Connect Muse data stream
  12. Repeat the checkFunction below with Muse data.
# Pseudocode for checking EEGEdu functionality
checkFunction = {
    view raw data 
    change sliders 
    make sure data changes and no errors
    click spectra
    move sliders
    make sure changes
    click bands
    move sliders
    make sure changes
}

Deployment

EEGEdu is running on Firebase and deployment happens automagically using GitHub post-commit hooks, or Actions, as they are commonly called. You can see how the application is build and deployed by inspecting the workflow.

Currently this automagic deployment is not working, so we can deploy to firebase manually:

First, install the Firebase deployment tools:

sudo brew install firebase
sudo yarn global add firebase-tools
sudo yarn global add firebase

The first deployment requires login and initialization once:

firebase login

Browser opens, and login to Google account authorized for Firebase deployment:

firebase init
  • options: Hosting Sites only
  • public directory: build
  • single-page app: No
  • Overwrite - No
  • Overwrite - No

Then, deployment to Firebase happens with the following commands:

# clean the local cache to ensure most recent version is served
yarn cache clean

# build the latest version of the site
yarn run build

# deploy the latest version to firebase
firebase deploy

References and Related Tools

Contributing

The guide for contributors can be found here. It covers everything you need to know to start contributing to EEGEdu.

Credits

EEGEdu - An Interactive Electrophysiology Tutorial with the Interaxon Muse brought to you by Mathewson Sons. A KyKorKey Production.

License

EEGEdu is licensed under The MIT License (MIT)

eegedu's People

Contributors

buyuk-dev avatar keyfer avatar korymath avatar kylemath avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

eegedu's Issues

Manual / Automated Testing

@kylemath noted some testing paradigms in the Slack channel for how to properly test a PR / new feature. We should add some of those details more formally to the README to make sure that at least we are all on the same page ...

Module: Frequency Bands

after viewing the spectra people should pick certain frequency bands and view them over time with visual..auditory feedback

Page navigation

big flow questions
how does the page navigation work
if you connect with web bluetooth can you move to a new page
and stay connected

or do the parts of the lession all happen in one page, scrolling, ...

Updates and cleanup for Public Launch Milestone.

There are a few changes which should be made before this repository goes public:

  • Update the README.md - Done with #94
  • Update the CONTRIBUTING.md - Done with #95
  • (@kylemath -- you'll need to do this as the repo owner) -- Add short description and topics to the GitHub Repo.
  • Will we be making the repository public? YES. I do not think we need to, but there may be some benefit of doing so?
  • Clean up stale branches that are not works in progress.
  • Add logo, screenshots to the README.md
  • Remove stale code and references
  • Do we want to write up a blog post about the software?
  • Do we want to write a tutorial on how it might be used?
  • Will we be pushing it to folks on NeuroTechX?
  • Do you want to post it on HackerNews / Reddit / Twitter additionally?
  • Attribution for images on the website itself should be included as captions on the images.
  • Do we want a Google form to collect user feedback of EEGEdu? Something which will help us collect comments in a centralized place rather than a collection of emails/tweets/face to face comments?

Spectra plotting

I think this one is ready to go right now, want to add a record button

Add additional Mock data stream parameters

Currently we are outputting a mock data stream which is a sequence of random numbers. We might be curious to control the data stream by outputting a sine wave or a wave that is corrupted by NaN values to see how the various modules might react to these modifications.

Deploy to Google App Engine

This software should be running on a live website front-end.
This would allow for users to test the system's performance on a front end without having to deal with installing node.js packages and the like.

Expose classifier output from Module 12. Predict brain states.

Is your feature request related to a problem? Please describe.
It would be helpful to expose the classifier decision in a variety of ways and means so that the EEGEdu prediction backend could be used as a controller for other output tasks.

Describe the solution you'd like
add some sort of simple interface to collect short (1s?) measurements for user-defined states. The experimenter could add as many states as desired, and for each state, store as many EEG snippets as wanted. Then at test time let the user try to invoke one of those states (using a nearest neighbour classifier) and expose the output of the classifier as a MIDI message. This way the experimenter could try to train a brain-wave based musical instrument, in the spirit of Rebecca Fiebrink's work, or a controller.

Describe alternatives you've considered
Currently there is ability to predict brain states, but the output is limited to the visual playground. One potential interface-centric library for making such MIDI interaction straightforward would be MIDI.js

Module: Spectrogram

an image plot of the 2d frequency by time spectrogram, scrolling over the screen

Module: Heart rate

Modulate to show time domain and hopefully simultaneously the frequency domain data while someone holds two electrodes on the muse with both hands to see their ecg. Peak picking could be used on the time domain to estimate the period and instantaneous heart rate. This can be compared to the frequency domain estimation of heart rate

currently working on the frequency domain here: https://github.com/kylemath/EEGEdu/tree/heartrate

The sliders last value is not used to update the charts, but the value before that, callback race problem

Describe the bug
When adjusting sliders, the data shows the second to last values selected, instead of the last

To Reproduce
Go to raw data, adjust sliders,

Expected behavior
should update charts to newest possible settings

I think it is a race problem where the callback is not complete at setting the new settings before the pipe gets recreated with the old settings, similar to this problem: https://stackoverflow.com/questions/30550314/react-input-range-slider-not-propagating-last-value-to-its-owner-in-chrome

  function handleIntervalRangeSliderChange(value) {
    setSettings(prevState => ({...prevState, interval: value}));
    resetPipeSetup();
  }

so resetPipeSetup() runs before setSettings is done, I tried something like this but setSettings can only have one callback:

  function handleIntervalRangeSliderChange(value) {
    setSettings(prevState => ({...prevState, interval: value}), function () {
      resetPipeSetup();
    });
  }

What's the goal?

What are the lessons we're trying to teach? What are the applications for this?

Disconnect data stream (muse or mock) button refreshes page.

This is a little disingenuous. The disconnect button just refreshes the page, which works to disconnect the data stream but it also destroys any work that the user might have made in the sliders/textbox.

I would recommend removing the button and the user can choose to refresh as desired. For now, I think they might develop with mock data, and then disconnect the mock data to switch to the muse and then lose their cool animation.

Change settings of pipes live on the fly

We want to change a few settings on the fly in the browser so people can see the effects of different filtering settings, different fft settings, etc.

so two easy ones are
can we change the filter in raw to toggle on vs off?
can we change the filter cutoffs with inputs from user?
can we change the fft: bins: 256 number in the spectra

relevant closed issue from pipes:
neurosity/eeg-pipes#3

Flexibility to plot 1,2,3,4 or all 5 Channels

There is an extra channel people can get by plugging an electrode into the auxillary port with the metal of the electrode attached to a certain pin of the port. We will want to include an option to use that auxillary channel.

Also we will want the ability to plot just a single channel for some of these, not sure the write UX for selection and will think about this more

from: https://github.com/urish/muse-js/blob/master/README.md
Auxiliary Electrode
The Muse 2016 EEG headsets contains four electrodes, and you can connect an additional Auxiliary electrode through the Micro USB port. By default, muse-js does not read data from the Auxiliary electrode channel. You can change this behavior and enable the Auxiliary electrode by setting the enableAux property to true, just before calling the connect method:

async function main() {
  let client = new MuseClient();
  client.enableAux = true;
  await client.connect();
}

Module: BCI Trainer

Present two training sessions where individual does behaviour A in training session 1, and behaviour B in training session 2, then live classification can begin to classify into A or B.
Buttons for train A, train B, run training, run classification. Similar to Gene Kogan webcam examples, or eeg101.

Could also do continuous instead of discrete prediction with training along various parts of a continuous slider as input.

Rename from MuseFFT to EEGEdu

Want to change all references to MuseFFT to be EEGEdu, including the main .js file names, the tab names, and the class name in the tabs

the FFT is just a subset of the things our EEGEdu does

MuseFFTSpectra
MuseFFTRaw
export class MuseFFTRaw extends Component {

becomes

EEGEduSpectra
EEGEduRaw
export class EEGEduRaw extends Component {

Module: Data Analysis

Now that we are saving .csv data files of raw data with markers, perhaps we could make analysis modules to load that data, change settings, and look at results in real time

Is your feature request related to a problem? Please describe.
data files are made with no analysis code

Describe the solution you'd like
another module to load data and analyze it

Describe alternatives you've considered
analyzing it offline, less fun

Module: Evoked Experiment

Classic ERP experiment

Either auditory or visual stimuli

~700 ms apart with random jitter
2 types
red vs blue
high vs low tone

we collect 30-50 trials , 20% of which are one condition

we save a snip of data locked to the onset for each presentation
200 ms before and 1000 ms after

we append this to an array 1200 ms by number of trials, one for each condition

we subtract the 200 ms baseline average from the whole trial for each channel and trial

we average over the trials for each channel, average over hemispheres and get a single time series for each condition

plot this and save it

Feat: add support for NotionJS

I’d love to be able to use my Notion for this! NotionJS Docs

Are the maintainers open to that? Shouldn’t be too bad seeing as no Bluetooth drivers are needed. The only big difference is you need authorization into device.

Module oddball exp: no markers saved

Describe the bug
The saved csv file does not contain any marker (stimulus or response). Tried on Windows tablet and Android Smartphone

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'ERP experiment'
  2. Start exp
  3. Touch on red circles
  4. Close window after csv is saved

Desktop (please complete the following information):

  • OS: Win 10
  • Browser [chrome]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [Huawei P10 lite]
  • OS: [Android 8]
  • Browser [chrome]
  • Version [e.g. 22]

Add tests for MuseFFT.js

We should start adding some tests for functions that we are adding to the code base. This will help us make sure we are unit testing consistently and ensure consistency along the way.

// Function to count by n to something

Reset cache when visiting page

currently loading older versions on locally served or deployed versions, kory suggested two fixes, I implemented this into the index.html file, which doesn't fix the problem:

    <meta http-equiv="Pragma" content="no-cache" />
    <meta http-equiv="Expires" content="0" />

@korymath also suggested doing something else in the .js file, not sure how

Firebase Analytics and Performance

Is your feature request related to a problem? Please describe.
Currently EEGEdu is not measuring the performance of the app using the full extent of Firebase tools.

Solution Path

  • Install Firebase Analytics
  • Set up the analytics hooks to ensure things are called appropriately
  • Measure performance with Firebase Performance Testing
  • Continue Locust testing for load measuring
  • Measure first-input-delay

ssvep rhythm test

need better control over p5js animation frequency
one clue is to add

p.frameRate(60)

another is to use Date.now() instead of counting draw cycles

Minimum Viable Product

Let's start with:

  • single page web app,
  • automatically deploy to heroku with continuous integration testing,
  • use npm for the server
  • write the logs for the user

Basic functionality is covered in #7

Add Google analytics

For future grant applications, etc we want to start tracking number of visitors and locations , setting up analytics soon would be good

Feature requests from Jon Kuziek power muse user

Jonathan Kuziek 3:27 PM
Hmmm well not really a setting but maybe have way to show the different plots we've been making for the stroke study? Brain symmetry, background spectra, spectra - background, lateralised plots (Tp9 and Tp10 on same plot), frequency ratios (delta+theta/alpha+beta). Would be interesting to see these in real time and then have a summary report after the 3 minutes.

Also I suppose a way to indicate impedance/variability in the data would be nice without having to look at the raw data (maybe a number to indicate variability like muselsl?).

Ummm, I suppose another thing that would be useful would be a way to customise the AUX channel so that you can tell the app where on the brain (or body) the AUX channel is connected, and it could process the data differently based on that?

Migrate from Cloud Build / App Engine to Firebase workflow action

  • Once we have merged PR:#24, we will want to add a Firebase workflow action to automatically build and deploy the new version to Firebase. This will require us to set the Github Secret with the firebase command interface TOKEN.

References: here, here, and the YAML file here

Only the steps under: "Uninstalling the Google Cloud Build app" -- let me know if you are able to do this.

ML5 Prediction Logic for Buttons.

As introduced in #102

TODO: introduce logic so the predict button is only active after x samples of each, and the samples buttons are inactive during the prediction, and add a button to stop prediction?

Unify all css in index.html

All CSS rules should be listed in index.html

public/index.html

We do not need a set of css that will be included from the src/ directory.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.