Comments (5)
As an addendum to this, I also noticed that bad-by-dropout channels aren't added to the initial set of "bad" channels during re-referencing along with bad-by-NaN and bad-by-flat channels:
EEG-Clean-Tools/PrepPipeline/utilities/robustReference.m
Lines 37 to 42 in 3ed337e
Since the initial average referencing would add signal to the previously flat "dropout" regions, however, wouldn't their lack of inclusion in that first pre-reference noisy detection pass prevent them from being detected later, except maybe indirectly by the bad-by-correlation or bad-by-RANSAC detectors?
from eeg-clean-tools.
Oh, and related to these: in the same updateBadChannels
code above, bad-by-SNR channels aren't added to ref.all
every loop either.
On its face this wouldn't be an issue, since bad-by-SNR channels are just channels bad by both HF noise and low correlation, but a potentially unexpected consequence of this is that bad-by-SNR channels are added to the initial referenceOut.badChannels
state along with bad-by-NaN and bad-by-flats, but are then immediately removed from the set of all bad channels on the first pass of updateBadChannels
.
In one of our test files there's a channel that's initially bad-by-SNR that's no longer bad by any metric after initial referencing, so it ends up being considered good enough to use to estimate the signals of bad channels during interpolation while still being considered too bad to include in the average reference calculation! I'm not sure what the correct behavior here should be, but this seems unintentional.
from eeg-clean-tools.
The updateBadChannels problem mentioned at the beginning of this thread is an issue and will be fixed in version 0.56.0. (Thanks for finding.....)
I don't think the second issue mentioned here is an actual bug but a "feature". Robust referencing is a "chicken-and-egg" situation. You can't remove additive noise without a good "true" average reference, but you can't get a "true" average reference without removing additive noise. The problem is how to get the process started.
The second thing to notice here is that identifying "bad" channels is not exactly the same thing as getting the channels needed to compute a good reference. The phenomenon you mentioned one of the things that Prep is trying to accomplish --- not rejecting channels that shouldn't necessarily be rejected.
The initial set of channels only rejects channels that are "globally" bad --- that is have globally large amplitudes, poor overall correlation, or have signals with NANs or zeros. The drop-outs is a low-amplitude in a sufficient number of windows feature. This can be sensitive to the removal of additive noise. Ransac is also very unstable to noise and it is not used in the initial setup either.
On each iteration, a completely new set of bad channels is computed once the process has started. It is not uncommon for channels that were initially bad to be "okay" once the robust reference is subtracted out.
from eeg-clean-tools.
The second thing to notice here is that identifying "bad" channels is not exactly the same thing as getting the channels needed to compute a good reference. The phenomenon you mentioned one of the things that Prep is trying to accomplish --- not rejecting channels that shouldn't necessarily be rejected.
If this is in reference to my 3rd post about bad-by-SNR, I think I should clarify: what we observed was PREP rejecting a channel that shouldn't necessarily be rejected, since the 'bad-by-SNR' channel was no longer bad by any metric after the initial re-reference, but was still interpolated at the end because it was included in the list of "unusable" channels (which get interpolated no matter what). If the channel remains "good" after initial average referencing, and it's not inherently unusable in the same way that flat and NaN-containing channels are, wouldn't we want to retain it during final interpolation?
The initial set of channels only rejects channels that are "globally" bad --- that is have globally large amplitudes, poor overall correlation, or have signals with NANs or zeros
When we reimplemented this code for PyPREP, the MATLAB PREP code seems to only exclude bad-by-NAN, bad-by-flat-signal, and bad-by-SNR (i.e. bad by both high-frequency noise ratio and by having more than 1% low correlation windows). Were bad-by-deviation channels meant to be included in that initial set as well?
from eeg-clean-tools.
from eeg-clean-tools.
Related Issues (20)
- adding back online reference
- restoreEEGOptions not quite working correctly HOT 1
- struct2str getNumerical HOT 2
- Change to EEGLAB options HOT 1
- Change name of findpeaks
- Print out errors on command line when verbose option is set HOT 1
- Should I do the channel location first? HOT 7
- should I do a low pass filter and another rereference after PREP, before ICA?
- Add reference information in EEG struct after PREP
- Crash on movefile in publishPrepReport HOT 4
- Interpolation prior to ICA HOT 2
- Minimum of three iterations in robust re-reference? HOT 2
- Error using cleanLineNoise (line 57) HOT 3
- Why is EEGlab PREP not flagging up EEG files as "problematic" if the number of interpolated channels is > 25%? HOT 1
- Out of memory error HOT 3
- Channel locations missing after the pipeline
- Computation of windowSize based on detrendCutoff in localDetrend HOT 7
- PyPrep Noisy Channel Parameters
- Keep receiving errors when running the PREPpipeline
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from eeg-clean-tools.