Coder Social home page Coder Social logo

mhhennig / hs2 Goto Github PK

View Code? Open in Web Editor NEW
26.0 8.0 17.0 44.34 MB

Software for high density electrophysiology

License: GNU General Public License v3.0

C++ 1.69% Python 5.05% Jupyter Notebook 93.10% Cython 0.16%
spike-sorting spike-detection electrophysiology multi-electrode-array neural-recording

hs2's People

Contributors

alejoe91 avatar bptgrm avatar colehurwitz avatar frozenblit avatar jamesjun avatar janohorvath avatar lkct avatar martinosorb avatar mhhennig avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hs2's Issues

Trouble running HS on Spikeinterface

Hi!
I am having a bit of difficulties getting HS to run using Spike interface. This is the error I am getting:

The extractor was not dumpable
Error running herdingspikes
Traceback (most recent call last):
File "C:\Users\nasya\anaconda3\envs\herdingspikes\lib\site-packages\spikeinterface\sorters\basesorter.py", line 200, in run_from_folder
SorterClass._run_from_folder(output_folder, sorter_params, verbose)
File "C:\Users\nasya\anaconda3\envs\herdingspikes\lib\site-packages\spikeinterface\sorters\herdingspikes\herdingspikes.py", line 170, in _run_from_folder
recording = st.normalize_by_quantile(
File "C:\Users\nasya\anaconda3\envs\herdingspikes\lib\site-packages\spikeinterface\toolkit\preprocessing\normalize_scale.py", line 196, in normalize_by_quantile
return NormalizeByQuantileRecording(*args, **kwargs)
File "C:\Users\nasya\anaconda3\envs\herdingspikes\lib\site-packages\spikeinterface\toolkit\preprocessing\normalize_scale.py", line 60, in init
random_data = get_random_data_chunks(recording, **random_chunk_kwargs)
File "C:\Users\nasya\anaconda3\envs\herdingspikes\lib\site-packages\spikeinterface\toolkit\utils.py", line 36, in get_random_data_chunks
for segment_index in range(recording.get_num_segments()):
AttributeError: 'NoneType' object has no attribute 'get_num_segments'

Here is how I am running it:

sorting_HS = ss.run_herdingspikes(recording, output_folder='results_HS', filter=False, verbose=True)

Whenever I try to pre-process it, and then run the sorter, the Kernel dies.

recording_f = st.bandpass_filter(recording, freq_min=100, freq_max=3000)
recording = recording_f.save(format='binary')

I have only take a small chunk of my data (not even 1GB) just to try and get it to work. Would be very thankful for any hints!

`probe.py` path does not exist error when `RecordingExtractor` is called from SpikeToolkit

probes and probe_info directories do not exist and create_probe_files function throws an error when it tries to write to the file. I suggest following changes to probe.py functions.

OLD:

def in_probes_dir(file):
return os.path.join(this_file_path, 'probes', file)

JJJ: directory created if not exist

def in_probe_info_dir(file):
return os.path.join(this_file_path, 'probe_info', file)


NEW:

def in_probes_dir(file):
str_dir = os.path.join(this_file_path, 'probes')
if not os.path.isdir(str_dir):
os.mkdir(str_dir)
return os.path.join(str_dir, file)

def in_probe_info_dir(file):
str_dir = os.path.join(this_file_path, 'probe_info')
if not os.path.isdir(str_dir):
os.mkdir(str_dir)
return os.path.join(str_dir, file)

reading multiple files

Sorting requires reading multiple files with detected spikes (same preparation, different raw data segments). This should be done using self.shapecache for shapes, and using the pandas tables for all operations. Needs to keep track of start/end of each file to allow storing in different files afterwards.

Save the version info in output files

The final hdf5 file should contain information about the version of HS2 that was used to generate the file.
One option could be the git commit hash. Would this be accessible from a pip install? Alternatively we should start tracking/attaching version numbers.

Come up with better localization algorithm

Current Algorithm

Pros - Easy to implement, works pretty well.

Cons - Subtracting median biases result (need a better way to deal with outliers), channels that border side of the probe are not being localized correctly

What are your guys' thoughts on this matter?

Installation Issue (Windows)

Hello,

I am trying to install HS2 on Windows, and I receive the following error when I run 'setup.py install':

(C:\Users\HS2env) C:\Windows\System32\HS2>python setup.py install
C:\Users\HS2env\lib\distutils\dist.py:261: UserWarning: Unknown distribution option: 'long_description_content_type'
warnings.warn(msg)
running install
running bdist_egg
running egg_info
writing herdingspikes.egg-info\PKG-INFO
writing dependency_links to herdingspikes.egg-info\dependency_links.txt
writing requirements to herdingspikes.egg-info\requires.txt
writing top-level names to herdingspikes.egg-info\top_level.txt
reading manifest file 'herdingspikes.egg-info\SOURCES.txt'
writing manifest file 'herdingspikes.egg-info\SOURCES.txt'
installing library code to build\bdist.win-amd64\egg
running install_lib
running build_py
running build_ext
building 'herdingspikes.detection_localisation.detect' extension
cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MT -Iherdingspikes/detection_localisation -IC:\Users\HS2env\lib\site-packages\numpy\core\include -IC:\Users\HS2env\include -IC:\Users\HS2env\include "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.6.1\include\um" /EHsc /Tpherdingspikes/detection_localisation/detect.cpp /Fobuild\temp.win-amd64-3.6\Release\herdingspikes/detection_localisation/detect.obj -std=c++11 -O3
error: command 'cl.exe' failed: No such file or directory

Any idea what could be the cause/solution? I directly cloned this repo using git, but I also don't find 'cl.exe' anywhere. Thank you!

Build fails on Windows 10

Hi,
I'm not sure if this project is still maintained, but I can't find a solution anywhere else.

I'm trying to build from source, based on the master branch and following the instructions.

I have a conda environment with Python 3.7.16, Cython 0.29.33 and numpy 1.17.4 (originally 1.21).

Here's the full trace :

Traceback (most recent call last):
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\site-packages\setuptools\sandbox.py", line 156, in save_modules
    yield saved
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\site-packages\setuptools\sandbox.py", line 198, in setup_context
    yield
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\site-packages\setuptools\sandbox.py", line 259, in run_setup
    _execfile(setup_script, ns)
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\site-packages\setuptools\sandbox.py", line 46, in _execfile
    exec(code, globals, locals)
  File "C:\Users\MEA\AppData\Local\Temp\easy_install-ebcfop83\pandas-1.5.3\setup.py", line 664, in <module>
  File "C:\Users\MEA\AppData\Local\Temp\easy_install-ebcfop83\pandas-1.5.3\setup.py", line 424, in maybe_cythonize
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\site-packages\Cython\Build\Dependencies.py", line 1089, in cythonize
    nthreads, initializer=_init_multiprocessing_helper)
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\multiprocessing\context.py", line 119, in Pool
    context=self.get_context())
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\multiprocessing\pool.py", line 176, in __init__
    self._repopulate_pool()
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\multiprocessing\pool.py", line 241, in _repopulate_pool
    w.start()
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\multiprocessing\process.py", line 112, in start
    self._popen = self._Popen(self)
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\multiprocessing\context.py", line 322, in _Popen
    return Popen(process_obj)
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\multiprocessing\popen_spawn_win32.py", line 68, in __init__
    with open(wfd, 'wb', closefd=True) as to_child:
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\site-packages\setuptools\sandbox.py", line 454, in _open
    if mode not in ('r', 'rt', 'rb', 'rU', 'U') and not self._ok(path):
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\site-packages\setuptools\sandbox.py", line 465, in _ok
    realpath = os.path.normcase(os.path.realpath(path))
  File "C:\Users\MEA\miniconda3\envs\hs2source\lib\ntpath.py", line 527, in abspath
    return normpath(_getfullpathname(path))
TypeError: _getfullpathname: path should be string, bytes or os.PathLike, not int

This always happens as pandas is compiling :
Compiling pandas\io/sas/sas.pyx because it changed.

setuptools installs version 1.5.3 :

Installed c:\users\mea\miniconda3\envs\hs2source\lib\site-packages\herdingspikes-0.3.102-py3.7-win-amd64.egg
Processing dependencies for herdingspikes==0.3.102
Searching for pandas
Reading https://pypi.org/simple/pandas/
C:\Users\MEA\miniconda3\envs\hs2source\lib\site-packages\pkg_resources\__init__.py:126: PkgResourcesDeprecationWarning:  is an invalid version and will not be supported in a future release
  PkgResourcesDeprecationWarning,
Downloading https://files.pythonhosted.org/packages/74/ee/146cab1ff6d575b54ace8a6a5994048380dc94879b0125b25e62edcb9e52/pandas-1.5.3.tar.gz#sha256=74a3fd7e5a7ec052f183273dc7b0acd3a863edf7520f5d3a1765c04ffdb3b0b1
Best match: pandas 1.5.3
Processing pandas-1.5.3.tar.gz

The only link which mentions that exact error is this, which recommends downgrading numpy. I progressively went down to version 1.17.4, to no avail.

Could any one point me to specific versions of Python, numpy and pandas on which building was successful ?

bug with the 0.3.7 version

Hello,
I have an error when running Herdingspikes on spikeInterface.

Code

sorting_HS = ss.run_herdingspikes(recording_loaded, output_folder='results_HS', 
                                  filter=False, verbose=True)
print('Found', len(sorting_HS.get_unit_ids()), 'units')

Error

herdingspikes version<0.3.99 uses the OLD spikeextractors with NewToOldRecording.
Consider updating herdingspikes (pip install herdingspikes>=0.3.99)
# Generating new position and neighbor files from data file
# Not Masking any Channels
# Sampling rate: 30000
# Localization On
# Number of recorded channels: 32
# Analysing frames: 4428544; Seconds: 147.61813333333333
# Frames before spike in cutout: 9
# Frames after spike in cutout: 54
# tcuts: 39 84
# tInc: 100000
# Analysing frames from -39 to 100084  (0.0%)
# Analysing frames from 99961 to 200084  (2.3%)
# Analysing frames from 199961 to 300084  (4.5%)
# Analysing frames from 299961 to 400084  (6.8%)
# Analysing frames from 399961 to 500084  (9.0%)
# Analysing frames from 499961 to 600084  (11.3%)
# Analysing frames from 599961 to 700084  (13.5%)
# Analysing frames from 699961 to 800084  (15.8%)
# Analysing frames from 799961 to 900084  (18.1%)
# Analysing frames from 899961 to 1000084  (20.3%)
# Analysing frames from 999961 to 1100084  (22.6%)
# Analysing frames from 1099961 to 1200084  (24.8%)
# Analysing frames from 1199961 to 1300084  (27.1%)
# Analysing frames from 1299961 to 1400084  (29.4%)
# Analysing frames from 1399961 to 1500084  (31.6%)
# Analysing frames from 1499961 to 1600084  (33.9%)
# Analysing frames from 1599961 to 1700084  (36.1%)
# Analysing frames from 1699961 to 1800084  (38.4%)
# Analysing frames from 1799961 to 1900084  (40.6%)
# Analysing frames from 1899961 to 2000084  (42.9%)
# Analysing frames from 1999961 to 2100084  (45.2%)
# Analysing frames from 2099961 to 2200084  (47.4%)
# Analysing frames from 2199961 to 2300084  (49.7%)
# Analysing frames from 2299961 to 2400084  (51.9%)
# Analysing frames from 2399961 to 2500084  (54.2%)
# Analysing frames from 2499961 to 2600084  (56.5%)
# Analysing frames from 2599961 to 2700084  (58.7%)
# Analysing frames from 2699961 to 2800084  (61.0%)
# Analysing frames from 2799961 to 2900084  (63.2%)
# Analysing frames from 2899961 to 3000084  (65.5%)
# Analysing frames from 2999961 to 3100084  (67.7%)
# Analysing frames from 3099961 to 3200084  (70.0%)
# Analysing frames from 3199961 to 3300084  (72.3%)
# Analysing frames from 3299961 to 3400084  (74.5%)
# Analysing frames from 3399961 to 3500084  (76.8%)
# Analysing frames from 3499961 to 3600084  (79.0%)
# Analysing frames from 3599961 to 3700084  (81.3%)
# Analysing frames from 3699961 to 3800084  (83.5%)
# Analysing frames from 3799961 to 3900084  (85.8%)
# Analysing frames from 3899961 to 4000084  (88.1%)
# Analysing frames from 3999961 to 4100084  (90.3%)
# Analysing frames from 4099961 to 4200084  (92.6%)
# Analysing frames from 4199961 to 4300084  (94.8%)
# Analysing frames from 4299961 to 4400084  (97.1%)
# Analysing frames from 4399961 to 4428544  (99.4%)
# Detection completed, time taken: 0:00:00.740824
# Time per frame: 0:00:00.000167
# Time per sample: 0:00:00.000005
Loaded 0 spikes.
Error running herdingspikes
Traceback (most recent call last):
  File "/Users/ismail/opt/anaconda3/envs/si090/lib/python3.8/site-packages/spikeinterface/sorters/basesorter.py", line 200, in run_from_folder
    SorterClass._run_from_folder(output_folder, sorter_params, verbose)
  File "/Users/ismail/opt/anaconda3/envs/si090/lib/python3.8/site-packages/spikeinterface/sorters/herdingspikes/herdingspikes.py", line 226, in _run_from_folder
    uids = C.spikes.cl.unique()
  File "/Users/ismail/opt/anaconda3/envs/si090/lib/python3.8/site-packages/pandas/core/generic.py", line 5583, in __getattr__
    return object.__getattribute__(self, name)
AttributeError: 'DataFrame' object has no attribute 'cl'

/Users/ismail/opt/anaconda3/envs/si090/lib/python3.8/site-packages/herdingspikes/hs2.py:161: UserWarning: Loading an empty file results_HS/HS2_detected.bin . This usually happens when no spikes weredetected due to the detection parameters being set too strictly
  warnings.warn(
---------------------------------------------------------------------------
SpikeSortingError                         Traceback (most recent call last)
Input In [74], in <cell line: 2>()
      1 # run spike sorting on entire recording
----> 2 sorting_HS = ss.run_herdingspikes(recording_loaded, output_folder='results_HS', 
      3                                   filter=False, verbose=True)
      4 print('Found', len(sorting_HS.get_unit_ids()), 'units')

File ~/opt/anaconda3/envs/si090/lib/python3.8/site-packages/spikeinterface/sorters/runsorter.py:468, in run_herdingspikes(*args, **kwargs)
    467 def run_herdingspikes(*args, **kwargs):
--> 468     return run_sorter('herdingspikes', *args, **kwargs)

File ~/opt/anaconda3/envs/si090/lib/python3.8/site-packages/spikeinterface/sorters/runsorter.py:67, in run_sorter(sorter_name, recording, output_folder, remove_existing_folder, delete_output_folder, verbose, raise_error, docker_image, singularity_image, with_output, **sorter_params)
     61 if singularity_image is not None:
     62     return run_sorter_container(sorter_name, recording, 'singularity', singularity_image,
     63                                 output_folder=output_folder,
     64                                 remove_existing_folder=remove_existing_folder,
     65                                 delete_output_folder=delete_output_folder, verbose=verbose,
     66                                 raise_error=raise_error, with_output=with_output, **sorter_params)
---> 67 return run_sorter_local(sorter_name, recording, output_folder=output_folder,
     68                         remove_existing_folder=remove_existing_folder,
     69                         delete_output_folder=delete_output_folder,
     70                         verbose=verbose, raise_error=raise_error, with_output=with_output,
     71                         **sorter_params)

File ~/opt/anaconda3/envs/si090/lib/python3.8/site-packages/spikeinterface/sorters/runsorter.py:89, in run_sorter_local(sorter_name, recording, output_folder, remove_existing_folder, delete_output_folder, verbose, raise_error, with_output, **sorter_params)
     86 SorterClass.set_params_to_folder(
     87     recording, output_folder, sorter_params, verbose)
     88 SorterClass.setup_recording(recording, output_folder, verbose=verbose)
---> 89 SorterClass.run_from_folder(output_folder, raise_error, verbose)
     90 if with_output:
     91     sorting = SorterClass.get_result_from_folder(output_folder)

File ~/opt/anaconda3/envs/si090/lib/python3.8/site-packages/spikeinterface/sorters/basesorter.py:236, in BaseSorter.run_from_folder(cls, output_folder, raise_error, verbose)
    234 if has_error and raise_error:
    235     print(log['error_trace'])
--> 236     raise SpikeSortingError(
    237         f"Spike sorting failed. You can inspect the runtime trace in {output_folder}/spikeinterface_log.json")
    239 return run_time

SpikeSortingError: Spike sorting failed. You can inspect the runtime trace in results_HS/spikeinterface_log.json

Thank you for your help

theoretical foundations for clustering

We're using features and locations as if they had the same dimensions. Investigate how different modalities can be clustered in a more principled way.

clustering using DBSCAN

Just tried to run the clustering with the DBSCAN algorithm. However, I get the following error:

Clustering...
Using 248746 out of 248746 spikes...

TypeError Traceback (most recent call last)
in
----> 1 C.CombinedClustering(alpha=0.4, clustering_algorithm=DBSCAN,cluster_subset=nr_spikes,min_samples=5, eps=0.2, n_jobs=-1)

~\Anaconda3\envs\spikesorting_20\lib\site-packages\herdingspikes\hs2.py in CombinedClustering(self, alpha, clustering_algorithm, cluster_subset, **kwargs)
484 )
485 )
--> 486 clusterer.fit(fourvec[inds])
487 self.NClusters = len(np.unique(clusterer.labels_))
488 print("Number of estimated units:", self.NClusters)

TypeError: fit() missing 1 required positional argument: 'X'

HS on Spike Interface

Hello,
I want to run HS2 on spike interface but receive the following error:

Traceback (most recent call last):

  File "C:\Users\Analysis\Documents\Code\SI_094\test_095.py", line 312, in <module>
    sorting_HS = ss.run_sorter('herdingspikes', recording2Sort, output_folder=r'C:\Users\Analysis\Documents\Code\SpikeInterface95\results_HS', verbose=True)
  File "C:\Users\Analysis\anaconda3\envs\si_env_094\lib\site-packages\spikeinterface\sorters\runsorter.py", line 142, in run_sorter
    return run_sorter_local(**common_kwargs)
  File "C:\Users\Analysis\anaconda3\envs\si_env_094\lib\site-packages\spikeinterface\sorters\runsorter.py", line 157, in run_sorter_local
    output_folder = SorterClass.initialize_folder(
  File "C:\Users\Analysis\anaconda3\envs\si_env_094\lib\site-packages\spikeinterface\sorters\basesorter.py", line 101, in initialize_folder
    if not cls.is_installed():
  File "C:\Users\Analysis\anaconda3\envs\si_env_094\lib\site-packages\spikeinterface\sorters\herdingspikes\herdingspikes.py", line 128, in is_installed
    import herdingspikes as hs
  File "C:\Users\Analysis\anaconda3\envs\si_env_094\lib\site-packages\herdingspikes\__init__.py", line 4, in <module>
    from herdingspikes.hs2 import HSDetection, HSClustering
  File "C:\Users\Analysis\anaconda3\envs\si_env_094\lib\site-packages\herdingspikes\hs2.py", line 7, in <module>
    from .detection_localisation.detect import detectData
  File "herdingspikes/detection_localisation/detect.pyx", line 1, in init herdingspikes.detection_localisation.detect
    # distutils: language = c++
ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject

Can you help with this?
Thanks,
Zoe

HerdingSpike through SpikeInterface wrongly scales waveform.

As originally pointed out in this issue, running HerdingSpikes through SpikeInterface then accessing the resulting HSDetection object results in incorrect plots using PlotTracesChannels.

Turns out the localization is fine (assuming the correct electrode width and pitch were passed during probe initialization), it's the trace itself that is wrong.

In hs2.py on line 316 :

plt.plot(
  pos[event.ch][0] + trange_bluered,
  pos[event.ch][1] + event.Shape * scale,
  "r"
   ) 

where pos[event.ch][1] is the y-axis coordinate of the spike, and event.Shape is the waveform as, I'm assuming, a list of deviation from the spike's Amplitude. This is unlike the rest of the waveform plots :

plt.plot(
  pos[n][0] + trange_bluered,
  pos[n][1] + data[start_bluered : start_bluered + cutlen, n] * scale,
  col
  )

where the actual raw data is plotted.

This works fine when running only through HerdingSpikes, where the Shape list on my own data typically looks something like :
[ 3 -2 -2 2 15 -9 -13 8 -5 11 2 12 3 2 1 -5].
However, through SpikeInterface, it's closer to this :
[2094 2088 2022 2010 2032 2028 2041 2030 2032 2022 2031 2031 2032 2028 2027 2022 2027 2026 2043 2024], which shifts the waveform up.

From my very empirical attempt at matching the waveform with the raw trace it seems there's a shift of about +2300 µV in Shape values. I tried digging into the code to find where that list is being populated but I'm not familiar with C.

My first assumption would be that the problem comes from using RecordingExtractor instead of BioCam for the probe, but as pointed out before the raw data seems just fine. Do you have any idea what could be the cause ?

On a related note, I'm also not sure what the probe parameters of inner_radius and neighbor_radius refer to, and why they are so different based on the type of probe object. Is it because SpikeInterface uses microns for coordinates instead of "pixels" for BioCam ?

Spikes are sometimes localised to negative coordinates

Samples of spikes localised to negative coordinates before 82b4c4b (notice the timestamps) :

Amplitude                                              Shape           ch        t            x            y  
100072     705024  [-37, -29, -54, -18, -24, -68, -14, -14, -129,...   95  2399999 -2147483.648 -2147483.648  
100073     633099  [-99, -69, -93, -32, -4, -5, -29, -13, -18, -1...  108  2399998 -2147483.648 -2147483.648  
100075     756112  [-56, -75, -35, -68, 16, -47, -3, -9, 1, -110,...    1  2399999 -2147483.648 -2147483.648  
100076     580824  [-46, -70, -72, -33, 26, -79, -61, 15, -11, -9...   30  2399999 -2147483.648 -2147483.648  
100077     601208  [-84, -108, -49, -62, -32, -100, -77, -73, -94...   70  2399999 -2147483.648 -2147483.648  
100080     752827  [-119, -142, -91, -77, -40, -44, -13, -35, -10...  100  2399999 -2147483.648 -2147483.648  
100081     747350  [-79, -64, -47, -18, -1, -19, -34, 5, -11, -14...  116  2399999 -2147483.648 -2147483.648  
100083     835248  [-85, -115, -93, -32, 58, -41, -3, -34, -50, -...  125  2399999 -2147483.648 -2147483.648  
150249    1077209  [-286, -287, -222, -218, -162, -204, -185, -25...  110  3899997 -2147483.648 -2147483.648  
150257    1041999  [-202, -200, -140, -211, -186, -288, -263, -26...  126  3899998 -2147483.648 -2147483.648  

Sample of mis-localised spikes after 82b4c4b (potentially a different issue? timestamps arbitrary):

78         615339  [-17, -11, 53, 40, -53, -106, -98, -91, -143, ...   70    9271 -14.485    4.541  
86         950424  [-354, -373, -352, -362, -219, -129, -141, -29...   70    9324  -3.277   25.383  
450       1664220  [-101, -106, -225, -210, -211, -328, -444, -40...   70   13849  -7.899   21.679  
487        916563  [147, 184, 273, 358, 362, 227, 31, -80, -210, ...   64   13989  -0.097   16.366  
1054      1166622  [205, 167, 179, 243, 190, 120, 120, 66, -77, -...    0   30765  -0.026    2.774  
1104       753527  [320, 427, 368, 287, 368, 406, 274, 209, 105, ...  124   30920  -0.060   29.969  
1498       845013  [-83, 3, -7, -28, -85, -61, -186, -214, -235, ...   70   36121  -4.142   19.210  
1574      1194098  [-31, -80, -138, -227, -169, -209, -199, -295,...    0   36614   1.697   -0.708  
1605       857456  [247, 327, 315, 355, 315, 235, 168, 102, 12, -...   64   36799  -1.464   17.022  
2232       910911  [-26, -55, -160, -183, -221, -224, -302, -366,...    0   54761   2.042   -0.292  
2323      1118249  [118, 117, 75, 47, 13, -52, -146, -219, -215, ...    0   55424   0.988   -0.155  
2417      1238955  [102, 148, 231, 303, 223, 173, 0, -235, -332, ...   70   56915  -0.143   18.201  
2476       931762  [-159, -98, -47, -9, -75, 59, 84, 11, -173, -2...    0   57265  -7.302   11.531  
2503       600371  [-68, -81, -123, -109, -171, -147, -197, -157,...    0   57625   0.246   -3.603 

Review example notebooks

  • Where possible, add a wget command that downloads a publicly available dataset for analysis.
  • Find reasonable values for the clustering parameters, so that the result actually looks usable. At the moment, in all the examples, the result that gets plotted is a big red blob of 1 cluster, or similar.

Cant run herdingspikes

Hello,

I have been testing Spikeinterface using the tutorials in the github. I tried to sort through herdingspikes using the database provided in the tutorial but i have the following error in the code

sorting_HS = ss.run_herdingspikes(recording_loaded, output_folder='results_HS', filter=False, verbose=True) print('Found', len(sorting_HS.get_unit_ids()), 'units')

Generating new position and neighbor files from data file
Not Masking any Channels
Sampling rate: 20000
Localization On
Number of recorded channels: 16
Few recording channels: not subtracing mean from activity
Analysing frames: 6000000; Seconds: 300.0
Frames before spike in cutout: 6
Frames after spike in cutout: 36
tcuts: 26 56
tInc: 100000
Detection completed, time taken: 0:00:00.854102
Time per frame: 0:00:00.000142
Time per sample: 0:00:00.000009
Loaded 0 spikes.
Error running herdingspikes
Traceback (most recent call last):
File "/home/brontodegus/anaconda3/envs/si090/lib/python3.8/site-packages/spikeinterface/sorters/basesorter.py", line 200, in run_from_folder
SorterClass._run_from_folder(output_folder, sorter_params, verbose)
File "/home/brontodegus/anaconda3/envs/si090/lib/python3.8/site-packages/spikeinterface/sorters/herdingspikes/herdingspikes.py", line 226, in _run_from_folder
uids = C.spikes.cl.unique()
File "/home/brontodegus/anaconda3/envs/si090/lib/python3.8/site-packages/pandas/core/generic.py", line 5575, in getattr
return object.getattribute(self, name)
AttributeError: 'DataFrame' object has no attribute 'cl'

Also, i tried changing "filter=True" but i still having the same error .

Thanks for your help.

create_probe_files leading to write OSError

I'm working on creating a docker image of HS2 to run with spikeinterface in docker and singularity mode.
Here is PR with the dockerfile: https://github.com/SpikeInterface/spikeinterface-dockerfiles/pull/42/files

The image generated with Dockerfile from the above PR works when running sorter in docker mode, but when I try to run from singularity I'm getting this error:

Traceback (most recent call last):
  File "/home/yu/spikeinterface/spikeinterface/sorters/basesorter.py", line 225, in run_from_folder
    SorterClass._run_from_folder(output_folder, sorter_params, verbose)
  File "/home/yu/spikeinterface/spikeinterface/sorters/herdingspikes/herdingspikes.py", line 183, in _run_from_folder
    Probe = hs.probe.RecordingExtractor(
  File "/usr/local/lib/python3.8/site-packages/herdingspikes/probe.py", line 305, in __init__
    create_probe_files(
  File "/usr/local/lib/python3.8/site-packages/herdingspikes/probe.py", line 27, in create_probe_files
    with open(pos_file, "w") as f:
OSError: [Errno 30] Read-only file system: '/usr/local/lib/python3.8/site-packages/herdingspikes/probe_info/positions_spikeextractor'

The error occours in this line which is trying to write in a Read-only path (in singularity context)

version 0.3.99 fails at importing probe utils

@mhhennig

With the latest changes you made to probe.py, herdingspikes fails at import:

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-1-b7cd068ffc71> in <module>
----> 1 import herdingspikes

~/Documents/Codes/spike_sorting/sorters/HS2/herdingspikes/__init__.py in <module>
----> 1 from herdingspikes import probe
      2 
      3 # from herdingspikes.parameter_optimisation import OptimiseParameters
      4 from herdingspikes.hs2 import HSDetection, HSClustering
      5 

~/Documents/Codes/spike_sorting/sorters/HS2/herdingspikes/probe.py in <module>
      3 import json
      4 from matplotlib import pyplot as plt
----> 5 from .probe_functions.readUtils import read_flat, readSiNAPS_S1Probe
      6 from .probe_functions.readUtils import openHDF5file, getHDF5params
      7 from .probe_functions.readUtils import readHDF5t_100, readHDF5t_101

ImportError: cannot import name 'readSiNAPS_S1Probe' from 'herdingspikes.probe_functions.readUtils' (/Users/abuccino/Documents/Codes/spike_sorting/sorters/HS2/herdingspikes/probe_functions/readUtils.py)

Save more relevant information in the .bin files

These files should additionally contain:

  • an identifier at the start, could be 4 bytes, e.g. 'HS2a' for version 'a' (in case we make changes later)
  • followed by one integer for the length of a cut-out
  • for each spike, the true peak amplitude (i.e. the minimum signal)

The latter is quite expensive to compute later on.

Version 0.3.100 fails at import

Hi guys, the new version fails at import in the spikeinterface tests with:

import herdingspikes as hs
../../../test_env/lib/python3.8/site-packages/herdingspikes/__init__.py:4: in <module>
    from herdingspikes.hs2 import HSDetection, HSClustering
../../../test_env/lib/python3.8/site-packages/herdingspikes/hs2.py:7: in <module>
    from .detection_localisation.detect import detectData
__init__.pxd:[19](https://github.com/SpikeInterface/spikeinterface/runs/5673855336?check_suite_focus=true#step:11:19)9: in init herdingspikes.detection_localisation.detect
    ???
E   ValueError: numpy.ndarray has the wrong size, try recompiling. Expected 88, got 96

You can check the full log here

Probe should provide units of voltage.

In a similar way to what is done with sampling, the Probe object should tell us what the units of voltage are, so that we can plot in a meaningful way.

bug with HS 0.3.7

Hi Matthias.
to make benchmark I was trying to run HS on the same dataset than last year for the paper:

I have a bug.

Here the log file from spikeinterface:

{
    "sorter_name": "herdingspikes",
    "sorter_version": "0.3.7+git.45665a2b6438",
    "datetime": "2021-09-10T13:34:01.439598",
    "runtime_trace": [],
    "error": true,
    "error_trace": "Traceback (most recent call last):\n  File \"/home/samuel.garcia/Documents/SpikeInterface/spikeinterface/spikeinterface/sorters/basesorter.py\", line 198, in run_from_folder\n    SorterClass._run_from_folder(output_folder, sorter_params, verbose)\n  File \"/home/samuel.garcia/Documents/SpikeInterface/spikeinterface/spikeinterface/sorters/herdingspikes/herdingspikes.py\", line 212, in _run_from_folder\n    uids = C.spikes.cl.unique()\n  File \"/home/samuel.garcia/.virtualenvs/py38/lib/python3.8/site-packages/pandas/core/generic.py\", line 5465, in __getattr__\n    return object.__getattribute__(self, name)\nAttributeError: 'DataFrame' object has no attribute 'cl'\n",
    "run_time": null
}

The columns cl seams to be missing from the dataframe C.spike ....

Speed up post-clustering processing

Building the table of clusters, after clustering, takes a long time, often longer than the actual clustering. This seems unnecessary and needs checking. It's the last bit in the CombinedClustering method.

DBSCAN problems

DBSCAN has a different output from MeanShift, there are NaNs and means of empty slices happening because of it.

Run `HS2` without probe file

Hello,

I am trying to sort data from Behnke Fried contacts. However, the exact or even approximate geometry of the contacts is unable to be determined and so I am unable to run HS2. I am wondering if there is a workaround for this?

Version in __version__

Hi Cole and Matthias.
Could you had a version attributes in herdingspikes modules, so the version could be introspected from spikesorters ?

We need to report version for benchmark.

At the moment
 python spikesorters.print_sorter_version()  
give

herdingspikes: unknown
ironclust: 5.0.8
kilosort: unknown
kilosort2: unknown
klusta: 3.0.16
mountainsort4: unknown
spykingcircus: 0.9.0
tridesclous: 1.4.2

Cheers

Sam

Inconsistent way of passing probe geometry

Currently, the probe geometry is passed to herdingspikes through a NeuralProbe object. However, the positions and neighbors contained in this object don't get passed to the C code, which still reads whatever text files are available in the directory. This is confusing.

Localization difference between BioCam and RecordingExtractor objects

Hi,
I'm trying to run Herding Spikes on an object that is then compatible with SpikeInterface and/or Neo objects.
Ideally, I would be able to have the flexibility of using standalone Herding Spikes while still being able to interact with a SpikeInterface pipeline.

My issue was intially detailed here.

Essentially, running HSDetection on the same file with the same parameters results in different outputs depending on wether I use HS2's BioCam object or SI's read_biocam then feed it through hs.probe.RecordingExtractor, as illustrated here.

It seems to boil down to BioCam and RecordingExtractor having different default parameters for inner_radius, neighbor_radius and masked_channels, and using different files for position and neighbor coordinates.

As detailed here, the neighbormatrix_spikeextractor generated when using RecordingExtractor is very different from the default neighbormatrix_biocam used with BioCam, and as a result the max_neighbors property is different.

From what I've been able to understand, it might be caused by how channel positions are obtained here.

Issue using tutorial jupyter notebook (localised_spikes_clustered-biocam)

Hello,

I am trying to play around with herding spikes installed in a conda enviroment alongside spike interface. But when I try and run the first cell of the localised_spikes_clustered-biocam.ipynb file I get the following error:

Downloading data file - around 8GB, so this may take a while, patience...
---------------------------------------------------------------------------
HTTPError                                 Traceback (most recent call last)
Cell In[3], line 10
      8 handler = urllib.request.HTTPBasicAuthHandler(password_mgr)
      9 opener = urllib.request.build_opener(handler)
---> 10 opener.open(file_url)
     11 urllib.request.install_opener(opener)
     12 with urllib.request.urlopen(file_url) as response, open('biocam_data.brw', 'wb') as out_file:

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:525, in OpenerDirector.open(self, fullurl, data, timeout)
    523 for processor in self.process_response.get(protocol, []):
    524     meth = getattr(processor, meth_name)
--> 525     response = meth(req, response)
    527 return response

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:634, in HTTPErrorProcessor.http_response(self, request, response)
    631 # According to RFC 2616, "2xx" code indicates that the client's
    632 # request was successfully received, understood, and accepted.
    633 if not (200 <= code < 300):
--> 634     response = self.parent.error(
    635         'http', request, response, code, msg, hdrs)
    637 return response

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:557, in OpenerDirector.error(self, proto, *args)
    555     http_err = 0
    556 args = (dict, proto, meth_name) + args
--> 557 result = self._call_chain(*args)
    558 if result:
    559     return result

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:496, in OpenerDirector._call_chain(self, chain, kind, meth_name, *args)
    494 for handler in handlers:
    495     func = getattr(handler, meth_name)
--> 496     result = func(*args)
    497     if result is not None:
    498         return result

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:1056, in HTTPBasicAuthHandler.http_error_401(self, req, fp, code, msg, headers)
   1054 def http_error_401(self, req, fp, code, msg, headers):
   1055     url = req.full_url
-> 1056     response = self.http_error_auth_reqed('www-authenticate',
   1057                                       url, req, headers)
   1058     return response

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:1005, in AbstractBasicAuthHandler.http_error_auth_reqed(self, authreq, host, req, headers)
    999             continue
   1001         if realm is not None:
   1002             # Use the first matching Basic challenge.
   1003             # Ignore following challenges even if they use the Basic
   1004             # scheme.
-> 1005             return self.retry_http_basic_auth(host, req, realm)
   1007 if unsupported is not None:
   1008     raise ValueError("AbstractBasicAuthHandler does not "
   1009                      "support the following scheme: %r"
   1010                      % (scheme,))

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:1020, in AbstractBasicAuthHandler.retry_http_basic_auth(self, host, req, realm)
   1018         return None
   1019     req.add_unredirected_header(self.auth_header, auth)
-> 1020     return self.parent.open(req, timeout=req.timeout)
   1021 else:
   1022     return None

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:525, in OpenerDirector.open(self, fullurl, data, timeout)
    523 for processor in self.process_response.get(protocol, []):
    524     meth = getattr(processor, meth_name)
--> 525     response = meth(req, response)
    527 return response

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:634, in HTTPErrorProcessor.http_response(self, request, response)
    631 # According to RFC 2616, "2xx" code indicates that the client's
    632 # request was successfully received, understood, and accepted.
    633 if not (200 <= code < 300):
--> 634     response = self.parent.error(
    635         'http', request, response, code, msg, hdrs)
    637 return response

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:563, in OpenerDirector.error(self, proto, *args)
    561 if http_err:
    562     args = (dict, 'default', 'http_error_default') + orig_args
--> 563     return self._call_chain(*args)

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:496, in OpenerDirector._call_chain(self, chain, kind, meth_name, *args)
    494 for handler in handlers:
    495     func = getattr(handler, meth_name)
--> 496     result = func(*args)
    497     if result is not None:
    498         return result

File D:\Anaconda\envs\spikeInterface\lib\urllib\request.py:643, in HTTPDefaultErrorHandler.http_error_default(self, req, fp, code, msg, hdrs)
    642 def http_error_default(self, req, fp, code, msg, hdrs):
--> 643     raise HTTPError(req.full_url, code, msg, hdrs, fp)

HTTPError: HTTP Error 401: Unauthorized

Any help would be greatly appreciated!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.