Coder Social home page Coder Social logo

ssloxford / seeing-red Goto Github PK

View Code? Open in Web Editor NEW
50.0 9.0 18.0 8.95 MB

Using PPG Obtained via Smartphone Cameras for Authentication

License: Other

Dockerfile 0.36% Python 49.78% Swift 49.86%
biometrics biometrics-systems research security systems-security

seeing-red's Introduction

Seeing Red: PPG Biometrics Using Smartphone Cameras

This repository contains the code for the paper "Seeing Red: PPG Biometrics Using Smartphone Cameras" published in the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) at the 15th IEEE Computer Vision Society Workshop on Biometrics. This work is a collaboration between Giulio Lovisotto, Henry Turner and Simon Eberz from the System Security Lab at University of Oxford.

Idea

In this work we investigated the use of photoplethysmography (PPG) for authentication. An individual's PPG signal can be extracted by taking a video with a smartphone camera as users place their finger on the sensor. The blood flowing through the finger changes the reflective properties of the skin, which is captured by subtle changes in the video color.

We collected PPG signals from 15 participants over several sessions (6-11), in each session the participant places his finger on the camera while a 30 seconds long video is taken. We extract the raw value of the LUMA component of each video frame to obtain the underlying PPG signal from a video. The signals are then preprocessed with a set of filters to remove trends and high frequency components, and then each individual heartbeat is separated with a custom algorithm.

We designed a set of features that capture the distinctiveness of each individual's PPG signal and we evaluated the authentication performance with a set of experiments (see Reproduce Evaluation).

See the conference presentation slides

Dataset

The dataset used for this paper has been published online on ORA and can be freely downloaded. The dataset contains a set of videos for 14 participants who consented to their data being shared, ethics approval number SSD/CUREC1A CS_C1A_19_032. Each video is a 30 seconds long recording which was taken as the participant kept his index finger on the smartphone camera, see a preview here. The dataset was collected using a custom built app on an iPhone X, the iOS application source code is available in this repository.

Reproduce Evaluation

The code runs inside a Docker container and requires docker and docker-compose to be installed in your system.

You might be able to make this work on a generic python/anaconda environment with some effort.

To reproduce the evaluation, follow these steps:

  1. read the paper - this is the only way you will understand what you are doing
  2. Clone this repository
  3. download the dataset used in the paper, unzip the archive and place the downloaded videos folder in seeing-red/data/
  4. build and start the container by running docker-compose up -d
  5. attach to the container with docker attach seeingred_er
  6. in the container, cd /home/code and run the entire signal analysis pipeline with python signal_run_all.py

Results will be produced in several subfolders in seeing-red/data/.

Read EER Results

Resulting Equal Error Rates (EER) are produced by three functions defined in classify.py and saved in subfolders in seeing-red/data/results/<expid>

  • exp1 produces the results used in paper Section 5.1: Multi-class Case, saved in data/results/exp1/all.npy and data/results/exp1/all-fta.npy
  • exp3 produces the results used in paper Section 5.2: One-class Case, saved in data/results/exp3/all.npy and data/results/exp3/all-fta.npy
  • exp4 produces the results used in paper Section 5.3: One-class Cross-Session, saved in data/results/exp4/all.npy and data/results/exp4/all-fta.npy

NB.: paper Section 5.4: EER User Distribution re-uses the results from exp3 and exp4.

A file results/<expid>/all.npy is a numpy multidimensional array containing EER measurements, each table dimension is described by the descr.json contained in the same folder.

For example, if you load the result file for exp1 and its description file, you can read results this way:

import numpy as np
import json
# load the file
eers = np.load("/home/data/results/exp1/all.npy")  
# load the description for the result file
descr = json.load(open("/home/data/results/exp1/descr.json"))  

# "header" in descr decribes the dimensions of the eers array
# the number of dimensions of eers should match the length of the header
assert len(descr["header"]) == len(eers.shape)

# ["fold", "clf", "window_size", "user"]
print(descr["header"])  
# should be (2, 3, 5, 14) for exp1
print(eers.shape)  

# let's print an EER for a specific instance
# select one index across each dimension
fold_index = 0
# one of ["SVM", "GBT", "RFC"]
clf_index = descr["clf"].index("SVM")  
# one of [1, 2, 5, 10, 20]
aws_index = descr["window_size"].index(5)  
usr_index = 3 
print("The EER measured for fold %d, classifier %s, aggregation window size of %d and user %d is %.4f" % (
          fold_index, descr["clf"][clf_index], descr["window_size"][aws_index], usr_index, eers[fold_index, clf_index, aws_index, usr_index]))

In the paper, to get an EER for a (classifier, aggregation window size) pair, we take the average across folds and across users:

## let's take "SVM" and aggregation window size of 5
# load the file
eers = np.load("/home/data/results/exp1/all.npy")  
# one of ["SVM", "GBT", "RFC"]
chosen_clf = "SVM"  
# one of [1, 2, 5, 10, 20]
chosen_aws = 5  
clf_index = descr["clf"].index(chosen_clf)
aws_index = descr["window_size"].index(chosen_aws)
eers = eers[:, clf_index, aws_index, :]
# we average across folds first to produce confidence intervals
eers_mean = eers.mean(axis=0).mean(axis=-1)  
eers_std = eers.mean(axis=0).std(axis=-1)
print("The average EER measured for exp1 using %s and aggregation window size of %d is %.4f with standard deviation of %.4f" % (
           chosen_clf, chosen_aws, eers_mean, eers_std))

Citation

If you use this repository please cite the paper as follows:

@INPROCEEDINGS{9150630,
  author={G. {Lovisotto} and H. {Turner} and S. {Eberz} and I. {Martinovic}},
  booktitle={2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)}, 
  title={Seeing Red: PPG Biometrics Using Smartphone Cameras}, 
  year={2020},
  volume={},
  number={},
  pages={3565-3574},
  doi={10.1109/CVPRW50498.2020.00417}}

Contributors

Acknowledgements

This work was generously supported by a grant from Mastercard and by the Engineering and Physical Sciences Research Council [grant numbers EP/N509711/1, EP/P00881X/1].

seeing-red's People

Contributors

giuliolovisotto avatar hcmturner avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

seeing-red's Issues

Error while reproducing.

I am trying to run the docker file on VS Code on my MacBook Pro (M2) running MacOS Sonama but I keep getting an error while the file tries to install lumpy, script, etc.. I am attaching the exact errors below.

=> ERROR [seeingred 12/16] RUN pip3 install numpy==1.15.4 scipy==1.4.0 pandas==1.0.4 matplotlib Pillow scikit-image scikit-video scikit-learn==0.23.2

[seeingred 12/16] RUN pip3 install numpy==1.15.4 scipy==1.4.0 pandas==1.0.4 matplotlib Pillow scikit-image scikit-video scikit-learn==0.23.2:
0.943 Collecting numpy==1.15.4
1.017 Downloading numpy-1.15.4.zip (4.5 MB)
1.343 Preparing metadata (setup.py): started
1.455 Preparing metadata (setup.py): finished with status 'done'
1.645 Collecting scipy==1.4.0
1.661 Downloading scipy-1.4.0.tar.gz (24.6 MB)
3.148 Installing build dependencies: started
52.69 Installing build dependencies: finished with status 'done'
52.70 Getting requirements to build wheel: started
52.83 Getting requirements to build wheel: finished with status 'done'
52.83 Preparing metadata (pyproject.toml): started
62.84 Preparing metadata (pyproject.toml): finished with status 'done'
63.06 Collecting pandas==1.0.4
63.07 Downloading pandas-1.0.4.tar.gz (5.0 MB)
63.42 Installing build dependencies: started
65.82 Installing build dependencies: finished with status 'done'
65.82 Getting requirements to build wheel: started
82.27 Getting requirements to build wheel: finished with status 'error'
82.27 ERROR: Command errored out with exit status 1:
82.27 command: /usr/bin/python3 /usr/local/lib/python3.6/dist-packages/pip/_vendor/pep517/in_process/_in_process.py get_requires_for_build_wheel /tmp/tmp_1khl465
82.27 cwd: /tmp/pip-install-mt7vuz7o/pandas_6f945dfbd8d44c48b1ba91d02eacfa0e
82.27 Complete output (301 lines):
82.27 warning: pandas/_libs/algos_take_helper.pxi:51:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:178:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:305:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:432:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:559:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:686:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:813:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:940:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:1067:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:1194:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:1321:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:1448:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:1575:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:1702:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:1829:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:1956:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:2083:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:2210:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos_take_helper.pxi:2337:4: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See cython/cython#4310
82.27 warning: pandas/_libs/algos.pyx:117:42: Unknown type declaration 'bint' in annotation, ignoring
82.27 warning: pandas/_libs/groupby.pyx:1104:26: Unreachable code
82.27 performance hint: pandas/_libs/hashing.pyx:106:5: Exception check on 'u32to8_le' will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashing.pyx:124:5: Exception check on '_sipround' will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashing.pyx:168:21: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashing.pyx:179:17: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashing.pyx:185:17: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable_class_helper.pxi:22:5: Exception check on 'append_data_float64' will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable_class_helper.pxi:31:5: Exception check on 'append_data_int64' will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable_class_helper.pxi:44:5: Exception check on 'append_data_string' will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable_class_helper.pxi:57:5: Exception check on 'append_data_uint64' will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable_class_helper.pxi:545:39: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable_class_helper.pxi:658:39: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable_class_helper.pxi:852:38: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable_class_helper.pxi:962:38: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable_class_helper.pxi:1156:37: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable_class_helper.pxi:1269:37: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/hashtable.pyx:166:33: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/parsers.pyx:1614:5: Exception check on '_to_fw_string_nogil' will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/parsers.pyx:1609:27: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27 2. Use an 'int' return type on the function to allow an error code to be returned.
82.27 performance hint: pandas/_libs/parsers.pyx:1702:42: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27
82.27 performance hint: pandas/_libs/parsers.pyx:1726:38: Exception check will always require the GIL to be acquired. Possible solutions:
82.27 1. Declare the function as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
82.27
82.27
82.27 Error compiling Cython file:
82.27 ------------------------------------------------------------
82.27 ...
82.27 object cached_typ = None
82.27
82.27 arr = self.arr
82.27 chunk = self.dummy
82.27 dummy_buf = chunk.data
82.27 chunk.data = arr.data
82.27 ^
82.27 ------------------------------------------------------------
82.27
82.27 pandas/_libs/reduction.pyx:101:13: Assignment to a read-only property
82.27
82.27 Error compiling Cython file:
82.27 ------------------------------------------------------------
82.27 ...
82.27 # On the first pass, we check the output shape to see
82.27 # if this looks like a reduction.
82.27 _check_result_array(res, len(self.dummy))
82.27
82.27 PyArray_SETITEM(result, PyArray_ITER_DATA(it), res)
82.27 chunk.data = chunk.data + self.increment
82.27 ^
82.27 ------------------------------------------------------------
82.27
82.27 pandas/_libs/reduction.pyx:140:21: Assignment to a read-only property
82.27
82.27 Error compiling Cython file:
82.27 ------------------------------------------------------------
82.27 ...
82.27 PyArray_SETITEM(result, PyArray_ITER_DATA(it), res)
82.27 chunk.data = chunk.data + self.increment
82.27 PyArray_ITER_NEXT(it)
82.27 finally:
82.27 # so we don't free the wrong memory
82.27 chunk.data = dummy_buf
82.27 ^
82.27 ------------------------------------------------------------
82.27
82.27 pandas/_libs/reduction.pyx:144:17: Assignment to a read-only property
82.27
82.27 Error compiling Cython file:
82.27 ------------------------------------------------------------
82.27 ...
82.27 PyArray_SETITEM(result, PyArray_ITER_DATA(it), res)
82.27 chunk.data = chunk.data + self.increment
82.27 PyArray_ITER_NEXT(it)
82.27 finally:
82.27 # so we don't free the wrong memory
82.27 chunk.data = dummy_buf
82.27 ^
82.27 ------------------------------------------------------------
82.27
82.27 pandas/_libs/reduction.pyx:144:17: Assignment to a read-only property
82.27
82.27 Error compiling Cython file:
82.27 ------------------------------------------------------------
82.27 ...
82.27
82.27 self.orig_data = self.buf.data
82.27 self.orig_len = self.buf.shape[0]
82.27 self.orig_stride = self.buf.strides[0]
82.27
82.27 self.buf.data = self.values.data
82.27 ^
82.27 ------------------------------------------------------------
82.27
82.27 pandas/_libs/reduction.pyx:437:16: Assignment to a read-only property
82.27
82.27 Error compiling Cython file:
82.27 ------------------------------------------------------------
82.27 ...
82.27
82.27 self.buf.data = self.values.data
82.27 self.buf.strides[0] = self.stride
82.27
82.27 cdef advance(self, Py_ssize_t k):
82.27 self.buf.data = <char*>self.buf.data + self.stride * k
82.27 ^
82.27 ------------------------------------------------------------
82.27
82.27 pandas/_libs/reduction.pyx:441:16: Assignment to a read-only property
82.27
82.27 Error compiling Cython file:
82.27 ------------------------------------------------------------
82.27 ...
82.27
82.27 cdef move(self, int start, int end):
82.27 """
82.27 For slicing
82.27 """
82.27 self.buf.data = self.values.data + self.stride * start
82.27 ^
82.27 ------------------------------------------------------------
82.27
82.27 pandas/_libs/reduction.pyx:447:16: Assignment to a read-only property
82.27
82.27 Error compiling Cython file:
82.27 ------------------------------------------------------------
82.27 ...
82.27 self.buf.shape[0] = length
82.27
82.27 cdef reset(self):
82.27
82.27 self.buf.shape[0] = self.orig_len
82.27 self.buf.data = self.orig_data
82.27 ^
82.27 ------------------------------------------------------------
82.27
82.27 pandas/_libs/reduction.pyx:456:16: Assignment to a read-only property
82.27
82.27 Error compiling Cython file:
82.27 ------------------------------------------------------------
82.27 ...
82.27 # move blocks
82.27 for i in range(self.nblocks):
82.27 arr = self.blocks[i]
82.27
82.27 # axis=1 is the frame's axis=0
82.27 arr.data = self.base_ptrs[i] + arr.strides[1] * start
82.27 ^
82.27 ------------------------------------------------------------
82.27
82.27 pandas/_libs/reduction.pyx:573:15: Assignment to a read-only property
82.27
82.27 Error compiling Cython file:
82.27 ------------------------------------------------------------
82.27 ...
82.27 # reset blocks
82.27 for i in range(self.nblocks):
82.27 arr = self.blocks[i]
82.27
82.27 # axis=1 is the frame's axis=0
82.27 arr.data = self.base_ptrs[i]
82.27 ^
82.27 ------------------------------------------------------------
82.27
82.27 pandas/_libs/reduction.pyx:592:15: Assignment to a read-only property
82.27 Compiling pandas/_libs/algos.pyx because it changed.
82.27 Compiling pandas/_libs/groupby.pyx because it changed.
82.27 Compiling pandas/_libs/hashing.pyx because it changed.
82.27 Compiling pandas/_libs/hashtable.pyx because it changed.
82.27 Compiling pandas/_libs/index.pyx because it changed.
82.27 Compiling pandas/_libs/indexing.pyx because it changed.
82.27 Compiling pandas/_libs/internals.pyx because it changed.
82.27 Compiling pandas/_libs/interval.pyx because it changed.
82.27 Compiling pandas/_libs/join.pyx because it changed.
82.27 Compiling pandas/_libs/lib.pyx because it changed.
82.27 Compiling pandas/_libs/missing.pyx because it changed.
82.27 Compiling pandas/_libs/parsers.pyx because it changed.
82.27 Compiling pandas/_libs/reduction.pyx because it changed.
82.27 Compiling pandas/_libs/ops.pyx because it changed.
82.27 Compiling pandas/_libs/ops_dispatch.pyx because it changed.
82.27 Compiling pandas/_libs/properties.pyx because it changed.
82.27 Compiling pandas/_libs/reshape.pyx because it changed.
82.27 Compiling pandas/_libs/sparse.pyx because it changed.
82.27 Compiling pandas/_libs/tslib.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/c_timestamp.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/ccalendar.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/conversion.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/fields.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/frequencies.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/nattype.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/np_datetime.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/offsets.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/parsing.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/period.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/resolution.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/strptime.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/timedeltas.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/timestamps.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/timezones.pyx because it changed.
82.27 Compiling pandas/_libs/tslibs/tzconversion.pyx because it changed.
82.27 Compiling pandas/_libs/testing.pyx because it changed.
82.27 Compiling pandas/_libs/window/aggregations.pyx because it changed.
82.27 Compiling pandas/_libs/window/indexers.pyx because it changed.
82.27 Compiling pandas/_libs/writers.pyx because it changed.
82.27 Compiling pandas/io/sas/sas.pyx because it changed.
82.27 [ 1/40] Cythonizing pandas/_libs/algos.pyx
82.27 [ 2/40] Cythonizing pandas/_libs/groupby.pyx
82.27 [ 3/40] Cythonizing pandas/_libs/hashing.pyx
82.27 [ 4/40] Cythonizing pandas/_libs/hashtable.pyx
82.27 [ 5/40] Cythonizing pandas/_libs/index.pyx
82.27 [ 6/40] Cythonizing pandas/_libs/indexing.pyx
82.27 [ 7/40] Cythonizing pandas/_libs/internals.pyx
82.27 [ 8/40] Cythonizing pandas/_libs/interval.pyx
82.27 [ 9/40] Cythonizing pandas/_libs/join.pyx
82.27 [10/40] Cythonizing pandas/_libs/lib.pyx
82.27 [11/40] Cythonizing pandas/_libs/missing.pyx
82.27 [12/40] Cythonizing pandas/_libs/ops.pyx
82.27 [13/40] Cythonizing pandas/_libs/ops_dispatch.pyx
82.27 [14/40] Cythonizing pandas/_libs/parsers.pyx
82.27 [15/40] Cythonizing pandas/_libs/properties.pyx
82.27 [16/40] Cythonizing pandas/_libs/reduction.pyx
82.27 Traceback (most recent call last):
82.27 File "/usr/local/lib/python3.6/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in
82.27 main()
82.27 File "/usr/local/lib/python3.6/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
82.27 json_out['return_val'] = hook(**hook_input['kwargs'])
82.27 File "/usr/local/lib/python3.6/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 130, in get_requires_for_build_wheel
82.27 return hook(config_settings)
82.27 File "/tmp/pip-build-env-fu_79r0m/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 163, in get_requires_for_build_wheel
82.27 config_settings, requirements=['wheel'])
82.27 File "/tmp/pip-build-env-fu_79r0m/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 143, in _get_build_requires
82.27 self.run_setup()
82.27 File "/tmp/pip-build-env-fu_79r0m/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 268, in run_setup
82.27 self).run_setup(setup_script=setup_script)
82.27 File "/tmp/pip-build-env-fu_79r0m/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 158, in run_setup
82.27 exec(compile(code, file, 'exec'), locals())
82.27 File "setup.py", line 757, in
82.27 ext_modules=maybe_cythonize(extensions, compiler_directives=directives),
82.27 File "setup.py", line 542, in maybe_cythonize
82.27 return cythonize(extensions, *args, **kwargs)
82.27 File "/tmp/pip-build-env-fu_79r0m/overlay/lib/python3.6/site-packages/Cython/Build/Dependencies.py", line 1154, in cythonize
82.27 cythonize_one(*args)
82.27 File "/tmp/pip-build-env-fu_79r0m/overlay/lib/python3.6/site-packages/Cython/Build/Dependencies.py", line 1321, in cythonize_one
82.27 raise CompileError(None, pyx_file)
82.27 Cython.Compiler.Errors.CompileError: pandas/_libs/reduction.pyx
82.27 ----------------------------------------
82.27 WARNING: Discarding https://files.pythonhosted.org/packages/53/87/6438c197fc70ca6b3056cfb60b3dfedca25bedb631bce1f72d6a10502d15/pandas-1.0.4.tar.gz#sha256=b35d625282baa7b51e82e52622c300a1ca9f786711b2af7cbe64f1e6831f4126 (from https://pypi.org/simple/pandas/) (requires-python:>=3.6.1). Command errored out with exit status 1: /usr/bin/python3 /usr/local/lib/python3.6/dist-packages/pip/_vendor/pep517/in_process/_in_process.py get_requires_for_build_wheel /tmp/tmp_1khl465 Check the logs for full command output.
82.27 ERROR: Could not find a version that satisfies the requirement pandas==1.0.4 (from versions: 0.1, 0.2, 0.3.0, 0.4.0, 0.4.1, 0.4.2, 0.4.3, 0.5.0, 0.6.0, 0.6.1, 0.7.0, 0.7.1, 0.7.2, 0.7.3, 0.8.0, 0.8.1, 0.9.0, 0.9.1, 0.10.0, 0.10.1, 0.11.0, 0.12.0, 0.13.0, 0.13.1, 0.14.0, 0.14.1, 0.15.0, 0.15.1, 0.15.2, 0.16.0, 0.16.1, 0.16.2, 0.17.0, 0.17.1, 0.18.0, 0.18.1, 0.19.0, 0.19.1, 0.19.2, 0.20.0, 0.20.1, 0.20.2, 0.20.3, 0.21.0, 0.21.1, 0.22.0, 0.23.0, 0.23.1, 0.23.2, 0.23.3, 0.23.4, 0.24.0, 0.24.1, 0.24.2, 0.25.0, 0.25.1, 0.25.2, 0.25.3, 1.0.0, 1.0.1, 1.0.2, 1.0.3, 1.0.4, 1.0.5, 1.1.0, 1.1.1, 1.1.2, 1.1.3, 1.1.4, 1.1.5)
82.27 ERROR: No matching distribution found for pandas==1.0.4


failed to solve: process "/bin/sh -c pip3 install numpy==1.15.4 scipy==1.4.0 pandas==1.0.4 matplotlib Pillow scikit-image scikit-video scikit-learn==0.23.2" did not complete successfully: exit code: 1

How to define `max_bpm` if `frame_rate = 30`?

Hi there, thank you for your great work. I've learnt a lot so far.

By default, my FPS=30 when using iPhone video to record my finger. I'm wondering if frame_rate is changed from 240 to 30, how much the max_bpm would be? 30 or still 120 as your paper suggested? Thank you.

Error reproducing

Hi!

I went through the reproducing evaluation steps on a macbook pro running Big Sur, and currently getting this error running step 6.

Traceback (most recent call last):
  File "signal_extractor.py", line 241, in <module>
    for file in sorted(user_files))
  File "/usr/local/lib/python3.6/dist-packages/joblib/parallel.py", line 1054, in __call__
    self.retrieve()
  File "/usr/local/lib/python3.6/dist-packages/joblib/parallel.py", line 933, in retrieve
    self._output.extend(job.get(timeout=self.timeout))
  File "/usr/local/lib/python3.6/dist-packages/joblib/_parallel_backends.py", line 542, in wrap_future_result
    return future.result(timeout=timeout)
  File "/usr/lib/python3.6/concurrent/futures/_base.py", line 432, in result
    return self.__get_result()
  File "/usr/lib/python3.6/concurrent/futures/_base.py", line 384, in __get_result
    raise self._exception
joblib.externals.loky.process_executor.TerminatedWorkerError: A worker process managed by the executor was unexpectedly terminated. This could be caused by a segmentation fault while calling the function or by an excessive memory usage causing the Operating System to kill the worker.

Any advice would be appreciated. The closest thread I found is here scikit-learn-contrib/skope-rules#18. The args.number_of_cpus is 2.

Edit: When removing the Parallel code on line 239 of signal_extractor.py and running, the issue disappears. Probably something to look into.
Thanks!

IndexError: index 209 is out of bounds for axis 0 with size 200 - feature_extractor.py

Hi,
I am testing the code through some ppg signals of an oximeter saved in a csv in the "extracted" folder. I ran in the order: "signal_preprocessor.py", "signal_beat_separation.py", "signal_fiducial_points_detection.py", "fta_average_beat.py", "signal_beat_fta.py"

but when the program tries to run "feature_extractor.py" to extract the features I get the following error:

root @ baa26fd16149: / home / code # python feature_extractor.py
001 (1/9), 00101.json, f_cheating
001 (1/9), 00101.json, f_widths
001 (1/9), 00101.json, f_fft
Traceback (most recent call last):
  File "feature_extractor.py", line 309, in <module>
    ** fextractor ["params"])
  File "feature_extractor.py", line 190, in f_fiducial_points
    dychrotic_notch_value = y [dychrotic_notch_index]
IndexError: index 209 is out of bounds for axis 0 with size 200

I tried to change the size of the array to greater than 209 but the problem repeats. Can someone help me?

Thanks.
Sincerely

Results

To watch the result of classification,how to use the results/all.npy

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.