Coder Social home page Coder Social logo

csu-hmc / gaitanalysistoolkit Goto Github PK

View Code? Open in Web Editor NEW
107.0 107.0 31.0 1.94 MB

Tools for the Cleveland State Human Motion and Control Lab

Home Page: http://hmc.csuohio.edu

License: Other

Python 85.67% MATLAB 13.55% Shell 0.78%
biomechanics control gait python

gaitanalysistoolkit's People

Contributors

campanawanna avatar moorepants avatar movermeyer avatar skhnat avatar spinningplates avatar tvdbogert avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gaitanalysistoolkit's Issues

Loading a compensation trial doesn't really work

If you try to load in a compensation trial there are some issues.

  1. If "stationary-platform" is false, then the compensation_needed() function returns true, but there is no compensation data available to run the compensation.

Using sparse_a in solve fails

result = solver.solve(sparse_a=True)

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-44-c96f1dfa9907> in <module>()
----> 1 result = solver.solve(sparse_a=True)

/home/moorepants/src/Gait-Analysis-Toolkit/gaitanalysis/controlid.pyc in solve(self, sparse_a, gain_omission_matrix)
    749         x, variance, covariance = self.least_squares(A, b)
    750 
--> 751         deconstructed_solution = self.deconstruct_solution(x, covariance)
    752 
    753         gain_matrices = deconstructed_solution[0]

/home/moorepants/src/Gait-Analysis-Toolkit/gaitanalysis/controlid.pyc in deconstruct_solution(self, x, covariance)
    299         control_vectors_variance = np.zeros((self.n, self.q))
    300 
--> 301         parameter_variance = np.diag(covariance)
    302 
    303         for i in range(self.n):

/home/moorepants/envs/walk/local/lib/python2.7/site-packages/numpy/lib/twodim_base.pyc in diag(v, k)
    287         return v.diagonal(k)
    288     else:
--> 289         raise ValueError("Input must be 1- or 2-d.")
    290 
    291 def diagflat(v, k=0):

ValueError: Input must be 1- or 2-d.

Move all filtering to one place in the code.

We can wrap Ton's filter that can handle varying sample rates and NaN values and use that as the universal filter.

Note that it is typical to compute inverse dynamics before filtering and filter the results, but we are not doing that currently.

You can filter before or after other linear operations (like differentiation) and get the same effect. But the order matters for non-linear operations, like non-linear inverse dynamics.

Account for force plate drift on long trials

We record the forces at the beginning and end of a trial with no load on the treadmill and not treadmill motion. These force measurements can be used to account for any drift. We need to check into that.

Rotation method for motek data

It would be nice to have a method in DFlowData that output all the data expressed in a different coordinate frame. You could pass in a rotation matrix and all of the data that has the xyz coordinates from the mocap system could be output into a different global reference frame.

no spaces in directory or filenames

Sandy,

I think it will be best if we don't use spaces in directories or file names, there are cross platform issues sometimes, and life will likely be easier if we avoid it.

You can change your directories and have git track it using the git mv command:

git mv Inertial\ Compensation inertial_compensation

Speed

Here is a profile of essentially running DFlowData.clean_data() + all the methods in WalkingData (including inverse dynamics compuations).

         7641430 function calls (7640347 primitive calls) in 181.271 seconds

   Ordered by: internal time

   ncalls  tottime  percall  cumtime  percall filename:lineno(function)
   320218  165.994    0.001  165.994    0.001 {method 'readline' of 'file' objects}
        3    2.709    0.903    2.709    0.903 {method 'read' of 'pandas.parser.TextReader' objects}
    36844    1.017    0.000    1.017    0.000 {pandas.lib.maybe_convert_objects}
       19    0.879    0.046  168.193    8.852 session.py:494(evaluate)
    36839    0.816    0.000    4.204    0.000 series.py:3129(interpolate)
    37612    0.574    0.000    0.643    0.000 common.py:134(_isnull_ndarraylike)
      120    0.530    0.004    0.532    0.004 fitpack2.py:441(__init__)
    41185    0.385    0.000    0.385    0.000 {method 'reduce' of 'numpy.ufunc' objects}
   320229    0.331    0.000    0.331    0.000 {_codecs.utf_8_decode}
   320226    0.284    0.000    0.777    0.000 {method 'decode' of 'str' objects}
   268534    0.267    0.000    0.485    0.000 {method 'view' of 'numpy.ndarray' objects}
    37998    0.258    0.000    0.851    0.000 series.py:430(__new__)
    37612    0.238    0.000    0.988    0.000 common.py:64(_isnull_new)
951420/951418    0.226    0.000    0.226    0.000 {isinstance}
    38160    0.208    0.000    0.209    0.000 {method 'copy' of 'numpy.ndarray' objects}
   111799    0.208    0.000    0.426    0.000 index.py:332(__getitem__)
    36642    0.203    0.000    1.341    0.000 frame.py:1928(_ixs)
   320110    0.185    0.000    0.297    0.000 __init__.py:1343(isEnabledFor)
    98213    0.169    0.000    0.169    0.000 {numpy.core.multiarray.array}
   320223    0.162    0.000    0.493    0.000 utf_8.py:15(decode)
   320110    0.154    0.000    0.451    0.000 __init__.py:1128(debug)
   151206    0.133    0.000    0.193    0.000 numeric.py:1810(isscalar)
    38175    0.132    0.000    0.298    0.000 series.py:3299(_sanitize_array)
      801    0.130    0.000    0.130    0.000 {pandas.algos.take_2d_axis0_float64_float64}
164141/164139    0.127    0.000    0.129    0.000 {getattr}
        1    0.119    0.119    0.277    0.277 __init__.py:20(<module>)
     1246    0.117    0.000    0.117    0.000 {numpy.core.multiarray.concatenate}
    36839    0.112    0.000    4.317    0.000 frame.py:4389(<lambda>)
   320110    0.112    0.000    0.112    0.000 __init__.py:1329(getEffectiveLevel)
      197    0.110    0.001    6.391    0.032 frame.py:4433(_apply_standard)
    36642    0.098    0.000    0.647    0.000 internals.py:1681(iget)
    76780    0.095    0.000    0.108    0.000 index.py:318(__contains__)
    75404    0.092    0.000    0.213    0.000 series.py:532(__array_finalize__)

I think using readline to pull the events from the record files is a major slow down. Also all the calls to session.py are oct2py calls, which are slow. I believe the slow parts are the inverse dynamics real time filter and the soder.m file.

Event extraction can remove the timestamp column

    # TODO : The time stamp is removed from dflow_data.data here, so the
    # following command fails because there is no timestamp column. i.e.
    # the class is mutable. This problematic if you want to extract more
    # than one event.

Try extracting two events in a row from DFlowData and setting the index_col flag to the same column on both.

License

We should put a license with the code so it is clear to people how they can use it. I've been using http://unlicense.org/ lately which makes it public domain (no restrictions). If you all desire some more restrictive (require citation, non-commerical, etc) we should discuss that. Right now the UNLICENSE is in the directory as that is what I had in DynamicisitToolKit.

Realtime filter in leg2d seems to be rather slow

octave:8> profile on
octave:9> leg2d(rand(5000, 1), rand(5000, 12), rand(5000, 3), options);
Marker 1: 0 samples are missing, longest gap is 0 samples.
Marker 2: 0 samples are missing, longest gap is 0 samples.
Marker 3: 0 samples are missing, longest gap is 0 samples.
Marker 4: 0 samples are missing, longest gap is 0 samples.
Marker 5: 0 samples are missing, longest gap is 0 samples.
Marker 6: 0 samples are missing, longest gap is 0 samples.
octave:10> profile off
octave:11> T = profile('info');
octave:12> profshow(T)
   #                  Function Attr     Time (s)        Calls
-------------------------------------------------------------
  26                  rtfilter             3.560        70000
  25 myfiltfilt>rtfilter_batch             2.287           14
   3                  binary *             0.306       947201
  12                  binary +             0.138       315762
  10                     zeros             0.112        69898
  24                  binary /             0.104       315715
  29                        pi             0.074        70162
   5                      size             0.072        70001
  11                  binary -             0.065       175555
  30                  binary ^             0.060       175388
  27                   isempty             0.046        70000
  28                 binary <=             0.042        70018
  31                      sqrt             0.022        35081
   2                     leg2d             0.005            1
  33                    lookup             0.004           18
   1                      rand             0.004            3
  32                  interp1q             0.003           18
  18                myfiltfilt             0.001            7
  19                    flipud             0.001           21
  44                    unwrap             0.001            4
octave:13> 

GRF Plot is no longer the same

The do_plot flag in grf_landmarks no longer calls show() and it now plots two windows instead of subplots. The number of steps is missing from the axes titles.

@campanawanna Let's agree on some of these changes before merging. You must have added this after I reviewed the PR for your acceleration stuff.

Add in compensation for Mx moments due to belt acceleration.

Now that we have a stationary V-gait we do not need the inertial compensation with the accelerometers (although it would still help, once we get better accelerometers).

But we do need to to the compensation for the belt/roller acceleration. Sandy has found a linear relationship between Mx (Nm) and the acceleration (m s^-2). Theoreticaly, no other force plate signals should be affected by belt acceleration. Sandy can probably confirm that with her data.

I would suggest to do a 6 Hz zero-lag filter on the belt speed signal (make sure it's exactly the same filter that is used for all other signals), then do a 3-point zero lag derivative estimation (xdot_i = (x_i+1 - x_i-1)/(t_i+1 - t_i-1)) to get the acceleration. Multiply by the correction coefficient, subtract result from Mx and that's it.

If you use filtered velocity, you should correct after filtering the force plate signals, otherwise the correction signal gets filtered twice. Since all these processing steps are linear, I suppose you can use unfiltered velocity and correct before filtering the force plate signals. The order does not matter. The only thing is, if you look at a corrected but unfiltered Mx signal, it may look much noisier than the original Mx signal because it has correction related to a possibly noisy belt acceleration signal. At the end it's all the same, it only matters if you want to inspect these intermediate results.

Do not use a time vector (floats) as the index in the walking data.

I think it was a poor choice to use a time vector of floats for the index in walking classes. It may be possible to use Panda's datetime index classes, but accuracy issues arise because it can only resolve to nanoseconds (based on integers from Epoch). I got the float indices to work for the WalkingData class and the SimpleController class, but I should probably revert to an integer index and move the time stamp's into the data columns. This doesn't work as cleanly as one would hope but there are fundamental issues with using floating point indices.

split_at(num_samples=None) will give too few samples for real data

If num_samples=None in the split_at method, then the shortest step (fewest number of samples) is used for the output number of samples. This works fine if all steps are similar length but if one step has incorrect heelstrike times then you can get steps with too few data points.

Delays in markers and accelerometers

There have been measurable delays when comparing accelerometer readings to corresponding force plate readings and also comparing marker readings to force plate readings. We need to ensure that these are taken care of in the D-Flow/Cortex software pipeline and if not incorporate these delays into our processing pipleline.

Add method that subtracts a constant (or linear) offset from the forces and moments

This is an issue with the D-Flow data, so ideally this would be a method on DFlowData but it will likely require the identification of steps so that you can isolate the stance phase of each step to find a mean offset for each.

Should this happen before or after inertial compensation? I guess it has to happen after inertial compensation because the forces and moments will not be zero.

Inertial compensation and force reexpression is slow

Add a delayed controller.

If the delay is specified we can solve a linear system such as:

m(t) = m0(t) + K(t) * [s0(t)- s(t)] + Kd * [s0(t-tau) - s(t-tau)]

First release (coincides with journal pub submission around July 1)

@tvdbogert @spinningplates @campanawanna

I'm going to shoot for submitting a paper on what's been done so far around July 1st, 2014. Both the data and the source code (including this repository) for processing it will be released alongside the paper publication. This means that a bit of work is needed to get this repository in shape for it's initial release. So I'm going to introduce some standards for code quality. These are open for discussion now if you have any concerns. This article, http://dx.doi.org/10.1371/journal.pbio.1001745 is a good starting point to read about reproducible scientific software.

So here are the standards I'd like us to meet for the publication:

  • All code in the repository must be accompanied by unit tests. We should shoot for 100% test coverage.
  • The unit tests must be automated and run on the TravisCI virtual machine.
  • All code in the repo must have an Apache 2.0 compatible license.
  • The final results must be able to be produced without proprietary software (at least as far back in the pipeline to the D-Flow data input).
  • All functions and classes must have documentation strings.
  • Changes after official releases must have deprecation warnings.
  • Code in the Github master branch should never be broken or untested.
  • All software dependencies and installation instructions must be specified in the documentation.
  • Software must at least run on Linux (TravisCI is Ubuntu 12.04 LTS).

Optional, but nice:

Notes:

  • None of the current Octave code in the repository has unit tests and they don't run on Travis. See issue: #7
  • The python code is at like 71% unit test coverage. It shouldn't be too hard to bring that up to 100%.
  • I'll be merging some of the functionality developed in my "scratch" repo, see: https://github.com/moorepants/walking-sys-id
  • Let me know how you'd like to be attributed
  • I may create a virtual machine that people can download and run to avoid installation problems.

I've created a milestone to track the issues that must be fixed before the first release.

https://github.com/csu-hmc/Gait-Analysis-Toolkit/issues/milestones

Improve gain plots.

  • when certain gains are not in the model, leave blank spaces in the matrix instead of plotting zeros.
  • Plot the derivative gains on a different scale, they are much smaller than the proportional gains.
  • Indicate units on the vertical axis.
  • Add one column to the matrix to show the m0(t) term.
  • Combine results from left and right into one graph, blue and red curves. Rows would then be hip moment, knee moment, ankle moment. Columns would be labeled "same side" and "other side" instead of "left" and "right". This is more compact and will make it easier to compare left and right. If a 3 x 13 matrix does not look good, maybe use 3 columns and 13 rows?

WalkingData could be a set of functions instead of a class.

I'm not sure the WalkingData class is that useful in it self. Could do:

data_frame = gait.inverse_dynamics_2D(data_frame)
data_frame = gait.grf_landmarks(data_frame, left, right)
data_panel = gait.split_at(data_frame, ...)

We basically have that now, but don't need the first arg because that is self in the class.

Consistency on modifying attributes

Right now the classes have a weird combination of modifying attributes of the object in-place and methods that don't modifying attributes but return the results, of which other methods use to modify attributes.

This inconsistency is tied around thinking about object immutability. Some objects work nicely as immutable, i.e. their methods always return a new modified object. But mutable objects let you modify the object in place.

pandas data frames are not immutable, you can change the underyling data, but it seems like most of their methods return a new data frame as opposed to modifying it in place. So it is a combination. I'm really not sure what the best OO design paradigm is for this stuff.

D-Flow 3.16.2rc4+ outputs missing markers as a string of zeros

FYI, the release candidate version of D-Flow that we are required to use due to the new stabilization struts has backward incompatible outputs. Missing markers used to be indicated by holding the previous marker value constant. See: http://gait-analysis-toolkit.readthedocs.org/en/latest/motek.html#missing-values. After failing to process the data from the most recent trials I tracked down the reason our software no longer gives correct answers. It seems that the latest DFlow (3.16.2rc4) now outputs a string in the text file representing zero, '0.000000', for missing markers. This does not work with the current version of Gait-Analysis-Toolkit and some refactoring is now needed.

Octave backslash fails with ATLAS LAPACK

When running inertial_compensation.m on the compensation data needed for run T006 the backslash operator fails to solve the linear least squares here:

https://github.com/csu-hmc/Gait-Analysis-Toolkit/blob/master/Octave-Matlab-Codes/Inertial-Compensation/inertial_compensation.m#L83

This failure gives this error:

octave:7> new_forces = inertial_compensation(calib_forces, calib_accel, markers, forces, accel);
 ** On entry to DLASD4 parameter number -1 had an illegal value
error: inertial_compensation: exception encountered in Fortran subroutine dgelsd_
error: called from:
error:   /home/moorepants/src/Gait-Analysis-Toolkit/Octave-Matlab-Codes/Inertial-Compensation/inertial_compensation.m at line 88, column 3

This error only occurs if the ATLAS LAPACK version is enabled and does not occur when the normal LAPACK is enabled. It would be nice to output the matrices at line 88 and then just try the backslash with ATLAS LAPACK and normal LAPACK.

Subclass DFlowData for loading compensation data

Loading compensation data is very similar to loading a regular trial. It may make sense to subclass DFlowData and put all the associated compensation computations with that class instead of having them buried in DFlowData. This would also allow the missing marker identification issue in #69 to be addressed for the compensation files.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.