Comments (16)
All the tests in this repository are currently non-python invocations, dependent on 2 executables pynml and jnml (Java). The question is how we should approach this:
- Wrap the non-python test calls in python (may not work because the jnml tests actually opens a java GUI window on which it draws an image.)
- Write a consistent interface for testing, that works with python doctest, as well as non-python stuff.
from muscle_model.
The instructions for both muscle_model/README.md and muscle_model/NeuroML2/README.md for installing the binaries for running examples in the repository have now been completed and verified on my MAC OS X Yosemite machine.
Points of note:
- On Yosemite (need to test this on other platforms), the instruction "pip install lxml" should be "STATIC_DEPS=true pip install lxml"
- Prior comments on testing infrastructure via Python doctest stands. In particular, muscle_model/NeuroML2/analyse_k_fast.sh will generate 6 (?) different images for its single experiment. This begs the question of how we can automate the process of checking that each of these images are "correct" with respect to the experiment parameters encapsulated by the script.
from muscle_model.
@cheelee
Thanks for that; I've also noted some installation issues with lxml
and , as this subproject matures, a unified installation script will become more necessary.
#29 deals with section 2 of the README, and I believe @net239 took a shot at this once. This could certainly be used/adapted to fit into a unified script as well.
As for the doctest issue, what you say is true. Perhaps we could test that the script has an exit code of 0
? I think comparing the plots would be unnecessarily difficult at this stage.
from muscle_model.
@cheelee Relative to the discussion we're having over here, if we change those jnml
calls to pynml
, we can do pynml -nogui
and the command will not produce gui windows, but will still run.
I think we can then do something like
def a_test():
exit_code = subprocess.call(['pynml', '-nogui', 'LEMS_Figure2A.xml'])
assert exit_code == 0
We just have to be sure that the arguments in that call are updated dynamically from the README, rather than hard-coded into the testing script.
There is also this module which is more fine-grained than what I am suggesting above.
What do you think of these approaches, and would you be comfortable taking on either?
from muscle_model.
@travs Sure thing. I should be able to try some of that out and see what I find before the Hackathon Sunday.
from muscle_model.
Do have a look at the testing that's already being run with OMV.
This framework installs jNeuroML on Travis, and runs tests in .test.* files. For example .testA.omt runs LEMS_Figure2A.xml with jnml; .test.validate.omt validates the NML files; .test.nm.omt runs LEMS_NeuronMuscle.xml and checks that the recorded voltage traces have spike times as laid out in .test.nm.mep
from muscle_model.
@pgleeson Cool, I'll check this out! My current scripts encapsulates the java command into a python script but it is a little clunky, and the output generated includes timing information from the subprocess call which makes it hard to validate simply by output comparison alone.
Nymeria:NeuroML2 cheelee$ cat pynml_test.py
import subprocess
import sys
var_command = 'pynml'
var_gui = '-nogui'
var_input = ''
def a_test(avar_command, avar_gui, avar_input):
if avar_gui == '':
exit_code = subprocess.call([avar_command,avar_input])
else:
exit_code = subprocess.call([avar_command,avar_gui,avar_input])
assert exit_code == 0
return
if len(sys.argv) < 2:
print 'Usage: ' + sys.argv[0] + ' <input file> [ withgui ] [ <alt exec tool> ]\n'
sys.exit(-1)
else:
var_input = str(sys.argv[1])
if len(sys.argv) > 2:
var_gui = ''
if len(sys.argv) > 3:
var_command = str(sys.argv[3])
a_test(var_command,var_gui,var_input)
from muscle_model.
@cheelee Let's chat about this one at the hackathon tomorrow!
from muscle_model.
Hey sure thing! I had forgotten about this! This is definitely hackathon material. Sorry about that!
from muscle_model.
Hi @cheelee and @travs -- we were having a chat with @brijeshm39 and @VahidGh today about next steps on the muscle model after integrating #55 and this issue came up as a logical next step. If you guys have any desire to keep moving on this, reply here. Otherwise we're going to see what we can do next to move this forward. Thanks!
from muscle_model.
Hi @slarson I still have an interest to contribute again to the project, but I'm also knee-deep trying to get my divorce settled for good and also to interview for a teaching position with SUNY. So where I am concerned, you guys please go ahead and move forward on this issue and I'll keep my eyes on it and try to keep up. Thanks!
I'm so sorry about all of this, but my ability to focus has been severely limited over the last few months dealing with these issues and the associated bouts of depression that I have to fight through at the same time.
from muscle_model.
@cheelee I am interested in working on the issue.
from muscle_model.
@souravsingh You are most welcome to! I have lost track of the context of this issue after having left it alone for a while, but I can try and see if I can package this in a way that will help you get started.
Are you already a contributor? If not, you can get in touch by filling out the form here - http://docs.openworm.org/en/0.9/#contributing-to-openworm and we can help you get hooked up. Thanks!
from muscle_model.
Thanks @cheelee. I have filled the form, but at the end, I was asked to schedule a meeting with @slarson and I couldn't find a day for the meeting.
from muscle_model.
@souravsingh Not a problem! I'm in touch with him regularly, we'll work something out and then get back to you real soon! Stay tuned and thanks!
from muscle_model.
@souravsingh The scripts in the NeuroML2 folder are well tested at the moment with OMV tests and additional scripts being run in the .travis.yml file as part of the NON_OMV_TESTS.
Note that many of the other directories contain obsolete code (look at READMEs), being kept for information purposes only.
To finish this off what's required is to add some tests on the BoyleCohen2008 code. This would be:
- An OMV based test on running some of the Matlab code (e.g. vclamp.m) using Octave. See here for example. Note test on spike times (mep file) not required.
- Add code in NON_OMV_TESTS of travis.yml to make C code in https://github.com/openworm/muscle_model/tree/master/BoyleCohen2008. Running this may take too long...
from muscle_model.
Related Issues (20)
- Draft workflow outline for validation tests
- Write Pytest tests for muscle_model HOT 1
- Write SciUnit validation tests for muscle_model HOT 10
- Document muscle_model testing framework
- Digitize the figure mentioned in #31 HOT 7
- Write a script that takes the digitized figure and the simulation output and calculates the error/difference
- Write a script that generates an overlaid image of the simulated and digitized data HOT 1
- Turn the error-generating script into a SciUnit test
- Integrate the test in the test suite being run on TravisCI
- Add jneuroML to documentation here HOT 3
- Replace jnml in scripts with pynml? HOT 2
- The same conductance for different channels in nml files HOT 6
- Check for performance regressions in testing suite HOT 1
- Create tests to catch performance regressions in specific functions
- Write PyTest tests for basic command line runs HOT 12
- Delete or archive unused branches HOT 1
- Muscle Model Explorer - Error while loading channel files HOT 2
- Add to the readme to create a well cited and reasoned case for the ideal set of ion channels that should be in the muscle model HOT 8
- SciUnit test to reproduce additional muscle patch clamp data in the muscle model HOT 7
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from muscle_model.