Coder Social home page Coder Social logo

b612-asteroid-institute / adam_home Goto Github PK

View Code? Open in Web Editor NEW
13.0 10.0 8.0 9.45 MB

ADAM python client and notebooks

License: MIT License

Python 64.73% Shell 0.10% Makefile 0.11% Batchfile 0.14% E 34.93%
client adam python jupyter-notebooks

adam_home's People

Contributors

astrogatorjohn avatar carise avatar emmieking avatar hankg avatar hoodedcrow avatar jp-c3 avatar katkiker avatar laurahlark avatar lowellrh avatar mjuric avatar moeyensj avatar rhiannonlynne avatar samotiwala avatar vvittaldev avatar zomglings avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

adam_home's Issues

non default adam config setup

I have modified the adam_config.py file so that user's can supply a non-default adam location, I couldn't find a way to pass in a parameter to the import. I ended up creating a new function that I could pass the parameter into and also return the paths from. The default import should still work though. Doing it this way causes some redundant code. I'm unsure if this is the best way to do it. Let me know what you think and if you know of a 'cleaner' approach.

Add Database Injest Function for STK Outputs: include Vector and IntervalList Tables

Currently, the workflow to go from LSST telescope pointings to STK outputs involves creating a reduced OpSim pointing text file from the OpSim summary database while simultaneously creating the Interval and Vector files for STK. For the impact probability / warning time analysis, we need to be able to efficiently access telescope pointing information alongside STK access outputs in order to calculate asteroid observability. I'll add a function to the stk subpackage in adam to read outputs from STK into a database with telescope pointing information. Ideally, this should be as decoupled to LSST specific database jargon as possible so I'll add a mapping dictionary to a configuration class in a separate study-specific repo.

This is of interest to: @AstrogatorMike, @brycebolin and @edlu123

Expose enableLogCloseApproaches setting in the ADAM python

We have a setting for it in the backend, but it's not currently in the python code. enableLogCloseApproaches will allow the user to turn on or turn off the close approach logging when doing propagations (right now just for Monte Carlo propagations)

Add stk package

Update the stk python package from the hack days and add to this repository

Documentation Standard? (sphinx-style)

We briefly discussed this during the hack session but didn't decide on anything. Are we interested in using sphinx-style documentation hosted by a service like read the docs? If so, it could be good to agree on a comment / docstring / documentation standard and start using that from now on to make it easier for us later.

Here is an astropy example going from higher level documentation all the way to source code:
http://docs.astropy.org/en/stable/
http://docs.astropy.org/en/stable/coordinates/index.html
http://docs.astropy.org/en/stable/api/astropy.coordinates.SkyCoord.html#astropy.coordinates.SkyCoord.to_string
http://docs.astropy.org/en/stable/_modules/astropy/coordinates/sky_coordinate.html#SkyCoord.to_string

Integration tests take a long time to run

Tests are taking a very long time to run.

Seeing database errors:

com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: No operations allowed after connection closed

New encrypted test_config.json

I have updated the unit tests in the Integration tests folder to use the new adam configuration. I named the file "test_adam_config.json" so the old file wouldn't get used. I believe this new file is created in Openssl and decoded from using the travis.yml and an encrypted file stored in the repo. Should this new file be generated with the same env/token of the previous version (test_config.json)? If so, I don't have this information. Who do you think is the best person to contact?

Revise demo notebooks to use new 'standard' adam path.

(1) delta_v_demo.ipynb:
insert cells 1 and 2, update cell 4 to use new config_file path
cell 9 -> AssertionError:
Not equal to tolerance rtol=0, atol=0.001

(2) many_batch_run.ipynb:
insert cells 1 and 2, update cell 4 to use new config_file path
cell 7 -> Encountered error 503 calling get on /batch/d1bd1fb9-2e89-4672-b0fb-8e8e6d78d224/1?token=redacted: {'error': {'errors': [{'domain': 'global', 'reason': 'backendError', 'message': 'org.apache.shiro.session.ExpiredSessionException: Session with id [f3396f5e-b714-43d6-a9e1-9add8f62cd7e] has expired. Last access time: 12/3/18 7:02 PM. Current time: 12/3/18 7:36 PM. Session timeout is set to 1800 seconds (30 minutes)'}], 'code': 503, 'message': 'org.apache.shiro.session.ExpiredSessionException: Session with id [f3396f5e-b714-43d6-a9e1-9add8f62cd7e] has expired. Last access time: 12/3/18 7:02 PM. Current time: 12/3/18 7:36 PM. Session timeout is set to 1800 seconds (30 minutes)'}}

(3) create_workspace.ipynb - I did not alter this notebook because of the note: "Can only be run by someone with sufficient access."

(4) permission_management_demo.ipynb - I did not alter this notebook because I don't think I have sufficient access to manage permissions.

(5) projects_demo.ipynb - insert cells 1 and 2, update cell 4, No Errors

(6) single_run_demo.ipynb - insert cells 1,2,3,5 and update cell 6, No Errors

Error in create_workspace.ipynb notebook

When I run the first block, the notebook complains that it can't find adam to import.

I had to add the following code above the first block to make the notebook work:

import sys
from os.path import expanduser

# default adam location - alter directory structure if you decide to use a different path
# for example: adam_home_defined = expanduser("~") + "your_project_name/scenario1/adam_home"
adam_home_defined = expanduser("~") + "/Development/adam_home" # default location

# Add adam path to system in order to import adam
sys.path.append(adam_home_defined)
# import adam and adam modules required
import adam

errors in Travis tests of new build

Looks like error in small_batches and hyper_cube

Emmie and I noticed this when submitting adam_config.py, but it doesn't seem like that could have caused these errors.

ADAM_Travis_error_log.txt

=================================== FAILURES ===================================
_______________________ BatchRunnerTest.test_small_batch _______________________
self = <adam.tests.integration_tests.batch_runner_test.BatchRunnerTest testMethod=test_small_batch>

...

The command "py.test adam --cov=adam --ignore=adam/tests/integration_tests/hypercube_test.py" exited with 1.

Set up continuous deployment to anaconda.org

Any built master should be uploaded to anaconda.org for end-user ease of install (and to avoid the divergence of what's available there vs. what's on master).

The assumption is that a) the master always build and b) after v1.0 we'll become careful about API compatibility testing.

Should be straightforward to do by adding a conda build CI step, followed by anaconda upload. Auth tokens can be taken care of using standard GitHub Actions secrets.

Fix test: integration_tests/reference_frame_test.py:test_sun_ememe

integration_tests/reference_frame_test.py:test_sun_ememe is currently failing.

The test appears to test a propagation that's sun-centered in EMEME2000 reference frame, and expects end state to be in ICRF.

From the test:

        # The output state is expected to be in ICRF.
        expected_end_state = [73978163.61069362, -121822760.05571477, -52811158.83249758,
                              31.71000343989318, 29.9657246374751, .6754531613947713]
        # These values are in EMEME. The resulting ephemeris is not expected to match these values.
        # expected_end_state = [73978158.47632701, -132777272.5255892, 5015.073123970032,
        #                       31.710003506237434, 27.761693311026138, -11.299967713192564]

The actual end state is the commented-out expected_end_state and is in EMEME. I'm not sure when this test was last working. I also ran the propagation parameters/OPM parameters through single_run_demo and got the same expected_end_state (the EMEME one).

Tests are failing due to ADAM not being able to parse OPM

I'm creating this issue here for now, not sure if the error happens because we're passing the wrong data to the backend or what.

I noticed this error in the logs:

java.io.InvalidClassException: org.b612foundation.adam.opm.OrbitParameterMessage; local class incompatible: stream classdesc serialVersionUID = <something>, local class serialVersionUID = <something else>"
... stack trace continues...
at org.b612foundation.adam.datamodel.DatabaseHelper.ungzipObject(DatabaseHelper.java:165
...
at org.b612foundation.adam.storage.PropagationParametersDbStoreHelper.getFieldValuesFromResults(PropagationParametersDbStoreHelper.java:37
...

But the error got reported as org.apache.shiro.authc.IncorrectCredentialsException (this is another separate issue, wrt our error handling) in AppEngine.

The error occurred when I submitted a commit to this repo and the tests were running. (run)

The error is coming from one of the tests that runs the BatchPropagation (/adam/v1/adam_object/runnable_state/single/BatchPropagation).

Revise ReadMe

Testing Installation section returns errors if user does not have test_config.json. I have attached a doc to explain User Installation - should we attach it to the ReadMe in the empty section?
defined_ADAMinstallInstruct.docx

Hypercube: should all parts of batch have same ephem after running the job?

I was playing with the single run demo and uncommented 3 parameters to OpmParams to run the hypercube:

opm_params = OpmParams({
    'epoch': '2017-10-04T00:00:00Z',
    'state_vector': state_vec,

     'covariance': covariance,   # object covariance
     'perturbation': 3,          # sigma perturbation on state vector
     'hypercube': 'FACES',       # hypercube propagation type
})

When I printed out the batch parts ephems, it looks like each part had the same ephem. Should figure out if this is intentional and correct.

Create a notebook using 2018VP1

Create a Jupyter notebook that takes data from 2018VP1 and runs it through ADAM. This will be one of many other regression tests we hope to run against ADAM.

Convert Monte Carlo Examples down to one notebook

  • Notebook should show each of the major representations: Keplerian Sigma, Cartesian Sigma, Keplerian Covariance, Cartesian Covariance, and make a note about mean vs true anomaly
  • Notebook should select the type based on another parameter rather than commented out code so that users know that the values in the code are still valid/legitimate.

define adam_home option

I modified the adam_config.py module to use only a function which passes in a adam_home path instead of assuming adam_home is located the user's home.

Update README

Update README to reflect repo changes: add badges for license, travis, coveralls, etc...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.