maserlib / expres Goto Github PK
View Code? Open in Web Editor NEWExoplanetary and Planetary Radio Emission Simulator
License: MIT License
Exoplanetary and Planetary Radio Emission Simulator
License: MIT License
Latest readthedocs build failed https://readthedocs.org/projects/expres/builds/12829644/ log of which (https://readthedocs.org/api/v2/build/12829644.txt) contains:
Collecting sphinxcontrib-bibtex
Downloading sphinxcontrib_bibtex-2.1.4-py3-none-any.whl (17 kB)
...
...
Traceback (most recent call last):
File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinx/cmd/build.py", line 279, in build_main
args.tags, args.verbosity, args.jobs, args.keep_going)
File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinx/application.py", line 276, in __init__
self._init_env(freshenv)
File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinx/application.py", line 312, in _init_env
self.env.setup(self)
File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinx/environment/__init__.py", line 216, in setup
for domain in app.registry.create_domains(self):
File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinx/registry.py", line 166, in create_domains
domain = DomainClass(env)
File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinxcontrib/bibtex/domain.py", line 263, in __init__
"You must configure the bibtex_bibfiles setting")
sphinx.errors.ExtensionError: You must configure the bibtex_bibfiles setting
Extension error:
You must configure the bibtex_bibfiles setting
Note incompatible changes was introduced in v2.0.0 (back in December of last year):
https://sphinxcontrib-bibtex.readthedocs.io/en/2.0.0/changes.html
To fix the issue either update documentation code or manually fixate spinxcontrib-bibtex to an older version (i.e. add something like sphinxcontrib-bibtex<2.0.0
to your docs/requirement.txt
).
Calc_lag routine has been modifying accordingly to e587b56 commit.
Doc of develop branch needs to be updated.
Call to WebGeoCalc to obtain ephemeris.
We can use what B. Cecconi has written (see here).
Need to be careful on which service (OBSPM, NASA or ESA) to use (test if obspm service is available, else use NASA or ESA depending on the observer).
Add a Warning in the log IDL: if NASA or ESA or JAXA wgc service is use, say "careful, you could be banned if you send to many requests".
Eventually add an message if an error occur, that say "You were banned"
Need to be careful on which kernel to use (e.g., for Jupiter archive or latest, depending on the date of the simulation).
The ExPRES-v1.0 JSON-shema (https://voparis-ns.pages.obspm.fr/maser/expres/v1.0/schema) has to be updated using the Advanced Users' guide (https://github.com/maserlib/ExPRES/blob/master/docs/usage/advanced.rst)
etc...
To be consistent with Miriade:
JUPITER
EARTH
to observer
Planetocentric
(or latitudinal
in webservice)We then get en longitude which must be corrected as follows:
CML = 180 - lon
NB: this is not what has been done up to now...
To have a documentation more user friendly, we could add a "copy button" in the code block. To do so, we can use the sphynx extension:
sphinx_copybutton
which needs to be added in the extensions
(https://github.com/maserlib/ExPRES/blob/master/docs/conf.py).
Currently, the VESPA_target_name
and VESPA_target_class
attributes are a text string, with #
separator.
Example:
VESPA_target_name = "Jupiter#Io"
VESPA_target_class = "planet#satellite"
We should transform this into a list of terms:
VESPA_target_name = ["Jupiter", "Io"]
VESPA_target_class = ["planet", "satellite"]
Computed the magnetic field model JRM09 with the new current sheet model of Connerney et al (2020)
For visualisation purposes it would be good to be able to get the location of all radio sources in the file (without masking the invisible ones), so that we can display both invisible and visible sources on a plot.
Add a check on the Parent entry of Observer. Must be the same as 1st body & Parent entry in Source
In the archive, the version tag of the CDF and the JSON configuration are not consistent:
expres_juno_jupiter_io_jrm09_lossc-wid1deg_3kev_20210930_v01.json
expres_juno_jupiter_io_jrm09_lossc-wid1deg_3kev_20210930_v11.cdf
I'm trying to set up a input file with multiple longitude auroral sources:
"SOURCE":
[
{"ON":true,
"NAME":"Source1",
"PARENT":"Jupiter",
"TYPE":"",
"LG_MIN":0,
"LG_MAX":350,
"LG_NBR":36,
"LAT":10,
"SUB":0,
"AURORA_ALT":0.009091926738619804,
"SAT":"",
"NORTH":true,
"SOUTH":false,
"WIDTH":1,
"CURRENT":"Transient (Aflvenic)",
"CONSTANT":0.0,
"ACCEL":10,
"TEMP":0,
"TEMPH":0,
"REFRACTION":false
},
This setup fails in m_cdf.pro
It would be interesting to be able to choose a value for θ that varies with frequency.
That is, on the same basis as θ = cste, but different cstes over the frequency range (chosen by the observer? Based on a model?)
File https://github.com/maserlib/ExPRES/blob/develop/src/build_serpe_skt.pro
Variable / UCD
time.epoch
em.freq
meta.id;em.freq
meta.id;src
meta.id;pos.bodyrc.lat
pos.cartesian;src
phys.count
phys.count
pos.angDistance
pos.posAng
pos.bodyrc.lon;obs.observer
pos.bodyrc.lat;obs.observer
pos.distance;obs.observer
pos.bodyrc.lon;obs.observer
pos.cartesian;src
pos.bodyrc.lon;src
stat.max;em.freq;src
stat.max;em.freq;src
em.freq;phys.density;phys.electron
em.freq;phys.magField;phys.electron
Any reason why the output Provenance VOTable is empty ?
As discussed initially in #18, it would be interesting to set up a metadata dictionary defining concepts related to the radio sources properties.
Line 89 (develop
branch) of file read_ephem_obs.pro
is:
time.maxi=(aj2-aj1)*24.*60.+1
Variables aj1
and aj2
are the decimal year-day format (YYYYDDD.dddddd
, with YYYY
, the year, DDD
the day of year, and dddddd
the decimal day).
In the context of the code, time.maxi
should be equal to the length duration aj1
and aj2
, in minutes.
The proposed computation is broken if an new year is passed between aj1
and aj2
. This should be fixed.
There is a CDF file which can't be opened with PyCDF. I'll check for other similar errors. We should be run this file.
>>> from spacepy import pycdf
c = pycdf.CDF('/volumes/kronos/serpe/data/earth/2004/05/expres_earth_jupiter_io_jrm09_lossc-wid1deg_3kev_20040516_v11.cdf')
>>> c = pycdf.CDF('/volumes/kronos/serpe/data/earth/2004/05/expres_earth_jupiter_io_jrm09_lossc-wid1deg_3kev_20040516_v11.cdf')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/obs/rpws/anaconda3/envs/cdf/lib/python3.7/site-packages/spacepy/pycdf/__init__.py", line 1718, in __init__
self._open(True if readonly is None else readonly)
File "/obs/rpws/anaconda3/envs/cdf/lib/python3.7/site-packages/spacepy/pycdf/__init__.py", line 1899, in _open
lib.call(const.OPEN_, const.CDF_, self.pathname, ctypes.byref(self._handle))
File "/obs/rpws/anaconda3/envs/cdf/lib/python3.7/site-packages/spacepy/pycdf/__init__.py", line 633, in call
*(args + (const.NULL_, ))
File "/obs/rpws/anaconda3/envs/cdf/lib/python3.7/site-packages/spacepy/pycdf/__init__.py", line 585, in check_status
raise CDFError(status)
spacepy.pycdf.CDFError: CHECKSUM_ERROR: The data integrity verification through the checksum failed.
>>>
I get an error in fz_spdyn
line 213:
for j=0l,parameters.time.n_step-1l do image(j,*)=interpol(float(image(j,*)),f,u)
Error message:
% INTERPOL: V and X arrays must have same # of elements
% Execution halted at: FZ_SPDYN 213
Input files (10 sources are configured)
While running the taplint epn_core validator, it appears that the VESPA_measurement_type
global attribute must be updated to a valid value.
The current value is em.radio;meta.modelled
and both words can only be used as qualifiers (secondary terms). We have to select a primary term, like phot.flux.density
(which is not so well adapted), phys.polarization
, pos
, src
...
May be a good pick would be: pos;src;em.radio;meta.modelled
, which translates into:
Position and coordinates / Observed source viewed on the sky / Radio part of the spectrum / Quantity was produced by a model
(using the explain tool of http://cdsweb.u-strasbg.fr/UCD/tools.htx)
Add the new lead angle models that L. Lamy determined from Bonfond et al. (2017) and Hinton et al., 2019, once he will have provided them.
Add an entry in the json to automatically compress the cdf file in ExPRES
Dear developers,
The Src_ID_Label
field in the ExPRES CDF is filled from an IDL table with string sizes equal to the maximal string length in the original array. This results in trailing blank spaces at the end of some values, which is not optimal for reading.
For instance :
['0d-30R NORTH ' '119d-30R NORTH' '239d-30R NORTH' '359d-30R NORTH' '0d-30R SOUTH ' '119d-30R SOUTH' '239d-30R SOUTH' '359d-30R SOUTH' 'Io NORTH ' 'Io SOUTH ']
Cheers!
The last submitted job is pending
and a message stating " Can't Start Job 1b515b: Internal Server Error" appears momentarily when the job is submitted.
It would be interesting to be able to choose a value of the electron energy that varies with frequency
Based on a model (empirical, see Hess paper ?) or chosen by the observer?
We need to be able to use Webgeocalc webservice to retrieve ephemeris data.
In python, it is very easy, so I propose got write a python module to access webgeocalc, and use the IDL spawn
command to use the script.
What can we do with the 10-minute limitation of the 'Run-on-demand' using free access on UWS ?
I tried to run ExPRES-dev with the attached files. The input file includes auroral sources and satellites-controlled sources. I get an error, since the ephemeris file does not include those of Io, Europa, Ganymede.
It would be good to update of the documentation with a dscription of the CDF file output, explaining each variable of the file and telling which option shall be used to activate/deactivate their output.
It would be good to have a test configuration file with a custom frequency list, for the testing suite.
Modifications from ExPRES 1.0 to 1.1:
It seems that the source name seems must be of the form "SourceXXX" where XXX is a number (or a string that can be interpreted as a float).
Source1
is validSource1.1
is validSource01
is validSource10
is validSource_10
is invalidSource_N10d
is invalidThis can be included into the JSON schema validation.
if [SIMU][NAME] == "config_parameters_set01.yml": change the file names as follows:
expres_observertargetdate_AFT_magXX_srcXX_outXX_vXX.cdf
with magXX_srcXX_outXX the parameters set configuration for the magnetic field, the source and the outputs,
e.g. earth_jupiter_io_20200929_mag01_src01_out01_v1.0.cdf
else: keep the previous file name, e.g. expres_earth_jupiter_io_jrm09_lossc-wid1deg_3kev_20200929_v01.cdf?
or only expres_earth_jupiter_io_jrm09_20200929_v01.cdf?
if [SIMU][NAME] == "config_parameters_set01.yml":
YAML file example:
[header]
- paramset_id: 01
[src_params]
- source_energy: 3 kev
- source_origin: Io
- source_lag_model: Hess2008
- source_aurora_alt: AURORA_ALT
- ...
[mfl_params]
- mfl_model: JRM09
- cs_model: CAN81
- density_model
[output_params]
- THETA: true
- FP: true
- FC: true
- AZIMUTH: false
- OBSLATITUDE: true
- SRCLONGITUDE: false
- SRCFREQMAX: true
- OBSDISTANCE: true
- OBSLOCALTIME: false
- CML: true
- SRCPOS: false
Add the possibility of adding a ± δ° to the value of the lead angle from the chosen model.
Write a function that calls different MIRIADE ephemeris and checks the results.
Then automatize the call to this function so that the check is done weekly.
UWS : add the possibility to retrieve a .sav file.
I tried to run a simulation with ExPRES-dev in voparis-uws-maser.obspm.fr. And I get the following error:
'Cannot create Job : internal server error'
Any idea why ?
Need to automatically create a provenance file, that will contains all data provenance (including, e.g., kernels name).
This file needs to be created by uws, while running the “archive” simulations
I tried to launch a run with an input Ephem file on the dev branch. I got the following IDL error in stderr
:
mv: 'Auroral_Galileo_C30.json' and 'Auroral_Galileo_C30.json' are the same file
% READ_SAVE_JSON: Something wrong happened with JSON file... Aborting.
% Stop encountered: READ_SAVE_JSON 1156 /obs/serpe/code/ExPRES/src/read_save.
pro
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.