Coder Social home page Coder Social logo

expres's People

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

expres's Issues

readthedocs build failing due to breaking change in sphinxcontrib-bibtex

Latest readthedocs build failed https://readthedocs.org/projects/expres/builds/12829644/ log of which (https://readthedocs.org/api/v2/build/12829644.txt) contains:

Collecting sphinxcontrib-bibtex
  Downloading sphinxcontrib_bibtex-2.1.4-py3-none-any.whl (17 kB)
...
...
Traceback (most recent call last):
  File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinx/cmd/build.py", line 279, in build_main
    args.tags, args.verbosity, args.jobs, args.keep_going)
  File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinx/application.py", line 276, in __init__
    self._init_env(freshenv)
  File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinx/application.py", line 312, in _init_env
    self.env.setup(self)
  File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinx/environment/__init__.py", line 216, in setup
    for domain in app.registry.create_domains(self):
  File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinx/registry.py", line 166, in create_domains
    domain = DomainClass(env)
  File "/home/docs/checkouts/readthedocs.org/user_builds/expres/envs/latest/lib/python3.6/site-packages/sphinxcontrib/bibtex/domain.py", line 263, in __init__
    "You must configure the bibtex_bibfiles setting")
sphinx.errors.ExtensionError: You must configure the bibtex_bibfiles setting

Extension error:
You must configure the bibtex_bibfiles setting

Note incompatible changes was introduced in v2.0.0 (back in December of last year):
https://sphinxcontrib-bibtex.readthedocs.io/en/2.0.0/changes.html

To fix the issue either update documentation code or manually fixate spinxcontrib-bibtex to an older version (i.e. add something like sphinxcontrib-bibtex<2.0.0 to your docs/requirement.txt).

calc_lag update

Calc_lag routine has been modifying accordingly to e587b56 commit.

Doc of develop branch needs to be updated.

Call to Webgeocalc for ephemeris

Call to WebGeoCalc to obtain ephemeris.

We can use what B. Cecconi has written (see here).

Need to be careful on which service (OBSPM, NASA or ESA) to use (test if obspm service is available, else use NASA or ESA depending on the observer).

Add a Warning in the log IDL: if NASA or ESA or JAXA wgc service is use, say "careful, you could be banned if you send to many requests".
Eventually add an message if an error occur, that say "You were banned"

Need to be careful on which kernel to use (e.g., for Jupiter archive or latest, depending on the date of the simulation).

The ExPRES-v1.0 JSON schema needs to be updated

The ExPRES-v1.0 JSON-shema (https://voparis-ns.pages.obspm.fr/maser/expres/v1.0/schema) has to be updated using the Advanced Users' guide (https://github.com/maserlib/ExPRES/blob/master/docs/usage/advanced.rst)

  • "BODY": {
    "description": "Configuration of the Natural Bodies of the Simulation Run",
    "ORB_PER": {
    "description": "???", --> The orbital period according to 3rd Kepler's law at 1 radius (in minutes)
    "type": "number"
    },
    "INIT_AX": {
    "description": "???", --> The reference longitude (in degrees)
    "type": "number"

etc...

Check WebGeoCalc configuration for ExPRES ephemeris

To be consistent with Miriade:

  • Target = JUPITER
  • Observer = EARTH
  • Light Time Correction = to observer
  • Frame type = Planetocentric (or latitudinal in webservice)

We then get en longitude which must be corrected as follows:
CML = 180 - lon

NB: this is not what has been done up to now...

Fix VESPA_target_* global attribute

Currently, the VESPA_target_name and VESPA_target_class attributes are a text string, with # separator.

Example:

VESPA_target_name = "Jupiter#Io"
VESPA_target_class = "planet#satellite"

We should transform this into a list of terms:

VESPA_target_name = ["Jupiter", "Io"]
VESPA_target_class = ["planet", "satellite"]

Observer Parent name

Add a check on the Parent entry of Observer. Must be the same as 1st body & Parent entry in Source

Ephemeris issue

How do we have to select the ephemeris

  1. We need to take the ephemeris corrected from the observer distance, to obtain the correct longitude of the active magnetic field line at the moment of the emission (ephemeris from Horizons are ok; ephemeris from UIowa are not corrected from the light travel time)
  2. Then we need to delay the observation of the emission by the light travel time of the signal from the emitting body to the observer.

check JSON file name in archive pipeline

In the archive, the version tag of the CDF and the JSON configuration are not consistent:

expres_juno_jupiter_io_jrm09_lossc-wid1deg_3kev_20210930_v01.json
expres_juno_jupiter_io_jrm09_lossc-wid1deg_3kev_20210930_v11.cdf

Multiple sources within a Source element

I'm trying to set up a input file with multiple longitude auroral sources:

"SOURCE":
	[
		{"ON":true,
		"NAME":"Source1",
		"PARENT":"Jupiter",
		"TYPE":"",
		"LG_MIN":0,
		"LG_MAX":350,
		"LG_NBR":36,
		"LAT":10,
		"SUB":0,
		"AURORA_ALT":0.009091926738619804,
		"SAT":"",
		"NORTH":true,
		"SOUTH":false,
		"WIDTH":1,
		"CURRENT":"Transient (Aflvenic)",
		"CONSTANT":0.0,
		"ACCEL":10,
		"TEMP":0,
		"TEMPH":0,
		"REFRACTION":false
		},

This setup fails in m_cdf.pro

check UCD of all Variable in cdf skeleton

File https://github.com/maserlib/ExPRES/blob/develop/src/build_serpe_skt.pro
Variable / UCD

  • Epoch / time.epoch
  • Frequency / em.freq
  • Freq_Label / meta.id;em.freq
  • Src_ID_Label / meta.id;src
  • Hemisphere_ID_Label / meta.id;pos.bodyrc.lat
  • Src_Pos_Coord / pos.cartesian;src
  • Polarization / phys.count
  • VisibleSources / phys.count
  • Theta / pos.angDistance
  • Azimuth / pos.posAng
  • CML / pos.bodyrc.lon;obs.observer
  • ObsLatitude / pos.bodyrc.lat;obs.observer
  • ObsDistance / pos.distance;obs.observer
  • ObsLocalTime / pos.bodyrc.lon;obs.observer
  • SrcPosition / pos.cartesian;src
  • SrcLongitude / pos.bodyrc.lon;src
  • SrcFreqMax / stat.max;em.freq;src
  • SrcFreqMaxCMI / stat.max;em.freq;src
  • FP / em.freq;phys.density;phys.electron
  • FC / em.freq;phys.magField;phys.electron

Bug in read_ephem_obs.pro

Line 89 (develop branch) of file read_ephem_obs.pro is:

time.maxi=(aj2-aj1)*24.*60.+1

Variables aj1 and aj2 are the decimal year-day format (YYYYDDD.dddddd, with YYYY, the year, DDD the day of year, and dddddd the decimal day).

In the context of the code, time.maxi should be equal to the length duration aj1 and aj2, in minutes.

The proposed computation is broken if an new year is passed between aj1 and aj2. This should be fixed.

Error in archived CDF file

There is a CDF file which can't be opened with PyCDF. I'll check for other similar errors. We should be run this file.

>>> from spacepy import pycdf
c = pycdf.CDF('/volumes/kronos/serpe/data/earth/2004/05/expres_earth_jupiter_io_jrm09_lossc-wid1deg_3kev_20040516_v11.cdf')
>>> c = pycdf.CDF('/volumes/kronos/serpe/data/earth/2004/05/expres_earth_jupiter_io_jrm09_lossc-wid1deg_3kev_20040516_v11.cdf')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/obs/rpws/anaconda3/envs/cdf/lib/python3.7/site-packages/spacepy/pycdf/__init__.py", line 1718, in __init__
    self._open(True if readonly is None else readonly)
  File "/obs/rpws/anaconda3/envs/cdf/lib/python3.7/site-packages/spacepy/pycdf/__init__.py", line 1899, in _open
    lib.call(const.OPEN_, const.CDF_, self.pathname, ctypes.byref(self._handle))
  File "/obs/rpws/anaconda3/envs/cdf/lib/python3.7/site-packages/spacepy/pycdf/__init__.py", line 633, in call
    *(args + (const.NULL_, ))
  File "/obs/rpws/anaconda3/envs/cdf/lib/python3.7/site-packages/spacepy/pycdf/__init__.py", line 585, in check_status
    raise CDFError(status)
spacepy.pycdf.CDFError: CHECKSUM_ERROR: The data integrity verification through the checksum failed.
>>> 

error in fz_spdyn

I get an error in fz_spdynline 213:

for j=0l,parameters.time.n_step-1l do image(j,*)=interpol(float(image(j,*)),f,u)

Error message:

% INTERPOL: V and X arrays must have same # of elements
% Execution halted at: FZ_SPDYN          213

Input files (10 sources are configured)

VESPA validator error

While running the taplint epn_core validator, it appears that the VESPA_measurement_type global attribute must be updated to a valid value.

The current value is em.radio;meta.modelled and both words can only be used as qualifiers (secondary terms). We have to select a primary term, like phot.flux.density (which is not so well adapted), phys.polarization, pos, src...

May be a good pick would be: pos;src;em.radio;meta.modelled, which translates into:

Position and coordinates / Observed source viewed on the sky / Radio part of the spectrum / Quantity was produced by a model

(using the explain tool of http://cdsweb.u-strasbg.fr/UCD/tools.htx)

Add new lead angle models

Add the new lead angle models that L. Lamy determined from Bonfond et al. (2017) and Hinton et al., 2019, once he will have provided them.

CDF compression

Add an entry in the json to automatically compress the cdf file in ExPRES

Inconsistent Src_ID_Label

Dear developers,

The Src_ID_Label field in the ExPRES CDF is filled from an IDL table with string sizes equal to the maximal string length in the original array. This results in trailing blank spaces at the end of some values, which is not optimal for reading.

For instance :
['0d-30R NORTH ' '119d-30R NORTH' '239d-30R NORTH' '359d-30R NORTH' '0d-30R SOUTH ' '119d-30R SOUTH' '239d-30R SOUTH' '359d-30R SOUTH' 'Io NORTH ' 'Io SOUTH ']

Cheers!

ExPRES User Guide

What can we do with the 10-minute limitation of the 'Run-on-demand' using free access on UWS ?

Description of the CDF file output

It would be good to update of the documentation with a dscription of the CDF file output, explaining each variable of the file and telling which option shall be used to activate/deactivate their output.

New relase ExPRES 1.1

Modifications from ExPRES 1.0 to 1.1:

  • The entry NAME now automatically has the same name as the json file. The OUT entry is given by the user in the config.init file
  • Changing the polarization output in cdf file (LH/RH separate and sum of the visible sources)
  • Adding reading of user-customized ephemeris files in csv format
  • modifying call to MFL (lsh vs. msh, depending on the planet -Jupiter or not)
  • adding user-customized frequency list option
  • Adding the CDF output parameter VisibleSources (SRCVIS in jsonfile) which allow to obtain the polarization and the number of visible sources, for each hemisphere

Name of "source" item

It seems that the source name seems must be of the form "SourceXXX" where XXX is a number (or a string that can be interpreted as a float).

  • Source1 is valid
  • Source1.1 is valid
  • Source01 is valid
  • Source10 is valid
  • Source_10 is invalid
  • Source_N10d is invalid

This can be included into the JSON schema validation.

.YML configuration file

  • if [SIMU][NAME] == "config_parameters_set01.yml": change the file names as follows:
    expres_observertargetdate_AFT_magXX_srcXX_outXX_vXX.cdf
    with magXX_srcXX_outXX the parameters set configuration for the magnetic field, the source and the outputs,
    e.g. earth_jupiter_io_20200929_mag01_src01_out01_v1.0.cdf

  • else: keep the previous file name, e.g. expres_earth_jupiter_io_jrm09_lossc-wid1deg_3kev_20200929_v01.cdf?
    or only expres_earth_jupiter_io_jrm09_20200929_v01.cdf?

  • if [SIMU][NAME] == "config_parameters_set01.yml":

    • user-defined parameters ([SPDYN][CDF], [BODY][main_body][MAG], [SOURCE][X][WIDTH, CURRENT, ACCEL], ...) will not be taken into account
    • these parameters will be defined from those in the YAML file.

YAML file example:
[header]
- paramset_id: 01
[src_params]
- source_energy: 3 kev
- source_origin: Io
- source_lag_model: Hess2008
- source_aurora_alt: AURORA_ALT
- ...
[mfl_params]
- mfl_model: JRM09
- cs_model: CAN81
- density_model
[output_params]
- THETA: true
- FP: true
- FC: true
- AZIMUTH: false
- OBSLATITUDE: true
- SRCLONGITUDE: false
- SRCFREQMAX: true
- OBSDISTANCE: true
- OBSLOCALTIME: false
- CML: true
- SRCPOS: false

ExPRES Provenance file

Need to automatically create a provenance file, that will contains all data provenance (including, e.g., kernels name).
This file needs to be created by uws, while running the “archive” simulations

Error in dev branch: JSON input file already exist

I tried to launch a run with an input Ephem file on the dev branch. I got the following IDL error in stderr:

mv: 'Auroral_Galileo_C30.json' and 'Auroral_Galileo_C30.json' are the same file
% READ_SAVE_JSON: Something wrong happened with JSON file... Aborting.
% Stop encountered: READ_SAVE_JSON   1156 /obs/serpe/code/ExPRES/src/read_save.
  pro

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.