Coder Social home page Coder Social logo

iris-grib's Introduction

GRIB (GRIdded Binary) interface for Iris

โš™๏ธ CI ci-manifest ci-tests pre-commit
๐Ÿ’ฌ Community Contributor Covenant GH Discussions
๐Ÿ“– Documentation rtd
๐Ÿ“ˆ Health codecov
โœจ Meta Ruff NEP29 license - bds-3-clause conda platform
๐Ÿ“ฆ Package conda-forge pypi pypi - python version
๐Ÿงฐ Repo commits-since contributors release

For documentation see the latest developer version or the most recent released stable version.

Graphics and Lead Scientist: Ed Hawkins, National Centre for Atmospheric Science, University of Reading.

Data: Berkeley Earth, NOAA, UK Met Office, MeteoSwiss, DWD, SMHI, UoR, Meteo France & ZAMG.

#ShowYourStripes is distributed under a Creative Commons Attribution 4.0 International License creative-commons-by

iris-grib's People

Contributors

ajdawson avatar bblay avatar bjlittle avatar corinnebosley avatar dependabot[bot] avatar djkirkham avatar dpeterk avatar esadek-mo avatar esc24 avatar hgwright avatar jamesp avatar kaedonkers avatar kwilliams-mo avatar lbdreyer avatar m1dr avatar marqh avatar niallrobinson avatar pelson avatar pp-mo avatar pre-commit-ci[bot] avatar qulogic avatar rhattersley avatar s-boardman avatar scitools-ci[bot] avatar shoyer avatar stephenworsley avatar tkknight avatar tpowellmeto avatar trexfeathers avatar tv3141 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

iris-grib's Issues

Why Iris-grib Windows support through conda?

Iris-grib is available for Windows systems but it relies on gribapi and sooner or later ecCodes. However neither of those packages are available for a Windows environment with Python support? Am I missing something or is there a reason why you can load this package into a Windows environment even though it requires something that you cannot?

Installing iris-grib free version and iris=1.13 in the same environment

Hey guys @bjlittle @corinnebosley here's one for you after the bank holiday weekend ๐Ÿ˜

Installing iris-grib and iris=1.13 in a conda environment is not working unless iris-grib is restricted to an old version. See e.g.:

---
name: cdds_free
channels:
  - conda-forge

dependencies:
  - iris=1.13
  - iris-grib

This will install iris=1.13 and iris-grib=0.12 and the env will be created no problemo, but the env is not functional since iris-grib wants iris._lazy_data which is available only in iris>=2.

Just a heads up you guys need to tweak a bit the conda dependencies file for iris-grib ๐Ÿบ

Mercator grid not supported?

Hi there,
I got a TranslationError reporting mercator is an unhandled grid type.
Does this mean that mercator grid in general is not supported by iris-grib?
Thanks!

> File "iris-plot.py", line 21, in plotit
>     cubelist = iris.load(gribpath)
>   File "/home/miniconda2/lib/python2.7/site-packages/iris/__init__.py", line 341, in load
>     return _load_collection(uris, constraints, callback).merged().cubes()
>   File "/home/miniconda2/lib/python2.7/site-packages/iris/__init__.py", line 311, in _load_collection
>     result = iris.cube._CubeFilterCollection.from_cubes(cubes, constraints)
>   File "/home/miniconda2/lib/python2.7/site-packages/iris/cube.py", line 144, in from_cubes
>     for cube in cubes:
>   File "/home/miniconda2/lib/python2.7/site-packages/iris/__init__.py", line 298, in _generate_cubes
>     for cube in iris.io.load_files(part_names, callback, constraints):
>   File "/home/miniconda2/lib/python2.7/site-packages/iris/io/__init__.py", line 201, in load_files
>     for cube in handling_format_spec.handler(fnames, callback):
>   File "/home/miniconda2/lib/python2.7/site-packages/iris/fileformats/rules.py", line 1047, in load_cubes
>     user_callback_wrapper=loadcubes_user_callback_wrapper):
>   File "/home/miniconda2/lib/python2.7/site-packages/iris/fileformats/rules.py", line 960, in _load_pairs_from_fields_and_filenames
>     for field, filename in fields_and_filenames:
>   File "/home/miniconda2/lib/python2.7/site-packages/iris/fileformats/rules.py", line 1024, in _generate_all_fields_and_filenames
>     filename, **loader.field_generator_kwargs):
>   File "/home/miniconda2/lib/python2.7/site-packages/iris_grib/__init__.py", line 700, in _load_generate
>     message = GribWrapper(message_id, grib_fh=grib_fh)
>   File "/home/miniconda2/lib/python2.7/site-packages/iris_grib/__init__.py", line 170, in __init__
>     self._compute_extra_keys()
>   File "/home/miniconda2/lib/python2.7/site-packages/iris_grib/__init__.py", line 460, in _compute_extra_keys
>     raise TranslationError("unhandled grid type: {}".format(gridType))
> iris.exceptions.TranslationError: unhandled grid type: mercator

KeyError: "'editionNumber' not defined in section 0" on CFSR grib files

I'm trying to read Climate Forecast System Reanalysis (CFSR) grib files and getting the KeyError: "'editionNumber' not defined in section 0" error. I'm guessing is a grib definition problem. When using grib_dump I get the following:

***** FILE: /Volumes/RV_Passport/CFSR/vertical_wind/197901010000_vertical_wind_pgbh.grb2 
ECCODES ERROR   :  grib_parser: syntax error at line 18 of /Users/raulvalenzuela/miniconda2/share/grib_api/definitions/grib2/section.0.def
ECCODES ERROR   :  ecCodes Version: 2.8.0
#==============   MESSAGE 1 ( length=-9999 )               ==============
ECCODES ERROR   :  Key/value not found

Is there any way I can fix this?

FF - GRIB2 conversion failure (Iris test data file)

Trying to convert the n48-multi-field FF from the Iris test data repo fails:

cubes = iris.load('n48-multi-field')
>>> iris.save(cubes, test.grib2)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'test' is not defined
>>> iris.save(cubes, 'test.grib2')
GRIB_API ERROR   :  Key "forecastTime": Trying to encode a negative value of -3 for key of type unsigned

Traceback (most recent call last):
  ..
  File "/.../gribapi/__init__.py", line 143, in GRIB_CHECK
    raise GribInternalError(errid)
gribapi.GribInternalError: Encoding invalid

Grid Type 3.12 Transverse Mercator not loading

Am trying to load a grib2 file and am getting the error 'iris.exceptions.TranslationError: Grid definition section 3 contains unsupported source of grid definition [12]'
Grib header for record 1 listed below..

Grib header is:

GRIB {
  # Meteorological products (grib2/tables/4/0.0.table)
  discipline = 0;
  editionNumber = 2;
  # U.K. Met Office - Exeter (common/c-11.table)
  centre = 74;
  subCentre = 0;
  # Start of forecast (grib2/tables/4/1.2.table)
  significanceOfReferenceTime = 1;
  dataDate = 20190225;
  dataTime = 1200;
  # Operational products (grib2/tables/4/1.3.table)
  productionStatusOfProcessedData = 0;
  # Analysis and forecast products (grib2/tables/4/1.4.table)
  typeOfProcessedData = 2;
  numberOfDataPoints = 385792;
  # There is no appended list (grib2/tables/4/3.11.table)
  interpretationOfNumberOfPoints = 0;
  # Unknown code table entry (grib2/tables/4/3.1.table)
  gridDefinitionTemplateNumber = 12;
  # Earth assumed oblate spheroid with major and minor axes specified by data producer (grib2/tables/4/3.2.table)
  shapeOfTheEarth = 3;
  Ni = 548;
  Nj = 704;
  latitudeOfReferencePointInDegrees = 4.9e-05;
  longitudeOfReferencePointInDegrees = -2e-06;
  XRInMetres = 400000;
  YRInMetres = -100000;
  iScansNegatively = 0;
  jScansPositively = 1;
  jPointsAreConsecutive = 0;
  alternativeRowScanning = 0;
  DiInMetres = 2000;
  DjInMetres = 2000;
  X1InGridLengths = -238000;
  Y1InGridLengths = 1.222e+06;
  X2InGridLengths = 856000;
  Y2InGridLengths = -184000;
  gridType = transverse_mercator;
  NV = 0;
  # Analysis or forecast at a horizontal level or in a horizontal layer at a point in time (grib2/tables/4/4.0.table)
  productDefinitionTemplateNumber = 0;
  # Temperature (grib2/tables/4/4.1.0.table)
  parameterCategory = 0;
  # Temperature  (K)  (grib2/tables/4/4.2.0.0.table)
  parameterNumber = 0;
  #-READ ONLY- parameterUnits = K;
  #-READ ONLY- parameterName = Temperature ;
  # Analysis (grib2/tables/4/4.3.table)
  typeOfGeneratingProcess = 0;
  generatingProcessIdentifier = 1;
  # Minute (grib2/tables/4/4.4.table)
  indicatorOfUnitOfTimeRange = 0;
  # Hour (stepUnits.table)
  stepUnits = 1;
  forecastTime = 0;
  stepRange = 0;
  # Specified height level above ground  (m)  (grib2/tables/4/4.5.table)
  typeOfFirstFixedSurface = 103;
  #-READ ONLY- unitsOfFirstFixedSurface = m;
scaleFactorOfFirstFixedSurface = 0;
  scaledValueOfFirstFixedSurface = 1;
  # Missing (grib2/tables/4/4.5.table)
  typeOfSecondFixedSurface = 255;
  #-READ ONLY- unitsOfSecondFixedSurface = unknown;
  #-READ ONLY- nameOfSecondFixedSurface = Missing;
  scaleFactorOfSecondFixedSurface = MISSING;
  scaledValueOfSecondFixedSurface = MISSING;
  level = 1;
  shortNameECMF = t;
  shortName = t;
  nameECMF = Temperature;
  name = Temperature;
  cfNameECMF = air_temperature;
  cfName = air_temperature;
  cfVarNameECMF = t;
  cfVarName = t;
  #-READ ONLY- modelName = unknown;
  numberOfValues = 385792;
  packingType = grid_simple;

Provide format-specific phenomenon identity for load and save

Summary : a solution like the 'STASH' attribute, to carry grib-specific parameter identity.

At present, if you want to save Iris data for which no known parameter code translation exists, it just gets a 0, so you can't save multiple such things to one file.

Likewise on load, an unknown parameter code just results in an 'unknown' or anonymous cube
-- and maybe we can even wind up merging things that should be kept separate ??

I'm imagining a specially recognised cube attribute, along the lines of 'STASH' for PP+FF encoding.
It needs to encode : (edition, discipline parameterCategory, parameterNumber). We can probably handle GRIB1 in that form somehow, not clear on detail yet.

Remove deprecations

I.e. where it currently states ".. deprecated:: 1.6" or whatever, we should change to the relevant iris-grib version.

No deprecations!

Python3 problems

I'm having a problem importing iris_grib in Python3:

>>> import iris_grib
[...]
No module named 'gribapi_swig'

However, installing grib_api with --enable-python didn't complain ...

Any ideas what might be the issue?

two phenomena with the same Grib2 parameter

Hi there,

While I'm trying to handle each Grib2 message with iris-grib, I found there two messages with identical Grib2 parameter but discribing different phenomenon. How could I handle this in terms of putting the G2Param codes in the _grib_cf_map.py.

I attached the details of these two message below:

  1. Variable "Visibility_height_above_ground"

float Visibility_height_above_ground(time=1, height_above_ground1=1, lat=769, lon=1024);
:long_name = "Visibility @ Specified height level above ground";
:units = "m";
:description = "Visibility";
:missing_value = NaNf; // float
:grid_mapping = "LatLon_Projection";
:coordinates = "reftime time height_above_ground1 lat lon ";
:Grib_Variable_Id = "VAR_0-19-0_L103";
:Grib2_Parameter = 0, 19, 0; // int
:Grib2_Parameter_Discipline = "Meteorological products";
:Grib2_Parameter_Category = "Physical atmospheric properties";
:Grib2_Parameter_Name = "Visibility";
:Grib2_Level_Type = "Specified height level above ground";
:Grib2_Generating_Process_Type = "Analysis";

  1. Variable "Visibility_height_above_ground_probability_below_5000"

float Visibility_height_above_ground_probability_below_5000(time=1, height_above_ground1=1, lat=769, lon=1024);
:long_name = "Probability Visibility below_5000 m @ Specified height level above ground";
:units = "%";
:description = "Visibility";
:missing_value = NaNf; // float
:grid_mapping = "LatLon_Projection";
:coordinates = "reftime time height_above_ground1 lat lon ";
:Grib_Variable_Id = "VAR_0-19-0_L103_Prob_below_5000";
:Grib2_Parameter = 0, 19, 0; // int
:Grib2_Parameter_Discipline = "Meteorological products";
:Grib2_Parameter_Category = "Physical atmospheric properties";
:Grib2_Parameter_Name = "Visibility";
:Grib2_Level_Type = "Specified height level above ground";
:Grib2_Probability_Type = 0; // int
:Grib2_Probability_Name = "below_5000";
:Grib2_Generating_Process_Type = "Analysis";

Thanks!

Regards
Haipeng

Tables incomplete?

Hi,
we are working a lot with different NWP models, and I was hoping iris-grib would be the glue to join the NetCDF and GRIB worlds. I already have some homebrew classes to deal with this mess, but of course it would be much more attractive to leverage the power of iris for this job. However, after installing from conda

ecmwf_grib                1.23.1                        0    conda-forge
iris                      1.13.0                   py27_1    conda-forge
iris-grib                 0.10.1                   py27_0    conda-forge
python-ecmwf_grib         1.23.1                   py27_0    conda-forge

I still get a lot of unknown fields for most of our GRIB data (examples for GFS-4, ECMWF and Environment Canada GDPS appended). These fields check out fine when I run e.g. grib_ls in the command line, where grib_info also points to the exact conda environment I run the Python code from:

grib_api Version 1.23.1
Default definition files path is used: /user/martin/.conda/envs/dev/share/grib_api/definitions

Can anyone explain this effect? I'm quite motivated to help getting this thing to fly, but I would appreciate someone pointing me in the right direction first.

Another (probably related?) issue is that for e.g. the GFS-4 in my example, I get like 10 cubes of y_wind having the exact same metadata. In reality, they differ in levelType. Can I somehow tell the load function to attach the levelType (and/or any other parameters I'm interested in) to the cube as a meta-parameter? Or is the philosophy here to fiddle with the GribMessages on a low level and then call load_pairs_from_fields?

iris_grib_issues.txt

Best Regards,
Martin

Port remaining grib tests from Iris + remove circularity

See Iris#2860 for remaining tests to be ported from Iris to Iris-grib.

This comment describes the principles wanted, that should deliver a more sensible Iris with no remaining dependencies on iris-grib (that is, even in the testing).

In addition this will be a good time to review the iris-grib codebase + weed out any remaining dependencies on private Iris api.

UnsatisfiableError when installing iris-grib with conda

Greetings,

Versions

  • Operation system : WSL Ubuntu v18.04 on Windows 10
  • Anaconda : 2019-03
  • Conda : 4.7.5
  • Python : 3.7.3
  • Iris : 2.2.0

Code

Input
conda install -c conda-forge iris-grib

Traceback

Collecting package metadata (current_repodata.json): done
Solving environment: failed
Collecting package metadata (repodata.json): done
Solving environment: failed

UnsatisfiableError: The following specifications were found to be incompatible with each other:

  - pkgs/main/linux-64::anaconda==2019.03=py37_0 -> importlib_metadata==0.8=py37_0
  - pkgs/main/linux-64::anaconda==2019.03=py37_0 -> pango==1.42.4=h049681c_0
  - pkgs/main/linux-64::importlib_metadata==0.8=py37_0
  - pkgs/main/linux-64::pango==1.42.4=h049681c_0
  - pkgs/main/linux-64::path.py==11.5.0=py37_0 -> importlib_metadata[version='>=0.5']

What could possibly explain this predicatement? Similarly, the exact same error occurs when conda installing eccodes.

Thank you!

Rewrite save rules in terms of GribMessage

All the grib2 load rules in iris_grib._load_convert are written in terms of the sectionally-structured GribMessage object.
But all the save rules (iris_grib._save_rules) use the gribapi (and/or eccodes) instead

Technical Debt summary:

  • fix:

    • extend GribMessage to add 'create from template' and 'write to file'.
    • rewrite existing save rules in terms of GribMessage, and rearrange sectionally (like load rules)
    • costs of change:
      • different from load rules means more to learn, hard to read
      • testing is especially more complicated
  • costs of the legacy support :

    • code is rather exposed to changes in the ecmwf APIs
    • confusion from the different types of "message" in our docs
      • for devs. especially, in our the testing code
    • existing save rules are less tidy, as you don't get the benefit of the sectional division of the GribMessage object :
      • unlike the load rules, where appropriate data sections are passed, which is neat + clean
      • this also complicates testing with elaborate schemes to mock out gribapi operations

To fix:

  • make it possible to create a GribMessage from a gribapi template, and save it to a file.
  • rework the existing rules to work on GribMessages instead of a gribapi handle.
  • refactor the tests likewise (including various test classes mocking out message access)

Enable web documentation

Following #55, we need to enable docs building on this repo.

I think that means a bit more work with readthedocs, to sort out an scitools organisation account and possibly to get to build docs for tagged release versions.
It will then maintain docs for different versions with a dropdown control (and a 'latest' concept), which if you care about it seems to be a key benefit from readthedocs.

Fix level indexing in hybrid coords

Revert #106 !
I think this change was a misunderstanding : the test data does not show clearly that these were the wrong numbers -- but other (real) data since encountered does.

( N.B. the actual Grib spec is really vague on this, hence the interpretation depended on what I found in actual files ! )

Statistical processing period being set wrong on save?

I think iris-grib might be setting value of the lengthOfTimeRange key incorrectly when saving cubes with a statistic (cell method) over time. According to the GRIB spec, the definition of this key (as used in PDT4.8, PDT4.10 and PDT4.11) is:

Length of the time range over which statistical processing is done, in units defined by the previous octet

To me, this sounds like the interval of the cell method. For example, in the following case, I'd have expected this key to be set to one hour:

Cell methods:
    mean: time (1 hour)

Instead, the value of this key is currently being set as the difference between lower and upper bound of the cube's time coordinate point:

time_range_in_hours = end_hours - start_hours
integer_hours = int(time_range_in_hours)
if integer_hours != time_range_in_hours:
msg = 'Truncating floating point lengthOfTimeRange {} to ' \
'integer value {}'
warnings.warn(msg.format(time_range_in_hours, integer_hours))
gribapi.grib_set(grib, "lengthOfTimeRange", integer_hours)

I'd appreciate others' input here โ€“ is iris-grib doing the right thing currently, or do we need to correct the above code?

Fix for latest gribapi

Version 1.16 is available, but currently causes us some failures.

Eventually we'll presumably want to throw this away and replace with eccodes, but it isn't clear when that will be appropriate : See #58

Different types of first and second fixed surface: Why error?

Hi, I tried to load a grib2 containing 'Low Cloud Cover (800 hPa - Soil)' from opendata.dwd.de.

This raises a TranslationError: Product definition section 4 has different types of first and second fixed surface.

I don't see why this should be a problem, can anyone clear things up? Thanks!

grib_dump shows

#-Unknown code table entry ()
typeOfFirstFixedSurface = 100;
#-READ ONLY- unitsOfFirstFixedSurface = 100;
#-READ ONLY- nameOfFirstFixedSurface = 100;
scaleFactorOfFirstFixedSurface = 0;
scaledValueOfFirstFixedSurface = 80000;
#-Unknown code table entry ()
typeOfSecondFixedSurface = 1;
#-READ ONLY- unitsOfSecondFixedSurface = 1;
#-READ ONLY- nameOfSecondFixedSurface = 1;
scaleFactorOfSecondFixedSurface = 0;
scaledValueOfSecondFixedSurface = 0;
topLevel = 800;
bottomLevel = 0;
shortNameECMF = ccl;
shortName = CLCL;
nameECMF = Cloud cover;
name = Cloud Cover (800 hPa - Soil);

gribapi or grib_api?

I'have compiled and installed GRIB API but the python package installed is named grib_api while iris-grib looks for gribapi. What am I supposed to do?

Add support for ProductDefinitionTemplate 4.60

Product definition template 4.60 is used by lots of the datasets in the ECMWF Mars system, including the S2S and Copernicus seasonal datasets which are growing increasingly important. Therefore support for this in iris-grib is becoming more necessary.

Details can be found in the WMO documentation:
Product definition template 4.60 - individual ensemble reforecast, control and perturbed, at a horizontal level or in a horizontal layer at a point in time

Dask worker fails to open badly translated grib file with Iris

First, the traceback:

  File "/data/pelson/conda/envs/data-transform/lib/python2.7/site-packages/iris_grib/_load_convert.py", line 1634, in generating_process
    if options.warn_on_unsupported:
AttributeError: 'thread._local' object has no attribute 'warn_on_unsupported'

There is some weird interaction between the pickled function and the way the function is called with dask. Essentially, I think iris-grib is relying on the _load_convert having been run, but that isn't necessarily the case when pickling/unpickling the function (or at least, the local it is referencing isn't the one that the thread has run the module on). This is a somewhat woolly description ๐Ÿ˜œ.

Rather than depending on the module having been run, we either need to ensure that we validate the threading.local has been correctly configured, or just ditch the complexity. I don't currently see the usecase for having warn_on_unsupported be defined on a per-thread basis. The implementation of this was originally done in SciTools/iris@0248bf3, and for all of those tests, I believe a module level constant would suffice.

Time averaged fields parsed incorrectly for ARPEGE

Hi,

I am currently working with data from the ARPEGE model (downloadable from here: https://donneespubliques.meteofrance.fr/?fond=produit&id_produit=130&id_rubrique=51).

Several of their fields are cumulative or in some other way time averaged, but iris-grib does not seem to be able to interpret them correctly. I don't know whether the files are not written according to the standard, or if it's a case that just hasn't occurred yet, but the following patch seems to do the trick:

--- a/iris_grib/_load_convert.py
+++ b/iris_grib/_load_convert.py
@@ -1661,6 +1661,11 @@ def statistical_forecast_period_coord(section, frt_coord):
 
     # Get the period end time (as a timedelta relative to the frt).
     end_time_delta = end_time - frt_point
+    if (end_time_delta.total_seconds() <= 0) and ('lengthOfTimeRange' in section.keys()):
+        # apparently the end time is not set correctly, replace it
+        # with a different key (needed e.g. for ARPEGE model)
+        time_range_seconds = forecast_units.convert(section['lengthOfTimeRange'], 'seconds')
+        end_time_delta = timedelta(seconds=time_range_seconds) + start_time_delta
 
     # Get the middle of the period (as a timedelta relative to the frt).
     # Note: timedelta division in 2.7 is odd. Even though we request integer

BTW, there are also a number of parameters which iris-grib does not recognize. I manually hacked the GRIB2_TO_CF list and the standard names check to make it work, but maybe someone who knows how the translation lists are generated could implement this properly?

Add `product_definition_template_5`

hi there,

While I was working with GRIB files, I found the need to add product_definition_template_5. Partly this would repeat product_definition_template_9 but still be valuable to add from my view. My contribution will be like the follows with the option to optimise the repeated part (works for my local variables):

def product_definition_section(section, metadata, discipline, tablesVersion,                        
                                                  rt_coord):
...
     # add template 5
     elif template == 5:
        probability = \
            product_definition_template_5(section, metadata, rt_coord)
...

def product_definition_template_5(section, metadata, rt_coord):
    """
    Set keys within the provided grib message based on Product
    Definition Template 4.5.
    Template 4.5 is used to represent a probability forecasts at a horizontal
    level or in a horizontal layer at a point in time.
    """
    # gribapi.grib_set(grib, "productDefinitionTemplateNumber", 5)
    product_definition_template_0(section, metadata, rt_coord)

    # NOTE: we currently don't record the nature of the underlying statistic,
    # as we don't have an agreed way of representing that in CF.

    # Return a probability object to control the production of a probability
    # result.  This is done once the underlying phenomenon type is determined,
    # in 'translate_phenomenon'.
    probability_typecode = section['probabilityType']

    if probability_typecode == 0:
        # Type is "above upper level".
        threshold_value = section['scaledValueOfUpperLimit']
        if threshold_value == _MDI:
            msg = 'Product definition section 4 has missing ' \
                'scaled value of upper limit'
            raise TranslationError(msg)
        threshold_scaling = section['scaleFactorOfUpperLimit']
        if threshold_scaling == _MDI:
            msg = 'Product definition section 4 has missing ' \
                'scale factor of upper limit'
            raise TranslationError(msg)
        # Encode threshold information.
        threshold = unscale(threshold_value, threshold_scaling)
        probability_type = Probability('below_threshold', threshold)

    elif probability_typecode == 1:
        # Type is "above upper level".
        threshold_value = section['scaledValueOfUpperLimit']
        if threshold_value == _MDI:
            msg = 'Product definition section 4 has missing ' \
                'scaled value of upper limit'
            raise TranslationError(msg)
        threshold_scaling = section['scaleFactorOfUpperLimit']
        if threshold_scaling == _MDI:
            msg = 'Product definition section 4 has missing ' \
                'scale factor of upper limit'
            raise TranslationError(msg)
        # Encode threshold information.
        threshold = unscale(threshold_value, threshold_scaling)
        probability_type = Probability('above_threshold', threshold)
        # Note that GRIB provides separate "above lower threshold" and "above
        # upper threshold" probability types.  This naming style doesn't
        # recognise that distinction.  For now, assume this is not important.
    else:
        msg = ('Product definition section 4 contains an unsupported '
               'probability type [{}]'.format(probability_typecode))
        raise TranslationError(msg)

    return probability_type

Could your guys please review and test from integration etc and see if this is worth to add?

Cheers
Haipeng

KeyError: "'codedValues' not defined in section 7" when loading a possibly corrupted file

Hi all,
I got a KeyError: "'codedValues' not defined in section 7" when loading a grib2. I suspect it is raised because the file is somehow corrupt, because loading works on some files and does not on others.

I uploaded a sample file here, which raises the error for me: http://homepage.univie.ac.at/a1254888/ICON_EU_single_level_elements_CLCM_2018050512.grib2
It originates from a grib2 from opendata.dwd.de, which has been loaded with Iris, reduced in size (area) and saved. (on about 10-20% of all files the error is raised)

It is enough to do
cube = iris.load('ICON_EU_single_level_elements_CLCM_2018050512.grib2')[0]
cube.data
and the error is raised.

How should I load many files, not knowing which/if a file is corrupted?
For now I am iterating over all files and load them inside a try-block. Does not seem to be a lot slower on ~300 files.

Would be great if you could share your comments on the problem.
Thanks!

Bug with saving masked arrays (inconsistent fill value dtype)

When a cube with a masked array is saved, if the masked array does not have a fill value specified, it will use the numpy default value which hass dtype np.float64. This may be inconsistent with the dtype of the array itself, and when the array is filled (here) it will be filled with a value that has the dtype of the array (e.g. np.float32). This will then be inconsistent with the fill value described on this line which will then cause the data to be saved incorrectly.

Changing this line to the following seems to fix this problem:

fill_value = float(np.array(cube.data.fill_value, dtype=cube.data.dtype))

Iris-grib saver doesn't save attributes

The grib saver used by iris (iris-grib, grib_api) does not save any attributes of cubes, with the exception of the attribute: 'centre = Met Office', which can then not be loaded back in.

Outstanding technical debt

A grab-bag of known issues we'd like to get round to "some time"
PLEASE ADD ANY YOU LIKE...

  • #56 GRIB1 support is still using the old "non-strict" grib handler code
    • fix: rewrite grib1 loading using iris_grib.message.GribMessage instead
    • legacy cost: clarity of main code structure; dread to maintain this (though little need has arisen)
  • #119 all save rules are written in terms of the gribapi API, unlike the load rules
    • fix:
      • extend GribMessage to add 'create from template' and 'write to file'.
      • rewrite existing save rules in terms of GribMessage, and rearrange sectionally (like load rules)
    • legacy cost:
      • different from load rules means more to learn, hard to read
      • testing is especially more complicated

Unable to load ECMWF daily climatology grib1

From @cbridge2 on September 27, 2017 2:14

Hi Iris developers,
I tried to load an ECMWF daily climatology grib1 file using Iris 1.12.0:
clim = iris.load('era_clim_mslp_jan01.grib')

And got the following problem:

/projects/access/apps/pythonlib/iris/1.12.0/lib/python2.7/site-packages/Iris-1.12.0-py2.7-linux-x86_64.egg/iris/__init__.pyc in load(uris, constraints, callback)
    339 
    340     """
--> 341     return _load_collection(uris, constraints, callback).merged().cubes()
    342 
    343 

/projects/access/apps/pythonlib/iris/1.12.0/lib/python2.7/site-packages/Iris-1.12.0-py2.7-linux-x86_64.egg/iris/__init__.pyc in _load_collection(uris, constraints, callback)
    309     try:
    310         cubes = _generate_cubes(uris, callback, constraints)
--> 311         result = iris.cube._CubeFilterCollection.from_cubes(cubes, constraints)
    312     except EOFError as e:
    313         raise iris.exceptions.TranslationError(

/projects/access/apps/pythonlib/iris/1.12.0/lib/python2.7/site-packages/Iris-1.12.0-py2.7-linux-x86_64.egg/iris/cube.pyc in from_cubes(cubes, constraints)
    142         pairs = [_CubeFilter(constraint) for constraint in constraints]
    143         collection = _CubeFilterCollection(pairs)
--> 144         for cube in cubes:
    145             collection.add_cube(cube)
    146         return collection

/projects/access/apps/pythonlib/iris/1.12.0/lib/python2.7/site-packages/Iris-1.12.0-py2.7-linux-x86_64.egg/iris/__init__.pyc in _generate_cubes(uris, callback, constraints)
    296         if scheme == 'file':
    297             part_names = [x[1] for x in groups]
--> 298             for cube in iris.io.load_files(part_names, callback, constraints):
    299                 yield cube
    300         elif scheme in ['http', 'https']:

/projects/access/apps/pythonlib/iris/1.12.0/lib/python2.7/site-packages/Iris-1.12.0-py2.7-linux-x86_64.egg/iris/io/__init__.pyc in load_files(filenames, callback, constraints)
    199                 yield cube
    200         else:
--> 201             for cube in handling_format_spec.handler(fnames, callback):
    202                 yield cube
    203 

/projects/access/apps/pythonlib/iris/1.12.0/lib/python2.7/site-packages/Iris-1.12.0-py2.7-linux-x86_64.egg/iris/fileformats/rules.pyc in load_cubes(filenames, user_callback, loader, filter_function)
   1045             all_fields_and_filenames,
   1046             converter=loader.converter,
-> 1047             user_callback_wrapper=loadcubes_user_callback_wrapper):
   1048         yield cube
   1049 

/projects/access/apps/pythonlib/iris/1.12.0/lib/python2.7/site-packages/Iris-1.12.0-py2.7-linux-x86_64.egg/iris/fileformats/rules.pyc in _load_pairs_from_fields_and_filenames(fields_and_filenames, converter, user_callback_wrapper)
    958     concrete_reference_targets = {}
    959     results_needing_reference = []
--> 960     for field, filename in fields_and_filenames:
    961         # Convert the field to a Cube, passing down the 'converter' function.
    962         cube, factories, references = _make_cube(field, converter)

/projects/access/apps/pythonlib/iris/1.12.0/lib/python2.7/site-packages/Iris-1.12.0-py2.7-linux-x86_64.egg/iris/fileformats/rules.pyc in _generate_all_fields_and_filenames()
   1022         for filename in filenames:
   1023             for field in loader.field_generator(
-> 1024                     filename, **loader.field_generator_kwargs):
   1025                 # evaluate field against format specific desired attributes
   1026                 # load if no format specific desired attributes are violated

/projects/access/apps/pythonlib/iris_grib/0.9.1/lib/python2.7/site-packages/iris_grib-0.9.1-py2.7.egg/iris_grib/__init__.pyc in _load_generate(filename)
    698             message_id = message._raw_message._message_id
    699             grib_fh = message._file_ref.open_file
--> 700             message = GribWrapper(message_id, grib_fh=grib_fh)
    701         elif editionNumber != 2:
    702             emsg = 'GRIB edition {} is not supported by {!r}.'

/projects/access/apps/pythonlib/iris_grib/0.9.1/lib/python2.7/site-packages/iris_grib-0.9.1-py2.7.egg/iris_grib/__init__.pyc in __init__(self, grib_message, grib_fh)
    168         self.extra_keys = {}
    169         self._confirm_in_scope()
--> 170         self._compute_extra_keys()
    171 
    172         # Calculate the data payload shape.

/projects/access/apps/pythonlib/iris_grib/0.9.1/lib/python2.7/site-packages/iris_grib-0.9.1-py2.7.egg/iris_grib/__init__.pyc in _compute_extra_keys(self)
    315         self.extra_keys['_referenceDateTime'] = \
    316             datetime.datetime(int(self.year), int(self.month), int(self.day),
--> 317                               int(self.hour), int(self.minute))
    318 
    319         # forecast time with workarounds

ValueError: year is out of range

I have attached the offending file:
era_clim_mslp_jan01 grib

Inspecting the file with gripapi there is no explicit mention of 'year' which may make sense for a climatology file but, I guess, could make Iris unhappy if it is expecting a date with a year. Any suggestions much appreciated. (For now I have read the data in using gribapi and attached it to a 'handmade' cube as the 'data' attribute.)
Cheers,
Chris

Copied from original issue: SciTools/iris#2768

Simplify eccodes dependence in setup.py install_requires

Currently eccodes is only available on PyPI for windows: https://pypi.org/project/eccodes/#files

#141 amends the "install_requires" section of setup.py so that it only checks for eccodes if testing on windows. This means that users will not be able to install eccodes using pip; they will have to install using conda instead.

However, this is a TEMPORARY AMENDMENT ONLY and must be reversed when ECMWF release eccodes on PyPI for all platforms.

Fix GRIB1 support

The GRIB1 load translation code (here) is still using GribWrappers, which is the only thing preventing us throwing all the old loader code away, including the GribWrapper class.

Technical Debt summary :

  • fix: rewrite grib1 loading using iris_grib.message.GribMessage instead
  • legacy cost: clarity of main code structure; dread to maintain this (though little need has arisen)

[
In fact this is still the bulk of code in iris_grib.__init__.py, though no longer publicly exported.

  • globals CENTRE_TITLES, TIME_RANGE_INDICATORS, PROCESSING_TYPES, TIME_CODES_EDITION1 and unknown_string
  • classes GribDataProxy and GribWrapper
  • functions _longitude_is_cyclic and _message_values
    ]

There is still a slippery kludge that means when you load a GRIB1 message the 'field' argument in a callback becomes a GribWrapper instead of a GribMessage (here).

So, this is now the only thing that GribWrapper is still used for, and the only remaining occasion that a user might possibly be concerned about what a GribWrapper looks like.

We haven't addressed this yet because GRIB1 usage is very low.

We should rewrite the Grib1 loader to use the modern message object + throw away all the old GribWrapper-based processing.

Tasks

  1. Dragon Sub-Task ๐ŸฆŽ Type: Technical Debt

Incomplete hybrid levels should still produce 3-dim cube

Hi, I have some apparently improperly defined hybrid level GRIB2 files here:

  # Hybrid level (grib2/tables/5/4.5.table)  
  typeOfFirstFixedSurface = 105;
  #-READ ONLY- unitsOfFirstFixedSurface = unknown;
  #-READ ONLY- nameOfFirstFixedSurface = Hybrid level;
  scaleFactorOfFirstFixedSurface = 0;
  scaledValueOfFirstFixedSurface = 1;
  # Missing (grib2/tables/5/4.5.table)  
  typeOfSecondFixedSurface = 255;
  #-READ ONLY- unitsOfSecondFixedSurface = unknown;
  #-READ ONLY- nameOfSecondFixedSurface = Missing;
  scaleFactorOfSecondFixedSurface = MISSING;
  scaledValueOfSecondFixedSurface = MISSING;

Obviously there is no way of correctly assigning pressure or anything to these surfaces, but it would still be very helpful if they could be parsed as a single cube with scaledValueOfFirstFixedSurface specifiying the third dimension coordinate. Instead they come out as individual 2-dim cubes:

 air_pressure / (Pa)                 (projection_y_coordinate: 429; projection_x_coordinate: 499)
     Dimension coordinates:
          projection_y_coordinate                           x                             -
          projection_x_coordinate                           -                             x
     Scalar coordinates:
          forecast_period: 1 hours
          forecast_reference_time: 2018-08-31 23:00:00
          time: 2018-09-01 00:00:00

Is there a workaround for this?

Unify supported templates across reading and writing

Currently we support fewer encoding for writing than reading.
But I think that we can currently read anything that we can write (i.e. not obvious, but it happens to be so).

It could simplify understanding if we always provided both read and write for all templates we support.
-- that is, we'd add write support for everything we can read.
This should not be too hard, as once you have knowledge of the encoding, the reverse should be easy : whereas reading may have to reject odd extended options that we don't support, writing is not complicated by that.

This should cover grid as well as data definition templates (i.e. section 3 and section 4 templates).

Public docs

The github docs link is not yet viable.
It would be nice to be able to link to this from Iris docs.

save_grib2 can't accept Cubelists

It seems save_grib2 can't accept Cubelists despite the documentation suggesting that it can. @pp-mo suggested this code may have been copied and modified from somewhere else, hence the mistake. He also suggested that it should be possible to save a Cubelist but that will require changes to the code.

For now I'd suggest changing the comments at under save_grib2 and possibly change the underlying code later.

It might be worth keeping an eye out for similar documentation mistakes elsewhere and seeing if there are other code changes needed.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.