Coder Social home page Coder Social logo

noaa-mdl / grib2io Goto Github PK

View Code? Open in Web Editor NEW
28.0 6.0 10.0 138.47 MB

Python interface to the NCEP G2C Library for reading and writing GRIB2 messages.

Home Page: https://noaa-mdl.github.io/grib2io

License: MIT License

Shell 1.53% Python 98.41% Dockerfile 0.06%
grib2 python python3 grib2-decoder grib2-encoder weather meteorology data-science numpy weather-data

grib2io's People

Contributors

adamschnapp avatar aerorahul avatar akrherz avatar alcoat avatar bschwedler avatar dependabot[bot] avatar dopplershift avatar eengl avatar ericengle-noaa avatar hhsprings avatar hugovk avatar jasonwashburn avatar jdkloe avatar jswhit avatar knedlsepp avatar kzokm avatar local-minimum avatar me-mark-o avatar mhagdorn avatar mromberg avatar ocefpaf avatar thorsteinssonh avatar timcera avatar timothycera-noaa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

grib2io's Issues

Interpolation

Implement grid interpolation via the NCEPLIBS-ip library. This is a Fortran library so the appropriate interface will need to be created using f2py.

Also, NCEPLIBS IP and sp libraries must support shared object libraries.

Unable to write out values from URMA qpe01/06 GRIBs

When I read URMA QPE01 and QPE06 GRIBs using the latest version of grib2io on WCOSS2 (2.0.0b2), all of the values are all 'nan'.

Acquire QPE06 data:

$ wget https://nomads.ncep.noaa.gov/pub/data/nccf/com/urma/prod/urma2p5.20230424/urma2p5.2023042418.pcp_06h.wexp.grb2

Set up environment:

$ module reset
$ module load intel/19.1.3.304
$ module load python/3.8.6
$ module load libjpeg
$ module use /lfs/h1/mdl/nbm/save/apps/modulefiles
$ module load grib2io/2.0.0b2

$ which python
/apps/spack/python/3.8.6/intel/19.1.3.304/pjn2nzkjvqgmjw4hmyz43v5x4jbxjzpk/bin/python

Python:

>>> import grib2io
>>> sourceGrib='urma2p5.2023042418.pcp_06h.wexp.grb2'
>>> grbs = grib2io.open(sourceGrib)                                                                                                                                                                                                                    

>>> grbs
mode = rb
name = urma2p5.2023042418.pcp_06h.wexp.grb2
messages = 1
current_message = 0
size = 719080
closed = False
variables = ('APCP',)
levels = ('surface',)

>>> grbs[0]
Section 0: discipline = 0 - Meteorological Products
Section 1: originatingCenter = 7 - US National Weather Service - NCEP (WMC)
Section 1: originatingSubCenter = 4 - Environmental Modeling Center
Section 1: masterTableInfo = 2 - Version Implemented on 4 November 2003
Section 1: localTableInfo = 1 - Number of local table version used.
Section 1: significanceOfReferenceTime = 0 - Analysis
Section 1: year = 2023
Section 1: month = 4
Section 1: day = 24
Section 1: hour = 12
Section 1: minute = 0
Section 1: second = 0
Section 1: refDate = 2023-04-24 12:00:00
Section 1: productionStatus = 0 - Operational Products
Section 1: typeOfData = 0 - Analysis Products
Section 3: interpretationOfListOfNumbers = 0 - There is no appended list
Section 3: gridDefinitionTemplateNumber = 30 - Lambert Conformal (Can be Secant, Tangent, Conical, or Bipolar)
Section 3: shapeOfEarth = 1 - Earth assumed spherical with radius specified (in m) by data producer
Section 3: earthRadius = 6371200.0
Section 3: earthMajorAxis = None
Section 3: earthMinorAxis = None
Section 3: resolutionAndComponentFlags = [0, 0, 1, 1, 1, 0, 0, 0]
Section 3: ny = 1597
Section 3: nx = 2345
Section 3: scanModeFlags = [0, 1, 0, 0, 0, 0, 0, 0]
Section 3: gridOrientation = 265.0
Section 3: gridlengthXDirection = 2539.703
Section 3: gridlengthYDirection = 2539.703
Section 3: latitudeFirstGridpoint = 19.228976
Section 3: latitudeSouthernPole = -90.0
Section 3: latitudeTrueScale = 25.0
Section 3: longitudeFirstGridpoint = 233.723448
Section 3: longitudeSouthernPole = 0.0
Section 3: projParameters = {'a': 6371200.0, 'b': 6371200.0, 'proj': 'lcc', 'lat_1': 25.0, 'lat_2': 25.0, 'lat_0': 25.0, 'lon_0': 265.0}
Section 3: projectionCenterFlag = 0
Section 3: standardLatitude1 = 25.0
Section 3: standardLatitude2 = 25.0
Section 4: productDefinitionTemplateNumber = 8 - Average, accumulation, extreme values or other statistically processed values at a horizontal level or in a horizontal layer in a continuous or non-continuous time interval. (see Template 4.8)
Section 4: fullName = Total Precipitation
Section 4: units = kg m-2
Section 4: shortName = APCP
Section 4: leadTime = 6:00:00
Section 4: unitOfFirstFixedSurface = unknown
Section 4: valueOfFirstFixedSurface = 0.0
Section 4: unitOfSecondFixedSurface = None
Section 4: valueOfSecondFixedSurface = 0.0
Section 4: validDate = 2023-04-24 18:00:00
Section 4: duration = 6:00:00
Section 4: level = surface
Section 4: backgroundGeneratingProcessIdentifier = 0
Section 4: dayOfEndOfTimePeriod = 24
Section 4: forecastTime = 0
Section 4: generatingProcess = 118 - UnRestricted Mesoscale Analysis (URMA)
Section 4: hourOfEndOfTimePeriod = 18
Section 4: hoursAfterDataCutoff = 0
Section 4: minuteOfEndOfTimePeriod = 0
Section 4: minutesAfterDataCutoff = 0
Section 4: monthOfEndOfTimePeriod = 4
Section 4: numberOfMissingValues = 255
Section 4: numberOfTimeRanges = 1
Section 4: parameterCategory = 1
Section 4: parameterNumber = 8
Section 4: scaleFactorOfFirstFixedSurface = 0
Section 4: scaleFactorOfSecondFixedSurface = 0
Section 4: scaledValueOfFirstFixedSurface = 0
Section 4: scaledValueOfSecondFixedSurface = 0
Section 4: secondOfEndOfTimePeriod = 0
Section 4: statisticalProcess = 1 - Accumulation
Section 4: timeIncrementOfSuccessiveFields = 0
Section 4: timeRangeOfStatisticalProcess = 6
Section 4: typeOfFirstFixedSurface = 1 - ['Ground or Water Surface', 'unknown']
Section 4: typeOfGeneratingProcess = 2 - Forecast
Section 4: typeOfSecondFixedSurface = 255 - ['Missing', 'unknown']
Section 4: typeOfTimeIncrementOfStatisticalProcess = 2 - Successive times processed have same start time of forecast, forecast time is incremented.
Section 4: unitOfTimeRange = 1 - Hour
Section 4: unitOfTimeRangeOfStatisticalProcess = 1 - Hour
Section 4: unitOfTimeRangeOfSuccessiveFields = 255 - Missing
Section 4: yearOfEndOfTimePeriod = 2023
Section 5: dataRepresentationTemplateNumber = 2 - Grid Point Data - Complex Packing (see Template 5.2)
Section 5: numberOfPackedValues = 1383199
Section 5: typeOfValues = 0 - Floating Point
Section 5: binScaleFactor = 5
Section 5: decScaleFactor = 4
Section 5: groupLengthIncrement = 1
Section 5: groupSplittingMethod = 1 - General Group Splitting
Section 5: lengthOfLastGroup = 658
Section 5: nBitsGroupWidth = 5
Section 5: nBitsPacking = 14
Section 5: nBitsScaledGroupLength = 10
Section 5: nGroups = 14496
Section 5: priMissingValue = None
Section 5: refGroupLength = 1
Section 5: refGroupWidth = 0
Section 5: refValue = 0.0
Section 5: secMissingValue = None
Section 5: typeOfMissingValueManagement = 0 - No explicit missing values included within the data values
Section 6: bitMapFlag = 0 - A bit map applies to this product and is specified in this section.

>>> grbs[0].data
array([[nan, nan, nan, ..., nan, nan, nan],
       [nan, nan, nan, ..., nan, nan, nan],
       [nan, nan, nan, ..., nan, nan, nan],
       ...,
       [nan, nan, nan, ..., nan, nan, nan],
       [nan, nan, nan, ..., nan, nan, nan],
       [nan, nan, nan, ..., nan, nan, nan]], dtype=float32)
>>> grbs[0].data.min()
nan

>>> grbs[0].data.max()                                                                                                                                                                                                                                  
nan

I can see via gdalinfo and QGIS that there are valid values in the GRIB file:

$ gdalinfo urma2p5.2023042418.pcp_06h.wexp.grb2 -mm | grep Min/Max
    Computed Min/Max=0.000,131.898

Consider adding Gitpod configuration to simplify development environment for contributors.

I've created a Gitpod configuration and custom image build which has greatly simplified my development workflow for contributing to grib2io. Once you've created a free account (up to 50 hours per month) at gitpod.io, you can open any GitHub repository in a throwaway development environment simply by prepending the repository url with https://gitpod.io/#. For example, to test my configuration you can go to https://gitpod.io/#https://github.com/jasonwashburn/grib2io/tree/add-gitpod-config.

You can also install their browser extension which adds a "Gitpod" button whenever you're viewing a GitHub repo. Clicking on it opens that repo up in a new workspace automatically, and you're ready to make changes, test and execute code.
image

There is sometimes a small delay the first time you load the environment while it builds a container image, but once you've added your repo as a project on Gitpod, it will attempt to pre-build a new image any time you make changes that affect the container image. When an image is already built, environments open within 5-10 seconds inside your browser in a web version of Visual Studio Code (which is surprisingly comparable to the desktop IDE). Alternatively, it will present you with the option to open it inside Visual Studio Code on your desktop as well.

One of the other great benefits is the ability to launch new instances from any branch, and even test PRs without having to pull or checkout anything.

Take a look, and if you're interested in adding it to the project I'll submit a PR.

Provide wgrib2-like string for level/layer information

Provide a string as a Grib2Message attribute that describes the level/layer information exactly as wgrib2 does. Here is some example output from wgrib2 (see the 5th colon-delimited column):

403:237545151:d=2020102600:TMP:2 m above ground:24 hour fcst:
404:238399796:d=2020102600:SPFH:2 m above ground:24 hour fcst:
405:239116747:d=2020102600:DPT:2 m above ground:24 hour fcst:
406:240018093:d=2020102600:RH:2 m above ground:24 hour fcst:
407:240795666:d=2020102600:APTMP:2 m above ground:24 hour fcst:
408:241347772:d=2020102600:TMAX:2 m above ground:18-24 hour max fcst:
409:242173333:d=2020102600:TMIN:2 m above ground:18-24 hour min fcst:
410.1:243016500:d=2020102600:UGRD:10 m above ground:24 hour fcst:
410.2:243016500:d=2020102600:VGRD:10 m above ground:24 hour fcst:

wgrib2 source file Level.c provides a const char array of code numbers (as comments) matched with level/layer message strings. Should be fairly easy to convert this into Python dictionary syntax.

Add coded data tables

GRIB2 Messages can contain data values that are an enumeration in which the values represent something specific. For example see NCEP GRIB2 Table 4.201 which defines coded values for Precipitation Type.

We need to add the following tables:

  • 4.201
  • 4.202
  • 4.203
  • 4.204
  • 4.205
  • 4.206
  • 4.207
  • 4.208
  • 4.209
  • 4.210
  • 4.211
  • 4.212
  • 4.213
  • 4.215
  • 4.216
  • 4.217
  • 4.218
  • 4.222
  • 4.223
  • 4.224
  • 4.225
  • 4.227
  • 4.243

Use datetime objects for GRIB2 date/time metadata

Rather than using past practices of turning GRIB2 date/time metadata into numeric or string objects, let's leverage Python's built-in datetime module.

Represent the GRIB2 reference date and end of Overall Time Period (where available) as datetime.datetime objects. Date components and other time range metadata can be represented using datetime.time and datetime.timedelta objects.

Time-based arithmetic becomes much easier!

Ability to access/output GRIB message as raw bytes

It would be beneficial to add a method to return the GRIB2 message as raw bytes. Something akin to Grib2Message.to_bytes() or similar. It should be an easy addition as I am currently achieving the same functionality by accessing the ._msg attribute of Grib2Message directly. Adding a method to get the data would be a cleaner implementation than calling a protected attribute. This is useful for writing the GRIB formatted data to other mediums, for instance, directly to S3 from memory without the need to create temporary files.

If you're open to contributions I'm willing to submit a PR.

Test suite improvements

Wondering if you'd be open to improving the test suite. I'm interested in developing an automated test suite with unit and integration tests, likely using Nox and Pytest; one that could be easily incorporated into both local development workflows and a build pipeline.

User documentation for the project

The official website for documentation of this project is https://noaa-mdl.github.io/grib2io. But the link is all about developer documentation. In addition, there is not even a single example given there.

Therefore, I propose to have user documentation for this package (similar to what pygrib or wrf-python have) and the dev documentation may be kept at the end of it. Ideally the user documentation should have a few sample data sets and example scripts (even with comments are fine) for some use cases.

Select GRIB2 messages by percentile

Provide mechanism for grib2io.open.select() method to select GRIB2 messages based on the percentile value in the product definition section for templates 4.6 and 4.10

Only read data section when calling data() method

When a Grib2Message object is created from an existing file, the entire packed GRIB2 message is read into Grib2Message._msg. To improve performance, when we create the Grib2Message object, we only want to read sections 0 through 6 and only read section 7 (Data Section) when Grib2Message.data() is called.

When data is finally read, the packed data section would be appended to the existing message data in Grib2Message._msg OR add a kwarg to data() to control whether the packed data is saved in the message object.

pip install fails when setup.py is unable to find a library in DEFAULT_LIBRARIES

When trying to pip install, install fails with the following error:

Collecting grib2io
  Using cached grib2io-0.9.2.tar.gz (669 kB)
    ERROR: Command errored out with exit status 1:
     command: /workspace/venv/bin/python3 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-a9lbp5i8/grib2io/setup.py'"'"'; __file__='"'"'/tmp/pip-install-a9lbp5i8/grib2io/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-2oyh4krp
         cwd: /tmp/pip-install-a9lbp5i8/grib2io/
    Complete output (8 lines):
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-install-a9lbp5i8/grib2io/setup.py", line 184, in <module>
        libpath = os.path.dirname(os.path.realpath(find_library(lib)))
      File "/usr/lib64/python3.9/posixpath.py", line 390, in realpath
        filename = os.fspath(filename)
    TypeError: expected str, bytes or os.PathLike object, not NoneType
    Reading from setup.cfg...
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
 

The problem occures when _find_library_linux() returns None. In my case it was because openjp2 wasn't found in libs.

os.path.realpath() expects a str or path, and cannot handle None as an input.

At first glance it looks like a possible fix would be to replace line 184 in setup.py with something like:

lib_path = find_library(lib)
if lib_path isinstance(lib_path, str):
    lib_dir_path = os.path.dirname(os.path.realpath(lib_path))
else:
    continue

User control of GRIB2 decoding.

In grib2io, the overall workflow for reading GRIB2 messages, is read, unpack, and decode. Here "unpack" means to unpack the GRIB2 sections into instance variables. This is fairly straightforward and performed by the underlying g2c code.

The process of "decode" is performed in the grib2io module and is where the various pieces of the GRIB2 sections are converted from the stored integer values into more descriptive metadata via lookup tables or converted to floating-point values (using scale factors and scaled values).

It is this process of "decode" that we want to offer control of to the user.

Missing attribute scaledValueOfSecondFixedSurface

When a GRIB2 message contains a missing 2nd fixed surface, the grib2io.Grib2Message object has no attribute scaledValueOfSecondFixedSurface.

It should always be present and set to None if missing.

Error decoding message with PDTN 4.5/9

grib2io cannot decode Grib2Messages with a Product Definition Template Number 5 or 9. These are probability products. Here is the error:

File ~/Documents/Projects/GitHub/grib2io/build/lib.macosx-12-x86_64-3.9/grib2io/_grib2io.py:910, in Grib2Message.decode(self)
    908     self.typeOfProbability = Grib2Metadata(self.productDefinitionTemplate[16],table='4.9')
    909     self.thresholdLowerLimit = self.productDefinitionTemplate[18]/(10.**self.productDefinitionTemplate[17])
--> 910     self.thresholdUpperLimit = self.productDefinitionTemplate[20]/(10.**self.productDefinitionTemplate[19])
    912 # Template 4.6 -
    913 elif self.productDefinitionTemplateNumber == 6:

ZeroDivisionError: float division by zero

make select() smarter

Currently, the select() method for the grib2io.open object does not avoid grib2 messages where an attribute is not available. When this happens, grib2io raises an AttributeError.

Hard-coded path in build script

Built the NCEPLIBS: g2c, sp, ip and running grib2io version 2.0.0b2.
Install command: pip install grib2io==2.0.0b2

Found this error when installing:

      INFO: compile options: '-I/Users/ericengle/test/ip/4.0.0/include_4 
      -I/Users/ericengle/test/ip/4.0.0/include_4 -Ibuild/src.linux-x86_64-3.10/build/src.linux-x86_64-3.10  
      -I/home/mward/mambaforge-pypy3/envs/media/lib/python3.10/site-packages/numpy/core/include 
      -Ibuild/src.linux-x86_64-3.10/numpy/distutils/include -I/home/mward/mambaforge-pypy3/envs/media/include/python3.10 -c'
      INFO: extra f90 options: '-O3 -fopenmp'
      INFO: gfortran:f90: interpolate.f90
      f951: Warning: Nonexistent include directory ‘/Users/ericengle/test/ip/4.0.0/include_4’ [-Wmissing-include-dirs]
      f951: Warning: Nonexistent include directory ‘/Users/ericengle/test/ip/4.0.0/include_4’ [-Wmissing-include-dirs]
      interpolate.f90:5:5:

          5 | use ipolates_mod
            |     1
      Fatal Error: Cannot open module file ‘ipolates_mod.mod’ for reading at (1): No such file or directory
      compilation terminated.
      error: Command "/usr/bin/gfortran -Wall -g -fno-second-underscore -fPIC -O3 -funroll-loops 
      -I/Users/ericengle/test/ip/4.0.0/include_4 -I/Users/ericengle/test/ip/4.0.0/include_4 
      -Ibuild/src.linux-x86_64-3.10/build/src.linux-x86_64-3.10 -I/home/mward/mambaforge-pypy3/envs/media/lib/python3.10/site-packages/numpy/core/include 
      -Ibuild/src.linux-x86_64-3.10/numpy/distutils/include 
      -I/home/mward/mambaforge-pypy3/envs/media/include/python3.10 -c -c interpolate.f90 -o build/temp.linux-x86_64-cpython-310/interpolate.o -O3 -fopenmp" failed with exit status 1

Notice /Users/ericengle/test/ip/4.0.0/include_4 shows up in the include path. I think this should not be in the script, correct?
My include paths are in $HOME/.local/NCEPLIBS-ip/include but how do I pass these to the pip install?

Add support for product definition template 4.48

When trying to create plots of near-surface (8 m AGL) smoke/dust from a grib2 file, I get the following error message:

ValueError: Unsupported Product Definition Template Number - 4.48

Would it be possible to add support for this product definition template to grib2io?

Use descriptor classes for section templates and metadata.

This is a big request and will amount to a major code rewrite. There are a few goals here:

  • Dynamic creation of the Grib2Message object using templates for each grid definition, product definition, and data representation templates. Each supported template will have predefined named metadata attributes.
  • Each GRIB2 metadata attribute will be a descriptor class where the attribute does not live in the instance variable, but instead be generated from the unpacked section data. This will allow for the user to change an attribute and have that change be reflected in the section data where it originated from. For example, changing refDate will then change the year, month, day, hour, minute, and second slots in the section 1 list.

Despite more code, it is anticipated that the new Grib2Message object will be more memory efficient and thus will improve grib2io's overall performance.

Writing GRIB2 Messages Examples

Greetings,

I have utilized the init(), addgrid(), and addfield() methods to create a new GRIB2 message with all of the constituent arguments required for each but it seems like my created message is not complete when trying to do some testing with decoding and was curious if there is a location which shows examples of generating GRIB2 products/data from scratch?

Thanks,
Chris

Grib file behavior not following documentation

I'm trying to learn the grib2io API version 0.8.0 on a CentOS7 WSL2, and I'm running into some issues.
I can open this grib file from NCEP no problem, but when I open the file with grib2io:

 `grbh = grb2.open(grbPath, mode='r')`

Which gives me something I can loop over to get the list of attributes and names in each message, but I keep getting each message as a list and I cannot access the data.
I can access some header information via the grbh methods: name, mode, variables (shortNames), size, messages (total) etc.
But like I said, then I access one of these records, I keep getting a list and no way to access the data.
Not sure if I have to count each of these roughly 74 list members to get the value.

The documentation is not helping me and I cannot find a good example. Is grib2io functioning as it should? If so, how do I get to the data?

Installing grib2io with conda is not working

Hello, I have been trying to install grib2io for the past few days on an anaconda environment using the command: conda install -c conda-forge grib2io on the terminal but it doesn't work. It always gets stuck here (for several hours):

Collecting package metadata (current_repodata.json): done
Solving environment: unsuccessful initial attempt using frozen solve. Retrying with flexible solve.
Solving environment: unsuccessful attempt using repodata from current_repodata.json, retrying with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: unsuccessful initial attempt using frozen solve. Retrying with flexible solve.
Solving environment: /

I first installed it on the anaconda base and I haven't had any problems but I'm stuck here when I try to install it on a specific environment.
I don't know what it is happening or why it is not working but I need this library to work with grib2 files.

grib2io v2.0.0b1 - xarray backend on Hera?

We are interested in testing with the xarray backend for grib2io v2. We followed the suggested directions for testing on Hera as below. We did not expect the open_dataset call to succeed (these GRB2 output files include pressure coordinates, height coordinates, and "other" - reading them with xarray and the cfgrib engine, for example, is quite complicated). But it seems that xarray does not "see" the grib2io backend yet either. Please see below:

# module use /scratch1/NCEPDEV/mdl/apps/modulefiles
# module load python/3.10
# module list

Currently Loaded Modules:
  1) intel/2022.1.2   2) hpss/hpss   3) rocoto/1.3.1   4) ncview/2.1.7   5) sutils/default   6) python/3.10

# python3
Python 3.10.8 (main, Nov 24 2022, 14:13:03) [GCC 11.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> 
>>> import grib2io
>>> import xarray as xr
>>> 
>>> print(grib2io.__version__)
2.0.0b1
>>>
>>> ds = xr.open_dataset('/scratch2/AOML/aoml-hafs1/Lew.Gramer/ocean/hafsv1_fnl.hycom_hfsb_RiC_0p20/coupled/grb2/2022092518/09l.2022092022092518.hfsb.storm.atm.f126.grb2',engine='grib2io')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/scratch1/NCEPDEV/mdl/apps/python/3.10/lib/python3.10/site-packages/xarray/backends/api.py", line 526, in open_dataset
    backend = plugins.get_backend(engine)
  File "/scratch1/NCEPDEV/mdl/apps/python/3.10/lib/python3.10/site-packages/xarray/backends/plugins.py", line 185, in get_backend
    raise ValueError(
ValueError: unrecognized engine grib2io must be one of: ['netcdf4', 'scipy', 'gini', 'store']

Have setup.py automatically search for g2c

Have the setup.py automatically search common paths on Linux, UNIX, and macOS systems for the NCEPLIBS-g2c library and include. This was done for grib2io-interp and that logic can be carried over here.

pip install fail

#  grib2io-1.1.0.tar.gz
 pip install grib2io
  • g2clib.c:745:10: fatal error: grib2.h: No such file or directory
    745 | #include "grib2.h"
  • Linux
  • Versions:
    • python 3.11.1
    • cfgrib 0.9.10.3
    • eccodes 2.29.0

Ability to use external g2c

What would it take to use an external g2c instead of building it internally? Is there some feature missing from the main repo? Building with Spack it would be ideal to use an external g2c so that there's no possibility of duplicate versions.

Account for -127 in scale factor attributes

GRIB2 encoding documentation states that if a particular value in a section is to be set to missing, then all bits should be set to 1. For a value that is only 1 byte (octet), this is a value of 255 or 11111111.

On decode, if a negative number is allowed for a particular value, g2c will read nbits-1 for the value and then evaluate the most significant bit. If it is 1, then the decoded value is flipped to negative.

So if a GRIB2 attribute value is 255 (i.e. missing) and the attribute is specified to be 1 byte and allows negative values, then on decode, the 255 packed value becomes -127.

Allow for a 0th GRIB2 message

grib2io is the successor to ncepgrib2 which was developed alongside pygrib and therefore takes on the pygrib convention that there is no 0th GRIB2 message. As we refactor grib2io for v2.0, we should allow a 0th GRIB2 message.

Python (and C) indexing begins at 0, so when reading a file we read in units of bytes, starting at 0. In grib2io, we will use that same idea for reading from files in units of GRIB2 messages.

GOAL:

g = grib2io.open(<file>)
g.seek(0) # Puts the file pointer at the beginning of the file
g.read(1) # Reads the first message (i.e. the 0th message)
g.tell() # Should return 1
msg = g[0] # msg is the first message on the file

Fix NBM Wx Decode

grib2io is not properly decoding section 2 (Local Use Section) for MDL/NBM Wx grids.

A mechanism to what attributes in sections 3, 4, and 5 are mutable and which ones are required.

It would be beneficial to the community if there was a mechanism to inform a user as to what attributes are required to be populated. Sections 0 and 1 are easy because they are static, but 3, 4, and 5 are dynamic according to their template numbers. And even with section 5, some of the metadata, should not be edited by the user.

Let me think about an added feature re: this.

Originally posted by @EricEngle-NOAA in #66 (reply in thread)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.