Coder Social home page Coder Social logo

meteoswiss / meteodata-lab Goto Github PK

View Code? Open in Web Editor NEW
3.0 4.0 0.0 758 KB

Model data processing framework

Home Page: https://meteoswiss.github.io/meteodata-lab/

License: MIT License

Python 79.00% NewLisp 18.89% Shell 2.11%
numericalweatherpredictions

meteodata-lab's Introduction

meteodata-lab

A model data processing framework based on xarray.

DISCLAIMER

Warning

This project is BETA and will be experimental for the forseable future. Interfaces and functionality are likely to change, and the project itself may be scrapped. Do not use this software in any project/software that is operational.

Install

Once you created or cloned this repository, make sure the installation is running properly. Install the package dependencies with the provided script setup_env.sh. Check available options with

tools/setup_env.sh -h

We distinguish pinned installations based on exported (reproducible) environments and free installations where the installation is based on top-level dependencies listed in requirements/requirements.yaml. If you start developing, you might want to do an unpinned installation and export the environment:

tools/setup_env.sh -u -e -n <package_env_name>

Hint: If you are the package administrator, it is a good idea to understand what this script does, you can do everything manually with conda instructions.

Hint: Use the flag -m to speed up the installation using mamba. Of course you will have to install mamba first we recommend to install mamba into your base environment conda install -c conda-forge mamba. If you install mamba in another (maybe dedicated) environment, environments installed with mamba will be located in <miniconda_root_dir>/envs/mamba/envs, which is not very practical.

The package itself is installed with pip. For development, install in editable mode:

conda activate <package_env_name>
pip install --editable .

Warning: Make sure you use the right pip, i.e. the one from the installed conda environment (which pip should point to something like path/to/miniconda/envs/<package_env_name>/bin/pip).

Once your package is installed, run the tests by typing:

conda activate <package_env_name>
pytest

Credits

This package was created with copier and the MeteoSwiss-APN/mch-python-blueprint project template.

meteodata-lab's People

Contributors

cfkanesan avatar cosunae avatar petrabaumann avatar sfriedlizuehlke avatar twicki avatar victoria-cherkas avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

meteodata-lab's Issues

Define the method of how to check for the type of vertical coordinates

cfgrib derives the type of vertical coordinates from the parameter attribute GRIB2_typeOfLevel, and creates a dimension coordinate with the same name.

Thus, either the name of the vertical dimension, the name of the associated dimension coordinate, or the attribute GRIB2_typeOfLevel could be assessed to check the vertical coordinate type.

In the current code, a mixture pof methods is used (e.g. destagger.py, and vertical_reduction.py).

fix metadata for vcoord_type

  • Icon Data Processing Incubator version: b890951

Description

While loading pressure levels, the metadata for vcoord_type is set to "isobaricInhPa",
we should set it to follow instead the mars levtype, in this case "pl"

What I Did

    pdatafile = cosmo_data_dir / "COSMO-2E/1h/pl/000/lfff00000000p"

    reader = GribReader([pdatafile])
    ds = reader.load_cosmo_data(["FI"])

Precision of floats in xarray and fieldextra

Check for compatible representation of floats in xarray and fieldextra.

fieldextra widely uses single precision (REAL data types) in numerical computations.

Exceptions:

  • representation of horizontal coordinates
  • operations on horizontal coordinates, including transformations
  • time unit transformations
  • logarithm in fxtr_column
  • representation of scale factors of fixed surfaces when computing the surface values
  • NetCDF import

origin_z not properly set for HHL

  • Icon Data Processing Incubator version: 78adca6
  • Python version: 3.11.8
  • Operating System:

Description

When loading data from KENDA (archive) corresponding to COSMO-1 km, HHL comes with 61 levels, while all other model level variables have 60 levels.

The issue is that origin_z == 0.0 for all of them and it does not distinguish variables on different vertical staggered positions.

What I Did

        ds = idpi.grib_decoder.load(
            idpi.data_source.DataSource(datafiles=[full_path]),
            {
                "param": [
                    "T",
                    "U_10M",
                    "V_10M",
                    "U",
                    "V",
                    "PS",
                    "T_2M",
                    "QV",
                    "TQV",
                    "PMSL",
                    "HHL",
                    "HSURF",
                    "PP",
                    "P0FL",
                    "P",
                ]
            },
        )
        idpi.metadata.set_origin_xy(ds, ref_param="P")

coordinates lost after destagger

  • Icon Data Processing Incubator version:
  • Python version:
  • Operating System:

Description

I have a data array, var, with the following coordinates:

  * time        (time) int64 8B 0
  * z           (z) int64 480B 1 2 3 4 5 6 7 8 9 ... 52 53 54 55 56 57 58 59 60
    lat         (y, x) float64 2MB 42.18 42.18 42.18 42.19 ... 50.13 50.13 50.13
    lon         (y, x) float64 2MB 0.8064 0.8331 0.8598 ... 17.43 17.46 17.49
    valid_time  (time) datetime64[ns] 8B 2016-03-14T20:00:00
    ref_time    datetime64[ns] 8B 2016-03-14T20:00:00

After applying destagger(var, "x"), many coordinates are lost.

Coordinates:
  * time     (time) int64 8B 0
  * z        (z) int64 480B 1 2 3 4 5 6 7 8 9 10 ... 52 53 54 55 56 57 58 59 60

Shouldnt we retain whatever coordinates the input array brings ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.