Coder Social home page Coder Social logo

xcape's Introduction

xgcm: General Circulation Model Postprocessing with xarray

pypi package conda forge conda-forge GitHub Workflow CI Status code coverage documentation status DOI license Code style pre-commit.ci status

Binder Examples

Link Provider Description
Binder mybinder.org Basic self-contained example
PBinder Pangeo Binder More complex examples integrated with other Pangeo tools (dask, zarr, etc.)

Description

xgcm is a python package for working with the datasets produced by numerical General Circulation Models (GCMs) and similar gridded datasets that are amenable to finite volume analysis. In these datasets, different variables are located at different positions with respect to a volume or area element (e.g. cell center, cell face, etc.) xgcm solves the problem of how to interpolate and difference these variables from one position to another.

xgcm consumes and produces xarray data structures, which are coordinate and metadata-rich representations of multidimensional array data. xarray is ideal for analyzing GCM data in many ways, providing convenient indexing and grouping, coordinate-aware data transformations, and (via dask) parallel, out-of-core array computation. On top of this, xgcm adds an understanding of the finite volume Arakawa Grids commonly used in ocean and atmospheric models and differential and integral operators suited to these grids.

xgcm was motivated by the rapid growth in the numerical resolution of ocean, atmosphere, and climate models. While highly parallel supercomputers can now easily generate tera- and petascale datasets, common post-processing workflows struggle with these volumes. Furthermore, we believe that a flexible, evolving, open-source, python-based framework for GCM analysis will enhance the productivity of the field as a whole, accelerating the rate of discovery in climate science. xgcm is part of the Pangeo initiative.

Getting Started

To learn how to install and use xgcm for your dataset, visit the xgcm documentation.

xcape's People

Contributors

chiaral avatar ocefpaf avatar rabernat avatar xebadir avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

xcape's Issues

Improve documentation on order of dimensions and units

First, this is a super useful package, so thank you to the creators and contributors! Second, in attempting to use this package, I stumbled across two confusing items related to documentation:

  1. Order of dimensions within input arrays

In calc_cape, it is implied that the order of dimensions for 4-D arrays should be (z, y, x, t), coming from the docstring in line 149: When vertical_lev='model', p.shape = t.shape = (nlev, x, y, ...). However, in _reshape_inputs and _reshape_surface_inputs it's clear that the input arrays are expected to be in the order of (x, y, t, z) or (x, y, t), respectively, which turns out to be the correct order. The docstring for calc_cape should be updated to reflect this.

  1. Units of input arrays

In calc_cape all temperature-related input arrays are specified to be in degC, but in cape within fortran.py, the docstring says everything should be in Kelvin. I believe degC is correct, based on the results I was able to get, but documentation should be updated to match and be correct.

May have time to put in a PR on this at some point soon, but just wanted to flag here for now.

Add Support for a calculation of LCL

For the SB parcel one can just use the Stull (2000) approximation:
SBLCL= 125.*(((T[0,:,:]+T[1,:,:])/2)-Td[0,:,:])

For the ML case - you need to mix the parcel (e.g. take the mean of the layer characteristics)
MLLCL_50mb = np.zeros((ny,nx)) MLLCL_100mb = np.zeros((ny,nx)) for i in range(0,nx): for j in range(0,ny): P_in = P[:,j,i] T_in = T[:,j,i] Td_in = Td[:,j,i] MU = int(MUlvl[j,i]) MLLCL_50mb[j,i],Q_ML50[j,i],avgt50[j,i],avgtd50[j,i],avgq50[j,i],avgp50[j,i]= MLLCL.getmllcl(p_in=P_in,t_in=T_in,td_in=Td_in,ml_depth=1000.0,nk=nz) MLLCL_100mb[j,i],Q_ML100[j,i],avgt100[j,i],avgtd100[j,i],avgq100[j,i],avgp100[j,i]= MLLCL.getmllcl(p_in=P_in,t_in=T_in,td_in=Td_in,ml_depth=1000.0,nk=nz) Q_MU[j,i]=Q[MU,j,i] H_MU[j,i]=H[MU,j,i] P_MU[j,i]=P[MU,j,i]
Though this was for consistency and pure convenience using a stripped back version of George's code. In any case, you calculate the LCL using the same calculation as above, just for the mixed properties.

There is also an analytical python/fortran solution by David Romps:

surfRH = calc.relative_humidity_from_dewpoint(T[0,:,:]*units.units('degC'),
                Td[0,:,:]*units.units('degC')).magnitude
       ML50_RH = calc.relative_humidity_from_dewpoint(avgt50[:,:]*units.units('degK'),
                avgtd50[:,:]*units.units('degK')).magnitude
       ML100_RH = calc.relative_humidity_from_dewpoint(avgt100[:,:]*units.units('degK'),
                avgtd100[:,:]*units.units('degK')).magnitude
       #t0 = time.clock() 
       #for i in range(0,nx):
       #   for j in range(0,ny):
       #       exact_SBLCL[j,i] = lcl(P[0,j,i]*100,T[0,j,i]+273.15,rh=surfRH[j,i]) 
       #print time.clock() - t0, "seconds full grid SBLCL Romps Py Calc time"
       Pin = P[0,:,:]*100
       Tin = T[0,:,:]+273.15
       t0 = time.clock()
       for i in range(0,nx):
          for j in range(0,ny):
               exact_SBLCL[j,i] = exactLCL.lcl_mod.lcl(Pin[j,i],Tin[j,i],surfRH[j,i])
               exact_MLLCL50[j,i] = exactLCL.lcl_mod.lcl(avgp50[j,i],avgt50[j,i],ML50_RH[j,i])
               exact_MLLCL100[j,i] = exactLCL.lcl_mod.lcl(avgp100[j,i],avgt100[j,i],ML100_RH[j,i])

I'll try to implement this - but the Stull approximate will do in a pinch.

Calculation of Composite Parameters

Highly specialized, and perhaps not as transferable (nor computationally intensive):
SHIP:

SHIP[:,:] = (CAPE_MU[:,:]*Q_MU[:,:]*LAPSE[:,:]*-T500[:,:]*S06[:,:])/42000000.
        for i in range(0,nx):
                for j in range(0,ny):
                        if(CAPE_MU[j,i]<1300.):
                                SHIP[j,i] = SHIP[j,i]*(CAPE_MU[j,i]/1300.)
                        if(LAPSE[j,i]<5.8):
                                SHIP[j,i] = SHIP[j,i]*(LAPSE[j,i]/5.8)
                        if(FZL[j,i]<2400.):
                                SHIP[j,i] = SHIP[j,i]*(FZL[j,i]/2400)

STP and SCP:

        #Calculate Supercell/STP parameters
        # term1: cape term #
        term1 = CAPE_SB / 1500.
        # term2: LCL term #
        term2 = (2000. - SBLCL) / 1000.
        #set term2 values to 1 or 0 based on lcl height
        term2[SBLCL < 1000.] = 1.
        term2[SBLCL > 2000.] = 0.
        # term3: SRH term #
        term3 = srh1km / 150.
        # term4: 0-6km shear term # 
        term4 = S06 / 20.
        #cap term4 when 0-6km bwd > 30. m/s
        term4[S06 > 30.] = 1.5
        #set term4 to zero where 0-6km bwd < 12.5 m/s
        term4[S06 < 12.5] = 0.
        #cin mask
        term5 = np.fabs(CIN_SB)
        term5[np.fabs(CIN_SB)>125]=0.
        term5[np.fabs(CIN_SB)<=125]=1.
        # calc STP #
        STP = term1 * term2 * term3 * term4 * term5
        #calculate SCP#
        scp1 = CAPE_SB/1000.
        scp2 = srh3km/100.
        scp3 = S06/20.
        SCP = scp1*scp2*scp3*term5

EHI

        EHI3 = ((CAPE)*(SRH3))/(1.6*10**5)
        EHI1 = ((CAPE)*(SRH1))/(1.6*10**5)

SHERB/MOSHE (Requires Theta E calculation, and input of W):

        MOSH = (((LAPSE_3km-4.)**2)/4.)*((S15-8.)/10.)*((lsf+10.)/9.)
        MOSH[MOSH<0]=0.0

        #Calculate SHERB
        SHERB = ((S03/26.0)*(LAPSE_3km/5.2)*(LAPSE/5.6))

List of core functions to implement

Here are the core numerically complex functions that we want to write in fortran and wrap in python.

All functions consume 3D inputs

Model Level data

  • cape (returns 2D field)
  • standard height (returns 3D field)
  • bunkers's storm motion (returns three 2D fields, where 2D is for U and V component)
  • srh (returns 2D field)

Pressure Level data

  • cape (returns 2D field)
  • standard height (returns 3D field)
  • bunkers's storm motion (returns three 2D fields, where 2D is for U and V component)
  • srh (returns 2D field)

Add Support to Interpolate for Commonly Used Parameters

Many parameters are calculated by interpolating some parameter against a reference coordinate (T, P, HAGL, H) - need to implement code that could wrap interpolation for:
2-4km AGL Lapse rate. (interpolate temperatures to height AGL level and difference, divide by height difference)
700-500mb LAPSE Rate (interpolate temperatures to pressure level and difference, divide by height difference)
500mb Temperature (in C) (interpolate temperature to pressure level)
0-6km Shear, 0-1km shear (interpolate u, v to height agl, and then take vector difference relative to surface level).

Example Code for the old way:
LAPSE = np.zeros((ny,nx)) T500 = np.zeros((ny,nx)) LAPSE_3km = np.zeros((ny,nx)) LAPSE_24km = np.zeros((ny,nx)) THGZ = np.zeros((ny,nx)) for i in range(0,nx): for j in range(0,ny): if(T[0,j,i]>0.0): if(P[0,j,i]>=700.): T700 = vinterp3d.dinterp3dz(T[:,j,i],P[:,j,i],700.) T500[j,i] = vinterp3d.dinterp3dz(T[:,j,i],P[:,j,i],500.) Z700 = vinterp3d.dinterp3dz(H[:,j,i],P[:,j,i],700.) Z500 = vinterp3d.dinterp3dz(H[:,j,i],P[:,j,i],500.) LAPSE[j,i]=(T700-T500[j,i])/((Z500-Z700)/1000.) T3km = vinterp3d.dinterp3dz(T[:,j,i],AGLH[:,j,i],3000.) HT30 = vinterp3d.dinterp3dz(H[:,j,i],T[:,j,i],-30.) HT10 = vinterp3d.dinterp3dz(H[:,j,i],T[:,j,i],-10.) T4km = vinterp3d.dinterp3dz(T[:,j,i],AGLH[:,j,i],4000.) T2km = vinterp3d.dinterp3dz(T[:,j,i],AGLH[:,j,i],2000.) LAPSE_3km[j,i]=(T[0,j,i]-T3km)/3. LAPSE_24km[j,i]=(T2km-T4km)/2. THGZ[j,i]=HT30-HT10

S06 = np.zeros((ny,nx)) u6km = vinterp3d.dinterp3dz(U[:,:,:],AGLH[:,:,:],6000.) v6km = vinterp3d.dinterp3dz(V[:,:,:],AGLH[:,:,:],6000.) S06[:,:] = np.sqrt((u6km[:,:] - U[1,:,:])**2 + (v6km[:,:] - V[1,:,:])**2)

Setup.py for more complex package

Brian Rose's https://github.com/brian-rose/climlab can help us setting up the setup.py.

Things I gather from chatting with him and browsing the repo:

  1. you have a hierarchy of setup.py
  2. you have a multitude of subpackages, which has a separate setup.py in it

I can try to setup this up, but it would be a large reorganization of everything in the repo. What is the best way to proceed?

set up documentation building

Here are the steps.

  1. First, learn to build the documentation locally. It lives in the doc folder. We use sphinx. The command to build it is make html (from the doc folder). To preview your local docs, do cd _build/html; python -m http.server.
  2. Then configure readthedocs. You will need to create a .readthedocs.yml file. (example)

List of things to do for JOSS publication

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation in code
NOTES:
Let's not touch the .pyf files for now - those should be automatically generated, so I am not sure we should add text to it.
Be mindful of the space before the """ for consistency use """ instead of ''' (although both work just fine)

  • copy documentation from bunkers_model_lev to bunkers_pressure_lev @chiaral #37
  • adapt documentation from bunkers_model_lev/pressure_lev for CAPE_CODE_model_lev/pressure_lev (for the outer loops routines @chiaral
  • SREH_model_lev/pressure_lev lack all documentation @chiaral #46
  • stdheight_2D_model_lev/pressure_lev lack all documentation @xebadir #43
  • cape_fortran.py lacks documentation @xebadir #42
  • eliminate cape_numba.py? @chiaral
  • Not sure we need to add text to duck_array_ops.py, @chiaral will check
    within core.py
    • _calc_cape_numpy: take out numba reference, add a few comments for better readability @chiaral #37
    • _calc_srh_gufunc: adapt _calc_cape_gufunc @chiaral #37
    • calc_srh: add a few comments for better readability @chiaral #37

Documentation

  • Create readdocs #36 #38
  • update the readdocs template to extend copyright time range.
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps?
  • Are the test described so that the functionality of the software can be verified? #41
  • Community guidelines: Are there clear guidelines for third parties wishing to
    • 1) Contribute to the software
    • 2) Report issues or problems with the software
    • 3) Seek support

Software paper

  • Create Paper folder/md/bib - #35
  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)?
  • Do references in the text use the proper citation syntax?
  • Check if compiles here

add xarray layer

It's great to see so much progress on xcape.

Part of our original vision was to have an xarray API, where you could pass xarray datasets / dataarrays and get them back.

Are you still interested in adding this type of interface?

Dealing with nans

Hi all!

Just looking around the code and am wondering how the package/the Bryan(2008) code deals with nans? Lots of CMIP data on pressure levels is nan because of orography, so just wondering if this is taken care of in the code or if it's required to remove the nans by hand before running it through?

Looking forward to using xcape!

Cheers!
Andrew

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.