Coder Social home page Coder Social logo

josuemtzmo / eke_sst_trends Goto Github PK

View Code? Open in Web Editor NEW
20.0 3.0 8.0 51.24 MB

A repository with Jupyter notebooks that reproduce the analyses and figures included in the manuscript "Global changes in oceanic mesoscale currents over the satellite altimetry record".

License: MIT License

Jupyter Notebook 98.50% Python 0.05% TeX 1.45%

eke_sst_trends's Introduction

Global changes in oceanic mesoscale currents over the satellite altimetry record

Zenodo
DOI

A repository containing material related to the manuscript

Martínez-Moreno, J., Hogg, A. McC., England, M. H., Constantinou, N. C., Kiss, A. E., and Morrison, A. K. Global changes in oceanic mesoscale currents over the satellite altimetry record. (submitted on Oct. 2020; preprint available at doi:10.21203/rs.3.rs-88932/v1)

that investigates the temporal evolution of oceanic surface eddy kinetic energy and sea surface temperature over the satellite record from a global, geographical and dynamical-region perspective.

Analysed datasets include the AVISO+ SSH altimetry and NOAA optimal interpolated sea surface temperature (OISST).

Python requirements:

Make sure you have the required dependencies installed (numpy, xarray,dask,cartopy,cmocean, & jupyterlab):

pip install -r requirements.txt 
conda install -c conda-forge --file ./requirements.txt

Additionally, install xarrayMannKendall:

git clone https://github.com/josuemtzmo/xarrayMannKendall.git

and follow the installation instructions in xarrayMannKendall GitHub Page.

Contents:

manuscript: folder containing the LaTeX source files and figures for the manuscript

datasets: folder in which the NetCDF (.nc) output files are expected to be found. Download NetCDF files from zenodo doi

figures: folder with jupyter notebooks that produce the main figures of the manuscript.

pre-processing: folder with scripts and instructions that reproduce .nc files in datasets from the raw AVISO+ dataset

trends: folder with jupyter notebooks that compute trends

Datasets:

To generate all the pre-processed datasets of this repository, you need access to the AVISO+ altimetry and OISST NOAA datasets for period Jan. 1993 - Mar. 2020. The generated pre-processed datasets will reproduce the analysis and results presented in the manuscript. All the required notebooks to reproduce the pre-processed datasets from the raw AVISO+ and OISST NOAA datasets are in the pre-processing and trends folders. The notebooks within pre-processing folder use the raw satellite output to produce some of the .nc files inside datasets; the notebooks within trends use the .nc files in datasets that were produced by pre-processing to output the *_trends.nc files inside datasets.

Execute the notebooks in the following order:

  1. ./pre-processing/AVISO+_to_EKE_timeseries.ipynb: generate the EKE field (lon, lat, time) from the AVISO+ geostrophic velocity anomalies.

  2. ./pre-processing/OISST_to_SST_grad_timeseries.ipynb; generate the SST gradient field (lon, lat, time) from the OISST NOAA SST record.

  3. ./pre-processing/KE_anomaly_timeseries.ipynb; generate the KE field anomaly (lon, lat, time) from the AVISO+ geostrophic velocities.

  4. ./pre-processing/EKE_scale_decomposition.ipynb; use a 3°x 3° kernel to decompose the EKE field (lon, lat, time) into large-scale EKE and mesoscale EKE.

  5. ./pre-processing/SST_gradient_scale_decomposition.ipynb; use a 3°x 3° kernel to decompose the SST gradient field (lon, lat, time) into large-scale SST gradients and mesoscale SST gradients (features smaller than 3°x 3° degrees).

  6. Subsequently, the trends can be reproduced by executing the notebooks in the folder trends.

  7. Download mask into the datasets folder:

    cd datasets 
    wget https://zenodo.org/record/3993824/files/ocean_basins_and_dynamical_masks.nc?download=1
    

Optionally, if you do not have access to AVISO+ or OISST NOAA datasets, you can download the pre-processed datasets from zenodo doi. To facilitate the download of all *.nc files, fist install zenodo_get:

pip install zenodo-get

Then all the datasets can be downloaded via:

cd datasets
zenodo_get 10.5281/zenodo.3993824

WARNING: Disk space of ~17 GB is required to download all contents of datasets folder.

Now you can reproduce all the analysis and figures of the manuscript; see figures folder.

Authors:

Funding:

This study was supported by the ARC Centre of Excellence for Climate Extremes (CLEx) funded by the Australian Research Council, grant CE170100023. In addition:

  • J.M.‐M. was supported by the Consejo Nacional de Ciencia y Tecnología (CONACYT), Mexico funding,
  • M.H.E. was supported by the Centre for Southern Hemisphere Oceans Research (CSHOR), a joint research centre between Qingdao National Laboratory for Marine Science and Technology (QNLM), Commonwealth Scientific and Industrial Research Organisation (CSIRO), University of New South Wales (UNSW), and the University of Tasmania (UTAS), and
  • A.K.M. was supported by the Australian Research Council DECRA Fellowship, grant DE170100184.

Citation:

Cite this repository as:

Josué Martínez Moreno, Andrew McC. Hogg, Matthew H. England, Navid C. Constantinou, Andrew E. Kiss, & Adele K. Morrison. (2021, January 23). josuemtzmo/EKE_SST_trends: EKE_SST_trends: Jupyter notebooks (Python) used to compute trends of Eddy kinetic energy and sea surface temperature (Version v0.1.0-alpha). Zenodo. http://doi.org/10.5281/zenodo.4458783

Software reference:

  • David Völgyes, & Rick Lupton. (2020, February 20). Zenodo_get: a downloader for Zenodo records (Version v1.3.0). Zenodo. http://doi.org/10.5281/zenodo.3676567

  • Josué Martínez Moreno, & Navid C. Constantinou. (2021, January 23). josuemtzmo/xarrayMannKendall: Mann Kendall significance test implemented in xarray. (Version v.1.0.1). Zenodo. http://doi.org/10.5281/zenodo.4458776

eke_sst_trends's People

Contributors

josuemtzmo avatar navidcy avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

eke_sst_trends's Issues

File structures

Should the jupyter notebooks (now in rood directory) be inside a directory figures (as the Readme suggests)?

How to solve this error!

When executing the sst_trends.compute() comand

it returns the error like the following..Do you have any suggestions?

2023-05-22 07:21:50,876 - distributed.protocol.pickle - ERROR - Failed to serialize <ToPickle: HighLevelGraph with 1 layers.
<dask.highlevelgraph.HighLevelGraph object at 0x7efda0d52230>
 0. 139627919285376
>.
Traceback (most recent call last):
  File "/home/ubuntu/miniconda3/lib/python3.10/site-packages/distributed/protocol/pickle.py", line 63, in dumps
    result = pickle.dumps(x, **dump_kwargs)
TypeError: cannot pickle '_thread.lock' object

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ubuntu/miniconda3/lib/python3.10/site-packages/distributed/protocol/pickle.py", line 68, in dumps
    pickler.dump(x)
TypeError: cannot pickle '_thread.lock' object

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ubuntu/miniconda3/lib/python3.10/site-packages/distributed/protocol/pickle.py", line 81, in dumps
    result = cloudpickle.dumps(x, **dump_kwargs)
  File "/home/ubuntu/miniconda3/lib/python3.10/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps
    cp.dump(obj)
  File "/home/ubuntu/miniconda3/lib/python3.10/site-packages/cloudpickle/cloudpickle_fast.py", line 632, in dump
    * reducer_override has the priority over dispatch_table-registered
TypeError: cannot pickle '_thread.lock' object
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
File ~/miniconda3/lib/python3.10/site-packages/distributed/protocol/pickle.py:63, in dumps(x, buffer_callback, protocol)
     62 try:
---> 63     result = pickle.dumps(x, **dump_kwargs)
     64 except Exception:

TypeError: cannot pickle '_thread.lock' object

During handling of the above exception, another exception occurred:

TypeError                                 Traceback (most recent call last)
File ~/miniconda3/lib/python3.10/site-packages/distributed/protocol/pickle.py:68, in dumps(x, buffer_callback, protocol)
     67 buffers.clear()
---> 68 pickler.dump(x)
     69 result = f.getvalue()

TypeError: cannot pickle '_thread.lock' object

During handling of the above exception, another exception occurred:

TypeError                                 Traceback (most recent call last)
File ~/miniconda3/lib/python3.10/site-packages/distributed/protocol/serialize.py:350, in serialize(x, serializers, on_error, context, iterate_collection)
    349 try:
--> 350     header, frames = dumps(x, context=context) if wants_context else dumps(x)
    351     header["serializer"] = name

File ~/miniconda3/lib/python3.10/site-packages/distributed/protocol/serialize.py:73, in pickle_dumps(x, context)
     71     writeable.append(not f.readonly)
---> 73 frames[0] = pickle.dumps(
     74     x,
     75     buffer_callback=buffer_callback,
     76     protocol=context.get("pickle-protocol", None) if context else None,
     77 )
     78 header = {
     79     "serializer": "pickle",
     80     "writeable": tuple(writeable),
     81 }

File ~/miniconda3/lib/python3.10/site-packages/distributed/protocol/pickle.py:81, in dumps(x, buffer_callback, protocol)
     80     buffers.clear()
---> 81     result = cloudpickle.dumps(x, **dump_kwargs)
     82 except Exception:

File ~/miniconda3/lib/python3.10/site-packages/cloudpickle/cloudpickle_fast.py:73, in dumps(obj, protocol, buffer_callback)
     70 cp = CloudPickler(
     71     file, protocol=protocol, buffer_callback=buffer_callback
     72 )
---> 73 cp.dump(obj)
     74 return file.getvalue()

File ~/miniconda3/lib/python3.10/site-packages/cloudpickle/cloudpickle_fast.py:632, in dump(self, obj)
    607 def reducer_override(self, obj):
    608     """Type-agnostic reducing callback for function and classes.
    609 
    610     For performance reasons, subclasses of the C _pickle.Pickler class
    611     cannot register custom reducers for functions and classes in the
    612     dispatch_table. Reducer for such types must instead implemented in
    613     the special reducer_override method.
    614 
    615     Note that method will be called for any object except a few
    616     builtin-types (int, lists, dicts etc.), which differs from reducers
    617     in the Pickler's dispatch_table, each of them being invoked for
    618     objects of a specific type only.
    619 
    620     This property comes in handy for classes: although most classes are
    621     instances of the ``type`` metaclass, some of them can be instances
    622     of other custom metaclasses (such as enum.EnumMeta for example). In
    623     particular, the metaclass will likely not be known in advance, and
    624     thus cannot be special-cased using an entry in the dispatch_table.
    625     reducer_override, among other things, allows us to register a
    626     reducer that will be called for any class, independently of its
    627     type.
    628 
    629 
    630     Notes:
    631 
--> 632     * reducer_override has the priority over dispatch_table-registered
    633     reducers.
    634     * reducer_override can be used to fix other limitations of
    635       cloudpickle for other types that suffered from type-specific
    636       reducers, such as Exceptions. See
    637       https://github.com/cloudpipe/cloudpickle/issues/248
    638     """
    639     if sys.version_info[:2] < (3, 7) and _is_parametrized_type_hint(obj):  # noqa  # pragma: no branch

TypeError: cannot pickle '_thread.lock' object

The above exception was the direct cause of the following exception:

TypeError                                 Traceback (most recent call last)
Cell In[34], line 1
----> 1 sst_trends.compute()

File ~/miniconda3/lib/python3.10/site-packages/xarrayMannKendall/decorators.py:91, in check_if_file_exists.<locals>.file_exists_inner(self, *args, **kwargs)
     89         return init(self, *args, **kwargs)
     90 else:
---> 91     return init(self, *args, **kwargs)

File ~/miniconda3/lib/python3.10/site-packages/xarrayMannKendall/xarrayMannKendall.py:210, in Mann_Kendall_test.compute(self, save, path, rewrite_trends, scheduler, progress_bar)
    208         MK_output = trend_method.compute(scheduler=scheduler)
    209 else:
--> 210     MK_output = trend_method.compute(scheduler=scheduler)
    212 if len(self.ordered_dims) == 2:
    213     ds = xr.Dataset({'trend': (['x'], MK_output[:,0]),
    214                  'signif': (['x'], MK_output[:,1]),
    215                  'p': (['x'], MK_output[:,2]),
    216                  'std_error': (['x'], MK_output[:,3])
    217                 },
    218                 coords={'x': (['x'], da2py(self.DataArray.x, False))})

File ~/miniconda3/lib/python3.10/site-packages/dask/base.py:314, in DaskMethodsMixin.compute(self, **kwargs)
    290 def compute(self, **kwargs):
    291     """Compute this dask collection
    292 
    293     This turns a lazy Dask collection into its in-memory equivalent.
   (...)
    312     dask.compute
    313     """
--> 314     (result,) = compute(self, traverse=False, **kwargs)
    315     return result

File ~/miniconda3/lib/python3.10/site-packages/dask/base.py:599, in compute(traverse, optimize_graph, scheduler, get, *args, **kwargs)
    596     keys.append(x.__dask_keys__())
    597     postcomputes.append(x.__dask_postcompute__())
--> 599 results = schedule(dsk, keys, **kwargs)
    600 return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])

File ~/miniconda3/lib/python3.10/site-packages/distributed/client.py:3204, in Client.get(self, dsk, keys, workers, allow_other_workers, resources, sync, asynchronous, direct, retries, priority, fifo_timeout, actors, **kwargs)
   3131 def get(
   3132     self,
   3133     dsk,
   (...)
   3145     **kwargs,
   3146 ):
   3147     """Compute dask graph
   3148 
   3149     Parameters
   (...)
   3202     Client.compute : Compute asynchronous collections
   3203     """
-> 3204     futures = self._graph_to_futures(
   3205         dsk,
   3206         keys=set(flatten([keys])),
   3207         workers=workers,
   3208         allow_other_workers=allow_other_workers,
   3209         resources=resources,
   3210         fifo_timeout=fifo_timeout,
   3211         retries=retries,
   3212         user_priority=priority,
   3213         actors=actors,
   3214     )
   3215     packed = pack_data(keys, futures)
   3216     if sync:

File ~/miniconda3/lib/python3.10/site-packages/distributed/client.py:3103, in Client._graph_to_futures(self, dsk, keys, workers, allow_other_workers, internal_priority, user_priority, resources, retries, fifo_timeout, actors)
   3100 from distributed.protocol import serialize
   3101 from distributed.protocol.serialize import ToPickle
-> 3103 header, frames = serialize(ToPickle(dsk), on_error="raise")
   3104 nbytes = len(header) + sum(map(len, frames))
   3105 if nbytes > 10_000_000:

File ~/miniconda3/lib/python3.10/site-packages/distributed/protocol/serialize.py:372, in serialize(x, serializers, on_error, context, iterate_collection)
    370     return {"serializer": "error"}, frames
    371 elif on_error == "raise":
--> 372     raise TypeError(msg, str(x)[:10000]) from exc
    373 else:  # pragma: nocover
    374     raise ValueError(f"{on_error=}; expected 'message' or 'raise'")

TypeError: ('Could not serialize object of type HighLevelGraph', '<ToPickle: HighLevelGraph with 1 layers.\n<dask.highlevelgraph.HighLevelGraph object at 0x7efda0d52230>\n 0. 139627919285376\n>')

requirements.txt requires some explanation?

What is the purpose of requirements.txt?
Just for users to read inside?
Or can you use that to install required packages via python?
If the latter, perhaps some explanation in the Readme is in order?

Questions about calculating EKE

Hi ,Prof !

I am new and I want to calculate EKE & KE from my own sla field rather than Aviso products, so I have to calculate u & v by myself.After reading some articles,I was stumped by a few questions in the formula. Hope you can answer them.

  1. KE=u2+v2; EKE=1/2*(u2+v2) Do u and v in these two formulas mean the same thing? If so, can I understand that KE is equal to 2 times EKE ?
    2.Dose the formula of ugosa in Aviso product is u=-g/f * (f'y) ?

Thanks for your reply!

Suggestions for notebooks that reproduce figures

In the notebooks, e.g.,
https://github.com/josuemtzmo/EKE_SST_trends/blob/master/Kinetic_Energy_maps.ipynb
it should be nice to do the following:

  • when you load the xarray.Dataset to print it out so people can see its dimensions, length of time axis and what not.
  • when you convert units also also change the xarray properties to reflect that, see e.g., trends = ( dataset_trends.trend * 10*365 ).sel(lat=slice(-60,60)) # Trends per day multiplied by 3650 days to report per decade.
  • btw, the line above does 2 things: converts units and makes a selection; this creates confusion to a novice user... Let's help the future generations of students who will struggle to reproduce Josue et al 2021 :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.