Coder Social home page Coder Social logo

omuse-geoscience / omuse Goto Github PK

View Code? Open in Web Editor NEW
18.0 5.0 11.0 41.08 MB

The Oceanographic Multi-purpose Software Environment: a package for multi-physics and multi-scale earth science simulations.

License: Apache License 2.0

Python 34.74% Makefile 2.05% Fortran 43.28% C 1.38% Roff 0.23% HTML 14.91% SourcePawn 0.43% Perl 1.01% MATLAB 0.09% Shell 0.17% Batchfile 0.08% C++ 1.43% Singularity 0.19%
oceanography earth-science python

omuse's Introduction

OMUSE

OMUSE stands for Oceanographic MUltipurpose Software Environment. It is a package to conduct numerical experiments in oceanography and other Earth sciences. Example OMUSE applications can be found in the examples repository.

For whom is OMUSE?

OMUSE aims to be useable by any researcher or student with a basic knowledge of the Python programming language.

What is this repository for?

This repository contains the source tree for OMUSE, including OMUSE specific framework components and community codes.

How do I get set up?

While there are some packages available on pipy, we recommend at the moment to do a pip developer install:

  • setup a python environment, e.g. using virtualenv, and activate it.
  • in a suitable working directory clone the AMUSE repository: git clone https://github.com/amusecode/amuse
  • go into the created directory: cd amuse
  • do the developer install from here: pip install -e . [MPI] The MPI is optional.
  • Going back to the working directory (cd ..) also clone the OMUSE repository: git clone https://github.com/omuse-geoscience/omuse,
  • go into the source directory cd omuse and set the environment variable DOWNLOAD_CODES, e.g. export DOWNLOAD_CODES=latest.
  • now, do pip install -e . from the root of the package
  • type python setup.py build_codes --inplace to build the codes.
  • the file build.log will report any errors in the build process.

This installs amuse-devel and omuse-devel. The community codes of OMUSE can be build manually by going into each directory:

  • src/omuse/community/adcirc
  • src/omuse/community/swan
  • etc

and typing: first make download (for some) and then make

OMUSE has been tested on OSX and linux machines, with ifort and gfortran compilers, on desktop machines and on the Carthesius supercomputer.

In addition to the AMUSE dependencies, OMUSE needs/ can use the following packages:

  • matplotlib basemap
  • netCDF and netCDF for fortran and the python bindings
  • GRIB_API

Documentation

Documentation can be found here. In addition the base AMUSE documentation can be consulted.

Reporting issues

Issues can be reported at the OMUSE issue tracker; for framework issues, report them at the AMUSE repository.

Contribution guidelines

Contributions are welcome. Note that most framework development happens at the AMUSE repository A primer for writing code interfaces and other documentation can be found on the amuse website.

omuse's People

Contributors

adamcandy avatar benvanwerkhoven avatar esclapez avatar fjansson avatar goord avatar ipelupessy avatar mchertova avatar merijn avatar sbte avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

omuse's Issues

Loading DALES case fails

Loading the namelist of a predefined DALES case seems to be broken, we always end up with the default ones; this can simply be tested by calling

d = Dales(case="bomex", workdir="bomex-perf", number_of_workers=1)
print(d.parameters_DOMAIN.xlat)

which should print 15 instead of 54 in the bomex case.
The problem seems to originate from the call in the dales constructor

self.read_namelist_parameters(namelist)

which correctly assigns the self._parameters dictionary but not parameter groups like self.parameters_DOMAIN

POP needs 2 states for a restart

OMUSE only has support for loading a single POP state when doing a restart from a previous run. This state is used for both oldtime and curtime. This seems wrong because POP uses a leap-frog scheme, which needs two consecutive states for any update of a different variable. The restart does work, but it takes some time before POP reaches a statistical steady state again, which is undesirable. Support should be added for writing and reading both consecutive states. This should be optional.

examine SWAN parallel options

currently SWAN uses the opnmp parallelization option; MPI and a seperate option for MPI with unstructured meshes are possible. This issue is a reminder that this needs to be examined further..

ERA5 grid orientation

should be checked...

(as well as offset; check definition of grid (ie center values?)

retire DOWNLOAD_CODES

its usually best to assume DOWNLOAD_CODES=1
for this to happen makefiles must:

  • check wether src is already available
  • clean must remove build artefacts
  • distclean must remove downloaded src code
  • specials are DALES that have options on the DOWNLOAD_CODES, thats fine its just that the default for DOWNLOAD_CODES changes from not to yes

Using flow_links after initialization of DFlowFM interface

Error in using flow_links after initialization, printing shape will give (0,):
d=DFlowFM(ini_file="gtsm_coarse.mdu", coordinates="spherical")
print("flow links",d.flow_links.shape)

Adding print statement for flow_nodes before using flow_links fix that:
d=DFlowFM(ini_file="gtsm_coarse.mdu", coordinates="spherical")
print("flow nodes",d.flow_nodes.shape)
print("flow links",d.flow_links.shape)

Support for customising `build.sub_commands` directly from `distutils` has been deprecated and is about to be removed from `setuptools`

The following feature in setuptools has been deprecated for almost 2 years and is about to be removed:

https://github.com/pypa/setuptools/blob/1ed759173983656734c3606e9c97a348895e5e0c/setuptools/command/build.py#L13-L27

It might be a good idea to import build directly from setuptools for the following code:

build.sub_commands.insert(0, ('configure_codes', None))

(maybe other places too)
(build is available directly from setuptools, starting on version v62.4.0)

delft3d out of tree compiles

  • it may be better to allow for out of tree installs of delft3d in the manner of iemic
  • its a bit tricky with the hot patching though

Interface of data and era5 modules

It would be good if the interface of the data and era5 modules resembles the other model interfaces more to enable straightforward coupling of models with data.

Suggestions (please add):

  • They should generate their own flow links and nodes
  • Merge data and era5 modules together. Providing an option to download era5 on the go from CDS, or provide own locally stored data files
  • Rename data module to ExternalData
  • Try to retrieve units from 'attributes" of the loaded netcdf.

Hurricane model start time defined by input file

Currently the HollandHurricane model (src/omuse/ext/hurricane_models.py) always starts from the start (top line) of the hurricane track file that is provided at initialization. This results in unexpected/unwanted behavior when it is coupled to a different model which does involve an explicit start date. The hurricane model will then act as if the hurricane started at the start time of the other model (since model_time is 0). I suggest to offer an option to select the start time such that it does not depend on the provided hurricane track data.

This requires:

  • Adding "start_time" as an optional argument for the initialization function
  • Change initialize_from_track to take this start date as t0 if it is provided
  • Raise an error when the provided start date is out of bounds of the provided track file times

DALES interface workdir and output redirection

It is currently not possible to redirect stdout, stderr to files in the model working directory. If workdir doesn't exist when creating the Dales object and a file in workdir is used for redirection the code fails, because the directory doesn't exist when the output files are created.

The DALES interface behaves differently depending on whether workdir exists. If not, it assumes a fresh start and copies files from inputdir. If it exists, it assumes a restart and doesn't copy files. This means the user code cannot create the working directory in advance.

Proposed fix: in the DALES interface.py, move the directory logic before call to model creation, add mkdirs in between:

class Dales(CommonCode, CodeWithNamelistParameters):
...
def __init__(self, **options):                                                                                                                                                            
        inputdir = None                                                                                                                                                                       
                                                                                                                                                                                              
        # Set working directory                                                                                                                                                               
        self.workdir = "."                                                                                                                                                                    
        if "workdir" in options:                                                                                                                                                              
            self.workdir = options["workdir"]                                                                                                                                                 
            if os.path.exists(self.workdir):  # This is a restart                                                                                                                             
                inputdir = self.workdir                                                                                                                                                       
            else:                                                                                                                                                                             
                os.makedirs(self.workdir)                                                                                                                                                     
                # create work directory so that output files can be placed there                                                                                                              
                                                                                                                                                                                              
        CodeWithNamelistParameters.__init__(self, namelist_parameters)                                                                                                                        
        CommonCode.__init__(self, DalesInterface(**options), **options)                                                                                                                       
        self.stopping_conditions = StoppingConditions(self)                                     

Singularity fails to build

Following up https://bitbucket.org/omuse/omuse/issues/19/singularity-recipe-fails-to-build

Installation progresses, but fails later on :

Building wheels for collected packages: mpi4py
  Building wheel for mpi4py (setup.py) ... done
  Created wheel for mpi4py: filename=mpi4py-3.0.3-cp36-cp36m-linux_x86_64.whl size=2550550 sha256=31350a25826377bd13c2d97a9256868749d407fad9577284353a37da614c9652
  Stored in directory: /root/.cache/pip/wheels/d6/73/83/ad9dd3ebae512829ab3f21657f76403dc4aa6649e1118c9369
Successfully built mpi4py
Installing collected packages: wheel, pluggy, more-itertools, py, pytest, amuse-framework, mpi4py, omuse
  Attempting uninstall: wheel
    Found existing installation: wheel 0.31.1
    Uninstalling wheel-0.31.1:
      Successfully uninstalled wheel-0.31.1
  Running setup.py develop for omuse
Successfully installed amuse-framework-13.2.1 more-itertools-8.4.0 mpi4py-3.0.3 omuse pluggy-0.13.1 py-1.9.0 pytest-5.4.3 wheel-0.34.2
+ python3 setup.py build_code --code-name dales --inplace
running build_code
building libraries and community codes
build, for logging, see 'build.log'
[21:48:26] building dales
Community codes not built (because of errors):
================================================================================
 * dales
================================================================================
0 out of 1 codes built, 0 out of 0 libraries built
Traceback (most recent call last):
  File "setup.py", line 76, in <module>
    data_files=all_data_files,
  File "/usr/local/lib/python3.6/site-packages/setuptools/__init__.py", line 165, in setup
    return distutils.core.setup(**attrs)
  File "/usr/lib64/python3.6/distutils/core.py", line 148, in setup
    dist.run_commands()
  File "/usr/lib64/python3.6/distutils/dist.py", line 955, in run_commands
    self.run_command(cmd)
  File "/usr/lib64/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/opt/omuse/support/setup_codes.py", line 1217, in run
    BuildCodes.run(self)
  File "/opt/omuse/support/setup_codes.py", line 1096, in run
    raise Exception("No succesful builds detected. Aborting.")
Exception: No succesful builds detected. Aborting.
FATAL:   failed to execute %post proc: exit status 1
FATAL:   While performing build: while running engine: while running /usr/libexec/singularity/bin/starter: exit status 255

NOTE: Using git tag v1.2:

commit caa7673 (HEAD, tag: v1.2)
Merge: 71ad0cd 930d266
Author: ipelupessy [email protected]
Date: Mon Apr 6 12:11:36 2020 +0200

Merge branch 'master' of github.com:omuse-geoscience/omuse****

DALES model cleanup_code() fails

Exception when calling cleanup_code() on a Dales object:

CodeException: Exception when calling function 'cleanup_code', of code 'DalesInterface', exception was 'lost connection to code'

Output from the dales model, in the terminal, when launched with redirection='none' to see the output:

 TOTAL wall time =    773.72791208199999     
*** The MPI_Comm_f2c() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.

So far I have tried this from a Jupyter notebook, with a single dales worker and using the sockets channel.

Dales notebook example fails on second run

This issue is copied from the OMUSE bitbucket repo issue 21 .

The example notebook src/omuse/community/dales/example/bubble-notebook.ipynb works once but fails on the second run, presumably because the output directory and name list file from the previous run exist.

d = Dales(workdir='bubble-nb', channel_type='sockets', number_of_workers=1)

Exception                                 Traceback (most recent call last)
<ipython-input-2-3a67f42204ee> in <module>
----> 1 d = Dales(workdir='bubble-nb', channel_type='sockets', number_of_workers=1)

/opt/omuse/src/omuse/community/dales/interface.py in __init__(self, **options)
    640         self.parameters.input_file = namelist
    641         if namelist is not None and os.path.isfile(namelist):
--> 642             self.read_namelist_parameters(namelist)
    643             self._nml_file = namelist
    644         else:

/usr/local/lib/python3.6/site-packages/amuse/support/parameter_tools.py in read_namelist_parameters(self, inputfile, add_missing_parameters)
    196
    197     def read_namelist_parameters(self, inputfile, add_missing_parameters=False):
--> 198         return self.read_parameters(inputfile,add_missing_parameters)
    199
    200     def output_format_value(self,value):

/usr/local/lib/python3.6/site-packages/amuse/support/parameter_tools.py in read_parameters(self, inputfile, add_missing_parameters)
    106                     name=self._parameters[key]["name"]
    107                     dtype=self._parameters[key]["dtype"]
--> 108                     val=self.interpret_value( rawval, dtype=dtype)
    109                     if is_quantity(self._parameters[key]["default"]):
    110                         self._parameters[key]["value"]=new_quantity(val, to_quantity(self._parameters[key]["default"]).unit)

    /usr/local/lib/python3.6/site-packages/amuse/support/parameter_tools.py in interpret_value(self, value, dtype)
     92
     93     def interpret_value(self,value, dtype=None):
---> 94         raise Exception("not implemented")
     95
     96     def read_parameters(self, inputfile, add_missing_parameters=False):

Exception: not implemented

This used to work - and still does in my singularity images from 2019-09-19.

Add interface for custom netcdf forcing files

Implementation in #87


The ERA5 interface enables us to download ERA5 data as netcfd files and access them as a regular model grid inside OMUSE. However more flexibility is required:

  • data can come from other sources than CDS, or
  • some custom made preprocessed data might be used
  • different file types might be required as forcing

This is currently not possible in the ERA5 interface.


Proposal
Set up a new interface, perhaps called "data" or "forcing" which at least accepts netcdf grid files, but ultimately can also be extended to other file types. It should function in a similar way as, and can be based largely on the ERA5 interface and perhaps at some point replace the ERA5 interface, however it should take a path to one or multiple netcdf files as argument.
In the future we could even merge the new interface and the old ERA5 interface and automatically downloading the data from CDS could be an option in the new interface.

Loose ends from repository move

please report all issues that arise from the move of the repository here.

just make a new issue if you want to reopen one of the issues from the old bitbucket repository.

add topics

I suggest adding the topic ocean, oceanography, earth-science in the About section.

Can't set parameters in I-EMIC from OMUSE

As the title says. I can set parameters at the start, I think, but not at any point after some computation has happened (I assume this is after commit_parameters is called, who knows when that is). This makes sense since setParameters is not called anywhere but in load_from_file. If I try to add it in a reasonable place, I either get the error that I can't set parameters that are not in "Starting Parameters", which makes sense, since we don't allow online modification of those parameters from the I-EMIC side (but those are present in the ParamSet), but if I try to make a new Techos::ParameterList and only put the modified parameter in there (this is the proposed way of doing things from the I-EMIC side) I get an error because the ParamSet uses getEntry to set a parameter (and obviously the new parameter is not present in an empty list).

iemic parallel output

for iemic runs with number_of_workers>1 all logging message output of each worker is redirected to sdtout, this is mostly duplicate output so either only output root output or redirect to seperate file (or a combination of this...)

(this can/should be fixed in iemic interface.cc)

OMUSE doc reorganization

OMUSE docs need to be expanded and reogranized

  • general framework documentation
  • utilities and extension doc
  • move dales and era5 to codes section
  • add to codes section
  • examples section ?
  • make sure install instructions are correct
  • resolving common problems section

Can't build/install omuse as described in the README

$ python setup.py build_codes --inplace
running build_codes
building libraries and community codes
build, for logging, see 'build.log'
[12:03:38] building adcirc
[12:03:38] building cdo
[12:03:38] building dales
[12:03:39] building iemic
[12:03:43] building oifs
Username for 'https://git.ecmwf.int': 

I don't have a username for that. Maybe it should be made optional.

Discuss merging mosaic and master branch

I propose to go back to a single main branch, we currently have two: master and mosaic

Can we list potential reasons why this is not a good idea?

(While we are at it, we can also rename the master branch to main as is the default these days. )

The I-EMIC interface uses globals

In some applications, we want to have multiple instances of the same model running at the same time (in case we do sampling). The use of globals makes this impossible. It would be nice to get rid of this design flaw.

Note that for the Ocean class of I-EMIC running multiple instances at the same time is also impossible due to the Fortran backend, but we are hoping to fix that one day.

In any case, it is probably better and easier to fix this in OMUSE now than in a few years when the codebase has grown.

Py-eddy-tracker software

Hi,

I work on the evolution of py-eddy-tracker.
I discover today that you use a previous version of this software.

If you want to update eddy identification and tracking, you could found the documentation and example gallery here.

Antoine

i-emic interface dash problem

when prescribing forcing or other parameters via i-emic interface it is impossible to use dashes in straightforward way:
instead of instance.ocean__THCM.Global_Grid-Size_l
one has to do:
setattr(instance.ocean__THCM, "Global_Grid-Size_l", 32)
That's confusing.

Use lan/lot coordinates for mapping/coupling models

Currently when using the bilinear_2D_remapper , (or any other mapper for that matter) to map a dataset (either from the ERA5 module or the Data module) to a model (in this case a Delft3DFM gtsm model), the lat/lon coordinates are not used to map one grid onto the other. For geospatial data and earth models this is unexpected/unwanted behavior as you would expect a grid with coordinates from -180 to 180 longitude to be mapped properly to a grid of 0 to 360 longitude.

This can be implemented either by:

  • using a default reference point to which all grids are converted
  • do the alignment during the coupling/mapping, which would require a different implementation of the various mappers in amuse (perhaps it could check whether a lon/lat coordinate exists?)
  • or an additional layer in omuse
  • having a different implementation and concept of the omuse grid object as opposed to the amuse grid object

singularity image does no find worker code

When I follow the steps from https://omuse.readthedocs.io/en/latest/singularity.html#singularity-section without having omuse built locally, I get the error

Singularity omuse.img:/opt/omuse/src/omuse/community/dales/example> python bubble.py
Traceback (most recent call last):
  File "bubble.py", line 22, in <module>
    d = Dales(workdir='bubble', channel_type='sockets', number_of_workers=1)
  File "/home/gvdoord/Software/amuse/src/omuse/src/omuse/community/dales/interface.py", line 587, in __init__
    CommonCode.__init__(self, DalesInterface(**options), **options)
  File "/home/gvdoord/Software/amuse/src/omuse/src/omuse/community/dales/interface.py", line 34, in __init__
    CodeInterface.__init__(self, name_of_the_worker=self.name_of_the_worker(), **options)
  File "/home/gvdoord/.local/lib/python3.6/site-packages/amuse/rfi/core.py", line 752, in __init__
    self._start(name_of_the_worker = name_of_the_worker, **options)
  File "/home/gvdoord/.local/lib/python3.6/site-packages/amuse/rfi/core.py", line 771, in _start
    self.channel = self.channel_factory(name_of_the_worker, type(self), interpreter_executable = interpreter_executable, **options)
  File "/home/gvdoord/.local/lib/python3.6/site-packages/amuse/rfi/channel.py", line 1694, in __init__
    self.full_name_of_the_worker = self.get_full_name_of_the_worker(legacy_interface_type)
  File "/home/gvdoord/.local/lib/python3.6/site-packages/amuse/rfi/channel.py", line 692, in get_full_name_of_the_worker
    raise exceptions.CodeException("The worker application does not exist, it should be at: \n{0}".format('\n'.join(tried_workers)))
amuse.support.exceptions.CodeException: The worker application does not exist, it should be at: 
/home/gvdoord/Software/amuse/src/omuse/src/omuse/_workers/dales_worker
/home/gvdoord/.local/lib/python3.6/site-packages/amuse/_workers/dales_worker
/home/gvdoord/Software/amuse/src/omuse/src/omuse/community/dales/dales_worker
/home/gvdoord/.local/lib/python3.6/site-packages/amuse/rfi/dales_worker

Somehow it tries to find the worker from my home directory, am I missing something?

allow SWAN w/o MPI

by default SWAN is compiled with mpi enabled, leading to error if no mpi compiler is present.
this is not necessary, it is possible to disable mpi if not available...

Running Delft3DFM model with multiple workers and ERA5 external forcing crashes

When running the Delft3DFM model with multiple workers, while using ERA5 netcdf files for global forcing through the forcing interface of Delft3DFM itself (instead of using the ERA5 interface in OMUSE), the run crashes with the following error message:

mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[53422,2],9]
  Exit code:    245

The problem seems to be that each Delft3DFM worker tries to load the same 10GB netcdf file (as specified in the ExtForceFile file with .ext extension) in memory at the same time. This would require 320 GB of memory in the case of 32 workers.

This problem came up because we want to use different custom forcing files and not just the default ERA5 forcing.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.