Coder Social home page Coder Social logo

pism / pism Goto Github PK

View Code? Open in Web Editor NEW
96.0 28.0 40.0 96.67 MB

Repository for the Parallel Ice Sheet Model (PISM)

Home Page: https://pism.io/

License: GNU General Public License v3.0

CMake 1.64% Python 15.18% Shell 0.99% Makefile 0.03% C++ 78.07% C 1.18% TeX 1.56% MATLAB 0.08% Fortran 0.23% Emacs Lisp 0.01% Dockerfile 0.02% SWIG 1.03%
scientific-computing parallel geophysics c-plus-plus ice-sheet glacier sea-level numerical petsc python

pism's Introduction

PISM, a Parallel Ice Sheet Model

doi_ gpl_ cipism_

The Parallel Ice Sheet Model is an open source, parallel, high-resolution ice sheet model:

  • hierarchy of available stress balances
  • marine ice sheet physics, dynamic calving fronts
  • polythermal, enthalpy-based conservation of energy scheme
  • extensible coupling to atmospheric and ocean models
  • verification and validation tools
  • documentation for users and developers
  • uses MPI and PETSc for parallel simulations
  • reads and writes CF-compliant NetCDF files

PISM is jointly developed at the University of Alaska, Fairbanks (UAF) and the Potsdam Institute for Climate Impact Research (PIK). UAF developers are based in the Glaciers Group at the Geophysical Institute.

Please see ACKNOWLEDGE.rst and doc/funding.csv for a list of grants supporting PISM development.

Homepage

http://www.pism.io/

Download and Install

See the Installing PISM on pism.io.

Support

Please e-mail [email protected] with questions about PISM.

You can also join the PISM workspace on Slack.

Contributing

Want to contribute? Great! See Contributing to PISM.

pism's People

Contributors

aaschwanden avatar bueler avatar ckhroulev avatar damaxwell avatar deepsourcebot avatar delayedneutron avatar enricodeg avatar florianziemen avatar jedbrown avatar juseg avatar m-kreuzer avatar mankoff avatar mariazeitz avatar matthiasmengel avatar ronjareese avatar sebhinck avatar sschoell avatar talbrecht avatar tkleiner avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pism's Issues

Clarify documentation

We should make it clear which variables in the -extra_vars list are two-dimensional and which are 3D.

Create a document describing PISM's "boundary models"

In other words: make the "PISM User's Manual" shorter by moving the section "Setting up PISM boundary (atmosphere, surface and ocean) models" into a separate document. (Some of the text can be re-used, some will have to be re-written.)

add linear iteration count and max velocities to stdout

Currently our -verbose 2 stdout looks something like this:

 y  SSA:     3 outer iterations
v$Eh d (dt=1.98573 in 2 substeps; av dt_sub_mass_cont=0.99287)
S      3.97219:  3.62510    2.1352   1.52198036

The report from the stressbalance is

 y  SSA:     3 outer iterations

The "S"=summary line has

S      time       :  ivol             iarea       max_diffusivity

These are good but I would like to augment just a bit:

 y  SSA:     3 outer iterations  (with  K  av linear iterations)
S      time       :  ivol             iarea       max_diffusivity    max_2d_abs_vel   max_3d_abs_vel

Here K is from KSPGetIterationNumber(); i.e. as reported by -ksp_converged_reason. The last two on the summary line are the numbers controlling cfl for the 2d mass continuity and 3d enthalpy jobs, respectively.

return SSAFEM object to better regularization form: nuH -> nuH + ssa_eps

David: This change is already made in the SSAFD solver. That is, the SSAFD solver now has the regularization you would prefer, and had implemented, for the SSAFEM solver, where epsilon_ssa is added to nuH instead of being used as a lower bound, which is less smooth. So now could you put it back the way you had it so that epsilon_ssa is again used the same way in both solvers? I don't dare screw up the Jacobian code ...

time series should write global attributes

Unlike output files and extra files, scalar time series files are missing global attributes (e.g. title, institution, source, etc).

Is there a deeper reason, or am I missing something?

(tested with PISM development 07683de)

despite preliminary step, max_hor_vel still reported as zero

This not a big issue, but an issue-reporting system should be used.

At the start of a run I get:

   doing preliminary step to fill diagnostic quantities ...
   running forward ...
   P         YEAR:     ivol     iarea     max_diff       max_hor_vel
   U        years 10^6_km^3 10^6_km^2     m^2 s^-1            m/year
   S      0.00000: 25.76450   13.7242   1.71849559           0.00000

I think that if we are doing the preliminary step, the max_hor_vel spot should be filled with a meaningful number. Or that spot should be filled with something like NONE.

Or do I misunderstand what the prelim step is doing?

Why do we want a value? Because that information should help users with an important concept: after a no mass run, or a change in basal resistance or geometry or etc., i.e. after many major switches of model, there are really high velocities and one should pay more attention to that. (E.g. use -skip 1 and a short times step.)

Feel free to close the issue as soon as you think it should be.

Make vel2tauc friendly to PISM workflow

Specification of vel2tauc.py:

(As revised following a November 4 2011 conversation with Ed)

The job of vel2tauc.py is to take a PISM model state, a set of 'observed ssa velocities', a scalar function vel_misfit_weight (the variable previously known as range_l2_weight), a previous guess for tauc, and produce an inferred value of tauc.

Variables for the model state are specified by either -i in_file or -a in_file; we'll call in_file the input file.

Output is stored in the output file. If -a in_file is specified, the input file is the output file. Otherwise, -o out_file may be used to indicate an output file. It's an error to use both -a and -o. If -i is used but not -o, then the output is placed in vel2tauc_input_file

Variables for the 'observed ssa velocities' and misfit weight are specified in the inverse data file. It can be specified by -inv_data foo.nc, and is otherwise taken to be the input file.

The following variables are meaningful in the inverse data file:

  • u/v_surface_observed -- surface velocities from measurement
  • vel_surface_mask -- = 1 if velocity measurement is present
  • u/v_ssa_observed -- explained below
  • vel_misfit_weight -- a nodewise weight used in the L2 misfit functional
  • tauc_prior -- explained below

If the variables u/v_ssa_observed are present, these are taken as input for inversion. Otherwise, the variables u/v_surface_observed must be present, and a solution of the SIA, u/v_sia_observed is constructed using data from the input file. We set u/v_ssa_observed = u/v_surface_observed - u/v_sia_observed. One can imagine in the future perhaps creating u/v_sia_observed from geometry different from the geometry in the input file; hence the appendage observed here.

I will be experimenting with using a vel_misfit_weight that is not specified on a per-node basis but on a per-element basis. For now, I'll use vel_elementwise_misfit_weight for this quantity.

The inversion run requires a starting point for tauc. The default is to take this variable from the input file. It may be useful to be able to restart an inversion from an aborted attempt. During each iteration of the inversion, the current value of tauc produced will be saved in the output file as tauc_inv. If -inv_restart is specified, then the starting tauc is taken to be tauc_inv in the input file. An explicit value of tauc can be specified in the inverse data file as tauc_prior, and use of this starting point is indicated via -use_tauc_prior.

The final value of tauc computed from inversion is saved in the output file as tauc. The starting point for tauc is saved in the output file as tauc_prior. We save the data we used from the inverse data file to the output file, including u/v_sia_observed if they were generated. If not running in append mode, we also write all PISM model state variables required for the SSA/SIA to the output file.

At the end of a successful inversion run that started from a tauc and producing "output_file", we should be able to replicate exactly the same inversion with the flags :-i/a output_file -use_tauc_prior. The output file will be managed in such a way that if there is a crash or an abort, the inversion can be continued from the output_file with with -i/a output_file -inv_restart.

I envision that standard use will be

vel2tauc.py -a input-file.nc -inv_data vel-obs.nc [other flags]

with tauc from the input data being used as the initial guess and consequently saved as tauc_prior post-inversion.

After a successful inversion, the output file will contain:

tauc: the tauc generated from inversion
tauc_prior: the tauc starting point for the inversion
tauc_inv: the most recent iteration of tauc produced during an inversion run

SIGUSR1 not giving important diagnostic fields

Try

pisms -y 1e6 >> out.foo &
pkill -usr1 pisms
pkill -kill pisms      # to actually kill it

Note

$ pisms -pismversion
PISMS development 09f904a (simplified geometry mode)

I use SIGUSR1 to get csurf, cbase and several other fields to eval the state of the run. There is a change in behavior. The set of diagnostic fields might be the user's original choice of -o_size, or could just be medium.

add changed ocean load (both from topg effects and PISMOceanModel sea level) to bed deformation

Sea level is handled by PISMOceanModel. But changing sea level and changing topg means changing sea load on the crust. This needs to be thought through.

But the full load change should be seen and handled by "-bed_def lc" model, at least; not important enough to suggest modifications to "-bed_def iso" equations, I think.

With this change it is almost certainly appropriate to raise "Z" (i.e. BedDeformLC.Z). Current default is 4; raise to 8.

inward scheme for calculation of principal strain rates not rotationally invariant in stable version

The inward-scheme for the calculation of strain rates in stressbalance/SSA.cc in SSA::compute_principal_strain_rates() still asks for the ice thickness of the neighbors, but not for all four directions. In the dev version I committed a mask-based solution. Since eigencalving uses principla components of the strain rates in the penultimate grid cell upstream from the front (offset=2), this should not impact the results so much.

create option -tauc_to_phi

This task created based on conversation 4 Nov with David.

The inversion procedure for the SSA will involve a python script which will uses David's siple package. I'm guessing it is called vel2tauc.py. It will allow a -a option to append the results of the inversion into a PISM NetCDF file.

There will generically be a PISM spinup run before inversion. To be useful, it generates a tauc field in its output file. The inversion will take that tauc as an initial guess. It will invert the SSA stress balance using surface velocities from another file (generically) and then generate a new tauc. The old tauc will be renamed as tauc_prior (I believe) in append mode.

The option -tauc_to_phi will be used in the PISM run after the inversion. It will (generically) identify the file from which PISM should get a tauc field. Thus the sequence looks something like this:

pismr -boot_file originaldata.nc -y 1e4 -ssa_sliding -topg_to_phi a,b,c,d,e \
   -o spinupresult.nc [other options] 

vel2tauc.py -a spinupresult.nc -inv_data vel-obs.nc [other options]

pismr -i spinupresult.nc -ssa_sliding -tauc_to_phi -y 100 [other options]

Note that if we see -tauc_to_phi foo.nc then foo.nc is the file from which to extract the field tauc in order to calculate the basal hydrology model state variable tillphi. But if we see -tauc_to_phi without a file name we may assume that the -i file has the tauc variable.

The PISMYieldStress object needs to have an invertible function which determines tauc given the state variable (e.g. tillphi), in order for an option like -tauc_to_phi to work. Both options -topg_to_phi and -tauc_to_phi will be handled by a PISMDefaultYieldStress object, and not IceModel.

The inversion script vel2tauc.py does generically know about the power law used in the sliding model, and tauc means the coefficient in that power law. But it does not need to know about the state variable of the basal hydrology model (i.e. tillphi). A hydrology model can be arbitrarily complicated and can have a ridiculously big state. On the other hand, the current default hydrology is really preliminary and should change. Both of these are reasons why the inversion need not know about the hydrology state variable(s).

Add the "hours since the beginning of a run" variable to extra_files.

Currently each time an record is written to an extra_file we prepend a time-stamp string to the history attribute. Unfortunately in some cases this attribute gets very long. Once it exceeds ~50K bytes adding more records becomes a very slow process (because of NetCDF-3 file format limitations).

This can be resolved by moving this time-stamp from an attribute into a variable, as suggested by the issue title.

Compilation fails on pacman

Version c3508da breaks on pacman during compilation/linking:

Linking CXX executable ../bedrough_test
libpismutil.a(PISMNC3File.cc.o): In function `MPI::Op::Init(void (*)(void const*, void*, int, MPI::Datatype const&), bool)':
PISMNC3File.cc:(.text._ZN3MPI2Op4InitEPFvPKvPviRKNS_8DatatypeEEb[MPI::Op::Init(void (*)(void const*, void*, int, MPI::Datatype const&), bool)]+0x14): undefined reference to `ompi_mpi_cxx_op_intercept'
libpismutil.a(PISMNC3File.cc.o): In function `MPI::Intracomm::Split(int, int) const':
PISMNC3File.cc:(.text._ZNK3MPI9Intracomm5SplitEii[MPI::Intracomm::Split(int, int) const]+0x34): undefined reference to `MPI::Comm::Comm()'
libpismutil.a(PISMNC3File.cc.o): In function `MPI::Graphcomm::Clone() const':
PISMNC3File.cc:(.text._ZNK3MPI9Graphcomm5CloneEv[MPI::Graphcomm::Clone() const]+0x25): undefined reference to `MPI::Comm::Comm()'
libpismutil.a(PISMNC3File.cc.o): In function `MPI::Intracomm::Create_graph(int, int const*, int const*, bool) const':
PISMNC3File.cc:(.text._ZNK3MPI9Intracomm12Create_graphEiPKiS2_b[MPI::Intracomm::Create_graph(int, int const*, int const*, bool) const]+0x2e): undefined reference to `MPI::Comm::Comm()'
libpismutil.a(PISMNC3File.cc.o): In function `MPI::Cartcomm::Sub(bool const*)':
PISMNC3File.cc:(.text._ZN3MPI8Cartcomm3SubEPKb[MPI::Cartcomm::Sub(bool const*)]+0x76): undefined reference to `MPI::Comm::Comm()'
libpismutil.a(PISMNC3File.cc.o): In function `MPI::Intracomm::Create_cart(int, int const*, bool const*, bool) const':
PISMNC3File.cc:(.text._ZNK3MPI9Intracomm11Create_cartEiPKiPKbb[MPI::Intracomm::Create_cart(int, int const*, bool const*, bool) const]+0x8f): undefined reference to `MPI::Comm::Comm()'
libpismutil.a(PISMNC3File.cc.o):PISMNC3File.cc:(.text._ZNK3MPI8Cartcomm5CloneEv[MPI::Cartcomm::Clone() const]+0x25): more undefined references to `MPI::Comm::Comm()' follow
libpismutil.a(PISMNC3File.cc.o):(.data.rel.ro._ZTVN3MPI3WinE[vtable for MPI::Win]+0x48): undefined reference to `MPI::Win::Free()'
libpismutil.a(PISMNC3File.cc.o):(.data.rel.ro._ZTVN3MPI8DatatypeE[vtable for MPI::Datatype]+0x78): undefined reference to `MPI::Datatype::Free()'
collect2: ld returned 1 exit status
make[2]: *** [bedrough_test] Error 1
make[1]: *** [src/CMakeFiles/bedrough_test.dir/all] Error 2
make: *** [all] Error 2

Allow using NetCDF's parallel I/O capabilities

  1. This should noticeably improve PISM's performance in high-resolution runs.
  2. It should be a run-time option.
  3. It should be easy if we switch PISM's output code and leave input code alone.

Improve the code applying time-dependent climate data

We want to be able to

  1. use time-dependent climate data "straight" (-surface given and -atmosphere given).
  2. apply arbitrary climate inputs (coming from a sequence of PISM's "boundary models", say a constant precipitation map + temperature parameterization + scalar temperature offsets, put through a PDD model) as anomalies, relative to given time-independent artm and acab fields. This is the "flux correction" case. (Try -surface ...,as_anomaly.)
  3. apply arbitrary atmospheric inputs as anomalies relative to given time-independent "near-surface air temperature" and "precipitation" fields. This case is just like 2., but at a different (i.e. atmosphere) level.
  4. apply time-dependent climate anomalies (either "artm + acab" or the "air temperature + precipitation" pair) read from a file. This is similar to 2. and 3., but the user has anomalies already. (See -atmosphere ...,anomaly.)

Now we have the code that does 1 and 2, plus a part of 4, but the interface is unintuitive. Moreover, the code itself can be improved (there is some avoidable code duplication.)

Notes:

  • item 2 is the most important one right now.
  • One needs to worry about being able to re-start runs using these mechanisms.

Make PISM calendar-aware

Add support for Gregorian calendar computations. This will make PISM easier to use for XX-century and prognostic runs.

bootstrapping fails with 'cannot find bwat' in dev

Running dev branch, commit 5b67529

The error occurs when running examples/searise-greenland/. After preprocess.sh, I get this right at the start:

   $ ./spinup.sh
   #(spinup.sh)  bootstrapping plus short smoothing run (for 100a)
   PISMR development 5b67529 (basic evolution run mode)
   CONFIG OVERRIDES read from file 'searise_config.nc'.
     time t = 0.0000 years found; setting current year
     setting flow law to polythermal type ...
     ...  [SNIP]
     reading 2D model state variables by regridding ...
     FOUND  lon       / standard_name=longitude 
                      \ min,max =   -92.130,   10.399 (degree_east)
     FOUND  lat       / standard_name=latitude  
                      \ min,max =    58.629,   84.482 (degree_north)
     FOUND  thk       / standard_name=land_ice_thickness
                      \ min,max =     0.000, 3325.986 (m)
     FOUND  topg      / standard_name=bedrock_altitude
                      \ min,max = -4574.027, 2568.967 (m)
     PISM ERROR: cannot find bwat () in pism_Greenland_5km_v1.1.nc

Is it possible that this dates back a few commits to dcd3d3c? Do we have regression tests for the bootstrapping code where a variable is not found?

make stress balances own flow laws

I am posting this task to make sure that a reasonably high-priority idea does not get lost in recent excitement and accomplishments. One motivation is Tolly's question:

Are there different enhancement factors for SIA and SSA?

Which is a hard question to answer.

A the technical level I am proposing that each PISMStressBalance have an IceFlowLaw member. I think it would also be wise to have a prefix scheme, so that "-ssafem_flow_law_e 1.0 -siafd_flow_law_e 3.0" is allowed. (This example is not intended to illustrate my faith in enhancement factors, however!)

A benefit already noted by Constantine is that the Hybrid flow law will be simplified. Thus "-siafd_flow_law gk" will be allowed but "-ssafd_flow_law gk" will generate an error. (The error message could include David Goldsby's email address perhaps.)

Nothing in this task is imperative. There may be good reasons to do it an entirely different way.

Ensure that climate model components can save their products.

Once task #7 is completed, we need to ensure that climate models (-surface, -atmosphere, -ocean) can save fields they produce.

For example, when PISM is given the -atmosphere given,dTforcong,lapse_rate option (not likely, but serves as an example), the lapse_rate object needs to be able to save artm that was given to the surface model underneath.

When -atmosphere given,dTforcing is given, a different object (dTforcing) needs to save artm for it to have the same meaning of "the artm that was given to the surface model underneath", and so on.

improvements to conservation of energy numerical details

Moving from https://gna.org/task/index.php?7373.

All of the things on this list are incremental. They occur to me now because I am renovating the enthalpy model in more fundamental ways. None should change the behavior, but a couple should make things slightly more accurate, and they mostly imply greater code clarity.

  1. Instead of storage->fine-equally-spaced grid transfer and then BOMBPROOF finite differences, a decent finite element method on the unequally-spaced grid is a probably good idea. The robust stability properties of BOMBPROOF need to be carried to the FEM, though.
  2. Currently the k=0 (base) level equation in enthSystemCtx is set by a call to setLevel0EqnThisColumn(). I think it would be just as clear an interface to have separate setDirichletBaseEqnThisColumn() and setNeumannBaseEqnThisColumn() calls instead, with the obvious meaning. Then the Neumann case could be implemented with greater accuracy, by the add-extra-point-at-minus-delta-z method.
  3. The current PISMBedThermalUnit scheme is equally-spaced in space, and I think that is fine. But it has a time-step restriction, and this can be removed by making the scheme into Crank-Nicolson, that is, trapezoid in time and implicit. The diagnostic upward flux could be the average of the values from the t=t_n and t=t_n+1 values, once the update() method is called for t_n to t_n+1.

Note that item 2 is implemented already (in April 2011).

Initialization of the Lingle-Clark bed deformation model is broken

The Lingle-Clark (-bed_def lc) initialization-from-the-uplift code seems to be broken. This makes -bed_def lc runs not re-startable and results in rather nasty artifacts in wvelbase and wvelsurf.

Run the following script, then compare ex-1.nc and ex-2.nc. Look for jumps in topg and dbdt at around year 100.

#!/bin/bash -x

pisms -y 1000 -Mx 31 -My 31 -o in.nc

EX1_FILE="ex-1.nc"
EX2_FILE="ex-2.nc"
OPTS="-extra_vars topg,dbdt -extra_times 0:1:200 -bed_def lc"

rm -f $EX1_FILE $EX2_FILE ex-*.nc out-*.nc

# Run for 200 years straight
pismr -i in.nc $OPTS -extra_file $EX1_FILE -ys 0 -ye 200 -o out-200.nc

# Run for 100 years, then for 100 more years:
pismr -i in.nc $OPTS -extra_file $EX2_FILE -ys 0 -ye 100 -o out-100.nc

pismr -i out-100.nc $OPTS -extra_file $EX2_FILE -extra_append -ye 200 -o out-200-2.nc

First pass of make browser_base fails

The following fails:

$ make clean
$ make browser_base

produces:

Traceback (most recent call last):
File "./config_doc.py", line 136, in
print '%s' % docstring
UnicodeEncodeError: 'ascii' codec can't encode characters in position 69-71: ordinal not in range(128)
make[2]: *** [doc/browser/pism_config.txt] Error 1
make[1]: *** [doc/browser/CMakeFiles/browser_base.dir/all] Error 2
make: *** [all] Error 2

However, a second

$ make browser_base

succeeds.

I've seen this happening for quite a while now, but always forgot to post it as a bug (especially since a second 'make browser_base' solves the problem).

Update the Source Code Browser

And make sure that all the important code is documented. (This should be done once everything else is out of the way.)

temperature discontinuities when using the "-skip" option

We found unphysical discontinuities in the evolution of temperature, bwat, ssa velocity and ice thickness fields when using the "-skip" option.
We assume this order of causality, because jumps to high values in the first three fields and to low values in ice thickness make sense physically. The problem occurs in Antarctica setups and simplified circular setups with pik options.
It is not dependent on the number of processors we use, it also occurs when we use only one.
Find the scripts for creating the simplified setup the run script to reproduce the bug and the output under
http://www.pik-potsdam.de/~mengel/pismtemperaturebug/
The first severe discontinuity occurs at year 2028 after 100 kyears nomass run on the pik cluster using 16 processors.
At this point, PISM tells us


S 102028.00000: 3.66912 2.3020 1.11816654 2716.91975

PISM WARNING: fully-liquified cells detected: volume liquified = 911209.091 km^3

[!CFL#=189865 (=130.84207576700f 3D grid)] BULGE=1218 y SSA: 8 outer iterations, ~2.0 KSP iterations each
v$Eh c (dt=0.00000)
S 102028.00000: 3.66912 2.3020 1.11185973 64192521101745905664.00000


Note the extremely high value of basal temperature at the center of the computational domain and the timestep of zero.
The "fully-liquified" warning occurs most often, but not always.
The processor domains are apparent in temperature and bwat fields when the bug occurs.
We do not find the problem when we leave out the "-skip" option.
We tried to reproduce the problem with less options from the pik zoo.
We do not find the discontinuities without the EigenCalving option. However, leaving out this option changes the geometry so strongly that the problem may occur due to the specific geometry of the ice sheet and not due to the Eigencalving code. Switching EigenCalving on in a run that equilibrated without EigenCalving does not show the bug. "-atmosphere pik" does not influence the occurrence.
We used the dev code at commit 5d1a9b9.

smallish rethinking of IceModel::velocitySIAStaggered() allows -pollarddeconto hybridization option

Original submission by Ed, https://gna.org/task/?6964:

Currently IceModel::velocitySIAStaggered() is effectively a map that takes the staggered grid values of the surface slope (=h_x[o][i][j] and h_y[o][i][j]) and computes the staggered grid values of of the vertically-averaged SIA velocity (=uvbar[o][i][j]) and a depth-dependent factor that says how that velocity should be distributed through the column (=Istag3).

Proposed new understanding: The input to velocitySIAStaggered() should be the z-dependent value of the driving stress. Let's denote that input as sigmaxz and sigmayz (i.e. these could be the names of the arrays returned by GetArray on a IceModelVec3). In the usual SIA case,

sigmaxzyz[0][i][j][k] = - rho g (H - z[k]) h_x[0][i][j]   << DETAILS: CHECK THIS!
sigmaxzyz[1][i][j][k] = - rho g (H - z[k]) h_y[1][i][j]   << DETAILS: CHECK THIS!

But under the -pollarddeconto option:

sigmaxzyz[0][i][j][k] = - rho g (H - z[k]) h_x[0][i][j]  - ssalongx[0][i][j] (H - z[k]) / H
sigmaxzyz[1][i][j][k] = - rho g (H - z[k]) h_y[1][i][j]  - ssalongy[1][i][j] (H - z[k]) / H

Actually, I now see that the z dependence is still trivial and only an IceModelVec2 needs to come into velocitySIAstaggered(). I.e. the input to velocitySIAstaggered() can be thought of as

- rho g h_x[0][i][j]
- rho g h_y[1][i][j]

in the old case and

- rho g h_x[0][i][j] - ssalongx[0][i][j] / H
- rho g h_y[1][i][j] - ssalongx[0][i][j] / H

Thus the input to velocitySIAstaggered() is \partial\sigma_{xz}/\partial z, up to sign errors etc.

Because of the division by H, this term should/may need to be dropped at very small thicknesses. (Where the SSA solution was not so reliable anyway. Or maybe this is just an issue of extracting from the SSA solve the right way.)

Also this unifies with the hydrostatic-Blatter-Pattyn model; the input to velocitySIAstaggered() is now the same as the right-hand side of that model. Happy!

Yes, we need to extract more from the SSA solve. And we need to think about order of SIA and SSA calculations. Perhaps a velocityComputer and velocityVerticalModifier abstraction??

Note: On a temperature or age update step, IceModel::horizontalVelocitySIARegular() uses both of the outputs of velocitySIAStaggered(). On a mass-continuity step without temperature update, only the uvbar output is used. But this has nothing to do with the hybridization issue; the rethinking has to do only with how we understand the inputs to velocitySIAStaggered().

add mass_held_in_surface_layer() to PISMSurfaceModel and use it in stress balances

The idea is that a "good" PISMSurfaceModel might "hold" firn for centuries or millenia before "releasing it" as incompressible ice to the ice dynamics core of PISM (=IceModel).

The following is an excerpt from Ed's Jan 2011 visit to the Centre for Ice and Climate, and to Danish Climate Centre:

Namely we need to add one capability ... PISMSurfaceModel::mass_held_in_surface_layer(t_years, dt_years,
IceModelVec2S &result). Frequently the "dt_years" is just ignored even by "good" firn models, and for the current couplers this would just return 0, I think. This method [would be] called from within SSAFD::compute_driving_stress() and other
places where the thickness and surface slope *related to computation of stresses in the ice are needed. Also for determining flotation and pressure melting temperature, I suppose. But we are not going to claim to compute the stress field in that layer, for example. That is, the surface model owns the mass of the (solid-or-liquid) water in its layer, e.g. for mass conservation purposes and the modeling of runoff and such, but we need the weight of that layer!

Add a the -list_diagnostics command-line option

Add an option that would make PISM print the list of diagnostics available in a particular run and stop. (What is on this list depends on other options and config. settings.)

All the code is there, I just need to add the option itself.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.