Coder Social home page Coder Social logo

cloudtomography / at3d Goto Github PK

View Code? Open in Web Editor NEW
14.0 2.0 9.0 121.95 MB

Retrieves 3D cloud properties from multi-angle images of reflected solar radiation

License: GNU General Public License v3.0

Python 13.51% Jupyter Notebook 62.05% Fortran 24.22% Shell 0.22%
python remote-sensing radiative-transfer radiative-transfer-models atmospheric-science cloud-microphysics satellite-imagery-analysis fortran

at3d's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

at3d's Issues

Segfault related to NPHI

When NPHI is a bit larger than 2 * NUM_MU_BINS there are segfaults when solution_iterations is called before the completion of one iteration. This was tested for NUM_MU_BINS = 8 and NUM_MU_BINS = 16 in 1D and 3D modes with adaptive grid splitting.

I am guessing it is due to this misspecification of the size of some working array in the discrete ordinates integration.

Tests fail because `np.int` and `np.float` were deprecated

Hello,

Running the tests in tests/ produces a few errors like the following:

======================================================================
ERROR: _ErrorHolder
----------------------------------------------------------------------
Traceback (most recent call last):
  File "at3d\AT3D\tests\test_derivatives.py", line 838, in setUpClass
    solvers, Sensordict,cloud_poly_tables,final_step,rte_grid = cloud_solar(mie_mono_table,ext,veff,reff,ssalb,solarmu,surfacealb,ground_temperature,
  File "at3d\AT3D\tests\test_derivatives.py", line 726, in cloud_solar
    Sensordict.get_measurements(solvers, maxiter=200, n_jobs=4, verbose=False)
  File "at3d\at3d\at3d\containers.py", line 226, in get_measurements
    keys, ray_start_end, pixel_start_end = at3d.parallel.subdivide_raytrace_jobs(rte_sensors, n_jobs)
  File "at3d\at3d\at3d\parallel.py", line 142, in subdivide_raytrace_jobs
    render_jobs[key] = max(np.ceil(render_job/ray_count * n_jobs * job_factor).astype(np.int), 1)
  File "miniconda3\envs\at3d\lib\site-packages\numpy\__init__.py", line 338, in __getattr__
    raise AttributeError(__former_attrs__[attr])
AttributeError: module 'numpy' has no attribute 'int'.
`np.int` was a deprecated alias for the builtin `int`. To avoid this error in existing code, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.
The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:
    https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations. Did you mean: 'inf'?

======================================================================

For both np.int and np.float, that were deprecated since numpy 1.20.0 (see https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations ).

Changing all references to python's built-in int and float solves the errors, but while being previously aliased by np.int and np.float, perhaps a lower level revision of the dtypes is necessary to make sure they are numerically appropriate?

I'm using conda 23.9.0, python 3.10.13, and numpy 1.26.2 (can't install version 1.21.2 from the requirements in python 3.10).

Thank you

About the question to create optical properties for ice particles

I wonder whether I can create optical properties dataset for ice particles which is not directly supported by at3d like how can I create xarray dataset for non-spherical particles when I have obtained single scattering albedo and extinction data externally?

Gradient bugs

Some gradient tests fail with the appearance of NaNs in the gradient. Seems to be linked to boundary columns. Use with caution till fixed.

Bug fixes in airmspi branch

Minor bug fixes for micro-physical optimization (should be integrated to main at some point): 5f813dd

mask.astype(np.bool) was moved to StateToGridMask.add_transoform for easier support of transforms that are not set externally as a numpy array e.g.:

state_to_grid = pyshdom.transforms.StateToGridMask()
state_to_grid.add_transform('cloud', 'density', mask.values, fill_value=0.0)
state_to_grid.add_transform('cloud', 'reff', mask.values, fill_value=10.0)

Jesse I assigned you so you could verify these bug fixes

Don't use boundary grid points for optimization

Very small inconsistencies in the calculation of the sensitivity of the direct beam optical path to the extinction, typically in boundary voxels. Until this is resolved, I suggest not using these grid points for retrievals.

The errors includes some negative values in the predicted sensitivity. Currently, the tests for the direct beam fail but the accuracy loss is very small (maximum_error < 1e-4) for typical values ranging two orders of magnitude larger. Derivative calculations within the volume should still meet the specified accuracy requirement in the tests.

Cannot compile the code on Ubuntu 18 or 16

Hi dear pyshdom developers,
I didn't succeed to compile the code due to strange Fortran compilation errors:

src/polarized/shdomsub4.f:1339.22:
INTEGER DOEXACT(IDR)
Error: Variable 'idr' cannot appear in the expression at (1)

src/polarized/shdomsub4.f:2684.30:
REAL DLEG(NSTLEG,0:NLEG,DNUMPHASE)
Error: Variable 'dnumphase' cannot appear in the expression at (1)

src/polarized/shdomsub4.f:2683.31:
REAL LEGEN(NSTLEG,0:NLEG,NUMPHASE)
Error: Variable 'numphase' cannot appear in the expression at (1)

I see that the code is OK so I guess I maybe use a "bad" gfortan version.
I followed every step in the readme.
It happens in Ubuntu 18.04.5 and Ubuntu 16.04.7.

Can you tell me what gfortan and Ubuntu version (if relevant) do you use?

Thanks

Import Error on Mac M1 (arm64)

AT3D installs but fails to import on Mac with M1 chip with the error in the screenshot.

截屏2023-10-21 1 46 10

Another user had a similar issue that was found to be that anaconda was not the arm64 version, but instead the x86_64 version. They suggested the following fix:

To find out what you have for python, I would try the following command:
(first activate the python environment that you are using, then type)

file -L which python3

This should tell you what kind of binary anaconda is. If it is x86_64, then I’d recommend installing the arm64 version of anaconda.

I would delete whatever folder anaconda is installed in. Typically it is ~/opt/anaconda3. With anaconda, all the envs are contained within that directory. If you created any envs with pip, which puts the in the folder of whatever project you are working on, you'll have to remove them separately. Then reinstall the correct version of anaconda and recreate your envs from scratch.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.