cloudtomography / at3d Goto Github PK
View Code? Open in Web Editor NEWRetrieves 3D cloud properties from multi-angle images of reflected solar radiation
License: GNU General Public License v3.0
Retrieves 3D cloud properties from multi-angle images of reflected solar radiation
License: GNU General Public License v3.0
When NPHI is a bit larger than 2 * NUM_MU_BINS there are segfaults when solution_iterations is called before the completion of one iteration. This was tested for NUM_MU_BINS = 8 and NUM_MU_BINS = 16 in 1D and 3D modes with adaptive grid splitting.
I am guessing it is due to this misspecification of the size of some working array in the discrete ordinates integration.
Hello,
Running the tests in tests/
produces a few errors like the following:
======================================================================
ERROR: _ErrorHolder
----------------------------------------------------------------------
Traceback (most recent call last):
File "at3d\AT3D\tests\test_derivatives.py", line 838, in setUpClass
solvers, Sensordict,cloud_poly_tables,final_step,rte_grid = cloud_solar(mie_mono_table,ext,veff,reff,ssalb,solarmu,surfacealb,ground_temperature,
File "at3d\AT3D\tests\test_derivatives.py", line 726, in cloud_solar
Sensordict.get_measurements(solvers, maxiter=200, n_jobs=4, verbose=False)
File "at3d\at3d\at3d\containers.py", line 226, in get_measurements
keys, ray_start_end, pixel_start_end = at3d.parallel.subdivide_raytrace_jobs(rte_sensors, n_jobs)
File "at3d\at3d\at3d\parallel.py", line 142, in subdivide_raytrace_jobs
render_jobs[key] = max(np.ceil(render_job/ray_count * n_jobs * job_factor).astype(np.int), 1)
File "miniconda3\envs\at3d\lib\site-packages\numpy\__init__.py", line 338, in __getattr__
raise AttributeError(__former_attrs__[attr])
AttributeError: module 'numpy' has no attribute 'int'.
`np.int` was a deprecated alias for the builtin `int`. To avoid this error in existing code, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.
The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:
https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations. Did you mean: 'inf'?
======================================================================
For both np.int
and np.float
, that were deprecated since numpy 1.20.0
(see https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations ).
Changing all references to python's built-in int
and float
solves the errors, but while being previously aliased by np.int
and np.float
, perhaps a lower level revision of the dtypes is necessary to make sure they are numerically appropriate?
I'm using conda 23.9.0, python 3.10.13, and numpy 1.26.2 (can't install version 1.21.2 from the requirements in python 3.10).
Thank you
Enhancement:
Currently when working on real data it would be useful to have a function just to save the sensor list without needing to create the solver list.
Bug:
I don't follow the logic here - there is an if/else condition with the identical consequences.
https://github.com/CloudTomography/pyshdom/blob/7591db4bcac48205d19dd6b94d13fb09c88e7f3b/pyshdom/medium.py#L252
Crashed with one of the variables as density
because self._exact_size_distribution_parameters
does not have a density
key.
Is this an intended behavior? Should only size distribution parameters and extinction be used?
I wonder whether I can create optical properties dataset for ice particles which is not directly supported by at3d like how can I create xarray dataset for non-spherical particles when I have obtained single scattering albedo and extinction data externally?
If this should be a warning, maybe we should add an argument to check_range: raise='exception' with the options 'exception'/'warning'?
maxnmicro is used as shape for optical property derivatives instead of deriv_max_num_micro. Need to fix all related shapes.
This issue can be sidestepped by using species that all have the same maxnmicro.
Not sure why at this point.
Occasional segmentation faults when calling core.optical_path. Issue may also affect space_carve and min_optical_path.
Some gradient tests fail with the appearance of NaNs in the gradient. Seems to be linked to boundary columns. Use with caution till fixed.
Minor bug fixes for micro-physical optimization (should be integrated to main at some point): 5f813dd
mask.astype(np.bool)
was moved to StateToGridMask.add_transoform
for easier support of transforms that are not set externally as a numpy array e.g.:
state_to_grid = pyshdom.transforms.StateToGridMask()
state_to_grid.add_transform('cloud', 'density', mask.values, fill_value=0.0)
state_to_grid.add_transform('cloud', 'reff', mask.values, fill_value=10.0)
Jesse I assigned you so you could verify these bug fixes
Very small inconsistencies in the calculation of the sensitivity of the direct beam optical path to the extinction, typically in boundary voxels. Until this is resolved, I suggest not using these grid points for retrievals.
The errors includes some negative values in the predicted sensitivity. Currently, the tests for the direct beam fail but the accuracy loss is very small (maximum_error < 1e-4) for typical values ranging two orders of magnitude larger. Derivative calculations within the volume should still meet the specified accuracy requirement in the tests.
Hi dear pyshdom developers,
I didn't succeed to compile the code due to strange Fortran compilation errors:
src/polarized/shdomsub4.f:1339.22:
INTEGER DOEXACT(IDR)
Error: Variable 'idr' cannot appear in the expression at (1)
src/polarized/shdomsub4.f:2684.30:
REAL DLEG(NSTLEG,0:NLEG,DNUMPHASE)
Error: Variable 'dnumphase' cannot appear in the expression at (1)
src/polarized/shdomsub4.f:2683.31:
REAL LEGEN(NSTLEG,0:NLEG,NUMPHASE)
Error: Variable 'numphase' cannot appear in the expression at (1)
I see that the code is OK so I guess I maybe use a "bad" gfortan version.
I followed every step in the readme.
It happens in Ubuntu 18.04.5 and Ubuntu 16.04.7.
Can you tell me what gfortan and Ubuntu version (if relevant) do you use?
Thanks
AT3D installs but fails to import on Mac with M1 chip with the error in the screenshot.
Another user had a similar issue that was found to be that anaconda was not the arm64 version, but instead the x86_64 version. They suggested the following fix:
To find out what you have for python, I would try the following command:
(first activate the python environment that you are using, then type)file -L
which python3
This should tell you what kind of binary anaconda is. If it is x86_64, then I’d recommend installing the arm64 version of anaconda.
I would delete whatever folder anaconda is installed in. Typically it is ~/opt/anaconda3. With anaconda, all the envs are contained within that directory. If you created any envs with pip, which puts the in the folder of whatever project you are working on, you'll have to remove them separately. Then reinstall the correct version of anaconda and recreate your envs from scratch.
Error in the subpixel ray averaging in util.f90 seems to cause segfaults. Likely linked to #7
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.