Coder Social home page Coder Social logo

cblue-dev-team / cblue.github.io Goto Github PK

View Code? Open in Web Editor NEW
12.0 8.0 1.0 79.35 MB

cBLUE is a tool to calculate the total propagated uncertainty of bathymetric lidar data.

Home Page: https://noaa-rsd.github.io/cBLUE.github.io/

License: GNU Lesser General Public License v2.1

Python 69.55% Makefile 0.26% Batchfile 0.90% MATLAB 26.61% Jupyter Notebook 2.69%
bathymetric-lidar topobathymetric-lidar bathymetry lidar total-propagated-uncertainty tpu rsd noaa-rsd

cblue.github.io's Introduction

cBLUE

image info

Documentation

To view cBLUE documentation, including installation instructions, click cBLUE User Manual

Note for Contributors:

cBLUE uses pre-commit hooks for code standardization. To keep commits standard, please run the following commands before issuing a new PR:

  1. pip install pre-commit
  2. cd cBLUE.github.io
  3. pre-commit install

Note: the hook black will cause commits to fail on first attempt and automatically reformat non-compliant code. Simply re-stage and commit the reformatted files.

cblue.github.io's People

Contributors

austinzanderson avatar brian-r-calder avatar firateren avatar forkozi avatar fpcorcoran avatar kiefk avatar klangrjt avatar parrishosu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

kiefk

cblue.github.io's Issues

Allow for optional pre-processing of LAS tiles

Currently, the code assumes that pre-processing has to be done every time the GUI is run (i.e., you can't compute TPUs unless you first run the pre-processing, even if the input files haven't changed, and the pre-processing has already been done). Since the pre-processing can take considerable time, it would be better to make this requirement optional.

So: Add a check-box to the GUI to allow the user to intentionally turn off the requirement for pre-processing, or have the code check for the existence of pre-processed data, and bypass the computation if it exists (assume the user will delete these separately if the pre-processing needs to be done).

Please add script from Nick to repository

Rudy - per our conversation so this doesn't get lost. Please add Nick's script from to the repository that searches for pt source id 0 and add instructions on how to use and how to get rid of zero pt source ID (sort and use LAS tools?) if this is likely to happen to surveys again.

Many Thanks,
g

Exception in Tkinter callback (ValueError: zero-size array to reduction ...)

from email from Jaime Kum...

Exception in Tkinter callback
Traceback (most recent call last):
File "C:\Python27\lib\lib-tk\Tkinter.py", line 1541, in call
return self.func(*args)
File "Gui.py", line 416, in tpuProcessCallback
self.sbets_df, las_to_process)
File "D:\00408518_data\NGS_TPU-master\calc_aerial_TPU.py", line 381, in main
D = merge(sbets_df.values, t, x, y, z) # D for data
File "D:\00408518_data\NGS_TPU-master\calc_aerial_TPU.py", line 59, in merge
max_dt = np.max(dt)
File "C:\Python27\lib\site-packages\numpy\core\fromnumeric.py", line 2320, in amax
out=out, **kwargs)
File "C:\Python27\lib\site-packages\numpy\core_methods.py", line 26, in _amax
return umr_maximum(a, axis, None, out, keepdims)
ValueError: zero-size array to reduction operation maximum which has no identity

image

TPU Values for Topo Points (Class 2)

The THU and TVU value for a non-bathy point is not valid due to the fact that the TPU value includes the subaqueous portion. In order to fix this we should only be calculating subaerial for Class 2 topo points.

las.py | ValueError: no field of name raw_classification

Some las files fail when trying to process TPU if they are not "raw_classification" type.

Workaround currently is to change in las.py
line 104
from
c = self.points_to_process['raw classification']
to
c = self.points_to_process['classification_byte']

Flexibility in output file location

The output CSV/meta files should have a directory that can be set in the GUI, so that they don't have to be written back to the same location as the LAS source files. In many cases, the LAS source files may be on a read-only share and/or remote; allowing the user to specify where to put the outputs would allow for this and/or improve IO performance.

cBLUE | Process whole workflow with one button click

Would be nice in later versions to run the whole cBLUE process with the click of one button.

Currently we process the sbet and after the sbet is processed there is a pause until we push the process tpu button.

If we could run the whole process from start to finish with one click that would save some babysitting time on waiting for the sbet to process especially for projects with a lot of missions.

cBLUE v.2.2.1 | Exception

cBLUE v.2.2.1

Getting the following exception when trying to process the TPU

*Loading trajectory files...
100% (12 of 12) |################################################################| Elapsed Time: 0:07:34 Time: 0:07:34
Calculating TPU (multi-processing)...
0%| | 0/2827 [00:00<?, ?it/s]Exception in Tkinter callback
Traceback (most recent call last):
File "C:\Users\Jamie.Kum.conda\envs\cBLUE\lib\tkinter_init_.py", line 1705, in call
return self.func(args)
File "CBlueApp.py", line 472, in
b1 = ttk.Button(wseh, text='Ok', command=lambda: self.continue_with_tpu_calc(wseh, float(wseh_value.get())))
File "CBlueApp.py", line 453, in continue_with_tpu_calc
self.begin_tpu_calc()
File "CBlueApp.py", line 552, in begin_tpu_calc
p = tpu.run_tpu_multiprocess(num_las, sbet_las_tiles_generator())
File "E:\004327912_data\python_rsd\cBLUE\cBLUE-2.2.1-rc1\Tpu.py", line 354, in run_tpu_multiprocess
for _ in tqdm(p.imap(self.calc_tpu, sbet_las_generator), total=num_las, ascii=True):
File "C:\Users\Jamie.Kum.conda\envs\cBLUE\lib\site-packages\tqdm\std.py", line 1107, in iter
for obj in iterable:
File "C:\Users\Jamie.Kum.conda\envs\cBLUE\lib\site-packages\multiprocess\pool.py", line 748, in next
raise value
AttributeError: 'WindowsPath' object has no attribute 'split'

Flexible placing of intermediate products

Since in many cases LAS source files may be on a read-only file system, it would be beneficial if the intermediate working directory could be somewhere other than the same directory as the LAS files. In addition, this would allow for the intermediate files to be written to a different disc channel (e.g., on another controller, or locally if the LAS source is remote), with performance benefits.

Attribute Error

Hi,

I am re-running cBLUE on FL1606 and FL1607 since they are the only two so far (other than FL1604 which was already submitted to MCD) with the pointID zero issue (TUV/THU tile issue) and I am seeing the following error:
image

For FL1606 I used the following las files: \ngs-s-rsd\Lidar_Proc\2017\FL1606-TB-N-880_DeSoto_to_BocaGrande_p\FL1606-TB-N-880_DeSoto_to_BocaGrande_02_p\06_RIEGL_PROC\04_EXPORT\Green\05_FL1606_DeSoto_Sarasota_02_g_gpsa_rf_ip_wsf_r_tm_qc_geocode\las_1.4_domain

SBET directory: \ngs-s-rsd\Lidar_Proc\2017\FL1606-TB-N-880_DeSoto_to_BocaGrande_p\tpu_sbet

Are these correct? wondering if either of these are causing a problem.

Also, I used the following parameters from the .json file as Jamie suggested:
"water surface": "Model (ECKV spectrum)",
"wind": "Gentle Breeze (7-10 kts)",
"kd": "Moderate (0.18-0.25 m^-1)",
"VDatum region": "---No Region Specified---",
"water_surface_ellipsoid_height": -24.3

Thanks for any assistance,
g

Preserve pathnames in GUI

User currently needs to reset the paths in the GUI on each run. It would be better to preserve these between runs, and restore on reset if available.

WARNING: Invalid body length for classification lookup, not parsing.

Get the following warning when running certain las files through cBLUE.

WARNING: Invalid body length for classification lookup, not parsing.
WARNING: Invalid body length for classification lookup, not parsing.

Does not seem to be an issue with the output, just a warning.

implementing additional lidar sensors

Implementing a sensor model in cBLUE consists (1) defining a laser geolocation equation and (2) specifying supplemental sensor-specific parameters. Currently, cBLUE supports one, generic sensor model, which was developed for use with the Riegl VQ-880-G lidar sensor. With slight modifications, the generic sensor model can be applied to other lidar sensors. Custom sensor models (i.e., those employing different laser geolocation equations), could also be created and integrated within cBLUE.

Using cBLUE’s Generic Sensor Model
The bulk of cBLUE’s generic sensor model is defined in the SensorModel class, which defines the laser geolocation equation and associated polynomial-surface error modeling. Assuming the same laser geolocation equation and associated polynomial-surface error modeling are to be used, the SensorModel class doesn’t require changes for it to be used with another lidar sensor. The following supporting parameters, however, do need to be edited.

• Sensor-Model Parameter Uncertainties
The a priori uncertainties (standard deviations) of the parameters in the sensor model laser geolocation equation (a, b, and rho) are currently specified in the Merge class.
https://github.com/noaa-rsd/cBLUE.github.io/blob/b1eb18315113d62f7df829deec78230faa690a04/Merge.py#L44-L46

• Sub-aqueous TPU Look-up Tables (LUTs)
The sensor-specific lookup table used in computing the subaqueous component of the total propagated uncertainty is currently specified in the ControllerPanel class.
https://github.com/noaa-rsd/cBLUE.github.io/blob/b1eb18315113d62f7df829deec78230faa690a04/CBlueApp.py#L515

One refactoring strategy for generalizing the current generic sensor model implementation is to consolidate the supporting parameters within the SensorModel class and instantiate SensorModel objects with sensor-specific information read from a json file, per the GUI dropdown menu in the CBlueApp class.

Using a Customized Sensor Model
The existing cBLUE SensorModel class implements the sensor model as described by Eren, F. et al (2019), which parameterizes the scan pattern using two angles, a and b, and includes polynomial-surface error modeling. Incorporating other sensor models (i.e., laser geolocation equations) into cBLUE would require some sort of refactoring of the SensorModel class and the associated Jacobian class, which provides the functionality to calculate and evaluate the Jacobian (i.e., matrix of partial derivatives) required to calculate TPU.

TPU json file formatting

I ran into an issue with the formatting within the json files while developing a seperate program that parses these files.
When any of the statistics values for a flight line (min/max/mean/stddev) is greater than 9.999, it messes up the formatting for that line.
See the snippit below for an example, there is no whitespace between the min and max values for total_thu.

"flight line stats (min max mean stddev)": {
"103 (6561/6561 points with TPU)": [
"total_thu: 0.51018.635 0.535 0.323",
"total_tvu: 0.166 9.037 0.185 0.299"
]
},
2021_414000e_4128500n.json

v2.2.2 Attribute error

Hi,

I am re-running cBLUE on FL1606 and FL1607 since they are the only two so far (other than FL1604 which was already submitted to MCD) with the pointID zero issue (TUV/THU tile issue) and I am seeing the following error:
image

For FL1606 I used the following las files: \ngs-s-rsd\Lidar_Proc\2017\FL1606-TB-N-880_DeSoto_to_BocaGrande_p\FL1606-TB-N-880_DeSoto_to_BocaGrande_02_p\06_RIEGL_PROC\04_EXPORT\Green\05_FL1606_DeSoto_Sarasota_02_g_gpsa_rf_ip_wsf_r_tm_qc_geocode\las_1.4_domain

Is this correct? wondering if this is the problem.

I used the following parameters from the .json file as Jamie suggested:

TPU Calculation not running v2.2.0

When running cBLUE v2.2.0, trajectory loads fine but when we press the button to calculate TPU it finishes quickly with no files created.

image

Also noticed the cBLUE_version in the configuration json is indicating v2.1.1 along with the splash screen and Help --> About

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.