Coder Social home page Coder Social logo

jpolchlo / inundation-mapping Goto Github PK

View Code? Open in Web Editor NEW

This project forked from noaa-owp/inundation-mapping

0.0 0.0 0.0 27.09 MB

Flood inundation mapping and evaluation software configured to work with U.S. National Water Model.

License: Other

Shell 4.67% Python 94.19% QML 0.85% Dockerfile 0.29%

inundation-mapping's Introduction

Inundation Mapping: Flood Inundation Mapping for U.S. National Water Model

This repository includes flood inundation mapping software configured to work with the U.S. National Water Model operated and maintained by the National Oceanic and Atmospheric Administration (NOAA) National Water Center (NWC).

This software uses the Height Above Nearest Drainage (HAND) method to generate Relative Elevation Models (REMs), Synthetic Rating Curves (SRCs), and catchment grids. This repository also includes functionality to generate flood inundation maps (FIMs) and evaluate FIM accuracy.

For more information, see the Inundation Mapping Wiki.


FIM Version 4

Accessing Data through ESIP S3 Bucket

The latest national generated HAND data and a subset of the inputs can be found in an Amazon S3 Bucket hosted by Earth Science Information Partners (ESIP). These data can be accessed using the AWS CLI tools. You will need permission from ESIP to access this data. Please contact Carson Pruitt ([email protected]) or Fernando Salas ([email protected]) for assistance.

AWS Region: US East (N. Virginia) us-east-1

AWS Resource Name: arn:aws:s3:::noaa-nws-owp-fim

Configuring the AWS CLI

  1. Install AWS CLI tools

  2. Configure AWS CLI tools

Accessing Data using the AWS CLI

This S3 Bucket (s3://noaa-nws-owp-fim) is set up as a "Requester Pays" bucket. Read more about what that means here. If you are using compute resources in the same region as the S3 Bucket, then there is no cost.

Examples

List bucket folder structure:

aws s3 ls s3://noaa-nws-owp-fim/ --request-payer requester

Download a directory of outputs for a HUC8:

aws s3 cp --recursive s3://noaa-nws-owp-fim/hand_fim/outputs/fim_4_0_18_02/12090301 /your_local_folder_name/12090301 --request-payer requester

By adjusting pathing, you can also download entire directories such as the fim_4_0_18_0 folder. Note: There may be newer editions than fim_4_0_18_0, and it is recommended to adjust the command above for the latest version.

Running the Code

Input Data

Input data can be found on the ESIP S3 Bucket (see "Accessing Data through ESIP S3 Bucket" section above). All necessary non-publicly available files are in this S3 bucket, as well as sample input data for HUCs 1204 and 1209.

Dependencies

Docker

Installation

  1. Install Docker : Docker
  2. Build Docker Image : docker build -f Dockerfile -t <image_name>:<tag> <path/to/repository>
  3. Create FIM group on host machine:
    • Linux: groupadd -g 1370800178 fim
  4. Change group ownership of repo (needs to be redone when a new file occurs in the repo):
    • Linux: chgrp -R fim <path/to/repository>

Configuration

This software is configurable via parameters found in the config directory. Copy files before editing and remove "template" pattern from the filename. Make sure to set the config folder group to 'fim' recursively using the chown command. Each development version will include a calibrated parameter set of manning’s n values.

  • params_template.env

This system has an optional tool called the calibration database tool. In order to use this system, you will need to install the calibration database service or disable it in the params_template.env file. See calibration tool README for more details.

Produce HAND Hydrofabric

fim_pipeline.sh -u <huc8> -n <name_your_run>
  • There are a wide number of options and defaulted values, for details run fim_pipeline.sh -h.
  • Manditory arguments:
    • -u can be a single huc, a series passed in quotes space delimited, or a line-delimited (.lst) file. To run the entire domain of available data use one of the /data/inputs/included_huc8.lst files or a HUC list file of your choice. Depending on the performance of your server, especially the number of CPU cores, running the full domain can take multiple days.
    • -n is a name of your run (only alphanumeric)
  • Outputs can be found under /data/outputs/<name_your_run>.

Processing of HUC's in FIM4 comes in three pieces. You can run fim_pipeline.sh which automatically runs all of three major section, but you can run each of the sections independently if you like. The three sections are:

  • fim_pre_processing.sh : This section must be run first as it creates the basic output folder for the run. It also creates a number of key files and folders for the next two sections.
  • fim_process_unit_wb.sh : This script processes one and exactly one HUC8 plus all of it's related branches. While it can only process one, you can run this script multiple times, each with different HUC (or overwriting a HUC). When you run fim_pipeline.sh, it automatically iterates when more than one HUC number has been supplied either by command line arguments or via a HUC list. For each HUC provided, fim_pipeline.sh will fim_process_unit_wb.sh. Using the fim_process_unit_wb.sh script allows for a run / rerun of a HUC, or running other HUCs at different times / days or even different docker containers.
  • fim_post_processing.sh : This section takes all of the HUCs that have been processed, aggregates key information from each HUC directory and looks for errors across all HUC folders. It also processes the group in sub-steps such as usgs guages processesing, rating curve adjustments and more. Naturally, running or re-running this script can only be done after running fim_pre_processing.sh and at least one run of fim_process_unit_wb.sh.

Running the fim_pipeline.sh is a quicker process than running all three steps independently.

Testing in Other HUCs

To test in HUCs other than the provided HUCs, the following processes can be followed to acquire and preprocess additional NHDPlus rasters and vectors. After these steps are run, the "Produce HAND Hydrofabric" step can be run for the new HUCs.

/foss_fim/src/acquire_and_preprocess_inputs.py -u <huc4s_to_process>
Note: This tool is deprecated, updates will be coming soon.
  • -u can be a single HUC4, series of HUC4s (e.g. 1209 1210), path to line-delimited file with HUC4s.
  • Please run /foss_fim/src/acquire_and_preprocess_inputs.py --help for more information.
  • See United States Geological Survey (USGS) National Hydrography Dataset Plus High Resolution (NHDPlusHR) site for more information

Reproject NHDPlus High-Res Rasters and Convert to Meters.

/foss_fim/src/preprocess_rasters.py
Note: This tool is deprecated, updates will be coming soon.

Evaluating Inundation Map Performance

After fim_pipeline.sh completes, or combinations of the three major steps described above, you can evaluate the model's skill. The evaluation benchmark datasets are available through ESIP in the test_cases directory.

To evaluate model skill, run the following:

python /foss_fim/tools/synthesize_test_cases.py -c DEV -v <fim_run_name> -m <path/to/output/metrics.csv> -j [num_of_jobs (cores and/or procs)]

More information can be found by running:

python /foss_fim/tools/synthesize_test_cases.py --help

Managing Dependencies

Dependencies are managed via Pipenv.

When you execute docker build from the Installation section above, all of the dependencies you need are included. This includes dependencies for you to work in JupyterLab for testing purposes.

While very rare, you may want to add more dependencies. You can follow the following steps:

  • From inside your docker container, run the following command:

    pipenv install <your package name> --dev

    The --dev flag adds development dependencies, omit it if you want to add a production dependency.

    This will automatically update the Pipfile in the root of your docker container directory. If the environment looks goods after adding dependencies, lock it with:

    pipenv lock

    This will update the Pipfile.lock. Copy the new updated Pipfile and Pipfile.lock in the source directory and include both in your git commits. The docker image installs the environment from the lock file.

Make sure you test it heavily including create new docker images and that it continues to work with the code.

If you are on a machine that has a particularly slow internet connection, you may need to increase the timeout of pipenv. To do this simply add PIPENV_INSTALL_TIMEOUT=10000000 in front of any of your pipenv commands.


Citing This Work

Please cite this work in your research and projects according to the CITATION.cff file found in the root of this repository.


Known Issues & Getting Help

Please see the issue tracker on GitHub and the Inundation Mapping Wiki for known issues and getting help.

Getting Involved

NOAA's National Water Center welcomes anyone to contribute to the Inundation Mapping repository to improve flood inundation mapping capabilities. Please contact Carson Pruitt ([email protected]) or Fernando Salas ([email protected]) to get started.

Open Source Licensing Info

  1. TERMS
  2. LICENSE

Credits and References

  1. Office of Water Prediction (OWP)
  2. National Flood Interoperability Experiment(NFIE)
  3. Garousi‐Nejad, I., Tarboton, D. G.,Aboutalebi, M., & Torres‐Rua, A.(2019). Terrain analysis enhancements to the Height Above Nearest Drainage flood inundation mapping method. Water Resources Research, 55 , 7983–8009.
  4. Zheng, X., D.G. Tarboton, D.R. Maidment, Y.Y. Liu, and P. Passalacqua. 2018. “River Channel Geometry and Rating Curve Estimation Using Height above the Nearest Drainage.” Journal of the American Water Resources Association 54 (4): 785–806.
  5. Liu, Y. Y., D. R. Maidment, D. G. Tarboton, X. Zheng and S. Wang, (2018), "A CyberGIS Integration and Computation Framework for High-Resolution Continental-Scale Flood Inundation Mapping," JAWRA Journal of the American Water Resources Association, 54(4): 770-784.
  6. Barnes, Richard. 2016. RichDEM: Terrain Analysis Software
  7. TauDEM
  8. Federal Emergency Management Agency (FEMA) Base Level Engineering (BLE)
  9. Verdin, James; Verdin, Kristine; Mathis, Melissa; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; and Gadain, Hussein, 2016, A software tool for rapid flood inundation mapping: U.S. Geological Survey Open-File Report 2016–1038, 26
  10. United States Geological Survey (USGS) National Hydrography Dataset Plus High Resolution (NHDPlusHR)
  11. Esri Arc Hydro

inundation-mapping's People

Contributors

bradfordbates-noaa avatar brianavant avatar caleboliven-noaa avatar carsonpruitt-noaa avatar dependabot[bot] avatar ericmyskowski-noaa avatar fernando-aristizabal avatar frsalas-noaa avatar gregcocks-noaa avatar jamescoll-noaa avatar laurakeys-noaa avatar mluck avatar nickchadwick-noaa avatar robgpita avatar robhanna-noaa avatar ryanspies-noaa avatar trevorgrout-noaa avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.