Coder Social home page Coder Social logo

dtarb / taudem Goto Github PK

View Code? Open in Web Editor NEW
221.0 221.0 115.0 1.82 MB

Terrain Analysis Using Digital Elevation Models (TauDEM) software for hydrologic terrain analysis and channel network extraction.

Home Page: http://hydrology.usu.edu/taudem

License: Other

C 3.53% Makefile 0.94% C++ 81.32% CMake 0.58% Python 11.92% Inno Setup 1.45% Shell 0.25%

taudem's People

Contributors

alexbruy avatar brightfanfy avatar castronova avatar dtarb avatar hobu avatar jsta avatar nazmussazib avatar nickrobison avatar pkdash avatar r-barnes avatar rogerlew avatar xingzheng avatar yanliu-chn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

taudem's Issues

Add support for tracking test suite dependencies with Git LFS

Currently, the contents of the TestSuite directory contains ~1GB of data files:

$ du -sh TestSuite 
1.0G    TestSuite

It may be useful to track changes of these files outside of the Git repository with Git LFS. This would continue to track pointers to the large test files, but keep them out of the primary repository so that it isn't so heavyweight.

Would you be willing to accept a change that attempts to add LFS support?

TauDEM logo in vector format

It would be good to have TauDEM logo in vector format, e.g. SVG to use in documentation and applications which calls TauDEM tools.

Raster statistics appear to be copied and not updated

When running commands like DinfFlowDir, the raster statistics embedded in GeoTIFF files from the input files appear in the output files. E.g. The input stats from the source DEM are shown with gdalinfo input.tif:

Driver: GTiff/GeoTIFF
Files: input.tif
Size is 2200, 3200
...
Band 1 Block=2200x128 Type=Float32, ColorInterp=Gray
  Min=144.480 Max=1108.735
  Minimum=144.480, Maximum=1108.735, Mean=466.574, StdDev=133.894
  NoData Value=-3.0000000054977558e+038
  Metadata:
    STATISTICS_MAXIMUM=1108.7349853516
    STATISTICS_MEAN=466.57443054098
    STATISTICS_MINIMUM=144.47999572754
    STATISTICS_STDDEV=133.89387030864

Within the binary GeoTIFF file, the metadata is stored at tag 42112 (see TIFFTAG_GDAL_METADATA)

<GDALMetadata>
  <Item name="STATISTICS_MAXIMUM" sample="0">1108.7349853516</Item>
  <Item name="STATISTICS_MEAN" sample="0">466.57443054098</Item>
  <Item name="STATISTICS_MINIMUM" sample="0">144.47999572754</Item>
  <Item name="STATISTICS_STDDEV" sample="0">133.89387030864</Item>
</GDALMetadata>

The problem is that the identical statistics are also in the output files too, for example the flow direction raster, which is misleading to tools that read the rasters. For example, ArcToolbox shows the flow direction raster as all black, since the values in the raster are are irrelevant for the statistics from another raster.

I'm not sure what's going on exactly in TauDEM, except that looks like this metadata is copied. I see tag 42112 in, for example, src/tifFile.cpp#L371, but I'm not sure how it is used.

The fix is to either update the metadata items with the correct statistics for the output rasters, or don't copy the lines from tag 42112 that match "STATISTICS_*" (the later strategy is done by gdalwarp as show in the source code). The statistics can be recalculated afterwards using gdalinfo -stats output.tif.

Question: Stream Network Missing Main Stem

I'm delineating watersheds and stream network and having an odd problem. Although the wshed layer and stream network are being created, the main stem is missing. All the tributaries are there, but the main stem is not. Also, continuously along that missing channel the wsheds are missing - the grid has no data cells in those areas. Script runs to completion, no errors or warnings.

Thanks for any help you can provide.

Ian
.
TauDEM 5.1.2 command line, mpiexec (Microsoft HPC Pack 2012 MS-MPI Redistributable Pack)
Windows 7, Intel i7, 16GB
Data source: USGS 1/3 arcsec DEM downloaded from TheNationalMap, projected to an Oregon LCC projection.
Area: ~1600 km2

Here is a small area that shows the problem:
image

Here is the script I am using:


REM @echo off
REM Get output directory
echo PWD is %cd%
set /p UserInputPath= "Enter output directory: "
set op=%cd%%UserInputPath%
md %op%
if ERRORLEVEL 1 EXIT /B
echo Output path is %op%

REM FILL
mpiexec -n 8 PitRemove mud.tif

REM FLOWDIR
mpiexec -n 8 D8Flowdir -p mudp.tif -sd8 %op%\mudsd8.tif -fel mudfel.tif
mpiexec -n 8 DinfFlowdir -ang %op%\mudang.tif -slp %op%\mudslp.tif -fel mudfel.tif

REM CONTRIB AREA
mpiexec -n 8 AreaD8 -p mudp.tif -ad8 %op%\mudad8.tif
mpiexec -n 8 AreaDinf -ang %op%\mudang.tif -sca %op%\mudsca.tif
REM mpiexec -n 8 Aread8 -p mudp.tif -o outlet.shp -ad8 %op%\mudad8o.tif
mpiexec -n 8 Aread8 -p mudp.tif -ad8 %op%\mudad8o.tif

REM GRIDNET
mpiexec -n 8 Gridnet -p mudp.tif -plen %op%\mudplen.tif -tlen %op%\mudtlen.tif -gord %op%\mudgord.tif

REM PEUKERDOUGLAS
mpiexec -n 8 PeukerDouglas -fel mudfel.tif -ss %op%\mudss.tif

REM PEUKERDOUGLAS STREAM DELINEATION
REM mpiexec -n 8 Aread8 -p mudp.tif -o mudoutlet2.shp -ad8 %op%\mudssa.tif -wg %op%\mudss.tif
mpiexec -n 8 Aread8 -p mudp.tif -ad8 %op%\mudssa.tif -wg %op%\mudss.tif
mpiexec -n 8 Dropanalysis -p mudp.tif -fel mudfel.tif -ad8 %op%\mudad8.tif -ssa %op%\mudssa.tif -drp %op%\muddrp.txt -o

mudoutlet.shp -par 5 1500 16 0
mpiexec -n 8 Threshold -ssa %op%\mudssa.tif -src %op%\mudsrc.tif -thresh 300

REM STREAM NETWORK
mpiexec -n 8 Streamnet -fel mudfel.tif -p mudp.tif -ad8 %op%\mudad8.tif -src %op%\mudsrc.tif -ord %op%\mudord3.tif -tree

%op%\mudtree.dat -coord %op%\mudcoord.dat -net %op%\mudnet.shp -w %op%\mudw.tif -o sws_outlets.shp


Compilation problem on Fedora

Hello,
I am trying to install Taudem on a fedora machine and ran into that problem. I was following this tutorial but I believe that this is not up to date anymore.
Could you provide some guidance for linux installation as standalone?

$> CXX=mpicxx cmake -DCMAKE_INSTALL_PREFIX=/usr/local ..
-- The C compiler identification is GNU 4.9.2
-- The CXX compiler identification is GNU 4.9.2
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: /usr/lib64/openmpi/bin/mpicxx
-- Check for working CXX compiler: /usr/lib64/openmpi/bin/mpicxx -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Configuring done
CMake Error at CMakeLists.txt:65 (add_executable):
Cannot find source file:

shape/cell.cpp

Tried extensions .c .C .c++ .cc .cpp .cxx .m .M .mm .h .hh .h++ .hm .hpp
.hxx .in .txx

PS : With the fast development of python modules dedicated to hydrological data does anybody has the project of translating this set of tools into a python module?

TauDEM 5.3 - Error connecting to the Service

Hello -

I just downloaded and installed the latest version of TauDEM (Nov 12, 2015), TauDEM530_prerelease.exe. I'm running Windows7. In the command window I entered the following:

mpiexec -n 8 DinfFlowDir -fel "E:\ddrive\GIS\projects\DEM_Fil.tif" -ang "E:\ddrive\GIS\projects\DEM_Tauang.tif" -slp "E:\ddrive\GIS\projects\DEM_Tauslp.tif"

and received the following error messages:

Error connecting to the Service
[mpiexec@PCxxxxxxxxx] ..\hydra\utils\sock\sock.c (270): unable to connect from "
PCxxxxxxxxx" to "PCxxxxxxxxx" (No error)
read from stdin failed, error 6.
[mpiexec@PCxxxxxxxxx] ..\hydra\tools\demux\demux_select.c (78): select error (No
such file or directory)
[mpiexec@PCxxxxxxxxx] ..\hydra\pm\pmiserv\pmiserv_pmci.c (480): error waiting fo
r event
[mpiexec@PCxxxxxxxxx] ..\hydra\ui\mpich\mpiexec.c (945): process manager error w
aiting for completion

Any ideas how I can resolve these errors?
Thank you.

ERROR 000732: Input Raster Dataset

I download and installed TauDEM for examining the capability of the tool to edit the DEM. I extracted and imported the elevation to ArcMap and ran the 'pit remove' tool. When it finished running it shows error message;

Process started:

Executing: Calculate Statistics

Failed script PitRemove...

Traceback (most recent call last):
File "C:\Program Files\TauDEM\TauDEM5Arc\PitRemove.py", line 63, in
arcpy.CalculateStatistics_management(outFile)
File "c:\program files (x86)\arcgis\desktop10.2\arcpy\arcpy\management.py", line 13825, in CalculateStatistics
raise e
ExecuteError: Failed to execute. Parameters are not valid.
ERROR 000732: Input Raster Dataset: Dataset D:\work\projects\test\Demo\cubdemfel.tif does not exist or is not supported
Failed to execute (CalculateStatistics).

Failed to execute (PitRemove).

I could not process further. Any help would be greatly appreciated.

Regards,
Sandhya

D8 produces potentially bad results in some cases

Hi,

While trying to figure out where the inconsistencies are coming from in my version of D8, I ran into some weird cases. I used an NED tile, which was converted to integer values and pitremove'd. Using normal float elevations usually doesn't produce too many differences between the algorithms, but the integer ones do.

For example, here's a small area with the differing cells in red (these are the output p rasters):
1459281678

Results from TauDEM 5.3.3:
1459281702

Results from my new algorithm:
1459281707

As we can see, TauDEM seems to assign the same direction for all cells in some flats, while my version produces more natural looking results. @ahmetartu agrees that this looks odd and asked me to open this issue.

Are we missing something?

EPSG code not found for some data

A user experienced a problem with a shapefile that had spatial reference "USA_Contiguous_Albers_Equal_Area_Conic. This was traced to lines 91-99 in ReadOutlets.cpp failing when they try to identify the EPSG code. This code needs to be fixed to be robust against EPSG codes not being identified. The principle I would like to implement, is that we should check if the projections match. If they do not match or are missing, we should issue a warning and keep going. We should be catching errors rather than crashing as appears to be the case here.

Add multifile .vrt writing capability

For very big DEMs, it is unrealistic to have a single file with hundreds of gigabyte data. Writing multiple .tiff files and connecting them as a .vrt raster is the proposed solution. I suggest we use the convention vrt rasters written by TauDEM are all .tiff (not compressed) and all in a single folder that also holds the XML .vrt file. The output command line parameter should then be foldername/filename.vrt or foldername\filename.vrt on a PC. Then the code needs to create the folder and each process write its tiff files into the folder, then at the end the XML .vrt file gets created. As a default each process should write one .tiff file. If we want multiple files per process, we can use the option –mf 2 3 to for example put 2 rows and 3 columns of files per each process. tiff files to inherit filename from the .vrt filename and be suffixed p[rc], the row and column only being specified with the -mf option.

The strategy can be
a. each process compute top-left coordinate position using cell size.
b. each process writes its own partition as .tiff files into folder
c. each process transmits the coordinates and file names to master process
d. master process waits for other processes to complete writing and then creates a vrt file (which should be fast considering it's just a xml file)

Error Move Outlets to Stream

Hi,
I try to use the Logan demo as explained in the delineating a single watershed and when doing the step "Move Outlet to stream" I get an error referencing:

ERROR 000714: Error in script MoveOutletsToStreams.
Error in executing: cmd.exe /C C:\PROGRA1\TauDEM\TAUDEM2\MOVEOU~1.PY "logD8Flow.tif" "logD8Contibutesrc.tif" "Outlet_New" "200" "8" "C:\Users\graut\Desktop\CatchmentAnalysis\Examples\Logan\logOutletmv.shp"

Is there some ressource where I can find possible errors explained? So far every now and then single modules chash and I'm often not sure why (I always check for same coordinate systems and sometimes it was just because of the filenamelength)?

No BIGTIFF support in tiff file creation because of geotiff compression support

Currently, TauDEM is failing to create tiff file larger than 4 GB. As described at GeoTIFF Gdal Page, if the compression is used in geotiff creation, BIGTIFF must be passed explicitly to papszOptions in GDALCreate function. The code in tiffIO::write function at tiffIO.cpp to fix that issue should be:

`
// if the extension is tiff, set bigtiff to true

if (index == 0)
papszOptions = CSLSetNameValue( papszOptions, "BIGTIFF", "YES");

`

where that code can be inserted after line in tiffIO.cpp:

papszOptions = CSLSetNameValue( papszOptions, "COMPRESS", compression_meth[index]);

File locking failed in ADIOI_Set_lock(fd 4,cmd F_SETLKW/7,type F_RDLCK/0,whence 0) with return value FFFFFFFF and errno 26

Hie i need help am having an error when i try to compile my taudem 5.1 on an HPC Cluster.See below the command and errors am getting

[sirdc@ln01 ~]$ mpiexec -n 4 /lustre/softs/taudem/taudem-5.1/build/pitremove -z ./dem -fel ./demfel
PitRemove version 5.1
File locking failed in ADIOI_Set_lock(fd 4,cmd F_SETLKW/7,type F_RDLCK/0,whence 0) with return value FFFFFFFF and errno 26.

  • If the file system is NFS, you need to use NFS version 3, ensure that the lockd daemon is running on all the machines, and mount the directory with the 'noac' option (no attribute caching).
  • If the file system is LUSTRE, ensure that the directory is mounted with the 'flock' option.
    ADIOI_Set_lock:: Function not implemented
    ADIOI_Set_lock:offset 0, length 2
    application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
    File locking failed in ADIOI_Set_lock(fd 4,cmd F_SETLKW/7,type F_RDLCK/0,whence 0) with return value FFFFFFFF and errno 26.
  • If the file system is NFS, you need to use NFS version 3, ensure that the lockd daemon is running on all the machines, and mount the directory with the 'noac' option (no attribute caching).
  • If the file system is LUSTRE, ensure that the directory is mounted with the 'flock' option.
    ADIOI_Set_lock:: Function not implemented
    ADIOI_Set_lock:offset 0, length 2
    application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1
    File locking failed in ADIOI_Set_lock(fd 4,cmd F_SETLKW/7,type F_RDLCK/0,whence 0) with return value FFFFFFFF and errno 26.
  • If the file system is NFS, you need to use NFS version 3, ensure that the lockd daemon is running on all the machines, and mount the directory with the 'noac' option (no attribute caching).
  • If the file system is LUSTRE, ensure that the directory is mounted with the 'flock' option.
    ADIOI_Set_lock:: Function not implemented
    ADIOI_Set_lock:offset 0, lFile locking failed in ADIOI_Set_lock(fd 4,cmd F_SETLKW/7,type F_RDLCK/0,whence 0) with return value FFFFFFFF and errno 26.
  • If the file system is NFS, you need to use NFS version 3, ensure that the lockd daemon is running on all the machines, and mount the directory with the 'noac' option (no attribute caching).
  • If the file system is LUSTRE, ensure that the directory is mounted with the 'flock' option.
    ADIOI_Set_lock:: Function not implemented
    ADIOI_Set_lock:offset 0, length 2
    application called MPI_Abort(MPI_COMM_WORLD, 1) - process 2
    ength 2
    application called MPI_Abort(MPI_COMM_WORLD, 1) - process 3
    rank 2 in job 3 ln01_56636 caused collective abort of all ranks
    exit status of rank 2: return code 1
    rank 1 in job 3 ln01_56636 caused collective abort of all ranks
    exit status of rank 1: return code 1
    rank 0 in job 3 ln01_56636 caused collective abort of all ranks
    exit status of rank 0: return code 1

Compilation problem in Debian

Hi.

I get this when running make in Debian 8.4, please let me know how to fix this if there's a way:

mpic++ -g -c areadinfmn.cpp -o areadinfmn.o
In file included from areadinfmn.cpp:46:0:
commonLib.h:46:21: fatal error: ogr_api.h: Aucun fichier ou dossier de ce type

include "ogr_api.h"

                ^

compilation terminated.
makefile:179: recipe for target 'areadinfmn.o' failed
make: *** [areadinfmn.o] Error 1

Thank you.

[Request] Port open file fixes to single-file version

Hi. On Mac OS X, I am seeing many processes of the single-file version (5.1.1) of TauDEM finishing with open file handles, whereby the output file(s) become unreadable by other apps. It looks like this has been fixed in 4568661, and maybe other commits in this repository.

I would like to know if these fixes are planned to be ported to the single-file version? Thanks.

Flow direction conditioning

Develop a new TauDEM “Flow direction conditioning” tool that soft burns in elevations by tracking down D8 flow directions to ensure there is no uphill elevation. This to be part of procedure for etching vector stream network into DEM for inundation mapping. This is step 6 in the following overall procedure.

  1. Inputs are a stream vector file (e.g. NHD or NHD High Res) and a DEM (denoted z).
  2. Convert the stream vectors to raster that shares the same dimensions (columns, rows, cell size and edge coordinates as the DEM), denoted srfv (stream raster from vector) with values 1 on stream and 0 off stream
  3. Burn srfv into z using raster calculator zb = z-100 * srfv (or any big number in place of 100)
  4. D8FlowDir with input zb and output flow directions p
  5. Mask D8 flow directions to only have flow directions on streams. The raster calculation pm = p/srvf achieves this
  6. Develop a new TauDEM “Flow direction conditioning” tool that operates on pm and z and produces zsb (soft burned elevations) by tracking down the D8 flow directions and ensuring there is no uphill elevation.

mpiexec can't find gdal111.dll - R

I am trying to run this command in R

system("mpiexec -n 8 pitremove -z logan.tif -fel loganfel.tif")

However I get an error message that "the program can't start because gdal111.dll is missing from your computer".

I have installed gdal as administrator and also have added the path to my system variables. I don't see what is missing. I have also installed Rmpi and ms-mpi.

I have installed all packages separately and just downloaded the commandline executables.

Make changes to Inno Setup script

The setup script should get the source files for the build from a directory outside the repo directory so that these source files do not become part of the source control.

Correct or gracefully handle case when user inputs an output file without extension

If an output is specified without an extension the result using ArcGIS is.
untitled

The file loganfel.tif is written with the .tif appended in C++ executable.
Options to fix include

  1. warning message telling the user that name was changed
  2. trapping for this in python wrapper or validation code and adding the extension there so that statistics can be computed on the result
  3. interpreting folder as container for .vrt rasters (but statistics may also not work on the folder)

Option 2 is preferred.

Install problem

I have a problem with TauDem installation even so I follow the instructions.
Everything seems to be alright till I add the toolbox to arcmap. But when I open the TauDem toolbox, the only function I can see is "Watershed Grid to Shapfile" under the "Stream network analysis".

The same is true, when I look at the toolbox with arc catalog.
I tried the installation on several independent machines, and experienced always the same.

Can you give me a little bit of support with this issue?
Thank you in advance!

Update Stream Reach and Watershed documentation

A user wrote:

It appears that in the output from the Stream Reach And Watershed, the Straight line distance field name has changed from ‘Straight_L’ to ‘StraightL’ (see attached). The documentation does not reflect this change…it still refers to this field as ‘Straight_L’.

capture

The documentation at http://hydrology.usu.edu/taudem/taudem5/help53/StreamReachAndWatershed.html needs to be cross checked against the code updated for all variables with _ in the name.

The code needs to be checked as in the image above Drop_ and Order_ appear to have trailing _

Add "Basins" function

A user requested a function that delineates all drainage basins. These are defined by identifying all pour points along the edge of the DEM and using these as outlet points to delineate the area draining to them.

My suggestion for implementation is to start with gagewatershed, but instead of looping over points in the shapefile to identify outlets in lines 161 to 174, loop over the grid of flow directions and identify edge cells, those where the flow direction of the downstream neighbor is undefined (off the edge or to an internal sink). Give each a unique number and que it as the outlet of a basin. Then the rest of the gagewatershed code should work to get the basins.

No data value handling on Dinf flow direction

Not sure if that is an issue so feel free to close if it is just the way it is.
I am computing TWI over a pretty large area using 32 cores machine with a ton of RAM. My DEM are around 1.4gb and I am working on SSD. When I compute the Dinf flow direction, I get a value of 295392041 flats to resolve which even with a 32 cores machine, takes more than 24h.
I have the feeling that the algorithm see my no_data value as actual values as, if I crop my DEM but keep the same extent, I get even more flats to resolve.

Is there a preferred no_data value? I use -9999 as seen above. Here is a gdalinfo output of one of my raster file and my workflow at the bottom.
Am I doing something wrong?

My files are :

J$ gdalinfo /Users/julienschroder/Downloads/CTIAnalysis_RasterLayers_26OCT2015/AKPCTR_DEM.tif 
Driver: GTiff/GeoTIFF
Files: /Users/.../AKPCTR_DEM.tif
       /Users/.../AKPCTR_DEM.tif.ovr
       /Users/.../AKPCTR_DEM.tfw
       /Users/j/Users/.../AKPCTR_DEM.tif.aux.xml

Size is 20332, 18577
Coordinate System is:
PROJCS["NAD_1983_StatePlane_Alaska_1_FIPS_5001",
    GEOGCS["NAD83",
        DATUM["North_American_Datum_1983",
            SPHEROID["GRS 1980",6378137,298.2572221010002,
                AUTHORITY["EPSG","7019"]],
            AUTHORITY["EPSG","6269"]],
        PRIMEM["Greenwich",0],
        UNIT["degree",0.0174532925199433],
        AUTHORITY["EPSG","4269"]],
    PROJECTION["Hotine_Oblique_Mercator"],
    PARAMETER["latitude_of_center",57],
    PARAMETER["longitude_of_center",-133.6666666666667],
    PARAMETER["azimuth",-36.86989764583333],
    PARAMETER["rectified_grid_angle",0],
    PARAMETER["scale_factor",0.9999],
    PARAMETER["false_easting",5000000],
    PARAMETER["false_northing",-5000000],
    UNIT["metre",1,
        AUTHORITY["EPSG","9001"]]]
Origin = (303571.061199599993415,1117618.294335020007566)
Pixel Size = (49.999999999999993,-50.000000000000000)
Metadata:
  AREA_OR_POINT=Area
  DataType=Generic
Image Structure Metadata:
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  (  303571.061, 1117618.294) (162d14'22.18"W, 23d20'23.00"N)
Lower Left  (  303571.061,  188768.294) (156d53'39.64"W, 19d 8'34.25"N)
Upper Right ( 1320171.061, 1117618.294) (157d10'48.89"W, 29d17'48.43"N)
Lower Right ( 1320171.061,  188768.294) (151d15'29.26"W, 24d29'35.05"N)
Center      (  811871.061,  653193.294) (156d54'44.66"W, 24d 6'27.45"N)
Band 1 Block=20332x128 Type=Float32, ColorInterp=Gray
  NoData Value=-9999
  Overviews: 10166x9289, 5083x4645, 2542x2323, 1271x1162, 636x581, 318x291, 159x146

And my workflow looks that way :


    # some setup
    import os, glob, rasterio
    import numpy as np
    #set path for Taudem EXE
    os.chdir('/home/UA/truc/TauDEM/')

    # Compute pit filling Taudem algorithm
    os.system('mpirun -n 32 pitremove -z /workspace/Shared/Users/truc/CTI_ref/DEM_cropped.tif -fel /fast_scratch/truc/CTI_processing_cropped/AKPCTR_DEMfel.tif')

    # Mask the glaciers
    mask_rasters('/fast_scratch/truc/CTI_processing_cropped/AKPCTR_DEMfel.tif', ['/workspace/Shared/Users/truc/CTI_ref/RGI_v5_AKPCTR_NewExtent.tif', '/workspace/Shared/Users/truc/CTI_ref/Seam3000_NewExtent.tif'],1,-9999,'/fast_scratch/truc/CTI_processing_cropped/AKPCTR_NoIceSeam.tif')

    #Run the flow direction taudem algorithm 
    os.system('mpirun -n 32 dinfflowdir -fel /fast_scratch/truc/CTI_processing_cropped/AKPCTR_NoIceSeam.tif -slp /fast_scratch/truc/CTI_processing_cropped/AKPCTR_NoIceSeam_slp.tif -ang /fast_scratch/truc/CTI_processing_cropped/AKPCTR_NoIceSeam_ang.tif')

    #Reclass the slope raster from 0 to 0.0001
    slope_reclasser('/fast_scratch/truc/CTI_processing_cropped/AKPCTR_NoIceSeam_slp.tif','/fast_scratch/truc/CTI_processing_cropped/AKPCTR_NoIceSeam_No0slp.tif')

    #Run the contributing area taudem algorithm
    os.system('mpirun -n 32 areadinf -ang /fast_scratch/truc/CTI_processing_cropped/AKPCTR_NoIceSeam_ang.tif -sca /fast_scratch/truc/CTI_processing_cropped/AKPCTR_NoIceSeam_sca.tif')

    #Run the actual TWI calculation
    os.system('mpirun -n 32 twi -sca /fast_scratch/truc/CTI_processing_cropped/AKPCTR_NoIceSeam_sca.tif -slp /fast_scratch/truc/CTI_processing_cropped/AKPCTR_NoIceSeam_No0Slp.tif -twi /fast_scratch/truc/CTI_processing_cropped/CTI_NoIceSeam_1.tif')

    #Clip the CTI raster by land mask
    mask_rasters('/fast_scratch/truc/CTI_processing_cropped/CTI_NoIceSeam_1.tif', '/fast_scratch/truc/CTI_processing_cropped/AKPCTR_WS_Outline_4kmBuff.tif',1,-9999,'/fast_scratch/truc/CTI_processing_cropped/CTI_NoIceSeam.tif')

Correct specific catchment area scaling for geographic coordinates

Areadinf has areares=areares+tempdxc;
This is incorrect when cell dx and dy are different as for geographic coordinates.

Suggested fix. Replace the value added by 1 or dx * dy based on one of the options below. Configure input so a user can indicate the output units desired from

  1. cell count (dimensionless)

  2. specific catchment area in length units at cell center

  3. Area in length units squared

    For 1 and 3 the values added are 1 and dx * dy respectively. For 2 use the calculation:

(A - (dx * dy)/2)/(d cos theta)

d is the length of the cell in the direction of the side that flow is across and theta is the angle offset from the cardinal direction from that side. This centers the area calculation at the midpoint of the cell and then infers contour width as d cos theta which can be derived approximating the flow as widths on a plane.

Problem in ArcGIS toolbox

I have got error/warning message while running "MoveOutletstoStream" and "ConnectDown" function from ArcGIS tool box. However , i have not get any error message while running from command line. After rigorous checking, i have found that those functions create "data sources" (e.g. any output shapefile), though those functions already have the data sources. I have fixed and tested those functions both in ArcGIS and command line.

Can't run PitRemove

Are the latest sources actually broken?

I've tried to compile everything, but getting errors in tiffIO constructor. It is called using filename but tiffIO constructor actually expects directory:

flood.cpp:73
...
tiffIO dem(demfile,FLOAT_TYPE);
...

tiffIO:61
...
tiffIO::tiffIO(char *dirname, DATA_TYPE newtype){
...

Correct "int" type in D8 for large DEM

Hi,

I had issues with the application of the D8FlowDir function for a large DEM (more than 7 GB).
It seems that some flat areas are not resolved when located in a column larger than 32,767.
In fact, the "short" type is used for two variables in the function setFlow2 of the file d8.cpp.
Changing the type of the two variables "in" and "jn" to "int" solved my issues.

missing gdal111.dll and broken page for downloading

I did a complete Windows install (Version 5.3.4), and I keep getting this error message:

"The program can't start because gdal111.dll is missing from your computer. Try reinstalling the program to fix this problem."

I used the recommended Complete Installer. When I went to download the GDAL 111 individually, it took me to a broken page.

I've reinstalled with no luck. Tried using both the command prompt and the ArcGIS toolbox. Any thoughts?
image

image

ORDINAL NOT FOUND Error (PitRemove.exe)

Thank you for a good software you provide.
I have a question while using it.
TauDEM works without problems like installing and operating on most computers.
On some computers, however, when I execute PitRemove.exe in CMD.EXE,
the error appears like “ORDINAL NOT FOUND. THE ORDINAL [xxxx] COULD NOT BE LOCATED IN THE DYNAMIC LINK LIBRARY LIBEAY32.DLL”.

The path of OS is like this:
Path= C:\Program Files\Microsoft HPC Pack 2012\Bin;%SystemRoot%\system32;%SystemRoot%;%SystemRoot%\System32\Wbem;%SYSTEMROOT%\System32\WindowsPowerShell\v1.0;C:\GDAL;C:\Program Files\GDAL;C:\Program Files\TauDEM\TauDEM5Exe

When I execute “Copy C:\Program Files\GDAL*.DLL C:\Program Files\TauDEM\TauDEM5Exe\ ” in CMD.EXE, this PitRemove.exe error disappears.
Could you let me know why this error occur on some computer?

Best Regards,
YJ Won

Improper Layer name in Linux

Current code extract improper Layer name from shapefile which creates problem while reading and writing shapefile in Linux which results error in Rapid watershed delineation function. I have fixed this problem and tested both for windows and Linux.

Visual Studio: Updated Microsoft MPI SDK Install locations

Microsoft has updated its installation locations for the MPI SDK.

Suggest using $(MPI)\Include and $(MPI)\Lib to replace instances of 'C:\Program Files\Microsoft HPC Pack 2012\Inc' and 'C:\Program Files\Microsoft HPC Pack 2012\Lib\amd64'

Users can set their install locations via the MPI environment variable.

If you agree, I can create a pull request.

Cheers

Switch to GDAL

It would be great to use GDAL in TauDEM to read/write both rasters and vectors. This allow to support not only TIFF rasters but also many other formats. Also with GDAL we can get rid of two versions: single- and multi-file, as GDAL provides so-called "virtual" rasters, which wrap multiple files into single one.

Correct command line help information for Threshold function

At present the use with specific file names says
threshold -fel -ss [-thresh ] [-mask ]

It should say
threshold -ssa -src [-thresh ] [-mask ]

At the same time it would be good to check the command line help information for other functions.

Could you tell me which pit filling method is used for the PitRemove algorithm in TauDEM?

A user wrote: "I just read the 2009 paper by Wallis, et al. Hydrologic Terrain Processing Using Parallel Computing that suggested the Planchon & Darboux (2001) has been implemented in TauDEM."

The Pitremove function of TauDEM actually uses a modified version of the Planchon & Darboux (PD) algorithm. Differences only affect computational efficiency and the results should be identical. This has been extensively tested. Differences are:

  1. PD was applied sequentially to the entire domain. TauDEM applies pit filling in parallel to the domain partitioned into stripes and at the end of each iteration exchanges information across the borders of stripes.
  2. PD uses eight scan directions. TauDEM limits scan directions to two.
  3. PD visits each grid cell on each iterative scan across the DEM. TauDEM uses a stack to only visit unresolved grid cells on each iterative scan. Unresolved grid cells are placed on a stack on the first scan, then removed from the first stack on each subsequent scan and placed on a second stack. Stacks are then switched. This limits the scanning to two directions rather than eight (point 2 above), but was found to result in a speedup of a factor of 2 for small datasets and 4.3 for a modestly large 1.5 GB dataset in comparison to the eight combination full grid scanning. The benefits of the stack thus seem to outweigh the inefficiency of fewer scan directions.
  4. The PD paper uses a recursive dry upward function to improve efficiency. This has not been attempted in the parallel TauDEM implementation because recursive methods use the system stack to expand system memory, posing a challenge for memory management in large data computations. They also pose a challenge for a domain partitioned parallel approach as cross partition calls are less predictable. They are also hard to implement on a stack as they would require additional code to track the stack position of each grid cell and to handle changing the order in which grid cells on the stack are processed.
  5. PD includes an option to avoid flats by setting a minimal small value elevation difference between cells for a cell to "drain". TauDEM does not include such a parameter at present (this may be added as an option in the future.) and pits are filled to the point where they are level with their pour point.

Differences between the original PD algorithm and the TauDEM implementation are further discussed in Yildirim et al. (2015). Note however that the version of TauDEM distributed from this repository does not use the Yildirim et al. virtual tile approach used for shared memory parallelism. This version uses MPI and distributed memory that applies to both multi core and parallel cluster computers. Code for the Yildirim virtual tile approach is in bitbucket.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.