Coder Social home page Coder Social logo

isce-framework / isce2 Goto Github PK

View Code? Open in Web Editor NEW
483.0 34.0 244.0 7.16 MB

InSAR Scientific Computing Environment version 2

License: Other

HTML 0.15% Python 53.78% C++ 10.39% C 12.87% Fortran 19.79% Shell 0.48% Forth 0.01% Makefile 0.04% Cuda 1.41% Tcl 0.04% Dockerfile 0.01% CMake 0.52% Cython 0.48% RouterOS Script 0.02%

isce2's Introduction

ISCE2

CircleCI

This is the Interferometric synthetic aperture radar Scientific Computing Environment (ISCE). Its initial development was funded by NASA's Earth Science Technology Office (ESTO) under the Advanced Information Systems Technology (AIST) 2008 and is currently being funded under the NASA-ISRO SAR (NISAR) project.

THIS IS RESEARCH CODE PROVIDED TO YOU "AS IS" WITH NO WARRANTIES OF CORRECTNESS. USE AT YOUR OWN RISK.

This software is open source under the terms of the the Apache License. Its export classification is 'EAR99 NLR', which entails some restrictions and responsibilities. Please read the accompanying LICENSE.txt and LICENSE-2.0 files.

ISCE is a framework designed for the purpose of processing Interferometric Synthetic Aperture Radar (InSAR) data. The framework aspects of it have been designed as a general software development framework. It may have additional utility in a general sense for building other types of software packages. In its InSAR aspect ISCE supports data from many space-borne satellites and one air-borne platform. We continue to increase the number of sensors supported. At this time the sensors that are supported are the following: ALOS, ALOS2, COSMO_SKYMED, ENVISAT, ERS, KOMPSAT5, LUTAN1, RADARSAT1, RADARSAT2, RISAT1, Sentinel1, TERRASARX, UAVSAR and SAOCOM1A.

Contents

  1. Software Dependencies
  2. Building ISCE
  3. Running ISCE
  4. Input Files
  5. Component Configurability
  6. User community Forums

1. Software Dependencies

Basic:

  • gcc >= 4.8+ (with C++11 support)
  • fftw >= 3.2.2 (with single precision support)
  • Python >= 3.5 (3.6 preferred)
  • scons >= 2.0.1
  • curl - for automatic DEM downloads
  • GDAL and its Python bindings >= 2.2

Optional:

For a few sensor types:

  • hdf5 >= 1.8.5 and h5py >= 1.3.1 - for COSMO-SkyMed, Kompsat5, and 'Generic' sensor

For mdx (image visualization tool) options:

  • Motif libraries and include files
  • ImageMagick - for mdx production of kml file (advanced feature)
  • grace - for mdx production of color table and line plots (advanced feature)

For the "unwrap 2 stage" option:

RelaxIV and Pulp are required. Information on getting these packages if you want to try the unwrap 2 stage option:

  • RelaxIV (a minimum cost flow relaxation algorithm coded in C++ by Antonio Frangioni and Claudio Gentile at the University of Pisa, based on the Fortran code developed by Dimitri Bertsekas while at MIT) is available at https://github.com/frangio68/Min-Cost-Flow-Class. The RelaxIV files should be placed in the directory: 'contrib/UnwrapComp/src/RelaxIV' so that ISCE will compile it properly.

  • PULP: Use easy_install or pip to install it or else clone it from, https://github.com/coin-or/pulp. Make sure the path to the installed pulp.py is on your PYTHONPATH environment variable (it should be the case if you use easy_install or pip).

For splitSpectrum and GPU modules:

  • cython3 - must have an executable named cython3 (use a symbolic link)
  • cuda - for GPUtopozero and GPUgeo2rdr
  • opencv - for split spectrum

With Anaconda

The conda requirements file is shown below:

cython
gdal
git
h5py
libgdal
pytest
numpy
fftw
scipy
basemap
scons
opencv

With the above contents in a textfile named "requirements.txt"

> conda install --yes --file requirements.txt

Ensure that you create a link in the anaconda bin directory for cython3.

With Macports

The following ports (assuming gcc7 and python36) are needed on OSX

gcc7
openmotif
python36
fftw-3 +gcc7 
fftw-3-single +gcc7
xorg-libXt +flat_namespace
git 
hdf5 +gcc7 
h5utils
netcdf +gcc7
netcdf-cxx
netcdf-fortran
postgresql95
postgresql95-server
proj
cairo
scons
opencv +python36
ImageMagick
gdal +expat +geos +hdf5 +netcdf +postgresql95 +sqlite3
py36-numpy +gcc7 +openblas
py36-scipy +gcc7 +openblas
py36-matplotlib +cairo +tkinter
py36-matplotlib-basemap
py36-h5py
py36-gdal

Python3 Convention

We follow the convention of most package managers in using the executable 'python3' for Python3.x and 'python' for Python2.x. This makes it easy to turn Python code into executable commands that know which version of Python they should invoke by naming the appropriate version at the top of the executable file (as in #!/usr/bin/env python3 or #!/usr/bin/env python). Unfortunately, not all package managers (such as macports) follow this convention. Therefore, if you use one of a package manager that does not create the 'python3' executable automatically, then you should place a soft link on your path to have the command 'python3' on your path. Then you will be able to execute an ISCE application such as 'stripmapApp.py as "> stripmapApp.py" rather than as "> /path-to-Python3/python stripmapApp.py".


Building ISCE

SCons (recommended)

Configuration control

Scons requires that configuration information be present in a directory specified by the environment variable SCONS_CONFIG_DIR. First, create a build configuration file, called SConfigISCE and place it in your chosen SCONS_CONFIG_DIR. The SConfigISCE file should contain the following information, note that the #-symbol denotes a comment and does not need to be present in the SConfigISCE file:

NOTE: Locations vary from system to system, so make sure to use the appropriate location. The one listed here are just for illustrative purpose.

# The directory in which ISCE will be built
PRJ_SCONS_BUILD = $ISCE_BUILD_ROOT/isce

# The directory into which ISCE will be installed
PRJ_SCONS_INSTALL = $ISCE_INSTALL_ROOT/isce

# The location of libraries, such as libstdc++, libfftw3 (for most system
# it's /usr/lib and/or /usr/local/lib/ and/or /opt/local/lib)
LIBPATH = $YOUR_LIB_LOCATION_HOME/lib64 $YOUR_LIB_LOCATION_HOME/lib

# The location of Python.h. If you have multiple installations of python
# make sure that it points to the right one
CPPPATH = $YOUR_PYTHON_INSTALLATION_LOCATION/include/python3.xm $YOUR_PYTHON_INSTALLATION_LOCATION/lib/python3.x/site-packages/numpy/core/include

# The location of the fftw3.h (most likely something like /usr/include or
# /usr/local/include /opt/local/include
FORTRANPATH =  $YOUR_FFTW3_INSTALLATION_LOCATION/include

# The location of your Fortran compiler. If not specified it will use the system one
FORTRAN = $YOUR_COMPILER_LOCATION/bin/gfortran

# The location of your C compiler. If not specified it will use the system one
CC = $YOUR_COMPILER_LOCATION/bin/gcc

# The location of your C++ compiler. If not specified it will use the system one
CXX = $YOUR_COMPILER_LOCATION/bin/g++

#libraries needed for mdx display utility
MOTIFLIBPATH = /opt/local/lib       # path to libXm.dylib
X11LIBPATH = /opt/local/lib         # path to libXt.dylib
MOTIFINCPATH = /opt/local/include   # path to location of the Xm
                                    # directory with various include files (.h)
X11INCPATH = /opt/local/include     # path to location of the X11 directory
                                    # with various include files

#Explicitly enable cuda if needed
ENABLE_CUDA = True
CUDA_TOOLKIT_PATH = $YOUR_CUDA_INSTALLATION  #/usr/local/cuda

In the above listing of the SConfigISCE file, ISCE_BUILD_ROOT and ISCE_INSTALL_ROOT may be actual environment variables that you create or else you can replace them with the actual paths you choose to use for the build files and the install files. Also, in the following the capitalization of 'isce' as lower case does matter. This is the case-sensitive package name that Python code uses for importing isce.

Install ISCE

cd isce
scons install

For a verbose install run:

scons -Q install

The scons command also allows you to explicitly specify the name of the SConfigISCE file, which could be used to specify an alternative file for (say SConfigISCE_NEW) which must still be located in the same SCONS_CONFIG_DIR, run

scons install --setupfile=SConfigISCE_NEW

This will build the necessary components and install them into the location specified in the configuration file as PRJ_SCONS_INSTALL.

Note about compiling ISCE after an unsuccessful build.

When building ISCE, scons will check the list of header files and libraries that ISCE requires. Scons will cache the results of this dependency checking. So, if you try to build ISCE and scons tells you that you are missing headers or libraries, then you should remove the cached files before trying to build ISCE again after installing the missing headers and libraries. The cached files are config.log, .sconfig.dblite, and the files in directory .sconf_temp. You should run the following command while in the top directory of the ISCE source (the directory containing the SConstruct file):

> rm -rf config.log .sconfig.dblite .sconf_temp .sconsign.dblite

and then try "scons install" again.

The same also applies for rebuilding with SCons after updating the code, e.g. via a git pull. If you encounter issues after such a change, it's recommended to remove the cache files and build directory and do a fresh rebuild.

CMake (experimental)

Make sure you have the following prerequisites:

  • CMake ≥ 3.13
  • GCC ≥ 4.8 (with C++11 support)
  • Python ≥ 3.5
  • Cython
  • FFTW 3
  • GDAL
git clone https://github.com/isce-framework/isce2
cd isce2
mkdir build
cd build
cmake .. -DCMAKE_INSTALL_PREFIX=/my/isce/install/location
make install

Additional cmake configuration options

CMake uses CMAKE_PREFIX_PATH as a global prefix for finding packages, which can come in handy when using e.g. Anaconda:

cmake [...] -DCMAKE_PREFIX_PATH=$CONDA_PREFIX

On macOS, cmake will also look for systemwide "frameworks", which is usually not what you want when using Conda or Macports.

cmake [...] -DCMAKE_FIND_FRAMEWORK=NEVER

For packagers, the PYTHON_MODULE_DIR can be used to specify ISCE2's package installation location relative to the installation prefix

cmake [...] -DPYTHON_MODULE_DIR=lib/python3.8m/site-packages

Setup Your Environment

Once everything is installed, you will need to set the following environment variables to run the programs included in ISCE ($ISCE_INSTALL_ROOT may be an environment variable you created above or else replace it with the actual path to where you installed ISCE):

export PYTHONPATH=$ISCE\_INSTALL\_ROOT:$PYTHONPATH

and to put the executable commands in the ISCE applications directory on your PATH for convenience,

export ISCE_HOME=$ISCE_INSTALL_ROOT/isce
export PATH=$ISCE_HOME/applications:$PATH

An optional environment variable is $ISCEDB. This variable points to a directory in which you may place xml files containing global preferences. More information on this directory and the files that you might place there is given below in Section on Input Files. For now you can ignore this environment variable.

To test your installation and your environment, do the following:

> python3
>>> import isce
>>> isce.version.release_version

Running ISCE

Running ISCE from the command line

Copy the example xml files located in the example directory in the ISCE source tree to a working directory and modify them to point to your own data. Run them using the command:

> $ISCE_HOME/applications/stripmapApp.py isceInputFile.xml

or (with $ISCE_HOME/applications on your PATH) simply,

> stripmapApp.py isceInputFile.xml

The name of the input file on the command line is arbitrary. ISCE also looks for appropriately named input files in the local directory

You can also ask ISCE for help from the command line:

> stripmapApp.py --help

This will tell you the basic command and the options for the input file. Example input files are also given in the 'examples/input_files' directory.

As explained in the Component Configurability section below, it is also possible to run stripmapApp.py without giving an input file on the command line. ISCE will automatically find configuration files for applications and components if they are named appropriately.

Running ISCE in the Python interpreter

It is also possible to run ISCE from within the Python interpreter. If you have an input file named insarInputs.xml you can do the following:

%> python3
>>> import isce
>>> from stripmapApp import Insar
>>> a = Insar(name="stripmapApp", cmdline="insarInputs.xml")
>>> a.configure()
>>> a.run()

(As explained in the Component Configurability section below, if the file insarInputs.xml were named stripmapApp.xml or insar.xml, then the 'cmdline' input on the line creating 'a' would not be necessary. The file 'stripmapApp.xml' would be loaded automatically because when 'a' is created above it is given the name 'stripmapApp'. A file named 'insar.xml' would also be loaded automatically if it exists because the code defining stripmapApp.py gives all instances of it the 'family' name 'insar'. See the Component Configurability section below for details.)

Running ISCE with steps

An other way to run ISCE is the following:

stripmapApp.py insar.xml --steps

This will run stripmapApp.py from beginning to end as is done without the --steps option, but with the added feature that the workflow state is stored in files after each step in the processing using Python's pickle module. This method of running stripmapApp.py is only a little slower and it uses extra disc space to store the pickle files, but it provides some advantage for debugging and for stopping and starting a workflow at any predetermined point in the flow.

The full options for running stripmapApp.py with steps is the following:

stripmapApp.py insar.xml [--steps] [--start=<s>] [--end=<s>] [--dostep=<s>]

where "<s>" is the name of a step. To see the full ordered list of steps the user can issue the following command:

stripmapApp.py insar.xml --steps --help

The --steps option was explained above. The --start and --end option can be used together to process a range of steps. The --dostep option is used to process a single step.

For the --start and --dostep options to work, of course, requires that the steps preceding the starting step have been run previously because the state of the work flow at the beginning of the first step to be run must be stored from a previous run.

An example for using steps might be to execute the end-to-end workflow with --steps to store the state of the workflow after every step as in,

stripmapApp.py insar.xml --steps

Then use --steps to rerun some of the steps (perhaps you made a code modification for one of the steps and want to test it without starting from the beginning) as in

stripmapApp.py insar.xml --start=<step-name1> --end=<step-name2>

or to rerun a single step as in

stripmapApp.py insar.xml --dostep=<step-name>

Running stripmapApp.py with --steps also enables one to enter the Python interpreter after a run and load the state of the workflow at any stage and introspect the objects in the flow and play with them as follows, for example:

%> python3
>>> import isce
>>> f = open("PICKLE/formslc")
>>> import pickle
>>> a = pickle.load(f)
>>> o = f.getReferenceOrbit()
>>> t, x, p, off = o._unpackOrbit()
>>> print(t)
>>> print(x)

Someone with familiarity of the inner workings of ISCE can exploit this mode of interacting with the pickle object to discover much about the workflow states and also to edit the state to see its effect on a subsequent run with --dostep or --start.

Notes on Digital Elevation Models

  • ISCE will automatically download SRTM Digital Elevation Models when you run an application that requires a DEM. In order for this to work follow the next 2 instructions:
  1. You will need to have a user name and password from urs.earthdata.nasa.gov and you need to include LPDAAC applications to your account.

    a. If you don't already have an earthdata username and password, you can set them at https://urs.earthdata.nasa.gov/

    b. If you already have an earthdata account, please ensure that you add LPDAAC applications to your account: - Login to earthdata here: https://urs.earthdata.nasa.gov/home - Click on my applications on the profile - Click on “Add More Applications” - Search for “LP DAAC” - Select “LP DAAC Data Pool” and “LP DAAC OpenDAP” and approve.

  2. create a file named .netrc with the following 3 lines:

machine urs.earthdata.nasa.gov
    login your_earthdata_login_name
    password your_earthdata_password
  1. set permissions to prevent others from viewing your credentials:
> chmod go-rwx .netrc
  • When you run applications that require a dem, such as stripmapApp.py, if a dem component is provided but the dem is referenced to the EGM96 geo reference (which is the case for SRTM DEMs) it will be converted to have the WGS84 ellipsoid as its reference. A new dem file with suffix wgs84 will be created.

  • If no dem component is specified as an input a EGM96 will be automatically downloaded (provided you followed the preceding instructions to register at earthdata) and then it will be converted into WGS84.

  • If you define an environment variable named DEMDB to contain the path to a directory, then ISCE applications will download the DEM (and water body mask files into the directory indicated by DEMDB. Also ISCE applications will look for the DEMs in the DEMDB directory and the local processing directory before downloading a new DEM. This will prevent ISCE from downloading multiple copies of a DEM if you work with data in different subdirectories that cover similar geographic locations.

Input Files

Input files are structured 'xml' documents. This section will briefly introduce their structure using a special case appropriate for processing ALOS data. Examples for the other sensor types can be found in the directory 'examples/input_files'.

The basic (ALOS) input file looks like this (indentation is optional):

stripmapApp.xml (Option 1)

<stripmapApp>
<component name="stripmapApp">
    <property name="sensor name">ALOS</property>
    <component name="Reference">
        <property name="IMAGEFILE">
            /a/b/c/20070215/IMG-HH-ALPSRP056480670-H1.0__A
        </property>
        <property name="LEADERFILE">
            /a/b/c/20070215/LED-ALPSRP056480670-H1.0__A
        </property>
        <property name="OUTPUT">20070215</property>
    </component>
    <component name="Secondary">
        <property name="IMAGEFILE">
            /a/b/c/20061231/IMG-HH-ALPSRP049770670-H1.0__A
        </property>
        <property name="LEADERFILE">
            /a/b/c/20061231/LED-ALPSRP049770670-H1.0__A
        </property>
        <property name="OUTPUT">20061231</property>
    </component>
</component>
</stripmapApp>

The data are enclosed between an opening tag and a closing tag. The <stripmapApp> tag is closed by the </stripmapApp> tag for example. This outer tag is necessary but its name has no significance. You can give it any name you like. The other tags, however, need to have the names shown above. There are 'property', and 'component' tags shown in this example.

The component tags have names that match a Component name in the ISCE code. The component tag named 'stripmapApp' refers to the configuration information for the Application (which is a Component) named "stripmapApp". Components contain properties and other components that are configurable. The property tags give the values of a single variable in the ISCE code. One of the properties defined in stripmapApp.py is the "sensor name" property. In the above example it is given the value ALOS. In order to run stripmapApp.py two images need to be specified. These are defined as components named 'Reference' and 'Secondary'. These components have properties named 'IMAGEFILE', 'LEADERFILE', and 'OUTPUT' with the values given in the above example.

NOTE: the capitalization of the property and component names are not of any importance. You could enter 'imagefile' instead of 'IMAGEFILE', for example, and it would work correctly. Also extra spaces in names that include spaces, such as "sensor name" do not matter.

There is a lot of flexibility provided by ISCE when constructing these input files through the use of "catalog" tags and "constant" tags.

A "catalog" tag can be used to indicate that the contents that would normally be found between an opening ad closing "component" tag are defined in another xml file. For example, the stripmapApp.xml file shown above could have been split between three files as follows:

stripmapApp.xml (Option 2)

<stripmapApp>
    <component name="insar">
        <property  name="Sensor name">ALOS</property>
        <component name="reference">
            <catalog>20070215.xml</catalog>
        </component>
        <component name="secondary">
            <catalog>20061231.xml</catalog>
        </component>
    </component>
</stripmapApp>

20070215.xml

<component name="Reference">
    <property name="IMAGEFILE">
        /a/b/c/20070215/IMG-HH-ALPSRP056480670-H1.0__A
    </property>
    <property name="LEADERFILE">
        /a/b/c/20070215/LED-ALPSRP056480670-H1.0__A
    </property>
    <property name="OUTPUT">20070215 </property>
</component>

20061231.xml

<component name="Secondary">
    <property name="IMAGEFILE">
        /a/b/c/20061231/IMG-HH-ALPSRP049770670-H1.0__A
    </property>
    <property name="LEADERFILE">
        /a/b/c/20061231/LED-ALPSRP049770670-H1.0__A
    </property>
    <property name="OUTPUT">20061231</property>
</component>

rtcApp.xml

The inputs are Sentinel GRD zipfiles

<rtcApp>
    <constant name="dir">/Users/data/sentinel1 </constant>
    <component name="rtcApp">
        <property name="sensor name">sentinel1</property>
        <property name="posting">100</property>
        <property name="polarizations">[VV, VH]</property>
        <property name="epsg id">32618</property>
        <property name="geocode spacing">100</property>
        <property name="geocode interpolation method">bilinear</property>
        <property name="apply thermal noise correction">True</property>
        <component name="reference">
        <property name="safe">$dir$/rtcApp/data/S1A_IW_GRDH_1SDV_20181221T225104_20181221T225129_025130_02C664_B46C.zip</property>
        <property name="orbit directory">$dir$/orbits</property>
        <property name="output directory">$dir$/rtcApp/output</property>
        </component>
    </component>
</rtcApp>

Component Configurability

In the examples for running stripmapApp.py (Here and Here above) the input data were entered by giving the name of an 'xml' file on the command line. The ISCE framework parses that 'xml' file to assign values to the configurable variables in the isce Application stripmapApp.py. The Application executes several steps in its workflow. Each of those steps are handled by a Component that is also configurable from input data. Each component may be configured independently from user input using appropriately named and placed xml files. This section will explain how to name these xml files and where to place them.

Component Names: Family and Instance

Each configurable component has two "names" associated with it. These names are used in locating possible configuration xml files for those components. The first name associated with a configurable component is its "family" name. For stripmapApp.py, the family name is "insar". Inside the stripmapApp.py file an Application is created from a base class named Insar. That base class defines the family name "insar" that is given to every instance created from it. The particular instance that is created in the file stripmapApp.py is given the 'instance name' 'stripmapApp'. If you look in the file near the bottom you will see the line,

insar = Insar(name="stripmapApp")

This line creates an instance of the class Insar (that is given the family name 'insar' elsewhere in the file) and gives it the instance name "stripmapApp".

Other applications could be created that could make several different instances of the Insar. Each instance would have the family name "insar" and would be given a unique instance name. This is possible for every component. In the above example xml files instances name "Reference" and "Secondary" of a family named "alos" are created.

Component Configuration Files: Locations, Names, Priorities

The ISCE framework looks for xml configuration files when configuring every Component in its flow in 3 different places with different priorities. The configuration sequence loads configuration parameters found in these xml files in the sequence lowest to highest priority overwriting any parameters defined as it moves up the priority sequence. This layered approach allows a couple of advantages. It allows the user to define common parameters for all instances in one file while defining specific instance parameters in files named for those specific instances. It also allows global preferences to be set in a special directory that will apply unless the user overrides them with a higher priority xml file.

The priority sequence has two layers. The first layer is location of the xml file and the second is the name of the file. Within each of the 3 location priorities indicated below, the filename priority goes from 'family name' to 'instance name'. That is, within a given location priority level, a file named after the 'family name' is loaded first and then a file with the 'instance name' is loaded next and overwrites any property values read from the 'family name' file.

The priority sequence for location is as follows:

(1) The highest priority location is on the command line. On the command line the filename can be anything you choose. Configuration parameters can also be entered directly on the command line as in the following example:

> stripmapApp.py insar.reference.output=reference_c.raw

This example indicates that the variable named 'output' of the Component named 'reference' belonging to the Component (or Application) named 'insar' will be given the name "reference_c.raw".

The priority sequence on the command line goes from lowest priority on the left to highest priority on the right. So, if we use the command line,

> stripmapApp.py myInputFile.xml insar.reference.output=reference_c.raw

where the myInputFile.xml file also gives a value for the insar reference output file as reference_d.raw, then the one defined on the right will win, i.e., reference_c.raw.

(2) The next priority location is the local directory in which stripmapApp.py is executed. Any xml file placed in this directory named according to either the family name or the instance name for any configurable component in ISCE will be read while configuring the component.

(3) If you define an environment variable named ISCEDB, you can place xml files with family names or instance names that will be read when configuring Configurable Components. These files placed in the ISCEDB directory have the lowest priority when configuring properties of the Components. The files placed in the ISCEDB directory can be used to define global settings that will apply unless the xml files in the local directory or the command line override those preferences.

Component Configuration Structure

However, the component tag has to have the family name of the Component/ Application. In the above examples you see that the outermost component tag has the name "insar", which is the family name of the class Insar of which stripmapApp is an instance.

Component Configuration Help

At this time there is limited information about component configurability through the command

> stripmapApp.py --help

Future deliveries will improve this situation. In the meantime we describe here how to discover from the code which Components and parameters are configurable. One note of caution is that it is possible for a parameter to appear to be configurable from user input when the particular flow will not allow this degree of freedom. Experience and evolving documentation will be of use in determining these cases.

How to find out whether a component is configurable, what its configurable parameters are, what "name" to use in the xml file, and what name to give to the xml file.

Let's take as an example, Ampcor.py, which is in components/mroipac/ampcor.

Open it in an editor and search for the string "class Ampcor". It is on line 263. You will see that it inherits from Component. This is the minimum requirement for it to be a configurable component.

Now look above that line and you will see several variable names being set equal to a call to Component.Parameter. These declarations define these variables as configurable parameters. They are entered in the "parameter_list" starting on line 268. That is the method by which these Parameters are made configurable parameters of the Component Nstage.

Each of the parameters defines the "public_name", which is the "name" that you would enter in the xml file. For instance if you want to set the gross offset in range, which is defined starting on line 88 in the variable ACROSS_GROSS_OFFSET, then you would use an xml tag like the following (assuming you have determined that the gross offset in range is about 150 pixels):

<property name="ACROSS_GROSS_OFFSET">150</property>

Now, to determine what to call the xml file and what "name" to use in the component tag. A configurable component has a "family" name and an instance "name". It is registered as having these names by calling the Component.__init__ constructor, which is done on line 806. On that line you will see that the call to __init__ passes 'family=self.class.family' and 'name=name' to the Component constructor (super class of Ampcor). The family name is given as "nstage" on line 265. The instance name is passed as the value of the 'name=name' and was passed to it from whatever program created it. Nstage is created in components/isceobj/StripmapProc/runRefineSecondaryTiming.py where it is given the name 'reference_offset1' on line 35. If you are setting a parameter that should be the same for all uses of Ampcor, then you can use the family name 'ampcor' for the name of the xml file as 'ampcor.xml'. It is more likely that you will want to use the instance name 'reference_offset1.xml' Use the family name 'ampcor' for the component tag 'name'.

Example for SLC matching use of Ampcor:

Filename: reference_offset1.xml:

<dummy>
<component name="ampcor">
    <property name="ACROSS_GROSS_OFFSET">150</property>
</component>
</dummy>

User community forums

Read helpful information and participate in discussion with the user/developer community on GitHub Discussions:

https://github.com/isce-framework/isce2/discussions

isce2's People

Contributors

asolarte avatar bjmarfito avatar cunrenliang avatar dbekaert avatar ejfielding avatar falkamelung avatar giangijpl avatar gshiroma avatar hardreddata avatar hbparache avatar hfattahi avatar jhkennedy avatar juribeparada avatar kmaterna avatar lijun99 avatar mirsl avatar mirzaees avatar mzzhong avatar niu541412 avatar piyushrpt avatar pymonger avatar ranneylxr avatar rtburns-jpl avatar scottstanie avatar shitong01 avatar sssangha avatar stoormgeo avatar vincentschut avatar yuankailiu avatar yunjunz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

isce2's Issues

consider removing large docs/ folder to separate repo

First of all, thanks so much for open-sourcing ISCE! It's excellent software and it's great that more people now have access to it. Hopefully the move to github facilitates more community contributions.

The tutorial material is fantastic under the docs/ directory, but the file sizes are large and it would be great to separate this from the source code (in particular it will speed up cloning and building packages). Currently, cloning the repo locally results in 287Mb because the folder is also kept track of in the hidden .git/ folder.

An easy solution is to move the docs to another repository (e.g. isce-framework/isce2-docs). I've confirmed that you can remove these files and commit history with the following commands:

cp -r docs/ ../isce2-docs
git filter-branch --tree-filter 'rm -rf docs' --prune-empty HEAD
git for-each-ref --format="%(refname)" refs/original/ | xargs -n 1 git update-ref -d
echo docs/ >> .gitignore
git add .gitignore
git commit -m 'Removing docs/ from git history'
rm -Rf .git/logs .git/refs/original
git gc --prune=all --aggressive
git push origin master --force
git pull
rm -Rf .git/logs .git/refs/original
git gc --prune=all --aggressive

The repo is then reduced to 29Mb. I think that with the above solution people who have already forked this repository will have to re-fork to be in sync since the commit history changes.

Issue in ROIPAC Ampcor

I have been running ROIPAC Ampcor with square search window width and height on a pair of ALOS PALSAR over Amery Ice Shelf Antarctica (46 days). The output offset fields in range and azimuth appear to be crowded with -10000 especially on the ice shelf (see below). Particularly, ROIPAC Ampcor prints a message on the screen "BAD Match level 1".

I played around with the Fortran code and it seems that when setting r_snrth to zero and r_covth to 100000 (impossible numbers) in Ampcor.F the issue is solved. I am not a Fortran expert but it seems that ROIPAC Ampcor performs inner filtering on the offsets based on the default values of the SNR (0.001) and covariance matrix (1000) which cannot be set by the user in stripmapApp.xml.

The issue does not recur when running ROIPAC Ampcor with rectangular window sizes.

Other Ampcor implementations that do not perform any internal offset thresholding (e.g. Ampcor in isceobj/util/denseoffsets) seems not to have this issue when forced to run with the same set of input parameters as ROIPAC (see below).

ROIPAC

Comparison

stripmapStack incompatible with GDAL 3.1.2?

I recently built ISCE2 (with a git pull this week from the main branch) using an Anaconda3 environment. I note that it has GDAL version 3.1.2.

I setup a stripmapStack run and started the run_1_reference script generated by the stackStripMap.py command. It seems to run the topo processing to generate all the full-resolution geometry files in merged/geom_reference, but then the multilooking step fails. I wonder if this new version of GDAL is not compatible. Folks found an issue in ARIA-tools where versions of GDAL greater than 3.0.4 cause errors (aria-tools/ARIA-tools#232)

...
GDAL close: /u/sar-r0/fielding/Calif/SF_Bay/UAVSAR/Stacks/Haywrd_15302_03/s1/merged/geom_reference/hgt.rdr.vrt
API close:  /u/sar-r0/fielding/Calif/SF_Bay/UAVSAR/Stacks/Haywrd_15302_03/s1/merged/geom_reference/simamp.rdr
Warning 1: Geotransform matrix has non rotational terms
ERROR 4: /u/sar-r0/fielding/Calif/SF_Bay/UAVSAR/Stacks/Haywrd_15302_03/s1/geom_reference/hgt.rdr.vrt: No such file or directory
C pointer already created. Finalize and recreate if image dimensions changed.
C pointer already created. Finalize and recreate if image dimensions changed.
--------------------------------------------------
generate multilooked geometry files with alks=16 and rlks=4 using gdal.Translate() ...
multilook /u/sar-r0/fielding/Calif/SF_Bay/UAVSAR/Stacks/Haywrd_15302_03/s1/merged/geom_reference/hgt.rdr
.vrt
/u/sar-r0/fielding/Calif/SF_Bay/UAVSAR/Stacks/Haywrd_15302_03/s1/geom_reference/hgt.rdr
Traceback (most recent call last):
  File "/u/pez0/fielding/tools/ISCE2_test/isce2/contrib/stack/stripmapStack/stripmapWrapper.py", line 155, in <module>
    main(args.start,args.end)
  File "/u/pez0/fielding/tools/ISCE2_test/isce2/contrib/stack/stripmapStack/stripmapWrapper.py", line 146, in main
    cfgParser.runCmd()
  File "/u/pez0/fielding/tools/ISCE2_test/isce2/contrib/stack/stripmapStack/stripmapWrapper.py", line 51, in runCmd
    func_modules.main(self.funcParams[section])
  File "/u/pez0/fielding/tools/ISCE2_test/isce2/contrib/stack/stripmapStack/topo.py", line 525, in main
    runMultilook(in_dir=info.outdir, out_dir=out_dir, alks=inps.alks, rlks=inps.rlks)
  File "/u/pez0/fielding/tools/ISCE2_test/isce2/contrib/stack/stripmapStack/topo.py", line 423, in runMultilook
    gdal2isce_xml(out_file+'.vrt')
  File "/u/pez0/fielding/tools/ISCE2_test/install/isce/applications/gdal2isce_xml.py", line 69, in gdal2isce_xml
    width = ds.RasterXSize
AttributeError: 'NoneType' object has no attribute 'RasterXSize'

runDispersive segmentation fault

I am running stripmapApp.py on a pair of UAVSAR NISAR sample products over the San Andreas Valley, CA, USA.
The processing chain runs smoothly until the ionospheric phase estimation step where it fails trying to apply the low-pass Gaussian filter.
Particularly, I am getting a "Segmentation Fault" when trying to execute cv2.Filter2D. I have checked the input data and the kernel and the look ok considering the pair of SLC being processed. I suspect the issue is related to opencv.

I am using the latest ISCE2 conda installation and opencv 3.4.7

Thanks a lot for your help!

Feature: Change detection in Polarimetric SAR Images #25

Hello,

@h-sdl and I have implemented a small pipeline for change detection for polarimetric SAR images in python from [1].

In short, we can detect changes between two SAR and subclasses changes in the two images:

Example 1
Example 2

The best would be to known whether such a feature would be interesting to include in PyRAT.
We do not know this library that much and thus this needs discussions.

See our repository of the implementation: https://gitlab.com/jjerphan/rsd-project

See this original proposal of the method:
[1] Advanced Methods for Change Detection in Multi-polarization and
Very-High Resolution Multitemporal SAR Images. Davide Pirrone. PhD thesis, International Doctorate School in
Information and Communication Technologies - University of Trento, 1 2019.

See the slides of the presentation of this method we made (in French): https://cloud.mines-paristech.fr/index.php/s/aXhZ2o5BM8fIBR0

topsApp geocoding fails for files without "merged/" in the file names in the .xml file

I am trying to geocode a new file with topsApp.py that I created by applying a mask to the unwrapped phase file. I added it to my topsApp.xml file "geocode list" property, and the run sees that the list is different from the original InsarProc list:

2019-09-30 18:34:29,713 - isce.insar - WARNING - Some filenames in insarApp.geocode_list configuration are different from those in InsarProc. Using names given to insarApp.
insarApp.geocode_list = ['merged/filt_topophase.unw', 'merged/filt_topophase_msk.unw', 'merged/filt_topophase.unw.conncomp']
InsarProc.geocode_list = ['merged/phsig.cor', 'merged/topophase.cor', 'merged/filt_topophase.unw', 'merged/los.rdr', 'merged/topophase.flat', 'merged/filt_topophase.flat', 'merged/filt_topophase_2stage.unw']

It properly geocodes the 'merged/filt_topophase.unw' file that was on the InsarProc list, but then when it gets to the new file, it looks in the wrong place, ignoring the 'merged/' part of the name and can't find the file.

GDAL close: /u/pez-z2/fielding/Calif/Ridgecrest/topo/NASADEM/demLat_n33_n37_Lon_w120_w114.dem.wgs84.vrt
GDAL close: merged/filt_topophase.unw.vrt
API close:  merged/dem.crop
API close:  merged/filt_topophase.unw.geo
API open (R): merged/filt_topophase.unw.geo
API close:  merged/filt_topophase.unw.geo
Writing geotrans to VRT for merged/filt_topophase.unw.geo
Output:  filt_topophase_msk.unw.geo
API open (WR): merged/dem.crop
API open (WR): filt_topophase_msk.unw.geo
GDAL open (R): filt_topophase_msk.unw.vrt
ERROR 4: No such file or directory
Error. Cannot open the file filt_topophase_msk.unw.vrt in read mode.
Error in file /u/pez0/fielding/tools/ISCE2_test/build/isce/components/iscesys/ImageApi/InterleavedAccessor/src/GDALAccessor.cpp at line 77 Exiting

Do I need to move the new file that I created to the main directory instead of putting it in the 'merged' directory?

please update SConstruct and contrib/mdx/src/SConscript

hi,

as was noticed already 2 years ago, there are some issues in 'checking for libraries/headers' when using anaconda's gdal within the isce's sconsfiguration.

I have finally AGAIN found the answer when installing isce on two servers and the instructions found at http://earthdef.caltech.edu/boards/4/topics/1245 helped:

in file SConstruct:

Change:
​env.PrependUnique(LIBS=['gdal'])

To:

#Resolve circular dependencies for gdal provided by Anaconda-Python...
env.PrependUnique(LIBS=['gdal', 'kea', 'hdf5', 'hdf5_hl', 'hdf5_cpp'])
# Include the following search path when linking against the conda environment, e.g. isce3.
env.AppendUnique(LINKFLAGS=['-Wl,-rpath,CONDA_ENV_LIB_DIRECTORY'])
# Include system libraries
env.AppendUnique(LINKFLAGS=['-Wl,-rpath,/usr/lib64'])

Note: Replace "CONDA_ENV_LIB_DIRECTORY" with the correct library path
(but maybe this one is not necessary?)

In file ./contrib/mdx/src/SConscript:
Change:
for i in range(envmdx['LIBS'].count('hdf5')): envmdx['LIBS'].remove('hdf5')

To:

for i in range(envmdx['LIBS'].count('hdf5')): envmdx['LIBS'].remove('hdf5')
# Remove additional libraries required by Anaconda-Python
for i in range(envmdx['LIBS'].count('kea')): envmdx['LIBS'].remove('kea')
for i in range(envmdx['LIBS'].count('hdf5_cpp')): envmdx['LIBS'].remove('hdf5_cpp')
for i in range(envmdx['LIBS'].count('hdf5_hl')): envmdx['LIBS'].remove('hdf5_hl')

fixImageXml.py conflict

@yunjunz and @CunrenLiang

There is a potential conflict introduced by the two latest PRs in the default settings for fixImageXml.py .

The default value for using fullpath has always been "False" and that has been changed to "True". @yunjunz - do you see issues with flipping this back to original behavior?

Build Error: Arguments of iand have different kind type parameters

Error from scons install when building ALOS_pre_process/readOrbitPulse.f

[ isce2-master]# /usr/bin/gfortran -o /opt/isce/components/isceobj/Sensor/src/ALOS_pre_process/readOrbitPulse.o -c -O2 -Wall -ffixed-line-length-none -fno-second-underscore -fPIC -fno-range-check -m64 -I/usr/include -I/opt/isce_build/mods -J/opt/isce_build/mods /opt/isce/components/isceobj/Sensor/src/ALOS_pre_process/readOrbitPulse.f
f951: Warning: Nonconforming tab character in column 1 of line 109 [-Wtabs]
/opt/isce/components/isceobj/Sensor/src/ALOS_pre_process/readOrbitPulse.f:110:41:

110 | $ iand(indata(38),255)*256+iand(indata(37),255)
| 1
Error: Arguments of 'iand' have different kind type parameters at (1)
/opt/isce/components/isceobj/Sensor/src/ALOS_pre_process/readOrbitPulse.f:112:41:

112 | $ iand(indata(42),255)*256+iand(indata(41),255)
| 1
Error: Arguments of 'iand' have different kind type parameters at (1)
/opt/isce/components/isceobj/Sensor/src/ALOS_pre_process/readOrbitPulse.f:114:41:

114 | $ iand(indata(46),255)*256+iand(indata(45),255)
| 1
Error: Arguments of 'iand' have different kind type parameters at (1)

[ isce2-master]# /usr/bin/gfortran --version
GNU Fortran (GCC) 9.2.0

Burst inconsistency when unpacking scenes across dates of stack

Hi ISCE developers,

Recently, we processed a coregistered SLC stack over cities in Europe and found that at times, the bursts processed across dates in a stack may be inconsistent. This seems to be an edge case occurs specifcially when we have a mix of S1A and S1B acquisitions in the stack.

This is an example of a stack we created over Bucharest >
Start Date: 20200503
End Date: 20200527
Master Date: 20200527

stackSentinel.py command:

stackSentinel.py -s data -d ./dem/demLat_N44_N45_Lon_E025_E027.dem.wgs84 -a .AuxDir/ -o ./orbits -b '44.3112 44.5513 25.9251 26.2722' -W slc -m 20200527 -C geometry -n '2'

SLCs used in ./data folder:

S1A_IW_SLC__1SDV_20200503T160840_20200503T160907_032403_03C07F_86EB.zip
S1A_IW_SLC__1SDV_20200503T160905_20200503T160932_032403_03C07F_AA8C.zip
S1B_IW_SLC__1SDV_20200509T160757_20200509T160824_021507_028D58_29EF.zip
S1B_IW_SLC__1SDV_20200509T160821_20200509T160848_021507_028D58_FEDB.zip
S1A_IW_SLC__1SDV_20200515T160841_20200515T160908_032578_03C5FB_8471.zip
S1A_IW_SLC__1SDV_20200515T160905_20200515T160932_032578_03C5FB_6A9E.zip
S1B_IW_SLC__1SDV_20200521T160757_20200521T160824_021682_029289_17E0.zip
S1B_IW_SLC__1SDV_20200521T160822_20200521T160849_021682_029289_4958.zip
S1A_IW_SLC__1SDV_20200527T160841_20200527T160908_032753_03CB48_F2D5.zip
S1A_IW_SLC__1SDV_20200527T160906_20200527T160933_032753_03CB48_A2F1.zip

In the ./slaves folder:

|-- 20200503
|   |-- IW2
|   |   |-- burst_01.slc
|   |   |-- burst_01.slc.vrt
|   |   |-- burst_01.slc.xml
|   |   |-- burst_02.slc
|   |   |-- burst_02.slc.vrt
|   |   |-- burst_02.slc.xml
|   |   |-- burst_03.slc
|   |   |-- burst_03.slc.vrt
|   |   |-- burst_03.slc.xml
|   |   |-- burst_04.slc
|   |   |-- burst_04.slc.vrt
|   |   `-- burst_04.slc.xml
|   `-- IW2.xml
|-- 20200509
|   |-- IW2
|   |   |-- burst_01.slc
|   |   |-- burst_01.slc.vrt
|   |   |-- burst_01.slc.xml
|   |   |-- burst_02.slc
|   |   |-- burst_02.slc.vrt
|   |   |-- burst_02.slc.xml
|   |   |-- burst_03.slc
|   |   |-- burst_03.slc.vrt
|   |   `-- burst_03.slc.xml
|   `-- IW2.xml
|-- 20200515
|   |-- IW2
|   |   |-- burst_01.slc
|   |   |-- burst_01.slc.vrt
|   |   |-- burst_01.slc.xml
|   |   |-- burst_02.slc
|   |   |-- burst_02.slc.vrt
|   |   |-- burst_02.slc.xml
|   |   |-- burst_03.slc
|   |   |-- burst_03.slc.vrt
|   |   |-- burst_03.slc.xml
|   |   |-- burst_04.slc
|   |   |-- burst_04.slc.vrt
|   |   `-- burst_04.slc.xml
|   `-- IW2.xml

In the ./master folder:

master
|-- IW2
|   |-- burst_01.slc
|   |-- burst_01.slc.vrt
|   |-- burst_01.slc.xml
|   |-- burst_02.slc
|   |-- burst_02.slc.vrt
|   |-- burst_02.slc.xml
|   |-- burst_03.slc
|   |-- burst_03.slc.vrt
|   |-- burst_03.slc.xml
|   |-- burst_04.slc
|   |-- burst_04.slc.vrt
|   `-- burst_04.slc.xml
`-- IW2.xml

As you can see, the bursts extracted over IW2 alternates between 3 and 4 bursts, depending if its S1A (4 bursts) or S1B (3 bursts). Due to this inconsistency, This causes an issue when we start pairing our dates for calculations (e.g. coherence etc):
image

A deeper look into this burst inconsistency occurs because ISCE does an approximate computation of the burst boundary (Sentinel1.py and BurstSLC.py) to decide if a burst should be processed or not. This approximated boundary is not precise (has a huge buffer) and inconsistent between S1A and S1B:

image
(click on image and zoom in for clearer view)

  • Magenta box
    • approximated burst boundaries computed by BurstSLC.py for 20200527's burst 4 (B4) captured by S1A during unpacking.
  • Cyan box
    • approximated burst boundaries computed by BurstSLC.py for 20200509's burst 4 (B4) captured by S1B during unpacking.
  • White box
    • bbox defined in stackSentinel.py.
  • Lower right figure
    • The magenta box for B4 in 20200527 overlaps the white box, but the cyan box for 20200509 does not. Hence B4 is included for unpacking for 20200527 (master) but not 20200509.
  • Of the 70 cities we processed around the world, we see this in 7 cities (5/7 in Europe).

We see this effect on our fully geocoded paired products if there is:

  1. Mix of S1A and S1B acquisitions in stack
  2. Any of the slave dates have less bursts than master date

Despite this, the ISCE stack proc. algorithm to filter bursts ensures that the bursts extracted cover 100% of the bbox defined in stackSentinel.py. So, if we just limit our geocoding region to only cover the area defined by the bbox of stackSentinel.py, the area with this issue will be cropped out.

Some users, however, would like to make use of all data that was processed by the stack processor, hence I am reflecting this issue here for reference. Hopefully, this issue could be addressed when unpacking the data at run_1 and run_2.

Attached is the KMZ for the above example of burst boundaries (from plotBursts.py), approximated burst boundaries, and bbox defined.

bucharest.kmz.zip

dem.py ERROR can not create the .vrt file

(isce) rsgis@rsgis10:/mnt/sar/quanzhou/dem$ dem.py -a stitch -b 24 25 117 119 -s 1 -m rsc -r -c -n xxx -w xxx -u https://e4ftl01.cr.usgs.gov/MEASURES/SRTMGL1.003/2000.02.11/ This is the Open Source version of ISCE. Some of the workflows depend on a separate licensed package. To obtain the licensed package, please make a request for ISCE through the website: https://download.jpl.nasa.gov/ops/request/index.cfm. Alternatively, if you are a member, or can become a member of WinSAR you may be able to obtain access to a version of the licensed sofware at https://winsar.unavco.org/software/isce API open (R): ./demLat_N24_N25_Lon_E117_E119.dem API close: ./demLat_N24_N25_Lon_E117_E119.dem GDAL open (R): ./demLat_N24_N25_Lon_E117_E119.dem.vrt ERROR 4: ./demLat_N24_N25_Lon_E117_E119.dem.vrt: No such file or directory Error. Cannot open the file ./demLat_N24_N25_Lon_E117_E119.dem.vrt in read mode. Error in file build/isce/components/iscesys/ImageApi/InterleavedAccessor/src/GDALAccessor.cpp at line 77 Exiting
how to fix it?

Version 2.4.0 clean build fails in autorift

I removed my entire isce2 directory and cloned the new version 2.4.0 to make sure I had a clean copy. I also removed my build and install directories. I then tried to build with scons build and it failed in the autorift code:

Install file: "contrib/geo_autoRIFT/autoRIFT/autoRIFT_ISCE.py" as "/Users/fielding/tools/ISCE2_latest/install/isce/components/contrib/geo_autoRIFT/autoRIFT/autoRIFT_ISCE.py"
Install file: "contrib/geo_autoRIFT/autoRIFT/include/autoriftcoremodule.h" as "/Users/fielding/tools/ISCE2_latest/build/components/contrib/geo_autoRIFT/autoRIFT/include/autoriftcoremodule.h"
/opt/local/bin/g++ -o /Users/fielding/tools/ISCE2_latest/build/components/contrib/geo_autoRIFT/autoRIFT/bindings/autoriftcoremodule.os -c -O2 -Wall -fPIC -m64 -fPIC -DNEEDS_F77_TRANSLATION -DF77EXTERNS_LOWERCASE_TRAILINGBAR -I/opt/local/Library/Frameworks/Python.framework/Versions/3.7/include/python3.7m -I/opt/local/include -I/opt/local/include/openmpi-mp -I/Users/fielding/tools/ISCE2_latest/build/components/iscesys/ImageApi/include -I/Users/fielding/tools/ISCE2_latest/build/components/iscesys/ImageApi/DataCaster/include -I/Users/fielding/tools/ISCE2_latest/build/components/isceobj/LineAccessor/include -I/Users/fielding/tools/ISCE2_latest/build/components/iscesys/StdOE/include -I/Users/fielding/tools/ISCE2_latest/build/components/isceobj/Util/include -I/Users/fielding/tools/ISCE2_latest/build/components/isceobj/Util/Library/include -I/Users/fielding/tools/ISCE2_latest/build/components/contrib/geo_autoRIFT/autoRIFT/include /Users/fielding/tools/ISCE2_latest/build/components/contrib/geo_autoRIFT/autoRIFT/bindings/autoriftcoremodule.cpp
/Users/fielding/tools/ISCE2_latest/build/components/contrib/geo_autoRIFT/autoRIFT/bindings/autoriftcoremodule.cpp:42:10: fatal error: numpy/arrayobject.h: No such file or directory
   42 | #include "numpy/arrayobject.h"
      |          ^~~~~~~~~~~~~~~~~~~~~
compilation terminated.
scons: *** [/Users/fielding/tools/ISCE2_latest/build/components/contrib/geo_autoRIFT/autoRIFT/bindings/autoriftcoremodule.os] Error 1
scons: done reading SConscript files.
scons: Building targets ...
scons: *** Do not know how to make File target `build' (/Users/fielding/tools/ISCE2_latest/isce2/build).  Stop.
scons: building terminated because of errors.

This is on MacOS with NumPy installed by MacPorts py37-numpy @1.19.1_0+gfortran+openblas (active).

installing issue

Hello all! I'm trying to install isce2 following the cmake method and came across an error at 95%. Does anyone knows what can be wrong?. I have ubuntu 20.04 and anaconda. Thanks !

image

resolve linker warnings for libbuid version

Proposed Improvement

Use @piyushrpt's method for resolving linker issues with anaconda vs. system libuuid (https://github.com/piyushrpt/oldLinuxSetup/blob/master/issues.md):

  1. Find path to libuuid used by libXm and libSM:
    > ldd /lib64/libSM.so | grep uuid
    > ls -ltr /lib64/libuuid.so.1
    
  2. Link libuuid from anaconda to this library
    > cd /home/agram/anaconda3/lib
    > unlink libuuid.so     (should be a link to libuuid.so.1.0.0)
    > unlink libuuid.so.1   (should be a link to libuuid.so.1.0.0)
    > ln -s /lib64/libuuid.so.1.3.0 libuuid.so
    > ln -s /lib64/libuuid.so.1.3.0 libuuid.so.1
    

Current Implementation

Existence of additional linker flags in SConfigISCE:

# additional linker flags
LINKFLAGS = -luuid

error: no match for 'operator+='

Possibly also due to newer compiler

In file included from /opt/isce_build/components/mroipac/looks/bindings/looksmodule.cpp:14:
/usr/local/include/c++/9.2.0/complex: In instantiation of 'std::complex& std::complex::operator+=(const std::complex<_Tp>&) [with _Tp = char]':
/opt/isce_build/components/mroipac/looks/bindings/looksmodule.cpp:253:25: required from 'int takeLookscpx(DataAccessor*, DataAccessor*, int, int) [with T = char]'
/opt/isce_build/components/mroipac/looks/bindings/looksmodule.cpp:107:49: required from here
/usr/local/include/c++/9.2.0/complex:1327:13: error: no match for 'operator+=' (operand types are 'std::complex::_ComplexT' {aka 'complex double'} and 'std::complex')
1327 | _M_value += __z.__rep();
/usr/local/include/c++/9.2.0/complex: In instantiation of 'std::complex& std::complex::operator+=(const std::complex<_Tp>&) [with _Tp = short int]':
/opt/isce_build/components/mroipac/looks/bindings/looksmodule.cpp:253:25: required from 'int takeLookscpx(DataAccessor*, DataAccessor*, int, int) [with T = short int]'
/opt/isce_build/components/mroipac/looks/bindings/looksmodule.cpp:111:51: required from here
/usr/local/include/c++/9.2.0/complex:1327:13: error: no match for 'operator+=' (operand types are 'std::complex::_ComplexT' {aka 'complex double'} and 'std::complex')
/usr/local/include/c++/9.2.0/complex: In instantiation of 'std::complex& std::complex::operator+=(const std::complex<_Tp>&) [with _Tp = int]':
/opt/isce_build/components/mroipac/looks/bindings/looksmodule.cpp:253:25: required from 'int takeLookscpx(DataAccessor*, DataAccessor*, int, int) [with T = int]'
/opt/isce_build/components/mroipac/looks/bindings/looksmodule.cpp:115:49: required from here
/usr/local/include/c++/9.2.0/complex:1327:13: error: no match for 'operator+=' (operand types are 'std::complex::_ComplexT' {aka 'complex double'} and 'std::complex')
/usr/local/include/c++/9.2.0/complex: In instantiation of 'std::complex& std::complex::operator+=(const std::complex<_Tp>&) [with _Tp = long int]':
/opt/isce_build/components/mroipac/looks/bindings/looksmodule.cpp:253:25: required from 'int takeLookscpx(DataAccessor*, DataAccessor*, int, int) [with T = long int]'
/opt/isce_build/components/mroipac/looks/bindings/looksmodule.cpp:119:50: required from here
/usr/local/include/c++/9.2.0/complex:1327:13: error: no match for 'operator+=' (operand types are 'std::complex::_ComplexT' {aka 'complex double'} and 'std::complex')
scons: *** [/opt/isce_build/components/mroipac/looks/bindings/looksmodule.os] Error 1
scons: done reading SConscript files.

Baseline grid bug - TOPS

I have documented a bug whereby there is an abrupt sign-change when generating perpendicular baseline grids. I have generated these products using the "baselineGrid.py" script packaged through stackSentinel, which is essentially a wrapper of this ISCE script "components/zerodop/baseline/Baseline.py". I have attached "baselineGrid.py" as a reference here within a zip file.

The master scene used was "S1B_IW_SLC__1SDV_20181210T000734_20181210T000752_013972_019ECE_FB9D.zip", and the slave scenes used were "S1B_IW_SLC__1SDV_20171203T000703_20171203T000730_008547_00F2A8_F803.zip, S1B_IW_SLC__1SDV_20171203T000727_20171203T000745_008547_00F2A8_DCE6.zip"

Within this zip file the extracted baseline grids may also be found, but I will illustrate the problem below:
Screen Shot 2020-05-26 at 12 49 49 PM

I made some print statements to track changes in the variables involved in the estimate of the direction (i.e. "direction = np.sign(np.dot( np.cross(targxyz-mxyz, sxyz-mxyz), mvel))"), and here is an example of two sets of variables corresponding to adjacent pixels along range for which there was a sign flip:

"
############################################################################################
direction -1.0
targxyz [ 13843.73542219 5020808.95559147 3920247.75736679]
mxyz [-431237.83432596 5613389.08465543 4277645.57748178]
sxyz [-431234.7701128 5613379.85014268 4277646.91694975]
mvel [ 1404.18442248 4599.24232307 -5877.04480614]
Bperp[jj] -6.0630693 ############################################################################################
direction 1.0
targxyz [ 26673.98226566 5019284.63081567 3922120.34709414]
mxyz [-431237.83432596 5613389.08465543 4277645.57748178]
sxyz [-431234.80521498 5613379.73516128 4277647.06387625]
mvel [ 1404.18442248 4599.24232307 -5877.04480614]
Bperp[jj] 6.2536173
"

After looking carefully across this sign-flip in the field, it was my impression that the field would be smoothly varying. I.e. if I wrote out the absolute value of all pixels in the field, there would no longer be an abrupt jump in the field. Thus, I believe there could be an issue in the baseline equation, but I haven't been able to pinpoint a bug.
Screen Shot 2020-05-26 at 1 33 43 PM

Please let me know if any additional information or clarification is needed to help diagnose the problem.
bPerp_bug_track121.zip

mdx from cmake pipeline results in segfault with conda compilers

I was looking to replacing scons with cmake for conda recipes. Note that there we no compilation failures.

  1. scons and cmake are using different sources. CMakeLists.txt should be updated to use mdx_subs.F instead of mdx.F. Even with that change, there are some issues.
  2. Looks like cmake generated mdx results in segfaults in calls to "mdxsub"
    • All the files are being compiled straight into an executable here
    • Tried replacing the conda compilers with system compilers but the problem persisted
    • I suspect the culprit might be in one of the three
      - there were always bugs in mdx and conda's optimization flags causes the error
      - sequence of compilation/ linking somehow matters for this case
      - additional args might be needed for linking in one go
  3. scons generated mdx seems to work
    • Files are compiled into a static lib first and then compiled into an executable

I tried for a fair bit to track down the issue by matching compiler and linking flags but that didnt make a difference. Somehow generating a static lib and then using to generate the executable seems to produce this behavior. Attaching the logs here - in case something stands out immediately that I missed.
scons.log
cmake.log

Multiprocessing and Python 3.8

Python 3.8 multiprocessing uses “spawning” instead of “forking” due to security reasons. The different ampcor threads dont get the complete context to generate image accessors correctly when using denseampcor / denseoffsets. This can be overcome by adding

mp.set_start_method("fork")

before using any multiprocessing features. This should only be done for osx + python3.8

not well-formed (invalid token): line 3, column 13

Completed Parsing the Configuration file
Functions to be executed:
['Function-1']
Running: Sentinel1_TOPS
['--dirname', '/Volumes/DATA1/JBT-A/S1B_IW_SLC__1SDV_20190420T151749_20190420T151817_015892_01DDA3_A970.zip', '--swaths', '1', '--orbitdir', '/Volumes/MSWAP/S1_orbit', '--outdir', '/Volumes/DATA/jbta/isce/secondarys/20190420', '--auxdir', '/Volumes/DATA/jbta/aux', '--bbox', '11.47 11.72 42.30 42.60', '--pol', 'vv']
Input XML files: ['/vsizip//Volumes/DATA1/JBT-A/S1B_IW_SLC__1SDV_20190420T151749_20190420T151817_015892_01DDA3_A970.zip/S1B_IW_SLC__1SDV_20190420T151749_20190420T151817_015892_01DDA3_A970.SAFE/annotation/s1b-iw1-slc-vv-20190420t151751-20190420t151816-015892-01dda3-004.xml']
Input TIFF files: ['/vsizip//Volumes/DATA1/JBT-A/S1B_IW_SLC__1SDV_20190420T151749_20190420T151817_015892_01DDA3_A970.zip/S1B_IW_SLC__1SDV_20190420T151749_20190420T151817_015892_01DDA3_A970.SAFE/measurement/s1b-iw1-slc-vv-20190420t151751-20190420t151816-015892-01dda3-004.tiff']
Manifest files: ['/vsizip//Volumes/DATA1/JBT-A/S1B_IW_SLC__1SDV_20190420T151749_20190420T151817_015892_01DDA3_A970.zip/S1B_IW_SLC__1SDV_20190420T151749_20190420T151817_015892_01DDA3_A970.SAFE/manifest.safe']
MANS: /Volumes/DATA1/JBT-A/S1B_IW_SLC__1SDV_20190420T151749_20190420T151817_015892_01DDA3_A970.zip S1B_IW_SLC__1SDV_20190420T151749_20190420T151817_015892_01DDA3_A970.SAFE/manifest.safe
Setting IPF version to : 002.91
not well-formed (invalid token): line 3, column 13

ISCE UPDATE

how do i update my isce on ubuntu,doi need to rebuild all the file to install?

no phase info

Hello,
I use isce2 to genrate the interferogram. However the result seems no phase information.(using "mdx.py filt_topophase.flat" to check the picture.)
Would you mind helping me to see where it goes wrong?
I have used some pairs of sentinel-1, but they have the same results.
And the attachments are the xml file, the phase image and the mag image
I want to know whether I have not installed it corretly or something else wrong.

xml
phase
mag

rtcApp.py on isce/isce2 dockers fails with importerror since tag 20200726

I hope this is also the place to file issues with the docker images provided on docker hub (https://hub.docker.com/r/isce/isce2/tags); if not, please let me know.

The problem is that the image with tag 20200719 is the last one for which rtcApp.py works for me. All the later ones fail with an ImportError, caused by missing .so files:

Traceback (most recent call last):
  File "/opt/isce2/isce/applications/rtcApp.py", line 229, in <module>
    class GRDSAR(Application):
  File "/opt/isce2/isce/applications/rtcApp.py", line 423, in GRDSAR
    @use_api
  File "/opt/isce2/isce/components/isceobj/Util/decorators.py", line 282, in use_api
    from iscesys.ImageApi.DataAccessorPy import DataAccessor
  File "/opt/isce2/isce/components/iscesys/ImageApi/DataAccessorPy.py", line 31, in <module>
    from iscesys.ImageApi import DataAccessor as DA
ImportError: libhdf5.so.101: cannot open shared object file: No such file or directory

or (on 2.4.0):

Traceback (most recent call last):
  File "/opt/isce2/isce/applications/rtcApp.py", line 229, in <module>
    class GRDSAR(Application):
  File "/opt/isce2/isce/applications/rtcApp.py", line 423, in GRDSAR
    @use_api
  File "/opt/isce2/isce/components/isceobj/Util/decorators.py", line 282, in use_api
    from iscesys.ImageApi.DataAccessorPy import DataAccessor
  File "/opt/isce2/isce/components/iscesys/ImageApi/DataAccessorPy.py", line 31, in <module>
    from iscesys.ImageApi import DataAccessor as DA
ImportError: libgdal.so.20: cannot open shared object file: No such file or directory

useGPU in topsApp

I'm trying to see where I enable the useGPU functionality. I've modified the SConfigISCE file to set ENABLE_CUDA=True (and I saw it building via nvcc) and I've modified the topsApp.xml as below to add the useGPU flag. Is there something else that I am missing?

  <component name="topsinsar">
    <property name="useGPU">True</property>
    <property name='sensorname'>SENTINEL1</property>
    <component name='master'>
      <property name='safe'>['S1B_IW_SLC__1SDV_20190126T141243_20190126T141315_014666_01B566_ACAF.zip']</property>
      <property name='output directory'>masterdir</property>
    </component>
    <component name='slave'>
      <property name='safe'>['S1B_IW_SLC__1SDV_20190207T141243_20190207T141314_014841_01BB23_EBE6.zip']</property>
      <property name='output directory'>slavedir</property>
    </component>
  </component>
</topsApp>

run_3_geo2rdr_coarseResamp error (stackStripMap.py)

Hi,
Does anyone know any solution for this issue?
Slc data is present as 20170630.slc 20170630.slc.vrt 20170630.slc.xml data.db in each directory.
But after starting the 3rd stage i have only data.db file in the /data/insar/Proc/YLPG/proc/coregSLC/Coarse/20170630/referenceShelve and secondaryShelve folders.
I checked the permissions and it is ok.

Running: resampleSlc
['--reference', '/data/insar/Proc/YL/SLC/20170629', '--secondary', '/data/insar/Proc/YL/SLC/20170630', '--coreg', '/data/insar/Proc/YL/proc/coregSLC/Coarse/20170630', '--offsets', '/data/insar/Proc/YL/proc/offsets/20170630']
cp /data/insar/Proc/YL/SLC/20170630/data* /data/insar/Proc/YL/proc/coregSLC/Coarse/20170630/secondaryShelve
Traceback (most recent call last):
File "/opt/isce2/isce/contrib/stack/stripmapStack/stripmapWrapper.py", line 155, in
main(args.start,args.end)
File "/opt/isce2/isce/contrib/stack/stripmapStack/stripmapWrapper.py", line 146, in main
cfgParser.runCmd()
File "/opt/isce2/isce/contrib/stack/stripmapStack/stripmapWrapper.py", line 51, in runCmd
func_modules.main(self.funcParams[section])
File "/opt/isce2/isce/contrib/stack/stripmapStack/resampleSlc.py", line 219, in main
reference = reference)
File "/opt/isce2/isce/components/isceobj/Util/decorators.py", line 290, in use_api_decorator
ret = func(*args,**kwargs)
File "/opt/isce2/isce/contrib/stack/stripmapStack/resampleSlc.py", line 106, in resampSecondary
inimg.load(burst.getImage().filename + '.xml')
File "/opt/isce2/isce/components/iscesys/Component/Configurable.py", line 1407, in load
tmpProp, tmpFact, tmpMisc = FP.parse(filename)
File "/opt/isce2/isce/components/iscesys/Parsers/XmlParser.py", line 41, in parse
root = ET.parse(filename)
File "/usr/lib/python3.6/xml/etree/ElementTree.py", line 1196, in parse
tree.parse(source, parser)
File "/usr/lib/python3.6/xml/etree/ElementTree.py", line 586, in parse
source = open(source, "rb")
FileNotFoundError: [Errno 2] No such file or directory: './20170630/20170630.slc.xml'

unify ISCE installation location in docker image to mirror conda package installation

Use cases:

  1. User wants to run the latest official release of ISCE
    • conda
      • install conda (on bare-metal or within a VM or docker container)
      • run conda install -c conda-forge isce2
    • docker
      • docker pull isce/isce2:latest
  2. User wants to run the develop branch (all approved PR's) of ISCE
    • conda
      • install conda (on bare-metal or within a VM or docker container)
      • checkout develop branch from github and build manually
    • docker
      • docker pull isce/isce2:develop

In all cases, user-developed code and scripts should not have to distinguish the import locations of ISCE nor have if-else cases statements to set proper environment variables.

To do:

  • create a develop branch in this repo and set that as the default branch (similar to ISCE3)
  • update current CircleCI workflows for PR processing and nightly builds to use this develop branch
    • this will tag the deployed dockerhub images as isce/isce2:develop and isce/isce2:YYYYMMDD, respectively
  • update Dockerfile to build and install ISCE under the conda installation directory as if it was installed via conda install -c conda-forge isce2
  • create new workflow that triggers off the creation of a new release tag from the master branch
    • this will tag the deployed dockerhub images as isce/isce2:<github_tag> (e.g. isce/isce2:2.3.1) and isce/isce2:latest

dem.py generate .WGS84 file ERROR

(isce) rsgis@rsgis10:/mnt/sar/quanzhou/dem$ dem.py -a stitch -b 24 25 117 119 -s 1 -m xml -r -c -n xxx-w xxx -u https://e4ftl01.cr.usgs.gov/MEASURES/SRTMGL1.003/2000.02.11/ This is the Open Source version of ISCE. Some of the workflows depend on a separate licensed package. To obtain the licensed package, please make a request for ISCE through the website: https://download.jpl.nasa.gov/ops/request/index.cfm. Alternatively, if you are a member, or can become a member of WinSAR you may be able to obtain access to a version of the licensed sofware at https://winsar.unavco.org/software/isce API open (R): ./demLat_N24_N25_Lon_E117_E119.dem API close: ./demLat_N24_N25_Lon_E117_E119.dem Writing geotrans to VRT for ./demLat_N24_N25_Lon_E117_E119.dem GDAL open (R): ./demLat_N24_N25_Lon_E117_E119.dem.vrt API open (WR): demLat_N24_N25_Lon_E117_E119.dem.wgs84 At line 114 of file build/isce/components/contrib/demUtils/correct_geoid_i2_srtm/src/correct_geoid_i2_srtm.f Fortran runtime error: End of record

mdx compiled with -fopenmp segfaults

Followup from #155. Compiling mdx with -fopenmp (e.g. via export FFLAGS=-fopenmp before running cmake) appears successful, but produces an executable that segfaults immediately when attempting to view any image.

topsStack bug: occasionally relativeOrbitNumber to track number conversion wrong

Occasionally, in topsStack, the relativeOrbitNumber is not properly converted to track number in the /master/IW1.xml file (in about 10-20% of cases). Here an example for descending track 87 covering Kilauea volcano.

Here the *.SAFE file for 2018-05-05 showing the correct relativeOrbitNumber of 87:

grep OrbitNumber S1B_IW_SLC__1SDV_20180505T161525_20180505T161552_010788_013B7D_25D2.SAFE/*
S1B_IW_SLC__1SDV_20180505T161525_20180505T161552_010788_013B7D_25D2.SAFE/manifest.safe:            <safe:relativeOrbitNumber type="start">87</safe:relativeOrbitNumber>

However, the master/IW1.xml file shows a track number of 41:

/scratch/05861/tg851601/KilaueaSenDT87[1104] head -1422 master/IW1.xml | tail -20
                <property name="sensingstop">
                    <value>2018-05-05 16:15:32.858289</value>
                    <doc>UTC time corresponding to last line of burst SLC</doc>
                </property>
                <property name="startingrange">
                    <value>799103.829224447</value>
                    <doc>Slant range to first pixel in m</doc>
                </property>
                <property name="swathnumber">
                    <value>1</value>
                    <doc>Swath number for bookkeeping</doc>
                </property>
                <property name="terrainheight">
                    <value>0.0</value>
                    <doc>Average terrain height used for focusing</doc>
                </property>
                <property name="tracknumber">
                    <value>41</value>
                    <doc>Track number for bookkeeping</doc>
                </property>

Weirdly, some slave granules don't show the correct track number, whereas others do (the first one shows 41 and the second correctly 87)

//login4/scratch/05861/tg851601/KilaueaSenDT87[1056] head -1422  slaves/20180517/IW1.xml | tail -20
                <property name="sensingstop">
                    <value>2018-05-17 16:15:33.369091</value>
                    <doc>UTC time corresponding to last line of burst SLC</doc>
                </property>
                <property name="startingrange">
                    <value>799103.829224447</value>
                    <doc>Slant range to first pixel in m</doc>
                </property>
                <property name="swathnumber">
                    <value>1</value>
                    <doc>Swath number for bookkeeping</doc>
                </property>
                <property name="terrainheight">
                    <value>0.0</value>
                    <doc>Average terrain height used for focusing</doc>
                </property>
                <property name="tracknumber">
                    <value>41</value>
                    <doc>Track number for bookkeeping</doc>
                </property>
//login4/scratch/05861/tg851601/KilaueaSenDT87[1057] head -1422  slaves/20190331/IW1.xml | tail -20
                <property name="sensingstop">
                    <value>2019-03-31 16:16:19.877707</value>
                    <doc>UTC time corresponding to last line of burst SLC</doc>
                </property>
                <property name="startingrange">
                    <value>799103.9365244998</value>
                    <doc>Slant range to first pixel in m</doc>
                </property>
                <property name="swathnumber">
                    <value>1</value>
                    <doc>Swath number for bookkeeping</doc>
                </property>
                <property name="terrainheight">
                    <value>0.0</value>
                    <doc>Average terrain height used for focusing</doc>
                </property>
                <property name="tracknumber">
                    <value>87</value>
                    <doc>Track number for bookkeeping</doc>
                </property>

If you want to download example data using ssara:

ssara_federated_query-cj.py --platform=SENTINEL-1A,SENTINEL-1B --relativeOrbit=87 --intersectsWith='Polygon((-155.80 19.10, -155.80 19.90, -154.50 19.90, -154.50 19.10, -155.80 19.10))' -s=2018-05-05 --parallel=50 --print --download

The topsStack workflow is initiated using:

stackSentinel.py --slc_directory /scratch/05861/tg851601/KilaueaSenDT87/SLC --orbit_directory /work/05861/tg851601/stampede2/insarlab/S1orbits --aux_directory /work/05861/tg851601/stampede2/insarlab/S1aux --working_directory /scratch/05861/tg851601/KilaueaSenDT87 --dem DEM/dem.dem.wgs84 --num_connections 5 --num_overlap_connections 3 --swath_num '1 2' --bbox '19.1 19.9 -155.8 -154.5' --azimuth_looks 7 --range_looks 19 --filter_strength 0.2 --esd_coherence_threshold 0.85 --snr_misreg_threshold 10 --unw_method snaphu --polarization vv --coregistration NESD --workflow interferogram

prepareUAVSAR_coregStack does not handle multiple polarizations

A few UAVSAR lines have been processed with all four polarizations (HH, HV, VH, VV), including eelriv_06508_03. The preparation script does not detect that more than one polarization is present and simply overwrites the output ".slc" files with one of the polarizations (I think the VV).

Best workaround is to only download one polarization before running the script.

/.../orbit2sch.abi3.so: undefined symbol: orbit2sch_

/home/ops/ariamh/interferogram/sentinel/get_union_bbox.sh
encounters "undefined symbol: orbit2sch_", any ideas?

Traceback (most recent call last):
File "/home/ops/ariamh/interferogram/sentinel/create_standard_product_s1.py", line 1860, in
status = main()
File "/home/ops/ariamh/interferogram/sentinel/create_standard_product_s1.py", line 1046, in main
match_pol), shell=True)
File "/opt/conda/lib/python3.7/subprocess.py", line 363, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '/home/ops/ariamh/interferogram/sentinel/get_union_bbox.sh -o bbox.json .SAFE/annotation/s1?-iw?-slc-vv-.xml' returned non-zero exit status 1.

----- errors|exception found in log -----

ImportError: /opt/isce2/isce/components/stdproc/orbit/orbit2sch.abi3.so: undefined symbol: orbit2sch_
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '/home/ops/ariamh/interferogram/sentinel/get_union_bbox.sh -o bbox.json .SAFE/annotation/s1?-iw?-slc-vv-.xml' returned non-zero exit status 1.

scons issue with python 2 and 3

Hi ISCE team,

First off, thanks for a developing this neat package. I look forward to using it.

When I go to install isce2 using scons this is what I get:

Will-Kochtitzkys-Mac:isce2 willkochtitzky$ scons install
scons: Reading SConscript files ...
Building with scons from python2
Checking for C header file Python.h... yes
Checking for C header file fftw3.h... yes
Checking for C header file hdf5.h... yes
Checking for C header file X11/Xlib.h... yes
Checking for C header file Xm/Xm.h... yes
Checking for C header file omp.h... yes
Checking for C library hdf5... yes
Checking for C library fftw3f... yes
Checking for C library Xm... yes
Checking for C library Xt... yes
Checking for F include fftw3 ... yes
GDAL version: 3.0.3
 
Checking for C++ header file gdal_priv.h... yes
Checking for C library gdal... yes
Scons appears to find everything needed for installation
Checking whether cython3 program exists.../Users/willkochtitzky/miniconda3/bin/cython3
User did not request CUDA support. Add ENABLE_CUDA = True to SConfigISCE to enable CUDA support
TypeError: makedirs() got an unexpected keyword argument 'exist_ok':
  File "/Users/willkochtitzky/bin/ISCE_SRC/isce2/SConstruct", line 219:
    os.makedirs(inst, exist_ok=True)

It seems like I am having a scons versioning issue with python 2 and 3. I have attached my config.log file as well.

I don't understand what is happening here but when I enter: /usr/bin/env python3 $(which scons)

This is what I get:

(base) Will-Kochtitzkys-Mac:isce2 willkochtitzky$ /usr/bin/env python3 $(which scons)
scons: Reading SConscript files ...
Building with scons from python3
Checking for C header file Python.h... yes
Checking for C header file fftw3.h... yes
Checking for C header file hdf5.h... yes
Checking for C header file X11/Xlib.h... yes
Checking for C header file Xm/Xm.h... yes
Checking for C header file omp.h... yes
Checking for C library hdf5... yes
Checking for C library fftw3f... yes
Checking for C library Xm... yes
Checking for C library Xt... yes
Checking for F include fftw3 ... yes
GDAL version: 3.0.3

Checking for C++ header file gdal_priv.h... yes
Checking for C library gdal... yes
Scons appears to find everything needed for installation
Checking whether cython3 program exists.../Users/willkochtitzky/miniconda3/bin/cython3
User did not request CUDA support. Add ENABLE_CUDA = True to SConfigISCE to enable CUDA support
Building with scons from python2
Checking for F include fftw3 ... yes
GDAL version: 3.0.3

Scons appears to find everything needed for installation
User did not request CUDA support. Add ENABLE_CUDA = True to SConfigISCE to enable CUDA support
TypeError: makedirs() got an unexpected keyword argument 'exist_ok':
  File "/Users/willkochtitzky/bin/ISCE_SRC/isce2/SConstruct", line 219:
    os.makedirs(inst, exist_ok=True)
scons: done reading SConscript files.
scons: Building targets ...
scons: `.' is up to date.
scons: done building targets.

This seems like an improvement that I am initially running in python 3 but then I get kicked back to python 2.

I have no clue how to fix this, but would be very appreciate of any advice!

Thanks!
Will Kochtitzky
University of Ottawa

config.log

Baseline bug - stripmap workflows

An earlier issue ticket (#137) addressed baseline grid errors in Sentinel-1 Tops workflows. Similar fix to be carried over to Stripmap and potentially Scansar workflows. Below was the suggested solution by @piyushrpt

Looks like the scenes are from different datasets than the one above. So, am not able to reproduce the plots above.

The equations currently don't account for along-track shift. Bpar and Bperp are 2D concepts when the imaging geometry is in 3D. There are 2 solutions:

Solution 1 (preferred)

This will keep implementation general and should work in both zero doppler and native doppler systems (if doppler is provided as additional parameter in rdr2geo and geo2rdr in future).

Replace sxyz with

mvelunit = mvel / np.linalg.norm(mvel)
sxyz = sxyz - np.dot ( sxyz-mxyz, mvelunit) * mvelunit

This ensures baseline has no along track component reducing it to 2D diagrams we see in papers.

Solution 2

This only works for zero doppler geometry since the doppler curve reduces to a plane.

You can just use a single slave orbit position for the entire range line - by finding closest point between 2 orbits using

mllh = refElp.xyz_to_llh(mxyz)
stime, srng = sOrb.geo2rdr(mllh)
ssv = sOrbit.interpolate(stime, method='hermite')
sxyz = np.array(ssv.GetPosition())

This way all 3 points, mxyz, sxyz and target are all on zero doppler plane - reducing to a 2D representation. You can reuse same sxyz for all slant ranges.

Quick check with your annotation files that you sent shows that both agree to sub-mm level for zero doppler. If you can verify that the fix works with the troublesome data, that would be great.

Originally posted by @piyushrpt in #137 (comment)

topsStack processing metadata changed?

I have a stack of Sentinel-1 data that I was processing with the topsStack workflow a few months ago, and I tried to update it today with some new data. In the meantime, I had also updated my version of ISCE2 to v. 2.3.3 and I think some additional updates last month.

I am getting an error at the baseline step:

Running: computeBaseline
['--baseline_file', '/u/sar-r2/fielding/Puerto_Rico/S1AB/A135/seismic/baselines/20191223_20200316/2
0191223_20200316.txt', '--master', '/u/sar-r2/fielding/Puerto_Rico/S1AB/A135/seismic/master/', '--s
lave', '/u/sar-r2/fielding/Puerto_Rico/S1AB/A135/seismic/slaves/20200316']
2020-05-06 19:50:54,870 - isce.burstslc - ERROR - Error. The attribute corresponding to the key "az
imuthprocessingbandwidth" is not present in the object "<class 'isceobj.Sensor.TOPS.BurstSLC.BurstS
LC'>".
Possible causes are the definition in the xml file of such attribute that is no longer defined
in the object "<class 'isceobj.Sensor.TOPS.BurstSLC.BurstSLC'>" or a spelling error
Completed Parsing the Configuration file
Functions to be executed:
['Function-1']

Has the metadata for the TOPS processing changed in the last few months?

Do I have to discard the whole stack and restart processing from the beginning?

Codec error on some Sentinel SAFE zip files?

Hi,

I am very new to exploring Sentinel-1A data using topsApp.py. Thanks for sharing this great tool.

On some data files I have an encoding issue shortly after starting topsApp. It happens for all swaths in both files.

Could not extract swath 3 from ['/media/storage/sentinel-1a/S1A_IW_SLC__1SDV_20201124T191556_20201124T191623_035394_0422CA_4005.zip']
Generated error:  'utf-8' codec can't decode byte 0x9a in position 14: invalid start byte
2020-12-24 09:23:29,066 - isce.topsinsar.runPreprocessor - INFO - 

I am running Ubuntu 20.04 in Australia. I explored adjusting the LC_ALL environment variable regarding locale settings but couldn't get it to work.

I cannot open these files from relative orbit 147

S1A_IW_SLC__1SDV_20200901T191556_20200901T191623_034169_03F81C_6DF7.zip
S1A_IW_SLC__1SDV_20201124T191556_20201124T191623_035394_0422CA_4005.zip
S1A_IW_SLC__1SDV_20201218T191556_20201218T191623_035744_042ED2_5F7A.zip

I also tried unzipping the files. They open fine in the ESA SNAP software.

Interestingly I can open and process files from relative orbit 9 (which I think doesn't align with the satellite orbit track)

S1A_IW_SLC__1SDV_20200401T083922_20200401T083950_031931_03AFD3_801F.zip
S1A_IW_SLC__1SDV_20200904T083930_20200904T083958_034206_03F960_D105.zip
S1A_IW_SLC__1SDV_20201127T083930_20201127T083958_035431_04241A_E226.zip
S1A_IW_SLC__1SDV_20201209T083930_20201209T083958_035606_042A28_11C3.zip

Any advice is welcome.

Thanks.

rtcApp.py is incomplete

@MIRSL - looks like the rtcApp.py PR is incomplete. The geocode step seems to be missing and the the example input file seems to have errors in it.

Could you follow up and complete the missing steps since you already have it working ?

Docker fail to build image

Hi, I'm trying to build an image using the file docker/Dockerfile and it fails with the following error

+ sudo /opt/conda/bin/conda install --yes gdal h5py libgdal pytest numpy fftw scipy hdf4 hdf5 netcdf4
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying with flexible solve.
Solving environment: ...working... failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying with flexible solve.
                                                                                                                                          
Found conflicts! Looking for incompatible packages.
This can take several minutes.  Press CTRL-C to abort.
failed

UnsatisfiableError: The following specifications were found to be incompatible with each other:

Output in format: Requested package -> Available versions

Package _libgcc_mutex conflicts for:
fftw -> libgcc-ng[version='>=7.3.0'] -> _libgcc_mutex=[build=main]
numpy -> libgcc-ng[version='>=7.3.0'] -> _libgcc_mutex=[build=main]
hdf5 -> libgcc-ng[version='>=7.3.0'] -> _libgcc_mutex=[build=main]
python=3.8 -> libgcc-ng[version='>=7.3.0'] -> _libgcc_mutex=[build=main]
gdal -> libgcc-ng[version='>=7.3.0'] -> _libgcc_mutex=[build=main]
netcdf4 -> libgcc-ng[version='>=7.3.0'] -> _libgcc_mutex=[build=main]
h5py -> libgcc-ng[version='>=7.3.0'] -> _libgcc_mutex=[build=main]
libgdal -> libgcc-ng[version='>=7.3.0'] -> _libgcc_mutex=[build=main]
hdf4 -> libgcc-ng[version='>=7.2.0'] -> _libgcc_mutex=[build=main]
scipy -> libgcc-ng[version='>=7.3.0'] -> _libgcc_mutex=[build=main]

Package libgcc-ng conflicts for:
numpy -> libopenblas[version='>=0.3.2,<0.3.3.0a0'] -> libgcc-ng[version='>=8.2.0']
python=3.8 -> libgcc-ng[version='>=7.3.0']
hdf4 -> libgcc-ng[version='>=7.2.0']
hdf4 -> zlib[version='>=1.2.11,<1.3.0a0'] -> libgcc-ng[version='>=7.3.0']
fftw -> libgcc-ng[version='>=7.2.0|>=7.3.0']
libgdal -> libgcc-ng[version='>=7.2.0|>=7.3.0']
netcdf4 -> libgcc-ng[version='>=7.2.0|>=7.3.0']
pytest -> python[version='>=3.6,<3.7.0a0'] -> libgcc-ng[version='>=7.2.0|>=7.3.0']
numpy -> libgcc-ng[version='>=7.2.0|>=7.3.0']
h5py -> libgcc-ng[version='>=7.2.0|>=7.3.0']
python=3.8 -> zlib[version='>=1.2.11,<1.3.0a0'] -> libgcc-ng[version='>=7.2.0']
scipy -> libopenblas[version='>=0.3.2,<0.3.3.0a0'] -> libgcc-ng[version='>=8.2.0']
hdf5 -> libgcc-ng[version='>=7.2.0|>=7.3.0']
scipy -> libgcc-ng[version='>=7.2.0|>=7.3.0']
gdal -> libgcc-ng[version='>=7.2.0|>=7.3.0']

Package hdf4 conflicts for:
gdal -> libgdal==3.0.2=h27ab9cc_0 -> hdf4[version='>=4.2.13,<4.2.14.0a0']
libgdal -> hdf4[version='>=4.2.13,<4.2.14.0a0']
netcdf4 -> libnetcdf[version='>=4.7.3,<5.0a0'] -> hdf4[version='>=4.2.13,<4.2.14.0a0']
hdf4

Package numpy conflicts for:
numpy
scipy -> numpy[version='>=1.11.3,<2.0a0|>=1.14.6,<2.0a0|>=1.15.1,<2.0a0|>=1.9.3,<2.0a0']
h5py -> numpy[version='>=1.11.3,<2.0a0|>=1.14.6,<2.0a0']
gdal -> numpy[version='>=1.11.3,<2.0a0|>=1.9.3,<2.0a0']
netcdf4 -> numpy[version='>=1.11.3,<2.0a0|>=1.14.6,<2.0a0|>=1.9.3,<2.0a0|>=1.14.0,<2.0a0']

Package libgdal conflicts for:
gdal -> libgdal[version='2.2.4|2.2.4|2.3.0|2.3.2|2.3.2|2.3.3|3.0.2|>=2.2.2,<2.3.0a0',build='hc8d23f9_1|heea9cce_1|hda2fc8e_0|h27ab9cc_0|h2e7e64b_0|h9d4a965_0|heea9cce_1']
libgdal

Package libedit conflicts for:
python=3.8 -> sqlite[version='>=3.32.3,<4.0a0'] -> libedit[version='>=3.1.20181209,<3.2.0a0|>=3.1.20191231,<3.2.0a0']
libgdal -> sqlite[version='>=3.30.1,<4.0a0'] -> libedit[version='>=3.1.20170329,<3.2.0a0|>=3.1.20181209,<3.2.0a0|>=3.1.20191231,<3.2.0a0']

Package libgfortran-ng conflicts for:
scipy -> libgfortran-ng[version='>=7,<8.0a0|>=7.2.0,<8.0a0']
netcdf4 -> hdf5[version='>=1.10.4,<1.10.5.0a0'] -> libgfortran-ng[version='>=7,<8.0a0|>=7.2.0,<8.0a0']
scipy -> libopenblas[version='>=0.3.2,<0.3.3.0a0'] -> libgfortran-ng[version='>=8,<9.0a0']
libgdal -> cfitsio[version='>=3.470,<3.471.0a0'] -> libgfortran-ng[version='>=7,<8.0a0|>=7.2.0,<8.0a0']
h5py -> hdf5[version='>=1.10.6,<1.10.7.0a0'] -> libgfortran-ng[version='>=7,<8.0a0|>=7.2.0,<8.0a0']
numpy -> libgfortran-ng[version='>=7,<8.0a0|>=7.2.0,<8.0a0']
numpy -> libopenblas[version='>=0.3.2,<0.3.3.0a0'] -> libgfortran-ng[version='>=8,<9.0a0']
gdal -> numpy[version='>=1.11.3,<2.0a0'] -> libgfortran-ng[version='>=7,<8.0a0|>=7.2.0,<8.0a0']
hdf5 -> libgfortran-ng[version='>=7,<8.0a0|>=7.2.0,<8.0a0']

Package hdf5 conflicts for:
hdf5
libgdal -> libnetcdf[version='>=4.6.1,<4.7.0a0'] -> hdf5[version='>=1.8.20,<1.9.0a0']
gdal -> libgdal==3.0.2=h27ab9cc_0 -> hdf5[version='>=1.10.1,<1.10.2.0a0|>=1.10.2,<1.10.3.0a0|>=1.10.4,<1.10.5.0a0|>=1.8.18,<1.8.19.0a0']
libgdal -> hdf5[version='>=1.10.1,<1.10.2.0a0|>=1.10.2,<1.10.3.0a0|>=1.10.4,<1.10.5.0a0|>=1.8.18,<1.8.19.0a0']
netcdf4 -> hdf5[version='>=1.10.1,<1.10.2.0a0|>=1.10.2,<1.10.3.0a0|>=1.10.4,<1.10.5.0a0|>=1.8.20,<1.9.0a0|>=1.8.18,<1.8.19.0a0']
h5py -> hdf5[version='>=1.10.1,<1.10.2.0a0|>=1.10.2,<1.10.3.0a0|>=1.10.4,<1.10.5.0a0|>=1.10.6,<1.10.7.0a0|>=1.8.20,<1.9.0a0|>=1.8.18,<1.8.19.0a0']

Package libcurl conflicts for:
libgdal -> libcurl[version='>=7.60.0,<8.0a0|>=7.61.1,<8.0a0|>=7.63.0,<8.0a0|>=7.65.3,<8.0a0']
libgdal -> cfitsio[version='>=3.470,<3.471.0a0'] -> libcurl[version='7.57.0|7.58.0|7.59.0|7.60.0|7.61.0|7.61.1|7.62.0|7.63.0|7.63.0|7.64.0|7.64.1|7.65.2|7.65.3|7.67.0|7.68.0|7.69.1|7.71.0|7.71.1|>=7.58.0,<8.0a0|>=7.59.0,<8.0a0|>=7.61.0,<8.0a0|>=7.69.1,<8.0a0|>=7.71.1,<8.0a0',build='h1ad7b7a_0|h20c2e04_0|h20c2e04_0|h20c2e04_0|h20c2e04_1000|h20c2e04_2|h20c2e04_0|h20c2e04_0|h20c2e04_0|h20c2e04_0|h20c2e04_0|h20c2e04_1|h20c2e04_0|h20c2e04_0|h1ad7b7a_0|h1ad7b7a_0|h1ad7b7a_0|h1ad7b7a_0']

Package zlib conflicts for:
netcdf4 -> hdf5[version='>=1.10.4,<1.10.5.0a0'] -> zlib[version='1.2.*|>=1.2.11,<1.3.0a0']
scipy -> python[version='>=3.6,<3.7.0a0'] -> zlib[version='>=1.2.11,<1.3.0a0']
numpy -> python[version='>=3.7,<3.8.0a0'] -> zlib[version='>=1.2.11,<1.3.0a0']
python=3.8 -> zlib[version='>=1.2.11,<1.3.0a0']
h5py -> hdf5[version='>=1.10.6,<1.10.7.0a0'] -> zlib[version='1.2.*|>=1.2.11,<1.3.0a0']
libgdal -> hdf5[version='>=1.8.18,<1.8.19.0a0'] -> zlib=1.2
libgdal -> zlib[version='>=1.2.11,<1.3.0a0']
pytest -> python[version='>=3.6,<3.7.0a0'] -> zlib[version='>=1.2.11,<1.3.0a0']
hdf4 -> zlib[version='>=1.2.11,<1.3.0a0']
gdal -> libgdal==3.0.2=h27ab9cc_0 -> zlib[version='>=1.2.11,<1.3.0a0']
hdf5 -> zlib[version='1.2.*|>=1.2.11,<1.3.0a0']

Package ncurses conflicts for:
pytest -> python[version='>=3.6,<3.7.0a0'] -> ncurses[version='6.0.*|>=6.0,<7.0a0|>=6.1,<7.0a0|>=6.2,<7.0a0']
h5py -> python[version='>=3.8,<3.9.0a0'] -> ncurses[version='6.0.*|>=6.0,<7.0a0|>=6.1,<7.0a0|>=6.2,<7.0a0']
libgdal -> sqlite[version='>=3.30.1,<4.0a0'] -> ncurses[version='>=6.2,<7.0a0']
netcdf4 -> python[version='>=3.8,<3.9.0a0'] -> ncurses[version='6.0.*|>=6.0,<7.0a0|>=6.1,<7.0a0|>=6.2,<7.0a0']
numpy -> python[version='>=3.7,<3.8.0a0'] -> ncurses[version='6.0.*|>=6.0,<7.0a0|>=6.1,<7.0a0|>=6.2,<7.0a0']
python=3.8 -> ncurses[version='>=6.1,<7.0a0|>=6.2,<7.0a0']
scipy -> python[version='>=3.6,<3.7.0a0'] -> ncurses[version='6.0.*|>=6.0,<7.0a0|>=6.1,<7.0a0|>=6.2,<7.0a0']
python=3.8 -> readline[version='>=7.0,<8.0a0'] -> ncurses[version='6.0.*|>=6.0,<7.0a0']
gdal -> python[version='>=2.7,<2.8.0a0'] -> ncurses[version='6.0.*|>=6.0,<7.0a0|>=6.1,<7.0a0|>=6.2,<7.0a0']

Package six conflicts for:
h5py -> six
pytest -> more-itertools[version='>=4.0.0'] -> six[version='>=1.0.0,<2.0.0']
scipy -> mkl-service[version='>=2,<3.0a0'] -> six
numpy -> mkl-service[version='>=2,<3.0a0'] -> six
h5py -> unittest2 -> six[version='>=1.4']
pytest -> six[version='>=1.10.0']

Package setuptools conflicts for:
netcdf4 -> setuptools
pytest -> setuptools[version='>=40.0']
python=3.8 -> pip -> setuptools

Package certifi conflicts for:
pytest -> setuptools[version='>=40.0'] -> certifi[version='>=2016.09|>=2016.9.26']
netcdf4 -> setuptools -> certifi[version='>=2016.09|>=2016.9.26']

Package bzip2 conflicts for:
libgdal -> cfitsio[version='>=3.470,<3.471.0a0'] -> bzip2[version='>=1.0.6,<2.0a0|>=1.0.8,<2.0a0']
netcdf4 -> libnetcdf[version='>=4.7.3,<5.0a0'] -> bzip2[version='>=1.0.6,<2.0a0|>=1.0.8,<2.0a0']

Package libnetcdf conflicts for:
netcdf4 -> libnetcdf[version='>=4.4.1.1,<4.4.2.0a0|>=4.5.0,<5.0a0|>=4.6.1,<4.7.0a0|>=4.7.3,<5.0a0']
libgdal -> libnetcdf[version='>=4.4.1.1,<4.4.2.0a0|>=4.6.1,<4.7.0a0']
gdal -> libgdal==3.0.2=h27ab9cc_0 -> libnetcdf[version='>=4.4.1.1,<4.4.2.0a0|>=4.6.1,<4.7.0a0']

Package intel-openmp conflicts for:
scipy -> mkl[version='>=2019.4,<2021.0a0'] -> intel-openmp
numpy -> mkl[version='>=2019.4,<2021.0a0'] -> intel-openmp

Removing intermediate container d1b07f376eba
The command '/bin/sh -c set -ex  && sudo /opt/conda/bin/conda install --yes       gdal       h5py       libgdal       pytest       numpy       fftw       scipy       hdf4       hdf5       netcdf4  && sudo yum update -y  && sudo yum install -y uuid-devel x11-devel motif-devel gcc-gfortran  && cd /opt/conda/lib  && sudo unlink libuuid.so  && sudo unlink libuuid.so.1  && sudo ln -s /lib64/libuuid.so.1.3.0 libuuid.so  && sudo ln -s /lib64/libuuid.so.1.3.0 libuuid.so.1  && cd /lib64  && ( test -f libgfortran.so || sudo ln -sv libgfortran.so.*.* libgfortran.so )  && sudo yum install -y /tmp/isce-2.3-1.x86_64.rpm  && sudo yum clean all  && sudo rm -rf /var/cache/yum  && sudo rm /tmp/isce-2.3-1.x86_64.rpm' returned a non-zero code: 1



bug in topsStack `logging/logging.conf parameters: too small filesize

For large data sets and running ~100 jobs in parallel, occasionally the Logging error below appears (it may also say isce.log.3, etc). Allowing for larger file size in isce/defaults/logging/logging.conf fixes the problem ( I did args=('isce.log','a',1000048576,5) ),

I don't understand the purpose of the size restriction, that is why I don't do a PR, hoping you guys can fix it. If you are not sure please just put a big number.

I see a lot of Depreciation Warning (see below). That could contribute to the file size issue.
Thank you

log erros:

cat run_7_pairs_misreg_0_36.e
--- Logging error ---
Traceback (most recent call last):
  File "/work/05861/tg851601/stampede2/test/dev2/rsmas_insar/3rdparty/miniconda3/lib/python3.7/logging/handlers.py", line 70, in emit
    self.doRollover()
  File "/work/05861/tg851601/stampede2/test/dev2/rsmas_insar/3rdparty/miniconda3/lib/python3.7/logging/handlers.py", line 166, in doRollover
    os.remove(dfn)
FileNotFoundError: [Errno 2] No such file or directory: '/scratch/05861/tg851601/HanumangarhSenDT34/run_files/isce.log.5'

Depraciation warning message:

cat run_7_pairs_misreg_0_34.e
/work/05861/tg851601/stampede2/test/dev2/rsmas_insar/sources/isceStack/isce2/contrib/stack/topsStack/estimateAzimuthMisreg.py:144: VisibleDeprecationWarning: Passing `normed=True` on non-uniform bins has always been broken, and computes neither the probability density function nor the probability mass function. The result is only correct if the bins are uniform, when density=True will produce the same result anyway. The argument will be removed in a future version of numpy.
  hist, bins = np.histogram(val, 50, normed=1)
/work/05861/tg851601/stampede2/test/dev2/rsmas_insar/sources/isceStack/isce2/contrib/stack/topsStack/estimateRangeMisreg.py:208: VisibleDeprecationWarning: Passing `normed=True` on non-uniform bins has always been broken, and computes neither the probability density function nor the probability mass function. The result is only correct if the bins are uniform, when density=True will produce the same result anyway. The argument will be removed in a future version of numpy.
  hist, bins = np.histogram(val, 50, normed=1)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.