Coder Social home page Coder Social logo

planetlabs / notebooks Goto Github PK

View Code? Open in Web Editor NEW
595.0 53.0 296.0 786.13 MB

interactive notebooks from Planet Engineering

Home Page: https://developers.planet.com/

License: Apache License 2.0

Jupyter Notebook 99.96% Python 0.03% Dockerfile 0.01% HTML 0.01% JavaScript 0.01%
jupyter-notebooks satellite-imagery remote-sensing python data-analysis api

notebooks's Introduction

Planet Interactive Guides

In this repository, you'll find a collection of Jupyter notebooks from the software developers, data scientists, and developer advocates at Planet. These interactive, open-source (APLv2) guides are designed to help youwork with our APIs and tools, explore Planet data, and learn how to extract information from our massive archive of high-cadence satellite imagery. We hope these guides will inspire you to ask interesting questions of Planet data. Need help? Find a bug? Please file an issue and we'll get back to you.

Install and use these notebooks

System Prerequisites

NOTE: After installing Docker, Windows users should install WSL2 Backend when prompted.

Clone or update repo:

If you've never cloned the Planet notebooks repo, run the following:

git clone https://github.com/planetlabs/notebooks.git
cd notebooks

If you have previously cloned the Planet notebooks repo in the past, make sure to update to pull any changes locally that might have happened since you last interacted with the Planet notebooks:

cd notebooks
git pull

Authentication

Access your Planet API Key in Python

Authentication with Planet's API Key can be achieved by using a valid Planet API Key.

You can export your API Key as an environment variable on your system:

export PL_API_KEY="YOUR-API-KEY"

If you wish to have your API Key be persistent (forever stored as PL_API_KEY), then you may enter this export command in your ~/.bashrc or ~/.zshrc file. If you are using our Docker environment, as is defined below, it will already be set.

In Python, we set up an API Key variable, PLANET_API_KEY, from an environment variable to use with our API requests:

# Import the os module in order to access environment variables
import os

# Set up the API Key from the `PL_API_KEY` environment variable
PLANET_API_KEY = os.getenv('PL_API_KEY')

Now, your Planet API Key is stored in the variable PLANET_API_KEY and is ready to use in your Python code.

Run Planet Notebooks in Docker

Planet Notebooks rely on a complex stack of technologies that are not always easy to install and properly configure. To ease this complexity we provide a docker container for running the notebook on docker compatible systems. To install docker on your system please see docker's documentation for your operating system.

Download prebuilt Docker image (recommended)

The Docker image for these notebooks is hosted in the planetlabs/notebooks repo on DockerHub. To download and prepare the image for use, run:

cd notebooks
docker pull planetlabs/notebooks
docker tag planetlabs/notebooks planet-notebooks

# If you get errors running the above, you might have to add sudo to the beginning:
#sudo docker pull planetlabs/notebooks
#sudo docker tag planetlabs/notebooks planet-notebooks

If you want to re-build the Docker image yourself, this is documented below in the "Appendix: Build the Docker image" section.

Run the container

To run the container (after building or downloading it), add your Planet API key below and issue the following command from the git repository root directory:

docker run -it --rm -p 8888:8888 -v $PWD:/home/jovyan/work -e PL_API_KEY='[YOUR-API-KEY]' planet-notebooks

# If you get a permissions error running the above, you should add sudo to the front:
# sudo docker run -it --rm -p 8888:8888 -v $PWD:/home/jovyan/work -e PL_API_KEY='[YOUR-API-KEY]' planet-notebooks
# Windows users run: winpty docker run -it --rm -p 8888:8888 -v "/$PWD":/home/joyvan/work -e PL_API_KEY='[YOUR-API-KEY]' planet-notebooks

This does several things:

  1. Maps the docker container's 8888 port to your system's 8888 port. This makes the container available to your host systems web browser.

  2. Maps a host system path $PWD to the docker container's working directory. This ensures that the notebooks you create, edit, and save are available on your host system under the jupyter-notebooks sub-directory and are not destroyed when you exit the container. This also allows for running tests in the tests sub-directory.

  3. Ensures that the directory in the Docker container named /home/jovyan/work that has the notebooks in them is accessible to the Jupyter notebook server.

  4. Starts an interactive terminal that is accessible through http://localhost:8888.

  5. Sets an environment variable with your unique Planet API key for authenticating against the API.

  6. Includes the --rm option to clean up the notebook after you exit the process.

Open Jupyter notebooks

Once the Docker container is running, the CLI output will display a URL that you will use to access Jupyter notebooks with your browser.

http://localhost:8888/?token=<UNIQUE-TOKEN>

NOTE: This security token will change every time you start your Docker container.

Repository Organization

jupyter-notebooks

exploring_planet_data: Working with our various image products, how to use the udm mask or deliver imagery to our GEE integration

Search, activate, download with the Data API

Ordering, delivery, and tools with the Orders API

Process Planet data

Analyze and visualize Planet data

Soon we hope to add notebooks from the researchers, technologists, geographers, and entrepreneurs who are already using Planet data to ask interesting and innovative questions about our changing Earth. If you're working with our imagery and have a notebook (or just an idea for a notebook) that you'd like to share, please file an issue and let us know.

Appendix: Build the Docker image

This documents how to build the docker image yourself, rather than using the recommended step of downloading pre-built Docker images. This is useful if you are a developer adding dependencies or a new Jupyter notebook to this repo, for example.

First you must build the docker image. Note, this only has to be done the first time you use it. After checking out the repository, you run:

cd planet-notebook-docker
docker build --rm -t planet-notebooks .
cd ..

This will build and install the Docker image on your system, making it available to run. This may take some time (from 10 minutes to an hour) depending on your network connection and how long Anaconda takes to configure its environment.

notebooks's People

Contributors

angaither avatar bradneuberg avatar cameronbronstein avatar charliemoriarty avatar danabauer avatar digitaltopo avatar dorothyron avatar ebonnecab avatar emma-steuer avatar henryw95 avatar jakedahn avatar joferkington avatar jreiberkyle avatar juleeatplanet avatar keith-plant avatar kevinlacaille avatar kscottz avatar lobugliop avatar mattferraro avatar matthew-ballard avatar mikhailklassen avatar mkshah605 avatar nik-bladey avatar olepinard avatar pl-kevinwurster avatar rpitonak avatar sarasafavi avatar strixcuriosus avatar tbarsballe avatar thetechie avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

notebooks's Issues

Wrong output raster file defined

I've just followed the steps in your script to calculate top of atmosphere reflection (toar_planetscope.ipynb). It's really easy to follow. Thank you for providing this!

I think I've found a typo and figured I'd make you aware of it.
In [6] is the section where you set the characteristics of the output file to unint16. Then you rescale the reflectance bands from float so that they can be saved as unint16.
However, in the call to save the raster file, you define the reflectance file and not the rescaled file as output file.
If I understand this correctly, this would try to save the float values as unint16, resulting in an output file that has pretty much only 0 values.

I believe the section
with rasterio.open('data/reflectance.tif', 'w', **kwargs) as dst:
dst.write_band(1, band_blue_reflectance.astype(rasterio.uint16))
dst.write_band(2, band_green_reflectance.astype(rasterio.uint16))
dst.write_band(3, band_red_reflectance.astype(rasterio.uint16))
dst.write_band(4, band_nir_reflectance.astype(rasterio.uint16))

would have to be changed into:

with rasterio.open('data/reflectance.tif', 'w', **kwargs) as dst:
dst.write_band(1, blue_ref_scaled.astype(rasterio.uint16))
dst.write_band(2, green_ref_scaled .astype(rasterio.uint16))
dst.write_band(3, red_ref_scaled .astype(rasterio.uint16))
dst.write_band(4, nir_ref_scaled .astype(rasterio.uint16))

Break out utilities from large notebooks into their own notebooks

Some large notebooks have multiple utilities that would be useful in their own notebooks:

  • visualizing and converting UDMs to masks
  • visualizing RGB and NIR bands of analytic scenes

Break these out into their own notebooks and then reference in the source notebooks.

Docker Build Failure

The docker build is failing on the conda install:

Sending build context to Docker daemon   5.12kB
Step 1/12 : FROM jupyter/minimal-notebook:2c80cf3537ca
 ---> db464e6587fb
Step 2/12 : RUN conda install -y -c conda-forge gdal=2.4.0
 ---> Running in 2a389bdcb711
Fetching package metadata ...An unexpected error has occurred.
Please consider posting the following information to the
conda GitHub issue tracker at:

    https://github.com/conda/conda/issues



Current conda install:

               platform : linux-64
          conda version : 4.3.29
       conda is private : False
      conda-env version : 4.3.29
    conda-build version : not installed
         python version : 3.6.3.final.0
       requests version : 2.18.4
       root environment : /opt/conda  (writable)
    default environment : /opt/conda
       envs directories : /opt/conda/envs
                          /home/jovyan/.conda/envs
          package cache : /opt/conda/pkgs
                          /home/jovyan/.conda/pkgs
           channel URLs : https://conda.anaconda.org/conda-forge/linux-64
                          https://conda.anaconda.org/conda-forge/noarch
                          https://repo.continuum.io/pkgs/main/linux-64
                          https://repo.continuum.io/pkgs/main/noarch
                          https://repo.continuum.io/pkgs/free/linux-64
                          https://repo.continuum.io/pkgs/free/noarch
                          https://repo.continuum.io/pkgs/r/linux-64
                          https://repo.continuum.io/pkgs/r/noarch
                          https://repo.continuum.io/pkgs/pro/linux-64
                          https://repo.continuum.io/pkgs/pro/noarch
            config file : /opt/conda/.condarc
             netrc file : None
           offline mode : False
             user-agent : conda/4.3.29 requests/2.18.4 CPython/3.6.3 Linux/4.9.125-linuxkit debian/stretch/sid glibc/2.23
                UID:GID : 1000:100

`$ /opt/conda/bin/conda install -y -c conda-forge gdal=2.4.0`




    Traceback (most recent call last):
      File "/opt/conda/lib/python3.6/site-packages/conda/exceptions.py", line 640, in conda_exception_handler
        return_value = func(*args, **kwargs)
      File "/opt/conda/lib/python3.6/site-packages/conda/cli/main.py", line 140, in _main
        exit_code = args.func(args, p)
      File "/opt/conda/lib/python3.6/site-packages/conda/cli/main_install.py", line 80, in execute
        install(args, parser, 'install')
      File "/opt/conda/lib/python3.6/site-packages/conda/cli/install.py", line 231, in install
        unknown=index_args['unknown'], prefix=prefix)
      File "/opt/conda/lib/python3.6/site-packages/conda/core/index.py", line 101, in get_index
        index = fetch_index(channel_priority_map, use_cache=use_cache)
      File "/opt/conda/lib/python3.6/site-packages/conda/core/index.py", line 120, in fetch_index
        repodatas = collect_all_repodata(use_cache, tasks)
      File "/opt/conda/lib/python3.6/site-packages/conda/core/repodata.py", line 75, in collect_all_repodata
        repodatas = _collect_repodatas_serial(use_cache, tasks)
      File "/opt/conda/lib/python3.6/site-packages/conda/core/repodata.py", line 485, in _collect_repodatas_serial
        for url, schan, pri in tasks]
      File "/opt/conda/lib/python3.6/site-packages/conda/core/repodata.py", line 485, in <listcomp>
        for url, schan, pri in tasks]
      File "/opt/conda/lib/python3.6/site-packages/conda/core/repodata.py", line 115, in func
        res = f(*args, **kwargs)
      File "/opt/conda/lib/python3.6/site-packages/conda/core/repodata.py", line 473, in fetch_repodata
        with open(cache_path, 'w') as fo:
    FileNotFoundError: [Errno 2] No such file or directory: '/opt/conda/pkgs/cache/497deca9.json'

The command '/bin/sh -c conda install -y -c conda-forge gdal=2.4.0' returned a non-zero code: 1

This looks closely related to the recent gdal version issue mentioned in #71
which inspired the upgrade of gdal in
#72

we have an environment.yml file and requirements.txt file

We have an environment.yml file and requirements.txt file in this repo.

This situation can be a little confusing with users wondering which file to use to set up their environment and also having two places to maintain dependencies.

@sarasafavi and @digitaltopo I am curious:

  • what need is the environment.yml file filling?
  • is there any opportunity to merge the two files?
  • if we keep environment.yml is there any way to reduce it to actual high-level dependencies and remove version pinning? Not as convenient but easier to maintain.

Thanks!

localhost refused to connect

All steps in the installation process until and including 'Build the docker image' were successful. However I'm now running into this issue at the 'run the container' step.

Ran the following command in command prompt
docker run -it --rm -v "//c/Users/Craig D/Code/planet/notebooks/jupyter-notebooks:/home/jovyan/work" -e PL_API_KEY='[MY-API-KEY]' planet-notebooks
did not include the -p 8888:8888 flag mentioned in the README because it was already set when I ran the docker run command originally and running the command again while specifying -p gives the error 'port already allocated'

the snapshot below is what I get ..
image
once I open the localhost URL the page simply says 'site can't be reached - localhost refused to connect' instead of the jupyter lab notebooks I expect to see.

my system is Windows 10 and Docker version is Docker Toolbox, Docker version 18.03.0-ce, build 0520e24302

ndvi_planetscope requests API key

Running the Docker container, the notebook ndvi_planetscope.ipynb requests my API key at the line !planet data download --item-type PSScene4Band --dest data --asset-type analytic,analytic_xml --string-in id 20160831_180302_0e26. It appears that the environment variable PL_API_KEY is not being set correctly. I tried setting the key within the notebook using %env PLANET_API_KEY='my_key' but this also results in the error Error: InvalidAPIKey: {"message": "Please enter your API key.", "errors": []}

The following was successful !planet --api-key my_key data download --item-type PSScene4Band --dest data --asset-type analytic,analytic_xml --string-in id 20160831_180302_0e26

Note you also don't set %matplotlib inline in this notebook

Invalid shape: Hole is not within polygon

Both the crop-classification/classify-cart-l8-ps.ipynb and crop-classification/classify-cart.ipynb throw an error when trying to download the planet data in the 4th cell running planet data download

paging: False            pending: 0                                             

activating: 0            complete: 0              elapsed: 1                    
paging: False            pending: 0                                             

Error: BadQuery: {"field": {}, "general": [{"message": "invalid_shape_exception: Invalid shape: Hole is not within polygon"}]}

I'm running the jupyter notebook directly from the docker container

Quick search is not working

Hi Everyone:
I had an issue recently trying to follow step-by-step procedure from API Introduction (tutorials)
I can't go further Quick Search section, because my API_KEY is apparently not working.
I got this answer:

{
"message": "Please enter your API key, or email and password.",
"errors": []
}
Everything was good for previous sections, but from here (the most important i consider) the code reboots me with that message.

Sorry if my question is too basic, but I tried looking for the answer from another blogs, trying their solutions and without success.
Thanks in advance for your reply!
Mishel.

Query for `data/v1/item-types/:itemType/items/:itemId/assets` return empty response for the item

This bug is related to the bug I posted here: kscottz/PythonFromSpace#1

I am trying to gather assets for my items, but my queries for assets always return an empty object. Any suggestions on how I can gather these assets?

For example, I can look up an item with the following query: https://api.planet.com/data/v1/item-types/PSScene3Band/items/20170520_181725_1044/ but I am getting an empty object when I try to get the assets for that item, here:
https://api.planet.com/data/v1/item-types/PSScene3Band/items/20170520_181725_1044/assets/

Here is a more thorough example:

# Import helper modules:
import os
import json
import requests

# Setup the API Key from the `PL_API_KEY` environment variable
PLANET_API_KEY = os.getenv('PL_API_KEY')
print("PLANET_API_KEY:", PLANET_API_KEY)

# Helper function to printformatted JSON using the json module
def p(data):
    print(json.dumps(data, indent=2))


# Our First Request

# Setup Planet Data API base URL
# url from the example:
# URL = "https://api.planet.com/data/v1"
# url for the 'items' api returns the item in the response:
# URL = "https://api.planet.com/data/v1/item-types/PSScene3Band/items/20170520_181725_1044"

# TODO: why does querying for that item's asset give an empty response?
URL = "https://api.planet.com/data/v1/item-types/PSScene3Band/items/20170520_181725_1044/assets/"

# Setup the session
session = requests.Session()

# Authenticate
session.auth = (PLANET_API_KEY, "")

# Make a GET request to the Planet Data API
res = session.get(URL)
print("res.status_code", res.status_code)
print("res.text:", res.text)
print(res.json())
p(res.json())

Perhaps this is a bug in the API, or maybe I am missing something? Any tips would be helpful.

docker build failure: gdal missing

I'm seeing a new failure when trying to build the notebook image:

docker build --rm -t planet-notebooks .

Sending build context to Docker daemon   5.12kB
Step 1/12 : FROM jupyter/minimal-notebook:2c80cf3537ca
2c80cf3537ca: Pulling from jupyter/minimal-notebook
e0a742c2abfd: Pull complete
486cb8339a27: Pull complete
dc6f0d824617: Pull complete
4f7a5649a30e: Pull complete
672363445ad2: Pull complete
ecdd51c923e7: Pull complete
42885501cf6c: Pull complete
a91169574a99: Pull complete
4d0f6517ea26: Pull complete
95394e9265ac: Pull complete
8227c59e3779: Pull complete
074b7bf56d53: Pull complete
7acd5e85ad59: Pull complete
7f12c3d0ff9e: Pull complete
c6c3afa6f981: Pull complete
84c4870ea598: Pull complete
9f71a0e80d07: Pull complete
501394cd98d6: Pull complete
206ef30745dc: Pull complete
Digest: sha256:5fa4d62f2cf2ea7e17790ab9d5628d75fda4151b18d5dc47545cb34b0b07c2a2
Status: Downloaded newer image for jupyter/minimal-notebook:2c80cf3537ca
 ---> db464e6587fb
Step 2/12 : RUN conda install -y -c conda-forge gdal=2.4.0
 ---> Running in 28b4d36e8130
Fetching package metadata .............

PackageNotFoundError: Packages missing in current channels:

  - gdal 2.4.0*

We have searched for the packages in the following channels:

  - https://conda.anaconda.org/conda-forge/linux-64
  - https://conda.anaconda.org/conda-forge/noarch
  - https://repo.continuum.io/pkgs/main/linux-64
  - https://repo.continuum.io/pkgs/main/noarch
  - https://repo.continuum.io/pkgs/free/linux-64
  - https://repo.continuum.io/pkgs/free/noarch
  - https://repo.continuum.io/pkgs/r/linux-64
  - https://repo.continuum.io/pkgs/r/noarch
  - https://repo.continuum.io/pkgs/pro/linux-64
  - https://repo.continuum.io/pkgs/pro/noarch

I've tried the suggestion of removing the base jupyter/minimal-notebook image mentioned in #73 to no avail.

docker image gdal install error

When the docker image is built, the following error occurs:

Step 2/12 : RUN conda install -y -c conda-forge gdal=2.3.1
 ---> Running in fc89ad22750d
Fetching package metadata .............

PackageNotFoundError: Packages missing in current channels:
            
  - gdal 2.3.1*

It looks like the gdal version needs to be updated

test static notebook generation

It would be beneficial to casual notebook readers if a static representation of the notebooks was available. Test generating static representations of notebooks in this repo.

Thoughts:

  • utilize a script to build static representation
  • consider Pelican
  • consider creating a dev Docker image that installs Pelican
  • generate static representation of notebooks on a branch

host docker image on dockerhub

to allow users to get up and running with notebooks as fast as possible, remove the requirement for building their own images by hosting a docker image on dockerhub

fiona 1.8.0 bug breaks `datasets-identify.ipynb`

In datasets-identify.ipynb, we use fiona to read the shapefile coordinate reference system. The version of fiona that is associated with the version of rasterio we are using, 1.8.0, has a bug where it is unable to open the EPSG support file gcs.csv. This issue is documented here. In 1.8.1, this bug will be fixed. Wait for 1.8.1 to go live and fix the installed fiona to that version.

ndvi_planetscope notebook unexpected NDVI values

not sure this is the right place, but I have a query regarding one of the notebooks , 'ndvi_planetscope'
I ran the notebook for a scene I was interested in,
Scene id: '20181002_045628_0f1a'
item_type = 'PSScene4Band'
asset_type = 'analytic_sr'
everything runs ok, however the resulting NDVI I get is completely unexpected, largely in the range of 0.6-0.8 A Sentinel 2 image (TOA not SR) for the same date shows NDVI values, much smaller. Any idea what the issue might be? or if there is one

fix opencv import

In the latest notebook image, attempting to import opencv results in an error.

$>docker run -it --rm planet-notebooks python -c "import cv2"
Traceback (most recent call last):
  File "<string>", line 1, in <module>
ImportError: libGL.so.1: cannot open shared object file: No such file or directory

new build of docker image throws error on rasterio import

I am running docker for mac, which no longer provides docker-machine, the tool I was using to set my machine and build the docker images. When I rebuilt the docker image using docker for mac's default virtualization, I ran into the following error on rasterio import:

>$docker run -it planet-notebooks python -c "import rasterio"
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/opt/conda/lib/python3.7/site-packages/rasterio/__init__.py", line 22, in <module>
    from rasterio._base import gdal_version
ImportError: libpoppler.so.76: cannot open shared object file: No such file or directory

Make notebooks available in Docker env

I've successfully run the Docker container but discovered that the example notebooks are not in the env by default, so am uploading them manually. Would be good if these were available by default

Error 400 when try to run crop-temporal.ipynb

https://github.com/planetlabs/notebooks/blob/master/jupyter-notebooks/temporal-analysis/crop-temporal.ipynb

I met difficulty to run the In [23]:

ERROR 11: HTTP response code: 400

The step is crucial since I have to do some download using the API (my token works in other cases.)

Any idea?

we use gdalwarp and the crop_to_cutline argument to only download the aoi portion of the COG

def _gdalwarp(input_filename, output_filename, options, verbose=False):
commands = ['gdalwarp'] + options +
['-overwrite',
input_filename,
output_filename]
if verbose: print(' '.join(commands))
subprocess.check_call(commands)

def download_scene_aoi(download_url, output_filename, geojson_filename, verbose=False):
vsicurl_url = '/vsicurl/' + download_url
options = [
'-cutline', geojson_filename,
'-crop_to_cutline',
]
_gdalwarp(vsicurl_url, output_filename, options, verbose=verbose)

%time download_scene_aoi(download_url, output_file, geojson_filename, verbose=True)

Also try to run the gdal command line in bash, same error:

$ gdalwarp -cutline data/87/aoi.geojson -crop_to_cutline -overwrite /vsicurl/https://api.planet.com/data/v1/download?token=eyJhbGciOiJIUzUxMiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJKTEgzM1VsU2h4aGJXMUw5VWxwZUlHOGFWNXoyOEtmZVdTUE04Zk4wTStaWmsrRTNITFlRb3BJaTk2SklUT05EMTY1ME51ajl4RkhKM3FzbXZSbUZ1Zz09IiwiaXRlbV90eXBlX2lkIjoiUFNTY2VuZTRCYW5kIiwidG9rZW5fdHlwZSI6InR5cGVkLWl0ZW0iLCJleHAiOjE1MzAwNTQyMzMsIml0ZW1faWQiOiIyMDE3MDgxOF8xOTA2MjlfMTA1MCIsImFzc2V0X3R5cGUiOiJhbmFseXRpY19zciJ9.FkxnWYQgyXfnmkQlgacssXuVvVuUe9RQuo1xkxYNcXkE64TGy4P7Z6OqWGeARp1mXHK8ENyEODGwEJtXFLSaHw data/87/20170818_190629_1050.tif

clip-api-demo JSONDecode Error

Jen, I was looking at the clip-api-demo notebook with a colleague here and we were able to get it working. However, there is a short section of code which I ended up surrounding in a try-except block since I was getting a JSONDecode error which was causing Python to crash:

" if check_state_request.json()['state'] == 'succeeded':\n",

# If clipping process succeeded , we are done
        try:
            if check_state_request.json()['state'] == 'succeeded':
                clip_download_url = check_state_request.json()['_links']['results'][0]
                clip_succeeded = True
                print("Clip of scene succeeded and is ready to download") 

            # Still activating. Wait 1 second and check again.
            else:
                print("...Still waiting for clipping to complete...")
                time.sleep(1)
        except Exception as e:
            print('Exception! {}'.format(e))
            print("...Still waiting for clipping to complete...")
            time.sleep(1)

Just curious if you had seen that before or maybe I'm just lucky!

Current issue with docker build - gdal version

I was following the directions from the previous page, building the Docker image and ran into an issue at step 3/24 (install GDAL step). Looks like something might have changed with the gdal version (2.1.3) from the expected channel.

image

ship_detector import error

Using the docker container, in the 01_ship_detector.ipynb notebook I receive the following error:

ImportError                               Traceback (most recent call last)
<ipython-input-4-5c0ac2810ed5> in <module>()
      1 import json
----> 2 from osgeo import gdal, osr
      3 import numpy
      4 from skimage.segmentation import felzenszwalb
      5 from skimage.segmentation import mark_boundaries

/opt/conda/envs/python2/lib/python2.7/site-packages/osgeo/__init__.py in <module>()
     19                 fp.close()
     20             return _mod
---> 21     _gdal = swig_import_helper()
     22     del swig_import_helper
     23 else:

/opt/conda/envs/python2/lib/python2.7/site-packages/osgeo/__init__.py in swig_import_helper()
     15         if fp is not None:
     16             try:
---> 17                 _mod = imp.load_module('_gdal', fp, pathname, description)
     18             finally:
     19                 fp.close()

ImportError: libjson-c.so.2: cannot open shared object file: No such file or directory

gdalwarp vsicurl not working in current docker image

In the crop-temporal notebook, in cell 23, gdalwarp vsicurl is called to download a portion of a geotiff. In the current docker image, this process fails with the following message:

ERROR 1: PROJ: proj_create_from_wkt: Cannot find proj.db
ERROR 1: PROJ: proj_create_from_wkt: Cannot find proj.db
ERROR 1: PROJ: pj_obj_create: Cannot find proj.db
ERROR 1: PROJ: createGeodeticReferenceFrame: Cannot find proj.db
ERROR 1: PROJ: proj_as_wkt: Cannot find proj.db
ERROR 1: PROJ: createGeodeticReferenceFrame: Cannot find proj.db
ERROR 1: PROJ: pj_obj_create: Cannot find proj.db
ERROR 1: PROJ: proj_as_wkt: Cannot find proj.db
ERROR 1: PROJ: proj_create_from_wkt: Cannot find proj.db
ERROR 1: PROJ: proj_create_from_wkt: Cannot find proj.db
ERROR 1: PROJ: pj_obj_create: Cannot find proj.db
ERROR 1: PROJ: proj_as_wkt: Cannot find proj.db
ERROR 1: PROJ: proj_as_wkt: Cannot find proj.db
ERROR 1: PROJ: proj_create_from_wkt: Cannot find proj.db
ERROR 1: PROJ: proj_create_from_database: Cannot find proj.db
ERROR 1: Cannot compute bounding box of cutline. Cannot find source SRS

The current installed version of gdal is 3.0.1. In the past, the version was pinned to 2.4.0.

This may be related to the switch to gdal 3. (ref and possible solutions: PDAL/PDAL#2544)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.