Coder Social home page Coder Social logo

Comments (12)

jreiberkyle avatar jreiberkyle commented on June 4, 2024 3

Okay, it appears that trying to overwrite the original kernel doesn't work. I have tried activating the base environment before installing gdal (and therefore proj) and installing a new kernel that activates the base environment, but the PROJ_LIB variable still is not set. I am going to give up on using the approach conda suggests and just store the PROJ_LIB var in the Dockerfile. What a mess.

Dockerfile:

# 2019-12-02 build
FROM jupyter/minimal-notebook:7a0c7325e470

# activate so we can store some vars in here
RUN conda init bash

# new shells activate  base
RUN echo "conda activate base" >> ~/.bashrc

SHELL ["/bin/bash", "-c"]
RUN source ~/.bashrc && conda install -y -c conda-forge gdal && \
    python -m ipykernel install --user && \
    fix-permissions $CONDA_DIR && \
    fix-permissions /home/$NB_USER

WORKDIR work

from notebooks.

jreiberkyle avatar jreiberkyle commented on June 4, 2024 1

It looks like the issue is that the environmental variable PROJ_LIB is not being set. It appears this is because we are not creating/activating a conda environment, just using the base environment (ref). This even though all of the jupyter-notebook docker images are installing packages into the conda base environment (e.g. datascience notebook Dockerfile). So, next thing to try is creating/activating a conda environment in the Docker image. This is a little tricky, but this article demonstrates how.

from notebooks.

jreiberkyle avatar jreiberkyle commented on June 4, 2024

Based on the PROJ6.3.0 FAQ, this is a common issue.

Updated the Dockerfile to use the latest build of jupyter/minimal-notebook. same failure in using gdalwarp in crop-temporal notebook.

Trying to reduce this to the minimal code to test but getting surprising results.

rasterio test succeeds, this has been known to fail (rasterio/rasterio#1850)

from rasterio import crs
a = crs.CRS.from_epsg(3005)
a.is_epsg_code

using gdalwarp in classify-cart-l8-ps actually fails if run as a bash command:

gdalwarp -te 475500.0 4727500.0 500500.0 4752500.0 -ts 8000 8000 \
  -r near -overwrite \
  data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF \
  data/cart/210879_1558814_2016-07-25_0e16/L8/LC80260302016245LGN00_BQA.TIF
ERROR 1: PROJ: pj_obj_create: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: createGeodeticReferenceFrame: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: proj_as_wkt: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: createGeodeticReferenceFrame: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: pj_obj_create: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: proj_as_wkt: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: proj_create_from_wkt: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: proj_create_from_wkt: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: pj_obj_create: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: proj_as_wkt: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: proj_as_wkt: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: proj_create_from_wkt: Open of /opt/conda/share/proj failed
ERROR 1: PROJ: proj_create_from_database: Open of /opt/conda/share/proj failed
Creating output file that is 8000P x 8000L.
Processing data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF [1/1] : 0...10...20...30...40...50...60...70...80...90...100 - done.

from notebooks.

jreiberkyle avatar jreiberkyle commented on June 4, 2024

I have tried a few things with no luck so far.

This works when I run the docker image in interactive mode using the bash shell, but not when I run the docker image as a notebook:

FROM jupyter/minimal-notebook:7a0c7325e470

ARG conda_env=notebooks
COPY environment.yml .
RUN conda env create --quiet -f environment.yml && \
    conda clean --all -f -y

RUN conda init bash
RUN echo "conda activate ${conda_env}" >> ~/.bashrc

WORKDIR work

This actually activates the conda environment in the notebooks, and the necessary environmental variable is set, but the code fails with: ERROR 1: PROJ: pj_obj_create: Open of /opt/conda/envs/notebooks/share/proj failed

FROM jupyter/minimal-notebook:7a0c7325e470

ARG conda_env=notebooks
COPY environment.yml .
RUN conda env create --quiet -f environment.yml && \
    conda clean --all -f -y

ENV PATH $CONDA_DIR/envs/${conda_env}/bin:$PATH
ENV CONDA_DEFAULT_ENV ${conda_env}

# this appears to do nothing but was worth a try in case it was a permissions issue
RUN fix-permissions $CONDA_DIR
WORKDIR work

asking for help on the jupyter discourse site: post

from notebooks.

jreiberkyle avatar jreiberkyle commented on June 4, 2024

The current access issue seems to be specific to notebooks.

Running the below code works when running the docker image in bash:

gdalwarp --debug on -te 475500.0 4727500.0 500500.0 4752500.0 -ts 8000 8000   \
-r near -overwrite   data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF  \
data/test/bash.TIF
GDAL: GDALOpen(data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF, this=0x55e4c3f53340) succeeds as GTiff.
GDAL: GDALOpen(data/test/bash.TIF, this=0x55e4c3f65be0) succeeds as GTiff.
GDAL: GDALClose(data/test/bash.TIF, this=0x55e4c3f65be0)
GDAL: Using GTiff driver
GDAL: Computing area of interest: -93.9157, 42.0991, -91.0002, 44.2434
Creating output file that is 8000P x 8000L.
GDAL: QuietDelete(data/test/bash.TIF) invoking Delete()
GDAL: GDALOpen(data/test/bash.TIF, this=0x55e4c3f6b300) succeeds as GTiff.
GDAL: GDALDefaultOverviews::OverviewScan()
MDReaderPleiades: Not a Pleiades product
MDReaderPleiades: Not a Pleiades product
GDAL: GDALClose(data/test/bash.TIF, this=0x55e4c3f6b300)
GDAL: GDALDriver::Create(GTiff,data/test/bash.TIF,8000,8000,1,UInt16,(nil))
Processing data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF [1/1] : 0WARP: Copying metadata from first source to destination dataset
MDReaderPleiades: Not a Pleiades product
MDReaderPleiades: Not a Pleiades product
GDAL: GDALDefaultOverviews::OverviewScan()
GDALWARP: Defining SKIP_NOSOURCE=YES
GDAL: GDAL_CACHEMAX = 599 MB
GDAL: GDALWarpKernel()::GWKNearestNoMasksOrDstDensityOnlyShort() Src=1620,4893,834x418 Dst=0,0,8000x4000
...10...20...30...40...50GDAL: GDALWarpKernel()::GWKNearestNoMasksOrDstDensityOnlyShort() Src=1620,5310,834x418 Dst=0,4000,8000x4000
...60...70...80...90...100 - done.
GDAL: Flushing dirty blocks: 0...10...20...30...40...50...60...70...80...90...100 - done.
GDAL: GDALClose(data/test/bash.TIF, this=0x55e4c3f67d40)
GDAL: GDALClose(data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF, this=0x55e4c3f53340)

Running above code gets this failure message when run as a bash command in the notebook:

GDAL: GDALOpen(data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF, this=0x555f71c843f0) succeeds as GTiff.
GDAL: GDALOpen(data/test/notebook.TIF, this=0x555f71c96c90) succeeds as GTiff.
GDAL: GDALClose(data/test/notebook.TIF, this=0x555f71c96c90)
GDAL: Using GTiff driver
PROJ: pj_obj_create: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: createGeodeticReferenceFrame: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: createGeodeticReferenceFrame: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: pj_obj_create: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_create_from_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_create_from_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: pj_obj_create: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_create_from_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_proj_string: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_proj_string: Open of /opt/conda/envs/notebooks/share/proj failed
ERROR 1: PROJ: proj_create_from_database: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_create: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_create: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_create_operation_factory_context: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: pj_obj_create: Open of /opt/conda/envs/notebooks/share/proj failed
GDAL: Computing area of interest: -93.9157, 42.0991, -91.0002, 44.2434
Creating output file that is 8000P x 8000L.
GDAL: QuietDelete(data/test/notebook.TIF) invoking Delete()
GDAL: GDALOpen(data/test/notebook.TIF, this=0x555f71c876a0) succeeds as GTiff.
GDAL: GDALDefaultOverviews::OverviewScan()
MDReaderPleiades: Not a Pleiades product
MDReaderPleiades: Not a Pleiades product
GDAL: GDALClose(data/test/notebook.TIF, this=0x555f71c876a0)
GDAL: GDALDriver::Create(GTiff,data/test/notebook.TIF,8000,8000,1,UInt16,(nil))
PROJ: proj_create_from_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
Processing data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF [1/1] : 0WARP: Copying metadata from first source to destination dataset
MDReaderPleiades: Not a Pleiades product
MDReaderPleiades: Not a Pleiades product
GDAL: GDALDefaultOverviews::OverviewScan()
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
GDALWARP: Defining SKIP_NOSOURCE=YES
GDAL: GDAL_CACHEMAX = 599 MB
GDAL: GDALWarpKernel()::GWKNearestNoMasksOrDstDensityOnlyShort() Src=1620,4893,834x418 Dst=0,0,8000x4000
...10...20...30...40...50GDAL: GDALWarpKernel()::GWKNearestNoMasksOrDstDensityOnlyShort() Src=1620,5310,834x418 Dst=0,4000,8000x4000
...60...70...80...90...100 - done.
GDAL: Flushing dirty blocks: 0PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_create_from_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
PROJ: proj_as_wkt: Open of /opt/conda/envs/notebooks/share/proj failed
...10...20...30...40...50...60...70...80...90...100 - done.
GDAL: GDALClose(data/test/notebook.TIF, this=0x555f71ca3280)
GDAL: GDALClose(data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF, this=0x555f71c843f0)

from notebooks.

jreiberkyle avatar jreiberkyle commented on June 4, 2024

Trying to fix permissions the way they are fixed in the jupyter notebook stacks isn't helping.

Tried:

# 2019-12-02 build
FROM jupyter/minimal-notebook:7a0c7325e470

ARG conda_env=notebooks

# try this to see if additional permissions fix helps notebooks
ADD environment.yml /tmp/environment.yml
RUN conda env create --quiet -f /tmp/environment.yml && \
    conda clean --all -f -y && \
    fix-permissions $CONDA_DIR && \
    fix-permissions /home/$NB_USER

RUN conda init bash
RUN echo "conda activate ${conda_env}" >> ~/.bashrc
ENV PATH $CONDA_DIR/envs/${conda_env}/bin:$PATH
ENV CONDA_DEFAULT_ENV ${conda_env}

WORKDIR work

Running in notebook I get the same access issues as above.

from notebooks.

jreiberkyle avatar jreiberkyle commented on June 4, 2024

ok, I don't know if this is related to my permissions issue, but I just figured out that creating and activating a new environment using conda env create --quiet -f /tmp/environment.yml results in me losing all of the conda packages that were installed in the base image because those packages were installed in the base conda environment. I am going to try activating the base conda environment instead of making a new environment. This isn't recommended by conda, but it's what the jupyter notebook docker stacks are doing.

from notebooks.

jreiberkyle avatar jreiberkyle commented on June 4, 2024

So, just to document before we go in another direction (using the base environment), I don't think the error Open of /opt/conda/envs/notebooks/share/proj failed was caused by a permissions issue. I ran fix-permissions on /opt/conda/envs/notebooks/share/proj and root directories and nothing changed and ls -l showed that the notebook user and group have access to read and write the proj.db file inside that folder.

from notebooks.

jreiberkyle avatar jreiberkyle commented on June 4, 2024

Trying to install gdal in the root environment gives a different error.

FROM jupyter/minimal-notebook:7a0c7325e470
RUN conda install --quiet -y -c conda-forge gdal 
WORKDIR work

Installs gdal version 3.0.2 (same version installed when I install using an environment and it all works)

when I run the container in bash and try gdalinfo I get

gdalinfo: error while loading shared libraries: libpoppler.so.76: cannot open shared object file: No such file or directory

For some reason, removing the --quiet flag fixed this. I built the container with the --no-cache flag so it shouldn't be a docker build issue. I'm going to avoid the --quiet flag from now on.

So now that gdal is installed properly in the base environment, I get the following when running from shell:

gdalwarp --debug on -te 475500.0 4727500.0 500500.0 4752500.0 -ts 8000 8000   \
 -r near -overwrite   data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF  \
 data/test/bash.TIF
...
PROJ: pj_obj_create: Open of /opt/conda/share/proj failed
PROJ: createGeodeticReferenceFrame: Open of /opt/conda/share/proj failed
...

So, it knows where proj.db is (it is in that folder), but it can't access it

Someone else has had this issue and just set the PROJ_LIB and GDAL_DATA environmental variables - conda/conda#9152

from notebooks.

jreiberkyle avatar jreiberkyle commented on June 4, 2024

Following advice from conda-forge/geopandas-feedstock#63 (comment), I set PROJ_LIB=/opt/conda/share/proj in the bash shell and get no errors (this is without activating a conda env)

gdalwarp --debug on -te 475500.0 4727500.0 500500.0 4752500.0 -ts 8000 8000   \
 -r near -overwrite   data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF  \
 data/test/bash.TIF
GDAL: GDALOpen(data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF, this=0x55a3465f03e0) succeeds as GTiff.
GDAL: GDALOpen(data/test/bash.TIF, this=0x55a346602c80) succeeds as GTiff.
GDAL: GDALClose(data/test/bash.TIF, this=0x55a346602c80)
GDAL: Using GTiff driver
GDAL: Computing area of interest: -93.9157, 42.0991, -91.0002, 44.2434
Creating output file that is 8000P x 8000L.
GDAL: QuietDelete(data/test/bash.TIF) invoking Delete()
GDAL: GDALOpen(data/test/bash.TIF, this=0x55a3466082f0) succeeds as GTiff.
GDAL: GDALDefaultOverviews::OverviewScan()
MDReaderPleiades: Not a Pleiades product
MDReaderPleiades: Not a Pleiades product
GDAL: GDALClose(data/test/bash.TIF, this=0x55a3466082f0)
GDAL: GDALDriver::Create(GTiff,data/test/bash.TIF,8000,8000,1,UInt16,(nil))
Processing data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF [1/1] : 0WARP: Copying metadata from first source to destination dataset
MDReaderPleiades: Not a Pleiades product
MDReaderPleiades: Not a Pleiades product
GDAL: GDALDefaultOverviews::OverviewScan()
GDALWARP: Defining SKIP_NOSOURCE=YES
GDAL: GDAL_CACHEMAX = 599 MB
GDAL: GDALWarpKernel()::GWKNearestNoMasksOrDstDensityOnlyShort() Src=1620,4893,834x418 Dst=0,0,8000x4000
...10...20...30...40...50GDAL: GDALWarpKernel()::GWKNearestNoMasksOrDstDensityOnlyShort() Src=1620,5310,834x418 Dst=0,4000,8000x4000
...60...70...80...90...100 - done.
GDAL: Flushing dirty blocks: 0...10...20...30...40...50...60...70...80...90...100 - done.
GDAL: GDALClose(data/test/bash.TIF, this=0x55a3466082f0)
GDAL: GDALClose(data/cart/LC80260302016245LGN00/LC80260302016245LGN00_BQA.TIF, this=0x55a3465f03e0)

from notebooks.

jreiberkyle avatar jreiberkyle commented on June 4, 2024

ok, just to benchmark, we can get gdal and proj working again using this Dockerfile:

FROM jupyter/minimal-notebook:7a0c7325e470

RUN conda install -y -c conda-forge gdal
ENV PROJ_LIB=/opt/conda/share/proj

WORKDIR work

Is this good enough? Should we just add the environmental variable and get on with our lives? At this point I am inclined to say yes.

But I will try one more thing: recreating the notebook kernel with the environment activated. Following https://github.com/jupyter/docker-stacks/blob/master/docs/using/recipes.md#add-a-python-3x-environment. But overwriting the original kernel.

Why? Because I have come so far and I have this working in the shell, I just need jupyter to play along.

from notebooks.

iflament avatar iflament commented on June 4, 2024

After 2 days of working through gdal errors in my Dockerfile, I found your post! Thank you for documenting this!

from notebooks.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.