Coder Social home page Coder Social logo

amsterdam-ai-team / urban_pointcloud_processing Goto Github PK

View Code? Open in Web Editor NEW
157.0 11.0 27.0 116.17 MB

Repository for automatic classification and labeling of Urban PointClouds using data fusion and region growing techniques.

License: GNU General Public License v3.0

Python 100.00%
data-fusion classification labelling point-cloud topographic-maps 3d-data lidar-point-cloud semantic-segmentation gis segmentation

urban_pointcloud_processing's Introduction

Urban PointCloud Processing

This repository contains methods for the automatic classification and labeling of Urban PointClouds using data fusion. The methods can serve as inspiration, or can be applied as-is under some specific assumptions:

  1. Usage in The Netherlands (The "Rijksdriehoek coordinate system");
  2. Point clouds in LAS format and tiled following specific rules; and
  3. Fusion with AHN and BGT public data sources.

Example notebooks are provided to demonstrate the tools.

Example: automatic labeling of a point cloud.

Example: automatic labeling of ground, road, buildings, cars, trees, street lights, traffic signs, city benches, and rubbish bins.

Project Goal

The goal of this project is to automatically locate and classify various assets such as trees, street lights, traffic signs, and other street furniture in street level point clouds. A typical approach would be to build and train a machine learning classier, but this requires a rich labeled dataset to train on. One of the main challenges in working with 3D point cloud data is that, in contrast to 2D computer vision, no general-purpose training sets are available. Moreover, the sparsity and non-uniform density of typical point clouds makes transferring results form one task to another difficult.

However, since we are working with urban street level data, we do have access to a large number of public datasets and registries that we can use to start labeling and create an initial training set. This repository contains several data fusion methods that combine public datasets such as elevation data, building footprints, and topographic registries to automatically label point clouds.

We also provide some post-processing methods that further fine-tune the labels. For example, we use region growing to extend the facade of buildings to include protruding elements such as balconies and canopies that are not included in the building footprint.

For a quick dive into this repository take a look at our complete solution notebook.


Folder Structure


Installation

This code has been tested with Python >= 3.8 on Linux and MacOS.

  1. To use this code in development mode simply clone the repository and install the dependencies.

    # Clone the repository
    git clone https://github.com/Amsterdam-AI-Team/Urban_PointCloud_Processing.git
    
    # Install dependencies
    cd Urban_PointCloud_Processing
    python -m pip install -r requirements.txt

    or, with Conda:

    conda env create -f environment.yml
  2. Alternatively, the code can be installed as a Python package from source:

    # Install the latest release as Wheel
    python -m pip install https://github.com/Amsterdam-AI-Team/Urban_PointCloud_Processing/releases/download/v0.3/upcp-0.3-py3-none-any.whl
    
    # Alternatively, install the latest version from source
    python -m pip install git+https://github.com/Amsterdam-AI-Team/Urban_PointCloud_Processing.git#egg=upcp
    
    # Or, after making changes in the code
    cd Urban_PointCloud_Processing
    python -m pip install .

    If you use the latter and want your code changes to take effect without re-installing the package, use the --editable flag for pip.

  3. Additionally, install cccorelib and pycc from the CloudCompare-PythonPlugin project by following the summary instructions below; for more details and Windows instructions see their GitHub page. Please note, these two packages are not available on the Python Package Index (PyPi).

    Building these packages requires Qt.

    git checkout https://github.com/tmontaigu/CloudCompare-PythonPlugin.git
    cd CloudCompare-PythonPlugin
    pip install --upgrade pip  # Requires version >= 21.1
    # For Mac OS
    export CMAKE_PREFIX_PATH=/usr/local/opt/qt@5
    pip install wrapper/cccorelib
    pip install wrapper/pycc

Usage

We provide tutorial notebooks that demonstrate how the tools can be used.

For visualisation of the resulting labelled point clouds we suggest CloudCompare. Simply open the labelled .laz in CloudCompare, select the cloud, and set Colors to the custom Scalar Field named label. For best results, use our custom "AMS" color scale.


Citing our work

If you use (parts of) this repositiory in your work, please cite our paper:

@article{bloembergen2021automatic,
  title={Automatic labelling of urban point clouds using data fusion},
  author={Bloembergen, Daan and Eijgenstein, Chris},
  journal={arXiv preprint arXiv:2108.13757},
  year={2021}
}

Acknowledgements

This repository was created by Amsterdam Intelligence for the City of Amsterdam.

We owe special thanks to Dr. Sander Oude-Elberink for ideas and brainstorming regarding data fusion with AHN and BGT data.

urban_pointcloud_processing's People

Contributors

chrise96 avatar daanbl avatar fallbosk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

urban_pointcloud_processing's Issues

Rewrite all code to use geodata directly.

  • Store all geodata as .gpkg instead of .csv.
  • Modify all code to work with this data directly.
  • Will save a whole bunch of conversions between csv, arrays, and shapely polygons.

Need help setting up project and running notebooks

I have cloned the repository using PowerShell and installed the required dependencies by running the following commands:

git clone https://github.com/Amsterdam-AI-Team/Urban_PointCloud_Processing.git
cd Urban_PointCloud_Processing
python -m pip install -r requirements.txt

However, I am not familiar with how to proceed further. I am trying to use the notebooks provided in the Usage directory to perform point cloud processing, but I am not sure how to run the code.

For example, the first line of the code in the notebook is import set_path. I am not sure how to set the path to the project directory.

Could someone please provide some guidance on how to set up and run the project, and how to use the notebooks provided in the Usage directory?

Thank you.

Can you provide a Docker?

I'm a new student, and I had tried a week to run this code but the kernel keeps crashing. It's too difficult for me to cmake the C++ envs. Cloud you please help me with a Docker?

About Datasets

How can I get a BGT-like datasets of my own country? I'm using OpenStreetMap but I don't know how to get the polygon data like the BGT data.

cannot import cccorelib

image
I have already installed the cloudcompare-alpha and cccorelib direct installer on windows. However, it still cannot import cccorelib. Anyone has the same problem?
image

Find a way to deal with bridges.

Bridges are a special class in AHN data, and as such are not seen as ground. This causes them to not be labelled in the point cloud. We should find a way to deal with this.

Cannot install "cccorelib"

I tried "pip install cccorelib" but it showed me "ERROR: Could not find a version that satisfies the requirement cccorelib
ERROR: No matching distribution found for cccorelib".
How can I install this module? Please give me some hint, Thank you very much~~

How to segment the road pointcloud?

HI, teacher, I am a beginner of python and segmentation. I was wondering how did you define a road using elevation, like region growing? Here is the function I was confused: 'fusion.BGTRoadFuser(Labels.ROAD, bgt_reader=bgt_road_reader)'.

Thank you so much!

Can't import 'cccorelib'

I'm running region growing on jupyter notebook and it gives error of 'No module named cccorelib'. I tried pip but it didn't work.

Interpolate AHN ground estimate?

Interpolating the full AHN ground estimate tiles to fill missing data would potentially solve some issues.

Considerations:

  • This is always an approximation.
  • This will fail for bridges.
  • This might fail for quay walls / other water bodies.

CableFuser throws an error when caching is enabled.

This is due to the fact that a subset of points is queried for AHN interpolation, while the interpolator expects the full set for caching purposes.

A temporary fix is to disable caching in the pipeline:
pipeline.Pipeline(processors=..., ahn_reader=..., caching=False)

AttributeError: module 'pycc' has no attribute 'PointCoordinateType'

I just fixed the "laspy" issues but know when I run the block "pipeline.process_file(in_file, out_file)" block again, it shows me

AttributeError: module 'pycc' has no attribute 'PointCoordinateType'

My pycc version is 2.0.0

What should I do? Please help me~ Thank you!!

module 'laspy' has no attribute 'read'

I just run the complete solution and when I run the block "pipeline.process_file(in_file, out_file)" block, it shows me "module 'laspy' has no attribute 'read'".

What did I do wrong? What should I do?

Kernel die when running pipeline

The log is :
INFO - Processing file ../datasets/pointcloud/filtered_2386_9702.laz.
INFO - Caching ground_surface for tile 2386_9702.
INFO - AHN [npz/ground] fuser (label=1, Ground).
INFO - Processor finished in 0.02s, 0 points labelled.
INFO - Noise filter (label=99, Noise).

What are the possible problems? Can you give me some tips so that I can debug?

can't import cccorelib

I'm trying to run the attached region growing algorithm on jupyter notebook but it keeps saying ''No module named 'cccorelib' '' and pip can't install it !

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.