Coder Social home page Coder Social logo

whu-lyh / b-seg Goto Github PK

View Code? Open in Web Editor NEW

This project forked from fullcyxuc/b-seg

0.0 0.0 0.0 3.43 MB

Code for our SIGGRAPH'2023 paper: "UrbanBIS: a Large-scale Benchmark for Fine-grained Urban Building Instance Segmentation"

License: MIT License

C++ 31.54% Python 48.20% C 1.19% Cuda 18.52% CMake 0.56%

b-seg's Introduction

B-Seg

Code for our SIGGRAPH'2023 paper: "UrbanBIS: a Large-scale Benchmark for Fine-grained Urban Building Instance Segmentation"

Pipeline Image

Installation

Requirements

  • Python 3.6.0 or above
  • Pytorch 1.2.0 or above
  • CUDA 10.0 or above

Virtual Environment

conda create -n bseg python==3.6
source activate bseg

Install B-Seg

(1) Clone from the repository.

git clone https://github.com/fullcyxuc/B-Seg.git
cd B-Seg

(2) Install the dependent libraries.

pip install -r requirements.txt
conda install -c bioconda google-sparsehash 

(3) For the SparseConv, we apply the implementation of spconv as Pointgroup did. The repository is recursively downloaded at step (1). We use the version 1.0 of spconv.

Note: it was modify spconv\spconv\functional.py to make grad_output contiguous. Make sure you use the modified spconv.

  • First please download the spconv, and put it into lib directory

  • To compile spconv, firstly install the dependent libraries.

conda install libboost
conda install -c daleydeng gcc-5 # need gcc-5.4 for sparseconv

Add the $INCLUDE_PATH$ that contains boost in lib/spconv/CMakeLists.txt. (Not necessary if it could be found.)

include_directories($INCLUDE_PATH$)
  • Compile the spconv library.
cd lib/spconv
python setup.py bdist_wheel
  • Run cd dist and use pip to install the generated .whl file.

(4) We also use other cuda and cpp extension(pointgroup_ops,pcdet_ops), and put them into the lib, to compile them:

cd lib/**  # (** refer to a specific extension)
python setup.py develop

Data Preparation

(1) Download the UranBIS training set and test set for the building instance segmentation

(2) Put the data in the corresponding folders, which are organized as follows.

B-Seg
├── dataset
│   ├── UrbanBIS
│   │   ├── original
│   │   │   ├── Qingdao
│   │   │   │   ├── train
│   │   │   │   │   ├── Areax.txt 
│   │   │   │   ├── test
│   │   │   │   │   ├── Areax.txt 
│   │   │   │   ├── val
│   │   │   │   │   ├── Areax.txt 
│   │   │   ├── Wuhu
│   │   │   │   ├── train
│   │   │   │   │   ├── Areax.txt 
│   │   │   │   ├── test
│   │   │   │   │   ├── Areax.txt 
│   │   │   │   ├── val
│   │   │   │   │   ├── Areax.txt 
...

(3) Preprocess and generate the block files _inst_nostuff.pth for building instance segmentation.

cd dataset/UrbanBIS
python prepare_data_inst_instance_UrbanBIS.py

then, it will create a processed folder under the UrbanBIS folder, which contains the files for training and testing. That will be:

B-Seg
├── dataset
│   ├── UrbanBIS
│   │   ├── original
│   │   ├── processed
│   │   │   ├── Qingdao
│   │   │   │   ├── train
│   │   │   │   │   ├── X.pth or X.txt 
│   │   │   │   ├── test_w_label
│   │   │   │   │   ├── X.pth or X.txt 
│   │   │   │   ├── test_w_label_gt
│   │   │   │   │   ├── X.txt 
│   │   │   │   ├── val
│   │   │   │   │   ├── X.pth or X.txt 
│   │   │   │   ├── val_gt
│   │   │   │   │   ├── X.txt 
│   │   │   ├── Wuhu
│   │   │   │   ├── train
│   │   │   │   │   ├── X.pth or X.txt 
│   │   │   │   ├── test_w_label
│   │   │   │   │   ├── X.pth or X.txt 
│   │   │   │   ├── test_w_label_gt
│   │   │   │   │   ├── X.txt 
│   │   │   │   ├── val
│   │   │   │   │   ├── X.pth or X.txt 
│   │   │   │   ├── val_gt
│   │   │   │   │   ├── X.txt 
...

By default, it only processes the Qingdao city scene, and this can be changed at the line 177 in the prepare_data_inst_instance_UrbanBIS.py file.

Training

CUDA_VISIBLE_DEVICES=0 python train.py --config config/BSeg_default_urbanbis.yaml

Inference and Evaluation

For evaluation, please set eval as True in the config file, and set split as val for validation set or test_w_label for testing set with labels

CUDA_VISIBLE_DEVICES=0 python test.py --config config/BSeg_default_urbanbis.yaml

Acknowledgement

This repo is built upon several repos, e.g., Pointgroup, HAIS, DyCo3D, SoftGroup, DKNet, SparseConvNet, spconv, IA-SSD and STPLS3D.

b-seg's People

Contributors

fullcyxuc avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.