Coder Social home page Coder Social logo

lingxiaoli94 / spfn Goto Github PK

View Code? Open in Web Editor NEW
170.0 8.0 34.0 86 KB

Source code for "Supervised Fitting of Geometric Primitives to 3D Point Clouds" [CVPR 2019].

Home Page: https://arxiv.org/abs/1811.08988

License: MIT License

Python 78.93% C++ 15.21% Shell 0.74% Cuda 5.12%

spfn's Introduction

Supervised Fitting of Geometric Primitives to 3D Point Clouds

Lingxiao Li*, Minhyuk Sung*, Anastasia Dubrovina, Li Yi, Leonidas Guibas [arXiv]

CVPR2019 (Oral Presentation)

(* indicates equal contribution)

Citation

@misc{1811.08988,
  Author = {Lingxiao Li and Minhyuk Sung and Anastasia Dubrovina and Li Yi and Leonidas Guibas},
  Title = {Supervised Fitting of Geometric Primitives to 3D Point Clouds},
  Year = {2018},
  Eprint = {arXiv:1811.08988},
}

Introduction

Fitting geometric primitives to 3D point cloud data bridges a gap between low-level digitized 3D data and high-level structural information on the underlying 3D shapes. As such, it enables many downstream applications in 3D data processing. For a long time, RANSAC-based methods have been the gold standard for such primitive fitting problems, but they require careful per-input parameter tuning and thus do not scale well for large datasets with diverse shapes. In this work, we introduce Supervised Primitive Fitting Network (SPFN), an end-to-end neural network that can robustly detect a varying number of primitives at different scales without any user control. The network is supervised using ground truth primitive surfaces and primitive membership for the input points. Instead of directly predicting the primitives, our architecture first predicts per-point properties and then uses a differentiable model estimation module to compute the primitive type and parameters. We evaluate our approach on a novel benchmark of ANSI 3D mechanical component models and demonstrate a significant improvement over both the state-of-the-art RANSAC-based methods and the direct neural prediction.

Requirements

The code has been tested with Tensorflow 1.10 (GPU version) and Python 3.6.6. There are a few dependencies on the following Python libraries : numpy (tested with 1.14.5), scipy (tested with 1.1.0), pandas (tested with 0.23.4), PyYAML (tested with 3.13), and h5py (tested with 2.8.0).

Usage

Compiling PointNet++

In order to use PointNet++, one needs to run each of the compile scripts under pointnet_plusplus/utils/tf_ops to compile the CUDA code. One needs to change the various path names in pointnet_plusplus/utils/tf_opsto point to the correct CUDA/tensorflow directories.

Dataset

First, while in the project folder, download processed ANSI 3D dataset of mechanical components (point clouds with 8k points, 9.4GB zip file, 12GB after unzipping):

wget --no-check-certificate https://shapenet.cs.stanford.edu/media/minhyuk/spfn/data/spfn_traceparts.zip
unzip spfn_traceparts.zip

The original CAD data is kindly provided by Traceparts. The provided dataset has been processed to extract primitive surface information and point samples on each surface as well as on the whole shape.

#f03c15 [NEW] High-Res Dataset

We also provide high-resolution point clouds of the same dataset, which has 131,072 (128k) points for each object (121GB tar file):

wget --no-check-certificate https://shapenet.cs.stanford.edu/media/minhyuk/spfn/data/spfn_traceparts_high_res.tar

Training

Train SPFN with our default configuration by:

mkdir experiments && cd experiments
cp ../default_configs/network_config.yml .
python3 ../spfn/train.py network_config.yml

Note that the script train.py takes a configuration YAML file network_config.yml that contains GPU setting, data source, neural network parameters, training hyperparameters, and I/O parameters. Simply copy the default YAML configuration file and change parameters to your need. During training, three folders will be created/updated. In their default locations, results/model is the directory for storing the Tensorflow model, results/log is the directory for log files (created by tf.summary.FileWriter), and results/val_pred contains predictions for the validation dataset at varying training steps.

Test

At test time, run train.py with --test flag to run the network on test dataset speficied by test_data_file in the YAML configuration:

python3 ../spfn/train.py network_config.yml --test 

As a shortcut, and also to test the network with only input points without other supervision, run train.py with --test_pc_in=tmp.xyz and --test_h5_out=tmp.h5 in additional to --test flag, where tmp.xyz is assumed to be a point cloud file with one point x y z on each line:

python3 ../spfn/train.py ../default_configs/network_config.yml --test --test_pc_in=tmp.xyz --test_h5_out=tmp.h5

The predictions are stored in HDF5 format. Each HDF5 prediction file contains per-point normal prediction, per-point membership prediction, per-point type prediction, and estimated parameters for each primitive.

Evaluation

For evaluating the predictions made by SPFN and other approaches (as in ablation studies and RANSAC variants in the paper), we use a unified evaluation pipeline to compute the various metrics proposed in the paper. While in experiments folder, first copy the default evaluation configuration file:

cp ../default_configs/eval_config.yml .

This file contains pointers to the data source and the prediction directory. One needs to modify prediction_dir to point to a direction containing HDF5 predictions (same format as SPFN predictions). Run the evaluation network by

python3 ../spfn/eval.py eval_config.yml

This will generate a directory (by default results/eval_bundle) containing one bundle file for every shape. The bundle file contains evaluation results following the evaluation metrics propsed in the paper (mean IoU, type accuracy, different kinds of coverage numbers etc.), in addition to input data and the prediction (hence the name "bundle"). These bundle files can then be used for visualization and downstream analysis.

Acknowledgement

Code in pointnet_plusplus folder is borrowed from PointNet++, with slight modification to support dynamic size and extra parallel fully-connected layers at the end.

License

This code is released under the MIT License. Refer to LICENSE for details.

spfn's People

Contributors

lingxiaoli94 avatar mhsung avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

spfn's Issues

Broken link to the data

Hi @csimstu2,

when I try to download the provided dataset I get an error 404, so the link to the data seems to be broken. Could you check it, please?

All points belong to the same primitive

Hi,

I just followed ur work and trained the network. Unfortunately, when I check the validation and the test dataset, all points belong to the same primitive.

Can you provide some well-trained parameters?

Thank you so much!

Detect only plane

Hello,

Thanks for your excellent work.

I want to know if there is any way to force the model to only detect plane primitive?

Thanks in advance.

Best.

Mulin

Tips/Code for visualization

Thank you for providing this code. I have been able to successfully train and test the network. I was wondering if you could provide some tips/sample code on visualizing the output predictions.

How to use my own data to test?

Hi there.
Thank you for providing this code.
I am trying to using my own point cloud data to test this network.But i have met some problems,such as the 'gt_normals','gt_labels' and 'noisy_points' in .h5 file.
How can i generate those groups by myself?
Btw,i have transform point cloud in .txt to .h5,then i copy 'gt_normals','gt_labels' from one of your test_dataset ,then i input this .h5 into network to test,the output is as following:
"
Traceback (most recent call last):
File "../spfn/train.py", line 91, in
save_dir=conf.get_test_prediction_dir(),
File "../spfn/lib/network.py", line 237, in predict_and_save
for batch in dset.create_iterator():
File "../spfn/lib/dataset.py", line 155, in next
data.append(self.fetch_data_at_index(self.current + i))
File "../spfn/lib/dataset.py", line 125, in fetch_data_at_index
assert data is not None # assume data are all clean
AssertionError
"
I will be appreciate if u can give me some advice or information,thx 👍 :p.

Visualization

How to visualize.h5 files and get images like the ones in the paper.

How to decide the boundary of primitives?

Hello, I notice that all primitives have boundary parameters, e.g., a plane is bounded by x_range and y_range, a cylinder is bounded by height, etc.

However, I couldn't find how these parameters are calculated in the codebase. Can anyone point me in the right direction?

Thanks in advance!

Can I generate point cloud and normals from stl file?

I'm trying to generate point cloud data from mesh such as .stl file, but I failed to create the groundtrue normals of sampled points. Is there any way to generate point cloud and normals from stl file using 3D tools like MeshLab? Thanks.

Interpreting HDF5 files output

Hello, thank you for sharing your code.
I succeeded to train the network, and now when i check the results file I'm wondering about the mean of them.
for example:
Reading file: results/test_pred/reducing_outlet_tee_sch_20_dn_24_x_22.h5
Keys in the HDF5 file: ['instance_per_point', 'normal_per_point', 'parameters', 'type_per_point']
Data in 'instance_per_point': [[0.04146668 0.04270491 0.0424811 ... 0.04199342 0.04334399 0.04160568]
[0.04181283 0.0418735 0.04237555 ... 0.04092691 0.04334719 0.04153264]
[0.04049535 0.04221334 0.04246197 ... 0.04132846 0.04314893 0.04170037]
...
[0.04147572 0.04272255 0.04257024 ... 0.04186079 0.0433732 0.04154029]
[0.04129084 0.04159909 0.0416345 ... 0.04079393 0.04304048 0.04178598]
[0.04142863 0.04228378 0.04211678 ... 0.03941492 0.04331646 0.04165341]]
Data in 'normal_per_point': [[-0.16674297 0.6498785 -0.7415219 ]
[-0.6984148 0.7140314 0.04874534]
[-0.03995237 0.1555926 -0.987013 ]
...
[-0.21273135 0.6225238 -0.7531332 ]
[-0.3132458 0.3457177 -0.884509 ]
[-0.8076426 0.4083275 -0.42541987]]
Group 'parameters' contains:
Data in 'parameters/cone_apex': [[-0.68249977 0.2683862 0.24202625]
[-0.683581 0.26436955 0.24337825]
[-0.68229187 0.265686 0.24405871]
[-0.68866557 0.2640691 0.2451576 ]
[-0.6903612 0.26529744 0.24397635]
[-0.6806098 0.26292068 0.24722433]
[-0.68791765 0.26528564 0.24524073]
[-0.6839899 0.26504272 0.24282333]
[-0.6808714 0.2710494 0.2393507 ]
[-0.67881876 0.2677244 0.24303956]
[-0.68873703 0.2672725 0.24337135]
[-0.6871602 0.26956183 0.23944698]
[-0.68566525 0.26917693 0.24429011]
[-0.6842599 0.26475802 0.24496672]
[-0.68275535 0.26878864 0.23655924]
[-0.68363965 0.26302138 0.24211153]
[-0.67980057 0.26306492 0.24363805]
[-0.68206114 0.26226032 0.24465162]
[-0.6887125 0.26928684 0.24353853]
[-0.687524 0.26925272 0.24260522]
[-0.68189293 0.26568702 0.2410939 ]
[-0.67795604 0.2659719 0.2417794 ]
[-0.6819265 0.26931733 0.24228878]
[-0.6825126 0.26816985 0.24228932]]
Data in 'parameters/cone_axis': [[ 0.67268926 -0.13504499 0.727497 ]
[ 0.67214715 -0.13330843 0.728318 ]
[ 0.6723382 -0.13541871 0.72775203]
[ 0.67138946 -0.13331147 0.729016 ]
[ 0.67106825 -0.13150176 0.7296401 ]
[ 0.67428064 -0.1328572 0.72642595]
[ 0.6728075 -0.13434722 0.72751695]
[ 0.6709063 -0.1325869 0.72959256]
[ 0.6715145 -0.13535106 0.7285248 ]
[ 0.6730632 -0.13571179 0.727027 ]
[ 0.6709665 -0.1342867 0.72922635]
[ 0.6710383 -0.13450007 0.72912073]
[ 0.673337 -0.1370484 0.7265226 ]
[ 0.67257 -0.13489676 0.72763485]
[ 0.67024326 -0.13181952 0.7303409 ]
[ 0.67061096 -0.13245825 0.72988755]
[ 0.6720134 -0.13251215 0.72858655]
[ 0.6714473 -0.13302794 0.7290144 ]
[ 0.6725806 -0.13470566 0.72766036]
[ 0.6719631 -0.13605991 0.7279788 ]
[ 0.6717268 -0.13470827 0.72844815]
[ 0.6723414 -0.13405232 0.72800213]
[ 0.6720696 -0.1365872 0.7277817 ]
[ 0.67176944 -0.1344259 0.7284611 ]]
Data in 'parameters/cone_half_angle': [1.03905 1.0379328 1.0372236 1.0402502 1.0403289 1.0363117 1.038464
1.0391703 1.0400497 1.0377322 1.0402452 1.0407808 1.0379105 1.0383459
1.0417931 1.0397389 1.0384817 1.038821 1.0388099 1.0403212 1.0398713
1.0380976 1.0387175 1.039127 ]
Data in 'parameters/cylinder_axis': [[ 0.68718046 0.71948516 -0.10061884]
[ 0.68765205 0.7191879 -0.09951618]
[ 0.68482655 0.7222519 -0.09677138]
[ 0.6873515 0.71951336 -0.09923869]
[ 0.6901509 0.7160029 -0.10503148]
[ 0.6913279 0.7142434 -0.10918733]
[ 0.6876821 0.7184218 -0.10470645]
[ 0.68906516 0.7178702 -0.09925459]
[ 0.68845195 0.7183915 -0.09973835]
[ 0.6872032 0.7196677 -0.0991475 ]
[ 0.6851335 0.72187716 -0.09739257]
[ 0.6873199 0.71935284 -0.10061287]
[ 0.68407977 0.72253704 -0.09987635]
[ 0.68763757 0.7188192 -0.10224283]
[ 0.694219 0.711446 -0.1091079 ]
[ 0.6881904 0.7186158 -0.09992576]
[ 0.6899453 0.71664363 -0.10196752]
[ 0.6903345 0.7164836 -0.1004462 ]
[ 0.68543 0.72084236 -0.10282055]
[ 0.684875 0.7219858 -0.09840129]
[ 0.6871282 0.7197285 -0.09922622]
[ 0.6903027 0.7163488 -0.10161982]
[ 0.6845313 0.72262704 -0.0960576 ]
[ 0.6866276 0.7202805 -0.09868377]]
Data in 'parameters/cylinder_center': [[-0.01060657 0.01127663 0.00819669]
[-0.00858989 0.00932445 0.00803067]
[-0.00912228 0.00969322 0.0077892 ]
[-0.0079653 0.00880875 0.00869678]
[-0.0078293 0.00878472 0.00844016]
[-0.00642065 0.0077186 0.00983801]
[-0.0107281 0.01148327 0.00833096]
[-0.00629287 0.00713723 0.00793322]
[-0.00936395 0.01003724 0.00766043]
[-0.00888856 0.00966695 0.00856039]
[-0.00970314 0.0102592 0.00778228]
[-0.01131018 0.01178 0.00695998]
[-0.01314948 0.01351666 0.00771949]
[-0.00792195 0.0088279 0.00878533]
[-0.00734683 0.0083034 0.00739739]
[-0.00542111 0.00632546 0.00815424]
[-0.00508226 0.00620564 0.00922599]
[-0.00745827 0.00836 0.0083737 ]
[-0.01197881 0.01246879 0.00756071]
[-0.00978743 0.01041962 0.0083297 ]
[-0.00798881 0.00880275 0.00852853]
[-0.00720033 0.00815473 0.00857338]
[-0.0095138 0.01005245 0.0078253 ]
[-0.00861483 0.00933804 0.00821648]]
Data in 'parameters/cylinder_radius_squared': [0.5770874 0.5791853 0.57965344 0.5797129 0.57892674 0.5789346
0.5777656 0.58067375 0.5764967 0.57835305 0.5790267 0.5772051
0.57700723 0.5787317 0.574821 0.5798143 0.5783358 0.5766971
0.5789064 0.580007 0.5774035 0.5769484 0.578994 0.5784347 ]
Data in 'parameters/plane_c': [ 7.0171322e-05 4.0577562e-04 1.5225034e-03 -2.1876341e-03
-3.2952118e-03 1.1653228e-02 2.4652968e-03 -1.9900431e-03
-2.8912134e-03 3.4797313e-03 -5.3713522e-03 -6.3767219e-03
1.2009876e-03 3.7553997e-03 4.3897619e-03 -1.6744898e-03
2.7777476e-03 1.2472745e-03 -9.9430908e-04 -2.2567972e-03
-1.2048875e-03 2.9806879e-03 -8.6502335e-04 -1.2321911e-03]
Data in 'parameters/plane_n': [[ 4.6418980e-03 -2.8651118e-02 9.9957871e-01]
[-2.0068292e-02 -2.3908934e-02 9.9951267e-01]
[-8.4241172e-03 -3.7782114e-02 9.9925047e-01]
[-2.8569916e-02 2.4665041e-02 9.9928755e-01]
[-2.1805611e-02 6.6515505e-02 9.9754721e-01]
[ 2.9405151e-02 1.6061900e-02 9.9943858e-01]
[ 4.6240520e-03 3.7014209e-02 9.9930412e-01]
[-1.3362234e-02 -4.5255660e-03 9.9990058e-01]
[ 2.0265048e-02 -1.0983745e-02 9.9973422e-01]
[ 1.4029609e-02 -4.2116344e-02 9.9901426e-01]
[-2.9682709e-02 3.0225457e-03 9.9955487e-01]
[-5.4899980e-03 -9.5130055e-04 9.9998450e-01]
[ 1.1733224e-03 -3.8165756e-02 9.9927074e-01]
[ 8.8828115e-04 -3.0822961e-02 9.9952453e-01]
[-2.9844727e-02 -4.7172908e-02 -9.9844080e-01]
[-1.6053585e-02 -4.7872870e-04 9.9987102e-01]
[ 3.1866154e-03 -4.1732118e-03 9.9998623e-01]
[-3.0898275e-02 -4.2306520e-02 9.9862677e-01]
[ 5.0779297e-03 3.9054554e-02 9.9922413e-01]
[-1.1279652e-04 -3.3993043e-03 9.9999422e-01]
[ 3.7299341e-03 -2.7705431e-02 9.9960905e-01]
[ 8.1685921e-03 -4.9467377e-02 9.9874228e-01]
[ 1.4505754e-03 -4.8959538e-02 9.9879968e-01]
[-4.8622852e-03 -3.1589009e-02 9.9948907e-01]]
Data in 'parameters/sphere_center': [[-0.00503756 0.02726883 0.00643293]
[-0.0025967 0.02723723 0.0057612 ]
[-0.00361849 0.02623498 0.00557375]
[-0.00290233 0.02661383 0.00690671]
[-0.00344405 0.02624249 0.00633732]
[-0.00336611 0.02519865 0.005232 ]
[-0.00565094 0.02645118 0.00612139]
[-0.00146938 0.02681303 0.00592509]
[-0.00414805 0.02675812 0.00653567]
[-0.00351483 0.02656311 0.00633274]
[-0.0045257 0.02668802 0.00643197]
[-0.00596962 0.0269479 0.00616396]
[-0.00733277 0.02667976 0.00596084]
[-0.00473065 0.02532309 0.0062288 ]
[-0.00471777 0.02592422 0.00636153]
[-0.00191117 0.02564437 0.00654969]
[-0.00192464 0.02549449 0.00580772]
[-0.00258446 0.02701984 0.00615514]
[-0.00634171 0.02680117 0.00610351]
[-0.00563655 0.02578138 0.00652786]
[-0.00421188 0.02581028 0.00707097]
[-0.00282517 0.02628773 0.00611088]
[-0.00442278 0.02621978 0.00647386]
[-0.00428585 0.02643985 0.00593286]]
Data in 'parameters/sphere_radius_squared': [0.8855629 0.8859482 0.886977 0.8867259 0.8855147 0.8881748
0.8869213 0.885528 0.8860611 0.88703626 0.885167 0.8853015
0.8869937 0.8880144 0.8858609 0.8863709 0.88572514 0.88486266
0.88554233 0.8879343 0.8871139 0.88619304 0.88766515 0.8860579 ]
Data in 'type_per_point': [1 3 1 ... 1 2 2]

So, if you have a primitive per point, and for each primitive you have parameters, how do you know the points that belong to the same primitive? I mean, I don't think that you have then one primitive per point, somehow they must be grouped
Thank you in advance

Labelling procedure and tools

First of all I would like to thank you for publishing this repository, it has been really useful for us. After analysing the dataset used to train the models, we would like to know what was the labelling procedure and what tools you used to carry out the labelling process. We would be grateful if you could clarify this critical aspect. Thank you very much in advance.

Google Colab

Is it possible to run it in Google Colab ? if yes can you provide the required steps please ?

ANSI 3D dataset Dataset with meshes?

Hi,

Thanks for this amazing work and for making the code and data public! I've downloaded the pre-processed ANSI 3D dataset, however, seems like the meshes are not included. Is there a place I can download the original meshes?

Best regards,

Daxuan

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.