prbonn / deep-point-map-compression Goto Github PK
View Code? Open in Web Editor NEWLicense: MIT License
License: MIT License
Hello @louis-wiesmann . Is there any way to retain the class labels of the points obtained after decompression from the decoder?
Thanks in advance.
Aakash
Hello,
I'd like to go into the link for dataset, but it seems expired. Could you please provide new link?
Thank you!
Segmentation fault
Hello @louis-wiesmann .
After training the model to at the default params with epochs=15
, I am using the trained model to test the results on a sample point cloud.
The input cloud I used for testing when looked from side looks like this :
On the other hand, the output obtained from the decoder looks like this:
As is clearly visible, the output from the decoder has a somewhat layered structure. Is there any way to minimize this / correct this?
Some other info:
max_nr_points
(set in the yaml file) = 60,000Also, I am getting some big clusters in the outputs that I get from the decoder...as can be seen in sample outputs below :
Input Cloud | Output Cloud |
---|---|
Can you please tell if there is any parameter that can be tweaked in the YAML config file to get rid of this layered clustering. Or is there any other way to minimize or eliminate this ?
NOTE : I have already seen the other issue on clustering, but in my case the clusters seem to be large.
Thanks
Hey Louis, I get confused by the result, since the translated point cloud (after compression and decompression) get translated from the original one. I think the size is similar, the only problem is that their position differs and somehow the height is also differ from the original one by some height. Any idea about this. I attached an image showing this issue. Possibly because of this class?
class Normalizer():
def __init__(self, data, dif=None):
self.min = np.amin(data, axis=0, keepdims=True)
self.max = np.amax(data, axis=0, keepdims=True)
if dif is None:
self.dif = self.max-self.min
else:
self.dif = dif
def getScale(self):
return self.dif
def normalize(self, points):
return (points - self.min)/self.dif
def recover(self, norm_points):
return (norm_points * self.dif)+self.min
Thank for your work @louis-wiesmann. I noticed that you mentioned in your paper that you did a generalization test on the Nuscene dataset, but did not find the corresponding part in the code, I wonder if it is convenient to provide this part.thanks a lot
Hello.
I am trying to run the code on Google Colab. While trying to install the third party dependency (following the instructions given in the Dockerfile) octree_handler
via the command:
cd depoco/submodules/octree_handler && pip3 install -U .
...I get the following error:
Processing /content/git-clone/submodules/octree_handler
DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default.
pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555.
Building wheels for collected packages: octree-handler
Building wheel for octree-handler (setup.py) ... error
ERROR: Failed building wheel for octree-handler
Running setup.py clean for octree-handler
Failed to build octree-handler
Installing collected packages: octree-handler
Running setup.py install for octree-handler ... error
ERROR: Command errored out with exit status 1: /usr/bin/python3 -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-req-build-7l37y9ju/setup.py'"'"'; __file__='"'"'/tmp/pip-req-build-7l37y9ju/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-qzusj2ky/install-record.txt --single-version-externally-managed --compile --install-headers /usr/local/include/python3.7/octree-handler Check the logs for full command output.
First thank you for your great work. I have a question about using your model on my dataset.
My trainingset, validset and testingset is three .pkl files with sizes of [B, N, 3] which is normalized already. Could you tell me how to transform my data to the form which could be trained? Thank you so much.
Hello @louis-wiesmann
when I install the 3rdparty dependencies, I meet the problem:
ERROR [12/13] RUN cd depoco/submodules/ChamferDistancePytorch/chamfer3D/ && pip3 install -U . 2>/dev/null
I guess what matters is the setup process for chamfer-3D,the details are:
Building wheels for collected packages: chamfer-3D
#0 1.824 Building wheel for chamfer-3D (setup.py): started
#0 2.682 Building wheel for chamfer-3D (setup.py): finished with status 'error'
#0 2.682 Running setup.py clean for chamfer-3D
#0 3.509 Failed to build chamfer-3D
#0 4.069 Installing collected packages: chamfer-3D
#0 4.069 Running setup.py install for chamfer-3D: started
#0 4.940 Running setup.py install for chamfer-3D: finished with status 'error'
is there any way to solve the problem?
thanks in advance
As I can see the upsamlping result of the encoded pointcloud is somehow noisy,e.g. the ground points should be lying on the ground neatly but actually it shifts a lot and seems noisy.I want to know have you guys tried to add the the symmetric plane distance to the loss? (it seems that it will help these points to lie on the same ground)
when I install the 3rdparty dependencies, I meet the problem:
deep-point-map-compression/submodules/octree_handler$ pip3 install -U .
the problem is:
/home/bianjiang/CVPR/deep-point-map-compression/submodules/octree-handler/src/OctreeHandler.cpp: In member function ‘Eigen::MatrixXf Octree::computeEigenvaluesNormal(const float&)’:
/home/bianjiang/CVPR/deep-point-map-compression/submodules/octree-handler/src/OctreeHandler.cpp:181:12: error: ‘cout’ is not a member of ‘std’
181 | std::cout << "sing: " << singularv
| ^~~~
/home/bianjiang/CVPR/deep-point-map-compression/submodules/octree-handler/src/OctreeHandler.cpp:9:1: note: ‘std::cout’ is defined in header ‘’; did you forget to ‘#include ’?
8 | #include <eigen3/Eigen/Dense>
+++ |+#include
9 | #include
make[2]: *** [src/CMakeFiles/octree_handler.dir/build.make:63:src/CMakeFiles/octree_handler.dir/OctreeHandler.cpp.o] error 1
make[1]: *** [CMakeFiles/Makefile2:94:src/CMakeFiles/octree_handler.dir/all] error 2
make: *** [Makefile:84:all] error 2
I solve this probelm by add #include <iostream>
to
deep-point-map-compression/submodules/octree-handler/src/OctreeHandler.h
So if anyone meets the same problem, this method may work.
Can we try this without GPU?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.