Coder Social home page Coder Social logo

hassony2 / obman Goto Github PK

View Code? Open in Web Editor NEW
137.0 8.0 8.0 685 KB

[cvpr19] Hands+Objects synthetic dataset, instructions to download and code to load the dataset

Home Page: https://hassony2.github.io/obman.html

Python 100.00%
cvpr2019 hand object 3d-reconstruction

obman's Introduction

Learning Joint Reconstruction of Hands and Manipulated Objects - ObMan dataset

Yana Hasson, Gül Varol, Dimitris Tzionas, Igor Kalevatykh, Michael J. Black, Ivan Laptev, Cordelia Schmid, CVPR 2019

Download required files

Download dataset images and data

  • Request the dataset on the ObMan webpage. Note that the data falls under the following license
  • unzip obman.zip to /path/to/obman
  • Your dataset structure should look like
obman/
  test/
    rgb/
    rgb_obj/
    meta/
    ...
  val/
    rgb/
    rgb_obj/
    meta/
    ...

Download object meshes

Download code

git clone https://github.com/hassony2/obman

cd obman

Load samples

python readataset --root /path/to/obman --shapenet_root /path/to/ShapeNetCore.v2 --split test --viz

Options you might be interested in --segment which keeps only the foreground --mini_factor 0.01 to load only 1% of the data (to speed-up loading)

Preprocess shapenet objects for training

Sample points on the external surface of the object:

python shapenet_samplepoints.py

Visualizations

Hand object and mesh in camera coordinates

image

Projected in pixel space

Hand vertices in blue, object vertices in red.

image

Citations

If you find this dataset useful for your research, consider citing:

@INPROCEEDINGS{hasson19_obman,
  title     = {Learning joint reconstruction of hands and manipulated objects},
  author    = {Hasson, Yana and Varol, G{\"u}l and Tzionas, Dimitris and Kalevatykh, Igor and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},
  booktitle = {CVPR},
  year      = {2019}
}

obman's People

Contributors

hassony2 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

obman's Issues

The dataset

Hi, I registered yesterday and only received a email let me wait your contact me.
When will you contact me?
Best
Kaijie

Email no reply

Hello hassony2, Great data. I already send email by specific about four days, However nobody reply.

raise Exception('Exceeded {} attempts'.format(interrupt))

Hello Yana,

First of all thank you for the great work!

I try to understand your codes about obman these days.And when i run " python shapenet_samplepoints.py", I got the warning message as follows:
Traceback (most recent call last):
File "shapenet_samplepoints.py", line 50, in create_ray_samples
points = sample_mesh(mesh, min_hits=min_hits)
File "/home/haomeng/PycharmProjects/3DHand/obman/obman/samplemesh.py", line 80, in sample_mesh
raise Exception('Exceeded {} attempts'.format(interrupt))
Exception: Exceeded 10 attempts

It looks like the code just go through despite of such message. Does it matters?What does it mean actually?

Format of the meta (target) files

Hello Yana,

First of all thank you for the great work!

I would like to use the Obman dataset to train my own deep learning model for a different application and to do so I am trying to understand the format of the lables (meta data).
I have unpickled a meta file (.pkl) and I am trying to understand the dictionary resulting from it.

For example, coords3D is associated to 21 3D points, I think that these are keypoints of the object bounding box but I am not sure, and which keypoint are they? I have the same doubt for other voices of the dictionary.

Another example, in the case of the voice hand_pose. What do the 45 numbers associated to this voice exactly correspond to?

Is this information available somewhere?
If not, could you be so nice to explain the format of each target (voice in the meta file)?

Let me know! :)

How to convert mano hand model to the format accept by GraspIt?

Hi, Yana,
Your work of Learning joint reconstruction of hands and manipulated objects is very interesting to me.
Here I want to replicate the process you create the ObMan dataset,but I met a problem, I don't how to convert mano hand mode with 16x3 DOFs to a rigid robot hand model in XML format which is accepted by GraspIt software, could you please provide more detail or a script on the conversion steps?
Secondly, I found a 16DOF human hand model in GraspIt code, is that the rigid hand model you converted to? If so, could you reveal the relation between two representation (axis angle in MANO and DH parameters in XML file)?
I am very grateful if you could spare some time and solve my puzzle. Thanks!

About 'modelnet_meshes_***' in samplemesh.py

First, thank you for a chance to study about this code.

I run shapenet_cache.py and shapenet_samplepoint.py to train traineval.py in obman_train.
But, running the shapenet_samplepoints.py , i saw this error code.

"
Loaded /home/bella/obman_train/datasymlinks/ShapeNetCore.v2/03593526/10af6bdfd126209faaf0ad030fc37d94/models/model_normalized.pkl
Traceback (most recent call last):
File "shapenet_samplepoints.py", line 49, in create_ray_samples
points = sample_mesh(mesh, min_hits=min_hits)
File "/home/bella/obman_dir/obman/samplemesh.py", line 80, in sample_mesh
raise Exception('Exceeded {} attempts'.format(interrupt))
Exception: Exceeded 10 attempts
"

The shapnet_samplepoints.py stopped at 1007/3036.
To solve this problem, i saw the samplemesh.py, then i don't know about this " modelnet_meshes_***~".

"
model_path = ('/sequoia/data2/yhasson/datasets/'
'modelnet_meshes_2018_10_02/glass_000001702_12.obj')
model_path = ('/sequoia/data2/yhasson/datasets/'
'modelnet_meshes_2018_10_02/bottle_000001910_21.obj')
" in samplemesh.py.

I don't know how i solve this problem.
Thank you for reading my question :)

Transparency

Hi Yana,

I just had a simple question. Does the ObMan dataset include transparent objects?

Thanks.

About loss graph

Hello,

Thank you for your nice works!

I tried to implement based on your project.
However, when I trained my network, the training loss graph seems not good.

I compared the estimated hand & obj with GT data in millimeter unit.
So, the chamfer distance loss and other losses are too big to reduce until when they become zero.
At this point, the total loss decreases at 1e+5, and it does not reduce at all.

So, can you share your loss graph?
I wonder how your network trained based on ObMan dataset.

Thank you.
TY.

Dataset download request

Sorry to interrupt but I handed in the request in 29th June, received automated email and haven't been contacted yet for the access of the ObMan dataset. I would greatly appreciate that if you could help to check whether my request have been received or not, many thanks!

Approval to download

Hi Yana,

Hope you are doing well. I've sent an approval request for Obman dataset download, from the website. Not sure if you received it, as I haven't received any email after that. Will be grateful if you could look into this.

Train meta file length vs Shapenet_grasps data length

Hi, thank you for the work.

I have some question about data.

First, Shapenet_grasps seems to have been created from obman train dataset. What code should I refer to to create shapenet_grasps for test or validation set?

Second, The total length of the meta file is 141550. But, Shapenet_grasps is 125215. What is the difference between these two?

Thank you!

Hand grasp without object

Hello!

First of all, congratulations for this interesting project. I am a student doing my Master Thesis about Conditional GANs and I find this dataset very interesting.

I am wondering if there is any way to load an image sample and erase the object, keeping the hand grasp just as if it was holding it. That would allow me to have pairs of images one with and another without the object.

Thank you!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.