Coder Social home page Coder Social logo

hassony2 / obman_render Goto Github PK

View Code? Open in Web Editor NEW
77.0 5.0 9.0 5.83 MB

[cvpr19] Code to generate images from the ObMan dataset, synthetic renderings of hands holding objects (or hands in isolation)

Python 98.90% Shell 1.10%
rendering hands grasping cvpr19 cvpr-2019 synthetic-data graspit

obman_render's Introduction

Learning Joint Reconstruction of Hands and Manipulated Objects - Demo, Training Code and Models

Yana Hasson, Gül Varol, Dimitris Tzionas, Igor Kalevatykh, Michael J. Black, Ivan Laptev, Cordelia Schmid, CVPR 2019

This code allows to generate synthetic images of hands holding objects as in the ObMan dataset.

In addition, hands-only images can also be generated, with hand-poses sampled randomly from the MANO hand pose space.

Examples of rendered images:

Hands+Objects Hands
handobject hand

Rendering generates:

  • rgb images
  • 3D ground truth for the hand and objects
  • depth maps
  • segmentation maps

For additional information about the project, see:

Installation

Setup blender

  • Download Blender 2.78c (wget https://download.blender.org/release/Blender2.78/blender-2.78c-linux-glibc219-x86_64.tar.bz2 for instance)
  • untar tar -xvf blender-2.78c-linux-glibc219-x86_64.tar.bz2
  • Download getpip.py: wget https://bootstrap.pypa.io/get-pip.py
  • Try blender-2.78c-linux-glibc219-x86_64/2.78/python/bin/python3.5m get-pip.py
    • If this fails, try:
      • Install pip path/to/blender-2.78c-linux-glibc219-x86_64/2.78/python/bin/python3.5m path/to/blender-2.78c-linux-glibc219-x86_64/2.78/python/lib/python3.5/ensurepip
      • Try to update pip path/to/blender-2.78c-linux-gliblender-2.78c-linux-glibc219-x86_64/2.78/python/bin/pip3 install --upgrade pip
  • Install dependencies
    • path/to/blender-2.78c-linux-glibc219-x86_64/2.78/python/bin/pip install -r requirements.txt

Clone repository

git clone https://github.com/hassony2/obman_render
cd obman_render

Download data dependencies

Download hand and object pickle data-structures

Download SURREAL assets

  • Go to SURREAL dataset request page
  • Create an account, and receive an email with a username and password for data download
  • Download SURREAL data dependencies using the following commands
cd download
sh download_smpl_data.sh ../assets username password
cd ..

Download MANO model

  • Go to MANO website
  • Create an account by clicking Sign Up and provide your information
  • Download Models and Code (the downloaded file should have the format mano_v*_*.zip). Note that all code and data from this download falls under the MANO license.
  • unzip the file mano_v*_*.zip: unzip mano_v*_*.zip
  • set environment variable: export MANO_LOCATION=/path/to/mano_v*_*

Modify mano code to be Python3 compatible

  • Remove print 'FINITO' at the end of file webuser/smpl_handpca_wrapper.py (line 144)
-    print 'FINITO'
  • Replace import cPickle as pickle by import pickle
-    import cPickle as pickle
+    import pickle
  • at top of webuser/smpl_handpca_wrapper.py (line 23)
  • at top of webuser/serialization.py (line 30)
  • Fix pickle encoding
    • in webuser/smpl_handpca_wrapper.py (line 74)
-    smpl_data = pickle.load(open(fname_or_dict))
+    smpl_data = pickle.load(open(fname_or_dict, 'rb'), encoding='latin1')
  • in webuser/serialization.py (line 90)
-    dd = pickle.load(open(fname_or_dict))
+    dd = pickle.load(open(fname_or_dict, 'rb'), encoding='latin1')
  • Fix model paths in webuser/smpl_handpca_wrapper.py (line 81-84)
-    with open('/is/ps2/dtzionas/mano/models/MANO_LEFT.pkl', 'rb') as f:
-        hand_l = load(f)
-    with open('/is/ps2/dtzionas/mano/models/MANO_RIGHT.pkl', 'rb') as f:
-        hand_r = load(f)
+    with open('/path/to/mano_v*_*/models/MANO_LEFT.pkl', 'rb') as f:
+        hand_l = load(f, encoding='latin1')
+    with open('/path/to/mano_v*_*/models/MANO_RIGHT.pkl', 'rb') as f:
+        hand_r = load(f, encoding='latin1')

At the time of writing the instructions mano version is 1.2 so use

-    with open('/is/ps2/dtzionas/mano/models/MANO_LEFT.pkl', 'rb') as f:
-        hand_l = load(f)
-    with open('/is/ps2/dtzionas/mano/models/MANO_RIGHT.pkl', 'rb') as f:
-        hand_r = load(f)
+    with open('/path/to/mano_v1_2/models/MANO_LEFT.pkl', 'rb') as f:
+        hand_l = load(f, encoding='latin1')
+    with open('/path/to/mano_v1_2/models/MANO_RIGHT.pkl', 'rb') as f:
+        hand_r = load(f, encoding='latin1')

Download SMPL model

  • Go to SMPL website
  • Create an account by clicking Sign Up and provide your information
  • Download and unzip SMPL for Python users, copy the models folder to assets/models. Note that all code and data from this download falls under the SMPL license.

OPTIONAL : Download LSUN dataset (to generate images on LSUN backgrounds)

Download LSUN dataset following the instructions.

OPTIONAL : Download ImageNet dataset (to generate images on ImageNet backgrounds)

  • Download original images from here

Download body+hand textures and grasp information

  • Request data on the ObMan webpage

  • Download grasp and texture zips

You should receive two links that will allow you to download bodywithands.zip and shapenet_grasps.zip.

  • Unzip texture zip
cd assets/textures
mv path/to/downloaded/bodywithands.zip .
unzip bodywithands.zip
cd ../..
  • Unzip the grasp information
cd assets/grasps
mv path/to/downloaded/shapenet_grasps.zip .
unzip shapenet_grasps.zip
cd ../../
  • Your structure should look like this:
obman_render/
  assets/
    models/
      SMPLH_female.pkl
      basicModel_f_lbs_10_207_0_v1.0.2.fbx'
      basicModel_m_lbs_10_207_0_v1.0.2.fbx'
      ...
    grasps/
      shapenet_grasps/
      shapenet_grasps_splits.csv
    SURREAL/
      smpl_data/
      	smpl_data.npz
    ...

Launch !

Minimal version on white background

Hands only

path/to/blender -noaudio -t 1 -P blender_grasps_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["white"]}'

Grasping objects

path/to/blender -noaudio -t 1 -P blender_hands_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["white"]}'

Full version with image backgrounds

Hands only

path/to/blender -noaudio -t 1 -P blender_hands_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["lsun", "imagenet"], "imagenet_path": "/path/to/imagenet", "lsun_path": "/path/to/lsun"}'

Grasping objects

path/to/blender -noaudio -t 1 -P blender_grasps_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["lsun", "imagenet"], "imagenet_path": "/path/to/imagenet", "lsun_path": "/path/to/lsun"}'

Citations

If you find this code useful for your research, consider citing:

  • the publication this code has been developped for
@INPROCEEDINGS{hasson19_obman,
  title     = {Learning joint reconstruction of hands and manipulated objects},
  author    = {Hasson, Yana and Varol, G{\"u}l and Tzionas, Dimitris and Kalevatykh, Igor and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},
  booktitle = {CVPR},
  year      = {2019}
}
  • the publication it builds upon, for synthetic data generation of humans
@INPROCEEDINGS{varol17_surreal,  
  title     = {Learning from Synthetic Humans},  
  author    = {Varol, G{\"u}l and Romero, Javier and Martin, Xavier and Mahmood, Naureen and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},  
  booktitle = {CVPR},  
  year      = {2017}  
}
  • the publication describing the used hand model: MANO:
@article{MANO:SIGGRAPHASIA:2017,
  title = {Embodied Hands: Modeling and Capturing Hands and Bodies Together},
  author = {Romero, Javier and Tzionas, Dimitrios and Black, Michael J.},
  journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)},
  publisher = {ACM},
  month = nov,
  year = {2017},
  url = {http://doi.acm.org/10.1145/3130800.3130883},
  month_numeric = {11}
}

obman_render's People

Contributors

hassony2 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

obman_render's Issues

rendering fails: TypeError: 'NoneType' object is not subscriptable

Thanks for making the obman_render code available.
I am trying to render multiple views of obman. I installed everything based on your well written instructions.

But when I launch the blender using the commands provided in GitHub page, I am getting an error message at thie line

File "./blender_hands.py", line 223, in run
segm_img = cv2.imread(tmp_segm_path)[:, :, 0]
TypeError: 'NoneType' object is not subscriptable

same errors for blender_grasps,py also

I would like to know if you had also encountered this error at any point of time.
It would be a great help if you could have suggestions or hints to solve this error, because even after spending more than 2 days I could not find the reason(may be because, I am new to blender)

Thank in Advance!
Hoping to hear from you.

I received different models from 'SMPL for Python Users'

When I go to the SMPL website, I have the option to download either:

Download version 1.0.0 for Python 2.7 (10 shape PCs)
or
Download version 1.1.0 for Python 2.7 (all 300 shape PCs)

In both cases, I receive .pkl files (not .fbx like it says in the instructions):

models/basicModel_f_lbs_10_207_0_v1.1.0.pkl
models/basicModel_m_lbs_10_207_0_v1.1.0.pkl

I also don't receive 'SMPLH_female.pkl'.

Strange hand pose

Hi,

I'm generating grasps for my object in GraspIt!:

Screenshot from 2020-10-13 20-00-53

However, the pinky and ring finger are not rendered correctly in Blender:

Screenshot from 2020-10-13 20-02-01

This seems to be always happening, but I have no idea why.

Another example here, click to expand

Screenshot from 2020-10-13 20-24-43

Screenshot from 2020-10-13 20-24-23

The standard mano model's mesh gets transformed in this function:

mesh_manip.alter_mesh(mano_obj, mano_model.r.tolist())

But I can't identify a bug. Any ideas what this could be?

Best

Weird hand pose

Hi Yana,

I am trying to reproduce your results of synthetic data generation, but the renderings I obtain depict very unnatural hand poses such as this one:
00000009

After the renderings are finished, I can see in the viewer of Blender that there is a MANO hand that is not rendered, but which is in a more natural pose:
image

After following the instructions in the Readme, here is the list of modifications I had to do in order to be able to run the program:

  • Install "sacred" package in the Python distribution of Blender
  • Copy in the assets/models folder:
  • Modify assets/textures/shapenet/textures_{train/test/val}.txt and replace "/sequoia/data2/dataset/shapenet/ShapeNetCore.v2" by the path of my own ShapeNet repository for every line.

Do you have an idea of what may cause this behavior?

In any case, thanks for sharing your research code. The Readme is very clear and detailed compared to many projects on GitHub.

Best,
Romain

question abour some variable

Hi,thx for your nice work.When I read the demo I have some questions.
In the function randomized_verts you set center_idx to be 20 or 45. What does it mean.On the other word, how to get the index of the center? such as What if I want to get the head?

'Unable to open a display' while running the first example

Hello!

Thank you for this efficient tool for hand shapes and grasps rendering.

Unfortunately, I am facing some issue while running the first example:

2020-01-21-17-47-13

Could you please give some advice on how can I launch the tool in the remote server's terminal?
Is it even possible to use it without the display (in a "headless" mode)?

Thank you in advance!

how to debug blender script?

thank you for your great works.
I am using this repository very well. However, when I change some codes to make new datasets, I have a difficulty in debugging the python scripts. I tried to debug this code using vscode-blender add-on but this needs lots of fixes.
I would appreciate it if you would tell me how to debug code. (just keyword is also fine)
Thank you!

Use smplx instead of smplh

Hi .. thank you for the great work.
I need to use smplx (smplh+facial expressions you can say) to include facial expressions in the output. How can I approach that ?
Thanks in advance!

Problems with asset/models files

Hello.
First, thanks for the nice work!
I'm following your READEME explanation to make a new synthetic dataset, like OBMan.
But while doing so, I got some problems.
When I download the SMPL model(in SMPL for Python users), I can only download the Model files(ends with .pkl) and Code which supports functions to load/save SMPL files. I can't get download those files which are written in README.

basicModel_f_lbs_10_207_0_v1.0.2.fbx
basicModel_m_lbs_10_207_0_v1.0.2.fbx

So, when I call Launch-Minimal version on white background-Hands only task,
I get this error
FileNotFoundError: [Errno 2] No such file or directory: 'assets/models/basicModel_f_lbs_10_207_0_v1.0.2.fbx'

Can you suggest any solutions or share files?

Thanks!

get_pip synthax error

Hi :) Awesome repo !!

Just a heads up that get_pip has been updated on Jan 23rd 2021. The new version uses f-strings and is incompatible with python 3.5 of blender. It will throw a synthax error.

You can revert to the old get_pip by replacing the link in the readme:

 wget https://bootstrap.pypa.io/3.4/get-pip.py

Cheers,
Thibault

Render hands from terminal

Hello,

I am trying to generate hands without the objects from a terminal. When I execute the blender_hands_sacred.py, it raises an error :

' Unable to open a display
Aborted (core dumped) '

Is there anyway to generate the images without rendering them? I am trying to just generate hands in different poses (without object).

Cheers

problem with 'obj_texture'

I read the .pkl meta file of the obman release dataset and looked at the texture image of the 'obj_texture' path, but found that it does not correspond to the texture of rendered image in rgb_obj.

I want to know how can I obtain correct texture images of dataset samples? Thanks.

texture

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.