Coder Social home page Coder Social logo

gentlesjan / affordpose Goto Github PK

View Code? Open in Web Editor NEW
55.0 5.0 3.0 31.97 MB

AffordPose: A Large-scale Dataset of Hand-Object Interactions with Affordance-driven Hand Pose (ICCV 2023)

Home Page: https://affordpose.github.io/

License: GNU General Public License v3.0

Python 100.00%
computer-graphics computer-vision dataset graspit hand-object-interaction mano

affordpose's Introduction


Logo

AffordPose: A Large-scale Dataset of Hand-Object Interactions with Affordance-driven Hand Pose

ICCV, 2023
Juntao Jian1 · Xiuping Liu1 · Manyi Li2, · Ruizhen Hu3 · Jian Liu4,

1 Dalian University of Technology    2 Shandong University
3 Shenzhen University    4 Tsinghua University
Corresponding author

Paper PDF ArXiv PDF Project Page Youtube Video Project Page


overview overview overview overview

Download Datasets

  1. Download the AffordPose datasets from the AffordPose Project Page. You can download specific categories or all the data according to your needs. The data are saved with the path: AffordPose/Object_class/Object_id/affordance/xxx.json, look like:

     .
     └── AffordPose
         ├──bottle
         │   ├──3415
         │   │   ├──3415_Twist
         │   │   │   ├── 1.json
         │   │   │   ├── ...
         │   │   │   └── 28.json
         │   │   │
         │   │   └──3415_Wrap-grasp
         │   │       ├── 1.json
         │   │       ├── ...
         │   │       └── 28.json
         |   |
         |   └── ...
         |
         └── ...
    
  2. The structure in xxx.json file as follows:

     .
     ├── xxx.json
         ├── rhand_mesh            # the hand mesh
         ├── dofs                  # the joint configurations of the hand
         ├── rhand_trans           # the translation of the paml
         ├── rhand_quat            # the rotation of the paml
         ├── object_mesh           # the object mesh, and the verts are annotated with affordance label
         ├── trans_obj             # with the default value: (0,0,0)
         ├── quat_obj              # with the default value: (1,0,0,0)
         ├── afford_name           # the object affordance corresponding to the interaction
         └── class_name            # the object class
    

Data visualization

  • If you want to visualize the hand mesh, a feasible way is to save the value of "rhand_mesh" from the xxx.json as xxx.obj file and visualize it in MeshLab, which is also applies to object mesh.

  • The hand model we use following the obman dataset, which ports the MANO hand model to GraspIt! simulator.

  • We used GraspIt! to collect xxx.xml data and ran ManoHand_xml2mesh.py to obtain the hand mesh in 'mm'. Please note that you cannot obtain the correct hand mesh in 'm' by simply changing the 'scale' parameter in this python file.

    $ python ./ManoHand_xml2mesh.py --xml_path PATH_TO_DATA.xml --mesh_path PATH_TO_SAVE_DATA.obj --part_path DIRPATH_TO_SAVE_HAND_PARTS

Citation

If you find AffordPose dataset is useful for your research, please considering cite us:

@InProceedings{Jian_2023_ICCV,
  author    = {Jian, Juntao and Liu, Xiuping and Li, Manyi and Hu, Ruizhen and Liu, Jian},
  title     = {AffordPose: A Large-Scale Dataset of Hand-Object Interactions with Affordance-Driven Hand Pose},
  booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
  month     = {October},
  year      = {2023},
  pages     = {14713-14724}
}

affordpose's People

Contributors

affordpose avatar gentlesjan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

affordpose's Issues

Question about the affordance label

Hi,

Thanks for your great dataset. But may I ask how can I find the affordance label? It seems every object looks the same inside MeshLab regardless of the affordance. And the object_mesh vertexes look the same without any annotation about affordance. E.g., jar 4118 looks like this

image

I noticed every vertex is followed by 5 or 7. Does this means the affordance label?

Best,
Hui

Hand Model

Dear Authors,

Thanks for releasing the dataset!

What is the hand model used? Could you please provide more details? It does not look like a real hand like MANO.
Is this hand model replaceable by the MANO hand model?

What is the reason for not using the MANO hand model?

Thanks in Advance!

ETA?

Hi!

Do you have an ETA on the release of your dataset? I'd like to use it for my research :)

Thanks!

dataloader?

Thanks for the wonderful dataset!

Is there any formatted dataloader available for the dataset? Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.