Coder Social home page Coder Social logo

peterzs / derender3d Goto Github PK

View Code? Open in Web Editor NEW

This project forked from brummi/derender3d

0.0 0.0 0.0 12.38 MB

CVPR 2022 - derender3d: A method for de-rendering a 3D object from a single image into shape, material, and lighting, that is trained in a weakly-supervised fashion relying only on rough shape estimates.

License: MIT License

Shell 2.02% Python 97.98%

derender3d's Introduction

De-rendering 3D Objects in the Wild

Paper | Video | Project Page

This is the official implementation for the CVPR 2022 paper:

De-rendering 3D objects in the Wild

Felix Wimbauer1,2, Shangzhe Wu1 and Christian Rupprecht1
1Visual Geometry Group, University of Oxford, 2Technical University of Munich

CVPR 2022 (arXiv)

thumbnail.mp4

A method for de-rendering a 3D object from a single image into shape, material, and lighting, that is trained in a weakly-supervised fashion relying only on rough shape estimates.

๐Ÿ“‹ Abstract

With increasing focus on augmented and virtual reality applications (XR) comes the demand for algorithms that can lift objects from images and videos into representations that are suitable for a wide variety of related 3D tasks. Large-scale deployment of XR devices and applications means that we cannot solely rely on supervised learning, as collecting and annotating data for the unlimited variety of objects in the real world is infeasible. We present a weakly supervised method that is able to decompose a single image of an object into shape (depth and normals), material (albedo, reflectivity and shininess) and global lighting parameters. For training, the method only relies on a rough initial shape estimate of the training objects to bootstrap the learning process. This shape supervision can come for example from a pretrained depth network or - more generically - from a traditional structure-from-motion pipeline. In our experiments, we show that the method can successfully de-render 2D images into a decomposed 3D representation and generalizes to unseen object categories. Since in-the-wild evaluation is difficult due to the lack of ground truth data, we also introduce a photo-realistic synthetic test set that allows for quantitative evaluation.

@InProceedings{wimbauer2022rendering,
  title={De-rendering 3D Objects in the Wild},
  author={Wimbauer, Felix and Wu, Shangzhe and Rupprecht, Christian},
  booktitle={CVPR},
  year={2022}
}

๐Ÿ—๏ธ๏ธ Setup

๐Ÿ Python Environment

We use Conda to manage our Python environment:

conda env create -f environment.yml

Then, activate the conda environment :

conda activate derender3d

๐Ÿ“ธ Checkpoints

We provide download links for pretrained models for CelebA-HQ and Co3D. Models will be stored under results/models at the same location the training checkpoints will be stored.

setup/download_model.sh {celebahq|co3d}

๐Ÿ’พ Processed Datasets

We provide download links for the processed Co3D dataset. For CelebA-HQ, the licensing is unclear, which is why we can only provide intructions to reproduce the dataset. Datasets will be stored under datasets. If you should prefer another storage location, you can create soft-links to the respective locations in the datasets folder.

setup/download_processed_co3d.sh

๐ŸŽค Demo

Coming Soon

For now, please have a look at the scripts directory, which provides many useful code snippets for data inspection, image generation, relighting videos, and consistency videos.

๐Ÿ‹๏ธ Training

We provide experiment configurations under experiments/release to reproduce the results we reported in the paper. To perform training, run the following commands:

CelebA-HQ

python run.py --config experiments/release/celebahq.yml --num_workers 8 --gpu 0
python run.py --config experiments/release/celebahq_nr.yml --num_workers 8 --gpu 0

Co3D

python run.py --config experiments/release/co3d.yml --num_workers 8 --gpu 0

๐Ÿ“Š Evaluation

To recalculate the numbers we report in the paper, please run the scripts/eval_cosy.py script. This requires you to setup the Co3D checkpoint and COSy dataset, as explained before.

python scripts/eval_cosy.py

Manual Dataset Creation

Coming Soon

TODO

  • Check reproducibility
  • Refactor and clean up code
  • Create download scripts for data and trained models
  • Check conda environment
  • Write detailed ReadMe
  • Create demo
  • Create fork for Unsup3D with data setup scripts (for CelebA-HQ)
  • Create fork for Co3D with data setup scripts

Acknowledgements

This repository is largely based on the Unsup3D repository by Shangzhe Wu.

derender3d's People

Contributors

brummi avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.