Coder Social home page Coder Social logo

dreadlord1984 / texturemixer Goto Github PK

View Code? Open in Web Editor NEW

This project forked from ningyu1991/texturemixer

0.0 2.0 0.0 2.54 GB

The official Tensorflow implementation for CVPR'19 paper 'Texture Mixer: A Network for Controllable Synthesis and Interpolation of Texture'

License: Other

Python 84.66% Jupyter Notebook 15.34%

texturemixer's Introduction

Texture Mixer

  • Official Tensorflow implementation for our CVPR'19 paper on controllable texture interpolation and several applications.
  • Contact: Ning Yu (ningyu AT umd DOT edu)

Texture Interpolation 128x1024 (more results are shown in the paper)

Texture Dissolve 1024x1024

Texture Brush 512x2048

Animal hybridization

Prerequisites

  • Linux
  • NVIDIA GPU + CUDA + CuDNN
  • Python 3.6
  • tensorflow-gpu
  • Other Python dependencies: numpy, scipy, moviepy, Pillow, skimage, lmdb, opencv-python, cryptography, h5py, six
  • Clone the official VGG repository into the current direcotory.

Datasets: Animal Texture, Earth Texture, Plant Texture

  • Raw training and testing images of earth texture are downloaded from Flickr under Creative Commons or public domain license. They are saved at datasets/earth_texture/.
  • Raw training and testing images of animal texture or plant texture are copyrighted by Adobe and can be purchased from Adobe Stock. We list the searchable image IDs at:
    • datasets/animal_texture/train_AdobeStock_ID_list.txt
    • datasets/animal_texture/test_AdobeStock_ID_list.txt
    • datasets/plant_texture/train_AdobeStock_ID_list.txt
    • datasets/plant_texture/test_AdobeStock_ID_list.txt
  • Given raw images, run, e.g., the following command for data augmentation.
    python3 data_augmentation.py --iPath earth_texture/test_resize512/ --oPath earth_texture_test_aug/ --num_aug 10000
    
  • Then follow the official Progressive GAN repository "Preparing datasets for training" Section for dataset preparation. Use the create_from_images option in dataset_tool.py. The prepared data enables efficient streaming.
  • For convenience, the prepared testing dataset of earth texture can be downloaded here. Unzip and put under datasets/.

Pre-Trained Models

Training

After data preparation, run, e.g.,

python3 run.py \
--app train \
--train_dir earth_texture_train_aug_with_labels/ \
--val_dir earth_texture_test_aug_with_labels/ \
--out_dir models/earth_texture/ \
--num_gpus 8

where

  • train_dir: The prepared training dataset directory that can be efficiently called by the code.
  • val_dir: The prepared validation dataset directory that can be efficiently called by the code.
  • num_gpus: The number of GPUs for training. Options {1, 2, 4, 8}. Using 8 NVIDIA GeForce GTX 1080 Ti GPUs, we suggest training for 3 days.

Applications

Texture Interpolation

Run, e.g.,

python3 run.py \
--app interpolation \
--model_path models/animal_texture/network-final.pkl \
--imageL_path examples/animal_texture/1000_F_99107656_XvbvoVVRintE5tmuh1MkdXqs8rkzoahB_NW_aug00000094.png \
--imageR_path examples/animal_texture/1000_F_109464954_aBfyWSbdZt5PNpUo7hOqDRPWmmvQj3v9_NW_aug00000092.png \
--out_dir results/animal_texture/horizontal_interpolation/

where

  • imageL_path: The left-hand side image for horizontal interpolation.
  • imageR_path: The right-hand side image for horizontal interpolation.

Texture Dissolve

Run, e.g.,

python3 run.py \
--app dissolve \
--model_path models/animal_texture/network-final.pkl \
--imageStartUL_path examples/animal_texture/1000_F_23745067_w0GkcAIQG2C4hxOelI1aQYZglEXggGRS_NW_aug00000000.png \
--imageStartUR_path examples/animal_texture/1000_F_44327602_E6wl8FNihQON8c704fE5DEY2LOJPQQ1V_NW_aug00000018.png \
--imageStartBL_path examples/animal_texture/1000_F_66218716_rxcsXWQzYpIWVB8a09ZcuNzE7qAJ3HEk_NW_aug00000098.png \
--imageStartBR_path examples/animal_texture/1000_F_40846588_APKTS3BpiRvR1nvUx0FRa7qjjR788zt8_NW_aug00000001.png \
--imageEndUL_path examples/animal_texture/1000_F_40300952_3dgaCtcLCrhzU0r6HEfnwr7nujDWXbSQ_NW_aug00000029.png \
--imageEndUR_path examples/animal_texture/1000_F_44119648_3vskjRwVc4NVT1Pf0l3RlvIFUemo8TM1_NW_aug00000001.png \
--imageEndBL_path examples/animal_texture/1000_F_70708482_2N8lknTCJg2Q8JQeomFYaFttxId9rulj_NW_aug00000070.png \
--imageEndBR_path examples/animal_texture/1000_F_79496236_8mxTSHy5OilHnJaAxWcw2dwC9SoBLmDK_NW_aug00000057.png \
--out_dir results/animal_texture/dissolve/

where

  • imageStartUL_path: The upper-left corner image in the starting frame.
  • imageStartUR_path: The upper-right corner image in the starting frame.
  • imageStartBL_path: The bottom-left corner image in the starting frame.
  • imageStartBR_path: The bottom-right corner image in the starting frame.
  • imageEndUL_path: The upper-left corner image in the ending frame.
  • imageEndUR_path: The upper-right corner image in the ending frame.
  • imageEndBL_path: The bottom-left corner image in the ending frame.
  • imageEndBR_path: The bottom-right corner image in the ending frame.

Texture Brush

Run, e.g.,

python3 run.py \
--app brush \
--model_path models/animal_texture/network-final.pkl \
--imageBgUL_path examples/animal_texture/1000_F_23745067_w0GkcAIQG2C4hxOelI1aQYZglEXggGRS_NW_aug00000000.png \
--imageBgUR_path examples/animal_texture/1000_F_44327602_E6wl8FNihQON8c704fE5DEY2LOJPQQ1V_NW_aug00000018.png \
--imageBgBL_path examples/animal_texture/1000_F_66218716_rxcsXWQzYpIWVB8a09ZcuNzE7qAJ3HEk_NW_aug00000098.png \
--imageBgBR_path examples/animal_texture/1000_F_40846588_APKTS3BpiRvR1nvUx0FRa7qjjR788zt8_NW_aug00000001.png \
--imageFgUL_path examples/animal_texture/1000_F_40300952_3dgaCtcLCrhzU0r6HEfnwr7nujDWXbSQ_NW_aug00000029.png \
--imageFgUR_path examples/animal_texture/1000_F_44119648_3vskjRwVc4NVT1Pf0l3RlvIFUemo8TM1_NW_aug00000001.png \
--imageFgBL_path examples/animal_texture/1000_F_70708482_2N8lknTCJg2Q8JQeomFYaFttxId9rulj_NW_aug00000070.png \
--imageFgBR_path examples/animal_texture/1000_F_79496236_8mxTSHy5OilHnJaAxWcw2dwC9SoBLmDK_NW_aug00000057.png \
--stroke1_path stroke_fig/C_skeleton.png \
--stroke2_path stroke_fig/V_skeleton.png \
--stroke3_path stroke_fig/P_skeleton.png \
--stroke4_path stroke_fig/R_skeleton.png \
--out_dir results/animal_texture/brush/

where

  • imageBgUL_path: The upper-left corner image for the background canvas.
  • imageBgUR_path: The upper-right corner image for the background canvas.
  • imageBgBL_path: The bottom-left corner image for the background canvas.
  • imageBgBR_path: The bottom-right corner image for the background canvas.
  • imageFgUL_path: The upper-left corner image for the foreground palatte.
  • imageFgUR_path: The upper-right corner image for the foreground palatte.
  • imageFgBL_path: The bottom-left corner image for the foreground palatte.
  • imageFgBR_path: The bottom-right corner image for the foreground palatte.
  • stroke1_path: The trajectory image for the 1st stroke. The stroke pattern is sampled from the (3/8, 3/8) portion of the foreground palatte.
  • stroke2_path: The trajectory image for the 2nd stroke. The stroke pattern is sampled from the (3/8, 7/8) portion of the foreground palatte.
  • stroke3_path: The trajectory image for the 3rd stroke. The stroke pattern is sampled from the (7/8, 3/8) portion of the foreground palatte.
  • stroke4_path: The trajectory image for the 4th stroke. The stroke pattern is sampled from the (7/8, 7/8) portion of the foreground palatte.

Animal hybridization

Run, e.g.,

python3 run.py \
--app hybridization \
--model_path models/animal_texture/network-final.pkl \
--source_dir hybridization_fig/leoraffe/ \
--out_dir results/animal_texture/hybridization/leoraffe/

where

  • source_dir: The directory containing the hole region to be interpolated, two known source texture images adjacent to the hole, and their global Content-Aware Fill results from Photoshop.

After that, post-process the output hybridization image by Auto-Blend Layers with the original image (with hole) in Photoshop, so as to achieve the demo quality as shown above.

Citation

@inproceedings{yu2019texture,
    author = {Yu, Ning and Barnes, Connelly and Shechtman, Eli and Amirghodsi, Sohrab and Lukáč, Michal},
    title = {Texture Mixer: A Network for Controllable Synthesis and Interpolation of Texture},
    booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
    year = {2019}
}

Acknowledgement

  • This research is supported by Adobe Research Funding.
  • We acknowledge the Maryland Advanced Research Computing Center for providing computing resources.
  • We thank to the photographers for licensing photos under Creative Commons or public domain.
  • We express gratitudes to the Progressive GAN repository as we benefit a lot from their code.

texturemixer's People

Contributors

ningyu1991 avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.