hzzone / adatrans Goto Github PK
View Code? Open in Web Editor NEWAdaptive Nonlinear Latent Transformation for Conditional Face Editing (ICCV 2023)
Adaptive Nonlinear Latent Transformation for Conditional Face Editing (ICCV 2023)
Hello, thanks for your interesting work!
I have a question about strength of an edit (For example, in the teaser figure, where we can edit "old" attribute by specific age-number: 10 years old, 70+ years old, etc.)
As far as I understand, we need to manually edit the Facial Attribute vector
For example, given an initial image
I want to edit image
Then I will have to edit the corresponding facial attribute
Looking forward for your response. Thank you!
the det_face is always None . do you have any suggestions?
ps: The download link https://sota.nizhib.ai/pytorch-insightface/iresnet34-5b0d0e90.pth for the pre-trained IResNet model is invalid, causing an HTTP Error 404: Not Found. Could you please provide alternative methods to obtain the IResNet model. I really appreciate for your help and look forward to your reply!
Hello, thank you very much for providing the pre-trained models, which are very helpful.
I have a few additional questions. Firstly, I am unsure of the meaning of this sentence in section 4.2.1 of the paper: "Note that we train all attributes presented in Fig. 4 in a unified model". Aren't binary attributes supposed to be trained separately from one-hot conditions like hair color or style? Could you please give me some hints about how you conduct the training when performing the experiment?
Secondly, I can't find the code for some experiments mentioned in the paper, such as "Editing with 128 labeled samples only" and "nonlinear editing with a fixed step size at each step". Could you please provide the code or give me some guidance on the code implementation for these specific experiments?
Thank you very much, and have a great day :)
My E-mail: [email protected]
Hello, thank you so much for your previous help; it gives me a deeper understanding of the paper.
I'm confused by these two sentences in section 4.1 of the paper:
"For the fixed attribute classifier, we trained the last linear layer of ResNet-50 [15] on the binary attributes of the CelebA dataset [28] and the discrete age labels of FFHQ from [30]. We train another classifier from scratch for better classification performance and obtain the attribute labels of the FFHQ dataset".
1.Does that mean the weights of the classifier used to generate attribute labels are different from the one computing the Lmi loss?
2.Which classifier is the provided "Train the classifier" code aim to train? And could you please give me some guidance on the code implementation for training the other classifier?
3.In the "Editing with limited labeled data" experiment, are the attribute labels for the FFHQ dataset also generated by a classifier trained only on 128 samples, or by a classifier trained normally on the CelebA dataset?
Thank you very much, and have a great day :)
My E-mail: [email protected]
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.