Comments (2)
Hi!
I wonder what will be the reason: batch size, training epoch or others?
What dataset do you use? None of the above parameters alone should break training that severely.
The configs in this repository contain the actuall parameter values we used to train the models for paper - so please refer to them to get exact values.
The training takes about 12 hrs
12 hours per epoch or per full training? If you mean 12h per full training, this is too little.
We trained models for the paper for 1M iterations on 3xV100 with the total batch size of 30. Training of a single model on Places should take approximately 1 week. Big Lama Fourier trained for approximately two weeks. Celeba is much smaller and less diverse than Places and the convergence on Celeba is much faster, approximately 1-2 days.
If I set the batch size to 10, the training time on lama-fourier will be too long.
Please be aware that with DDP, data.batch_size
sets the batch size per gpu (not the total batch size). If you mean that 10 is the total batch size - then it is too small. We found that the quality degrades when BS<20. But small batch size alone should not break training that severely.
from lama.
Thanks for your reply, and I found I did not update the codebase (the old code had some errors in preparing the training data) so the training data was not sufficient. I am redoing everything again.
I also found the config files for your pre-trained model, and those are very helpful.
Thanks again.
from lama.
Related Issues (20)
- A simple ckpt to pt model convertor
- Repeated Refinement?
- Error finetuning the big-lama-with-discr model HOT 7
- Data set training problem HOT 1
- After executing the training command, it has been stuck at this point without any progress in the training. HOT 1
- Inpaint a NEW thing? HOT 3
- Refinement with Multiple Images
- How to draw a loss function curve
- Dataset is empty if configuring img_suffix: .jpg in default.yaml
- ONNX Model done HOT 10
- Output Error: No inpainted in the output_dir HOT 1
- Can't install at image in Docker
- The completion effect is not good? HOT 1
- Does Llama support inpainting a given image onto the original image as opposed to just removing from the original image
- Can't find dataloader , no outputs HOT 1
- Is there a way to find out the result is quality?
- nccl time out HOT 1
- Supplementary materials link does not work HOT 1
- Post-processing for LaMa Model Output to Increase Quality
- The link The link to download the CelebA-HQ dataset is not working
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from lama.