Comments (4)
Hello!
Sorry for the late answer!
Suppose, we have 10000 images training dataset, 2 GPU's and a batch_size=5 images per GPU (in .yaml config we define batch_size per GPU). Then:
total_batch_size == 10 (2 gpu x 5 imgs)
train_batches == 1000 (10000 imgs / total_batch_size)
limit_train_batches <= 1000
val_check_interval <=1000
from lama.
Hi!
Thank you for your appreciation of our work!
[2021-12-07 11:28:26,947][main][CRITICAL] - Training failed due to
val_check_interval
(25000) must be less than or equal to the number of the training batches (3006). If you want to disable validation setlimit_val_batches
to 0.0 instead.:
With the current batch_size and 12022 train images our training procedure produces 3006 training batches. While we ask it to perform validation after every 25000 batch in epoch.
The simplest way to fix this error is to open
lama/configs/training/trainer/any_gpu_large_ssim_ddp_final.yaml
and edit following parameters:
# @package _group_
kwargs:
...
limit_train_batches: 3006 # <- was 25000
val_check_interval: 3006 # or less, before it was ${trainer.kwargs.limit_train_batches}
Let us know if it did work out!
from lama.
Thank you for quick response!!
I edited these parameters and ran it again.
# @package _group_
kwargs:
...
limit_train_batches: 3006
val_check_interval: 3006
But I got the following error.
ValueError: `val_check_interval` (3006) must be less than or equal to the number of the training batches (1203). If you want to disable validation set `limit_val_batches` to 0.0 instead.
Therefore, I edit to val_check_interval: 1203
and it worked fine.
I am concerned if there are appropriate parameters. How can I calculate the appropriate limit_train_batches
and val_check_interval
?
from lama.
I got it! Thank you!!
from lama.
Related Issues (20)
- Can I separate the Feature Refinement to Improve the High-Resolution Image Inpainting technique
- A simple ckpt to pt model convertor
- Repeated Refinement?
- Error finetuning the big-lama-with-discr model HOT 7
- Data set training problem HOT 1
- After executing the training command, it has been stuck at this point without any progress in the training. HOT 1
- Inpaint a NEW thing? HOT 3
- Refinement with Multiple Images
- How to draw a loss function curve
- Dataset is empty if configuring img_suffix: .jpg in default.yaml
- ONNX Model done HOT 10
- Output Error: No inpainted in the output_dir HOT 1
- Can't install at image in Docker
- The completion effect is not good? HOT 1
- Does Llama support inpainting a given image onto the original image as opposed to just removing from the original image
- Can't find dataloader , no outputs HOT 1
- Is there a way to find out the result is quality?
- nccl time out HOT 1
- Supplementary materials link does not work HOT 1
- Post-processing for LaMa Model Output to Increase Quality
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from lama.