Comments (8)
I changed all groupnorm in the offical structure back to batchnorm which made it possible to over 92.2 around 135 epoch. I just tested 150 epochs due to equpment limit and got best result 92.5. I guess it would be around 93.3 with more than 200 epochs.
Some updated ReLU function would also improve a little like 0.5% based on my tests last week.
from neural-ode-features.
And I just tested 256 filters with high tol setting done 100 epochs, get 92.23 best accuracy with 70mins time cost. Seems tol could speed up a lot ,but it should decreased accucary more based on my reading.
However, tests showed that bad effects were not that obvious. Now we can get 92.23 with just around one hour rather then 6 hours :).
from neural-ode-features.
I was using tol decay for training faster. Basicly using high tol setting then decrease the num while going more and more epochs. That would decrease training time a lot (based on decay rate), hope this idea helps.
from neural-ode-features.
I managed to obtain 92.2% accuracy on CIFAR-10 with the same net structure proposed by the authors (two residual blocks followed by an ODE block) and more weights (I used 256-filter convolutions in the ODE block).
You should be able to reproduce it passing the following parameters to train.py and let the network train for around 150-200 epochs.
dataset | augmentation | model | downsample | filters | dropout | batch_size | batch_accumulation | optim | lr | lrschedule | lrcycle | patience | wd | method | tol | adjoint | seed |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
cifar10 | crop+jitter+flip+norm | odenet | residual | 256 | 0.5 | 128 | 1 | sgd | 0.1 | plateau | 0 | 15 | 0.0001 | dopri5 | 0.001 | True | 23 |
I haven't explored many configurations yet, especially larger models that quickly become very slow to train. If you manage to get something better, please let me know, I'm willing to report the best results on the project's README.
from neural-ode-features.
Thanks a lot for your reply. I will comment here as soon as possible if I find anything that could make result over 92.2.
from neural-ode-features.
With smaller filter numbers and higher tol setting. It could achieved 91.23 arould 110 epochs.
And for each epochs, it would just take around 35-40% of time. The number could be further reduced I guess.
256 filters really cost a lots.
from neural-ode-features.
That's nice. Could you share the exact parameters? I hope to be able to replicate them and getting insights about the training process of ODE-Nets.
from neural-ode-features.
Sure, here is the document. It cost 11-12 hours(300 epochs) for my computer under your 92.2 setting.
Using following settings, it cost around 4 hours, and likely to achieve 92 (not sure, just get 91.3 with 170-180 tests, did not finish all 300 epochs).
tol is kind of hyperparameter that influence speed and accuracy very much. It might help you on training time issues.
I personlly think that my equipments are kind of limit the improvment from this parameter, hope you can get better result with it.
from neural-ode-features.
Related Issues (1)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from neural-ode-features.