Academic project to optimize neural network hyper parameters.
This project was done for the INF8225 artificial intelligence course at Polytechnique Montréal.
Check our report
Our experimentation would not be possible without the use of these great tools:
- Theano Symbolic calculation library
- TensorFlow Tensor manipulation
- Keras Deep learning library
- NOMAD and OPAL Software for black box optimiaztion
- Spearmint Software to perform bayesian optimization
LeCun, Y., Cortes, C., & Burges, C. J. (1998). The MNIST database of handwritten digits.
Audet, C., & Dennis Jr, J. E. (2006). Mesh adaptive direct search algorithms for constrained optimization. SIAM Journal on optimization, 17(1), 188-217.
Audet, C., Dang, C. K., & Orban, D. (2011). Algorithmic parameter optimization of the DFO method with the OPAL framework. In Software Automatic Tuning (pp. 255-274). Springer New York.
Smithson, S. C., Yang, G., Gross, W. J., & Meyer, B. H. (2016). Neural networks designing neural networks: multi-objective hyper-parameter optimization. arXiv preprint arXiv:1611.02120.
Snoek, J., Larochelle, H., & Adams, R. P. (2012). Practical bayesian optimization of machine learning algorithms. In Advances in neural information processing systems (pp. 2951-2959).
Snoek, J., Rippel, O., Swersky, K., Kiros, R., Satish, N., Sundaram, N., ... & Adams, R. P. (2015, February). Scalable Bayesian Optimization Using Deep Neural Networks. In ICML (pp. 2171-2180).