We present Extremum Seeking Gradient (ESG), an innovative gradient descent method designed specifically for analog neural hardware. This method takes inspiration from control theory, where it is commonly used to optimize dynamic and unknown objective functions. In this work, we extend and adapt these control principles to the realm of training analog neural networks, where traditional backpropagation (BP) cannot be employed. Authors:
- David Prichen
- Or Dicker
- Abstract
- Introduction
- Methods
- Results (from the experiments)
- Conclusion
experiments/
- Polynomial fit
- spiral fit (RNN)
- MNIST
- [-] small LLM ? (MeZO)
presentation
- about 5 slides
Julia implementing under code/julia