1 min readfrom Machine Learning

Evolving Deep Learning Optimizers [R]

We present a genetic algorithm framework for automatically discovering deep learning optimization algorithms.

Our approach encodes optimizers as genomes that specify combinations of primitive update terms (gradient, momentum, RMS normalization, Adam-style adaptive terms, and sign-based updates) along with hyperparameters and scheduling options.

Through evolutionary search over 50 generations with a population of 50 individuals, evaluated across multiple vision tasks, we discover an evolved optimizer that outperforms Adam by 2.6% in aggregate fitness and achieves a 7.7% relative improvement on CIFAR-10.

The evolved optimizer combines sign-based gradient terms with adaptive moment estimation, uses lower momentum coefficients than Adam ( =0.86, =0.94), and notably disables bias correction while enabling learning rate warmup and cosine decay.

Our results demonstrate that evolutionary search can discover competitive optimization algorithms and reveal design principles that differ from hand-crafted optimizers.

submitted by /u/EducationalCicada
[link] [comments]

Want to read more?

Check out the full article on the original site

View original article

Tagged with

#financial modeling with spreadsheets
#machine learning in spreadsheet applications
#cloud-based spreadsheet applications
#natural language processing for spreadsheets
#generative AI for data analysis
#rows.com
#Excel alternatives for data analysis
#genetic algorithm
#deep learning
#optimization algorithms
#optimizers
#evolutionary search
#gradient
#sign-based updates
#primitive update terms
#adaptive moment estimation
#momentum
#RMS normalization
#hyperparameters
#CIFAR-10