BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Alexa Research Paper Shows Genetic Algorithms Offer Best Solution for Neural Network Optimization

Alexa Research Paper Shows Genetic Algorithms Offer Best Solution for Neural Network Optimization

Leia em Português

Bookmarks

Amazon's Alexa Science researchers published a paper providing a theoretical basis for neural-network optimization. While showing that it is computationally intractable to find a perfect solution, the paper does provide a formulation, the Approximate Architecture Search Problem (a-ASP), that can be solved with genetic algorithms.

In a recent blog post describing the work, research engineer Adrian de Wynter cast the problem of choosing a neural-network architecture as an exercise in function approximation; in this formulation, the function is the "true" mapping of input data to outputs, and the approximation is the learned neural-network model. Network architectures are often chosen based on intuition or trial-and-error, but de Wynter claims this "arbitrary selection of a neural architecture is unlikely to provide the best solution." Instead, given a set of neural-network components, such as convolutional or max-pooling layers, an automated optimal architecture search would find the combination of these components that approximates the function with minimal error, and de Wynter's work provides "theoretical guarantees on the accuracy of its computations." He shows that the general architecture search problem (ASP) is intractable---that is, it cannot be guaranteed to run in polynomial time. He thus proposes a "relaxed" formulation of the problem, approximate-ASP (a-ASP), which can be solved in polynomial time using co-evolutionary genetic algorithms.

Automatic optimization of machine-learning systems is an active research area. Many of the major cloud platforms offer AutoML systems, and there are several open-source options. Most AutoML solutions address all parts of the ML pipeline, including data cleanup and hyper-parameter optimization as well as model selection. By contrast, de Wynter's research particularly focuses on the selection of the best neural-network model structure. While some researchers have tackled this problem using techniques such as Bayesian Optimization, de Wynter's paper claims genetic algorithms "perform better than others in a generalized setting."

A genetic algorithm is an optimization technique that is based on the concepts of biological evolution: "survival of the fittest." Each potential solution to a problem has a fitness score, indicating how well it solves the problem, and a genetic representation. The key idea is that a solution must be represented in a way that allows for random mutations as well as crossover from other solutions. The genetic algorithm runs for several generations, trying out various solutions, applying mutations, and keeping the fittest results. In de Wynter's formulation, the genetic algorithm searches for combinations of neural-network components, such as convolutional layers, which are drawn from a set of components that is shown to be equivalent to a Turing machine. The genetic algorithm must find a sequence of these components that produces a network which best approximates the desired mapping of input data to outputs, subject to a constraint on maximum sequence length.

Other research teams have applied genetic or evolutionary algorithms to the problem of optimizing deep-learning systems. Google last year open-sourced AdaNet, a TensorFlow-based framework for evolutionary-based AutoML. More recently, Uber open-sourced EvoGrad, a PyTorch library for evolutionary algorithms which treats the population as an abstract probability distribution. According to de Wynter,

[M]any researchers have come to the conclusion that co-evolutionary algorithms provide the best way to build machine learning systems. But the function-approximation framework from my paper helps provide a more secure theoretical foundation for their intuition.

Rate this Article

Adoption
Style

BT