The record-breaking algorithm, dubbed AlphaTensor, is a spinoff of AlphaZero, which famously trounced human players in chess and Go. “Algorithms have been used throughout the world’s civilizations to perform fundamental operations for thousands of years,” wrote co-authors Drs. Matej Balog and Alhussein Fawzi at DeepMind.
“However, discovering algorithms is highly challenging.” AlphaTensor blazes a trail to a new world where AI designs programs that outperform anything humans engineer, while simultaneously improving its own machine “Brain.”
“This work pushes into uncharted territory by using AI for an optimization problem that people have worked on for decadesthe solutions that it finds can be immediately developed to improve computational run times,” said Dr. Federico Levi, a senior editor at Nature, which published the study.
Enter the Matrix Multiplication The problem AlphaTensor confronts is matrix multiplication. Matrix multiplication takes two grids of numbers and multiplies one by the other. Each number pair has to be multiplied individually to construct a new matrix.
Some experts estimate there are more ways to solve matrix multiplication than the number of atoms in the universe.
The Strassen algorithm, has reigned as the most efficient approach for over 50 years.
What if there are even more efficient methods? “Nobody knows the best algorithm for solving it,” Dr. François Le Gall at Nagoya University in Japan, who was not involved in the work, told MIT Technology Review.
“It’s one of the biggest open problems in computer science.” AI Chasing Algorithms If human intuition is faltering, why not tap into a mechanical mind? In the new study, the DeepMind team turned matrix multiplication into a game.
The team also showed the algorithm samples of successful games, like teaching a child the opening moves of chess.
The algorithm rapidly rediscovered Strassen’s original hack, but then surpassed all solutions previously devised by the human mind.
“In fact, AlphaTensor typically discovers thousands of algorithms for each size of matrix,” the team said.
Computer chips are often designed to optimize different computations-GPUs for graphics, for example, or AI chips for machine learning-and matching an algorithm with the best-suited hardware increases efficiency.
Here, the team used AlphaTensor to find algorithms for two popular chips in machine learning: the NVIDIA V100 GPU and Google TPU. Altogether, the AI-developed algorithms boosted computational speed by up to 20 percent.
“A boost in performance would improve a lot of applications.” The Mind of an AI Despite AlphaTensor trouncing the latest human record for matrix multiplication, the DeepMind team can’t yet explain why.
Evolving algorithms also doesn’t have to be man versus machines. While AlphaTensor is a stepping stone towards faster algorithms, even faster ones could exist.
“Because it needs to restrict its search to algorithms of a specific form, it could miss other types of algorithms that might be more efficient,” Balog and Fawzi wrote.
With a wealth of algorithms at their disposal, scientists can begin dissecting them for clues to what made AlphaTensor’s solutions tick, paving the way for the next breakthrough.