An artificial agent that combines machine learning techniques with “biologically inspired” mechanisms can learn how to play 49 classic arcade video games when given only minimal background information.

This discovery paves the way to building artificial intelligence systems that excel at learning a variety of challenging tasks from scratch, including QA testing.

“Reinforcement learning” describes how artificial agents interact with an environment, selecting actions that maximise some notion of net reward. However, using reinforcement learning in complex, real-world-like situations has proved complicated and applicability tends to be limited to domains in which useful features can be tailored.

Demis Hassabis, Vlad Mnih, Koray Kavukcuoglu, David Silver and colleagues developed a novel artificial agent called a deep Q-network (DQN), which combines reinforcement learning with a type of artificial neural network known as deep neural networks. They tested the DQN using 49 different classic Atari 2600 games including Space Invaders and Breakout.

The agent was given only pixel and score information for each game, but performed at a level comparable to that of a professional human games tester — achieving more than 75 per cent of the human score on more than half the games. The DQN also outperformed the best existing reinforcement learning agents on 43 games.

The games at which the DQN excelled were highly varied in nature, from side-scrolling shooters to boxing and 3D car-racing games, proving that a single architecture can successfully learn optimal strategies in a range of different environments with only minimal prior knowledge.

The work highlights how state-of-the-art machine learning techniques can be combined with biologically inspired mechanisms to create agents that are capable of learning to master a diverse array of challenging tasks. And that human games testers should count themselves lucky they have a more diverse skill set. At the very least, we can all look forward to a reliable co-op buddy in the future.