Researchers have developed an AI that can learn how a video game operates just by watching two minutes of gameplay.
The new system can replicate the ‘game engine,’ which dictates everything from character movement to rendering graphics, creating a cloned version that is indistinguishable from the original when played.
According to the team behind the new system, this approach could be used to radically speed up game development.
The Georgia Tech team's AI can learn how a video game operates just by watching two minutes of gameplay. On right, the AI replicates Mega Man in the 'Bomberman' stage. There were at some failures, including a point at which he disappears. The original is shown left
The new technique works with games in which the action primarily happens on-screen, according to the team.
The researchers trained their AI on a single ‘speedrunner’ video, in which a player heads straight for a goal.
Then, it created a predictive model based on a 2D game style.
‘The technique relies on a relatively simple search algorithm that searches through possible sets of rules that can best predict a set of frame transitions,’ says Mark Riedl, associate professor of Interactive Computing and co-investigator on the project.
While it may work for games like Super Mario Bros and Mega Man, the researchers say more complex games, such as Clash of Clans could pose a challenge, as they feature off-screen action.
In order to make the training ‘as difficult as possible,’ the researchers at Georgia Institute of Technology trained their AI on a single ‘speedrunner’ video, in which a player heads straight for a goal.
This allowed the AI to create an accurate predictive model for the 2D style of the game.
After watching just two minutes of gameplay, the researchers found the AI was able to build its own game engine.
And, this replicated engine was very similar to the original game.
‘Our AI creates the predictive model without ever accessing the game’s code, and makes significantly more accurate future event