Remember AlphaGo, the Google artificial intelligence (AI) that schooled world Go champion Lee Sedol at Go, a game AI supposedly could never master? Well, fresh from humbling the world's top-ranked Go player, the team behind AlphaGo, Google's DeepMind division, have now set their sights on an all new challenge - Blizzard's best-selling, real-time strategy game Starcraft 2!
The announcement of the partnership, which was made at Blizzcon 2016, Blizzard Entertainment's annual fan convention, stated that DeepMind would work with Blizzard to release Starcraft II as an AI research environment.
Using video games to teach AI
AI companies don't just develop algorithms for games to make gaming professionals feel like they've wasted their lives. They do it to teach AI how to solve complex problems without manually being taught how to. Games offer the perfect environment for that, with the added advantage of having scores with which to measure their success.
Which is what made Go such an appealing prospect to DeepMind. After all, Go was considered the Holy Grail of AI, with more Go moves possible than there are atoms in the universe. In fact, when DeepMind began developing AlphaGo to tackle the complexities of the game, it was thought that they would never succeed at creating an AI that could beat a human professional within this decade.
Still, AlphaGo managed to beat the European Go champions, Fan Hui, and then world champion Lee Sedol in quick succession.
With that sort of track record, one would think that Starcraft 2 wouldn't be hard to crack for DeepMind. Especially when you consider that this won't be DeepMind's first attempt at beating a videogame. Their algorithms are already adept at conquering a number of 2D Atari games as well as mastering TORCS, a 3D car racing simulator.
Starcraft though, is an entirely different prospect. The series, arguably one of the world's most competitive eSports, has endured for close to 20 years largely due to it's constant refinement and complexities. If Go was a challenge for DeepMind, Starcraft 2 will put their AI through the gauntlet.
The Starcraft challenge
"StarCraft is an interesting testing environment for current AI research because it provides a useful bridge to the messiness of the real-world," Oriol Vinyals, a DeepMind research scientist, told The Verge. Vinyals would know this better than almost anyone, having once been Spain's top-ranked Starcraft player.
The complexity of Starcraft comes from the level of planning required. In the game, players not only need to hone their battle tactics but must manage their resources efficiently as well as scouting smartly and planning their buildings.
Also, unlike Go, where the entire board is visible, in Starcraft 2 only portions of the map in the vicinity of a player's structures are visible. This makes the game far more complex. As Demi Hassabis, one of DeepMind's founders, said, "The thing about Go is obviously you can see everything on the board, so that makes it slightly easier for computers." For this reason, Starcraft is also a test of memory.
The fact that the game is played in real-time, rather than turn-based games like Go or Chess, also makes the challenge for the AI that much more intense, as moves have to be computed rapidly.
The game the AI will be playing though, will be markedly different from what actual Starcraft players see. The version being developed by Blizzard and DeepMind will be stripped down visually to better adapt the game to machine learning systems. Blizzard will also release game replays of concluded Starcraft games so that the AI has a data set to learn from.
Neither Blizzard nor DeepMind were willing to take a punt on how soon it would be before the world's best Starcraft players are being shown up by AI. Given what happened with AlphaGo though, we imagine the Starcraft elite are already looking over their shoulders.