Select Page

Image source: YouTube Video Screenshot
Google’s DeepMind lab wanted to test its artificial intelligence to see how it stacks up against professional gamers. The ultimate goal was to make its AI algorithms sharper in making decisions. The “learning” processes ended with DeepMind’s AI beating the professional Starcraft II players in almost all the matches they played.
AI has already proven successful in beating humans at games like chess, but seeing an algorithm play a complex computer game such as Starcraft II is a new challenge in the field of research. Given that it’s a video game, AI algorithms can’t watch the movements of all the other players as it determines the next step to take. Additionally, the AI must make quick decisions in real time. However, Google’s DeepMind developed algorithms which proved effective in playing video games against human players. The company described the outcomes of its tests on its website.
DeepMind’s AI, which is dubbed “AlphaStar,” didn’t have much trouble taking on professional Starcraft II players. It beat Dario “TLO” Wünsch and then tested its skills against MaNa. The vast majority of the matches were played last month at DeepMind’s London headquarters. However, the final and most important match against MaNa was streamed yesterday. It was the only match in which a human player defeated DeepMind’s AI.
The commentators of the matches described the AlphaStar algorithm’s play as “phenomenal” and “superhuman.”   Starcraft II players begin a match on different sides of the map and then build a base, which they use to train their army before invading their opponent’s territory.
While the professional Starcraft II players built more powerful armies, AlphaZero outmaneuvered them in close combat. In one of the matches, AlphaStar utilized a fast-moving unit called the Stalker to defeat MaNa.
According to The Verge, experts have begun discussing the games and arguing about

Article From: "Danica Simic"   Read full article