It’s been impossible to miss the latest collision of AI and mainstream culture.
The cycle started in earnest last year with the release of OpenAI’s DALL-E 2, a machine learning algorithm that concocts photorealistic images from text prompts. The hype ramped up even further with the company’s release of ChatGPT in November. But things really went off the rails last week, when Microsoft—a big investor in OpenAI with nearly unfettered access to its algorithms—blended a specialized version of ChatGPT into its Bing search engine in the form of a chatbot.
While the capability of these algorithms is undoubtedly advancing quickly, it seems recent leaps have been as much about what they can do as the fact average people can now access them.
Now, another AI is getting a mainstream release: Sony just announced its superhuman AI driver, GT Sophy, is officially joining Gran Turismo.
As of today, any player who downloads the latest update can race GT Sophy in the new “Gran Turismo Sophy Race Together” mode. The mode, which Sony says is a special event, will be available through the end of March. Players can compete with Sophy over four races, each more difficult than the last, or go head-to-head with the same car and settings to see how fast the AI can go.
GT Sophy first made headlines last year when the team published a paper in Nature outlining how the algorithm beat top Gran Turismo players. By pushing the envelope right to its physical limits and also observing etiquette—reckless drivers are penalized—the algorithm outpaced human players by seconds in a competition usually decided by milliseconds.
The plan was always to incorporate GT Sophy into the game. But the latest announcement should be viewed as a first step. Much like other recent AI deployments, Sony will seek feedback from players to improve the AI further.
While the recent round of AI releases has been in the category of generative AI—models trained to produce text or images after poring over billions of examples scraped from the internet—GT Sophy was trained a little differently.
Like other game-playing algorithms, it uses deep reinforcement learning, where the algorithm is fed the rules of the game and conditions of its environment, then plays millions of rounds, scoring its own performance and making iterative improvements. At least 10 times a second, GT Sophy takes stock of its position relative to other cars on the track and the various forces acting on the car and makes split-second decisions based on the data.
The approach has yielded algorithms that beat humans at games like Go, Starcraft, Stratego, and Diplomacy.
One way these algorithms beat humans is by developing surprising strategies. DeepMind’s AlphaGo famously outfoxed world champion Go player Lee Sedol with a move no human would make. At first dubbed a mistake, it later proved a turning point in the match.
Kazunori Yamauchi, the creator and CEO of Gran Turismo, said last year that GT Sophy gains a speed advantage thanks, in part, to an aggressive strategy driving through curves. Whereas drivers often brake into a curve and accelerate out of it, GT Sophy brakes during the curve, shifting the load from two to three tires. “We notice that, actually, top drivers such as [Formula One champions] Lewis Hamilton or Max Verstappen actually are doing that, using three tires, going fast in and fast out, all these things that we thought were unique to GT Sophy,” he said.
By racing the AI, Sony hopes players can improve. The top players in chess, for example, aren’t computer or human alone, but the two playing together.
“From the beginning, Gran Turismo Sophy was always about more than just being superhuman; we aspired to create an AI agent that would enhance the experience of players of all levels, and to make this experience available to everyone,” said Michael Spranger, Chief Operating Officer, Sony AI.
Algorithms like GT Sophy may eventually have real-world applications in robotics and self-driving cars. But for now, it’ll be more about the fun of getting schooled by an AI driver.
Image Credit: Sony