EA Says AI Is ‘the Very Core’ of Its Business: What Does That Mean?

On Tuesday, Electronic Arts (EA) held its annual Investor Day — a three-hour presentation intended for its investors to learn more about the company’s direction and promises to make them money in the coming year. But you’d be forgiven if you thought this was some sort of AI tech conference given how effusively the technology was brought up and touted as a key component of EA’s future.

While we’ve known for a while that EA and a number of other gaming companies are experimenting with and investing heavily in AI, it was nonetheless a bit overwhelming just how often AI came up throughout the entire presentation. It was mentioned it just about every single segment in some capacity, had its own dedicated segment near the end, and was described during CEO Andrew Wilson’s introductory speech as “the very core of our business” — a rather shocking new mission statement for a company that ostensibly makes and publishes video games.

We watched the entirety of the three-hour Investor Day presentation, and did our best to round up all the AI “highlights” in an effort to paint a picture of what, exactly, EA is doing with AI and what we can expect to see in the coming years if its investments and interest in the tech pan out.

Core of the Business

The first mention of AI in the presentation took place right at the top, in Andrew Wilson’s opening speech. In addition to referring to AI as “the very core of our business” and “not merely a buzzword,” he announced that EA apparently has over 100 active “novel AI projects” in the works right now, ranging from the practical to the very experimental. Wilson divvied these up into three categories: efficiency, expansion, and transformation.

Andrew Wilson at EA’s Investor Day 2024

Wilson describes the “efficiency” projects as not just cost-saving, but related to doing things faster, cheaper, and at “higher quality.” Specifically, he cites College Football 25, saying the developers could not have made the game’s 150 different stadiums and over 11,000 player likenesses without AI.

Wilson elaborated by saying he believes AI can give developers “richer colors” to paint “more brilliant worlds” and make characters with “more depth and intelligence” while offering “more authenticity and deeper immersion” to the company’s sports games. And for transformation, Wilson describes this as looking into the future and finding entirely new kinds of experiences that don’t currently exist in games, especially around user-generated content.

Chief strategy officer Mihir Vaidya went into more depth about what the “transformation” element will mean for EA in a later section, but Wilson’s opener made it clear that he’s more than bullish on the technology.

AI Examples

In the talks that followed, a number of EA leads highlighted ways in which EA was already working with AI tech in its existing games. Laura Miele, president of EA entertainment, technology, and central development, talked about something called The Sims Hub, the first AI features coming to The Sims universe. EA plans to release a platform with “supercharged discovery tools” that use AI to allow players to find user-generated content more easily. She showed off an AI application that uses a photo search feature, allowing users to drop in photos of real life houses and then find user-generated houses that look similar to them. Miele also highlighted how the AI can be used for character creation, with users able to drop in an image of a celebrity or person in a certain outfit and then generate a Sim that matches up. Miele says The Sims Hub will be released “soon.”

undefined
Laura Miele speaking about The Sims Hub at EA’s Investor Day 2024

On the internal tech side, Miele talked about EA’s asset library, which she described as “like the Smithsonian of game assets.” Essentially, EA has a massive database of assets from all of its games and work behind the scenes over the years, and it’s using it to train its machine learning capabilities and large language models. Those capabilities are then being used by the company’s “SEED” innovation lab, aka “Search for Extraordinary Experiences Division”, for things like EA’s “Script to Scene” tool.

Script to Scene lets developers “create characters, direct performances, and define worlds all from text.” Miele shows an example on screen, prompting an AI chat assistant to “build me a Parisian-style residential building.” She then asks to make it taller, changes it to a modern high rise, and expands it into a larger neighborhood. With Script to Scene, Miele claims EA developers could eventually make an entire scene in a game using simple text prompts.

undefined
Laura Miele explains Script to Scene at EA’s Investor Day 2024

After Miele, president of EA Sports Cam Weber took the stage to talk about, well, EA Sports. He showed off the already announced FC IQ, which uses “tactical AI” and real-world data to more accurately simulate how players and teams play together in EA Sports FC 25. And he highlighted Wilson’s prior statements about using AI in College Football 25, noting that the stadium creator AI tools in particular reduced creation time “by about 70%” and allowed developers to focus on building the “pageantry” and unique traditions of each school instead. “The investment in these tools and tech will benefit the rest of our portfolio in the months and years ahead,” he said.

And finally, chief experience officer David Tinson briefly talked about an early prototype of a predictive simulation tool EA is working on. He claims the tool will combine EA game data, AI, and IQ ratings to allow users to run more accurate simulations and answer questions of which team would win in a match, who would have won in a hypothetical match, and which team is the best.

Cardboard boxes and AI soccer stars

If all that somehow wasn’t enough AI chatter for you, chief strategy officer Mihir Vaidya took the stage next to talk about AI and nothing but. He opened by comparing the technology to the advent of makeup tutorial videos and cat videos, which he says people initially dismissed as trivial or niche, but now are ubiquitous and wildly popular. He says EA’s experimental AI efforts should be viewed the same way as “early YouTube videos”, and that while what he shows might feel rudimentary, it will naturally get better as AI improves.

Vaidya was specifically brought onstage to talk about the “transformation” portion of AI that Wilson mentioned earlier. He says the experiences he shows onstage are “not intended to replace AAA games, but instead unlock new and adjacent categories that add as opposed to take away from the existing gaming market.”

Those “new and adjacent categories” Vaidya wanted to show off largely seem to involve apps of some sort that let people use AI to shuffle around EA proprietary assets and spit out minigames of a sort. One example he shows involves two people asking an AI to “make a maze out of cardboard boxes.” They then ask the AI to make it more complex, then multi-level. Then they ask the AI to “make two characters with weapons,” allowing them to select from a gallery of existing EA characters before settling on two that purport to be community-designed, then equip them with guns from a library of weapons. They then select from a handful of game modes and start chasing each other around the cardboard maze. The video ends with one of them asking the AI to “make it more epic,” resulting in a giant cardboard box pyramid seemingly appearing in real-time to the astonishment of the two players.

undefined
A screenshot of an AI experiment at EA’s Investor Day 2024

In a second demonstration, Vaidya wants to demonstrate how AI can be used to create “more believable characters” that players care about even more. Unfortunately, we didn’t get to see much of that in action, as Vaidya encourages investors to check out the demo at the investor event after the presentation is over. But we do see a few seconds of what he’s talking about: an AI version of soccer star Jude Bellingham is apparently available to answer questions posed by investors using AI to simulate his likeness, voice, and likely responses. Vaidya demonstrates by asking him what it was like to play at Bernabéu “in front of millions of screaming fans.” Bellingham briefly explains the indescribable thrill of the experience in a flat monotone, expressionless.

Finally, Vaidya demonstrates how EA wants to use AI for “social ecosystems”, specifically something codenamed Project AIR. Project AIR seems to be a way to use short text prompts to generate characters, have text-based interactions with them, and then share those conversations with friends. In his example, he creates a “legendary investor” character using the prompt, “A high-stakes VC who swims in the deep waters of innovation.”

He then decides the “game” will be to pitch business ideas to him. In an interface that looks suspiciously like Tik-Tok, the user pitches “self-tying shoes” only to be slapped down. He then invites a friend to help him pitch, but lost for any ideas on how to make self-tying shoes more interesting, he uses an AI co-pilot to write the pitch for him, which ultimately succeeds.

undefined
A screenshot from an AI experiment at EA’s Investor Day 2024

What does it all mean?

That’s a lot of noise about AI, almost an astonishing amount even from a company we knew was pushing the tech hard. And it’s a lot of noise specifically about generative AI. Artificial intelligence, broadly, has been used in games for decades. But generative AI, which is involved in most of the things EA shared yesterday, is different. Generative AI effectively spits out brand new images, text, sound, or other content based on data it’s fed, which has led to numerous ethical questions regarding its use. Some of those EA has managed to answer effectively. For instance, EA is training its AI on its own proprietary material, so there’s seemingly no concern about it stealing copyrighted work (we’ve reached out to EA for comment).

But other concerns remain. There’s the environmental impact, for one, which we’ve also asked EA about. And then there’s issues of using personal likenesses. EA says Jude Bellingham agreed to let EA train an AI on his likeness and voice for the model we saw yesterday, but will EA ensure it has permission from every single individual it uses in the future? What about voice actors for beloved characters, who are still at this moment on strike from companies including AI over these exact protections? We’ve asked EA for comment on all this, too.

On the game development side, how does this implementation of AI impact individual creatives at the company? It’s easy to say that tools such as Script to Scene are intended to free up developers to work on other things. But it’s a practical reality that the games industry has seen two years of unprecedented layoffs just as AI is beginning to creep into the mainstream, and EA has been a part of that. There are no guarantees that this tech might not eventually be used to replace developers eventually. Developers have said over and over that they are rightfully nervous about this possibility, but neither EA nor its investors seem to be especially interested in addressing that. Nor does it seem apparent to them the difference between the intentional, creative work of designers assembling a thoughtful map for a shoorter, and anyone at all prompting an AI to spit out a random array of cardboard boxes. It’s all content to be sold in the end.

As some have pointed out online, EA is no stranger to pushing hard on new tech before immediately backing off the second the wind changed. But this feels different. EA leadership made it abundantly clear at the Investor Day that the company is already very, very deep in on AI even if the experiments themselves are still in their infancy. Investors might be kept happy by these experiments, but perhaps fans of EA’s 40-year history as a video game company ought to be asking why AI, not games, has suddenly become the “core” of its business.

Rebekah Valentine is a senior reporter for IGN. Got a story tip? Send it to [email protected].