Axios AI+

May 27, 2025
I hope you had a meaningful Memorial Day weekend. Personally, I got some much-needed rest. Today's AI+ is 1,221 words, a 4.5-minute read.
1 big thing: Americans want AI to slow down


While the tech industry floors the pedal on AI, the U.S. public would be happy to hit the brakes.
Stunning stat: More than three-quarters of Americans (77%) want companies to create AI slowly and get it right the first time, even if that delays breakthroughs, the 2025 Axios Harris 100 poll found.
- Only 23% of Americans want companies to develop AI quickly to speed breakthroughs, even at the price of mistakes along the way.
Why it matters: CEOs, investors and tech companies have pushed the narrative of a do-or-die AI race — but most people would rather get AI right than get it first.
Between the lines: This finding is consistent across generational lines, but the margins vary.
- 91% of boomers and 77% of Gen X favor slower AI.
- That number drops to 63% for millennials — but rises again to 74% for Gen Z, the youngest and most "digital native."
The big picture: The notion of an AI "race" has shaped the new technology's development at every level.
- The leading "frontier developers" of AI — notably OpenAI, Google and Anthropic, with xAI and Meta also in the game — believe they are racing toward "artificial general intelligence," or AGI, a level of AI that surpasses human capabilities.
- AI makers broadly see themselves in a race with one another to build faster, more reliable and more efficient models.
- Every other company sees itself in a race to put AI to work in different industries.
- Nations — chiefly the U.S. and China — also imagine a race for global AI dominance.
Yes, but: All this racing has spurred investment and development, but the public hasn't yet bought into the narrative.
- For many users, AI remains a solution in search of a problem.
- Others doubt AI's world-changing promises and instead see an innovation that will kill jobs and flood the world with bad information.
Our thought bubble: Sometimes public awareness of new technology simply lags reality, and AI boosters believe that's the case here.
- But the public has also now had many chances to learn from each new wave of the digital revolution of the last half-century.
- The Axios-Harris poll respondents may simply have drawn an important lesson from bitter experience with the rise of smartphones and social media over the last two decades.
That lesson is ... business-model-driven mistakes in the early phase of technology adoption are almost impossible to correct once a new platform's cement hardens.
- Moving more slowly in the early days may be the only way not to lock in choices that we regret.
2. AI cheating surge
Use of generative AI to cheat is rampant in high schools and colleges, but there's no clear consensus on how to fight back.
That means sussing out when students are secretly using it and avoiding the temptation for teachers and professors to overuse it themselves.
- Any assignment "that you take home and have time to play around with, there's going to be doubt hanging over it," says Stephen Cicirelli, an English professor at Saint Peter's University in Jersey City, New Jersey.
- Cicirelli captured the zeitgeist with a viral post on X (14.7 million views) about how one of his students got caught submitting an AI-written paper, and apologized with an email that appeared to be written by ChatGPT.
By the numbers: Use is ubiquitous in college. A survey of college students taken in January 2023, just two months after ChatGPT's launch, found that some 90% had already used it on assignments, New York Magazine reports.
- One in four 13- to 17-year-olds say they use ChatGPT for help with schoolwork, according to a recent Pew survey. That's double the rate in 2023.
Zoom in: The proliferation of AI-assisted schoolwork is worrying academic leaders.
- 66% think generative AI will cut into students' attention spans, according to a survey of university presidents and other top administrators by the American Association of Colleges & Universities and Elon University's Imagining the Digital Future Center.
- 56% say their schools aren't ready to prepare students for the AI era.
Between the lines: The rise of AI is causing unforeseen headaches.
- Teachers run assignments through detectors, which often don't get it right. Students who didn't use AI have had to appeal to their schools or submit proof of their process to avoid getting zeroes, the New York Times reports.
- Instructors are getting caught leaning on ChatGPT, too. One Northeastern senior demanded tuition reimbursement after discovering her professor had used AI to prep lecture notes and slides, according to the Times.
The other side: As much as they're struggling to wrangle AI use, many educators believe it has the potential to help students and that schools should be teaching them how to use it.
- American University's business school is launching an AI institute for just that purpose.
- "When 18-year-olds show up here as first-years, we ask them, 'How many of your high school teachers told you not to use AI?' And most of them raise their hand," David Marchick, the dean of American University's Kogod School of Business, told Axios. "We say, 'Here, you're using AI, starting today.'"
3. Sam Altman cast as the new Steve Jobs
OpenAI's latest power move in the AI race is to cast CEO Sam Altman as a Steve Jobs for the new era.
Why it matters: Jobs remains Silicon Valley's most revered founder, and since his 2011 death no industry figure has been able to match his success at product innovation, strategy and marketing.
Driving the news: This week OpenAI nabbed Jony Ive, the design guru who closely collaborated with Jobs to shape iconic devices like the iPhone and the iPod, to oversee a big new bet on AI hardware.
- OpenAI's promotional materials paired Altman and Ive in a video that strongly implies Altman's team-up with the Apple veteran makes him Jobs' natural successor.
- Altman has even invoked Jobs directly, saying the Apple founder would be "damn proud" of Ive's move, per Bloomberg's Mark Gurman.
Reality check: Jobs devoted his life to Apple and was fiercely protective of the company. At the very least he would have regretted Ive's decision to pursue his next ambitious goal outside Apple. More likely, he'd have seen it as a betrayal.
Zoom out: Every Silicon Valley founder wants to be Steve Jobs at some point, and, for many industry insiders, Altman's success at bringing ChatGPT forth from OpenAI to spark the generative-AI wave qualifies as a Jobs-like leap.
- Altman shares with Jobs a penchant for vast visionary schemes and a "reality distortion field" that persuades listeners those schemes could come true.
But there are plenty of ways in which the Altman-Jobs comparison falls short.
- Jobs was a control freak who obsessed over details and held projects back until they were well-tested.
- Altman takes more of a Zuckerberg-style "move fast and break things" approach. OpenAI ships products to the public early so users can try them out and show developers what to fix.
4. Training data
- Trump's vision of an American manufacturing renaissance depends on lots of foreign robots. (Axios)
- Last week's flood of announcements from OpenAI, Google and Anthropic shows how the future of AI is shaping up. (Axios)
- A romance author accidentally left an AI prompt in her published book. (404 Media)
5. + This
For the past several years, Germany has had a twice-a-year rave train that offers a nonstop dance party from Nuremberg to Wurzburg and back, complete with bars and DJs.
Thanks to Scott Rosenberg and Megan Morrone for editing this newsletter and Matt Piper for copy editing.
Sign up for Axios AI+





