Stop Talking About AI. Start Talking About What It Actually Does.
- 2 minutes ago
- 3 min read
AI and public trust are at a crossroads. It's time to stop debating what AI might do and start focusing on what it actually does — for real people, in real work, on a real Wednesday afternoon.

A new Quinnipiac University poll of roughly 1,400 Americans just dropped, and the numbers aren't great for the AI industry.
Fifty-five percent of respondents say AI will do more harm than good — up 11% from last year. Around 76% trust it "hardly ever" or "only some of the time." And 62% aren't even excited about it.
I've been covering technology for more than 10 years. I've watched the internet go from a curiosity to an infrastructure. I watched cloud computing move from skepticism to standard practice. And I've interviewed thousands of executives who were convinced their technology would change everything.
Here's what I've learned: the technology rarely changes everything. But it usually changes something. And that something — when it works — makes work a little easier, a little faster, or a little more human.
AI is no different. But right now, we're losing that story.
We've created a monster narrative.
The problem isn't the technology. The problem is how we're talking about it.
We've spent the last two years debating whether AI will take your job, crash the economy, or render entire industries obsolete. We've published viral worst-case scenarios. We've watched companies announce layoffs and blame AI — sometimes before the technology was even ready to do the work. We've flooded the conversation with conflicting data that says AI is automating everything and simultaneously making workloads heavier.
No wonder people are confused. No wonder 55% are scared.
When you tell someone, "This thing is coming for you," their natural reaction isn't curiosity. It's resistance. That's not a technology problem. That's a human problem. And it's entirely predictable.
The better question is always: what does it actually do?
I've spent the last two years covering AI at conferences, in executive boardrooms, and across dozens of interviews with the people building and deploying these tools. And the most compelling stories I've heard have nothing to do with AI replacing anyone.
They sound more like this:
A nurse who spends three fewer hours a week on paperwork and two more hours with patients.
A developer who stops context-switching between documentation and code.
A content strategist who uses AI to do first-pass research so she can focus on the thinking that only she can do.
Those stories don't make headlines. But they're real. And they're happening now.
Technology has always been about this.
The spreadsheet didn't replace the accountant. It eliminated the tedious part of the job and made accountants more valuable. The internet didn't replace journalists. It changed what journalism needed to be and gave writers a global audience.
Every major technology shift has followed the same arc. First comes fear. Then comes adaptation. Then comes a new normal that most people can't imagine living without.
AI is in the fear phase right now. And the industry — including the companies building these tools — is making it worse by leaning into the hype instead of the humanity.
What needs to change.
The companies building AI need to stop leading with capability and start leading with context. Don't tell me your model can do X billion things. Tell me how it helped a small business owner write her first marketing email. Tell me how it helped a first-generation college student prep for a job interview. Tell me how it helped a doctor catch something he might have missed.
And we — the journalists, the analysts, the communicators — need to hold that same standard. Stop writing about AI as an abstraction. Write about what it does for a person on a Tuesday afternoon.
That's where the real story is.
Here's my take going into the rest of 2026.
I'm not going to stop covering AI. But I am going to keep shifting how I cover it — away from the technology itself and toward what the technology makes possible.
Because at the end of the day, nobody actually cares about AI. They care about their job, their time, their family, and their future. If a tool helps with any of those things, it's worth talking about. If it doesn't, no amount of hype will save it.
The poll numbers will turn around when the stories do.
And the stories will turn around when we start telling the right ones.