We surveyed and listened to AI operators attending Section’s AI:ROI Spring 2026 Conference.Their responses tell a story about an adoption gap between AI enthusiasm and results that'sdefining the next era of work. Here's what we found.
Where Enterprise AI Actually Stands
The weird thing about enterprise AI right now is that suddenly everybody's using it, but almost nobody can prove it's working.
Deloitte's latest State of AI report found that two-thirds of companies report productivity gains from AI - but only 20% of organizations using it daily have actually grown revenue from it. Meanwhile PwC found that 56% of CEOs haven't realized either revenue or cost benefits, but they are planning on continuing to spend across the board.
That tension showed up clearly when 17,000 people registered for our Section Spring AI:ROI conference - the same week the Citrini report raised the prospect of AI-driven unemployment above 10%, and days after Block cut 40% of its workforce. People weren't there for hype - they wanted clarity.
So we surveyed the audience on where their organizations actually stand with AI, and we analyzed the live chat - 3,743 messages over six hours - to see what individuals are asking about. Many organizations are still early, and measurement is a big gap.
Who is leading AI adoption?
In our survey, nearly seven in ten said they're personally involved in leading AI transformation at their organization in some way.

But there's an open question in most organizations right now about who actually owns AI transformation. 72% of CEOs say they're the main decision-maker on AI. 71% of CFOs think they're in charge of AI decisions, while 77% of CIOs think they are. And a BearingPoint study found that middle managers are the ones actually bridging the gap between leadership vision and execution on the ground.
One conference attendee observed that "CIOs are not doing IT. They are transforming the whole organization." Another noted Zapier CEO Wade Foster’s evolving approach: no single AI owner in 2023, then appointing a non-technical one in 2025 - suggesting the role is still being defined in real time.
Most organizations are in the middle
McKinsey's latest survey found 88% of organizations use AI in at least one business function.But regular weekly usage - the kind that shows up in how people actually work - is lower. Worklytics puts leading companies at around 50% weekly adoption, with wide variation by industry. OpenAI reported that weekly messages in ChatGPT Enterprise increased roughly 8x over the past year, but that's usage volume, not breadth.

We asked our participants how often their organization is using AI. The distribution was bimodal.57% said at least 30% of their organization uses AI weekly, and 39% put that number above 50%. But 43% were still below 30%.
The chat reflected this. As one attendee put it: "Many companies provide access ONLY. They do not train internally on how employees can use AI for their specific position." When Wade Foster described calling a company-wide "code red" hackathon that jumped adoption from 10%to 50% overnight at Zapier, the audience didn't ask how he built the tools - they asked how he got people to change.
One attendee captured the tension: "Wade's style of leadership is inclusive and he has a lot of faith in his employees - possibly because he knows them personally. How do we create a culture like this in larger organizations?"
AI is starting to hit revenue

A recent IBM CEO survey found only about 25% of AI initiatives have delivered the expected ROI, even as they remain committed to continuing to invest in AI. And research from Google seems to indicate that AI productivity gains are about double for organizations that are deploying agents.
74% of respondents in our audience said AI is in at least some revenue-generating workflows -sales, pipeline, customer success, content-driven lead gen. But only 26% had it in more than a quarter of those workflows.
One conference participant asked, "If increased productivity is your ROI metric, what are you doing with the extra time? What do you now expect of your employees?" Another pushed back more directly: "Adoption is not a guarantee of value. The fact that people are spending time with a tool is not a guarantee that it is helping them do better work."
What 412 questions tell us about what is really worrying people

The survey told us what organizations are doing – the chat told us what individuals are worrieda bout. So we extracted every substantive query from the live chat and categorized them by topic. Among the topic-specific queries, several categories dominated.
The tools question is louder than it looks
Tools and platform choices were among the most-discussed topics in the chat (17% of all questions). Dozens of messages debated platform quality, vendor lock-in, and whether security and governance justified sticking with an incumbent stack.
Behind the vendor debate sat a more structural question - many organizations are still choosing tools before they've defined what they're trying to change. One participant described deploying"hundreds of GPTs or agents" that "actually made things harder for employees to find what they needed."
Another drew the distinction sharply: "I've been very clear in differentiating 'AI adoption' (logging in, using it a bit) from 'AI integration' (learning how to spot when and where to bring AI capabilities into your work)."
By the final session, one attendee captured where some of the audience still sat: "We are talking largely about agents here today, but the majority of the world is trying to figure out how to use generative AI." The tools conversation kept circling back to the same place as the rest of the conference - the gap isn't which platform to pick, it's how to make any of them stick.
The platform anxiety in the chat mirrors a broader industry pattern. Enterprises are expected to spend more on AI, but analysts are now predicting consolidation - organizations are moving from experimenting with dozens of tools to picking a smaller set of winners.
Meanwhile, shadow AI is running ahead of policy - multiple surveys have found that 50-80%employees use AI tools their employers haven't approved. One attendee in our chat flagged exactly this: "If you limit access to AI as a large firm, you are ensuring people put your data intopublic ChatGPT." The pattern is consistent - when organizations don't move fast enough on tool strategy, employees make the choice for them.
Nobody knows how to measure this
Measuring ROI was a top practical concern - 21% of all questions - and it spiked in exactly the sessions where you'd expect measurement to matter most. During PK Kota's deep dive onUKG's pilot-to-production process, a third of questions were about measurement. During Mark Mahaney's market analysis, nearly a third asked about ROI proof. Even during the governance session with Cisco's DJ Sampath, participants pivoted to measurement faster than security.
As the Futurum Group reports, AI ROI metrics are shifting from productivity proxies to P&Laccountability - revenue growth, profitability - but most organizations aren't ready for that yet.This is consistent with the survey's revenue findings. 74% had AI in revenue workflows, but most were in the shallow end - under 10% of workflows.
One reason they couldn't go deeper - they can’t make the business case for expanding what they don’t yet operationalize and measure. One attendee asked bluntly: "Has anyone defined AI ROI? Is it the investment money, time, or patience?"
Another pushed back on the entire framing: "I'm not convinced ROI is the right metric. Nobody cares about the ROI of coffee… we just know it makes people productive." A third shifted the burden: "The ROI is falling on the employees to sell the AI to prove the value for the hugeamounts of capex spent on data centers. They did the business case backwards."
And the workforce question isn't going away
Jobs and workforce impact was the largest question category - 23% - and it showed up in almost every session regardless of topic.
One attendee asked: "Doesn't the RPE of AI startups suggest that companies need fewer people for the same valuations - and thus, fewer jobs?” Another cut through the euphemisms:"By 'lower level tasks' do you not mean 'entry level jobs'?" And a third captured the generational anxiety: "What happens to the void left when AI takes over entry level jobs - and older experienced workers leave the market?"
When Matt Lyteson walked through IBM’s AI First Triangle - individual, team, org-wideautomation - 37% of the questions were about workforce, not workflows. One attendee wrote: "It would be very difficult for college grads to get a foot in. All the initial level jobs like making presentations, generating and analyzing reports are now done by AI." Others asked who bears responsibility when roles get eliminated, and how fresh grads build expertise when the training-ground work is automated.
The concern is legitimate. Entry-level job postings dropped 35% between January 2023 and mid-2025, with tech graduate roles falling even faster. IBM just announced it's tripling Gen Zentry-level hiring - but only after rewriting the roles entirely around AI fluency. Researchers are calling it a potential "seniority cliff" - a generation that never grapples with the low-level problems that enable them to build deep intuition.
What to do about it
Put together, the data from Section’s Spring AI:ROI conference attendees points to three concrete opportunities that organizations can act on - and none of these require betting on which AI vendor wins or predicting where the technology goes next.
First, consolidate your tool strategy. The chat data showed anxiety about platform choices -and most of it was about organizations choosing tools before defining what they're trying toch ange. Shadow AI is already running rampant at most companies. Organizations that are getting ahead of this are thinking strategically about which platforms make sense for their business, then training people on those specifically.
Second, define what you're measuring before you scale. Most organizations have AI in some revenue workflows but can't make the case to expand because they never established baselines. Pick one workflow, measure it for 30 days without AI, then measure it for 30 days with AI. That gives you a real number to take to leadership - not a productivity proxy.
And third, start redesigning entry-level roles now. The data on entry-level job postings is already moving. Organizations that wait until they've automated the training-ground work to figure out how new employees will build expertise will eventually face a talent pipeline problem that's much harder to fix after the fact.
This analysis is based on data from Section's AI:ROI Spring 2026 conference (March 5, 2026).17,000 registrants, 3,743 analyzed chat messages, 412 substantive queries extracted (after filtering logistics, noise, and non-substantive messages). Includes explicit questions and implicit queries (challenges, concerns, and information-seeking statements), 103 survey respondents.Chat analysis performed using automated text classification. Speakers included leaders from IBM, Evercore, Northwell Health, Cisco, UKG, Napster, Salesforce, Zapier, Siemens, andOpenAI.




