Eighty-two percent of enterprise organizations have discovered previously unknown AI agents running in their IT infrastructure over the past year. That figure comes from a new Cloud Security Alliance survey, “Autonomous but Not Controlled: AI Agent Incidents Now Common in Enterprises,” published April 21 and commissioned by Token Security.

The gap between perceived and actual visibility is stark. Sixty-eight percent of respondents reported high confidence in their ability to see what agents are running. Forty-one percent said they discovered unknown agents multiple times during the year.

Incidents Are Already Happening

Two in three organizations (65%) experienced at least one security incident related to AI agents in the past 12 months, according to the CSA report. The consequences broke down as follows: 61% resulted in data exposure, 43% caused operational disruption, 41% triggered unintended actions in business processes, 35% led to financial losses, and 31% produced delays in customer-facing or internal services.

“AI agent security and governance encompass an interconnected system spanning visibility, lifecycle management, policy, and monitoring,” said Hillary Baron, AVP of Research at the Cloud Security Alliance. “As agents gain greater autonomy, governance must evolve into a more unified, operational model that can sustain control at scale.”

The most common places where shadow agents were discovered: internal automation and scripting environments (51%), LLM platforms including custom tools, assistants, and plugins (47%), SaaS tools with built-in automation (40%), and developer-created workflows (40%).

The Retirement Debt Problem

Only 21% of organizations have formal processes for decommissioning AI agents. The rest leave agents running after their intended purpose is served, often retaining permissions, credentials, and operational hooks. The CSA report calls this accumulation “retirement debt,” a liability that grows silently until it becomes structural exposure.

Infosecurity Magazine reported that these forgotten agents represent a growing attack surface. Agents that persist with active credentials but no oversight can leak data, execute unintended actions, or be exploited by attackers who discover them before the organization’s own security team does.

“AI agents are outpacing the identity systems meant to secure and control them, and it’s already showing up in unknown agents and real incidents in the enterprise,” said Itamar Apelblat, CEO of Token Security, in the CSA press release. “These agents are not just another workload. They are a new type of identity and legacy controls don’t work.”

How Enterprises Are Responding

The survey reveals an industry in transition between human oversight and autonomous operation. Fifty-three percent of organizations run agents autonomously for low-risk tasks while requiring human review for higher-risk actions. Twenty-four percent still use human-in-the-loop models for most tasks. Only 13% report fully autonomous agent deployments.

When agents exceed their defined scope, responses vary: 38% require human approval, 24% log the action for review, and only 11% automatically block it.

Looking ahead, 79% of respondents said context-aware controls will be important or very important over the next two years. Sixty-six percent reported they already have guardrails in place for defining agent boundaries.

From Technical Oversight to Business Risk

The CSA’s data reframes agent governance from a technical concern to a core business risk, as Biometric Update noted. With 61% of agent-related incidents resulting in data exposure and 35% causing financial losses, the operational impact is measurable.

The report’s central finding is a confidence gap: organizations believe they have visibility, but the numbers say otherwise. For teams deploying agents at scale, the practical question is whether existing identity and access management systems can handle a class of actors that create themselves, persist beyond their intended lifespan, and retain the permissions of the humans who deployed them.