AI systems are now making real-time financial decisions across retail, environmental commodity markets, and legal services — approving refunds, pricing carbon credits, and determining litigation funding — with no human reviewing the outcome before money moves, according to an analysis published by PYMNTS.

The shift from “AI advises a human who decides” to “AI decides and the money moves” is already in production across multiple industries. The question it raises for builders deploying agentic systems in financial contexts is direct: when the agent gets it wrong at scale, who absorbs the loss?

Retail: Automated Refunds at $849.9 Billion Scale

Retail platforms now use AI to determine whether a return is accepted, partially refunded, or rejected. Many issue refunds before a returned item reaches the warehouse, according to PYMNTS. Total retail returns are projected to reach $849.9 billion in 2025, with an estimated 19.3% of online sales coming back, per the National Retail Federation.

The attack surface is widening from both sides. Fraudsters are deploying AI to generate fake damage photos, fabricated receipts, and false documentation to claim refunds, according to Modern Retail. Boll & Branch CEO Scott Tannen reported catching AI-generated photos of “damaged” sheets that carried AI watermarks and didn’t match how cotton actually frays. When his team asked the customer to verify the damage over FaceTime, the customer never responded.

Palo Alto Networks’ Unit 42 research division flagged the same dynamic in a report published last week: stores with reputations for easy refunds due to poor implementation face increased risk from automated fraud scripts. “2026 will be the year of this great divergence” between AI-powered attackers and defenders, the report states.

At the volume retailers process, a few percentage points of model error move billions of dollars in the wrong direction.

Carbon and Energy Markets: AI-Governed Pricing on $1 Trillion of Trades

In environmental commodity markets, AI-powered platforms already govern real-time pricing and automated settlement of carbon credits and renewable energy certificates, PYMNTS reports. Xpansiv’s CBL exchange, described as the world’s largest spot marketplace for environmental commodities, runs matched transactions on a same-day settlement cycle integrated with 17 global registries, deploying AI across price discovery, risk management, and settlement forecasting.

The global carbon market exceeded $1 trillion in 2025, according to data cited by Carbon Credits. When AI-driven pricing infrastructure misprices a credit or miscalculates a settlement, the error flows directly into the books of the financial institutions, corporate buyers, and fund managers that trade on it.

Law firms, insurers, and litigation finance platforms now use AI to assess case outcomes, model settlement ranges, and determine whether to fund a claim at all. By end of 2026, litigation intelligence is projected to operate as continuous, predictive infrastructure that determines whether to file, where to file, when to settle, and for how much, in near real time, according to the National Law Review.

The global litigation funding market could grow to nearly $50 billion by the mid-2030s, per Bloomberg Law. When AI governs whether a claim receives funding and at what valuation, it controls the timing and size of financial transfers — performing the same function as a credit committee at a bank.

The Governance Gap

The common thread across retail, carbon markets, and legal services is not sector-specific disruption. These AI systems are functioning as de facto financial actors: controlling the timing, amount, and direction of money movement without holding a banking license or operating under banking oversight.

For builders deploying agents that touch financial decisions, the implication is concrete. The efficiency that makes AI-driven financial execution attractive — speed, scale, no human bottleneck — is the same mechanism that amplifies damage when something goes wrong. An error propagates across thousands or millions of transactions before anyone detects the problem.

PYMNTS Intelligence data shows 54% of U.S. adults now use AI for personal tasks, with the average user relying on two to three different tools. Every additional consumer touchpoint where AI can trigger or influence a financial outcome expands the surface area for autonomous financial errors at scale.

No major regulatory framework currently governs AI systems that execute financial decisions outside of licensed banking institutions. The systems are already running. The oversight is not.