Adobe announced the Firefly AI Assistant on Wednesday, a cross-application AI agent that takes natural language instructions and orchestrates workflows across Photoshop, Premiere, Lightroom, Express, Illustrator, and Firefly. The assistant enters public beta in the coming weeks. Adobe has not specified whether it will be priced separately from Firefly’s existing credit-based subscription tiers.

From Project Moonlight to Public Beta

Adobe first previewed the concept in October 2025 under the codename “Project Moonlight,” according to TechCrunch. At that stage, the assistant could tap into Acrobat, Photoshop, and Express. The production version expands that scope to the full Creative Cloud catalog.

The assistant works through text prompts, buttons, and sliders. Users describe what they want. The agent suggests actions, orchestrates between apps, executes workflows, and leaves room for human intervention at any point. It also adapts contextually: if you are editing a product photo set in a forest, the assistant surfaces a slider to adjust the amount of trees and foliage rather than requiring manual layer editing across tools.

Skills: Multi-Step Workflow Automations

The most significant feature for production creative teams is the “Skills” system. Skills are multi-step workflow automations that handle an entire creative task end to end. Adobe’s launch example is a “social media assets” skill that takes a source image and automatically adapts it across platforms by cropping, expanding, optimizing file sizes, and storing outputs. No manual switching between apps required.

This is the agentic pattern applied to creative software: define the outcome, let the agent handle tool selection and execution across the suite.

The Competitive Landscape

Adobe is not operating in a vacuum. Canva and Figma are both building agentic workflows into their platforms. Anthropic’s upcoming Opus 4.7, reported to include a design tool for websites and presentations, has already caused stock declines at Figma and Wix this week. Adobe’s approach differs from all three: rather than building a new AI-native design tool, it is layering an agentic orchestration interface on top of two decades of existing Creative Cloud integrations.

Alexandru Costin, Adobe’s vice president of AI and innovation for creativity and productivity, framed the bet explicitly to TechCrunch: “We have the opportunity with the Firefly AI assistant and with agentic experiences to remove some of the friction in learning this large catalog of tools we have and bring all of that value to our customers at their fingertips.”

That “large catalog” complexity has been Adobe’s defining characteristic for years. The Firefly AI Assistant is the argument that cross-app orchestration is a better moat than any single product.

The Agent-as-Interface Pattern

For teams building enterprise software agents, Adobe’s architecture is a reference implementation of a specific pattern: using AI agents as the universal interface to a complex, multi-product software ecosystem. The same logic applies to any incumbent SaaS vendor with a large product catalog. Natural language orchestration, reusable skills, cross-app workflows, and preference learning over time. The creative software industry is now a three-way race to prove which version of that pattern wins.

Adobe also announced updates to the standalone Firefly tool, including noise reduction in speech, reverb and music adjustment, a color adjustment tool, stock library integration, and the addition of Kling 3.0 and Kling 3.0 Omni third-party AI models, according to TechCrunch.