OpenAI made a big bet Monday: ChatGPT isn’t just a chatbot anymore—it’s becoming the operating system for how we interact with apps. At DevDay 2025 in San Francisco, CEO Sam Altman unveiled changes that could fundamentally reshape how 800 million weekly users access digital services.
The centerpiece? Apps SDK, which lets you control Spotify, Zillow, Canva, and Figma directly inside your ChatGPT conversation. No more tab-switching hell. You can create playlists, search for apartments, design graphics, and book travel without ever leaving the chat window. Each service requires just one login, then natural language commands handle everything else.
“We hope this is going to be a big step in helping developers quickly scale their products,” Altman said during his keynote.

Building an Ecosystem Beyond Simple Chat
This represents OpenAI’s most ambitious attempt yet to transform ChatGPT from an AI assistant into what company leadership calls an “operating system for reasoning.” That’s not just marketing speak—the new Apps SDK, built on the open Model Context Protocol, lets developers create fully interactive applications that respond to conversational requests and include dynamic interfaces.
The live demos showed what this actually looks like. Users asked ChatGPT to generate marketing materials through Canva, then seamlessly switched to searching Pittsburgh real estate with interactive Zillow maps. Launch partners include Booking.com, Expedia, Coursera, and Figma, with DoorDash, Uber, Target, and OpenTable joining later this year.
OpenAI also released AgentKit—a comprehensive toolkit for building AI agents that includes visual workflow design tools, embeddable chat interfaces, and performance testing capabilities. During the presentation, engineer Christina Huang built a working AI workflow in under eight minutes, demonstrating just how fast developers can now create these integrated experiences.
The AMD Deal That Sent Stocks Soaring
The market impact was immediate and dramatic. Advanced Micro Devices shares jumped over 30% after OpenAI announced a multi-year chip supply agreement worth tens of billions of dollars. OpenAI committed to using 6 gigawatts of AMD Instinct GPUs, with options to acquire up to 10% of the chipmaker’s stock through performance-based warrants.
This partnership reflects OpenAI’s strategy to diversify chip supply beyond its existing $100 billion arrangement with Nvidia, according to industry analysts. Initial deployment of 1 gigawatt using AMD’s forthcoming MI450 series begins in the second half of 2026.
“We view this agreement as defining not only for AMD, but for the entire industry dynamics,” AMD Executive Vice President Forrest Norrod told Reuters.
The diversification makes sense. Relying on a single chip supplier for infrastructure powering 800 million weekly users creates massive risk. If Nvidia can’t deliver, or prices spike, or supply constraints emerge, OpenAI needs alternatives. AMD just became that alternative in a very big way.
What Else Launched at DevDay
Beyond the app platform announcement, OpenAI rolled out several other significant updates:
GPT-5 Pro API: Now generally available for high-precision tasks in finance, legal, and healthcare sectors. This matters for enterprises that need accuracy above all else.
gpt-realtime-mini voice module: 70% cheaper than previous advanced voice solutions, potentially making voice AI economically viable for applications that couldn’t justify earlier pricing.
Sora 2 API preview: OpenAI’s new video generation model becomes accessible to developers, though still in preview mode.
The Financial Reality Behind the Growth
Here’s the uncomfortable truth: despite explosive growth to 800 million weekly users and projected $13 billion annual revenue for 2025, OpenAI reported net losses of $13.5 billion in the first half of 2025 alone, with $2.5 billion in cash outflows. The company burns through more than $13.8 million daily in operational expenses.
Those numbers illustrate the brutal economics of competing in AI infrastructure. Building and maintaining the computational power to serve 800 million users while training progressively larger models requires staggering capital that revenue doesn’t yet cover. OpenAI is essentially in a race to achieve profitability before funding runs out or investor patience expires.
What This Platform Strategy Actually Means
The Apps SDK launch positions ChatGPT as something between a web browser, an operating system, and an app store. Instead of visiting individual websites or opening separate apps, users accomplish tasks through conversational interfaces that coordinate multiple services behind the scenes.
This threatens existing platforms in subtle but significant ways. If users start booking travel through ChatGPT instead of visiting Expedia directly, who owns the customer relationship? When someone creates designs through ChatGPT’s Canva integration rather than opening Canva’s website, whose brand gets reinforced?
App developers face a tricky calculation: integrate with ChatGPT to access 800 million users, but potentially cede control over user experience and data. Don’t integrate, and risk becoming invisible as competitors embrace the platform.
The Model Context Protocol foundation suggests OpenAI wants this to be an open ecosystem rather than a walled garden. Developers can build integrations without being locked into OpenAI-specific technologies. Whether this openness persists as the platform matures remains to be seen—open platforms have a way of becoming less open once they dominate markets.

AgentKit Makes Building Accessible
The visual workflow tools in AgentKit matter because they lower barriers to creating AI agents. Instead of requiring deep technical expertise, developers can design agent behaviors through visual interfaces, then test and deploy quickly. Christina Huang’s eight-minute live build demonstrated this accessibility—though of course, production-ready agents will require more than eight minutes of work.
This democratization of AI agent development could accelerate innovation, letting smaller companies and individual developers create sophisticated integrations that previously required large engineering teams. Or it could flood the ecosystem with half-baked agents that don’t work reliably, creating quality control challenges OpenAI will need to address.
Where This Goes Next
OpenAI clearly envisions ChatGPT becoming the primary interface for digital tasks—the place you go first, with apps and services accessible through conversation rather than requiring separate visits. Whether users embrace this model or prefer maintaining direct relationships with individual services will determine the strategy’s success.
The financial losses underscore how expensive this vision is to build. $13.5 billion in losses during six months suggests OpenAI is betting billions on this platform transition paying off before the money runs out. The AMD partnership helps by diversifying chip supply and potentially reducing costs, but the fundamental economics require either massive revenue growth or substantial cost reductions.
For now, ChatGPT users get a genuinely useful upgrade—integrated app access that eliminates friction from common tasks. Developers get new distribution opportunities with 800 million potential users. And OpenAI gets closer to its vision of ChatGPT as an operating system for the AI era.
Whether that vision becomes reality or joins the long list of ambitious tech platform plays that didn’t pan out will depend on execution quality, user adoption rates, and whether OpenAI can turn those 800 million users into sustainable revenue before the cash runs out.
Post a comment