When a close partner starts signing deals with your biggest rival, the standard response is usually a vague statement of support. Microsoft chose a different approach after OpenAI announced a new collaboration with Amazon. The tech giant released a detailed breakdown of its contract terms, effectively reminding the market—and perhaps OpenAI—that while the startup can make new friends, it still lives in Microsoft’s house. The clarification draws a bright line around what OpenAI can actually share with other tech giants and what remains strictly Microsoft property.
Key Takeaways
- Microsoft remains the exclusive cloud provider for all stateless OpenAI APIs.
- Microsoft maintains an exclusive license to intellectual property across all OpenAI models and products.
- OpenAI first-party products, including Frontier, will continue to be hosted on Azure.
The core of this update is a reassertion of dominance. While OpenAI is expanding its list of partners to include Amazon, the underlying mechanics of its business remain tethered to Microsoft’s Azure cloud. Microsoft confirmed that its revenue-sharing agreement covers these new partnerships, meaning it gets a financial cut even when OpenAI works with other providers.
More importantly, Microsoft clarified that it retains an exclusive license to the intellectual property across OpenAI’s models. The partnership was originally designed to allow OpenAI to find “additional compute” elsewhere as it scales, but the primary infrastructure and the most critical product pipelines stay put.
The big deal
This clarification matters because it reveals the actual power dynamic in the AI industry. For the last year, observers have wondered if OpenAI was drifting away from Microsoft to become truly independent. This document suggests the “drift” is more of a controlled leash. Microsoft has effectively locked down the most valuable parts of the relationship—the intellectual property and the hosting of core products—while allowing OpenAI to seek outside help for raw computing power.
For enterprise customers, this signals stability. If you use OpenAI models through Azure, your service isn’t going anywhere. For competitors like Amazon, it implies a limit to how deep their integration with OpenAI can go. They can partner on compute, but they cannot simply host the full OpenAI stack independently of Microsoft’s terms.
How it works
The technical enforcement of this contract relies on where the software actually runs. Microsoft states it is the exclusive host for “stateless APIs.”
Think of this like a famous chef opening a pop-up restaurant inside a hotel. The hotel (Amazon) provides the tables, the electricity, and the customers, but the food must still be cooked in the chef’s original central kitchen (Microsoft Azure) and trucked over. The hotel gets to put the chef’s name on the door, but they don’t get the recipe or the ability to cook the meal themselves.
In this case, even if a customer accesses OpenAI models through a third-party collaboration, the actual “stateless” processing for those API calls routes back to Azure infrastructure. The data travels, but the core processing engine does not move.
The catch
The main limitation here is the definition of “stateless.”
Stateless API: A software interface that treats every request as a new, isolated event, without remembering previous interactions.
Microsoft specifies exclusivity over stateless APIs, but the text does not explicitly detail the rules for “stateful” operations or other types of compute. This leaves a small gray area regarding what exactly OpenAI can build on other clouds.
There is also a financial catch for OpenAI’s new partners. Since the revenue share agreement remains unchanged, a portion of the money generated from OpenAI’s deals with other cloud providers likely flows back to Microsoft. This creates a scenario where Microsoft benefits financially even when its competitors succeed in signing deals with OpenAI.
What to watch
The text explicitly mentions the “Stargate” project, a large-scale infrastructure initiative, as an example of where OpenAI has flexibility to find compute elsewhere. This suggests the sheer size of future models requires more power than one company can easily provide.
- The Frontier launch: Microsoft confirmed that OpenAI’s next major model, Frontier, will be hosted on Azure. Watch to see if this launch happens exclusively on Azure first before trickling down to other partners.
- Performance latency: If API calls from Amazon or other partners have to route back to Azure for processing, watch for differences in speed compared to using Azure directly.
- If you are a developer: You can buy these APIs from Microsoft or OpenAI directly, but the backend infrastructure remains the same.













