
Reasoning reimagined: Introducing Phi-4-mini-flash-reasoning
Unlock faster, efficient reasoning with Phi-4-mini-flash-reasoning—optimized for edge, mobile, and real-time applications.
Unlock faster, efficient reasoning with Phi-4-mini-flash-reasoning—optimized for edge, mobile, and real-time applications.
Announcing the public preview of Deep Research in Azure AI Foundry—an API and SDK-based offering of OpenAI’s advanced agentic research capability.
Azure Accelerate is a simplified offering designed to fuel transformation with experts and investments across the cloud and AI journey.
Forrester Research shows how Azure helps enterprises scale generative AI securely, overcoming infrastructure and compliance challenges to unlock real business value.
This edition of FYAI features Yina Arenas, Vice President of Product, Azure AI Foundry, who’s leading the work to empower developers to shape the future with AI.
This blog breaks down the available pricing and deployment options, and tools that support scalable, cost-conscious AI deployments.
We are excited to introduce the Public Preview of Microsoft Planetary Computer Pro, a comprehensive platform that makes it dramatically easier for organizations to harness geospatial data for real-world impact.
An AI Center of Excellence helps align business strategy, people, and technology decisions around AI.
We’re proud to share that Microsoft has once again been named a Leader in the 2025 Gartner® Magic Quadrant™ for Data Science and Machine Learning (DSML) Platforms.
Defend your AI systems with Prompt Shields—a unified API that analyzes inputs to your LLM-based solution to guard against direct and indirect threats.
We’ve pulled together the top 25 announcements at Microsoft Build 2025 across the Azure business—spanning Azure AI Foundry, Azure infrastructure, Azure app platform, Azure databases and Microsoft Fabric, and our GitHub family.
This year, at Microsoft Build, we introduced new AI tools and capabilities designed to empower every software developer to become an AI developer.