OVERVIEW
Redefining what’s possible with SLMs
- Maximize AI capabilities, lower resource use, and ensure cost-effective generative AI deployments across your applications.
- Accelerate response times in real-time interactions, autonomous systems, apps requiring low latency, and other critical scenarios.
- Run Phi in the cloud, at the edge, or on device, resulting in greater deployment and operation flexibility.
- Phi models were developed in accordance with Microsoft AI principles: accountability, transparency, fairness, reliability and safety, privacy and security, and inclusiveness.
USE CASES
Use Phi for generative AI applications
Local deployments
Operate effectively in offline environments where data privacy is paramount or connectivity is limited.
Accurate and relevant answers
Generate more coherent, accurate, and contextually relevant outputs with an expanded context window.
Latency-bound scenarios
Deploy at the edge to deliver faster responses.
Cost-constrained tasks
Use Phi for simple tasks to reduce resource requirements and lower costs without compromising performance.
Customization and precision
Boost performance by fine-tuning the models with domain specific data.
SECURITY
Built-in security and compliance
Microsoft has committed to investing USD 20 billion in cybersecurity over five years.
We employ more than 8,500 security and threat intelligence experts across 77 countries.
Azure has one of the largest compliance certification portfolios in the industry.
PRICING
Phi models
Phi
Phi models are available for free for real-time deployment through the Azure AI model catalog. They’re also available on Hugging Face and Ollama.
Phi in models as a service
Phi models are available with pay-as-you-go billing via inference APIs.
RELATED PRODUCTS
Our products work better together
Use Phi models with other Azure AI products to build advanced and comprehensive solutions.
RESOURCES
Get started with Phi today
FAQ
Frequently asked questions
-
- Phi-1: For Python coding
- Phi-1.5: For reasoning and understanding
- Phi-2: For language comprehension
- Phi-3: For language understanding and reasoning tasks
*Phi-3 also performs well on coding benchmarks.
- Yes, the models from Phi-3 onwards are all designed for production use cases and have been through rigorous safety post-training.
-
- Phi is available across regions where Azure AI Studio is available.
- Yes, Phi models are available through MaaS.
- Phi can be fine-tuned using Azure Machine Learning SDK. Here is a sample notebook. No-code fine-tuning is available in Azure Machine Learning and Azure AI Studio.
- Phi-mini was trained and optimized for English, and its capabilities in other languages are limited. We encourage you to use Microsoft Translator to translate prompts and responses for the best results.
- October 2023
Get started
Account signup
Get started with a free account
Start with USD 200 Azure credit.
Account signup
Get started with pay-as-you-go pricing
There’s no upfront commitment—cancel anytime.