Trace Id is missing
Skip to main content
Azure

Phi open models

A family of powerful, small language models (SLMs) with groundbreaking performance at low cost and low latency
Smaller, less compute-intensive models for generative AI solutions.
OVERVIEW

Redefining what’s possible with SLMs

  • Maximize AI capabilities, lower resource use, and ensure cost-effective generative AI deployments across your applications.
  • Accelerate response times in real-time interactions, autonomous systems, apps requiring low latency, and other critical scenarios.
  • Run Phi in the cloud, at the edge, or on device, resulting in greater deployment and operation flexibility.
  • Phi models were developed in accordance with Microsoft AI principles: accountability, transparency, fairness, reliability and safety, privacy and security, and inclusiveness.
USE CASES

Use Phi for generative AI applications

Local deployments

Operate effectively in offline environments where data privacy is paramount or connectivity is limited.

Accurate and relevant answers

Generate more coherent, accurate, and contextually relevant outputs with an expanded context window.

Latency-bound scenarios

Deploy at the edge to deliver faster responses.

Cost-constrained tasks

Use Phi for simple tasks to reduce resource requirements and lower costs without compromising performance.

Customization and precision

Boost performance by fine-tuning the models with domain specific data.
SECURITY

Built-in security and compliance  

Microsoft has committed to investing USD20 billion in cybersecurity over five years. 
We employ more than 8,500 security and threat intelligence experts across 77 countries. 
Azure has one of the largest compliance certification portfolios in the industry. 
PRICING

Phi models

Phi

Phi models are available for free for real-time deployment through the Azure AI model catalog. They’re also available on Hugging Face and Ollama.

Phi in models as a service

Phi models are available with pay-as-you-go billing via inference APIs.
FAQ

Frequently asked questions

    • Phi-1: For Python coding
    • Phi-1.5: For reasoning and understanding
    • Phi-2: For language comprehension
    • Phi-3: For language understanding and reasoning tasks

     

    *Phi-3 also performs well on coding benchmarks. 

  • Yes, the models from Phi-3 onwards are all designed for production use cases and have been through rigorous safety post-training.
  • Phi is available through the Azure AI Studio model catalog, Hugging Face, and Ollama.
  • Phi is available across regions where Azure AI Studio is available.
  • Yes, Phi models are available through MaaS.
  • Phi can be fine-tuned using Azure Machine Learning SDK. Here is a sample notebook. No-code fine-tuning is available in Azure Machine Learning and Azure AI Studio.
  • Phi-mini was trained and optimized for English, and its capabilities in other languages are limited. We encourage you to use Microsoft Translator to translate prompts and responses for the best results.
  • October 2023

Get started

Two people sit at a table, engaged in discussion while looking at a laptop screen with charts and graphs
Account signup

Get started with a free account

Start with USD200 Azure credit.
A person sits at a table, working on a laptop with a focused expression. A small plant and a few pencils are on the table.
Account signup

Get started with pay-as-you-go pricing

There’s no upfront commitment—cancel anytime.