Microsoft currently allows developers to tweak the two models Phi-3-mini and Phi-3-medium

In April this year, Microsoft first announced the SLM (Small Language Model) Phi-3 family, which offers great performance at a low cost and low latency.

Phi-3-mini is a 3.8B language model available in two context-length variants of 4K and 128K tokens. Phi-3-medium is a 14B language model, also available with two similar context-length variants.

Today, Microsoft officially announced that both Phi-3-mini and Phi-3-medium can be tweaked on Azure. This opens the door to allowing developers to improve the performance of the base model for a variety of use cases. For example, Developers can fine-tune the Phi-3-medium model to support building chat applications based on a specific response style. Some organizations have now also started using the Phi-3 model in practical AI applications.

Microsoft also announced the general availability of Models-as-a-Service (serverless endpoint). As expected, the Phi-3-small model is now available as a serverless endpoint, allowing anyone to quickly develop AI applications without worrying about the underlying infrastructure. copy. Phi-3-vision, a multi-modal model available through the Azure AI model catalog, will also soon be deployed via a serverless endpoint.

Last month, Microsoft updated the Phi-3-mini model to offer significant improvements. According to industry benchmarks, Phi-3-mini-4k now scores 35.8 (previously 21.7) with the June 2024 update, and Phi-3-mini-128k scores 37, 6 (previously 25.7).

Microsoft currently allows developers to tweak the two models Phi-3-mini and Phi-3-medium Picture 1Microsoft currently allows developers to tweak the two models Phi-3-mini and Phi-3-medium Picture 1

Microsoft also highlighted the new models just made available on Azure:

  1. OpenAI GPT-4o mini
  2. Mistral Large 2
  3. Meta Llama 3.1 family of models
  4. Cohere Rerank

The newly added Azure AI Content Safety feature set includes reminders and detection layers enabled by default for Azure OpenAI Service. Developers can use these features as content filters for any platform model, including Phi-3, Llama, Mistral, and many other options.

With these updates and expansions, Microsoft is clearly demonstrating its commitment to advancing AI capabilities on Azure. By continuously investing in the availability of advanced AI models on Azure and providing accessible tools for fine-tuning and deployment, Microsoft is enabling developers to easily create solutions Their own efficient AI.

5 ★ | 1 Vote