Skip to content

Latest commit

 

History

History
107 lines (62 loc) · 4.78 KB

AzureAIStudio_QuickStart.md

File metadata and controls

107 lines (62 loc) · 4.78 KB

Using Phi-3 in Azure AI Studio

With the development of Generative AI, we hope to use a unified platform to manage different LLM and SLM, enterprise data integration, fine-tuning/RAG operations, and the evaluation of different enterprise businesses after integrating LLM and SLM, etc., so that generative AI can Smart applications are better implemented. Azure AI Studio is an enterprise-level generative AI application platform.

aistudo

With Azure AI Studio, you can evaluate large language model (LLM) responses and orchestrate prompt application components with prompt flow for better performance. The platform facilitates scalability for transforming proof of concepts into full-fledged production with ease. Continuous monitoring and refinement support long-term success.

We can quickly deploy the Phi-3 model on Azure AI Studio through simple steps, and then use Azure AI Studio to complete Phi-3 related Playground/Chat, Fine-tuning, evaluation and other related work.

1. Preparation

Azure AI Studio Starter

This is Bicep template that deploys everything you need to get started with Azure AI Studio. Includes AI Hub with dependent resources, AI project, AI Services and an online endpoint

Quick Use

If you already have the Azure Developer CLI installed on your machine, using this template is as simple as running this command in a new directory.

Terminal Command

azd init -t azd-aistudio-starter

Or If using the azd VS Code extension you can paste this URL in the VS Code command terminal.

Terminal URL

azd-aistudio-starter

Manual Creation

Create Azure AI Studio on Azure Portal

portal

After completing the naming of the studio and setting the region, you can create it

settings

After successful creation, you can access the studio you created through ai.azure.com

page

There can be multiple projects on one AI Studio. Create a project in AI Studio to prepare.

proj

2. Deploy the Phi-3 model in Azure AI Studio

Click the Explore option of the project to enter the Model Catalog and select Phi-3

model

Select Phi-3-mini-4k-instruct

phi3

Click 'Deploy' to deploy the Phi-3-mini-4k-instruct model

Note

You can select computing power when deploying

3. Playground Chat Phi-3 in Azure AI Studio

Go to the deployment page, select Playground, and chat with Phi-3 of Azure AI Studio

chat

4. Deploying the Model from Azure AI Studio

To deploy a model from the Azure Model Catalog, you can follow these steps:

  • Sign in to Azure AI Studio.
  • Choose the model you want to deploy from the Azure AI Studio model catalog.
  • On the model's Details page, select Deploy and then select Serverless API with Azure AI Content Safety.
  • Select the project in which you want to deploy your models. To use the Serverless API offering, your workspace must belong to the East US 2 or Sweden Central region. You can customize the Deployment name.
  • On the deployment wizard, select the Pricing and terms to learn about the pricing and terms of use.
  • Select Deploy. Wait until the deployment is ready and you're redirected to the Deployments page.
  • Select Open in playground to start interacting with the model.
  • You can return to the Deployments page, select the deployment, and note the endpoint's Target URL and the Secret Key, which you can use to call the deployment and generate completions.
  • You can always find the endpoint's details, URL, and access keys by navigating to the Build tab and selecting Deployments from the Components section.

Note

Please note that your account must have the Azure AI Developer role permissions on the Resource Group to perform these steps.

5. Using Phi-3 API in Azure AI Studio

You can access https://{Your project name}.region.inference.ml.azure.com/swagger.json through Postman GET and combine it with Key to learn about the provided interfaces

swagger

such as access score api

score

You can get the request parameters very conveniently, as well as the response parameters. This is Postman result

result