Skip to content

Commit

Permalink
Merge pull request #2188 from MicrosoftDocs/main
Browse files Browse the repository at this point in the history
1/7/2025 PM Publish
  • Loading branch information
Taojunshen authored Jan 7, 2025
2 parents 8b21d47 + 910ada7 commit 058aad5
Show file tree
Hide file tree
Showing 10 changed files with 719 additions and 481 deletions.
16 changes: 8 additions & 8 deletions articles/ai-services/agents/concepts/model-region-support.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ author: aahill
ms.author: aahi
ms.service: azure-ai-agent-service
ms.topic: conceptual
ms.date: 12/11/2024
ms.date: 01/07/2025
ms.custom: azure-ai-agents
---

Expand All @@ -19,13 +19,13 @@ Agents are powered by a diverse set of models with different capabilities and pr

Azure AI Agent Service supports the same models as the chat completions API in Azure OpenAI, in the following regions.

| **Region** | **gpt-4o**, **2024-05-13** | **gpt-4o**, **2024-08-06** | **gpt-4o-mini**, **2024-07-18** | **gpt-4**, **0613** | **gpt-4**, **1106-Preview** | **gpt-4**, **0125-Preview** | **gpt-4**, **vision-preview** | **gpt-4**, **turbo-2024-04-09** | **gpt-4-32k**, **0613** | **gpt-35-turbo**, **0613** | **gpt-35-turbo**, **1106** | **gpt-35-turbo**, **0125** | **gpt-35-turbo-16k**, **0613** |
|:--------------|:--------------------------:|:--------------------------:|:-------------------------------:|:-------------------:|:---------------------------:|:---------------------------:|:-----------------------------:|:-------------------------------:|:--------------------------:|:--------------------------:|:--------------------------:|:------------------------------:|
| eastus |||| - | - || - | | - || - |||
| francecentral | - | - | - ||| - | - | - |||| - ||
| japaneast | - | - | - | - | - | - | | - | - || - |||
| uksouth | - | - | - | - ||| - | - | - |||||
| westus |||| - || - || | - | - ||| - |
| **Region** | **gpt-4o**, **2024-05-13** | **gpt-4o**, **2024-08-06** | **gpt-4o-mini**, **2024-07-18** | **gpt-4**, **0613** | **gpt-4**, **1106-Preview** | **gpt-4**, **0125-Preview** | **gpt-4**, **turbo-2024-04-09** | **gpt-4-32k**, **0613** | **gpt-35-turbo**, **0613** | **gpt-35-turbo**, **1106** | **gpt-35-turbo**, **0125** | **gpt-35-turbo-16k**, **0613** |
|:--------------|:--------------------------:|:--------------------------:|:-------------------------------:|:-------------------:|:---------------------------:|:---------------------------:|:-------------------------------:|:--------------------------:|:--------------------------:|:--------------------------:|:------------------------------:|
| eastus |||| - | - ||| - || - |||
| francecentral | - | - | - ||| - | - |||| - ||
| japaneast | - | - | - | - | - | - | - | - || - |||
| uksouth | - | - | - | - ||| - | - |||||
| westus |||| - || - || - | - ||| - |


## More models
Expand Down
25 changes: 11 additions & 14 deletions articles/ai-services/openai/how-to/assistants-logic-apps.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,6 @@ recommendations: false

# Call Azure Logic apps as functions using Azure OpenAI Assistants

> [!NOTE]
> This functionality is currently only available in Azure OpenAI Studio.
[Azure Logic Apps](https://azure.microsoft.com/products/logic-apps) is an integration platform in Azure that allows you to build applications and automation workflows with low code tools enabling developer productivity and faster time to market. By using the visual designer and selecting from hundreds of prebuilt connectors, you can quickly build a workflow that integrates and manages your apps, data, services, and systems.

Azure Logic Apps is fully managed by Microsoft Azure, which frees you from worrying about hosting, scaling, managing, monitoring, and maintaining solutions built with these services. When you use these capabilities to create [serverless](/azure/logic-apps/logic-apps-overview) apps and solutions, you can just focus on the business logic and functionality. These services automatically scale to meet your needs, make automation workflows faster, and help you build robust cloud apps using little to no code.
Expand All @@ -31,7 +28,7 @@ The Assistants playground enumerates and lists all the workflows in your subscri
* [Request trigger](/azure/connectors/connectors-native-reqres?tabs=consumption): Function calling requires a REST-based API. Logic Apps with a request trigger provides a REST endpoint. Therefore only workflows with a request trigger are supported for function calling.
* Schema: The workflows you want to use for function calling should have a JSON schema describing the inputs and expected outputs. Using Logic Apps you can streamline and provide schema in the trigger, which would be automatically imported as a function definition.

If you already have workflows with above three requirements, you should be able to use them in Azure OpenAI Studio and invoke them via user prompts.
If you already have workflows with above three requirements, you should be able to use them in Azure AI Foundry and invoke them via user prompts.
If you do not have existing workflows, you can follow the steps in this article to create them. There are two primary steps:
1. [Create a Logic App on Azure portal](#create-logic-apps-workflows-for-function-calling).
2. [Import your Logic Apps workflows as a function in the Assistants Playground](#import-your-logic-apps-workflows-as-functions).
Expand All @@ -50,7 +47,7 @@ Here are the steps to create a new Logic Apps workflow for function calling.
1. After Azure successfully deploys your logic app resource, select **Go to resource**. Or, find and select your logic app resource by typing the name in the Azure search box.
1. Open the Logic Apps workflow in designer. Select Development Tools + Logic app designer. This opens your empty workflow in designer. Or you select Blank Logic App from templates
1. Now you're ready to add one more step in the workflow. A workflow always starts with a single trigger, which specifies the condition to meet before running any subsequent actions in the workflow.
1. Your workflow is required to have a Request trigger to generate a REST endpoint, and a response action to return the response to Azure OpenAI Studio when this workflow is invoked.
1. Your workflow is required to have a Request trigger to generate a REST endpoint, and a response action to return the response to Azure AI Foundry when this workflow is invoked.
1. Add a trigger [(Request)](/azure/connectors/connectors-native-reqres?tabs=consumption)

Select **Add a trigger** and then search for request trigger. Select the **When a HTTP request is received** operation.
Expand All @@ -61,7 +58,7 @@ Here are the steps to create a new Logic Apps workflow for function calling.

:::image type="content" source="..\media\how-to\assistants\logic-apps\create-logic-app-2.png" alt-text="A screenshot showing the option to provide a JSON schema." lightbox="..\media\how-to\assistants\logic-apps\create-logic-app-2.png":::

Here is an example of the request schema. You can add a description for your workflow in the comment box. This is imported by Azure OpenAI Studio as the function description.
Here is an example of the request schema. You can add a description for your workflow in the comment box. This is imported by Azure AI Foundry as the function description.

:::image type="content" source="..\media\how-to\assistants\logic-apps\create-logic-app-3.png" alt-text="A screenshot showing an example request schema." lightbox="..\media\how-to\assistants\logic-apps\create-logic-app-3.png":::

Expand All @@ -77,22 +74,22 @@ Here are the steps to create a new Logic Apps workflow for function calling.

:::image type="content" source="..\media\how-to\assistants\logic-apps\create-logic-app-6.png" alt-text="A screenshot showing the location property." lightbox="..\media\how-to\assistants\logic-apps\create-logic-app-6.png":::

1. Configure the [response](/azure/connectors/connectors-native-reqres#add-a-response-action). The workflow needs to return the response back to Azure OpenAI Studio. This is done using Response action.
1. Configure the [response](/azure/connectors/connectors-native-reqres#add-a-response-action). The workflow needs to return the response back to Azure AI Foundry. This is done using Response action.

:::image type="content" source="..\media\how-to\assistants\logic-apps\create-logic-app-7.png" alt-text="A screenshot showing the response action." lightbox="..\media\how-to\assistants\logic-apps\create-logic-app-7.png":::

In the response action, you can pick the output from any of the prior steps. You can optionally also provide a JSON schema if you want to return the output in a specific format.

:::image type="content" source="..\media\how-to\assistants\logic-apps\create-logic-app-7.png" alt-text="A screenshot showing the comment box to specify a JSON schema." lightbox="..\media\how-to\assistants\logic-apps\create-logic-app-7.png":::

1. The workflow is now ready. In Azure OpenAI Studio, you can import this function using the **Add function** feature in the Assistants playground.
1. The workflow is now ready. In Azure AI Foundry, you can import this function using the **Add function** feature in the Assistants playground.


## Import your Logic Apps workflows as functions

Here are the steps to import your Logic Apps workflows as function in the Assistants playground in Azure OpenAI Studio:
Here are the steps to import your Logic Apps workflows as function in the Assistants playground in Azure AI Foundry:

1. In Azure OpenAI Studio, select **Assistants**. Select an existing Assistant or create a new one. After you have configured the assistant with a name and instructions, you are ready to add a function. Select **+ Add function**.
1. In Azure AI Foundry, select **Playgrounds** from the left navigation menu, and then **Assistants playground**. Select an existing Assistant or create a new one. After you have configured the assistant with a name and instructions, you are ready to add a function. Select **+ Add function**.

:::image type="content" source="..\media\how-to\assistants\logic-apps\assistants-playground-add-function.png" alt-text="A screenshot showing the Assistant playground with the add function button." lightbox="..\media\how-to\assistants\logic-apps\assistants-playground-add-function.png":::

Expand Down Expand Up @@ -123,11 +120,11 @@ You can confirm the invocation by looking at the logs as well as your [workflow

Azure Logic Apps has connectors to hundreds of line-of-business (LOB) applications and databases including but not limited to: SAP, Salesforce, Oracle, SQL, and more. You can also connect to SaaS applications or your in-house applications hosted in virtual networks. These out of box connectors provide operations to send and receive data in multiple formats. Leveraging these capabilities with Azure OpenAI assistants, you should be able to quickly bring your data for Intelligent Insights powered by Azure OpenAI.

**What happens when a Logic Apps is imported in Azure OpenAI Studio and invoked**
**What happens when a Logic Apps is imported in Azure AI Foundry and invoked**

The Logic Apps swagger file is used to populate function definitions. Azure Logic App publishes an OpenAPI 2.0 definition (swagger) for workflows with a request trigger based on [annotations on the workflow](/rest/api/logic/workflows/list-swagger). Users are able to modify the content of this swagger by updating their workflow. Azure OpenAI Studio uses this to generate the function definitions that the Assistant requires.
The Logic Apps swagger file is used to populate function definitions. Azure Logic App publishes an OpenAPI 2.0 definition (swagger) for workflows with a request trigger based on [annotations on the workflow](/rest/api/logic/workflows/list-swagger). Users are able to modify the content of this swagger by updating their workflow. Azure AI Foundry uses this to generate the function definitions that the Assistant requires.

**How does authentication from Azure OpenAI Studio to Logic Apps work?**
**How does authentication from Azure AI Foundry to Logic Apps work?**

Logic Apps supports two primary types of authentications to invoke a request trigger.

Expand All @@ -139,7 +136,7 @@ Logic Apps supports two primary types of authentications to invoke a request tri

Logic Apps also supports authentication trigger invocations with Microsoft Entra ID OAuth, where you can specify authentication policies to be used in validating OAuth tokens. For more information, see the [Logic Apps documentation](/azure/logic-apps/logic-apps-securing-a-logic-app#generate-shared-access-signatures-sas).

When Azure OpenAI Assistants require invoking a Logic App as part of function calling, Azure OpenAI Studio will retrieve the callback URL with the SAS to invoke the workflow.
When Azure OpenAI Assistants require invoking a Logic App as part of function calling, Azure AI Foundry will retrieve the callback URL with the SAS to invoke the workflow.

## See also

Expand Down
2 changes: 1 addition & 1 deletion articles/ai-services/openai/how-to/prompt-caching.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Currently only the following models support prompt caching with Azure OpenAI:
- `o1-2024-12-17`
- `o1-preview-2024-09-12`
- `o1-mini-2024-09-12`
- `gpt-4o-2024-05-13`
- `gpt-4o-2024-11-20`
- `gpt-4o-2024-08-06`
- `gpt-4o-mini-2024-07-18`

Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion articles/ai-studio/how-to/configure-managed-network.md
Original file line number Diff line number Diff line change
Expand Up @@ -786,7 +786,7 @@ The hosts in this section are used to install Visual Studio Code packages to est
| `code.visualstudio.com` | Required to download and install VS Code desktop. This host isn't required for VS Code Web. |
| `update.code.visualstudio.com`<br>`*.vo.msecnd.net` | Used to retrieve VS Code server bits that are installed on the compute instance through a setup script. |
| `marketplace.visualstudio.com`<br>`vscode.blob.core.windows.net`<br>`*.gallerycdn.vsassets.io` | Required to download and install VS Code extensions. These hosts enable the remote connection to compute instances. For more information, see [Get started with Azure AI Foundry projects in VS Code](./develop/vscode.md). |
| `https://github.com/microsoft/vscode-tools-for-ai/tree/master/azureml_remote_websocket_server/*` | Used to retrieve websocket server bits that are installed on the compute instance. The websocket server is used to transmit requests from Visual Studio Code client (desktop application) to Visual Studio Code server running on the compute instance. |
| `https://github.com/microsoft/vscode-tools-for-ai/tree/master/azureml_remote_websocket_server/*`<br>`raw.githubusercontent.com` | Used to retrieve websocket server bits that are installed on the compute instance. The websocket server is used to transmit requests from Visual Studio Code client (desktop application) to Visual Studio Code server running on the compute instance. |
| `vscode.download.prss.microsoft.com` | Used for Visual Studio Code download CDN |
#### Ports
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,15 +45,6 @@ The Azure Machine Learning workspace uses a __managed identity__ to communicate
Once a workspace is created with SAI identity type, it can be updated to SAI+UAI, but not back from SAI+UAI to SAI. You may assign multiple user-assigned identities to the same workspace.


## Azure Container Registry and identity types

This table lists the support matrix when authenticating to __Azure Container Registry__, depending on the authentication method and the __Azure Container Registry's__ [public network access configuration](/azure/container-registry/container-registry-access-selected-networks).

| Authentication method | Public network access</br>disabled | Azure Container Registry</br>Public network access enabled |
| ---- | :----: | :----: |
| Admin user |||
| Workspace system-assigned managed identity |||

## User-assigned managed identity

### Workspace
Expand Down
4 changes: 2 additions & 2 deletions articles/machine-learning/how-to-manage-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ ms.author: larryfr
ms.reviewer: kritifaujdar
ms.service: azure-machine-learning
ms.subservice: mlops
ms.date: 06/12/2024
ms.date: 01/07/2025
ms.topic: how-to
ms.custom: cli-v2, sdk-v2, devx-track-azurecli, update-code2, devx-track-python
---
Expand Down Expand Up @@ -253,7 +253,7 @@ If your model data comes from a job output, you have two options for specifying
path="runs:/my_run_0000000000/model/"
name="my-registered-model",
description="Model created from run.",
type=ModelType.MLFLOW_MODEL
type=ModelType.MLFLOW
)

ml_client.models.create_or_update(run_model)
Expand Down
Loading

0 comments on commit 058aad5

Please sign in to comment.