diff --git a/articles/ai-services/agents/concepts/tracing.md b/articles/ai-services/agents/concepts/tracing.md index 75ba6131dd4..73a7aa5f771 100644 --- a/articles/ai-services/agents/concepts/tracing.md +++ b/articles/ai-services/agents/concepts/tracing.md @@ -25,9 +25,9 @@ Tracing solves this by allowing you to clearly see the inputs and outputs of eac Tracing lets you analyze your agent's performance and behavior by using OpenTelemetry and adding an Application Insights resource to your Azure AI Foundry project. -To add an Application Insights resource, navigate to the **Tracing** tab in the [AI Foundry portal](https://ai.azure.com/), and create a new resource if you don't already have one. +To add an Application Insights resource, navigate to the **Tracing** tab in the [Azure AI Foundry portal](https://ai.azure.com/), and create a new resource if you don't already have one. -:::image type="content" source="../media/ai-foundry-tracing.png" alt-text="A screenshot of the tracing screen in the AI Foundry portal." lightbox="../media/ai-foundry-tracing.png"::: +:::image type="content" source="../media/ai-foundry-tracing.png" alt-text="A screenshot of the tracing screen in the Azure AI Foundry portal." lightbox="../media/ai-foundry-tracing.png"::: Once created, you can get an Application Insights connection string, configure your agents, and observe the full execution path of your agent through Azure Monitor. Typically you want to enable tracing before you create an agent. @@ -46,7 +46,7 @@ You will also need an exporter to send results to your observability backend. Yo pip install opentelemetry-exporter-otlp ``` -Once you have the packages installed, you can use one the following Python samples to implement tracing with your agents. Samples that use console tracing display the results locally in the console. Samples that use Azure Monitor send the traces to the Azure Monitor in the [AI Foundry portal](https://ai.azure.com/), in the **Tracing** tab in the left navigation menu for the portal. +Once you have the packages installed, you can use one the following Python samples to implement tracing with your agents. Samples that use console tracing display the results locally in the console. Samples that use Azure Monitor send the traces to the Azure Monitor in the [Azure AI Foundry portal](https://ai.azure.com/), in the **Tracing** tab in the left navigation menu for the portal. > [!NOTE] > There is a known bug in the agents tracing functionality. The bug will cause the agent's function tool to call related info (function names and parameter values, which could contain sensitive information) to be included in the traces even when content recording is not enabled. diff --git a/articles/ai-services/agents/how-to/tools/code-interpreter.md b/articles/ai-services/agents/how-to/tools/code-interpreter.md index b93609b6c43..ee843a6a80a 100644 --- a/articles/ai-services/agents/how-to/tools/code-interpreter.md +++ b/articles/ai-services/agents/how-to/tools/code-interpreter.md @@ -49,7 +49,7 @@ from azure.ai.projects.models import FilePurpose from azure.identity import DefaultAzureCredential from pathlib import Path -# Create an Azure AI Client from a connection string, copied from your AI Foundry project. +# Create an Azure AI Client from a connection string, copied from your Azure AI Foundry project. # At the moment, it should be in the format ";;;" # Customer needs to login to Azure subscription via Azure CLI and set the environment variables project_client = AIProjectClient.from_connection_string( diff --git a/articles/ai-services/agents/includes/connection-string-portal.md b/articles/ai-services/agents/includes/connection-string-portal.md index 87cb5a91bda..486d3d8b605 100644 --- a/articles/ai-services/agents/includes/connection-string-portal.md +++ b/articles/ai-services/agents/includes/connection-string-portal.md @@ -8,5 +8,5 @@ ms.date: 12/11/2024 --- > [!TIP] -> You can also find your connection string in the **overview** for your project in the [AI Foundry portal](https://ai.azure.com/), under **Project details** > **Project connection string**. +> You can also find your connection string in the **overview** for your project in the [Azure AI Foundry portal](https://ai.azure.com/), under **Project details** > **Project connection string**. > :::image type="content" source="../media/quickstart/portal-connection-string.png" alt-text="A screenshot showing the connection string in the Azure AI Foundry portal." lightbox="../media/quickstart/portal-connection-string.png"::: diff --git a/articles/ai-services/includes/quickstarts/ai-studio-prerequisites.md b/articles/ai-services/includes/quickstarts/ai-studio-prerequisites.md index 175969c435b..debd5787cd9 100644 --- a/articles/ai-services/includes/quickstarts/ai-studio-prerequisites.md +++ b/articles/ai-services/includes/quickstarts/ai-studio-prerequisites.md @@ -11,4 +11,4 @@ ms.author: eur > [!div class="checklist"] > - Azure subscription - [Create one for free](https://azure.microsoft.com/free/cognitive-services). -> - Some Azure AI services features are free to try in the Azure AI Foundry portal. For access to all capabilities described in this article, you need to [connect AI services in AI Foundry](../../../ai-studio/ai-services/how-to/connect-ai-services.md). +> - Some Azure AI services features are free to try in the Azure AI Foundry portal. For access to all capabilities described in this article, you need to [connect AI services in Azure AI Foundry](../../../ai-studio/ai-services/how-to/connect-ai-services.md). diff --git a/articles/ai-services/language-service/includes/use-language-studio.md b/articles/ai-services/language-service/includes/use-language-studio.md index b78577e6a62..3d4f90d38ae 100644 --- a/articles/ai-services/language-service/includes/use-language-studio.md +++ b/articles/ai-services/language-service/includes/use-language-studio.md @@ -11,4 +11,4 @@ ms.custom: include, ignite-2024 --- > [!TIP] -> You can use [**AI Foundry**](../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code. +> You can use [**Azure AI Foundry**](../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code. diff --git a/articles/ai-services/language-service/personally-identifiable-information/includes/use-language-studio.md b/articles/ai-services/language-service/personally-identifiable-information/includes/use-language-studio.md index 41baceee202..43f526d69d6 100644 --- a/articles/ai-services/language-service/personally-identifiable-information/includes/use-language-studio.md +++ b/articles/ai-services/language-service/personally-identifiable-information/includes/use-language-studio.md @@ -9,4 +9,4 @@ ms.custom: include, ignite-2024 --- > [!TIP] -> You can use [**AI Foundry**](../../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code. +> You can use [**Azure AI Foundry**](../../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code. diff --git a/articles/ai-services/language-service/personally-identifiable-information/overview.md b/articles/ai-services/language-service/personally-identifiable-information/overview.md index 54f53c84b95..88c41b2e2c7 100644 --- a/articles/ai-services/language-service/personally-identifiable-information/overview.md +++ b/articles/ai-services/language-service/personally-identifiable-information/overview.md @@ -25,7 +25,7 @@ The Conversational PII detection models (both version `2024-11-01-preview` and ` As of June 2024, we now provide General Availability support for the Conversational PII service (English-language only). Customers can now redact transcripts, chats, and other text written in a conversational style (i.e. text with “um”s, “ah”s, multiple speakers, and the spelling out of words for more clarity) with better confidence in AI quality, Azure SLA support and production environment support, and enterprise-grade security in mind. > [!TIP] -> Try out PII detection [in AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md) +> Try out PII detection [in Azure AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new Azure AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md) * [**Quickstarts**](quickstart.md) are getting-started instructions to guide you through making requests to the service. * [**How-to guides**](how-to-call.md) contain instructions for using the service in more specific or customized ways. diff --git a/articles/ai-services/language-service/summarization/includes/use-language-studio.md b/articles/ai-services/language-service/summarization/includes/use-language-studio.md index f95f9a2a0a9..6ff54f18b54 100644 --- a/articles/ai-services/language-service/summarization/includes/use-language-studio.md +++ b/articles/ai-services/language-service/summarization/includes/use-language-studio.md @@ -11,4 +11,4 @@ ms.custom: include, build-2024, ignite-2024 --- > [!TIP] -> You can use [**AI Foundry**](../../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code. +> You can use [**Azure AI Foundry**](../../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code. diff --git a/articles/ai-services/language-service/summarization/overview.md b/articles/ai-services/language-service/summarization/overview.md index fb0be863c7d..e7d3cea445b 100644 --- a/articles/ai-services/language-service/summarization/overview.md +++ b/articles/ai-services/language-service/summarization/overview.md @@ -22,7 +22,7 @@ Use this article to learn more about this feature, and how to use it in your app Out of the box, the service provides summarization solutions for three types of genre, plain texts, conversations, and native documents. Text summarization only accepts plain text blocks, and conversation summarization accept conversational input, including various speech audio signals in order for the model to effectively segment and summarize, and native document can directly summarize for documents in their native formats, such as Words, PDF, etc. > [!TIP] -> Try out Summarization [in AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md) in order to use this service. +> Try out Summarization [in Azure AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new Azure AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md) in order to use this service. # [Text summarization](#tab/text-summarization) diff --git a/articles/ai-services/language-service/text-analytics-for-health/overview.md b/articles/ai-services/language-service/text-analytics-for-health/overview.md index 8e9fc0002e0..b615dcf49d1 100644 --- a/articles/ai-services/language-service/text-analytics-for-health/overview.md +++ b/articles/ai-services/language-service/text-analytics-for-health/overview.md @@ -19,7 +19,7 @@ ms.custom: language-service-health, ignite-2024 Text Analytics for health is one of the prebuilt features offered by [Azure AI Language](../overview.md). It is a cloud-based API service that applies machine-learning intelligence to extract and label relevant medical information from a variety of unstructured texts such as doctor's notes, discharge summaries, clinical documents, and electronic health records. > [!TIP] -> Try out Text Analytics for health [in AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md) in order to use this service. +> Try out Text Analytics for health [in Azure AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new Azure AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md) in order to use this service. This documentation contains the following types of articles: * The [**quickstart article**](quickstart.md) provides a short tutorial that guides you with making your first request to the service. diff --git a/articles/ai-services/openai/assistants-quickstart.md b/articles/ai-services/openai/assistants-quickstart.md index 7c75d7a33a4..b1cdda0e964 100644 --- a/articles/ai-services/openai/assistants-quickstart.md +++ b/articles/ai-services/openai/assistants-quickstart.md @@ -20,7 +20,7 @@ Azure OpenAI Assistants (Preview) allows you to create AI assistants tailored to ::: zone pivot="ai-foundry-portal" -[!INCLUDE [AI Foundry portal](includes/assistants-ai-studio.md)] +[!INCLUDE [Azure AI Foundry portal](includes/assistants-ai-studio.md)] ::: zone-end diff --git a/articles/ai-services/openai/concepts/provisioned-throughput.md b/articles/ai-services/openai/concepts/provisioned-throughput.md index 579c1678c9d..fb25f67808f 100644 --- a/articles/ai-services/openai/concepts/provisioned-throughput.md +++ b/articles/ai-services/openai/concepts/provisioned-throughput.md @@ -54,7 +54,7 @@ To help with simplifying the sizing effort, the following table outlines the TPM |Max Output TPM per PTU| 833|12,333| |Latency Target Value |25 Tokens Per Second|33 Tokens Per Second| -For a full list see the [AOAI Foundry calculator](https://oai.azure.com/portal/calculator). +For a full list see the [AOAI in Azure AI Foundry calculator](https://oai.azure.com/portal/calculator). > [!NOTE] @@ -116,7 +116,7 @@ Azure OpenAI is a highly sought-after service where customer demand might exceed #### Regional capacity guidance -To find the capacity needed for their deployments, use the capacity API or the AI Foundry deployment experience to provide real-time information on capacity availability. +To find the capacity needed for their deployments, use the capacity API or the Azure AI Foundry deployment experience to provide real-time information on capacity availability. In Azure AI Foundry, the deployment experience identifies when a region lacks the capacity needed to deploy the model. This looks at the desired model, version and number of PTUs. If capacity is unavailable, the experience directs users to a select an alternative region. @@ -128,7 +128,7 @@ If an acceptable region isn't available to support the desire model, version and - Attempt the deployment with a smaller number of PTUs. - Attempt the deployment at a different time. Capacity availability changes dynamically based on customer demand and more capacity might become available later. -- Ensure that quota is available in all acceptable regions. The [model capacities API](/rest/api/aiservices/accountmanagement/model-capacities/list?view=rest-aiservices-accountmanagement-2024-04-01-preview&tabs=HTTP&preserve-view=true) and AI Foundry experience consider quota availability in returning alternative regions for creating a deployment. +- Ensure that quota is available in all acceptable regions. The [model capacities API](/rest/api/aiservices/accountmanagement/model-capacities/list?view=rest-aiservices-accountmanagement-2024-04-01-preview&tabs=HTTP&preserve-view=true) and Azure AI Foundry experience consider quota availability in returning alternative regions for creating a deployment. ### Determining the number of PTUs needed for a workload diff --git a/articles/ai-services/openai/concepts/safety-system-message-templates.md b/articles/ai-services/openai/concepts/safety-system-message-templates.md index 858d8f9219d..c74b7bad7d1 100644 --- a/articles/ai-services/openai/concepts/safety-system-message-templates.md +++ b/articles/ai-services/openai/concepts/safety-system-message-templates.md @@ -35,7 +35,7 @@ Below are examples of recommended system message components you can include to p The following steps show how to leverage safety system messages in Azure AI Foundry portal. 1. Go to Azure AI Foundry and navigate to Azure OpenAI and the Chat playground. - :::image type="content" source="../media/navigate-chat-playground.PNG" alt-text="Screenshot of the AI Foundry portal selection."::: + :::image type="content" source="../media/navigate-chat-playground.PNG" alt-text="Screenshot of the Azure AI Foundry portal selection."::: 1. Navigate to the default safety system messages integrated in the studio. :::image type="content" source="../media/navigate-system-message.PNG" alt-text="Screenshot of the system message navigation."::: 1. Select the system message(s) that are applicable to your scenario. diff --git a/articles/ai-services/openai/gpt-v-quickstart.md b/articles/ai-services/openai/gpt-v-quickstart.md index cb627a90b7a..bf182ccf030 100644 --- a/articles/ai-services/openai/gpt-v-quickstart.md +++ b/articles/ai-services/openai/gpt-v-quickstart.md @@ -24,7 +24,7 @@ Get started using GPT-4 Turbo with images with the Azure OpenAI Service. ::: zone pivot="ai-foundry-portal" -[!INCLUDE [AI Foundry portal quickstart](includes/gpt-v-studio.md)] +[!INCLUDE [Azure AI Foundry portal quickstart](includes/gpt-v-studio.md)] ::: zone-end diff --git a/articles/ai-services/openai/how-to/batch.md b/articles/ai-services/openai/how-to/batch.md index 570783aebbf..d798887f954 100644 --- a/articles/ai-services/openai/how-to/batch.md +++ b/articles/ai-services/openai/how-to/batch.md @@ -82,7 +82,7 @@ The following aren't currently supported: ### Global batch deployment -In the AI Foundry portal the deployment type will appear as `Global-Batch`. +In the Azure AI Foundry portal the deployment type will appear as `Global-Batch`. :::image type="content" source="../media/how-to/global-batch/global-batch.png" alt-text="Screenshot that shows the model deployment dialog in Azure AI Foundry portal with Global-Batch deployment type highlighted." lightbox="../media/how-to/global-batch/global-batch.png"::: @@ -91,7 +91,7 @@ In the AI Foundry portal the deployment type will appear as `Global-Batch`. ::: zone pivot="ai-foundry-portal" -[!INCLUDE [AI Foundry portal](../includes/batch/batch-studio.md)] +[!INCLUDE [Azure AI Foundry portal](../includes/batch/batch-studio.md)] ::: zone-end @@ -154,7 +154,7 @@ Yes. Similar to other deployment types, you can create content filters and assoc ### Can I request additional quota? -Yes, from the quota page in the AI Foundry portal. Default quota allocation can be found in the [quota and limits article](../quotas-limits.md#global-batch-quota). +Yes, from the quota page in the Azure AI Foundry portal. Default quota allocation can be found in the [quota and limits article](../quotas-limits.md#global-batch-quota). ### What happens if the API doesn't complete my request within the 24 hour time frame? diff --git a/articles/ai-services/openai/how-to/on-your-data-configuration.md b/articles/ai-services/openai/how-to/on-your-data-configuration.md index 163eb2bf48d..503b94a6e81 100644 --- a/articles/ai-services/openai/how-to/on-your-data-configuration.md +++ b/articles/ai-services/openai/how-to/on-your-data-configuration.md @@ -271,7 +271,7 @@ So far you have already setup each resource work independently. Next you need to | `Storage Blob Data Contributor` | Azure OpenAI | Storage Account | Reads from the input container, and writes the preprocessed result to the output container. | | `Cognitive Services OpenAI Contributor` | Azure AI Search | Azure OpenAI | Custom skill. | | `Storage Blob Data Reader` | Azure AI Search | Storage Account | Reads document blobs and chunk blobs. | -| `Reader` | Azure AI Foundry Project | Azure Storage Private Endpoints (Blob & File) | Read search indexes created in blob storage within an AI Foundry Project. | +| `Reader` | Azure AI Foundry Project | Azure Storage Private Endpoints (Blob & File) | Read search indexes created in blob storage within an Azure AI Foundry Project. | | `Cognitive Services OpenAI User` | Web app | Azure OpenAI | Inference. | diff --git a/articles/ai-services/openai/how-to/realtime-audio.md b/articles/ai-services/openai/how-to/realtime-audio.md index dc932988544..6c7e8b9ab24 100644 --- a/articles/ai-services/openai/how-to/realtime-audio.md +++ b/articles/ai-services/openai/how-to/realtime-audio.md @@ -42,7 +42,7 @@ Before you can use GPT-4o real-time audio, you need: - An Azure subscription - Create one for free. - An Azure OpenAI resource created in a [supported region](#supported-models). For more information, see [Create a resource and deploy a model with Azure OpenAI](create-resource.md). -- You need a deployment of the `gpt-4o-realtime-preview` model in a supported region as described in the [supported models](#supported-models) section. You can deploy the model from the [Azure AI Foundry portal model catalog](../../../ai-studio/how-to/model-catalog-overview.md) or from your project in AI Foundry portal. +- You need a deployment of the `gpt-4o-realtime-preview` model in a supported region as described in the [supported models](#supported-models) section. You can deploy the model from the [Azure AI Foundry portal model catalog](../../../ai-studio/how-to/model-catalog-overview.md) or from your project in Azure AI Foundry portal. Here are some of the ways you can get started with the GPT-4o Realtime API for speech and audio: - For steps to deploy and use the `gpt-4o-realtime-preview` model, see [the real-time audio quickstart](../realtime-audio-quickstart.md). diff --git a/articles/ai-services/openai/how-to/use-web-app.md b/articles/ai-services/openai/how-to/use-web-app.md index 942d8f4ae11..e95419967f0 100644 --- a/articles/ai-services/openai/how-to/use-web-app.md +++ b/articles/ai-services/openai/how-to/use-web-app.md @@ -166,7 +166,7 @@ This can be accomplished using the Advanced edit or simple Edit options as previ ### Using Azure AI Foundry -Follow [this tutorial on integrating Azure AI Search with AI Foundry](/azure/ai-studio/tutorials/deploy-chat-web-app#add-your-data-and-try-the-chat-model-again) and redeploy your application. +Follow [this tutorial on integrating Azure AI Search with Azure AI Foundry](/azure/ai-studio/tutorials/deploy-chat-web-app#add-your-data-and-try-the-chat-model-again) and redeploy your application. ### Using Azure OpenAI Studio diff --git a/articles/ai-services/openai/how-to/weights-and-biases-integration.md b/articles/ai-services/openai/how-to/weights-and-biases-integration.md index 77a181495ee..7372626ed2c 100644 --- a/articles/ai-services/openai/how-to/weights-and-biases-integration.md +++ b/articles/ai-services/openai/how-to/weights-and-biases-integration.md @@ -87,7 +87,7 @@ Give your Azure OpenAI resource the **Key Vault Secrets Officer** role. ## Link Weights & Biases with Azure OpenAI -1. Navigate to the [AI Foundry portal](https://ai.azure.com) and select your Azure OpenAI fine-tuning resource. +1. Navigate to the [Azure AI Foundry portal](https://ai.azure.com) and select your Azure OpenAI fine-tuning resource. :::image type="content" source="../media/how-to/weights-and-biases/manage-integrations.png" alt-text="Screenshot of the manage integrations button." lightbox="../media/how-to/weights-and-biases/manage-integrations.png"::: diff --git a/articles/ai-services/openai/includes/assistants-ai-studio.md b/articles/ai-services/openai/includes/assistants-ai-studio.md index f89de09adfb..a303751d9be 100644 --- a/articles/ai-services/openai/includes/assistants-ai-studio.md +++ b/articles/ai-services/openai/includes/assistants-ai-studio.md @@ -1,7 +1,7 @@ --- -title: Quickstart - getting started with Azure OpenAI assistants (preview) in AI Foundry portal +title: Quickstart - getting started with Azure OpenAI assistants (preview) in Azure AI Foundry portal titleSuffix: Azure OpenAI -description: Walkthrough on how to get started with Azure OpenAI assistants with new features like code interpreter in AI Foundry portal (Preview). +description: Walkthrough on how to get started with Azure OpenAI assistants with new features like code interpreter in Azure AI Foundry portal (Preview). manager: nitinme ms.service: azure-ai-studio ms.custom: diff --git a/articles/ai-services/openai/includes/assistants-studio.md b/articles/ai-services/openai/includes/assistants-studio.md index a91bf91d5be..be4cb473230 100644 --- a/articles/ai-services/openai/includes/assistants-studio.md +++ b/articles/ai-services/openai/includes/assistants-studio.md @@ -41,7 +41,7 @@ Use the **Assistant setup** pane to create a new AI assistant or to select an ex | **Deployment** | This is where you set which model deployment to use with your assistant. | | **Functions**| Create custom function definitions for the models to formulate API calls and structure data outputs based on your specifications | | **Code interpreter** | Code interpreter provides access to a sandboxed Python environment that can be used to allow the model to test and execute code. | -| **Files** | You can upload up to 20 files, with a max file size of 512 MB to use with tools. You can upload up to 10,000 files using [AI Foundry portal](../assistants-quickstart.md?pivots=ai-foundry-portal). | +| **Files** | You can upload up to 20 files, with a max file size of 512 MB to use with tools. You can upload up to 10,000 files using [Azure AI Foundry portal](../assistants-quickstart.md?pivots=ai-foundry-portal). | ### Tools diff --git a/articles/ai-services/openai/includes/batch/batch-studio.md b/articles/ai-services/openai/includes/batch/batch-studio.md index e289bb30f80..d2981c04938 100644 --- a/articles/ai-services/openai/includes/batch/batch-studio.md +++ b/articles/ai-services/openai/includes/batch/batch-studio.md @@ -70,7 +70,7 @@ For this article, we'll create a file named `test.jsonl` and will copy the conte Once your input file is prepared, you first need to upload the file to then be able to kick off a batch job. File upload can be done both programmatically or via the Studio. -1. Sign in to [AI Foundry portal](https://ai.azure.com). +1. Sign in to [Azure AI Foundry portal](https://ai.azure.com). 2. Select the Azure OpenAI resource where you have a global batch model deployment available. 3. Select **Batch jobs** > **+Create batch jobs**. diff --git a/articles/ai-services/openai/includes/connect-your-data-studio.md b/articles/ai-services/openai/includes/connect-your-data-studio.md index e9bb8dd1aa2..15e9e74d27b 100644 --- a/articles/ai-services/openai/includes/connect-your-data-studio.md +++ b/articles/ai-services/openai/includes/connect-your-data-studio.md @@ -19,7 +19,7 @@ recommendations: false Navigate to [Azure AI Foundry](https://ai.azure.com/) and sign-in with credentials that have access to your Azure OpenAI resource. -1. You can either [create an AI Foundry project](../../../ai-studio/how-to/create-projects.md) by clicking **Create project**, or continue directly by clicking the button on the **Focused on Azure OpenAI Service** tile. +1. You can either [create an Azure AI Foundry project](../../../ai-studio/how-to/create-projects.md) by clicking **Create project**, or continue directly by clicking the button on the **Focused on Azure OpenAI Service** tile. :::image type="content" source="../media/use-your-data/ai-studio-homepage.png" alt-text="A screenshot of the Azure AI Foundry portal landing page." lightbox="../media/use-your-data/ai-studio-homepage.png"::: @@ -27,7 +27,7 @@ Navigate to [Azure AI Foundry](https://ai.azure.com/) and sign-in with credentia 1. In the **Chat playground**, Select **Add your data** and then **Add a data source** - :::image type="content" source="../media/use-your-data/chat-playground.png" alt-text="A screenshot of the chat playground in AI Foundry." lightbox="../media/use-your-data/chat-playground.png"::: + :::image type="content" source="../media/use-your-data/chat-playground.png" alt-text="A screenshot of the chat playground in Azure AI Foundry." lightbox="../media/use-your-data/chat-playground.png"::: 1. In the pane that appears, select **Upload files (preview)** under **Select data source**. Azure OpenAI needs both a storage resource and a search resource to access and index your data. diff --git a/articles/ai-services/openai/includes/fine-tune-models.md b/articles/ai-services/openai/includes/fine-tune-models.md index 472b5e316aa..754ba4a44f4 100644 --- a/articles/ai-services/openai/includes/fine-tune-models.md +++ b/articles/ai-services/openai/includes/fine-tune-models.md @@ -13,7 +13,7 @@ manager: nitinme > [!NOTE] > `gpt-35-turbo` - Fine-tuning of this model is limited to a subset of regions, and isn't available in every region the base model is available. > -> The supported regions for fine-tuning might vary if you use Azure OpenAI models in an AI Foundry project versus outside a project. +> The supported regions for fine-tuning might vary if you use Azure OpenAI models in an Azure AI Foundry project versus outside a project. | Model ID | Fine-tuning regions | Max request (tokens) | Training Data (up to) | | --- | --- | :---: | :---: | diff --git a/articles/ai-services/openai/includes/fine-tuning-openai-in-ai-studio.md b/articles/ai-services/openai/includes/fine-tuning-openai-in-ai-studio.md index 458a7495045..74ba8f63f8f 100644 --- a/articles/ai-services/openai/includes/fine-tuning-openai-in-ai-studio.md +++ b/articles/ai-services/openai/includes/fine-tuning-openai-in-ai-studio.md @@ -20,7 +20,7 @@ ms.custom: include, build-2024 - An [Azure AI project](../../../ai-studio/how-to/create-projects.md) in Azure AI Foundry portal. - An [Azure OpenAI connection](/azure/ai-studio/how-to/connections-add?tabs=azure-openai#connection-details) to a resource in a [region where fine-tuning is supported](/azure/ai-services/openai/concepts/models#fine-tuning-models). > [!NOTE] - > The supported regions might vary if you use Azure OpenAI models in an AI Foundry project versus outside a project. + > The supported regions might vary if you use Azure OpenAI models in an Azure AI Foundry project versus outside a project. - Fine-tuning access requires **Cognitive Services OpenAI Contributor** role on the Azure OpenAI resource. - If you don't already have access to view quota and deploy models in Azure AI Foundry portal you need [more permissions](../how-to/role-based-access-control.md). diff --git a/articles/ai-services/openai/includes/gpt-4-turbo.md b/articles/ai-services/openai/includes/gpt-4-turbo.md index 3c81d192c83..0e12ad186c9 100644 --- a/articles/ai-services/openai/includes/gpt-4-turbo.md +++ b/articles/ai-services/openai/includes/gpt-4-turbo.md @@ -36,4 +36,4 @@ This is the replacement for the following preview models: ### Deploying GPT-4 Turbo with Vision GA -To deploy the GA model from the AI Foundry portal, select `GPT-4` and then choose the `turbo-2024-04-09` version from the dropdown menu. The default quota for the `gpt-4-turbo-2024-04-09` model will be the same as current quota for GPT-4-Turbo. See the [regional quota limits.](../quotas-limits.md) +To deploy the GA model from the Azure AI Foundry portal, select `GPT-4` and then choose the `turbo-2024-04-09` version from the dropdown menu. The default quota for the `gpt-4-turbo-2024-04-09` model will be the same as current quota for GPT-4-Turbo. See the [regional quota limits.](../quotas-limits.md) diff --git a/articles/ai-services/openai/realtime-audio-quickstart.md b/articles/ai-services/openai/realtime-audio-quickstart.md index 0982a7c546b..5b3ce3e5bb8 100644 --- a/articles/ai-services/openai/realtime-audio-quickstart.md +++ b/articles/ai-services/openai/realtime-audio-quickstart.md @@ -46,14 +46,14 @@ Support for the Realtime API was first added in API version `2024-10-01-preview` Before you can use GPT-4o real-time audio, you need a deployment of the `gpt-4o-realtime-preview` model in a supported region as described in the [supported models](#supported-models) section. -1. Go to the [AI Foundry home page](https://ai.azure.com) and make sure you're signed in with the Azure subscription that has your Azure OpenAI Service resource (with or without model deployments.) +1. Go to the [Azure AI Foundry home page](https://ai.azure.com) and make sure you're signed in with the Azure subscription that has your Azure OpenAI Service resource (with or without model deployments.) 1. Select the **Real-time audio** playground from under **Resource playground** in the left pane. 1. Select **+ Create a deployment** to open the deployment window. 1. Search for and select the `gpt-4o-realtime-preview` model and then select **Confirm**. 1. In the deployment wizard, make sure to select the `2024-10-01` model version. 1. Follow the wizard to deploy the model. -Now that you have a deployment of the `gpt-4o-realtime-preview` model, you can interact with it in real time in the AI Foundry portal **Real-time audio** playground or Realtime API. +Now that you have a deployment of the `gpt-4o-realtime-preview` model, you can interact with it in real time in the Azure AI Foundry portal **Real-time audio** playground or Realtime API. ## Use the GPT-4o real-time audio @@ -64,7 +64,7 @@ Now that you have a deployment of the `gpt-4o-realtime-preview` model, you can i To chat with your deployed `gpt-4o-realtime-preview` model in the [Azure AI Foundry](https://ai.azure.com) **Real-time audio** playground, follow these steps: -1. the [Azure OpenAI Service page](https://ai.azure.com/resource/overview) in AI Foundry portal. Make sure you're signed in with the Azure subscription that has your Azure OpenAI Service resource and the deployed `gpt-4o-realtime-preview` model. +1. the [Azure OpenAI Service page](https://ai.azure.com/resource/overview) in Azure AI Foundry portal. Make sure you're signed in with the Azure subscription that has your Azure OpenAI Service resource and the deployed `gpt-4o-realtime-preview` model. 1. Select the **Real-time audio** playground from under **Resource playground** in the left pane. 1. Select your deployed `gpt-4o-realtime-preview` model from the **Deployment** dropdown. 1. Select **Enable microphone** to allow the browser to access your microphone. If you already granted permission, you can skip this step. diff --git a/articles/ai-services/openai/whats-new.md b/articles/ai-services/openai/whats-new.md index d1dae45f70e..1a592041118 100644 --- a/articles/ai-services/openai/whats-new.md +++ b/articles/ai-services/openai/whats-new.md @@ -145,7 +145,7 @@ Global batch now supports GPT-4o (2024-08-06). See the [global batch getting sta ### Azure OpenAI Studio UX updates -On September 19, when you access the [Azure OpenAI Studio](https://oai.azure.com/) you'll begin to no longer see the legacy AI Foundry portal by default. If needed you'll still be able to go back to the previous experience by using the **Switch to the old look** toggle in the top bar of the UI for the next couple of weeks. If you switch back to legacy AI Foundry portal, it helps if you fill out the feedback form to let us know why. We're actively monitoring this feedback to improve the new experience. +On September 19, when you access the [Azure OpenAI Studio](https://oai.azure.com/) you'll begin to no longer see the legacy Azure AI Foundry portal by default. If needed you'll still be able to go back to the previous experience by using the **Switch to the old look** toggle in the top bar of the UI for the next couple of weeks. If you switch back to legacy Azure AI Foundry portal, it helps if you fill out the feedback form to let us know why. We're actively monitoring this feedback to improve the new experience. ### GPT-4o 2024-08-06 provisioned deployments @@ -188,7 +188,7 @@ OpenAI has incorporated additional safety measures into the `o1` models, includi ### Availability -The `o1-preview` and `o1-mini` are available in the East US2 region for limited access through the [AI Foundry portal](https://ai.azure.com) early access playground. Data processing for the `o1` models might occur in a different region than where they are available for use. +The `o1-preview` and `o1-mini` are available in the East US2 region for limited access through the [Azure AI Foundry portal](https://ai.azure.com) early access playground. Data processing for the `o1` models might occur in a different region than where they are available for use. To try the `o1-preview` and `o1-mini` models in the early access playground **registration is required, and access will be granted based on Microsoft’s eligibility criteria.** @@ -244,9 +244,9 @@ On August 6, 2024, OpenAI [announced](https://openai.com/index/introducing-struc * An enhanced ability to support complex structured outputs. * Max output tokens have been increased from 4,096 to 16,384. -Azure customers can test out GPT-4o `2024-08-06` today in the new AI Foundry early access playground (preview). +Azure customers can test out GPT-4o `2024-08-06` today in the new Azure AI Foundry early access playground (preview). -Unlike the previous early access playground, the AI Foundry portal early access playground (preview) doesn't require you to have a resource in a specific region. +Unlike the previous early access playground, the Azure AI Foundry portal early access playground (preview) doesn't require you to have a resource in a specific region. > [!NOTE] > Prompts and completions made through the early access playground (preview) might be processed in any Azure OpenAI region, and are currently subject to a 10 request per minute per Azure subscription limit. This limit might change in the future. diff --git a/articles/ai-services/speech-service/custom-speech-ai-foundry-portal.md b/articles/ai-services/speech-service/custom-speech-ai-foundry-portal.md index d30d4a6d035..e078dcdab99 100644 --- a/articles/ai-services/speech-service/custom-speech-ai-foundry-portal.md +++ b/articles/ai-services/speech-service/custom-speech-ai-foundry-portal.md @@ -18,17 +18,17 @@ ms.author: eur With custom speech, you can enhance speech recognition accuracy for your applications by using a custom model for real-time speech to text, speech translation, and batch transcription. The base model, trained with Microsoft-owned data, handles common spoken language well, but a custom model can improve domain-specific vocabulary and audio conditions by providing text and audio data for training. Additionally, you can train the model with structured text for custom pronunciations, display text formatting, and profanity filtering. > [!TIP] -> You can bring your custom speech models from [Speech Studio](https://speech.microsoft.com) to the [Azure AI Foundry portal](https://ai.azure.com). In AI Foundry, you can pick up where you left off by connecting to your existing Speech resource. For more information about connecting to an existing Speech resource, see [Connect to an existing Speech resource](../../ai-studio/ai-services/how-to/connect-ai-services.md#connect-azure-ai-services-after-you-create-a-project). +> You can bring your custom speech models from [Speech Studio](https://speech.microsoft.com) to the [Azure AI Foundry portal](https://ai.azure.com). In Azure AI Foundry, you can pick up where you left off by connecting to your existing Speech resource. For more information about connecting to an existing Speech resource, see [Connect to an existing Speech resource](../../ai-studio/ai-services/how-to/connect-ai-services.md#connect-azure-ai-services-after-you-create-a-project). -In [AI Foundry portal](https://ai.azure.com), you create a custom speech model by fine-tuning an Azure AI Speech base model with your own data. You can upload your data, test and train a custom model, compare accuracy between models, and deploy a model to a custom endpoint. +In [Azure AI Foundry portal](https://ai.azure.com), you create a custom speech model by fine-tuning an Azure AI Speech base model with your own data. You can upload your data, test and train a custom model, compare accuracy between models, and deploy a model to a custom endpoint. -This article shows you how to use fine-tuning in AI Foundry to create a custom speech model. For more information about custom speech, see the [custom speech overview](./custom-speech-overview.md) documentation. +This article shows you how to use fine-tuning in Azure AI Foundry to create a custom speech model. For more information about custom speech, see the [custom speech overview](./custom-speech-overview.md) documentation. ## Start fine-tuning a model with your data -In AI Foundry, you can fine-tune some Azure AI services models. For example, you can fine-tune a model for custom speech. Each custom model is specific to a [locale](language-support.md?tabs=stt). For example, you might fine-tune a model for English in the United States. +In Azure AI Foundry, you can fine-tune some Azure AI services models. For example, you can fine-tune a model for custom speech. Each custom model is specific to a [locale](language-support.md?tabs=stt). For example, you might fine-tune a model for English in the United States. -1. Go to your AI Foundry project. If you need to create a project, see [Create an AI Foundry project](../../ai-studio/how-to/create-projects.md). +1. Go to your Azure AI Foundry project. If you need to create a project, see [Create an Azure AI Foundry project](../../ai-studio/how-to/create-projects.md). 1. Select **Fine-tuning** from the left pane. 1. Select **AI Service fine-tuning** > **+ Fine-tune**. @@ -40,7 +40,7 @@ In AI Foundry, you can fine-tune some Azure AI services models. For example, you :::image type="content" source="./media/ai-studio/custom-speech/new-fine-tune-select-connection.png" alt-text="Screenshot of the page to select the connected service resource that you want to use for fine-tuning." lightbox="./media/ai-studio/custom-speech/new-fine-tune-select-connection.png"::: In this example, we can choose from the following options: - - **AI Service**: The Azure AI Services multi-service resource that [came with the AI Foundry project](../../ai-studio/ai-services/how-to/connect-ai-services.md#connect-azure-ai-services-when-you-create-a-project-for-the-first-time). + - **AI Service**: The Azure AI Services multi-service resource that [came with the Azure AI Foundry project](../../ai-studio/ai-services/how-to/connect-ai-services.md#connect-azure-ai-services-when-you-create-a-project-for-the-first-time). - **Speech Service**: An Azure AI Speech resource that was [connected after the project was created](../../ai-studio/ai-services/how-to/connect-ai-services.md#connect-azure-ai-services-after-you-create-a-project). 1. Enter a name and description for the fine-tuning job. Then select **Next**. diff --git a/articles/ai-services/speech-service/get-started-speech-to-text.md b/articles/ai-services/speech-service/get-started-speech-to-text.md index 16491e0edbc..ad84f4924d4 100644 --- a/articles/ai-services/speech-service/get-started-speech-to-text.md +++ b/articles/ai-services/speech-service/get-started-speech-to-text.md @@ -17,7 +17,7 @@ zone_pivot_groups: programming-languages-speech-services-studio # Quickstart: Recognize and convert speech to text ::: zone pivot="ai-studio" -[!INCLUDE [AI Foundry include](includes/quickstarts/speech-to-text-basics/ai-studio.md)] +[!INCLUDE [Azure AI Foundry include](includes/quickstarts/speech-to-text-basics/ai-studio.md)] ::: zone-end ::: zone pivot="programming-language-csharp" diff --git a/articles/ai-services/speech-service/includes/quickstarts/speech-to-text-basics/ai-studio.md b/articles/ai-services/speech-service/includes/quickstarts/speech-to-text-basics/ai-studio.md index 644e9a1ae2c..6ae065912d6 100644 --- a/articles/ai-services/speech-service/includes/quickstarts/speech-to-text-basics/ai-studio.md +++ b/articles/ai-services/speech-service/includes/quickstarts/speech-to-text-basics/ai-studio.md @@ -19,7 +19,7 @@ In this quickstart, you try real-time speech to text in [Azure AI Foundry](https ## Try real-time speech to text -1. Go to your AI Foundry project. If you need to create a project, see [Create an AI Foundry project](../../../../../ai-studio/how-to/create-projects.md). +1. Go to your Azure AI Foundry project. If you need to create a project, see [Create an Azure AI Foundry project](../../../../../ai-studio/how-to/create-projects.md). 1. Select **Playgrounds** from the left pane and then select a playground to use. In this example, select **Try the Speech playground**. :::image type="content" source="../../../../../ai-studio/media/ai-services/playgrounds/azure-ai-services-playgrounds.png" alt-text="Screenshot of the project level playgrounds that you can use." lightbox="../../../../../ai-studio/media/ai-services/playgrounds/azure-ai-services-playgrounds.png"::: diff --git a/articles/ai-services/speech-service/pronunciation-assessment-tool.md b/articles/ai-services/speech-service/pronunciation-assessment-tool.md index dc402cb556b..47ebe8519ed 100644 --- a/articles/ai-services/speech-service/pronunciation-assessment-tool.md +++ b/articles/ai-services/speech-service/pronunciation-assessment-tool.md @@ -22,7 +22,7 @@ Pronunciation assessment uses the speech to text capability to provide subjectiv > [!NOTE] > For information about availability of pronunciation assessment, see [supported languages](language-support.md?tabs=pronunciation-assessment) and [available regions](regions.md#speech-service). -This article describes how to use the pronunciation assessment tool without writing any code through the [AI Foundry portal](https://ai.azure.com/explore/aiservices/speech/pronunciationassessment). For information about how to integrate pronunciation assessment in your speech applications, see [How to use pronunciation assessment](how-to-pronunciation-assessment.md). +This article describes how to use the pronunciation assessment tool without writing any code through the [Azure AI Foundry portal](https://ai.azure.com/explore/aiservices/speech/pronunciationassessment). For information about how to integrate pronunciation assessment in your speech applications, see [How to use pronunciation assessment](how-to-pronunciation-assessment.md). ## Reading and speaking scenarios @@ -35,9 +35,9 @@ For pronunciation assessment, there are two scenarios: Reading and Speaking. Follow these steps to assess your pronunciation of the reference text: -1. Go to **Pronunciation assessment** in the [AI Foundry portal](https://ai.azure.com/explore/aiservices/speech). +1. Go to **Pronunciation assessment** in the [Azure AI Foundry portal](https://ai.azure.com/explore/aiservices/speech). - :::image type="content" source="media/pronunciation-assessment/pronunciation-assessment-select.png" alt-text="Screenshot of how to go to Pronunciation assessment in AI Foundry."::: + :::image type="content" source="media/pronunciation-assessment/pronunciation-assessment-select.png" alt-text="Screenshot of how to go to Pronunciation assessment in Azure AI Foundry."::: 1. On the Reading tab, choose a supported [language](language-support.md?tabs=pronunciation-assessment) that you want to evaluate the pronunciation. @@ -57,7 +57,7 @@ Follow these steps to assess your pronunciation of the reference text: If you want to conduct an unscripted assessment, select the Speaking tab. This feature allows you to conduct unscripted assessment without providing reference text in advance. Here's how to proceed: -1. Go to **Pronunciation assessment** in the [AI Foundry portal](https://ai.azure.com/explore/aiservices/speech). +1. Go to **Pronunciation assessment** in the [Azure AI Foundry portal](https://ai.azure.com/explore/aiservices/speech). 1. On the Speaking tab, choose a supported [language](language-support.md?tabs=pronunciation-assessment) that you want to evaluate the pronunciation. @@ -243,9 +243,9 @@ Pronunciation assessment provides various assessment results in different granul - Syllable-level accuracy scores are currently available via the [JSON file](?tabs=json#pronunciation-assessment-results) or [Speech SDK](how-to-pronunciation-assessment.md). - At the phoneme level, pronunciation assessment provides accuracy scores of each phoneme, helping learners to better understand the pronunciation details of their speech. -In addition to the baseline scores of accuracy, fluency, and completeness, the pronunciation assessment feature in the AI Foundry includes more comprehensive scores to provide detailed feedback on various aspects of speech performance and understanding. The enhanced scores are as follows: Prosody score, Vocabulary score, Grammar score, and Topic score. These scores offer valuable insights into speech prosody, vocabulary usage, grammar correctness, and topic understanding. +In addition to the baseline scores of accuracy, fluency, and completeness, the pronunciation assessment feature in the Azure AI Foundry includes more comprehensive scores to provide detailed feedback on various aspects of speech performance and understanding. The enhanced scores are as follows: Prosody score, Vocabulary score, Grammar score, and Topic score. These scores offer valuable insights into speech prosody, vocabulary usage, grammar correctness, and topic understanding. -:::image type="content" source="media/pronunciation-assessment/speaking-score.png" alt-text="Screenshot of overall pronunciation score and overall content score in AI Foundry."::: +:::image type="content" source="media/pronunciation-assessment/speaking-score.png" alt-text="Screenshot of overall pronunciation score and overall content score in Azure AI Foundry."::: At the bottom of the Assessment result, two overall scores are displayed: Pronunciation score and Content score. In the Reading tab, you find the Pronunciation score displayed. In the Speaking tab, both the Pronunciation score and the Content score are displayed. @@ -268,9 +268,9 @@ These overall scores offer a comprehensive assessment of both pronunciation and ## Assessment scores in streaming mode -Pronunciation assessment supports uninterrupted streaming mode. The AI Foundry demo allows for up to 60 minutes of recording in streaming mode for evaluation. As long as you don't press the stop recording button, the evaluation process doesn't finish and you can pause and resume evaluation conveniently. +Pronunciation assessment supports uninterrupted streaming mode. The Azure AI Foundry demo allows for up to 60 minutes of recording in streaming mode for evaluation. As long as you don't press the stop recording button, the evaluation process doesn't finish and you can pause and resume evaluation conveniently. -Pronunciation assessment evaluates several aspects of pronunciation. At the bottom of **Assessment result**, you can see **Pronunciation score** as aggregated overall score, which includes 4 sub aspects: **Accuracy score**, **Fluency score**, **Completeness score**, and **Prosody score**. In streaming mode, since the **Accuracy score**, **Fluency score**, and **Prosody score** will vary over time throughout the recording process, we demonstrate an approach in AI Foundry to display approximate overall score incrementally before the end of the evaluation, which weighted only with Accuracy score, Fluency score, and Prosody score. The **Completeness score** is only calculated at the end of the evaluation after you press the stop button, so the final pronunciation overall score is aggregated from **Accuracy score**, **Fluency score**, **Completeness score**, and **Prosody score** with weight. +Pronunciation assessment evaluates several aspects of pronunciation. At the bottom of **Assessment result**, you can see **Pronunciation score** as aggregated overall score, which includes 4 sub aspects: **Accuracy score**, **Fluency score**, **Completeness score**, and **Prosody score**. In streaming mode, since the **Accuracy score**, **Fluency score**, and **Prosody score** will vary over time throughout the recording process, we demonstrate an approach in Azure AI Foundry to display approximate overall score incrementally before the end of the evaluation, which weighted only with Accuracy score, Fluency score, and Prosody score. The **Completeness score** is only calculated at the end of the evaluation after you press the stop button, so the final pronunciation overall score is aggregated from **Accuracy score**, **Fluency score**, **Completeness score**, and **Prosody score** with weight. Refer to the demo examples below for the whole process of evaluating pronunciation in streaming mode. diff --git a/articles/ai-services/speech-service/toc.yml b/articles/ai-services/speech-service/toc.yml index 3a802e9a9f6..9cfbcd66781 100644 --- a/articles/ai-services/speech-service/toc.yml +++ b/articles/ai-services/speech-service/toc.yml @@ -289,7 +289,7 @@ items: href: language-learning-overview.md - name: Pronunciation Assessment with speech to text items: - - name: Reading and speaking assessment in AI Foundry + - name: Reading and speaking assessment in Azure AI Foundry href: pronunciation-assessment-tool.md displayName: pronounce, learn language, assess pron - name: Interactive language learning with pronunciation assessment diff --git a/articles/ai-studio/ai-services/content-safety-overview.md b/articles/ai-studio/ai-services/content-safety-overview.md index 3e20190d80e..f4894db479a 100644 --- a/articles/ai-studio/ai-services/content-safety-overview.md +++ b/articles/ai-studio/ai-services/content-safety-overview.md @@ -14,7 +14,7 @@ author: PatrickFarley # Content Safety in Azure AI Foundry portal -Azure AI Content Safety is an AI service that detects harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety includes various APIs that allow you to detect and prevent the output of harmful content. The interactive Content Safety **try out** page in AI Foundry portal allows you to view, explore, and try out sample code for detecting harmful content across different modalities. +Azure AI Content Safety is an AI service that detects harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety includes various APIs that allow you to detect and prevent the output of harmful content. The interactive Content Safety **try out** page in Azure AI Foundry portal allows you to view, explore, and try out sample code for detecting harmful content across different modalities. ## Features diff --git a/articles/ai-studio/ai-services/how-to/connect-ai-services.md b/articles/ai-studio/ai-services/how-to/connect-ai-services.md index 2a5e097fc06..0e63fd1c2f5 100644 --- a/articles/ai-studio/ai-services/how-to/connect-ai-services.md +++ b/articles/ai-studio/ai-services/how-to/connect-ai-services.md @@ -1,7 +1,7 @@ --- -title: How to use Azure AI services in AI Foundry portal +title: How to use Azure AI services in Azure AI Foundry portal titleSuffix: Azure AI Foundry -description: Learn how to use Azure AI services in AI Foundry portal. You can use existing Azure AI services resources in AI Foundry portal by creating a connection to the resource. +description: Learn how to use Azure AI services in Azure AI Foundry portal. You can use existing Azure AI services resources in Azure AI Foundry portal by creating a connection to the resource. manager: nitinme ms.service: azure-ai-studio ms.custom: @@ -15,20 +15,20 @@ ms.author: eur author: eric-urban --- -# How to use Azure AI services in AI Foundry portal +# How to use Azure AI services in Azure AI Foundry portal You might have existing resources for Azure AI services that you used in the old studios such as Azure OpenAI Studio or Speech Studio. You can pick up where you left off by using your existing resources in the [Azure AI Foundry portal](https://ai.azure.com). -This article describes how to use new or existing Azure AI services resources in an AI Foundry project. +This article describes how to use new or existing Azure AI services resources in an Azure AI Foundry project. ## Usage scenarios -Depending on the AI service and model you want to use, you can use them in AI Foundry portal via: -- [Bring your existing Azure AI services resources](#bring-your-existing-azure-ai-services-resources-into-a-project) into a project. You can use your existing Azure AI services resources in an AI Foundry project by creating a connection to the resource. +Depending on the AI service and model you want to use, you can use them in Azure AI Foundry portal via: +- [Bring your existing Azure AI services resources](#bring-your-existing-azure-ai-services-resources-into-a-project) into a project. You can use your existing Azure AI services resources in an Azure AI Foundry project by creating a connection to the resource. - The [model catalog](#discover-azure-ai-models-in-the-model-catalog). You don't need a project to browse and discover Azure AI models. Some of the Azure AI services are available for you to try via the model catalog without a project. Some Azure AI services require a project to use in the playgrounds. - The [project-level playgrounds](#try-azure-ai-services-in-the-project-level-playgrounds). You need a project to try Azure AI services such as Azure AI Speech and Azure AI Language. - [Azure AI Services demo pages](#try-out-azure-ai-services-demos). You can browse Azure AI services capabilities and step through the demos. You can try some limited demos for free without a project. -- [Fine-tune](#fine-tune-azure-ai-services-models) models. You can fine-tune a subset of Azure AI services models in AI Foundry portal. +- [Fine-tune](#fine-tune-azure-ai-services-models) models. You can fine-tune a subset of Azure AI services models in Azure AI Foundry portal. - [Deploy](#deploy-models-to-production) models. You can deploy base models and fine-tuned models to production. Most Azure AI services models are already deployed and ready to use. ## Bring your existing Azure AI services resources into a project @@ -44,14 +44,14 @@ When you create a project for the first time, you also create a hub. When you cr :::image type="content" source="../../media/how-to/projects/projects-create-resource.png" alt-text="Screenshot of the create resource page within the create project dialog." lightbox="../../media/how-to/projects/projects-create-resource.png"::: -For more details about creating a project, see the [create an AI Foundry project](../../how-to/create-projects.md) how-to guide or the [create a project and use the chat playground](../../quickstarts/get-started-playground.md) quickstart. +For more details about creating a project, see the [create an Azure AI Foundry project](../../how-to/create-projects.md) how-to guide or the [create a project and use the chat playground](../../quickstarts/get-started-playground.md) quickstart. ### Connect Azure AI services after you create a project -To use your existing Azure AI services resources (such as Azure AI Speech) in an AI Foundry project, you need to create a connection to the resource. +To use your existing Azure AI services resources (such as Azure AI Speech) in an Azure AI Foundry project, you need to create a connection to the resource. -1. Create an AI Foundry project. For detailed instructions, see [Create an AI Foundry project](../../how-to/create-projects.md). -1. Go to your AI Foundry project. +1. Create an Azure AI Foundry project. For detailed instructions, see [Create an Azure AI Foundry project](../../how-to/create-projects.md). +1. Go to your Azure AI Foundry project. 1. Select **Management center** from the left pane. 1. Select **Connected resources** (under **Project**) from the left pane. 1. Select **+ New connection**. @@ -72,12 +72,12 @@ To use your existing Azure AI services resources (such as Azure AI Speech) in an You can discover Azure AI models in the model catalog without a project. Some Azure AI services are available for you to try via the model catalog without a project. -1. Go to the [AI Foundry home page](https://ai.azure.com). +1. Go to the [Azure AI Foundry home page](https://ai.azure.com). 1. Select the tile that says **Model catalog and benchmarks**. :::image type="content" source="../../media/explore/ai-studio-home-model-catalog.png" alt-text="Screenshot of the home page in Azure AI Foundry portal with the option to select the model catalog tile." lightbox="../../media/explore/ai-studio-home-model-catalog.png"::: - If you don't see this tile, you can also go directly to the [Azure AI model catalog page](https://ai.azure.com/explore/models) in AI Foundry portal. + If you don't see this tile, you can also go directly to the [Azure AI model catalog page](https://ai.azure.com/explore/models) in Azure AI Foundry portal. 1. From the **Collections** dropdown, select **Microsoft**. Search for Azure AI services models by entering **azure-ai** in the search box. @@ -89,7 +89,7 @@ You can discover Azure AI models in the model catalog without a project. Some Az In the project-level playgrounds, you can try Azure AI services such as Azure AI Speech and Azure AI Language. -1. Go to your AI Foundry project. If you need to create a project, see [Create an AI Foundry project](../../how-to/create-projects.md). +1. Go to your Azure AI Foundry project. If you need to create a project, see [Create an Azure AI Foundry project](../../how-to/create-projects.md). 1. Select **Playgrounds** from the left pane and then select a playground to use. In this example, select **Try the Speech playground**. :::image type="content" source="../../media/ai-services/playgrounds/azure-ai-services-playgrounds.png" alt-text="Screenshot of the project level playgrounds that you can use." lightbox="../../media/ai-services/playgrounds/azure-ai-services-playgrounds.png"::: @@ -106,12 +106,12 @@ If you have other connected resources, you can use them in the corresponding pla You can browse Azure AI services capabilities and step through the demos. You can try some limited demos for free without a project. -1. Go to the [AI Foundry home page](https://ai.azure.com) and make sure you're signed in with the Azure subscription that has your Azure AI services resource. +1. Go to the [Azure AI Foundry home page](https://ai.azure.com) and make sure you're signed in with the Azure subscription that has your Azure AI services resource. 1. Find the tile that says **Explore Azure AI Services** and select **Try now**. :::image type="content" source="../../media/explore/home-ai-services.png" alt-text="Screenshot of the home page in Azure AI Foundry portal with the option to select Azure AI Services." lightbox="../../media/explore/home-ai-services.png"::: - If you don't see this tile, you can also go directly to the [Azure AI Services page](https://ai.azure.com/explore/aiservices) in AI Foundry portal. + If you don't see this tile, you can also go directly to the [Azure AI Services page](https://ai.azure.com/explore/aiservices) in Azure AI Foundry portal. 1. You should see tiles for Azure AI services that you can try. Select a tile to get to the demo page for that service. For example, select **Language + Translator**. @@ -121,9 +121,9 @@ The presentation and flow of the demo pages might vary depending on the service. ## Fine-tune Azure AI services models -In AI Foundry portal, you can fine-tune some Azure AI services models. For example, you can fine-tune a model for custom speech. +In Azure AI Foundry portal, you can fine-tune some Azure AI services models. For example, you can fine-tune a model for custom speech. -1. Go to your AI Foundry project. If you need to create a project, see [Create an AI Foundry project](../../how-to/create-projects.md). +1. Go to your Azure AI Foundry project. If you need to create a project, see [Create an Azure AI Foundry project](../../how-to/create-projects.md). 1. Select **Fine-tuning** from the left pane. 1. Select **AI Service fine-tuning**. @@ -136,7 +136,7 @@ In AI Foundry portal, you can fine-tune some Azure AI services models. For examp Once you have a project, several Azure AI services models are already deployed and ready to use. -1. Go to your AI Foundry project. +1. Go to your Azure AI Foundry project. 1. Select **Management center** from the left pane. 1. Select **Models + endpoints** (under **Project**) from the left pane. 1. Select the **Service deployments** tab to view the list of Azure AI services models that are already deployed. diff --git a/articles/ai-studio/ai-services/how-to/connect-azure-openai.md b/articles/ai-studio/ai-services/how-to/connect-azure-openai.md index a300b53ab4c..baa4828ecef 100644 --- a/articles/ai-studio/ai-services/how-to/connect-azure-openai.md +++ b/articles/ai-studio/ai-services/how-to/connect-azure-openai.md @@ -1,7 +1,7 @@ --- -title: How to use Azure OpenAI Service in AI Foundry portal +title: How to use Azure OpenAI Service in Azure AI Foundry portal titleSuffix: Azure AI Foundry -description: Learn how to use Azure OpenAI Service in AI Foundry portal. +description: Learn how to use Azure OpenAI Service in Azure AI Foundry portal. manager: nitinme ms.service: azure-ai-studio ms.custom: @@ -15,28 +15,28 @@ ms.author: eur author: eric-urban --- -# How to use Azure OpenAI Service in AI Foundry portal +# How to use Azure OpenAI Service in Azure AI Foundry portal -You might have existing Azure OpenAI Service resources and model deployments that you created using the old Azure OpenAI Studio or via code. You can pick up where you left off by using your existing resources in AI Foundry portal. +You might have existing Azure OpenAI Service resources and model deployments that you created using the old Azure OpenAI Studio or via code. You can pick up where you left off by using your existing resources in Azure AI Foundry portal. This article describes how to: - Use Azure OpenAI Service models outside of a project. -- Use Azure OpenAI Service models and an AI Foundry project. +- Use Azure OpenAI Service models and an Azure AI Foundry project. > [!TIP] -> You can use Azure OpenAI Service in AI Foundry portal without creating a project or a connection. When you're working with the models and deployments, we recommend that you work outside of a project. Eventually, you want to work in a project for tasks such as managing connections, permissions, and deploying the models to production. +> You can use Azure OpenAI Service in Azure AI Foundry portal without creating a project or a connection. When you're working with the models and deployments, we recommend that you work outside of a project. Eventually, you want to work in a project for tasks such as managing connections, permissions, and deploying the models to production. ## Use Azure OpenAI models outside of a project -You can use your existing Azure OpenAI model deployments in AI Foundry portal outside of a project. Start here if you previously deployed models using the old Azure OpenAI Studio or via the Azure OpenAI Service SDKs and APIs. +You can use your existing Azure OpenAI model deployments in Azure AI Foundry portal outside of a project. Start here if you previously deployed models using the old Azure OpenAI Studio or via the Azure OpenAI Service SDKs and APIs. To use Azure OpenAI Service outside of a project, follow these steps: -1. Go to the [AI Foundry home page](https://ai.azure.com) and make sure you're signed in with the Azure subscription that has your Azure OpenAI Service resource. +1. Go to the [Azure AI Foundry home page](https://ai.azure.com) and make sure you're signed in with the Azure subscription that has your Azure OpenAI Service resource. 1. Find the tile that says **Focused on Azure OpenAI Service?** and select **Let's go**. :::image type="content" source="../../media/azure-openai-in-ai-studio/home-page.png" alt-text="Screenshot of the home page in Azure AI Foundry portal with the option to select Azure OpenAI Service." lightbox="../../media/azure-openai-in-ai-studio/home-page.png"::: - If you don't see this tile, you can also go directly to the [Azure OpenAI Service page](https://ai.azure.com/resource/overview) in AI Foundry portal. + If you don't see this tile, you can also go directly to the [Azure OpenAI Service page](https://ai.azure.com/resource/overview) in Azure AI Foundry portal. 1. You should see your existing Azure OpenAI Service resources. In this example, the Azure OpenAI Service resource `contoso-azure-openai-eastus` is selected. @@ -48,7 +48,7 @@ If you create more Azure OpenAI Service resources later (such as via the Azure p ## Use Azure OpenAI Service in a project -You might eventually want to use a project for tasks such as managing connections, permissions, and deploying models to production. You can use your existing Azure OpenAI Service resources in an AI Foundry project. +You might eventually want to use a project for tasks such as managing connections, permissions, and deploying models to production. You can use your existing Azure OpenAI Service resources in an Azure AI Foundry project. Let's look at two ways to connect Azure OpenAI Service resources to a project: @@ -61,13 +61,13 @@ When you create a project for the first time, you also create a hub. When you cr :::image type="content" source="../../media/how-to/projects/projects-create-resource.png" alt-text="Screenshot of the create resource page within the create project dialog." lightbox="../../media/how-to/projects/projects-create-resource.png"::: -For more details about creating a project, see the [create an AI Foundry project](../../how-to/create-projects.md) how-to guide or the [create a project and use the chat playground](../../quickstarts/get-started-playground.md) quickstart. +For more details about creating a project, see the [create an Azure AI Foundry project](../../how-to/create-projects.md) how-to guide or the [create a project and use the chat playground](../../quickstarts/get-started-playground.md) quickstart. ### Connect Azure OpenAI Service after you create a project If you already have a project and you want to connect your existing Azure OpenAI Service resources, follow these steps: -1. Go to your AI Foundry project. +1. Go to your Azure AI Foundry project. 1. Select **Management center** from the left pane. 1. Select **Connected resources** (under **Project**) from the left pane. 1. Select **+ New connection**. @@ -91,7 +91,7 @@ You can try Azure OpenAI models in the Azure OpenAI Service playgrounds outside > [!TIP] > You can also try Azure OpenAI models in the project-level playgrounds. However, while you're only working with the Azure OpenAI Service models, we recommend working outside of a project. -1. Go to the [Azure OpenAI Service page](https://ai.azure.com/resource/overview) in AI Foundry portal. +1. Go to the [Azure OpenAI Service page](https://ai.azure.com/resource/overview) in Azure AI Foundry portal. 1. Select a playground from under **Resource playground** in the left pane. :::image type="content" source="../../media/ai-services/playgrounds/azure-openai-studio-playgrounds.png" alt-text="Screenshot of the playgrounds that you can select to use Azure OpenAI Service." lightbox="../../media/ai-services/playgrounds/azure-openai-studio-playgrounds.png"::: @@ -106,9 +106,9 @@ Each playground has different model requirements and capabilities. The supported ## Fine-tune Azure OpenAI models -In AI Foundry portal, you can fine-tune several Azure OpenAI models. The purpose is typically to improve model performance on specific tasks or to introduce information that wasn't well represented when you originally trained the base model. +In Azure AI Foundry portal, you can fine-tune several Azure OpenAI models. The purpose is typically to improve model performance on specific tasks or to introduce information that wasn't well represented when you originally trained the base model. -1. Go to the [Azure OpenAI Service page](https://ai.azure.com/resource/overview) in AI Foundry portal to fine-tune Azure OpenAI models. +1. Go to the [Azure OpenAI Service page](https://ai.azure.com/resource/overview) in Azure AI Foundry portal to fine-tune Azure OpenAI models. 1. Select **Fine-tuning** from the left pane. :::image type="content" source="../../media/ai-services/fine-tune-azure-openai.png" alt-text="Screenshot of the page to select fine-tuning of Azure OpenAI Service models." lightbox="../../media/ai-services/fine-tune-azure-openai.png"::: @@ -117,16 +117,16 @@ In AI Foundry portal, you can fine-tune several Azure OpenAI models. The purpose 1. Follow the [detailed how to guide](../../../ai-services/openai/how-to/fine-tuning.md?context=/azure/ai-studio/context/context) to fine-tune the model. For more information about fine-tuning Azure AI models, see: -- [Overview of fine-tuning in AI Foundry portal](../../concepts/fine-tuning-overview.md) +- [Overview of fine-tuning in Azure AI Foundry portal](../../concepts/fine-tuning-overview.md) - [How to fine-tune Azure OpenAI models](../../../ai-services/openai/how-to/fine-tuning.md?context=/azure/ai-studio/context/context) - [Azure OpenAI models that are available for fine-tuning](../../../ai-services/openai/concepts/models.md?context=/azure/ai-studio/context/context) ## Deploy models to production -You can deploy Azure OpenAI base models and fine-tuned models to production via the AI Foundry portal. +You can deploy Azure OpenAI base models and fine-tuned models to production via the Azure AI Foundry portal. -1. Go to the [Azure OpenAI Service page](https://ai.azure.com/resource/overview) in AI Foundry portal. +1. Go to the [Azure OpenAI Service page](https://ai.azure.com/resource/overview) in Azure AI Foundry portal. 1. Select **Deployments** from the left pane. :::image type="content" source="../../media/ai-services/endpoint/models-endpoints-azure-openai-deployments.png" alt-text="Screenshot of the models and endpoints page to view and create Azure OpenAI Service deployments." lightbox="../../media/ai-services/endpoint/models-endpoints-azure-openai-deployments.png"::: @@ -145,5 +145,5 @@ At some point, you want to develop apps with code. Here are some developer resou ## Related content -- [Azure OpenAI in AI Foundry portal](../../azure-openai-in-ai-studio.md) +- [Azure OpenAI in Azure AI Foundry portal](../../azure-openai-in-ai-studio.md) - [Use Azure AI services resources](./connect-ai-services.md) diff --git a/articles/ai-studio/ai-services/how-to/quickstart-github-models.md b/articles/ai-studio/ai-services/how-to/quickstart-github-models.md index b1d4132925b..6350d269aea 100644 --- a/articles/ai-studio/ai-services/how-to/quickstart-github-models.md +++ b/articles/ai-studio/ai-services/how-to/quickstart-github-models.md @@ -46,7 +46,7 @@ To obtain the key and endpoint: 1. Once you've signed in to your Azure account, you're taken to [Azure AI Foundry](https://ai.azure.com). -1. At the top of the page, select **Go to your GitHub AI resource** to go to Azure AI Foundry / Github](https://ai.azure.com/github). It might take one or two minutes to load your initial model details in AI Foundry portal. +1. At the top of the page, select **Go to your GitHub AI resource** to go to Azure AI Foundry / Github](https://ai.azure.com/github). It might take one or two minutes to load your initial model details in Azure AI Foundry portal. 1. The page is loaded with your model's details. Select the **Create a Deployment** button to deploy the model to your account. diff --git a/articles/ai-studio/azure-openai-in-ai-studio.md b/articles/ai-studio/azure-openai-in-ai-studio.md index 5e725745f98..07f55536bff 100644 --- a/articles/ai-studio/azure-openai-in-ai-studio.md +++ b/articles/ai-studio/azure-openai-in-ai-studio.md @@ -53,7 +53,7 @@ Use the left navigation area to perform your tasks with Azure OpenAI models: While the previous sections show how to focus on just the Azure OpenAI Service, you can also incorporate other AI services and models from various providers in Azure AI Foundry portal. You can access the Azure OpenAI Service in two ways: * When you focus on just the Azure OpenAI Service, as described in the previous sections, you don't use a project. -* Azure AI Foundry portal uses a project to organize your work and save state while building customized AI apps. When you work in a project, you can connect to the service. For more information, see [How to use Azure OpenAI Service in AI Foundry portal](ai-services/how-to/connect-azure-openai.md#project). +* Azure AI Foundry portal uses a project to organize your work and save state while building customized AI apps. When you work in a project, you can connect to the service. For more information, see [How to use Azure OpenAI Service in Azure AI Foundry portal](ai-services/how-to/connect-azure-openai.md#project). When you create a project, you can try other models and tools along with Azure OpenAI. For example, the **Model catalog** in a project contains many more models than just Azure OpenAI models. Inside a project, you'll have access to features that are common across all AI services and models. @@ -77,15 +77,15 @@ Pay attention to the top left corner of the screen to see which context you are * When you are in the Azure AI Foundry portal landing page, with choices of where to go next, you see **Azure AI Foundry**. - :::image type="content" source="media/azure-openai-in-ai-studio/ai-studio-no-project.png" alt-text="Screenshot shows top left corner of screen for AI Foundry without a project."::: + :::image type="content" source="media/azure-openai-in-ai-studio/ai-studio-no-project.png" alt-text="Screenshot shows top left corner of screen for Azure AI Foundry without a project."::: * When you are in a project, you see **Azure AI Foundry / project name**. The project name allows you to switch between projects. - :::image type="content" source="media/azure-openai-in-ai-studio/ai-studio-project.png" alt-text="Screenshot shows top left corner of screen for AI Foundry with a project."::: + :::image type="content" source="media/azure-openai-in-ai-studio/ai-studio-project.png" alt-text="Screenshot shows top left corner of screen for Azure AI Foundry with a project."::: * When you're working with Azure OpenAI outside of a project, you see **Azure AI Foundry | Azure OpenAI / resource name**. The resource name allows you to switch between Azure OpenAI resources. - :::image type="content" source="media/azure-openai-in-ai-studio/ai-studio-azure-openai.png" alt-text="Screenshot shows top left corner of screen for AI Foundry when using Azure OpenAI without a project."::: + :::image type="content" source="media/azure-openai-in-ai-studio/ai-studio-azure-openai.png" alt-text="Screenshot shows top left corner of screen for Azure AI Foundry when using Azure OpenAI without a project."::: Use the **Azure AI Foundry** breadcrumb to navigate back to the Azure AI Foundry portal home page. diff --git a/articles/ai-studio/breadcrumb/toc.yml b/articles/ai-studio/breadcrumb/toc.yml index 44c96228a2f..9aedb327069 100644 --- a/articles/ai-studio/breadcrumb/toc.yml +++ b/articles/ai-studio/breadcrumb/toc.yml @@ -2,6 +2,6 @@ tocHref: /azure/ topicHref: /azure/index items: - - name: AI Foundry + - name: Azure AI Foundry tocHref: /azure/ai-studio/ topicHref: /azure/ai-studio/index diff --git a/articles/ai-studio/concepts/ai-resources.md b/articles/ai-studio/concepts/ai-resources.md index 8a78fc61c6f..829e681771e 100644 --- a/articles/ai-studio/concepts/ai-resources.md +++ b/articles/ai-studio/concepts/ai-resources.md @@ -1,7 +1,7 @@ --- title: Manage, collaborate, and organize with hubs titleSuffix: Azure AI Foundry -description: This article introduces concepts about Azure AI Foundry hubs for your AI Foundry projects. +description: This article introduces concepts about Azure AI Foundry hubs for your Azure AI Foundry projects. manager: scottpolly ms.service: azure-ai-studio ms.custom: @@ -18,7 +18,7 @@ author: Blackmist # Manage, collaborate, and organize with hubs -Hubs are the primary top-level Azure resource for AI Foundry and provide a central way for a team to govern security, connectivity, and computing resources across playgrounds and projects. Once a hub is created, developers can create projects from it and access shared company resources without needing an IT administrator's repeated help. +Hubs are the primary top-level Azure resource for Azure AI Foundry and provide a central way for a team to govern security, connectivity, and computing resources across playgrounds and projects. Once a hub is created, developers can create projects from it and access shared company resources without needing an IT administrator's repeated help. Project workspaces that are created using a hub inherit the same security settings and shared resource access. Teams can create project workspaces as needed to organize their work, isolate data, and/or restrict access. @@ -36,7 +36,7 @@ Get started by [creating your first hub in Azure AI Foundry portal](../how-to/cr Often, projects in a business domain require access to the same company resources such as vector indices, model endpoints, or repos. As a team lead, you can preconfigure connectivity with these resources within a hub, so developers can access them from any new project workspace without delay on IT. -[Connections](connections.md) let you access objects in AI Foundry portal that are managed outside of your hub. For example, uploaded data on an Azure storage account, or model deployments on an existing Azure OpenAI resource. A connection can be shared with every project or made accessible to one specific project. Connections can be configured with key-based access or Microsoft Entra ID to authorize access to users on the connected resource. Plus, as an administrator, you can track, audit, and manage connections across projects using your hub. +[Connections](connections.md) let you access objects in Azure AI Foundry portal that are managed outside of your hub. For example, uploaded data on an Azure storage account, or model deployments on an existing Azure OpenAI resource. A connection can be shared with every project or made accessible to one specific project. Connections can be configured with key-based access or Microsoft Entra ID to authorize access to users on the connected resource. Plus, as an administrator, you can track, audit, and manage connections across projects using your hub. ## Shared Azure resources and configurations @@ -44,18 +44,18 @@ Various management concepts are available on hubs to support team leads and admi * **Security configuration** including public network access, [virtual networking](#virtual-networking), customer-managed key encryption, and privileged access to whom can create projects for customization. Security settings configured on the hub automatically pass down to each project. A managed virtual network is shared between all projects that share the same hub. * **Connections** are named and authenticated references to Azure and non-Azure resources like data storage providers. Use a connection as a means for making an external resource available to a group of developers without having to expose its stored credential to an individual. -* **Compute and quota allocation** is managed as shared capacity for all projects in AI Foundry portal that share the same hub. This quota includes compute instance as managed cloud-based workstation for an individual. The same user can use a compute instance across projects. +* **Compute and quota allocation** is managed as shared capacity for all projects in Azure AI Foundry portal that share the same hub. This quota includes compute instance as managed cloud-based workstation for an individual. The same user can use a compute instance across projects. * **AI services access keys** to endpoints for prebuilt AI models are managed on the hub scope. Use these endpoints to access foundation models from Azure OpenAI, Speech, Vision, and Content Safety with one [API key](#azure-ai-services-api-access-keys) * **Policy** enforced in Azure on the hub scope applies to all projects managed under it. -* **Dependent Azure resources** are set up once per hub and associated projects and used to store artifacts you generate while working in AI Foundry portal such as logs or when uploading data. For more information, see [Azure AI dependencies](#azure-ai-dependencies). +* **Dependent Azure resources** are set up once per hub and associated projects and used to store artifacts you generate while working in Azure AI Foundry portal such as logs or when uploading data. For more information, see [Azure AI dependencies](#azure-ai-dependencies). ## Organize work in projects for customization -A hub provides the hosting environment for [projects](../how-to/create-projects.md) in AI Foundry portal. A project is an organizational container that has tools for AI customization and orchestration. It lets you organize your work, save state across different tools like prompt flow, and collaborate with others. For example, you can share uploaded files and connections to data sources. +A hub provides the hosting environment for [projects](../how-to/create-projects.md) in Azure AI Foundry portal. A project is an organizational container that has tools for AI customization and orchestration. It lets you organize your work, save state across different tools like prompt flow, and collaborate with others. For example, you can share uploaded files and connections to data sources. Multiple projects can use a hub, and multiple users can use a project. A project also helps you keep track of billing, and manage access and provides data isolation. Every project uses dedicated storage containers to let you upload files and share it with only other project members when using the 'data' experiences. -Projects let you create and group reusable components that can be used across tools in AI Foundry portal: +Projects let you create and group reusable components that can be used across tools in Azure AI Foundry portal: | Asset | Description | | --- | --- | @@ -72,11 +72,11 @@ Projects also have specific settings that only hold for that project: | Prompt flow runtime | Prompt flow is a feature that can be used to generate, customize, or run a flow. To use prompt flow, you need to create a runtime on top of a compute instance. | > [!NOTE] -> In AI Foundry portal you can also manage language and notification settings that apply to all projects that you can access regardless of the hub or project. +> In Azure AI Foundry portal you can also manage language and notification settings that apply to all projects that you can access regardless of the hub or project. ## Azure AI services API access keys -The hub allows you to set up connections to existing Azure OpenAI or Azure AI Services resource types, which can be used to host model deployments. You can access these model deployments from connected resources in AI Foundry portal. Keys to connected resources can be listed from the AI Foundry portal or Azure portal. For more information, see [Find Azure AI Foundry resources in the Azure portal](#find-azure-ai-foundry-resources-in-the-azure-portal). +The hub allows you to set up connections to existing Azure OpenAI or Azure AI Services resource types, which can be used to host model deployments. You can access these model deployments from connected resources in Azure AI Foundry portal. Keys to connected resources can be listed from the Azure AI Foundry portal or Azure portal. For more information, see [Find Azure AI Foundry resources in the Azure portal](#find-azure-ai-foundry-resources-in-the-azure-portal). ### Virtual networking @@ -97,11 +97,11 @@ Connections can be set up as shared with all projects in the same hub, or create ## Azure AI dependencies -Azure AI Foundry layers on top of existing Azure services including Azure AI and Azure Machine Learning services. While it might not be visible on the display names in Azure portal, AI Foundry, or when using the SDK or CLI, some of these architectural details become apparent when you work with the Azure REST APIs, use Azure cost reporting, or use infrastructure-as-code templates such as Azure Bicep or Azure Resource Manager. From an Azure Resource Provider perspective, Azure AI Foundry resource types map to the following resource provider kinds: +Azure AI Foundry layers on top of existing Azure services including Azure AI and Azure Machine Learning services. While it might not be visible on the display names in Azure portal, Azure AI Foundry, or when using the SDK or CLI, some of these architectural details become apparent when you work with the Azure REST APIs, use Azure cost reporting, or use infrastructure-as-code templates such as Azure Bicep or Azure Resource Manager. From an Azure Resource Provider perspective, Azure AI Foundry resource types map to the following resource provider kinds: [!INCLUDE [Resource provider kinds](../includes/resource-provider-kinds.md)] -When you create a new hub, a set of dependent Azure resources are required to store data that you upload or get generated when working in AI Foundry portal. If not provided by you, and required, these resources are automatically created. +When you create a new hub, a set of dependent Azure resources are required to store data that you upload or get generated when working in Azure AI Foundry portal. If not provided by you, and required, these resources are automatically created. [!INCLUDE [Dependent Azure resources](../includes/dependent-resources.md)] @@ -124,7 +124,7 @@ In the Azure portal, you can find resources that correspond to your project in A 1. In [Azure AI Foundry](https://ai.azure.com), go to a project and select **Management center** to view your project resources. 1. From the management center, select the overview for either your hub or project and then select the link to **Manage in Azure portal**. - :::image type="content" source="../media/concepts/azureai-project-view-ai-studio.png" alt-text="Screenshot of the AI Foundry project overview page with links to the Azure portal." lightbox="../media/concepts/azureai-project-view-ai-studio.png"::: + :::image type="content" source="../media/concepts/azureai-project-view-ai-studio.png" alt-text="Screenshot of the Azure AI Foundry project overview page with links to the Azure portal." lightbox="../media/concepts/azureai-project-view-ai-studio.png"::: ## Next steps diff --git a/articles/ai-studio/concepts/architecture.md b/articles/ai-studio/concepts/architecture.md index c6d1a256129..c02a1df3d35 100644 --- a/articles/ai-studio/concepts/architecture.md +++ b/articles/ai-studio/concepts/architecture.md @@ -16,15 +16,15 @@ author: Blackmist # Azure AI Foundry architecture -AI Foundry provides a unified experience for AI developers and data scientists to build, evaluate, and deploy AI models through a web portal, SDK, or CLI. AI Foundry is built on capabilities and services provided by other Azure services. +Azure AI Foundry provides a unified experience for AI developers and data scientists to build, evaluate, and deploy AI models through a web portal, SDK, or CLI. Azure AI Foundry is built on capabilities and services provided by other Azure services. [!INCLUDE [new-name](../includes/new-name.md)] :::image type="content" source="../media/concepts/ai-studio-architecture.png" alt-text="Diagram of the high-level architecture of Azure AI Foundry." lightbox="../media/concepts/ai-studio-architecture.png"::: -At the top level, AI Foundry provides access to the following resources: +At the top level, Azure AI Foundry provides access to the following resources: - + - **Azure OpenAI**: Provides access to the latest Open AI models. You can create secure deployments, try playgrounds, fine tune models, content filters, and batch jobs. The Azure OpenAI resource provider is `Microsoft.CognitiveServices/account` and the kind of resource is `OpenAI`. You can also connect to Azure OpenAI by using a kind of `AIServices`, which also includes other [Azure AI services](/azure/ai-services/what-are-ai-services). @@ -32,10 +32,10 @@ At the top level, AI Foundry provides access to the following resources: For more information, visit [Azure OpenAI in Azure AI Foundry portal](../azure-openai-in-ai-studio.md). -- **Management center**: The management center streamlines governance and management of AI Foundry resources such as hubs, projects, connected resources, and deployments. +- **Management center**: The management center streamlines governance and management of Azure AI Foundry resources such as hubs, projects, connected resources, and deployments. For more information, visit [Management center](management-center.md). -- **AI Foundry hub**: The hub is the top-level resource in AI Foundry portal, and is based on the Azure Machine Learning service. The Azure resource provider for a hub is `Microsoft.MachineLearningServices/workspaces`, and the kind of resource is `Hub`. It provides the following features: +- **Azure AI Foundry hub**: The hub is the top-level resource in Azure AI Foundry portal, and is based on the Azure Machine Learning service. The Azure resource provider for a hub is `Microsoft.MachineLearningServices/workspaces`, and the kind of resource is `Hub`. It provides the following features: - Security configuration including a managed network that spans projects and model endpoints. - Compute resources for interactive development, fine-tuning, open source, and serverless model deployments. - Connections to other Azure services such as Azure OpenAI, Azure AI services, and Azure AI Search. Hub-scoped connections are shared with projects created from the hub. @@ -43,14 +43,14 @@ At the top level, AI Foundry provides access to the following resources: - An associated Azure storage account for data upload and artifact storage. For more information, visit [Hubs and projects overview](ai-resources.md). -- **AI Foundry project**: A project is a child resource of the hub. The Azure resource provider for a project is `Microsoft.MachineLearningServices/workspaces`, and the kind of resource is `Project`. The project provides the following features: +- **Azure AI Foundry project**: A project is a child resource of the hub. The Azure resource provider for a project is `Microsoft.MachineLearningServices/workspaces`, and the kind of resource is `Project`. The project provides the following features: - Access to development tools for building and customizing AI applications. - Reusable components including datasets, models, and indexes. - An isolated container to upload data to (within the storage inherited from the hub). - Project-scoped connections. For example, project members might need private access to data stored in an Azure Storage account without giving that same access to other projects. - Open source model deployments from catalog and fine-tuned model endpoints. - :::image type="content" source="../media/concepts/resource-provider-connected-resources.svg" alt-text="Diagram of the relationship between AI Foundry resources." ::: + :::image type="content" source="../media/concepts/resource-provider-connected-resources.svg" alt-text="Diagram of the relationship between Azure AI Foundry resources." ::: For more information, visit [Hubs and projects overview](ai-resources.md). @@ -67,7 +67,7 @@ Azure AI Foundry is built on the Azure Machine Learning resource provider, and t When you create a new hub, a set of dependent Azure resources are required to store data, get access to models, and provide compute resources for AI customization. The following table lists the dependent Azure resources and their resource providers: > [!TIP] -> If you don't provide a dependent resource when creating a hub, and it's a required dependency, AI Foundry creates the resource for you. +> If you don't provide a dependent resource when creating a hub, and it's a required dependency, Azure AI Foundry creates the resource for you. [!INCLUDE [Dependent Azure resources](../includes/dependent-resources.md)] @@ -94,9 +94,9 @@ Hubs provide a central way for a team to govern security, connectivity, and comp Often, projects in a business domain require access to the same company resources such as vector indices, model endpoints, or repos. As a team lead, you can preconfigure connectivity with these resources within a hub, so developers can access them from any new project workspace without delay on IT. -[Connections](connections.md) let you access objects in AI Foundry that are managed outside of your hub. For example, uploaded data on an Azure storage account, or model deployments on an existing Azure OpenAI resource. A connection can be shared with every project or made accessible to one specific project. Connections can be configured to use key-based access or Microsoft Entra ID passthrough to authorize access to users on the connected resource. As an administrator, you can track, audit, and manage connections across the organization from a single view in AI Foundry. +[Connections](connections.md) let you access objects in Azure AI Foundry that are managed outside of your hub. For example, uploaded data on an Azure storage account, or model deployments on an existing Azure OpenAI resource. A connection can be shared with every project or made accessible to one specific project. Connections can be configured to use key-based access or Microsoft Entra ID passthrough to authorize access to users on the connected resource. As an administrator, you can track, audit, and manage connections across the organization from a single view in Azure AI Foundry. -:::image type="content" source="../media/concepts/connected-resources-spog.png" alt-text="Screenshot of AI Foundry showing an audit view of all connected resources across a hub and its projects." ::: +:::image type="content" source="../media/concepts/connected-resources-spog.png" alt-text="Screenshot of Azure AI Foundry showing an audit view of all connected resources across a hub and its projects." ::: ### Organize for your team's needs @@ -108,7 +108,7 @@ If you require isolation between dev, test, and production as part of your LLMOp Azure AI services including Azure OpenAI provide control plane endpoints for operations such as listing model deployments. These endpoints are secured using a separate Azure role-based access control (RBAC) configuration than the one used for a hub. -To reduce the complexity of Azure RBAC management, AI Foundry provides a *control plane proxy* that allows you to perform operations on connected Azure AI services and Azure OpenAI resources. Performing operations on these resources through the control plane proxy only requires Azure RBAC permissions on the hub. The Azure AI Foundry service then performs the call to the Azure AI services or Azure OpenAI control plane endpoint on your behalf. +To reduce the complexity of Azure RBAC management, Azure AI Foundry provides a *control plane proxy* that allows you to perform operations on connected Azure AI services and Azure OpenAI resources. Performing operations on these resources through the control plane proxy only requires Azure RBAC permissions on the hub. The Azure AI Foundry service then performs the call to the Azure AI services or Azure OpenAI control plane endpoint on your behalf. For more information, see [Role-based access control in Azure AI Foundry portal](rbac-ai-studio.md). @@ -178,6 +178,6 @@ For more information on price and quota, use the following articles: Create a hub using one of the following methods: -- [Azure AI Foundry portal](../how-to/create-azure-ai-resource.md#create-a-hub-in-ai-foundry-portal): Create a hub for getting started. +- [Azure AI Foundry portal](../how-to/create-azure-ai-resource.md#create-a-hub-in-azure-ai-foundry-portal): Create a hub for getting started. - [Azure portal](../how-to/create-secure-ai-hub.md): Create a hub with your own networking. - [Bicep template](../how-to/create-azure-ai-hub-template.md). diff --git a/articles/ai-studio/concepts/concept-model-distillation.md b/articles/ai-studio/concepts/concept-model-distillation.md index ba8174b938b..2098d757fba 100644 --- a/articles/ai-studio/concepts/concept-model-distillation.md +++ b/articles/ai-studio/concepts/concept-model-distillation.md @@ -1,5 +1,5 @@ --- -title: Distillation in AI Foundry portal (preview) +title: Distillation in Azure AI Foundry portal (preview) titleSuffix: Azure AI Foundry description: Learn how to do distillation in Azure AI Foundry portal. manager: scottpolly @@ -33,7 +33,7 @@ The main steps in knowledge distillation are: ## Sample notebook -Distillation in AI Foundry portal is currently only available through a notebook experience. You can use the [sample notebook](https://github.com/Azure/azureml-examples/tree/main/sdk/python/foundation-models/system/distillation) to see how to perform distillation. Model distillation is available for Microsoft models and a selection of OSS (open-source software) models available in the model catalog. In this sample notebook, the teacher model uses the Meta Llama 3.1 405B instruction model, and the student model uses the Meta Llama 3.1 8B instruction model. +Distillation in Azure AI Foundry portal is currently only available through a notebook experience. You can use the [sample notebook](https://github.com/Azure/azureml-examples/tree/main/sdk/python/foundation-models/system/distillation) to see how to perform distillation. Model distillation is available for Microsoft models and a selection of OSS (open-source software) models available in the model catalog. In this sample notebook, the teacher model uses the Meta Llama 3.1 405B instruction model, and the student model uses the Meta Llama 3.1 8B instruction model. diff --git a/articles/ai-studio/concepts/concept-synthetic-data.md b/articles/ai-studio/concepts/concept-synthetic-data.md index 17aa17ad501..41654c504bb 100644 --- a/articles/ai-studio/concepts/concept-synthetic-data.md +++ b/articles/ai-studio/concepts/concept-synthetic-data.md @@ -1,5 +1,5 @@ --- -title: Synthetic data generation in AI Foundry portal +title: Synthetic data generation in Azure AI Foundry portal titleSuffix: Azure AI Foundry description: Learn how to generate a synthetic dataset in Azure AI Foundry portal. manager: scottpolly diff --git a/articles/ai-studio/concepts/connections.md b/articles/ai-studio/concepts/connections.md index 24b1eed90cf..55877131814 100644 --- a/articles/ai-studio/concepts/connections.md +++ b/articles/ai-studio/concepts/connections.md @@ -17,7 +17,7 @@ author: sdgilley # Connections in Azure AI Foundry portal -Connections in Azure AI Foundry portal are a way to authenticate and consume both Microsoft and non-Microsoft resources within your AI Foundry projects. For example, connections can be used for prompt flow, training data, and deployments. [Connections can be created](../how-to/connections-add.md) exclusively for one project or shared with all projects in the same hub. +Connections in Azure AI Foundry portal are a way to authenticate and consume both Microsoft and non-Microsoft resources within your Azure AI Foundry projects. For example, connections can be used for prompt flow, training data, and deployments. [Connections can be created](../how-to/connections-add.md) exclusively for one project or shared with all projects in the same hub. ## Connections to Azure AI services @@ -46,7 +46,7 @@ A data connection offers these benefits: - A common, easy-to-use API that interacts with different storage types including Microsoft OneLake, Azure Blob, and Azure Data Lake Gen2. - Easier discovery of useful connections in team operations. -- For credential-based access (service principal/SAS/key), AI Foundry connection secures credential information. This way, you won't need to place that information in your scripts. +- For credential-based access (service principal/SAS/key), Azure AI Foundry connection secures credential information. This way, you won't need to place that information in your scripts. When you create a connection with an existing Azure storage account, you can choose between two different authentication methods: diff --git a/articles/ai-studio/concepts/encryption-keys-portal.md b/articles/ai-studio/concepts/encryption-keys-portal.md index 4335f846baf..165bc6d897e 100644 --- a/articles/ai-studio/concepts/encryption-keys-portal.md +++ b/articles/ai-studio/concepts/encryption-keys-portal.md @@ -39,8 +39,8 @@ The following data is stored on the managed resources. |Service|What it's used for|Example| |-----|-----|-----| |Azure Cosmos DB|Stores metadata for your Azure AI projects and tools|Index names, tags; Flow creation timestamps; deployment tags; evaluation metrics| -|Azure AI Search|Stores indices that are used to help query your AI Foundry content.|An index based off your model deployment names| -|Azure Storage Account|Stores instructions for how customization tasks are orchestrated|JSON representation of flows you create in AI Foundry portal| +|Azure AI Search|Stores indices that are used to help query your Azure AI Foundry content.|An index based off your model deployment names| +|Azure Storage Account|Stores instructions for how customization tasks are orchestrated|JSON representation of flows you create in Azure AI Foundry portal| >[!IMPORTANT] > Azure AI Foundry uses Azure compute that is managed in the Microsoft subscription, for example when you fine-tune models or or build flows. Its disks are encrypted with Microsoft-managed keys. Compute is ephemeral, meaning after a task is completed the virtual machine is deprovisioned, and the OS disk is deleted. Compute instance machines used for 'Code' experiences are persistant. Azure Disk Encryption isn't supported for the OS disk. diff --git a/articles/ai-studio/concepts/management-center.md b/articles/ai-studio/concepts/management-center.md index f5946573dc4..f4bc77bfb82 100644 --- a/articles/ai-studio/concepts/management-center.md +++ b/articles/ai-studio/concepts/management-center.md @@ -24,7 +24,7 @@ You can use the management center to create and configure hubs and projects with :::image type="content" source="../media/management-center/manage-hub-project.png" alt-text="Screenshot of the all resources, hub, and project sections of the management studio selected." lightbox="../media/management-center/manage-hub-project.png"::: -For more information, see the articles on creating a [hub](../how-to/create-azure-ai-resource.md#create-a-hub-in-ai-foundry-portal) and [project](../how-to/create-projects.md). +For more information, see the articles on creating a [hub](../how-to/create-azure-ai-resource.md#create-a-hub-in-azure-ai-foundry-portal) and [project](../how-to/create-projects.md). ## Manage resource utilization @@ -40,7 +40,7 @@ Assign roles, manage users, and ensure that all settings comply with organizatio :::image type="content" source="../media/management-center/user-management.png" alt-text="Screenshot of the user management section of the management center." lightbox="../media/management-center/user-management.png"::: -For more information, see [Role-based access control](rbac-ai-studio.md#assigning-roles-in-ai-foundry-portal). +For more information, see [Role-based access control](rbac-ai-studio.md#assigning-roles-in-azure-ai-foundry-portal). ## Related content diff --git a/articles/ai-studio/concepts/model-lifecycle-retirement.md b/articles/ai-studio/concepts/model-lifecycle-retirement.md index ae647efbf49..f8d7067883a 100644 --- a/articles/ai-studio/concepts/model-lifecycle-retirement.md +++ b/articles/ai-studio/concepts/model-lifecycle-retirement.md @@ -72,4 +72,4 @@ Models labeled _Retired_ are no longer available for use. You can't create new d ## Related content - [Model catalog and collections in Azure AI Foundry portal](../how-to/model-catalog-overview.md) -- [Data, privacy, and security for use of models through the model catalog in AI Foundry portal](../how-to/concept-data-privacy.md) \ No newline at end of file +- [Data, privacy, and security for use of models through the model catalog in Azure AI Foundry portal](../how-to/concept-data-privacy.md) \ No newline at end of file diff --git a/articles/ai-studio/concepts/rbac-ai-studio.md b/articles/ai-studio/concepts/rbac-ai-studio.md index 5ea31add997..5e42e85ec5c 100644 --- a/articles/ai-studio/concepts/rbac-ai-studio.md +++ b/articles/ai-studio/concepts/rbac-ai-studio.md @@ -22,17 +22,17 @@ In this article, you learn how to manage access (authorization) to an Azure AI F > [!WARNING] > Applying some roles might limit UI functionality in Azure AI Foundry portal for other users. For example, if a user's role does not have the ability to create a compute instance, the option to create a compute instance will not be available in studio. This behavior is expected, and prevents the user from attempting operations that would return an access denied error. -## AI Foundry hub vs project +## Azure AI Foundry hub vs project In the Azure AI Foundry portal, there are two levels of access: the hub and the project. The hub is home to the infrastructure (including virtual network setup, customer-managed keys, managed identities, and policies) and where you configure your Azure AI services. Hub access can allow you to modify the infrastructure, create new hubs, and create projects. Projects are a subset of the hub that act as workspaces that allow you to build and deploy AI systems. Within a project you can develop flows, deploy models, and manage project assets. Project access lets you develop AI end-to-end while taking advantage of the infrastructure setup on the hub. -:::image type="content" source="../media/concepts/resource-provider-connected-resources.svg" alt-text="Diagram of the relationship between AI Foundry resources."::: +:::image type="content" source="../media/concepts/resource-provider-connected-resources.svg" alt-text="Diagram of the relationship between Azure AI Foundry resources."::: One of the key benefits of the hub and project relationship is that developers can create their own projects that inherit the hub security settings. You might also have developers who are contributors to a project, and can't create new projects. ## Default roles for the hub -The AI Foundry hub has built-in roles that are available by default. +The Azure AI Foundry hub has built-in roles that are available by default. Here's a table of the built-in roles and their permissions for the hub: @@ -92,7 +92,7 @@ If the built-in Azure AI Developer role doesn't meet your needs, you can create ## Default roles for projects -Projects in AI Foundry portal have built-in roles that are available by default. +Projects in Azure AI Foundry portal have built-in roles that are available by default. Here's a table of the built-in roles and their permissions for the project: @@ -104,7 +104,7 @@ Here's a table of the built-in roles and their permissions for the project: | Azure AI Inference Deployment Operator | Perform all actions required to create a resource deployment within a resource group. | | Reader | Read only access to the project. | -When a user is granted access to a project (for example, through the AI Foundry portal permission management), two more roles are automatically assigned to the user. The first role is Reader on the hub. The second role is the Inference Deployment Operator role, which allows the user to create deployments on the resource group that the project is in. This role is composed of these two permissions: ```"Microsoft.Authorization/*/read"``` and ```"Microsoft.Resources/deployments/*"```. +When a user is granted access to a project (for example, through the Azure AI Foundry portal permission management), two more roles are automatically assigned to the user. The first role is Reader on the hub. The second role is the Inference Deployment Operator role, which allows the user to create deployments on the resource group that the project is in. This role is composed of these two permissions: ```"Microsoft.Authorization/*/read"``` and ```"Microsoft.Resources/deployments/*"```. In order to complete end-to-end AI development and deployment, users only need these two autoassigned roles and either the Contributor or Azure AI Developer role on a project. @@ -229,7 +229,7 @@ For example, if you're trying to consume a new Blob storage, you need to ensure ## Manage access with roles -If you're an owner of a hub, you can add and remove roles for AI Foundry. Go to the **Home** page in [Azure AI Foundry](https://ai.azure.com) and select your hub. Then select **Users** to add and remove users for the hub. You can also manage permissions from the Azure portal under **Access Control (IAM)** or through the Azure CLI. For example, use the [Azure CLI](/cli/azure/) to assign the Azure AI Developer role to "joe@contoso.com" for resource group "this-rg" with the following command: +If you're an owner of a hub, you can add and remove roles for Azure AI Foundry. Go to the **Home** page in [Azure AI Foundry](https://ai.azure.com) and select your hub. Then select **Users** to add and remove users for the hub. You can also manage permissions from the Azure portal under **Access Control (IAM)** or through the Azure CLI. For example, use the [Azure CLI](/cli/azure/) to assign the Azure AI Developer role to "joe@contoso.com" for resource group "this-rg" with the following command: ```azurecli-interactive az role assignment create --role "Azure AI Developer" --assignee "joe@contoso.com" --resource-group this-rg @@ -237,18 +237,18 @@ az role assignment create --role "Azure AI Developer" --assignee "joe@contoso.co ## Create custom roles -If the built-in roles are insufficient, you can create custom roles. Custom roles might have the read, write, delete, and compute resource permissions in that AI Foundry. You can make the role available at a specific project level, a specific resource group level, or a specific subscription level. +If the built-in roles are insufficient, you can create custom roles. Custom roles might have the read, write, delete, and compute resource permissions in that Azure AI Foundry. You can make the role available at a specific project level, a specific resource group level, or a specific subscription level. > [!NOTE] > You must be an owner of the resource at that level to create custom roles within that resource. -The following JSON example defines a custom AI Foundry developer role at the subscription level: +The following JSON example defines a custom Azure AI Foundry developer role at the subscription level: ```json { "properties": { - "roleName": "AI Foundry Developer", - "description": "Custom role for AI Foundry. At subscription level", + "roleName": "Azure AI Foundry Developer", + "description": "Custom role for Azure AI Foundry. At subscription level", "assignableScopes": [ "/subscriptions/" ], @@ -299,7 +299,7 @@ For steps on creating a custom role, use one of the following articles: For more information on creating custom roles in general, visit the [Azure custom roles](/azure/role-based-access-control/custom-roles) article. -## Assigning roles in AI Foundry portal +## Assigning roles in Azure AI Foundry portal You can add users and assign roles directly from Azure AI Foundry at either the hub or project level. In the [management center](management-center.md), select **Users** in either the hub or project section, then select **New user** to add a user. @@ -316,7 +316,7 @@ You are then prompted to enter the user information and select a built-in role. When configuring a hub to use a customer-managed key (CMK), an Azure Key Vault is used to store the key. The user or service principal used to create the workspace must have owner or contributor access to the key vault. -If your AI Foundry hub is configured with a **user-assigned managed identity**, the identity must be granted the following roles. These roles allow the managed identity to create the Azure Storage, Azure Cosmos DB, and Azure Search resources used when using a customer-managed key: +If your Azure AI Foundry hub is configured with a **user-assigned managed identity**, the identity must be granted the following roles. These roles allow the managed identity to create the Azure Storage, Azure Cosmos DB, and Azure Search resources used when using a customer-managed key: - `Microsoft.Storage/storageAccounts/write` - `Microsoft.Search/searchServices/write` @@ -368,8 +368,8 @@ An Azure Container Registry instance is an optional dependency for Azure AI Foun | Authentication method | Public network access
disabled | Azure Container Registry
Public network access enabled | | ---- | :----: | :----: | | Admin user | ✓ | ✓ | -| AI Foundry hub system-assigned managed identity | ✓ | ✓ | -| AI Foundry hub user-assigned managed identity
with the **ACRPull** role assigned to the identity | | ✓ | +| Azure AI Foundry hub system-assigned managed identity | ✓ | ✓ | +| Azure AI Foundry hub user-assigned managed identity
with the **ACRPull** role assigned to the identity | | ✓ | A system-assigned managed identity is automatically assigned to the correct roles when the hub is created. If you're using a user-assigned managed identity, you must assign the **ACRPull** role to the identity. diff --git a/articles/ai-studio/concepts/trace.md b/articles/ai-studio/concepts/trace.md index 2377174fc14..13c1dd392b5 100644 --- a/articles/ai-studio/concepts/trace.md +++ b/articles/ai-studio/concepts/trace.md @@ -65,9 +65,9 @@ Trace visualization refers to the graphical representation of trace data. Azure ## Enable tracing -In order to enable tracing, you need to add an Application Insights resource to your Azure AI Foundry project. To add an Application Insights resource, navigate to the **Tracing** tab in the [AI Foundry portal](https://ai.azure.com/), and create a new resource if you don't already have one. +In order to enable tracing, you need to add an Application Insights resource to your Azure AI Foundry project. To add an Application Insights resource, navigate to the **Tracing** tab in the [Azure AI Foundry portal](https://ai.azure.com/), and create a new resource if you don't already have one. -:::image type="content" source="../../ai-services/agents/media/ai-foundry-tracing.png" alt-text="A screenshot of the tracing screen in the AI Foundry portal." lightbox="../../ai-services/agents/media/ai-foundry-tracing.png"::: +:::image type="content" source="../../ai-services/agents/media/ai-foundry-tracing.png" alt-text="A screenshot of the tracing screen in the Azure AI Foundry portal." lightbox="../../ai-services/agents/media/ai-foundry-tracing.png"::: ## Conclusion diff --git a/articles/ai-studio/concepts/vulnerability-management.md b/articles/ai-studio/concepts/vulnerability-management.md index c3a7b21ff1b..c4f27d15994 100644 --- a/articles/ai-studio/concepts/vulnerability-management.md +++ b/articles/ai-studio/concepts/vulnerability-management.md @@ -58,7 +58,7 @@ Although Microsoft patches base images with each release, whether you use the la By default, dependencies are layered on top of base images when you're building an image. After you install more dependencies on top of the Microsoft-provided images, vulnerability management becomes your responsibility. -Associated with your AI Foundry hub is an Azure Container Registry instance that functions as a cache for container images. Any image that materializes is pushed to the container registry. The workspace uses it when deployment is triggered for the corresponding environment. +Associated with your Azure AI Foundry hub is an Azure Container Registry instance that functions as a cache for container images. Any image that materializes is pushed to the container registry. The workspace uses it when deployment is triggered for the corresponding environment. The hub doesn't delete any image from your container registry. You're responsible for evaluating the need for an image over time. To monitor and maintain environment hygiene, you can use [Microsoft Defender for Container Registry](/azure/defender-for-cloud/defender-for-container-registries-usage) to help scan your images for vulnerabilities. To automate your processes based on triggers from Microsoft Defender, see [Automate remediation responses](/azure/defender-for-cloud/workflow-automation). diff --git a/articles/ai-studio/how-to/access-on-premises-resources.md b/articles/ai-studio/how-to/access-on-premises-resources.md index 281f64bf21d..197bef79b2e 100644 --- a/articles/ai-studio/how-to/access-on-premises-resources.md +++ b/articles/ai-studio/how-to/access-on-premises-resources.md @@ -18,7 +18,7 @@ To access your non-Azure resources located in a different virtual network or loc Azure Application Gateway is a load balancer that makes routing decisions based on the URL of an HTTPS request. Azure Machine Learning supports using an application gateway to securely communicate with non-Azure resources. For more on Application Gateway, see [What is Azure Application Gateway](/azure/application-gateway/overview). -To access on-premises or custom virtual network resources from the managed virtual network, you configure an Application Gateway on your Azure virtual network. The application gateway is used for inbound access to the AI Foundry portal's hub. Once configured, you then create a private endpoint from the Azure AI Foundry hub's managed virtual network to the Application Gateway. With the private endpoint, the full end to end path is secured and not routed through the Internet. +To access on-premises or custom virtual network resources from the managed virtual network, you configure an Application Gateway on your Azure virtual network. The application gateway is used for inbound access to the Azure AI Foundry portal's hub. Once configured, you then create a private endpoint from the Azure AI Foundry hub's managed virtual network to the Application Gateway. With the private endpoint, the full end to end path is secured and not routed through the Internet. :::image type="content" source="../media/how-to/network/ai-studio-app-gateway.png" alt-text="Diagram of a managed network using Application Gateway to communicate with on-premises resources." lightbox="../media/how-to/network/ai-studio-app-gateway.png"::: diff --git a/articles/ai-studio/how-to/benchmark-model-in-catalog.md b/articles/ai-studio/how-to/benchmark-model-in-catalog.md index a9dd4057689..fd5b3289384 100644 --- a/articles/ai-studio/how-to/benchmark-model-in-catalog.md +++ b/articles/ai-studio/how-to/benchmark-model-in-catalog.md @@ -28,7 +28,7 @@ In this article, you learn to compare benchmarks across models and datasets, usi ## Access model benchmarks through the model catalog -Azure AI supports model benchmarking for select models that are popular and most frequently used. Follow these steps to use detailed benchmarking results to compare and select models directly from the AI Foundry model catalog: +Azure AI supports model benchmarking for select models that are popular and most frequently used. Follow these steps to use detailed benchmarking results to compare and select models directly from the Azure AI Foundry model catalog: [!INCLUDE [open-catalog](../includes/open-catalog.md)] @@ -61,7 +61,7 @@ When you're in the "Benchmarks" tab for a specific model, you can gather extensi :::image type="content" source="../media/how-to/model-benchmarks/gpt4o-benchmark-tab-expand.png" alt-text="Screenshot showing benchmarks tab for gpt-4o." lightbox="../media/how-to/model-benchmarks/gpt4o-benchmark-tab-expand.png"::: -By default, AI Foundry displays an average index across various metrics and datasets to provide a high-level overview of model performance. +By default, Azure AI Foundry displays an average index across various metrics and datasets to provide a high-level overview of model performance. To access benchmark results for a specific metric and dataset: diff --git a/articles/ai-studio/how-to/concept-data-privacy.md b/articles/ai-studio/how-to/concept-data-privacy.md index 39744e11d5c..17746cb4ce8 100644 --- a/articles/ai-studio/how-to/concept-data-privacy.md +++ b/articles/ai-studio/how-to/concept-data-privacy.md @@ -1,5 +1,5 @@ --- -title: Data, privacy, and security for use of models through the model catalog in AI Foundry portal +title: Data, privacy, and security for use of models through the model catalog in Azure AI Foundry portal titleSuffix: Azure AI Foundry description: Get details about how data that customers provide is processed, used, and stored when a user deploys a model from the model catalog. manager: scottpolly @@ -12,7 +12,7 @@ ms.author: scottpolly author: s-polly #Customer intent: As a data scientist, I want to learn about data privacy and security for use of models in the model catalog. --- -# Data, privacy, and security for use of models through the model catalog in AI Foundry portal +# Data, privacy, and security for use of models through the model catalog in Azure AI Foundry portal [!INCLUDE [feature-preview](../includes/feature-preview.md)] diff --git a/articles/ai-studio/how-to/configure-managed-network.md b/articles/ai-studio/how-to/configure-managed-network.md index 395368b00ef..876a5ab165e 100644 --- a/articles/ai-studio/how-to/configure-managed-network.md +++ b/articles/ai-studio/how-to/configure-managed-network.md @@ -821,7 +821,7 @@ pytorch.org Private endpoints are currently supported for the following Azure services: -* AI Foundry hub +* Azure AI Foundry hub * Azure AI Search * Azure AI services * Azure API Management @@ -902,4 +902,4 @@ The hub managed virtual network feature is free. However, you're charged for the ## Related content -- [Create AI Foundry hub and project using the SDK](./develop/create-hub-project-sdk.md) +- [Create Azure AI Foundry hub and project using the SDK](./develop/create-hub-project-sdk.md) diff --git a/articles/ai-studio/how-to/configure-private-link.md b/articles/ai-studio/how-to/configure-private-link.md index e7f4c2bc2f8..710a4488931 100644 --- a/articles/ai-studio/how-to/configure-private-link.md +++ b/articles/ai-studio/how-to/configure-private-link.md @@ -17,7 +17,7 @@ author: Blackmist We have two network isolation aspects. One is the network isolation to access an Azure AI Foundry hub. Another is the network isolation of computing resources in your hub and projects such as compute instances, serverless, and managed online endpoints. This article explains the former highlighted in the diagram. You can use private link to establish the private connection to your hub and its default resources. This article is for Azure AI Foundry (hub and projects). For information on Azure AI services, see the [Azure AI services documentation](/azure/ai-services/cognitive-services-virtual-networks). -:::image type="content" source="../media/how-to/network/azure-ai-network-inbound.svg" alt-text="Diagram of AI Foundry hub network isolation." lightbox="../media/how-to/network/azure-ai-network-inbound.png"::: +:::image type="content" source="../media/how-to/network/azure-ai-network-inbound.svg" alt-text="Diagram of Azure AI Foundry hub network isolation." lightbox="../media/how-to/network/azure-ai-network-inbound.png"::: You get several hub default resources in your resource group. You need to configure following network isolation configurations. diff --git a/articles/ai-studio/how-to/connections-add.md b/articles/ai-studio/how-to/connections-add.md index 6fc5153ad11..5035dd5b7d2 100644 --- a/articles/ai-studio/how-to/connections-add.md +++ b/articles/ai-studio/how-to/connections-add.md @@ -80,7 +80,7 @@ To create an outbound private endpoint rule to the data source, use the followin 1. Select __Networking__, then __Workspace managed outbound access__. 1. To add an outbound rule, select __Add user-defined outbound rules__. From the __Workspace outbound rules__ sidebar, provide the following information: - - __Rule name__: A name for the rule. The name must be unique for the AI Foundry hub. + - __Rule name__: A name for the rule. The name must be unique for the Azure AI Foundry hub. - __Destination type__: Private Endpoint. - __Subscription__: The subscription that contains the Azure resource you want to connect to. - __Resource type__: `Microsoft.Storage/storageAccounts`. This resource provider is used for Azure Storage, Azure Data Lake Storage Gen2, and Microsoft OneLake. diff --git a/articles/ai-studio/how-to/costs-plan-manage.md b/articles/ai-studio/how-to/costs-plan-manage.md index 6f2dd14175b..b1cc3b7980c 100644 --- a/articles/ai-studio/how-to/costs-plan-manage.md +++ b/articles/ai-studio/how-to/costs-plan-manage.md @@ -100,7 +100,7 @@ When you use cost analysis, you view hub costs in graphs and tables for differen You can get to cost analysis from the [Azure portal](https://portal.azure.com). You can also get to cost analysis from the [Azure AI Foundry](https://ai.azure.com). > [!IMPORTANT] -> Your AI Foundry project costs are only a subset of your overall application or solution costs. You need to monitor costs for all Azure resources used in your application or solution. For more information, see [Azure AI Foundry hubs](../concepts/ai-resources.md). +> Your Azure AI Foundry project costs are only a subset of your overall application or solution costs. You need to monitor costs for all Azure resources used in your application or solution. For more information, see [Azure AI Foundry hubs](../concepts/ai-resources.md). For the examples in this section, assume that all Azure AI Foundry resources are in the same resource group. But you can have resources in different resource groups. For example, your Azure AI Search resource might be in a different resource group than your project. diff --git a/articles/ai-studio/how-to/create-azure-ai-resource.md b/articles/ai-studio/how-to/create-azure-ai-resource.md index 4b33b9c35cf..a94874967b4 100644 --- a/articles/ai-studio/how-to/create-azure-ai-resource.md +++ b/articles/ai-studio/how-to/create-azure-ai-resource.md @@ -1,7 +1,7 @@ --- title: How to create and manage an Azure AI Foundry hub titleSuffix: Azure AI Foundry -description: Learn how to create and manage an Azure AI Foundry hub from the Azure portal or from the AI Foundry portal. Your developers can then create projects from the hub. +description: Learn how to create and manage an Azure AI Foundry hub from the Azure portal or from the Azure AI Foundry portal. Your developers can then create projects from the hub. manager: scottpolly ms.service: azure-ai-studio ms.custom: @@ -18,24 +18,24 @@ author: Blackmist # How to create and manage an Azure AI Foundry hub -In AI Foundry portal, hubs provide the environment for a team to collaborate and organize work, and help you as a team lead or IT admin centrally set up security settings and govern usage and spend. You can create and manage a hub from the Azure portal or from the AI Foundry portal, and then your developers can create projects from the hub. +In Azure AI Foundry portal, hubs provide the environment for a team to collaborate and organize work, and help you as a team lead or IT admin centrally set up security settings and govern usage and spend. You can create and manage a hub from the Azure portal or from the Azure AI Foundry portal, and then your developers can create projects from the hub. -In this article, you learn how to create and manage a hub in AI Foundry portal with the default settings so you can get started quickly. Do you need to customize security or the dependent resources of your hub? Then use [Azure portal](create-secure-ai-hub.md) or [template options](create-azure-ai-hub-template.md). +In this article, you learn how to create and manage a hub in Azure AI Foundry portal with the default settings so you can get started quickly. Do you need to customize security or the dependent resources of your hub? Then use [Azure portal](create-secure-ai-hub.md) or [template options](create-azure-ai-hub-template.md). > [!TIP] -> If you're an individual developer and not an admin, dev lead, or part of a larger effort that requires a hub, you can create a project directly from the AI Foundry portal without creating a hub first. For more information, see [Create a project](create-projects.md). +> If you're an individual developer and not an admin, dev lead, or part of a larger effort that requires a hub, you can create a project directly from the Azure AI Foundry portal without creating a hub first. For more information, see [Create a project](create-projects.md). > > If you're an admin or dev lead and would like to create your Azure AI Foundry hub using a template, see the articles on using [Bicep](create-azure-ai-hub-template.md) or [Terraform](create-hub-terraform.md). -## Create a hub in AI Foundry portal +## Create a hub in Azure AI Foundry portal -To create a new hub, you need either the Owner or Contributor role on the resource group or on an existing hub. If you're unable to create a hub due to permissions, reach out to your administrator. If your organization is using [Azure Policy](/azure/governance/policy/overview), don't create the resource in AI Foundry portal. Create the hub [in the Azure portal](#create-a-secure-hub-in-the-azure-portal) instead. +To create a new hub, you need either the Owner or Contributor role on the resource group or on an existing hub. If you're unable to create a hub due to permissions, reach out to your administrator. If your organization is using [Azure Policy](/azure/governance/policy/overview), don't create the resource in Azure AI Foundry portal. Create the hub [in the Azure portal](#create-a-secure-hub-in-the-azure-portal) instead. [!INCLUDE [Create Azure AI Foundry hub](../includes/create-hub.md)] ## Create a secure hub in the Azure portal -If your organization is using [Azure Policy](/azure/governance/policy/overview), set up a hub that meets your organization's requirements instead of using AI Foundry for resource creation. +If your organization is using [Azure Policy](/azure/governance/policy/overview), set up a hub that meets your organization's requirements instead of using Azure AI Foundry for resource creation. 1. From the Azure portal, search for `Azure AI Foundry` and create a new hub by selecting **+ New Azure AI hub** 1. Enter your hub name, subscription, resource group, and location details. @@ -152,7 +152,7 @@ az ml workspace update -n "myexamplehub" -g "{MY_RESOURCE_GROUP}" -a "APPLICATIO ### Choose how credentials are stored -Select scenarios in AI Foundry portal store credentials on your behalf. For example when you create a connection in AI Foundry portal to access an Azure Storage account with stored account key, access Azure Container Registry with admin password, or when you create a compute instance with enabled SSH keys. No credentials are stored with connections when you choose Microsoft Entra ID identity-based authentication. +Select scenarios in Azure AI Foundry portal store credentials on your behalf. For example when you create a connection in Azure AI Foundry portal to access an Azure Storage account with stored account key, access Azure Container Registry with admin password, or when you create a compute instance with enabled SSH keys. No credentials are stored with connections when you choose Microsoft Entra ID identity-based authentication. You can choose where credentials are stored: diff --git a/articles/ai-studio/how-to/create-hub-terraform.md b/articles/ai-studio/how-to/create-hub-terraform.md index bc815dc2865..5241a9bb4ba 100644 --- a/articles/ai-studio/how-to/create-hub-terraform.md +++ b/articles/ai-studio/how-to/create-hub-terraform.md @@ -27,8 +27,8 @@ In this article, you use Terraform to create an Azure AI Foundry hub, a project, > * Set up a storage account > * Establish a key vault > * Configure AI services -> * Build an AI Foundry hub -> * Develop an AI Foundry project +> * Build an Azure AI Foundry hub +> * Develop an Azure AI Foundry project > * Establish an AI services connection ## Prerequisites diff --git a/articles/ai-studio/how-to/create-projects.md b/articles/ai-studio/how-to/create-projects.md index 8e7a7ad634a..b426b659488 100644 --- a/articles/ai-studio/how-to/create-projects.md +++ b/articles/ai-studio/how-to/create-projects.md @@ -34,7 +34,7 @@ For more information about the projects and hubs model, see [Azure AI Foundry hu Use the following tabs to select the method you plan to use to create a project: -# [AI Foundry portal](#tab/ai-studio) +# [Azure AI Foundry portal](#tab/ai-studio) [!INCLUDE [Create Azure AI Foundry project](../includes/create-projects.md)] @@ -85,11 +85,11 @@ The code in this section assumes you have an existing hub. If you don't have a ## View project settings -# [AI Foundry portal](#tab/ai-studio) +# [Azure AI Foundry portal](#tab/ai-studio) On the project **Overview** page you can find information about the project. -:::image type="content" source="../media/how-to/projects/project-settings.png" alt-text="Screenshot of an AI Foundry project settings page." lightbox = "../media/how-to/projects/project-settings.png"::: +:::image type="content" source="../media/how-to/projects/project-settings.png" alt-text="Screenshot of an Azure AI Foundry project settings page." lightbox = "../media/how-to/projects/project-settings.png"::: - Name: The name of the project appears in the top left corner. You can rename the project using the edit tool. - Subscription: The subscription that hosts the hub that hosts the project. @@ -133,7 +133,7 @@ In addition, a number of resources are only accessible by users in your project | workspacefilestore | {project-GUID}-code | Hosts files created on your compute and using prompt flow | > [!NOTE] -> Storage connections are not created directly with the project when your storage account has public network access set to disabled. These are created instead when a first user accesses AI Foundry over a private network connection. [Troubleshoot storage connections](troubleshoot-secure-connection-project.md#troubleshoot-configurations-on-connecting-to-storage) +> Storage connections are not created directly with the project when your storage account has public network access set to disabled. These are created instead when a first user accesses Azure AI Foundry over a private network connection. [Troubleshoot storage connections](troubleshoot-secure-connection-project.md#troubleshoot-configurations-on-connecting-to-storage) ## Related content diff --git a/articles/ai-studio/how-to/create-secure-ai-hub.md b/articles/ai-studio/how-to/create-secure-ai-hub.md index e5da89cfb4b..ed60c8b423b 100644 --- a/articles/ai-studio/how-to/create-secure-ai-hub.md +++ b/articles/ai-studio/how-to/create-secure-ai-hub.md @@ -38,7 +38,7 @@ You can secure your Azure AI Foundry hub, projects, and managed resources in a m :::image type="content" source="../media/how-to/network/ai-hub-resources.png" alt-text="Screenshot of the Create a hub with the option to set resource information." lightbox="../media/how-to/network/ai-hub-resources.png"::: -1. Select **Next: Networking** to configure the managed virtual network that AI Foundry uses to secure its hub and projects. +1. Select **Next: Networking** to configure the managed virtual network that Azure AI Foundry uses to secure its hub and projects. 1. Select **Private with Internet Outbound**, which allows compute resources to access the public internet for resources such as Python packages. diff --git a/articles/ai-studio/how-to/data-add.md b/articles/ai-studio/how-to/data-add.md index 3c478f3258a..e4becb99df3 100644 --- a/articles/ai-studio/how-to/data-add.md +++ b/articles/ai-studio/how-to/data-add.md @@ -34,11 +34,11 @@ Data can help when you need these capabilities: To create and work with data, you need: - An Azure subscription. If you don't have one, create a [free account](https://azure.microsoft.com/free/). -- An [AI Foundry project](../how-to/create-projects.md). +- An [Azure AI Foundry project](../how-to/create-projects.md). ## Create data -When you create your data, you need to set the data type. AI Foundry supports these data types: +When you create your data, you need to set the data type. Azure AI Foundry supports these data types: |Type |**Canonical Scenarios**| |---------|---------| @@ -119,9 +119,9 @@ A Folder (`uri_folder`) data source type points to a *folder* on a storage resou ### Delete data > [!IMPORTANT] -> Data deletion is not supported. Data is immutable in AI Foundry portal. Once you create a data version, it can't be modified or deleted. This immutability provides a level of protection when working in a team that creates production workloads. +> Data deletion is not supported. Data is immutable in Azure AI Foundry portal. Once you create a data version, it can't be modified or deleted. This immutability provides a level of protection when working in a team that creates production workloads. -If AI Foundry allowed data deletion, it would have the following adverse effects: +If Azure AI Foundry allowed data deletion, it would have the following adverse effects: - Production jobs that consume data that is later deleted would fail. - Machine learning experiment reproduction would become more difficult. - Job lineage would break, because it would become impossible to view the deleted data version. @@ -182,7 +182,7 @@ You can add tags to existing data. You can browse the folder structure and preview the file in the Data details page. We support data preview for the following types: - Data file types that are supported via preview API: ".tsv", ".csv", ".parquet", ".jsonl". -- Other file types, AI Foundry portal attempts to preview the file in the browser natively. The supported file types might depend on the browser itself. +- Other file types, Azure AI Foundry portal attempts to preview the file in the browser natively. The supported file types might depend on the browser itself. Normally for images, these file image types are supported: ".png", ".jpg", ".gif". Normally, these file types are supported: ".ipynb", ".py", ".yml", ".html". ## Next steps diff --git a/articles/ai-studio/how-to/data-image-add.md b/articles/ai-studio/how-to/data-image-add.md index cb3b46a7388..21bbe7808a5 100644 --- a/articles/ai-studio/how-to/data-image-add.md +++ b/articles/ai-studio/how-to/data-image-add.md @@ -28,7 +28,7 @@ Use this article to learn how to provide your own image data for GPT-4 Turbo wit - An Azure OpenAI resource with the GPT-4 Turbo with Vision model deployed. For more information about model deployment, see the [resource deployment guide](../../ai-services/openai/how-to/create-resource.md). - Be sure that you're assigned at least the [Cognitive Services Contributor role](../../ai-services/openai/how-to/role-based-access-control.md#cognitive-services-contributor) for the Azure OpenAI resource. - An Azure AI Search resource. See [create an Azure AI Search service in the portal](/azure/search/search-create-service-portal). If you don't have an Azure AI Search resource, you're prompted to create one when you add your data source later in this guide. -- An [AI Foundry hub](../how-to/create-azure-ai-resource.md) with your Azure OpenAI resource and Azure AI Search resource added as connections. +- An [Azure AI Foundry hub](../how-to/create-azure-ai-resource.md) with your Azure OpenAI resource and Azure AI Search resource added as connections. ## Deploy a GPT-4 Turbo with Vision model diff --git a/articles/ai-studio/how-to/deploy-models-cohere-rerank.md b/articles/ai-studio/how-to/deploy-models-cohere-rerank.md index 885212222c2..d09c8081c74 100644 --- a/articles/ai-studio/how-to/deploy-models-cohere-rerank.md +++ b/articles/ai-studio/how-to/deploy-models-cohere-rerank.md @@ -83,7 +83,7 @@ To create a deployment: 4. Select the model card of the model you want to deploy. In this article, you select **Cohere-rerank-v3-english** to open the Model Details page. 1. Select **Deploy** to open a serverless API deployment window for the model. -1. Alternatively, you can initiate a deployment from your project in the AI Foundry portal as follows: +1. Alternatively, you can initiate a deployment from your project in the Azure AI Foundry portal as follows: 1. From the left sidebar of your project, select **Models + Endpoints**. 1. Select **+ Deploy model** > **Deploy base model**. diff --git a/articles/ai-studio/how-to/deploy-models-jamba.md b/articles/ai-studio/how-to/deploy-models-jamba.md index 56fc0666700..79fd0a6c6e3 100644 --- a/articles/ai-studio/how-to/deploy-models-jamba.md +++ b/articles/ai-studio/how-to/deploy-models-jamba.md @@ -61,7 +61,7 @@ To get started with Jamba 1.5 mini deployed as a serverless API, explore our int - Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure AI Foundry portal. To perform the steps in this article, your user account must be assigned the __owner__ or __contributor__ role for the Azure subscription. Alternatively, your account can be assigned a custom role that has the following permissions: - - On the Azure subscription—to subscribe the AI Foundry project to the Azure Marketplace offering, once for each project, per offering: + - On the Azure subscription—to subscribe the Azure AI Foundry project to the Azure Marketplace offering, once for each project, per offering: - `Microsoft.MarketplaceOrdering/agreements/offers/plans/read` - `Microsoft.MarketplaceOrdering/agreements/offers/plans/sign/action` - `Microsoft.MarketplaceOrdering/offerTypes/publishers/offers/plans/agreements/read` @@ -72,7 +72,7 @@ To get started with Jamba 1.5 mini deployed as a serverless API, explore our int - `Microsoft.SaaS/resources/read` - `Microsoft.SaaS/resources/write` - - On the AI Foundry project—to deploy endpoints (the Azure AI Developer role contains these permissions already): + - On the Azure AI Foundry project—to deploy endpoints (the Azure AI Developer role contains these permissions already): - `Microsoft.MachineLearningServices/workspaces/marketplaceModelSubscriptions/*` - `Microsoft.MachineLearningServices/workspaces/serverlessEndpoints/*` @@ -89,7 +89,7 @@ These steps demonstrate the deployment of `AI21 Jamba 1.5 Large` or `AI21 Jamba 1. Select **Deploy** to open a serverless API deployment window for the model. -1. Alternatively, you can initiate a deployment by starting from the **Models + endpoints** page in AI Foundry portal. +1. Alternatively, you can initiate a deployment by starting from the **Models + endpoints** page in Azure AI Foundry portal. 1. From the left navigation pane of your project, select **My assets** > **Models + endpoints**. 1. Select **+ Deploy model** > **Deploy base model**. diff --git a/articles/ai-studio/how-to/deploy-models-managed.md b/articles/ai-studio/how-to/deploy-models-managed.md index 6e8141f8c48..d8cecb4abf5 100644 --- a/articles/ai-studio/how-to/deploy-models-managed.md +++ b/articles/ai-studio/how-to/deploy-models-managed.md @@ -1,6 +1,6 @@ --- title: How to deploy and inference a managed compute deployment with code -titleSuffix: AI Foundry +titleSuffix: Azure AI Foundry description: Learn how to deploy and inference a managed compute deployment with code. manager: scottpolly ms.service: azure-ai-studio @@ -16,7 +16,7 @@ author: msakande # How to deploy and inference a managed compute deployment with code -the AI Foundry portal [model catalog](../how-to/model-catalog-overview.md) offers over 1,600 models, and the most common way to deploy these models is to use the managed compute deployment option, which is also sometimes referred to as a managed online deployment. +the Azure AI Foundry portal [model catalog](../how-to/model-catalog-overview.md) offers over 1,600 models, and the most common way to deploy these models is to use the managed compute deployment option, which is also sometimes referred to as a managed online deployment. Deployment of a large language model (LLM) makes it available for use in a website, an application, or other production environment. Deployment typically involves hosting the model on a server or in the cloud and creating an API or other interface for users to interact with the model. You can invoke the deployment for real-time inference of generative AI applications such as chat and copilot. @@ -48,7 +48,7 @@ pip install azure-ai-ml pip install azure-identity ``` -Use this code to authenticate with Azure Machine Learning and create a client object. Replace the placeholders with your subscription ID, resource group name, and AI Foundry project name. +Use this code to authenticate with Azure Machine Learning and create a client object. Replace the placeholders with your subscription ID, resource group name, and Azure AI Foundry project name. ```python from azure.ai.ml import MLClient @@ -153,11 +153,11 @@ print(json.dumps(response_json, indent=2)) ## Delete the deployment endpoint -To delete deployments in AI Foundry portal, select the **Delete** button on the top panel of the deployment details page. +To delete deployments in Azure AI Foundry portal, select the **Delete** button on the top panel of the deployment details page. ## Quota considerations -To deploy and perform inferencing with real-time endpoints, you consume Virtual Machine (VM) core quota that is assigned to your subscription on a per-region basis. When you sign up for AI Foundry, you receive a default VM quota for several VM families available in the region. You can continue to create deployments until you reach your quota limit. Once that happens, you can request for a quota increase. +To deploy and perform inferencing with real-time endpoints, you consume Virtual Machine (VM) core quota that is assigned to your subscription on a per-region basis. When you sign up for Azure AI Foundry, you receive a default VM quota for several VM families available in the region. You can continue to create deployments until you reach your quota limit. Once that happens, you can request for a quota increase. ## Next steps diff --git a/articles/ai-studio/how-to/deploy-models-openai.md b/articles/ai-studio/how-to/deploy-models-openai.md index 1a870df345c..432cd51267a 100644 --- a/articles/ai-studio/how-to/deploy-models-openai.md +++ b/articles/ai-studio/how-to/deploy-models-openai.md @@ -34,7 +34,7 @@ To modify and interact with an Azure OpenAI model in the [Azure AI Foundry](http ## Deploy an Azure OpenAI model from the model catalog -Follow the steps below to deploy an Azure OpenAI model such as `gpt-4o-mini` to a real-time endpoint from the AI Foundry portal [model catalog](./model-catalog-overview.md): +Follow the steps below to deploy an Azure OpenAI model such as `gpt-4o-mini` to a real-time endpoint from the Azure AI Foundry portal [model catalog](./model-catalog-overview.md): [!INCLUDE [open-catalog](../includes/open-catalog.md)] @@ -52,9 +52,9 @@ Follow the steps below to deploy an Azure OpenAI model such as `gpt-4o-mini` to ## Deploy an Azure OpenAI model from your project -Alternatively, you can initiate deployment by starting from your project in AI Foundry portal. +Alternatively, you can initiate deployment by starting from your project in Azure AI Foundry portal. -1. Go to your project in AI Foundry portal. +1. Go to your project in Azure AI Foundry portal. 1. From the left sidebar of your project, go to **My assets** > **Models + endpoints**. 1. Select **+ Deploy model** > **Deploy base model**. 1. In the **Collections** filter, select **Azure OpenAI**. diff --git a/articles/ai-studio/how-to/deploy-models-serverless-connect.md b/articles/ai-studio/how-to/deploy-models-serverless-connect.md index 749ba008679..69d32a80339 100644 --- a/articles/ai-studio/how-to/deploy-models-serverless-connect.md +++ b/articles/ai-studio/how-to/deploy-models-serverless-connect.md @@ -41,7 +41,7 @@ The need to consume a serverless API endpoint in a different project or hub than - You need to install the following software to work with Azure AI Foundry: - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) You can use any compatible web browser to navigate [Azure AI Foundry](https://ai.azure.com). @@ -88,7 +88,7 @@ Follow these steps to create a connection: 1. Connect to the project or hub where the endpoint is deployed: - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) Go to [Azure AI Foundry](https://ai.azure.com) and navigate to the project where the endpoint you want to connect to is deployed. @@ -116,9 +116,9 @@ Follow these steps to create a connection: 1. Get the endpoint's URL and credentials for the endpoint you want to connect to. In this example, you get the details for an endpoint name **meta-llama3-8b-qwerty**. - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) - 1. From the left sidebar of your project in AI Foundry portal, go to **My assets** > **Models + endpoints** to see the list of deployments in the project. + 1. From the left sidebar of your project in Azure AI Foundry portal, go to **My assets** > **Models + endpoints** to see the list of deployments in the project. 1. Select the deployment you want to connect to. @@ -141,7 +141,7 @@ Follow these steps to create a connection: 1. Now, connect to the project or hub **where you want to create the connection**: - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) Go to the project where the connection needs to be created to. @@ -169,9 +169,9 @@ Follow these steps to create a connection: 1. Create the connection in the project: - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) - 1. From your project in AI Foundry portal, go to the bottom part of the left sidebar and select **Management center**. + 1. From your project in Azure AI Foundry portal, go to the bottom part of the left sidebar and select **Management center**. 1. From the left sidebar of the management center, select **Connected resources**. @@ -218,7 +218,7 @@ Follow these steps to create a connection: 1. To validate that the connection is working: - 1. Return to your project in AI Foundry portal. + 1. Return to your project in Azure AI Foundry portal. 1. From the left sidebar of your project, go to **Build and customize** > **Prompt flow**. diff --git a/articles/ai-studio/how-to/deploy-models-serverless.md b/articles/ai-studio/how-to/deploy-models-serverless.md index 85fe0c2bb50..7b79a4148c5 100644 --- a/articles/ai-studio/how-to/deploy-models-serverless.md +++ b/articles/ai-studio/how-to/deploy-models-serverless.md @@ -35,7 +35,7 @@ This article uses a Meta Llama model deployment for illustration. However, you c - You need to install the following software to work with Azure AI Foundry: - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) You can use any compatible web browser to navigate [Azure AI Foundry](https://ai.azure.com). @@ -132,7 +132,7 @@ Serverless API endpoints can deploy both Microsoft and non-Microsoft offered mod 1. Create the model's marketplace subscription. When you create a subscription, you accept the terms and conditions associated with the model offer. - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) 1. On the model's **Details** page, select **Deploy**. A **Deployment options** window opens up, giving you the choice between serverless API deployment and deployment using a managed compute. @@ -259,7 +259,7 @@ Serverless API endpoints can deploy both Microsoft and non-Microsoft offered mod 1. At any point, you can see the model offers to which your project is currently subscribed: - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) 1. Go to the [Azure portal](https://portal.azure.com). @@ -314,7 +314,7 @@ In this section, you create an endpoint with the name **meta-llama3-8b-qwerty**. 1. Create the serverless endpoint - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) 1. To deploy a Microsoft model that doesn't require subscribing to a model offering: 1. Select **Deploy** and then select **Serverless API with Azure AI Content Safety (preview)** to open the deployment wizard. @@ -466,7 +466,7 @@ In this section, you create an endpoint with the name **meta-llama3-8b-qwerty**. 1. At any point, you can see the endpoints deployed to your project: - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) 1. Go to your project. @@ -515,7 +515,7 @@ In this section, you create an endpoint with the name **meta-llama3-8b-qwerty**. 1. The created endpoint uses key authentication for authorization. Use the following steps to get the keys associated with a given endpoint. - # [AI Foundry portal](#tab/azure-ai-studio) + # [Azure AI Foundry portal](#tab/azure-ai-studio) You can select the deployment, and note the endpoint's _Target URI_ and _Key_. Use them to call the deployment and generate predictions. @@ -559,7 +559,7 @@ Read more about the [capabilities of this API](../reference/reference-model-infe ## Network isolation -Endpoints for models deployed as Serverless APIs follow the public network access (PNA) flag setting of the AI Foundry portal Hub that has the project in which the deployment exists. To secure your MaaS endpoint, disable the PNA flag on your AI Foundry Hub. You can secure inbound communication from a client to your endpoint by using a private endpoint for the hub. +Endpoints for models deployed as Serverless APIs follow the public network access (PNA) flag setting of the Azure AI Foundry portal Hub that has the project in which the deployment exists. To secure your MaaS endpoint, disable the PNA flag on your Azure AI Foundry Hub. You can secure inbound communication from a client to your endpoint by using a private endpoint for the hub. To set the PNA flag for the Azure AI Foundry hub: @@ -573,7 +573,7 @@ To set the PNA flag for the Azure AI Foundry hub: You can delete model subscriptions and endpoints. Deleting a model subscription makes any associated endpoint become *Unhealthy* and unusable. -# [AI Foundry portal](#tab/azure-ai-studio) +# [Azure AI Foundry portal](#tab/azure-ai-studio) To delete a serverless API endpoint: diff --git a/articles/ai-studio/how-to/deploy-models-timegen-1.md b/articles/ai-studio/how-to/deploy-models-timegen-1.md index fd65d3fe6e2..a6c97928461 100644 --- a/articles/ai-studio/how-to/deploy-models-timegen-1.md +++ b/articles/ai-studio/how-to/deploy-models-timegen-1.md @@ -84,7 +84,7 @@ These steps demonstrate the deployment of TimeGEN-1. To create a deployment: 4. Select the model card of the model you want to deploy. In this article, you select **TimeGEN-1** to open the Model Details page. 1. Select **Deploy** to open a serverless API deployment window for the model. -1. Alternatively, you can initiate a deployment from your project in the AI Foundry portal as follows: +1. Alternatively, you can initiate a deployment from your project in the Azure AI Foundry portal as follows: 1. From the left sidebar of your project, select **Models + Endpoints**. 1. Select **+ Deploy model** > **Deploy base model**. diff --git a/articles/ai-studio/how-to/develop/connections-add-sdk.md b/articles/ai-studio/how-to/develop/connections-add-sdk.md index 4744dac7d36..1f054a33576 100644 --- a/articles/ai-studio/how-to/develop/connections-add-sdk.md +++ b/articles/ai-studio/how-to/develop/connections-add-sdk.md @@ -1,5 +1,5 @@ --- -title: How to add a new connection in AI Foundry portal using the Azure Machine Learning SDK +title: How to add a new connection in Azure AI Foundry portal using the Azure Machine Learning SDK titleSuffix: Azure AI Foundry description: This article provides instructions on how to add connections to other resources using the Azure Machine Learning SDK. manager: scottpolly @@ -25,7 +25,7 @@ Connections are a way to authenticate and consume both Microsoft and other resou ## Prerequisites - An Azure subscription. If you don't have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure AI Foundry](https://azure.microsoft.com/free/) today. -- An Azure AI Foundry hub. For information on creating a hub, see [Create AI Foundry resources with the SDK](./create-hub-project-sdk.md). +- An Azure AI Foundry hub. For information on creating a hub, see [Create Azure AI Foundry resources with the SDK](./create-hub-project-sdk.md). - A resource to create a connection to. For example, an AI Services resource. The examples in this article use placeholders that you must replace with your own values when running the code. ## Set up your environment diff --git a/articles/ai-studio/how-to/develop/create-hub-project-sdk.md b/articles/ai-studio/how-to/develop/create-hub-project-sdk.md index 9b513bbb32a..b02a3e87a11 100644 --- a/articles/ai-studio/how-to/develop/create-hub-project-sdk.md +++ b/articles/ai-studio/how-to/develop/create-hub-project-sdk.md @@ -1,7 +1,7 @@ --- title: How to create a hub using the Azure Machine Learning SDK/CLI titleSuffix: Azure AI Foundry -description: This article provides instructions on how to create an AI Foundry hub using the Azure Machine Learning SDK and Azure CLI extension. +description: This article provides instructions on how to create an Azure AI Foundry hub using the Azure Machine Learning SDK and Azure CLI extension. manager: scottpolly ms.service: azure-ai-studio ms.custom: build-2024, devx-track-azurecli @@ -16,7 +16,7 @@ author: sdgilley [!INCLUDE [feature-preview](../../includes/feature-preview.md)] -In this article, you learn how to create the following AI Foundry resources using the Azure Machine Learning SDK and Azure CLI (with machine learning extension): +In this article, you learn how to create the following Azure AI Foundry resources using the Azure Machine Learning SDK and Azure CLI (with machine learning extension): - An Azure AI Foundry hub - An Azure AI Services connection @@ -46,7 +46,7 @@ Use the following tabs to select whether you're using the Python SDK or Azure CL --- -## Create the AI Foundry hub and AI Services connection +## Create the Azure AI Foundry hub and AI Services connection Use the following examples to create a new hub. Replace example string values with your own values: @@ -127,7 +127,7 @@ You can use either an API key or credential-less YAML configuration file. For mo --- -## Create an AI Foundry hub using existing dependency resources +## Create an Azure AI Foundry hub using existing dependency resources You can also create a hub using existing resources such as Azure Storage and Azure Key Vault. In the following examples, replace the example string values with your own values: diff --git a/articles/ai-studio/how-to/develop/index-build-consume-sdk.md b/articles/ai-studio/how-to/develop/index-build-consume-sdk.md index ade126d2ac3..5e2aedeb58c 100644 --- a/articles/ai-studio/how-to/develop/index-build-consume-sdk.md +++ b/articles/ai-studio/how-to/develop/index-build-consume-sdk.md @@ -23,12 +23,12 @@ In this article, you learn how to create an index and consume it from code. To c You must have: -- An [AI Foundry hub](../../how-to/create-azure-ai-resource.md) and [project](../../how-to/create-projects.md). +- An [Azure AI Foundry hub](../../how-to/create-azure-ai-resource.md) and [project](../../how-to/create-projects.md). - An [Azure AI Search service connection](../../how-to/connections-add.md#create-a-new-connection) to index the sample product and customer data. If you don't have an Azure AI Search service, you can create one from the [Azure portal](https://portal.azure.com/) or see the instructions [here](/azure/search/search-create-service-portal). - Models for embedding: - You can use an ada-002 embedding model from Azure OpenAI. The instructions to deploy can be found [here](../deploy-models-openai.md). - - OR you can use any another embedding model deployed in your AI Foundry project. In this example we use Cohere multi-lingual embedding. The instructions to deploy this model can be found [here](../deploy-models-cohere-embed.md). + - OR you can use any another embedding model deployed in your Azure AI Foundry project. In this example we use Cohere multi-lingual embedding. The instructions to deploy this model can be found [here](../deploy-models-cohere-embed.md). ## Build and consume an index locally @@ -88,9 +88,9 @@ local_index_aoai=build_index( The above code builds an index locally. It uses environment variables to get the AI Search service and also to connect to the Azure OpenAI embedding model. -### Build an index locally using other embedding models deployed in your AI Foundry project +### Build an index locally using other embedding models deployed in your Azure AI Foundry project -To create an index that uses an embedding model deployed in your AI Foundry project, we configure the connection to the model using a `ConnectionConfig` as shown below. The `subscription`, `resource_group` and `workspace` refers to the project where the embedding model is installed. The `connection_name` refers to the connection name for the model, which can be found in the AI Foundry project settings page. +To create an index that uses an embedding model deployed in your Azure AI Foundry project, we configure the connection to the model using a `ConnectionConfig` as shown below. The `subscription`, `resource_group` and `workspace` refers to the project where the embedding model is installed. The `connection_name` refers to the connection name for the model, which can be found in the Azure AI Foundry project settings page. ```python from promptflow.rag.config import ConnectionConfig @@ -142,14 +142,14 @@ retriever.get_relevant_documents("") retriever=get_langchain_retriever_from_index(local_index_cohere) retriever.get_relevant_documents("") ``` -### Registering the index in your AI Foundry project (Optional) +### Registering the index in your Azure AI Foundry project (Optional) -Optionally, you can register the index in your AI Foundry project so that you or others who have access to your project can use it from the cloud. Before proceeding [install the required packages](#required-packages-for-remote-index-operations) for remote operations. +Optionally, you can register the index in your Azure AI Foundry project so that you or others who have access to your project can use it from the cloud. Before proceeding [install the required packages](#required-packages-for-remote-index-operations) for remote operations. #### Connect to the project ```python -# connect to the AI Foundry project +# connect to the Azure AI Foundry project from azure.identity import DefaultAzureCredential from azure.ai.ml import MLClient @@ -185,9 +185,9 @@ client.indexes.create_or_update( > [!NOTE] > Environment variables are intended for convenience in a local environment. However, if you register a local index created using environment variables, the index may not function as expected because secrets from environment variables won't be transferred to the cloud index. To address this issue, you can use a `ConnectionConfig` or `connection_id` to create a local index before registering. -## Build an index (remotely) in your AI Foundry project +## Build an index (remotely) in your Azure AI Foundry project -We build an index in the cloud in your AI Foundry project. +We build an index in the cloud in your Azure AI Foundry project. ### Required packages for remote index operations @@ -197,12 +197,12 @@ Install the following packages required for remote index creation. pip install azure-ai-ml promptflow-rag langchain langchain-openai ``` -### Connect to the AI Foundry project +### Connect to the Azure AI Foundry project To get started, we connect to the project. The `subscription`, `resource_group` and `workspace` in the code below refers to the project you want to connect to. ```python -# connect to the AI Foundry project +# connect to the Azure AI Foundry project from azure.identity import DefaultAzureCredential from azure.ai.ml import MLClient @@ -245,7 +245,7 @@ embeddings_model_config = IndexModelConfiguration.from_connection( deployment_name="text-embedding-ada-002") ``` -You can connect to embedding model deployed in your AI Foundry project (non Azure OpenAI models) using the serverless connection. +You can connect to embedding model deployed in your Azure AI Foundry project (non Azure OpenAI models) using the serverless connection. ```python from azure.ai.ml.entities import IndexModelConfiguration @@ -392,6 +392,6 @@ print(result["answer"]) ## Related content -- [Create and consume an index from the AI Foundry portal UI](../index-add.md) +- [Create and consume an index from the Azure AI Foundry portal UI](../index-add.md) - [Get started building a chat app using the prompt flow SDK](../../quickstarts/get-started-code.md) - [Work with projects in VS Code](vscode.md) \ No newline at end of file diff --git a/articles/ai-studio/how-to/develop/sdk-overview.md b/articles/ai-studio/how-to/develop/sdk-overview.md index 3e5ee914c10..62109e0166a 100644 --- a/articles/ai-studio/how-to/develop/sdk-overview.md +++ b/articles/ai-studio/how-to/develop/sdk-overview.md @@ -24,7 +24,7 @@ The Azure AI Foundry SDK is a comprehensive toolchain designed to simplify the d - Easily combine together models, data, and AI services to build AI-powered applications - Evaluate, debug, and improve application quality & safety across development, testing, and production environments -The AI Foundry SDK is a set of packages and services designed to work together. You can use the [Azure AI Projects client library](/python/api/overview/azure/ai-projects-readme) to easily use multiple services through a single project client and connection string. You can also use services and SDKs on their own and connect directly to your services. +The Azure AI Foundry SDK is a set of packages and services designed to work together. You can use the [Azure AI Projects client library](/python/api/overview/azure/ai-projects-readme) to easily use multiple services through a single project client and connection string. You can also use services and SDKs on their own and connect directly to your services. If you want to jump right in and start building an app, check out: @@ -173,7 +173,7 @@ If you have existing code that uses the OpenAI SDK, you can use the project clie ::: zone-end -If you’re already using the [Azure OpenAI SDK](../../../ai-services/openai/chatgpt-quickstart.md) directly against the Azure OpenAI Service, the project provides a convenient way to use Azure OpenAI Service capabilities alongside the rest of the AI Foundry capabilities. +If you’re already using the [Azure OpenAI SDK](../../../ai-services/openai/chatgpt-quickstart.md) directly against the Azure OpenAI Service, the project provides a convenient way to use Azure OpenAI Service capabilities alongside the rest of the Azure AI Foundry capabilities. ## Azure AI model inference service diff --git a/articles/ai-studio/how-to/develop/simulator-interaction-data.md b/articles/ai-studio/how-to/develop/simulator-interaction-data.md index 4e5539d3c34..804a721e087 100644 --- a/articles/ai-studio/how-to/develop/simulator-interaction-data.md +++ b/articles/ai-studio/how-to/develop/simulator-interaction-data.md @@ -307,7 +307,7 @@ Augment and accelerate your red-teaming operation by using Azure AI Foundry safe from azure.ai.evaluation.simulator import AdversarialSimulator ``` -The adversarial simulator works by setting up a service-hosted GPT large language model to simulate an adversarial user and interact with your application. An AI Foundry project is required to run the adversarial simulator: +The adversarial simulator works by setting up a service-hosted GPT large language model to simulate an adversarial user and interact with your application. An Azure AI Foundry project is required to run the adversarial simulator: ```python from azure.identity import DefaultAzureCredential diff --git a/articles/ai-studio/how-to/develop/trace-local-sdk.md b/articles/ai-studio/how-to/develop/trace-local-sdk.md index 2b7e550d4d7..e91dcdfe9a4 100644 --- a/articles/ai-studio/how-to/develop/trace-local-sdk.md +++ b/articles/ai-studio/how-to/develop/trace-local-sdk.md @@ -26,7 +26,7 @@ In this article you'll learn how to trace your application with Azure AI Inferen - An [Azure Subscription](https://azure.microsoft.com/). - An Azure AI project, see [Create a project in Azure AI Foundry portal](../create-projects.md). -- An AI model supporting the [Azure AI model inference API](https://aka.ms/azureai/modelinference) deployed through AI Foundry. +- An AI model supporting the [Azure AI model inference API](https://aka.ms/azureai/modelinference) deployed through Azure AI Foundry. - If using Python, you need Python 3.8 or later installed, including pip. - If using JavaScript, the supported environments are LTS versions of Node.js. @@ -209,7 +209,7 @@ To trace your own custom functions, you can leverage OpenTelemetry, you'll need ## Attach User feedback to traces -To attach user feedback to traces and visualize them in AI Foundry portal using OpenTelemetry's semantic conventions, you can instrument your application enabling tracing and logging user feedback. By correlating feedback traces with their respective chat request traces using the response ID, you can use view and manage these traces in AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications. +To attach user feedback to traces and visualize them in Azure AI Foundry portal using OpenTelemetry's semantic conventions, you can instrument your application enabling tracing and logging user feedback. By correlating feedback traces with their respective chat request traces using the response ID, you can use view and manage these traces in Azure AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications. ## Related content diff --git a/articles/ai-studio/how-to/develop/trace-production-sdk.md b/articles/ai-studio/how-to/develop/trace-production-sdk.md index d44df30a96e..ca0b32f4711 100644 --- a/articles/ai-studio/how-to/develop/trace-production-sdk.md +++ b/articles/ai-studio/how-to/develop/trace-production-sdk.md @@ -27,7 +27,7 @@ In this article, you learn to enable tracing, collect aggregated metrics, and co ## Prerequisites - The Azure CLI and the Azure Machine Learning extension to the Azure CLI. -- An AI Foundry project. If you don't already have a project, you can [create one here](../../how-to/create-projects.md). +- An Azure AI Foundry project. If you don't already have a project, you can [create one here](../../how-to/create-projects.md). - An Application Insights. If you don't already have an Application Insights resource, you can [create one here](/azure/azure-monitor/app/create-workspace-resource). - Azure role-based access controls are used to grant access to operations in Azure Machine Learning. To perform the steps in this article, you must have **Owner** or **Contributor** permissions on the selected resource group. For more information, see [Role-based access control in Azure AI Foundry portal](../../concepts/rbac-ai-studio.md). @@ -42,7 +42,7 @@ You can also [deploy to other platforms, such as Docker container, Kubernetes cl ## Enable trace and collect system metrics for your deployment -If you're using AI Foundry portal to deploy, then you can turn-on **Application Insights diagnostics** in **Advanced settings** > **Deployment** step in the deployment wizard, in which way the tracing data and system metrics are collected to the project linked to Application Insights. +If you're using Azure AI Foundry portal to deploy, then you can turn-on **Application Insights diagnostics** in **Advanced settings** > **Deployment** step in the deployment wizard, in which way the tracing data and system metrics are collected to the project linked to Application Insights. If you're using SDK or CLI, you can by adding a property `app_insights_enabled: true` in the deployment yaml file that collects data to the project linked to application insights. diff --git a/articles/ai-studio/how-to/develop/visualize-traces.md b/articles/ai-studio/how-to/develop/visualize-traces.md index 003d77bdd87..ae61de68c66 100644 --- a/articles/ai-studio/how-to/develop/visualize-traces.md +++ b/articles/ai-studio/how-to/develop/visualize-traces.md @@ -54,7 +54,7 @@ os.environ['AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED'] = 'true' application_insights_connection_string = project.telemetry.get_connection_string() if not application_insights_connection_string: print("Application Insights was not enabled for this project.") - print("Enable it via the 'Tracing' tab in your AI Foundry project page.") + print("Enable it via the 'Tracing' tab in your Azure AI Foundry project page.") exit() configure_azure_monitor(connection_string=application_insights_connection_string) diff --git a/articles/ai-studio/how-to/develop/vscode.md b/articles/ai-studio/how-to/develop/vscode.md index c67e7e15458..da731b83fce 100644 --- a/articles/ai-studio/how-to/develop/vscode.md +++ b/articles/ai-studio/how-to/develop/vscode.md @@ -46,7 +46,7 @@ Azure AI Foundry supports developing in VS Code - Desktop and Web. In each scena Our prebuilt development environments are based on a docker container that has Azure AI SDKs, the prompt flow SDK, and other tools. The environment is configured to run VS Code remotely inside of the container. The container is defined in a similar way to [this Dockerfile](https://github.com/Azure-Samples/aistudio-python-quickstart-sample/blob/main/.devcontainer/Dockerfile), and is based on [Microsoft's Python 3.10 Development Container Image](https://mcr.microsoft.com/product/devcontainers/python/about). -Your file explorer is opened to the specific project directory you launched from in AI Foundry portal. +Your file explorer is opened to the specific project directory you launched from in Azure AI Foundry portal. The container is configured with the Azure AI folder hierarchy (`afh` directory), which is designed to orient you within your current development context, and help you work with your code, data, and shared files most efficiently. This `afh` directory houses your Azure AI Foundry projects, and each project has a dedicated project directory that includes `code`, `data`, and `shared` folders. diff --git a/articles/ai-studio/how-to/disable-local-auth.md b/articles/ai-studio/how-to/disable-local-auth.md index e3aa68a8f77..6f139e7d55b 100644 --- a/articles/ai-studio/how-to/disable-local-auth.md +++ b/articles/ai-studio/how-to/disable-local-auth.md @@ -226,7 +226,7 @@ If you have an existing Azure AI Foundry hub, use the steps in this section to u # [Azure portal](#tab/portal) -1. Go to the Azure portal and select the __AI Foundry hub__. +1. Go to the Azure portal and select the __Azure AI Foundry hub__. 1. From the left menu, select **Properties**. From the bottom of the page, set __Storage account access type__ to __Identity-based__. Select __Save__ from the top of the page to save the configuration. :::image type="content" source="../media/disable-local-auth/update-existing-hub-identity-based-access.png" alt-text="Screenshot showing selection of Identity-based access." lightbox="../media/disable-local-auth/update-existing-hub-identity-based-access.png"::: diff --git a/articles/ai-studio/how-to/disaster-recovery.md b/articles/ai-studio/how-to/disaster-recovery.md index a87e269e2cf..57ba9fb3aff 100644 --- a/articles/ai-studio/how-to/disaster-recovery.md +++ b/articles/ai-studio/how-to/disaster-recovery.md @@ -84,8 +84,8 @@ Azure AI Foundry builds on top of other services. Some services can be configure | Azure service | Geo-replicated by | Configuration | | ----- | ----- | ----- | -| AI Foundry hub and projects | You | Create a hub/projects in the selected regions. | -| AI Foundry compute | You | Create the compute resources in the selected regions. For compute resources that can dynamically scale, make sure that both regions provide sufficient compute quota for your needs. | +| Azure AI Foundry hub and projects | You | Create a hub/projects in the selected regions. | +| Azure AI Foundry compute | You | Create the compute resources in the selected regions. For compute resources that can dynamically scale, make sure that both regions provide sufficient compute quota for your needs. | | Key Vault | Microsoft | Use the same Key Vault instance with the Azure AI Foundry hub and resources in both regions. Key Vault automatically fails over to a secondary region. For more information, see [Azure Key Vault availability and redundancy](/azure/key-vault/general/disaster-recovery-guidance).| | Storage Account | You | Azure Machine Learning doesn't support __default storage-account__ failover using geo-redundant storage (GRS), geo-zone-redundant storage (GZRS), read-access geo-redundant storage (RA-GRS), or read-access geo-zone-redundant storage (RA-GZRS). Configure a storage account according to your needs and then use it for your hub. All subsequent projects use the hub's storage account. For more information, see [Azure Storage redundancy](/azure/storage/common/storage-redundancy). | | Container Registry | Microsoft | Configure the Container Registry instance to geo-replicate registries to the paired region for Azure AI Foundry. Use the same instance for both hub instances. For more information, see [Geo-replication in Azure Container Registry](/azure/container-registry/container-registry-geo-replication). | @@ -120,7 +120,7 @@ For any hubs that are essential to business continuity, deploy resources in two In the scenario in which you're connecting with data to customize your AI application, typically your datasets could be used in Azure AI but also outside of Azure AI. Dataset volume could be quite large, so for it might be good practice to keep this data in a separate storage account. Evaluate what data replication strategy makes most sense for your use case. -In AI Foundry portal, make a connection to your data. If you have multiple AI Foundry instances in different regions, you might still point to the same storage account because connections work across regions. +In Azure AI Foundry portal, make a connection to your data. If you have multiple Azure AI Foundry instances in different regions, you might still point to the same storage account because connections work across regions. ## Initiate a failover diff --git a/articles/ai-studio/how-to/evaluate-generative-ai-app.md b/articles/ai-studio/how-to/evaluate-generative-ai-app.md index 7a52dde8db3..b130cadf24c 100644 --- a/articles/ai-studio/how-to/evaluate-generative-ai-app.md +++ b/articles/ai-studio/how-to/evaluate-generative-ai-app.md @@ -16,7 +16,7 @@ author: lgayhardt To thoroughly assess the performance of your generative AI models and applications when applied to a substantial dataset, you can initiate an evaluation process. During this evaluation, your model or application is tested with the given dataset, and its performance will be quantitatively measured with both mathematical based metrics and AI-assisted metrics. This evaluation run provides you with comprehensive insights into the application's capabilities and limitations. -To carry out this evaluation, you can utilize the evaluation functionality in Azure AI Foundry portal, a comprehensive platform that offers tools and features for assessing the performance and safety of your generative AI model. In AI Foundry portal, you're able to log, view, and analyze detailed evaluation metrics. +To carry out this evaluation, you can utilize the evaluation functionality in Azure AI Foundry portal, a comprehensive platform that offers tools and features for assessing the performance and safety of your generative AI model. In Azure AI Foundry portal, you're able to log, view, and analyze detailed evaluation metrics. In this article, you learn to create an evaluation run against model, a test dataset or a flow with built-in evaluation metrics from Azure AI Foundry UI. For greater flexibility, you can establish a custom evaluation flow and employ the **custom evaluation** feature. Alternatively, if your objective is solely to conduct a batch run without any evaluation, you can also utilize the custom evaluation feature. @@ -29,7 +29,7 @@ To run an evaluation with AI-assisted metrics, you need to have the following re ## Create an evaluation with built-in evaluation metrics -An evaluation run allows you to generate metric outputs for each data row in your test dataset. You can choose one or more evaluation metrics to assess the output from different aspects. You can create an evaluation run from the evaluation, model catalog or prompt flow pages in AI Foundry portal. Then an evaluation creation wizard appears to guide you through the process of setting up an evaluation run. +An evaluation run allows you to generate metric outputs for each data row in your test dataset. You can choose one or more evaluation metrics to assess the output from different aspects. You can create an evaluation run from the evaluation, model catalog or prompt flow pages in Azure AI Foundry portal. Then an evaluation creation wizard appears to guide you through the process of setting up an evaluation run. ### From the evaluate page @@ -235,7 +235,7 @@ The evaluator library is a centralized place that allows you to see the details The evaluator library also enables version management. You can compare different versions of your work, restore previous versions if needed, and collaborate with others more easily. -To use the evaluator library in AI Foundry portal, go to your project's **Evaluation** page and select the **Evaluator library** tab. +To use the evaluator library in Azure AI Foundry portal, go to your project's **Evaluation** page and select the **Evaluator library** tab. :::image type="content" source="../media/evaluations/evaluate/evaluator-library-list.png" alt-text="Screenshot of the page to select evaluators from the evaluator library." lightbox="../media/evaluations/evaluate/evaluator-library-list.png"::: diff --git a/articles/ai-studio/how-to/evaluate-results.md b/articles/ai-studio/how-to/evaluate-results.md index b9b8ddf74b3..d8dd969a2be 100644 --- a/articles/ai-studio/how-to/evaluate-results.md +++ b/articles/ai-studio/how-to/evaluate-results.md @@ -17,7 +17,7 @@ author: lgayhardt # How to view evaluation results in Azure AI Foundry portal -The Azure AI Foundry portal evaluation page is a versatile hub that not only allows you to visualize and assess your results but also serves as a control center for optimizing, troubleshooting, and selecting the ideal AI model for your deployment needs. It's a one-stop solution for data-driven decision-making and performance enhancement in your AI Foundry projects. You can seamlessly access and interpret the results from various sources, including your flow, the playground quick test session, evaluation submission UI, and SDK. This flexibility ensures that you can interact with your results in a way that best suits your workflow and preferences. +The Azure AI Foundry portal evaluation page is a versatile hub that not only allows you to visualize and assess your results but also serves as a control center for optimizing, troubleshooting, and selecting the ideal AI model for your deployment needs. It's a one-stop solution for data-driven decision-making and performance enhancement in your Azure AI Foundry projects. You can seamlessly access and interpret the results from various sources, including your flow, the playground quick test session, evaluation submission UI, and SDK. This flexibility ensures that you can interact with your results in a way that best suits your workflow and preferences. Once you've visualized your evaluation results, you can dive into a thorough examination. This includes the ability to not only view individual results but also to compare these results across multiple evaluation runs. By doing so, you can identify trends, patterns, and discrepancies, gaining invaluable insights into the performance of your AI system under various conditions. diff --git a/articles/ai-studio/how-to/fine-tune-model-llama.md b/articles/ai-studio/how-to/fine-tune-model-llama.md index 3c0ef1024a0..96f6f14bd8d 100644 --- a/articles/ai-studio/how-to/fine-tune-model-llama.md +++ b/articles/ai-studio/how-to/fine-tune-model-llama.md @@ -66,7 +66,7 @@ Fine-tuning of Llama 2 models is currently supported in projects located in West - An [Azure AI Foundry project](../how-to/create-projects.md) in Azure AI Foundry portal. - Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure AI Foundry portal. To perform the steps in this article, your user account must be assigned the __owner__ or __contributor__ role for the Azure subscription. Alternatively, your account can be assigned a custom role that has the following permissions: - - On the Azure subscription—to subscribe the AI Foundry project to the Azure Marketplace offering, once for each project, per offering: + - On the Azure subscription—to subscribe the Azure AI Foundry project to the Azure Marketplace offering, once for each project, per offering: - `Microsoft.MarketplaceOrdering/agreements/offers/plans/read` - `Microsoft.MarketplaceOrdering/agreements/offers/plans/sign/action` - `Microsoft.MarketplaceOrdering/offerTypes/publishers/offers/plans/agreements/read` @@ -77,7 +77,7 @@ Fine-tuning of Llama 2 models is currently supported in projects located in West - `Microsoft.SaaS/resources/read` - `Microsoft.SaaS/resources/write` - - On the AI Foundry project—to deploy endpoints (the Azure AI Developer role contains these permissions already): + - On the Azure AI Foundry project—to deploy endpoints (the Azure AI Developer role contains these permissions already): - `Microsoft.MachineLearningServices/workspaces/marketplaceModelSubscriptions/*` - `Microsoft.MachineLearningServices/workspaces/serverlessEndpoints/*` @@ -87,15 +87,15 @@ Fine-tuning of Llama 2 models is currently supported in projects located in West # [Meta Llama 2](#tab/llama-two) An Azure subscription with a valid payment method. Free or trial Azure subscriptions won't work. If you don't have an Azure subscription, create a [paid Azure account](https://azure.microsoft.com/pricing/purchase-options/pay-as-you-go) to begin. -- An [AI Foundry hub](../how-to/create-azure-ai-resource.md). +- An [Azure AI Foundry hub](../how-to/create-azure-ai-resource.md). > [!IMPORTANT] > For Meta Llama 2 models, the pay-as-you-go model fine-tune offering is only available with hubs created in the **West US 3** region. -- An [AI Foundry project](../how-to/create-projects.md) in Azure AI Foundry portal. +- An [Azure AI Foundry project](../how-to/create-projects.md) in Azure AI Foundry portal. - Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure AI Foundry portal. To perform the steps in this article, your user account must be assigned the __owner__ or __contributor__ role for the Azure subscription. Alternatively, your account can be assigned a custom role that has the following permissions: - - On the Azure subscription—to subscribe the AI Foundry project to the Azure Marketplace offering, once for each project, per offering: + - On the Azure subscription—to subscribe the Azure AI Foundry project to the Azure Marketplace offering, once for each project, per offering: - `Microsoft.MarketplaceOrdering/agreements/offers/plans/read` - `Microsoft.MarketplaceOrdering/agreements/offers/plans/sign/action` - `Microsoft.MarketplaceOrdering/offerTypes/publishers/offers/plans/agreements/read` @@ -106,7 +106,7 @@ Fine-tuning of Llama 2 models is currently supported in projects located in West - `Microsoft.SaaS/resources/read` - `Microsoft.SaaS/resources/write` - - On the AI Foundry project—to deploy endpoints (the Azure AI Developer role contains these permissions already): + - On the Azure AI Foundry project—to deploy endpoints (the Azure AI Developer role contains these permissions already): - `Microsoft.MachineLearningServices/workspaces/marketplaceModelSubscriptions/*` - `Microsoft.MachineLearningServices/workspaces/serverlessEndpoints/*` diff --git a/articles/ai-studio/how-to/fine-tune-models-tsuzumi.md b/articles/ai-studio/how-to/fine-tune-models-tsuzumi.md index 8288ec13ec0..1c893529b3a 100644 --- a/articles/ai-studio/how-to/fine-tune-models-tsuzumi.md +++ b/articles/ai-studio/how-to/fine-tune-models-tsuzumi.md @@ -38,7 +38,7 @@ In this article, you learn how to fine-tune an NTTDATA tsuzumi-7b model in [Azur - An [Azure AI Foundry project](../how-to/create-projects.md) in Azure AI Foundry portal. - Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure AI Foundry portal. To perform the steps in this article, your user account must be assigned the __owner__ or __contributor__ role for the Azure subscription. Alternatively, your account can be assigned a custom role that has the following permissions: - - On the Azure subscription—to subscribe the AI Foundry project to the Azure Marketplace offering, once for each project, per offering: + - On the Azure subscription—to subscribe the Azure AI Foundry project to the Azure Marketplace offering, once for each project, per offering: - `Microsoft.MarketplaceOrdering/agreements/offers/plans/read` - `Microsoft.MarketplaceOrdering/agreements/offers/plans/sign/action` - `Microsoft.MarketplaceOrdering/offerTypes/publishers/offers/plans/agreements/read` @@ -49,7 +49,7 @@ In this article, you learn how to fine-tune an NTTDATA tsuzumi-7b model in [Azur - `Microsoft.SaaS/resources/read` - `Microsoft.SaaS/resources/write` - - On the AI Foundry project—to deploy endpoints (the Azure AI Developer role contains these permissions already): + - On the Azure AI Foundry project—to deploy endpoints (the Azure AI Developer role contains these permissions already): - `Microsoft.MachineLearningServices/workspaces/marketplaceModelSubscriptions/*` - `Microsoft.MachineLearningServices/workspaces/serverlessEndpoints/*` diff --git a/articles/ai-studio/how-to/fine-tune-phi-3.md b/articles/ai-studio/how-to/fine-tune-phi-3.md index 0e3fcdeb5ed..5f0e42e0f86 100644 --- a/articles/ai-studio/how-to/fine-tune-phi-3.md +++ b/articles/ai-studio/how-to/fine-tune-phi-3.md @@ -64,12 +64,12 @@ The models underwent a rigorous enhancement process, incorporating both supervis ### Prerequisites - An Azure subscription. If you don't have an Azure subscription, create a [paid Azure account](https://azure.microsoft.com/pricing/purchase-options/pay-as-you-go) to begin. -- An [AI Foundry hub](../how-to/create-azure-ai-resource.md). +- An [Azure AI Foundry hub](../how-to/create-azure-ai-resource.md). > [!IMPORTANT] > For Phi-3 family models, the pay-as-you-go model fine-tune offering is only available with hubs created in **East US 2** regions. -- An [AI Foundry project](../how-to/create-projects.md). +- An [Azure AI Foundry project](../how-to/create-projects.md). - Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure AI Foundry portal. To perform the steps in this article, your user account must be assigned the __Azure AI Developer role__ on the resource group. For more information on permissions, see [Role-based access control in Azure AI Foundry portal](../concepts/rbac-ai-studio.md). diff --git a/articles/ai-studio/how-to/flow-deploy.md b/articles/ai-studio/how-to/flow-deploy.md index 256dbf465c5..8c4ff278383 100644 --- a/articles/ai-studio/how-to/flow-deploy.md +++ b/articles/ai-studio/how-to/flow-deploy.md @@ -123,7 +123,7 @@ The authentication method for the endpoint. Key-based authentication provides a #### Identity type -The endpoint needs to access Azure resources such as the Azure Container Registry or your AI Foundry hub connections for inferencing. You can allow the endpoint permission to access Azure resources via giving permission to its managed identity. +The endpoint needs to access Azure resources such as the Azure Container Registry or your Azure AI Foundry hub connections for inferencing. You can allow the endpoint permission to access Azure resources via giving permission to its managed identity. System-assigned identity will be autocreated after your endpoint is created, while user-assigned identity is created by user. [Learn more about managed identities.](/azure/active-directory/managed-identities-azure-resources/overview) @@ -139,10 +139,10 @@ If you created the associated endpoint with **User Assigned Identity**, the user |Scope|Role|Why it's needed| |---|---|---| -|AI Foundry project|**Azure Machine Learning Workspace Connection Secrets Reader** role **OR** a customized role with `Microsoft.MachineLearningServices/workspaces/connections/listsecrets/action` | Get project connections| -|AI Foundry project container registry |**ACR pull** |Pull container image | -|AI Foundry project default storage| **Storage Blob Data Reader**| Load model from storage | -|AI Foundry project|**Workspace metrics writer**| After you deploy then endpoint, if you want to monitor the endpoint related metrics like CPU/GPU/Disk/Memory utilization, you need to give this permission to the identity.

Optional| +|Azure AI Foundry project|**Azure Machine Learning Workspace Connection Secrets Reader** role **OR** a customized role with `Microsoft.MachineLearningServices/workspaces/connections/listsecrets/action` | Get project connections| +|Azure AI Foundry project container registry |**ACR pull** |Pull container image | +|Azure AI Foundry project default storage| **Storage Blob Data Reader**| Load model from storage | +|Azure AI Foundry project|**Workspace metrics writer**| After you deploy then endpoint, if you want to monitor the endpoint related metrics like CPU/GPU/Disk/Memory utilization, you need to give this permission to the identity.

Optional| See detailed guidance about how to grant permissions to the endpoint identity in [Grant permissions to the endpoint](#grant-permissions-to-the-endpoint). diff --git a/articles/ai-studio/how-to/healthcare-ai/deploy-cxrreportgen.md b/articles/ai-studio/how-to/healthcare-ai/deploy-cxrreportgen.md index 2c165de874a..cde9e7f3751 100644 --- a/articles/ai-studio/how-to/healthcare-ai/deploy-cxrreportgen.md +++ b/articles/ai-studio/how-to/healthcare-ai/deploy-cxrreportgen.md @@ -1,5 +1,5 @@ --- -title: How to deploy and use CXRReportGen healthcare AI model with AI Foundry +title: How to deploy and use CXRReportGen healthcare AI model with Azure AI Foundry titleSuffix: Azure AI Foundry description: Learn how to use CXRReportGen Healthcare AI Model with Azure AI Foundry. ms.service: azure-ai-studio diff --git a/articles/ai-studio/how-to/healthcare-ai/deploy-medimageinsight.md b/articles/ai-studio/how-to/healthcare-ai/deploy-medimageinsight.md index e7976cf9ac5..6fff6f2bce7 100644 --- a/articles/ai-studio/how-to/healthcare-ai/deploy-medimageinsight.md +++ b/articles/ai-studio/how-to/healthcare-ai/deploy-medimageinsight.md @@ -1,5 +1,5 @@ --- -title: How to deploy and use MedImageInsight healthcare AI model with AI Foundry +title: How to deploy and use MedImageInsight healthcare AI model with Azure AI Foundry titleSuffix: Azure AI Foundry description: Learn how to use MedImageInsight Healthcare AI Model with Azure AI Foundry. ms.service: azure-ai-studio diff --git a/articles/ai-studio/how-to/healthcare-ai/deploy-medimageparse.md b/articles/ai-studio/how-to/healthcare-ai/deploy-medimageparse.md index 7d7c902b61b..21dbd8e27d6 100644 --- a/articles/ai-studio/how-to/healthcare-ai/deploy-medimageparse.md +++ b/articles/ai-studio/how-to/healthcare-ai/deploy-medimageparse.md @@ -1,5 +1,5 @@ --- -title: How to deploy and use MedImageParse healthcare AI model with AI Foundry +title: How to deploy and use MedImageParse healthcare AI model with Azure AI Foundry titleSuffix: Azure AI Foundry description: Learn how to use MedImageParse Healthcare AI Model with Azure AI Foundry. ms.service: azure-ai-studio diff --git a/articles/ai-studio/how-to/healthcare-ai/healthcare-ai-models.md b/articles/ai-studio/how-to/healthcare-ai/healthcare-ai-models.md index 51101e07243..f4e233af7fb 100644 --- a/articles/ai-studio/how-to/healthcare-ai/healthcare-ai-models.md +++ b/articles/ai-studio/how-to/healthcare-ai/healthcare-ai-models.md @@ -1,5 +1,5 @@ --- -title: Foundation models for healthcare in AI Foundry portal +title: Foundation models for healthcare in Azure AI Foundry portal titleSuffix: Azure AI Foundry description: Learn about AI models that are applicable to the health and life science industry. ms.service: azure-ai-studio diff --git a/articles/ai-studio/how-to/model-catalog-overview.md b/articles/ai-studio/how-to/model-catalog-overview.md index 58c21e99e2f..4555b2d580d 100644 --- a/articles/ai-studio/how-to/model-catalog-overview.md +++ b/articles/ai-studio/how-to/model-catalog-overview.md @@ -165,21 +165,21 @@ Pay-per-token billing is available only to users whose Azure subscription belong ### Network isolation for models deployed via serverless APIs -Managed computes for models deployed as serverless APIs follow the public network access flag setting of the AI Foundry hub that has the project in which the deployment exists. To help secure your managed compute, disable the public network access flag on your AI Foundry hub. You can help secure inbound communication from a client to your managed compute by using a private endpoint for the hub. +Managed computes for models deployed as serverless APIs follow the public network access flag setting of the Azure AI Foundry hub that has the project in which the deployment exists. To help secure your managed compute, disable the public network access flag on your Azure AI Foundry hub. You can help secure inbound communication from a client to your managed compute by using a private endpoint for the hub. -To set the public network access flag for the AI Foundry hub: +To set the public network access flag for the Azure AI Foundry hub: * Go to the [Azure portal](https://ms.portal.azure.com/). -* Search for the resource group to which the hub belongs, and select your AI Foundry hub from the resources listed for this resource group. +* Search for the resource group to which the hub belongs, and select your Azure AI Foundry hub from the resources listed for this resource group. * On the hub overview page, on the left pane, go to **Settings** > **Networking**. * On the **Public access** tab, you can configure settings for the public network access flag. * Save your changes. Your changes might take up to five minutes to propagate. #### Limitations -* If you have an AI Foundry hub with a managed compute created before July 11, 2024, managed computes added to projects in this hub won't follow the networking configuration of the hub. Instead, you need to create a new managed compute for the hub and create new serverless API deployments in the project so that the new deployments can follow the hub's networking configuration. +* If you have an Azure AI Foundry hub with a managed compute created before July 11, 2024, managed computes added to projects in this hub won't follow the networking configuration of the hub. Instead, you need to create a new managed compute for the hub and create new serverless API deployments in the project so that the new deployments can follow the hub's networking configuration. -* If you have an AI Foundry hub with MaaS deployments created before July 11, 2024, and you enable a managed compute on this hub, the existing MaaS deployments won't follow the hub's networking configuration. For serverless API deployments in the hub to follow the hub's networking configuration, you need to create the deployments again. +* If you have an Azure AI Foundry hub with MaaS deployments created before July 11, 2024, and you enable a managed compute on this hub, the existing MaaS deployments won't follow the hub's networking configuration. For serverless API deployments in the hub to follow the hub's networking configuration, you need to create the deployments again. * Currently, [Azure OpenAI On Your Data](/azure/ai-services/openai/concepts/use-your-data) support isn't available for MaaS deployments in private hubs, because private hubs have the public network access flag disabled. diff --git a/articles/ai-studio/how-to/monitor-quality-safety.md b/articles/ai-studio/how-to/monitor-quality-safety.md index 301f2a68713..0abacc21cb9 100644 --- a/articles/ai-studio/how-to/monitor-quality-safety.md +++ b/articles/ai-studio/how-to/monitor-quality-safety.md @@ -206,7 +206,7 @@ credential = DefaultAzureCredential() # Update your azure resources details subscription_id = "INSERT YOUR SUBSCRIPTION ID" resource_group = "INSERT YOUR RESOURCE GROUP NAME" -project_name = "INSERT YOUR PROJECT NAME" # This is the same as your AI Foundry project name +project_name = "INSERT YOUR PROJECT NAME" # This is the same as your Azure AI Foundry project name endpoint_name = "INSERT YOUR ENDPOINT NAME" # This is your deployment name without the suffix (e.g., deployment is "contoso-chatbot-1", endpoint is "contoso-chatbot") deployment_name = "INSERT YOUR DEPLOYMENT NAME" aoai_deployment_name ="INSERT YOUR AOAI DEPLOYMENT NAME" @@ -373,7 +373,7 @@ credential = DefaultAzureCredential() # Update your azure resources details subscription_id = "INSERT YOUR SUBSCRIPTION ID" resource_group = "INSERT YOUR RESOURCE GROUP NAME" -project_name = "INSERT YOUR PROJECT NAME" # This is the same as your AI Foundry project name +project_name = "INSERT YOUR PROJECT NAME" # This is the same as your Azure AI Foundry project name endpoint_name = "INSERT YOUR ENDPOINT NAME" # This is your deployment name without the suffix (e.g., deployment is "contoso-chatbot-1", endpoint is "contoso-chatbot") deployment_name = "INSERT YOUR DEPLOYMENT NAME" @@ -450,7 +450,7 @@ credential = DefaultAzureCredential() # Update your azure resources details subscription_id = "INSERT YOUR SUBSCRIPTION ID" resource_group = "INSERT YOUR RESOURCE GROUP NAME" -project_name = "INSERT YOUR PROJECT NAME" # This is the same as your AI Foundry project name +project_name = "INSERT YOUR PROJECT NAME" # This is the same as your Azure AI Foundry project name endpoint_name = "INSERT YOUR ENDPOINT NAME" # This is your deployment name without the suffix (e.g., deployment is "contoso-chatbot-1", endpoint is "contoso-chatbot") deployment_name = "INSERT YOUR DEPLOYMENT NAME" aoai_deployment_name ="INSERT YOUR AOAI DEPLOYMENT NAME" @@ -535,7 +535,7 @@ model_monitor = MonitorSchedule( ml_client.schedules.begin_create_or_update(model_monitor) ``` -After you create your monitor from the SDK, you can [consume the monitoring results](#consume-monitoring-results) in AI Foundry portal. +After you create your monitor from the SDK, you can [consume the monitoring results](#consume-monitoring-results) in Azure AI Foundry portal. ## Related content diff --git a/articles/ai-studio/how-to/online-evaluation.md b/articles/ai-studio/how-to/online-evaluation.md index b646ae0ce04..e6017c53ff6 100644 --- a/articles/ai-studio/how-to/online-evaluation.md +++ b/articles/ai-studio/how-to/online-evaluation.md @@ -228,7 +228,7 @@ app_insights_config = ApplicationInsightsConfiguration( deployment_name = "gpt-4" api_version = "2024-08-01-preview" -# This is your AOAI connection name, which can be found in your AI Foundry project under the 'Models + Endpoints' tab +# This is your AOAI connection name, which can be found in your Azure AI Foundry project under the 'Models + Endpoints' tab default_connection = project_client.connections._get_connection( "aoai_connection_name" ) @@ -245,7 +245,7 @@ Next, configure the evaluators you wish to use: ```python # RelevanceEvaluator -# id for each evaluator can be found in your AI Foundry registry - please see documentation for more information +# id for each evaluator can be found in your Azure AI Foundry registry - please see documentation for more information # init_params is the configuration for the model to use to perform the evaluation # data_mapping is used to map the output columns of your query to the names required by the evaluator relevance_evaluator_config = EvaluatorConfiguration( diff --git a/articles/ai-studio/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool.md b/articles/ai-studio/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool.md index 552e60bb2cf..887f6adf2ab 100644 --- a/articles/ai-studio/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool.md +++ b/articles/ai-studio/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool.md @@ -22,7 +22,7 @@ The prompt flow Azure OpenAI GPT-4 Turbo with Vision tool enables you to use you ## Prerequisites - An Azure subscription. You can create one for free. -- An [AI Foundry hub](../../how-to/create-azure-ai-resource.md) with a GPT-4 Turbo with Vision model deployed in [one of the regions that support GPT-4 Turbo with Vision](../../../ai-services/openai/concepts/models.md#model-summary-table-and-region-availability). When you deploy from your project's **Deployments** page, select `gpt-4` as the model name and `vision-preview` as the model version. +- An [Azure AI Foundry hub](../../how-to/create-azure-ai-resource.md) with a GPT-4 Turbo with Vision model deployed in [one of the regions that support GPT-4 Turbo with Vision](../../../ai-services/openai/concepts/models.md#model-summary-table-and-region-availability). When you deploy from your project's **Deployments** page, select `gpt-4` as the model name and `vision-preview` as the model version. ## Build with the Azure OpenAI GPT-4 Turbo with Vision tool diff --git a/articles/ai-studio/how-to/prompt-flow-tools/serp-api-tool.md b/articles/ai-studio/how-to/prompt-flow-tools/serp-api-tool.md index 0b719b1ba02..e336e1aa051 100644 --- a/articles/ai-studio/how-to/prompt-flow-tools/serp-api-tool.md +++ b/articles/ai-studio/how-to/prompt-flow-tools/serp-api-tool.md @@ -37,7 +37,7 @@ To create a Serp connection: - `azureml.flow.module`: `promptflow.connections` - `api_key`: Your Serp API key. You must select the **is secret** checkbox to keep the API key secure. - :::image type="content" source="../../media/prompt-flow/serp-custom-connection-keys.png" alt-text="Screenshot that shows adding extra information to a custom connection in AI Foundry portal." lightbox = "../../media/prompt-flow/serp-custom-connection-keys.png"::: + :::image type="content" source="../../media/prompt-flow/serp-custom-connection-keys.png" alt-text="Screenshot that shows adding extra information to a custom connection in Azure AI Foundry portal." lightbox = "../../media/prompt-flow/serp-custom-connection-keys.png"::: The connection is the model used to establish connections with the Serp API. Get your API key from the Serp API account dashboard. diff --git a/articles/ai-studio/how-to/prompt-flow-troubleshoot.md b/articles/ai-studio/how-to/prompt-flow-troubleshoot.md index 1a2d2ce5c28..b4dbc4912c1 100644 --- a/articles/ai-studio/how-to/prompt-flow-troubleshoot.md +++ b/articles/ai-studio/how-to/prompt-flow-troubleshoot.md @@ -96,7 +96,7 @@ If you regenerate your Azure OpenAI key and manually update the connection used This is because the connections used in the endpoints/deployments won't be automatically updated. Any change for key or secrets in deployments should be done by manual update, which aims to avoid impacting online production deployment due to unintentional offline operation. -- If the endpoint was deployed in the AI Foundry portal, you can just redeploy the flow to the existing endpoint using the same deployment name. +- If the endpoint was deployed in the Azure AI Foundry portal, you can just redeploy the flow to the existing endpoint using the same deployment name. - If the endpoint was deployed using SDK or CLI, you need to make some modification to the deployment definition such as adding a dummy environment variable, and then use `az ml online-deployment update` to update your deployment. ### Vulnerability issues in prompt flow deployments diff --git a/articles/ai-studio/how-to/secure-data-playground.md b/articles/ai-studio/how-to/secure-data-playground.md index 006be315708..0779c1e0b97 100644 --- a/articles/ai-studio/how-to/secure-data-playground.md +++ b/articles/ai-studio/how-to/secure-data-playground.md @@ -18,7 +18,7 @@ zone_pivot_groups: azure-ai-studio-sdk-cli Use this article to learn how to securely use Azure AI Foundry's playground chat on your data. The following sections provide our recommended configuration to protect your data and resources by using Microsoft Entra ID role-based access control, a managed network, and private endpoints. We recommend disabling public network access for Azure OpenAI resources, Azure AI Search resources, and storage accounts. Using selected networks with IP rules isn't supported because the services' IP addresses are dynamic. > [!NOTE] -> AI Foundry's managed virtual network settings apply only to AI Foundry's managed compute resources, not platform as a service (PaaS) services like Azure OpenAI or Azure AI Search. When using PaaS services, there is no data exfiltration risk because the services are managed by Microsoft. +> Azure AI Foundry's managed virtual network settings apply only to Azure AI Foundry's managed compute resources, not platform as a service (PaaS) services like Azure OpenAI or Azure AI Search. When using PaaS services, there is no data exfiltration risk because the services are managed by Microsoft. The following table summarizes the changes made in this article: @@ -31,13 +31,13 @@ The following table summarizes the changes made in this article: ## Prerequisites -Ensure that the AI Foundry hub is deployed with the __Identity-based access__ setting for the Storage account. This configuration is required for the correct access control and security of your AI Foundry Hub. You can verify this configuration using one of the following methods: +Ensure that the Azure AI Foundry hub is deployed with the __Identity-based access__ setting for the Storage account. This configuration is required for the correct access control and security of your Azure AI Foundry Hub. You can verify this configuration using one of the following methods: - In the Azure portal, select the hub and then select __Settings__, __Properties__, and __Options__. At the bottom of the page, verify that __Storage account access type__ is set to __Identity-based access__. - If deploying using Azure Resource Manager or Bicep templates, include the `systemDatastoresAuthMode: 'identity'` property in your deployment template. - You must be familiar with using Microsoft Entra ID role-based access control to assign roles to resources and users. For more information, visit the [Role-based access control](/azure/role-based-access-control/overview) article. -## Configure Network Isolated AI Foundry Hub +## Configure Network Isolated Azure AI Foundry Hub If you're __creating a new Azure AI Foundry hub__, use one of the following documents to create a hub with network isolation: @@ -214,9 +214,9 @@ For more information on assigning roles, see [Tutorial: Grant a user access to r | Azure Storage Account | Storage File Data Privileged Contributor | Developer's Microsoft Entra ID | Needed to Access File Share in Storage for Promptflow data. | | The resource group or Azure subscription where the developer need to deploy the web app to | Contributor | Developer's Microsoft Entra ID | Deploy web app to the developer's Azure subscription. | -## Use your data in AI Foundry portal +## Use your data in Azure AI Foundry portal -Now, the data you add to AI Foundry is secured to the isolated network provided by your Azure AI Foundry hub and project. For an example of using data, visit the [build a question and answer copilot](../tutorials/deploy-copilot-ai-studio.md) tutorial. +Now, the data you add to Azure AI Foundry is secured to the isolated network provided by your Azure AI Foundry hub and project. For an example of using data, visit the [build a question and answer copilot](../tutorials/deploy-copilot-ai-studio.md) tutorial. ## Deploy web apps diff --git a/articles/ai-studio/how-to/troubleshoot-deploy-and-monitor.md b/articles/ai-studio/how-to/troubleshoot-deploy-and-monitor.md index 0015479448a..78bad368d26 100644 --- a/articles/ai-studio/how-to/troubleshoot-deploy-and-monitor.md +++ b/articles/ai-studio/how-to/troubleshoot-deploy-and-monitor.md @@ -59,7 +59,7 @@ To fix this error, take the following steps to manually assign the ML Data scien 1. Select your endpoint's name. 1. Select **Select**. 1. Select **Review + Assign**. -1. Return to your project in AI Foundry portal and select **Deployments** from the left navigation menu. +1. Return to your project in Azure AI Foundry portal and select **Deployments** from the left navigation menu. 1. Select your deployment. 1. Test the prompt flow deployment. diff --git a/articles/ai-studio/how-to/troubleshoot-secure-connection-project.md b/articles/ai-studio/how-to/troubleshoot-secure-connection-project.md index 2c191f27f76..da6e0e5094b 100644 --- a/articles/ai-studio/how-to/troubleshoot-secure-connection-project.md +++ b/articles/ai-studio/how-to/troubleshoot-secure-connection-project.md @@ -131,6 +131,6 @@ Try the following steps to troubleshoot: 1. In Azure Portal, check the network settings of the storage account that is associated to your hub. * If public network access is set to __Enabled from selected virtual networks and IP addresses__, ensure the correct IP address ranges are added to access your storage account. * If public network access is set to __Disabled__, ensure you have a private endpoint configured from your Azure virtual network to your storage account with Target sub-resource as blob. In addition, you must grant the [Reader](/azure/role-based-access-control/built-in-roles#reader) role for the storage account private endpoint to the managed identity. -2. In Azure Portal, navigate to your AI Foundry hub. Ensure the managed virtual network is provisioned and the outbound private endpoint to blob storage is Active. For more on provisioning the managed virtual network, see [How to configure a managed network for Azure AI Foundry hubs](configure-managed-network.md). -3. Navigate to AI Foundry > your project > project settings. +2. In Azure Portal, navigate to your Azure AI Foundry hub. Ensure the managed virtual network is provisioned and the outbound private endpoint to blob storage is Active. For more on provisioning the managed virtual network, see [How to configure a managed network for Azure AI Foundry hubs](configure-managed-network.md). +3. Navigate to Azure AI Foundry > your project > project settings. 4. Refresh the page. A number of connections should be created including 'workspaceblobstore'. diff --git a/articles/ai-studio/how-to/use-blocklists.md b/articles/ai-studio/how-to/use-blocklists.md index e283aea0aa6..eb7f9fb5b2b 100644 --- a/articles/ai-studio/how-to/use-blocklists.md +++ b/articles/ai-studio/how-to/use-blocklists.md @@ -1,5 +1,5 @@ --- -title: Use blocklists in AI Foundry portal +title: Use blocklists in Azure AI Foundry portal titleSuffix: Azure AI Foundry description: Learn how to create custom blocklists in Azure AI Foundry portal as part of your content filtering configurations. manager: nitinme diff --git a/articles/ai-studio/includes/create-env-file-tutorial.md b/articles/ai-studio/includes/create-env-file-tutorial.md index 405a60dcd2a..c1cfb7864e1 100644 --- a/articles/ai-studio/includes/create-env-file-tutorial.md +++ b/articles/ai-studio/includes/create-env-file-tutorial.md @@ -23,7 +23,7 @@ CHAT_MODEL="gpt-4o-mini" EVALUATION_MODEL="gpt-4o-mini" ``` -* Find your connection string in the Azure AI Foundry project you created in the [AI Foundry playground quickstart](../quickstarts/get-started-playground.md). Open the project, then find the connection string on the **Overview** page. Copy the connection string and paste it into the `.env` file. +* Find your connection string in the Azure AI Foundry project you created in the [Azure AI Foundry playground quickstart](../quickstarts/get-started-playground.md). Open the project, then find the connection string on the **Overview** page. Copy the connection string and paste it into the `.env` file. :::image type="content" source="../media/quickstarts/azure-ai-sdk/connection-string.png" alt-text="Screenshot shows the overview page of a project and the location of the connection string."::: diff --git a/articles/ai-studio/includes/create-env-file.md b/articles/ai-studio/includes/create-env-file.md index 5ba936435fa..8f37956a94e 100644 --- a/articles/ai-studio/includes/create-env-file.md +++ b/articles/ai-studio/includes/create-env-file.md @@ -18,7 +18,7 @@ Create a `.env` file, and paste the following code: PROJECT_CONNECTION_STRING= ``` -You find your connection string in the Azure AI Foundry project you created in the [AI Foundry playground quickstart](../quickstarts/get-started-playground.md). Open the project, then find the connection string on the **Overview** page. Copy the connection string and paste it into the `.env` file. +You find your connection string in the Azure AI Foundry project you created in the [Azure AI Foundry playground quickstart](../quickstarts/get-started-playground.md). Open the project, then find the connection string on the **Overview** page. Copy the connection string and paste it into the `.env` file. :::image type="content" source="../media/quickstarts/azure-ai-sdk/connection-string.png" alt-text="Screenshot shows the overview page of a project and the location of the connection string."::: diff --git a/articles/ai-studio/index.yml b/articles/ai-studio/index.yml index 9b773c7f127..c8f5952bcde 100644 --- a/articles/ai-studio/index.yml +++ b/articles/ai-studio/index.yml @@ -66,7 +66,7 @@ landingContent: links: - text: Get started with the Azure AI SDKs url: how-to/develop/sdk-overview.md - - text: Work with AI Foundry projects in VS Code + - text: Work with Azure AI Foundry projects in VS Code url: how-to/develop/vscode.md - linkListType: tutorial diff --git a/articles/ai-studio/quickstarts/get-started-code.md b/articles/ai-studio/quickstarts/get-started-code.md index dc9126c4131..3ad5c0edbb1 100644 --- a/articles/ai-studio/quickstarts/get-started-code.md +++ b/articles/ai-studio/quickstarts/get-started-code.md @@ -20,7 +20,7 @@ In this quickstart, we walk you through setting up your local development enviro ## Prerequisites -* Before you can follow this quickstart, complete the [AI Foundry playground quickstart](../quickstarts/get-started-playground.md) to deploy a **gpt-4o-mini** model into a project. +* Before you can follow this quickstart, complete the [Azure AI Foundry playground quickstart](../quickstarts/get-started-playground.md) to deploy a **gpt-4o-mini** model into a project. ## Install the Azure CLI and sign in @@ -48,7 +48,7 @@ Create a file named **chat.py**. Copy and paste the following code into it. Your project connection string is required to call the Azure OpenAI service from your code. -Find your connection string in the Azure AI Foundry project you created in the [AI Foundry playground quickstart](../quickstarts/get-started-playground.md). Open the project, then find the connection string on the **Overview** page. +Find your connection string in the Azure AI Foundry project you created in the [Azure AI Foundry playground quickstart](../quickstarts/get-started-playground.md). Open the project, then find the connection string on the **Overview** page. :::image type="content" source="../media/quickstarts/azure-ai-sdk/connection-string.png" alt-text="Screenshot shows the overview page of a project and the location of the connection string."::: diff --git a/articles/ai-studio/quickstarts/hear-speak-playground.md b/articles/ai-studio/quickstarts/hear-speak-playground.md index 2aa958190cd..4ae96ce5f13 100644 --- a/articles/ai-studio/quickstarts/hear-speak-playground.md +++ b/articles/ai-studio/quickstarts/hear-speak-playground.md @@ -15,7 +15,7 @@ ms.author: eur author: eric-urban --- -# Quickstart: Hear and speak with chat models in the AI Foundry portal chat playground +# Quickstart: Hear and speak with chat models in the Azure AI Foundry portal chat playground In the chat playground in Azure AI Foundry portal, you can use speech to text and text to speech features to interact with chat models. You can try the same model that you use for text-based chat in a speech-based chat. It's just another way to interact with the model. @@ -24,12 +24,12 @@ In this quickstart, you use Azure OpenAI Service and Azure AI Speech to: - Speak to the assistant via speech to text. - Hear the assistant's response via text to speech. -The speech to text and text to speech features can be used together or separately in the AI Foundry portal chat playground. You can use the playground to test your chat model before deploying it. +The speech to text and text to speech features can be used together or separately in the Azure AI Foundry portal chat playground. You can use the playground to test your chat model before deploying it. ## Prerequisites - An Azure subscription - Create one for free. -- An [AI Foundry project](../how-to/create-projects.md). +- An [Azure AI Foundry project](../how-to/create-projects.md). - A deployed [Azure OpenAI](../how-to/deploy-models-openai.md) chat model. This guide is tested with a `gpt-4o-mini` model. ## Configure the chat playground @@ -37,7 +37,7 @@ The speech to text and text to speech features can be used together or separatel Before you can start a chat session, you need to configure the chat playground to use the speech to text and text to speech features. 1. Sign in to [Azure AI Foundry](https://ai.azure.com). -1. Go to your AI Foundry project. If you need to create a project, see [Create an AI Foundry project](../how-to/create-projects.md). +1. Go to your Azure AI Foundry project. If you need to create a project, see [Create an Azure AI Foundry project](../how-to/create-projects.md). 1. Select **Playgrounds** from the left pane and then select a playground to use. In this example, select **Try the chat playground**. 1. Select your deployed chat model from the **Deployment** dropdown. diff --git a/articles/ai-studio/quickstarts/multimodal-vision.md b/articles/ai-studio/quickstarts/multimodal-vision.md index 06e03ecdb79..8ccf2cfeec8 100644 --- a/articles/ai-studio/quickstarts/multimodal-vision.md +++ b/articles/ai-studio/quickstarts/multimodal-vision.md @@ -31,7 +31,7 @@ Extra usage fees might apply when using GPT-4 Turbo with Vision and Azure AI Vis - An Azure subscription - Create one for free. - Once you have your Azure subscription, create an Azure OpenAI resource . -- An [AI Foundry hub](../how-to/create-azure-ai-resource.md) with your Azure OpenAI resource added as a connection. +- An [Azure AI Foundry hub](../how-to/create-azure-ai-resource.md) with your Azure OpenAI resource added as a connection. ## Prepare your media diff --git a/articles/ai-studio/reference/region-support.md b/articles/ai-studio/reference/region-support.md index 16d42c47113..21bcc87a858 100644 --- a/articles/ai-studio/reference/region-support.md +++ b/articles/ai-studio/reference/region-support.md @@ -53,7 +53,7 @@ Azure AI Foundry is currently not available in Azure Government regions or air-g For information on the availability of Azure OpenAI models, see [Azure OpenAI Model summary table and region availability](../../ai-services/openai/concepts/models.md#model-summary-table-and-region-availability). > [!NOTE] -> Some models might not be available within the AI Foundry model catalog. +> Some models might not be available within the Azure AI Foundry model catalog. For more information, see [Azure OpenAI quotas and limits](/azure/ai-services/openai/quotas-limits). diff --git a/articles/ai-studio/tutorials/copilot-sdk-create-resources.md b/articles/ai-studio/tutorials/copilot-sdk-create-resources.md index cb7695c23da..77fcf3bfa48 100644 --- a/articles/ai-studio/tutorials/copilot-sdk-create-resources.md +++ b/articles/ai-studio/tutorials/copilot-sdk-create-resources.md @@ -55,7 +55,7 @@ To create a project in [Azure AI Foundry](https://ai.azure.com), follow these st You need two models to build a RAG-based chat app: an Azure OpenAI chat model (`gpt-4o-mini`) and an Azure OpenAI embedding model (`text-embedding-ada-002`). Deploy these models in your Azure AI Foundry project, using this set of steps for each model. -These steps deploy a model to a real-time endpoint from the AI Foundry portal [model catalog](../how-to/model-catalog-overview.md): +These steps deploy a model to a real-time endpoint from the Azure AI Foundry portal [model catalog](../how-to/model-catalog-overview.md): 1. On the left navigation pane, select **Model catalog**. 1. Select the **gpt-4o-mini** model from the list of models. You can use the search bar to find it. diff --git a/articles/ai-studio/tutorials/copilot-sdk-evaluate.md b/articles/ai-studio/tutorials/copilot-sdk-evaluate.md index b683a5e5c9e..ce4e6f02659 100644 --- a/articles/ai-studio/tutorials/copilot-sdk-evaluate.md +++ b/articles/ai-studio/tutorials/copilot-sdk-evaluate.md @@ -72,7 +72,7 @@ The script also logs the evaluation results to the cloud project so that you can :::code language="python" source="~/azureai-samples-main/scenarios/rag/custom-rag-app/evaluate.py" id="evaluate_wrapper"::: -1. Finally, add code to run the evaluation, view the results locally, and gives you a link to the evaluation results in AI Foundry portal: +1. Finally, add code to run the evaluation, view the results locally, and gives you a link to the evaluation results in Azure AI Foundry portal: :::code language="python" source="~/azureai-samples-main/scenarios/rag/custom-rag-app/evaluate.py" id="run_evaluation"::: @@ -139,12 +139,12 @@ If you weren't able to increase the tokens per minute limit for your model, you 12 Sorry, I only can answer queries related to ou... ... 12 [13 rows x 8 columns] -('View evaluation results in AI Foundry portal: ' +('View evaluation results in Azure AI Foundry portal: ' 'https://xxxxxxxxxxxxxxxxxxxxxxx') ``` -### View evaluation results in AI Foundry portal +### View evaluation results in Azure AI Foundry portal Once the evaluation run completes, follow the link to view the evaluation results on the **Evaluation** page in the Azure AI Foundry portal. @@ -154,7 +154,7 @@ You can also look at the individual rows and see metric scores per row, and view :::image type="content" source="../media/tutorials/develop-rag-copilot-sdk/eval-studio-rows.png" alt-text="Screenshot shows rows of evaluation results in Azure AI Foundry portal."::: -For more information about evaluation results in AI Foundry portal, see [How to view evaluation results in AI Foundry portal](../how-to/evaluate-results.md). +For more information about evaluation results in Azure AI Foundry portal, see [How to view evaluation results in Azure AI Foundry portal](../how-to/evaluate-results.md). ## Iterate and improve diff --git a/articles/ai-studio/tutorials/deploy-chat-web-app.md b/articles/ai-studio/tutorials/deploy-chat-web-app.md index 84972f4c3d3..ea2f3546a6a 100644 --- a/articles/ai-studio/tutorials/deploy-chat-web-app.md +++ b/articles/ai-studio/tutorials/deploy-chat-web-app.md @@ -20,7 +20,7 @@ author: sdgilley [!INCLUDE [feature-preview](../includes/feature-preview.md)] -In this article, you deploy an enterprise chat web app that uses your own data with a large language model in AI Foundry portal. +In this article, you deploy an enterprise chat web app that uses your own data with a large language model in Azure AI Foundry portal. Your data source is used to help ground the model with specific data. Grounding means that the model uses your data to help it understand the context of your question. You're not changing the deployed model itself. Your data is stored separately and securely in your original data source @@ -34,7 +34,7 @@ The steps in this tutorial are: ## Prerequisites - An Azure subscription - Create one for free. -- A [deployed Azure OpenAI](../how-to/deploy-models-openai.md) chat model. Complete the [AI Foundry playground quickstart](../quickstarts/get-started-playground.md) to create this resource if you haven't already. +- A [deployed Azure OpenAI](../how-to/deploy-models-openai.md) chat model. Complete the [Azure AI Foundry playground quickstart](../quickstarts/get-started-playground.md) to create this resource if you haven't already. - An [Azure AI Search service connection](../how-to/connections-add.md#create-a-new-connection) to index the sample product data. @@ -44,7 +44,7 @@ The steps in this tutorial are: ## Add your data and try the chat model again -In the [AI Foundry playground quickstart](../quickstarts/get-started-playground.md) (that's a prerequisite for this tutorial), observe how your model responds without your data. Now you add your data to the model to help it answer questions about your products. +In the [Azure AI Foundry playground quickstart](../quickstarts/get-started-playground.md) (that's a prerequisite for this tutorial), observe how your model responds without your data. Now you add your data to the model to help it answer questions about your products. [!INCLUDE [Chat with your data](../includes/chat-with-data.md)] @@ -54,7 +54,7 @@ Once you're satisfied with the experience in Azure AI Foundry portal, you can de ### Find your resource group in the Azure portal -In this tutorial, your web app is deployed to the same resource group as your [AI Foundry hub](../how-to/create-secure-ai-hub.md). Later you configure authentication for the web app in the Azure portal. +In this tutorial, your web app is deployed to the same resource group as your [Azure AI Foundry hub](../how-to/create-secure-ai-hub.md). Later you configure authentication for the web app in the Azure portal. Follow these steps to navigate from Azure AI Foundry to your resource group in the Azure portal: @@ -78,7 +78,7 @@ To deploy the web app: 1. Complete the steps in the previous section to [add your data](#add-your-data-and-try-the-chat-model-again) to the playground. > [!NOTE] - > You can deploy a web app with or without your own data, but at least you need a deployed model as described in the [AI Foundry playground quickstart](../quickstarts/get-started-playground.md). + > You can deploy a web app with or without your own data, but at least you need a deployed model as described in the [Azure AI Foundry playground quickstart](../quickstarts/get-started-playground.md). 1. Select **Deploy > ...as a web app**. diff --git a/articles/ai-studio/tutorials/screen-reader.md b/articles/ai-studio/tutorials/screen-reader.md index 711823a2c60..14fd28b63a2 100644 --- a/articles/ai-studio/tutorials/screen-reader.md +++ b/articles/ai-studio/tutorials/screen-reader.md @@ -14,7 +14,7 @@ ms.author: sgilley author: sdgilley --- -# QuickStart: Get started using AI Foundry with a screen reader +# QuickStart: Get started using Azure AI Foundry with a screen reader This article is for people who use screen readers such as [Microsoft's Narrator](https://support.microsoft.com/windows/complete-guide-to-narrator-e4397a0d-ef4f-b386-d8ae-c172f109bdb1#WindowsVersion=Windows_11), JAWS, NVDA or Apple's Voiceover. In this quickstart, you'll be introduced to the basic structure of Azure AI Foundry and discover how to navigate around efficiently. diff --git a/articles/ai-studio/what-is-ai-studio.md b/articles/ai-studio/what-is-ai-studio.md index 6ef8fd56344..743c5d467bc 100644 --- a/articles/ai-studio/what-is-ai-studio.md +++ b/articles/ai-studio/what-is-ai-studio.md @@ -26,14 +26,14 @@ ms.custom: ignite-2023, build-2024, ignite-2024 - Explore, build, test, and deploy using cutting-edge AI tools and ML models, grounded in responsible AI practices. - Collaborate with a team for the full life-cycle of application development. -With AI Foundry, you can explore a wide variety of models, services and capabilities, and get to building AI applications that best serve your goals. The Azure AI Foundry platform facilitates scalability for transforming proof of concepts into full-fledged production applications with ease. Continuous monitoring and refinement support long-term success. +With Azure AI Foundry, you can explore a wide variety of models, services and capabilities, and get to building AI applications that best serve your goals. The Azure AI Foundry platform facilitates scalability for transforming proof of concepts into full-fledged production applications with ease. Continuous monitoring and refinement support long-term success. :::image type="content" source="./media/explore/ai-studio-home.png" alt-text="Screenshot of the Azure AI Foundry home page with links to get started." lightbox="./media/explore/ai-studio-home.png"::: -When you come to the Azure AI Foundry portal, you find that all paths lead to a project. Projects are easy-to-manage containers for your work—and the key to collaboration, organization, and connecting data and other services. Before you create your first project, you can explore models from many providers, and try out AI services and capabilities. When you're ready to move forward with a model or service, AI Foundry guides you to create a project. Once you are in a project, all of the Azure AI capabilities come to life. +When you come to the Azure AI Foundry portal, you find that all paths lead to a project. Projects are easy-to-manage containers for your work—and the key to collaboration, organization, and connecting data and other services. Before you create your first project, you can explore models from many providers, and try out AI services and capabilities. When you're ready to move forward with a model or service, Azure AI Foundry guides you to create a project. Once you are in a project, all of the Azure AI capabilities come to life. > [!NOTE] -> If you want to focus only on Azure OpenAI models and capabilities, we have a place where you can work with your Azure OpenAI resource instead of a project. For more information, see [What is Azure OpenAI in Azure AI Foundry?](azure-openai-in-ai-studio.md). However, for most situations, we recommend an AI Foundry project to build with a wide range of AI models, functionalities and tools as you build, test, and deploy AI solutions. +> If you want to focus only on Azure OpenAI models and capabilities, we have a place where you can work with your Azure OpenAI resource instead of a project. For more information, see [What is Azure OpenAI in Azure AI Foundry?](azure-openai-in-ai-studio.md). However, for most situations, we recommend an Azure AI Foundry project to build with a wide range of AI models, functionalities and tools as you build, test, and deploy AI solutions. ## Work in an Azure AI Foundry project @@ -45,7 +45,7 @@ Once you're in a project, you'll see an overview of what you can do with it on t :::image type="content" source="media/explore/project-view-current.png" alt-text="Screenshot shows the project overview in Azure AI Foundry." lightbox="media/explore/project-view-current.png"::: -The AI Foundry portal is organized around your goals. Generally, as you develop with Azure AI, you'll likely go through a few distinct stages of project development: +The Azure AI Foundry portal is organized around your goals. Generally, as you develop with Azure AI, you'll likely go through a few distinct stages of project development: * **Define and explore**. In this stage you define your project goals, and then explore and test models and services against your use case to find the ones that enable you to achieve your goals. * **Build and customize**. In this stage, you're actively building solutions and applications with the models, tools, and capabilities you selected. You can also customize models to perform better for your use case by fine-tuning, grounding in your data, and more. Building and customizing might be something you choose to do in the Azure AI Foundry portal, or through code and the Azure AI Foundry SDKs. Either way, a project provides you with everything you need. @@ -72,15 +72,15 @@ Azure AI Foundry is monetized through individual products customer access and co The platform is free to use and explore. Pricing occurs at deployment level. -Using AI Foundry also incurs cost associated with the underlying services. To learn more, read [Plan and manage costs for Azure AI services](./how-to/costs-plan-manage.md). +Using Azure AI Foundry also incurs cost associated with the underlying services. To learn more, read [Plan and manage costs for Azure AI services](./how-to/costs-plan-manage.md). ## Region availability -AI Foundry is available in most regions where Azure AI services are available. For more information, see [region support for AI Foundry](reference/region-support.md). +Azure AI Foundry is available in most regions where Azure AI services are available. For more information, see [region support for Azure AI Foundry](reference/region-support.md). ## How to get access -You can [explore AI Foundry portal (including the model catalog)](./how-to/model-catalog.md) without signing in. +You can [explore Azure AI Foundry portal (including the model catalog)](./how-to/model-catalog.md) without signing in. But for full functionality there are some requirements: diff --git a/articles/machine-learning/breadcrumb/toc.yml b/articles/machine-learning/breadcrumb/toc.yml index 7f1b448cd41..f46a07cd62f 100644 --- a/articles/machine-learning/breadcrumb/toc.yml +++ b/articles/machine-learning/breadcrumb/toc.yml @@ -107,7 +107,7 @@ items: tocHref: /security/benchmark/azure/ topicHref: /security/benchmark/azure/index -# AI Foundry or Azure ML +# Azure AI Foundry or Azure ML - name: Azure tocHref: /ai/ topicHref: /azure/index diff --git a/articles/machine-learning/concept-hub-workspace.md b/articles/machine-learning/concept-hub-workspace.md index 0343e7b89d7..3c8222ec893 100644 --- a/articles/machine-learning/concept-hub-workspace.md +++ b/articles/machine-learning/concept-hub-workspace.md @@ -33,11 +33,11 @@ In the transition from proving feasibility of an idea, to a funded project, many The goal of hubs is to take away this bottleneck, by letting IT set up a secure, preconfigured, and reusable environment for a team to prototype, build, and operate machine learning models. -## Interoperability between ML studio and AI Foundry +## Interoperability between ML studio and Azure AI Foundry -Hubs can be used as your team's collaboration environment for both ML studio and [AI Foundry](/azure/ai-studio/what-is-ai-studio). Use ML Studio for training and operationalizing custom machine learning models. Use AI Foundry as experience for building and operating AI applications responsibly. +Hubs can be used as your team's collaboration environment for both ML studio and [Azure AI Foundry](/azure/ai-studio/what-is-ai-studio). Use ML Studio for training and operationalizing custom machine learning models. Use Azure AI Foundry as experience for building and operating AI applications responsibly. -| Workspace Kind | ML Studio | AI Foundry | +| Workspace Kind | ML Studio | Azure AI Foundry | | --- | --- | --- | | Default | Supported | - | | Hub | Supported | Supported | @@ -54,7 +54,7 @@ Project workspaces that are created using a hub obtain the hub's security settin | Network settings | One [managed virtual network](how-to-managed-network.md) is shared between hub and project workspaces. To access content in the hub and project workspaces, create a single private link endpoint on the hub workspace. | | Encryption settings | Encryption settings pass down from hub to project. | | Storage for encrypted data | When you bring your customer-managed keys for encryption, hub and project workspaces share the same managed resource group for storing encrypted service data. | -| Connections | Project workspaces can consume shared connections created on the hub. This feature is currently only supported in [AI Foundry]() | +| Connections | Project workspaces can consume shared connections created on the hub. This feature is currently only supported in [Azure AI Foundry]() | | Compute instance | Reuse a compute instance across all project workspaces associated to the same hub. | | Compute quota | Any compute quota consumed by project workspaces is deducted from the hub workspace quota balance. | | Storage | Associated resource for storing workspace data. Project workspaces use designated containers starting with a prefix {workspaceGUID}, and have a conditional [Azure Attribute Based Access](/azure/role-based-access-control/conditions-overview) role assignment for the workspace identity for accessing these containers only. | @@ -69,7 +69,7 @@ Data that is uploaded in one project workspace, is stored in isolation from data Once a hub is created, there are multiple ways to create a project workspace using it: 1. [Using ML Studio](how-to-manage-workspace.md?tabs=mlstudio) -1. [Using AI Foundry](/azure/ai-studio/how-to/create-projects) +1. [Using Azure AI Foundry](/azure/ai-studio/how-to/create-projects) 2. [Using Azure SDK](how-to-manage-workspace.md?tabs=python) 4. [Using automation templates](how-to-create-workspace-template.md) @@ -93,11 +93,11 @@ Features that are supported using hub/project workspaces differ from regular wor | Feature | Default workspace | Hub workspace | Project workspace | Note | |--|--|--|--|--| |Self-serve create project workspaces from Studio| - | X | X | - | -|Create shared connections on hub | |X|X| Only in AI Foundry portal | +|Create shared connections on hub | |X|X| Only in Azure AI Foundry portal | |Consume shared connections from hub | |X|X| - | |Reuse compute instance across workspaces|-|X|X| | |Share compute quota across workspaces|-|X|X|| -|Build GenAI apps in AI Foundry portal|-|X|X|| +|Build GenAI apps in Azure AI Foundry portal|-|X|X|| |Single private link endpoint across workspaces|-|X|X|| |Managed virtual network|X|X|X|-| |BYO virtual network|X|-|-|Use alternative [managed virtual network](how-to-managed-network.md)| @@ -115,6 +115,6 @@ To learn more about setting up Azure Machine Learning, see: + [Create and manage a workspace](how-to-manage-workspace.md) + [Get started with Azure Machine Learning](quickstart-create-resources.md) -To learn more about hub workspace support in AI Foundry portal, see: +To learn more about hub workspace support in Azure AI Foundry portal, see: + [How to configure a managed network for hubs](/azure/ai-studio/how-to/configure-managed-network) diff --git a/articles/machine-learning/how-to-configure-private-link.md b/articles/machine-learning/how-to-configure-private-link.md index 34b629a2af9..2d871435a3f 100644 --- a/articles/machine-learning/how-to-configure-private-link.md +++ b/articles/machine-learning/how-to-configure-private-link.md @@ -428,7 +428,7 @@ The following table shows the possible configurations for your workspace and man | Workspace
public network access | Managed online endpoint
public network access | Does the workspace
respect the selected IPs? | Does the online endpoint
respect the selected IPs? | | --- | --- | --- | --- | | Disabled | Disabled | No (all public traffic rejected) | No | -| Disabled | Enabled | No (all public traffic rejected) | Yes | +| Disabled | Enabled | No (all public traffic rejected) | Not supported | | Enabled from selected IPs | Disabled | Yes | No | | Enabled from selected IPs | Enabled | Yes | Yes | diff --git a/articles/machine-learning/how-to-deploy-online-endpoint-with-secret-injection.md b/articles/machine-learning/how-to-deploy-online-endpoint-with-secret-injection.md index da332f9c52f..582f3713dc9 100644 --- a/articles/machine-learning/how-to-deploy-online-endpoint-with-secret-injection.md +++ b/articles/machine-learning/how-to-deploy-online-endpoint-with-secret-injection.md @@ -51,7 +51,7 @@ You can choose to store your secrets (such as API keys) using either: You can create workspace connections to use in your deployment. For example, you can create a connection to Microsoft Azure OpenAI Service by using [Workspace Connections - Create REST API](/rest/api/azureml/2023-08-01-preview/workspace-connections/create). -Alternatively, you can create a custom connection by using Azure Machine Learning studio (see [How to create a custom connection for prompt flow](./prompt-flow/tools-reference/python-tool.md#create-a-custom-connection)) or Azure AI Foundry (see [How to create a custom connection in AI Foundry portal](/azure/ai-studio/how-to/connections-add?tabs=custom#create-a-new-connection)). +Alternatively, you can create a custom connection by using Azure Machine Learning studio (see [How to create a custom connection for prompt flow](./prompt-flow/tools-reference/python-tool.md#create-a-custom-connection)) or Azure AI Foundry (see [How to create a custom connection in Azure AI Foundry portal](/azure/ai-studio/how-to/connections-add?tabs=custom#create-a-new-connection)). 1. Create an Azure OpenAI connection: diff --git a/articles/machine-learning/how-to-manage-hub-workspace-portal.md b/articles/machine-learning/how-to-manage-hub-workspace-portal.md index 9d39513900f..6a1690058ab 100644 --- a/articles/machine-learning/how-to-manage-hub-workspace-portal.md +++ b/articles/machine-learning/how-to-manage-hub-workspace-portal.md @@ -47,7 +47,7 @@ Use the following steps to create a hub from the Azure portal: :::image type="content" source="~/reusable-content/ce-skilling/azure/media/ai-studio/resource-create-basics.png" alt-text="Screenshot of the option to set Azure AI hub basic information." lightbox="~/reusable-content/ce-skilling/azure/media/ai-studio/resource-create-basics.png"::: -1. Select an existing **Azure AI services** resource or create a new one. New Azure AI services include multiple API endpoints for Speech, Content Safety and Azure OpenAI. You can also bring an existing Azure OpenAI resource. Optionally, choose an existing **Storage account**, **Key vault**, **Container Registry**, and **Application insights** to host artifacts generated when you use AI Foundry. +1. Select an existing **Azure AI services** resource or create a new one. New Azure AI services include multiple API endpoints for Speech, Content Safety and Azure OpenAI. You can also bring an existing Azure OpenAI resource. Optionally, choose an existing **Storage account**, **Key vault**, **Container Registry**, and **Application insights** to host artifacts generated when you use Azure AI Foundry. > [!TIP] > You can skip selecting Azure AI Services if you plan to only work in Azure Machine Learning studio. Azure AI Services is required for Azure AI Foundry, and provides access to pre-built AI models for use in prompt flow. @@ -123,4 +123,4 @@ You can configure your hub for these resources during creation or update after c ## Next steps -Once you have a workspace hub, you can Create a project using [Azure Machine Learning studio](how-to-manage-workspace.md?tabs=mlstudio), [AI Foundry](/azure/ai-studio/how-to/create-projects), [Azure SDK](how-to-manage-workspace.md?tabs=python), or [Using automation templates](how-to-create-workspace-template.md). +Once you have a workspace hub, you can Create a project using [Azure Machine Learning studio](how-to-manage-workspace.md?tabs=mlstudio), [Azure AI Foundry](/azure/ai-studio/how-to/create-projects), [Azure SDK](how-to-manage-workspace.md?tabs=python), or [Using automation templates](how-to-create-workspace-template.md). diff --git a/articles/machine-learning/how-to-manage-hub-workspace-template.md b/articles/machine-learning/how-to-manage-hub-workspace-template.md index e9dc976c5ed..4e997090b62 100644 --- a/articles/machine-learning/how-to-manage-hub-workspace-template.md +++ b/articles/machine-learning/how-to-manage-hub-workspace-template.md @@ -16,7 +16,7 @@ ms.date: 02/29/2024 # Create an Azure Machine Learning hub workspace using a Bicep template -Use a [Microsoft Bicep](/azure/azure-resource-manager/bicep/overview) template to create a [hub workspace](concept-hub-workspace.md) for use in ML Studio and [AI Foundry](/azure/ai-studio/what-is-ai-studio). A template makes it easy to create resources as a single, coordinated operation. A Bicep template is a text document that defines the resources that are needed for a deployment. It might also specify deployment parameters. Parameters are used to provide input values when using the template. +Use a [Microsoft Bicep](/azure/azure-resource-manager/bicep/overview) template to create a [hub workspace](concept-hub-workspace.md) for use in ML Studio and [Azure AI Foundry](/azure/ai-studio/what-is-ai-studio). A template makes it easy to create resources as a single, coordinated operation. A Bicep template is a text document that defines the resources that are needed for a deployment. It might also specify deployment parameters. Parameters are used to provide input values when using the template. The template used in this article can be found at [https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.machinelearningservices/aistudio-basics](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.machinelearningservices/aistudio-basics). Both the source `main.bicep` file and the compiled Azure Resource Manager template (`main.json`) file are available. This template creates the following resources: @@ -26,7 +26,7 @@ The template used in this article can be found at [https://github.com/Azure/azur - Azure Key Vault - Azure Container Registry - Azure Application Insights -- Azure AI services (required for AI Foundry, and may be dropped for Azure Machine Learning use cases) +- Azure AI services (required for Azure AI Foundry, and may be dropped for Azure Machine Learning use cases) ## Prerequisites diff --git a/articles/machine-learning/how-to-manage-workspace.md b/articles/machine-learning/how-to-manage-workspace.md index 46904822e9d..1b789f2bbcc 100644 --- a/articles/machine-learning/how-to-manage-workspace.md +++ b/articles/machine-learning/how-to-manage-workspace.md @@ -185,7 +185,7 @@ This class requires an existing virtual network. # [Studio](#tab/studio) -1. To create a workspace with disabled internet connectivity via Studio, you should specify a hub workspace that has public network access disabled. Workspaces created without a hub in AI Foundry portal, have public internet access enabled. A private hub has a 'lock' icon. +1. To create a workspace with disabled internet connectivity via Studio, you should specify a hub workspace that has public network access disabled. Workspaces created without a hub in Azure AI Foundry portal, have public internet access enabled. A private hub has a 'lock' icon. :::image type="content" source="media/how-to-manage-workspace/studio-private-hub-selection.png" alt-text="Screenshot of the private hub with the 'lock' icon."::: diff --git a/articles/machine-learning/how-to-managed-network.md b/articles/machine-learning/how-to-managed-network.md index 789994057e8..9b94bb8cc97 100644 --- a/articles/machine-learning/how-to-managed-network.md +++ b/articles/machine-learning/how-to-managed-network.md @@ -70,6 +70,8 @@ If you want to use the integrated notebook or create datasets in the default sto Part of Azure Machine Learning studio runs locally in the client's web browser, and communicates directly with the default storage for the workspace. Creating a private endpoint or service endpoint (for the default storage account) in the client's virtual network ensures that the client can communicate with the storage account. +If the workspace associated Azure storage account has public network access disabled, ensure the private endpoint created in the client virtual network is granted the Reader role to your workspace managed identity. This applies to both blog and file storage private endpoints. The role is not required for the private endpoint created by the managed virtual network. + For more information on creating a private endpoint or service endpoint, see the [Connect privately to a storage account](/azure/storage/common/storage-private-endpoints) and [Service Endpoints](/azure/virtual-network/virtual-network-service-endpoints-overview) articles. ### Secured associated resources @@ -993,25 +995,27 @@ ml_client._workspace_outbound_rules.begin_remove(resource_group, ws_name, rule_n ## List of required rules -> [!TIP] -> These rules are automatically added to the managed VNet. - __Private endpoints__: * When the isolation mode for the managed virtual network is `Allow internet outbound`, private endpoint outbound rules are automatically created as required rules from the managed virtual network for the workspace and associated resources __with public network access disabled__ (Key Vault, Storage Account, Container Registry, Azure Machine Learning workspace). * When the isolation mode for the managed virtual network is `Allow only approved outbound`, private endpoint outbound rules are automatically created as required rules from the managed virtual network for the workspace and associated resources __regardless of public network access mode for those resources__ (Key Vault, Storage Account, Container Registry, Azure Machine Learning workspace). +* These rules are automatically added to the managed virtual network. + +For Azure Machine Learning to run normally, there are a set of required service tags, required in either a managed or custom virtual network set-up. There are no alternatives to replacing certain required service tags. Below is a table of each required service tag and its purpose within Azure Machine Learning. + +| Service tag rule | Inbound or Outbound | Purpose | +| ----------- | ----- | ----- | +| `AzureMachineLearning` | Inbound | Create, update, and delete of Azure Machine Learning compute instance/cluster. | +| `AzureMachineLearning`| Outbound | Using Azure Machine Learning services. Python intellisense in notebooks uses port 18881. Creating, updating, and deleting an Azure Machine Learning compute instance uses port 5831. | +| `AzureActiveDirectory` | Outbound | Authentication using Microsoft Entra ID. | +| `BatchNodeManagement.region` | Outbound | Communication with Azure Batch back-end for Azure Machine Learning compute instances/clusters. | +| `AzureResourceManager` | Outbound | Creation of Azure resources with Azure Machine Learning, Azure CLI, and Azure Machine Learning SDK. | +| `AzureFrontDoor.FirstParty` | Outbound | Access docker images provided by Microsoft. | +| `MicrosoftContainerRegistry` | Outbound | Access docker images provided by Microsoft. Setup of the Azure Machine Learning router for Azure Kubernetes Service. | +| `AzureMonitor` | Outbound | Used to log monitoring and metrics to Azure Monitor. Only needed if you haven't secured Azure Monitor for the workspace. This outbound is also used to log information for support incidents. | +| `VirtualNetwork` | Outbound | Required when private endpoints are present in the virtual network or peered virtual networks. | -__Outbound__ service tag rules: - -* `AzureActiveDirectory` -* `AzureMachineLearning` -* `BatchNodeManagement.region` -* `AzureResourceManager` -* `AzureFrontDoor.FirstParty` -* `MicrosoftContainerRegistry` -* `AzureMonitor` - -__Inbound__ service tag rules: -* `AzureMachineLearning` +> [!NOTE] +> Service tags as the ONLY security boundary is not sufficient. For tenant level isolation, use private endpoints when possible. ## List of scenario specific outbound rules @@ -1172,10 +1176,9 @@ The Azure Machine Learning managed virtual network feature is free. However, you ## Limitations -* Azure AI Foundry doesn't support using your own Azure Virtual Network to secure the hub, project, or compute resources. You can only use the managed network feature to secure these resources. * Once you enable managed virtual network isolation of your workspace (either allow internet outbound or allow only approved outbound), you can't disable it. * Managed virtual network uses private endpoint connection to access your private resources. You can't have a private endpoint and a service endpoint at the same time for your Azure resources, such as a storage account. We recommend using private endpoints in all scenarios. -* The managed virtual network is deleted when the workspace is deleted. +* The managed virtual network is deleted when the workspace is deleted. When deleting Azure Machine Learning resources in your Azure subscription, disable any resource locks or locks which prevent deletion of resources you created, or were created by Microsoft for the managed virtual network. * Data exfiltration protection is automatically enabled for the only approved outbound mode. If you add other outbound rules, such as to FQDNs, Microsoft can't guarantee that you're protected from data exfiltration to those outbound destinations. * Creating a compute cluster in a different region than the workspace isn't supported when using a managed virtual network. * Kubernetes and attached VMs aren't supported in an Azure Machine Learning managed virtual network. @@ -1186,6 +1189,7 @@ The Azure Machine Learning managed virtual network feature is free. However, you * Managed network isolation can't establish a private connection from the managed virtual network to a user's on-premises resources. For the list of supported private connections, see [Private Endpoints](/azure/machine-learning/how-to-managed-network?view=azureml-api-2&tabs=azure-cli&preserve-view=true#private-endpoints). * If your managed network is configured to __allow only approved outbound__, you can't use an FQDN rule to access Azure Storage Accounts. You must use a private endpoint instead. +* Ensure to allowlist Microsoft-managed private endpoints created for the managed virtual network in your custom policy. ### Migration of compute resources diff --git a/articles/machine-learning/prompt-flow/how-to-secure-prompt-flow.md b/articles/machine-learning/prompt-flow/how-to-secure-prompt-flow.md index 932c6941b25..8ae155afd77 100644 --- a/articles/machine-learning/prompt-flow/how-to-secure-prompt-flow.md +++ b/articles/machine-learning/prompt-flow/how-to-secure-prompt-flow.md @@ -119,7 +119,7 @@ Workspace managed virtual network is the recommended way to support network isol ## Known limitations -- AI Foundry doesn't support bring your own virtual network, it only supports workspace managed virtual network. +- Azure AI Foundry doesn't support bring your own virtual network, it only supports workspace managed virtual network. - Managed online endpoint with selected egress only supports workspace with managed virtual network. If you want to use your own virtual network, you might need one workspace for prompt flow authoring with your virtual network and another workspace for prompt flow deployment using managed online endpoint with workspace managed virtual network. ## Next steps diff --git a/articles/machine-learning/prompt-flow/troubleshoot-guidance.md b/articles/machine-learning/prompt-flow/troubleshoot-guidance.md index 67e3e1684d7..9f8b91e29dc 100644 --- a/articles/machine-learning/prompt-flow/troubleshoot-guidance.md +++ b/articles/machine-learning/prompt-flow/troubleshoot-guidance.md @@ -78,7 +78,7 @@ There are possible reasons for this issue: :::image type="content" source="./media/faq/datastore-with-wrong-account-key.png" alt-text="Screenshot that shows datastore with wrong account key." lightbox = "./media/faq/datastore-with-wrong-account-key.png"::: -- If you're using AI Foundry, the storage account needs to set CORS to allow AI Foundry access the storage account, otherwise, you see the flow missing issue. You can add following CORS settings to the storage account to fix this issue. +- If you're using Azure AI Foundry, the storage account needs to set CORS to allow Azure AI Foundry access the storage account, otherwise, you see the flow missing issue. You can add following CORS settings to the storage account to fix this issue. - Go to storage account page, select `Resource sharing (CORS)` under `settings`, and select to `File service` tab. - Allowed origins: `https://mlworkspace.azure.ai,https://ml.azure.com,https://*.ml.azure.com,https://ai.azure.com,https://*.ai.azure.com,https://mlworkspacecanary.azure.ai,https://mlworkspace.azureml-test.net` - Allowed methods: `DELETE, GET, HEAD, POST, OPTIONS, PUT` diff --git a/articles/machine-learning/toc.yml b/articles/machine-learning/toc.yml index f9aa7675eeb..93566eceace 100644 --- a/articles/machine-learning/toc.yml +++ b/articles/machine-learning/toc.yml @@ -6,7 +6,7 @@ items: - name: What is Azure Machine Learning? displayName: AML, services, overview, introduction href: overview-what-is-azure-machine-learning.md - - name: "AI Foundry or Azure Machine Learning studio: Which should I choose?" + - name: "Azure AI Foundry or Azure Machine Learning studio: Which should I choose?" href: /ai/ai-studio-experiences-overview?toc=/azure/machine-learning/toc.json&bc=/azure/machine-learning/breadcrumb/toc.json - name: Azure Machine Learning CLI and Python SDK href: concept-v2.md diff --git a/articles/search/cognitive-search-skill-azure-openai-embedding.md b/articles/search/cognitive-search-skill-azure-openai-embedding.md index 1d9d87a4c84..a29ec55f9ce 100644 --- a/articles/search/cognitive-search-skill-azure-openai-embedding.md +++ b/articles/search/cognitive-search-skill-azure-openai-embedding.md @@ -20,7 +20,7 @@ The **Azure OpenAI Embedding** skill connects to a deployed embedding model on y Your Azure OpenAI Service must have an associated [custom subdomain](/azure/ai-services/cognitive-services-custom-subdomains). If the service was created through the Azure portal, this subdomain is automatically generated as part of your service setup. Ensure that your service includes a custom subdomain before using it with the Azure AI Search integration. -Azure OpenAI Service resources (with access to embedding models) that were created in AI Foundry portal aren't supported. Only the Azure OpenAI Service resources created in the Azure portal are compatible with the **Azure OpenAI Embedding** skill integration. +Azure OpenAI Service resources (with access to embedding models) that were created in Azure AI Foundry portal aren't supported. Only the Azure OpenAI Service resources created in the Azure portal are compatible with the **Azure OpenAI Embedding** skill integration. The [Import and vectorize data wizard](search-get-started-portal-import-vectors.md) in the Azure portal uses the **Azure OpenAI Embedding** skill to vectorize content. You can run the wizard and review the generated skillset to see how the wizard builds the skill for embedding models. diff --git a/articles/search/index.yml b/articles/search/index.yml index 8058966b05a..912464c29ae 100644 --- a/articles/search/index.yml +++ b/articles/search/index.yml @@ -67,7 +67,7 @@ landingContent: linkLists: - linkListType: how-to-guide links: - - text: Create a vector index in AI Foundry portal + - text: Create a vector index in Azure AI Foundry portal url: /azure/ai-studio/how-to/index-add - text: Chat with your data using Azure OpenAI url: /azure/ai-services/openai/use-your-data-quickstart diff --git a/articles/search/search-get-started-portal-import-vectors.md b/articles/search/search-get-started-portal-import-vectors.md index 5ce07bb69dd..f2d6abe5155 100644 --- a/articles/search/search-get-started-portal-import-vectors.md +++ b/articles/search/search-get-started-portal-import-vectors.md @@ -52,7 +52,7 @@ Use an embedding model on an Azure AI platform in the [same region as Azure AI S If you use the Azure OpenAI Service, the endpoint must have an associated [custom subdomain](/azure/ai-services/cognitive-services-custom-subdomains). A custom subdomain is an endpoint that includes a unique name (for example, `https://hereismyuniquename.cognitiveservices.azure.com`). If the service was created through the Azure portal, this subdomain is automatically generated as part of your service setup. Ensure that your service includes a custom subdomain before using it with the Azure AI Search integration. -Azure OpenAI Service resources (with access to embedding models) that were created in AI Foundry portal aren't supported. Only the Azure OpenAI Service resources created in the Azure portal are compatible with the **Azure OpenAI Embedding** skill integration. +Azure OpenAI Service resources (with access to embedding models) that were created in Azure AI Foundry portal aren't supported. Only the Azure OpenAI Service resources created in the Azure portal are compatible with the **Azure OpenAI Embedding** skill integration. ### Public endpoint requirements @@ -323,7 +323,7 @@ Chunking is built in and nonconfigurable. The effective settings are: + For Azure OpenAI, choose an existing deployment of text-embedding-ada-002, text-embedding-3-large, or text-embedding-3-small. - + For AI Foundry catalog, choose an existing deployment of an Azure or Cohere embedding model. + + For Azure AI Foundry catalog, choose an existing deployment of an Azure or Cohere embedding model. + For AI Vision multimodal embeddings, select the account. diff --git a/articles/search/search-import-data-portal.md b/articles/search/search-import-data-portal.md index b023c87bd6d..688330840fd 100644 --- a/articles/search/search-import-data-portal.md +++ b/articles/search/search-import-data-portal.md @@ -72,7 +72,7 @@ Here are some points to keep in mind about the skills in the following list: |------|--------------------|----------------------------------| | [AI Vision multimodal](cognitive-search-skill-vision-vectorize.md) | ❌ | ✅ | | [Azure OpenAI embedding](cognitive-search-skill-azure-openai-embedding.md) | ❌ | ✅ | -| [Azure Machine Learning (AI Foundry model catalog)](cognitive-search-aml-skill.md) | ❌ | ✅ | +| [Azure Machine Learning (Azure AI Foundry model catalog)](cognitive-search-aml-skill.md) | ❌ | ✅ | | [Document layout](cognitive-search-skill-document-intelligence-layout.md) | ❌ | ✅ | | [Entity recognition](cognitive-search-skill-entity-recognition-v3.md) | ✅ | ❌ | | [Image analysis (applies to blobs, default parsing, whole file indexing](cognitive-search-skill-image-analysis.md) | ✅ | ❌ | diff --git a/articles/search/vector-search-integrated-vectorization-ai-studio.md b/articles/search/vector-search-integrated-vectorization-ai-studio.md index 7a17d0206f2..0410f2cb19a 100644 --- a/articles/search/vector-search-integrated-vectorization-ai-studio.md +++ b/articles/search/vector-search-integrated-vectorization-ai-studio.md @@ -1,7 +1,7 @@ --- title: Integrated vectorization with models from Azure AI Foundry titleSuffix: Azure AI Search -description: Learn how to vectorize content during indexing on Azure AI Search with an AI Foundry model. +description: Learn how to vectorize content during indexing on Azure AI Search with an Azure AI Foundry model. author: gmndrg ms.author: gimondra ms.service: azure-ai-search @@ -20,7 +20,7 @@ In this article, learn how to access the embedding models in the [Azure AI Found The workflow includes model deployment steps. The model catalog includes embedding models from Microsoft and other companies. Deploying a model is billable per the billing structure of each provider. -After the model is deployed, you can use it for [integrated vectorization](vector-search-integrated-vectorization.md) during indexing, or with the [AI Foundry vectorizer](vector-search-vectorizer-azure-machine-learning-ai-studio-catalog.md) for queries. +After the model is deployed, you can use it for [integrated vectorization](vector-search-integrated-vectorization.md) during indexing, or with the [Azure AI Foundry vectorizer](vector-search-vectorizer-azure-machine-learning-ai-studio-catalog.md) for queries. > [!TIP] > Use the [**Import and vectorize data**](search-get-started-portal-import-vectors.md) wizard to generate a skillset that includes an AML skill for deployed embedding models on Azure AI Foundry. AML skill definition for inputs, outputs, and mappings are generated by the wizard, which gives you an easy way to test a model before writing any code. @@ -65,7 +65,7 @@ For image embeddings: Optionally, you can change your endpoint to use **Token authentication** instead of **Key authentication**. If you enable token authentication, you only need to copy the URI and model name, but make a note of which region the model is deployed to. - :::image type="content" source="media\vector-search-integrated-vectorization-ai-studio\ai-studio-fields-to-copy.png" lightbox="media\vector-search-integrated-vectorization-ai-studio\ai-studio-fields-to-copy.png" alt-text="Screenshot of a deployed endpoint in AI Foundry portal highlighting the fields to copy and save for later."::: + :::image type="content" source="media\vector-search-integrated-vectorization-ai-studio\ai-studio-fields-to-copy.png" lightbox="media\vector-search-integrated-vectorization-ai-studio\ai-studio-fields-to-copy.png" alt-text="Screenshot of a deployed endpoint in Azure AI Foundry portal highlighting the fields to copy and save for later."::: 1. You can now configure a search index and indexer to use the deployed model. @@ -81,7 +81,7 @@ This section describes the AML skill definition and index mappings. It includes