From 71b308bbc8e1f66cd3f48a967263eee4a5db207f Mon Sep 17 00:00:00 2001 From: ChenJieting <40321821+ChenJieting@users.noreply.github.com> Date: Wed, 22 May 2024 11:20:41 +0800 Subject: [PATCH] update the link for MaaS deployment guidance (#3324) # Description Update links that related to the cloud documentation updates for several pages. # All Promptflow Contribution checklist: - [x] **The pull request does not introduce [breaking changes].** - [ ] **CHANGELOG is updated for new features, bug fixes or other significant changes.** - [x] **I have read the [contribution guidelines](../CONTRIBUTING.md).** - [ ] **Create an issue and link to the pull request to get dedicated review from promptflow team. Learn more: [suggested workflow](../CONTRIBUTING.md#suggested-workflow).** ## General Guidelines and Best Practices - [x] Title of the pull request is clear and informative. - [x] There are a small number of commits, each of which have an informative message. This means that previously merged commits do not appear in the history of the PR. For more information on cleaning up the commits in your PR, [see this page](https://github.com/Azure/azure-powershell/blob/master/documentation/development-docs/cleaning-up-commits.md). ### Testing Guidelines - [x] Pull request includes test coverage for the included changes. --- docs/concepts/concept-connections.md | 2 +- docs/integrations/tools/azure-ai-language-tool.md | 2 +- .../integrations/tools/llmlingua-prompt-compression-tool.md | 6 +++--- docs/reference/tools-reference/llm-tool.md | 2 +- 4 files changed, 6 insertions(+), 6 deletions(-) diff --git a/docs/concepts/concept-connections.md b/docs/concepts/concept-connections.md index 0fbd8de0068..1ec19bf9cad 100644 --- a/docs/concepts/concept-connections.md +++ b/docs/concepts/concept-connections.md @@ -15,7 +15,7 @@ Prompt flow provides a variety of pre-built connections, including Azure Open AI | [Open AI](https://openai.com/) | LLM or Python | | [Cognitive Search](https://azure.microsoft.com/en-us/products/search) | Vector DB Lookup or Python | | [Serp](https://serpapi.com/) | Serp API or Python | -| [Serverless](https://learn.microsoft.com/en-us/azure/ai-studio/concepts/deployments-overview#deploy-models-with-model-as-a-service) | LLM or Python | +| [Serverless](https://learn.microsoft.com/en-us/azure/ai-studio/concepts/deployments-overview#deploy-models-with-model-as-a-service-maas) | LLM or Python | | Custom | Python | By leveraging connections in prompt flow, you can easily establish and manage connections to external APIs and data sources, facilitating efficient data exchange and interaction within their AI applications. diff --git a/docs/integrations/tools/azure-ai-language-tool.md b/docs/integrations/tools/azure-ai-language-tool.md index 46b08c16e3e..b0a14666fab 100644 --- a/docs/integrations/tools/azure-ai-language-tool.md +++ b/docs/integrations/tools/azure-ai-language-tool.md @@ -18,7 +18,7 @@ Azure AI Language enables users with task-oriented and optimized pre-trained or ## Requirements PyPI package: [`promptflow-azure-ai-language`](https://pypi.org/project/promptflow-azure-ai-language/). - For AzureML users: - follow this [wiki](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2#prepare-runtime), starting from `Prepare runtime`. + follow this [wiki](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2#prepare-compute-session), starting from `Prepare compute session`. - For local users: ``` pip install promptflow-azure-ai-language diff --git a/docs/integrations/tools/llmlingua-prompt-compression-tool.md b/docs/integrations/tools/llmlingua-prompt-compression-tool.md index 834e0cbef88..ae52df90bca 100644 --- a/docs/integrations/tools/llmlingua-prompt-compression-tool.md +++ b/docs/integrations/tools/llmlingua-prompt-compression-tool.md @@ -6,7 +6,7 @@ LLMLingua Prompt Compression tool enables you to speed up large language model's ## Requirements PyPI package: [`llmlingua-promptflow`](https://pypi.org/project/llmlingua-promptflow/). - For Azure users: - follow [the wiki for AzureML](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2#prepare-runtime) or [the wiki for AI Studio](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/prompt-flow-tools/prompt-flow-tools-overview#custom-tools), starting from `Prepare runtime`. + follow [the wiki for AzureML](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2#prepare-compute-session) or [the wiki for AI Studio](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/prompt-flow-tools/prompt-flow-tools-overview#custom-tools) to prepare the compute session. - For local users: ``` pip install llmlingua-promptflow @@ -14,10 +14,10 @@ PyPI package: [`llmlingua-promptflow`](https://pypi.org/project/llmlingua-prompt You may also want to install the [Prompt flow for VS Code extension](https://marketplace.visualstudio.com/items?itemName=prompt-flow.prompt-flow). ## Prerequisite -Create a MaaS deployment for large language model in Azure model catalog. Take the Llama model as an example, you can learn how to deploy and consume Meta Llama models with model as a service by [the guidance for Azure AI Studio](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-llama?tabs=llama-three#deploy-meta-llama-models-with-pay-as-you-go) +Create a MaaS deployment for large language model in Azure model catalog. Take the Llama model as an example, you can learn how to deploy and consume Meta Llama models with model as a service by [the guidance for Azure AI Studio](https://learn.microsoft.com/azure/ai-studio/how-to/deploy-models-llama?tabs=llama-three#deploy-meta-llama-models-as-a-serverless-api) or [the guidance for Azure Machine Learning Studio -](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-models-llama?view=azureml-api-2&tabs=llama-three#deploy-meta-llama-models-with-pay-as-you-go). +](https://learn.microsoft.com/azure/machine-learning/how-to-deploy-models-llama?view=azureml-api-2&tabs=llama-three#deploy-meta-llama-models-with-pay-as-you-go). ## Inputs diff --git a/docs/reference/tools-reference/llm-tool.md b/docs/reference/tools-reference/llm-tool.md index 901af9a9a67..e41718cec42 100644 --- a/docs/reference/tools-reference/llm-tool.md +++ b/docs/reference/tools-reference/llm-tool.md @@ -25,7 +25,7 @@ Create OpenAI resources, Azure OpenAI resources or MaaS deployment with the LLM - **MaaS deployment** - Create MaaS deployment for models in Azure AI Studio model catalog with [instruction](https://learn.microsoft.com/en-us/azure/ai-studio/concepts/deployments-overview#deploy-models-with-model-as-a-service) + Create MaaS deployment for models in Azure AI Studio model catalog with [instruction](https://learn.microsoft.com/en-us/azure/ai-studio/concepts/deployments-overview#deploy-models-with-model-as-a-service-maas) You can create serverless connection to use this MaaS deployment.