diff --git a/content/code-security/code-scanning/managing-code-scanning-alerts/responsible-use-autofix-code-scanning.md b/content/code-security/code-scanning/managing-code-scanning-alerts/responsible-use-autofix-code-scanning.md index a0ad7746c262..71615a041417 100644 --- a/content/code-security/code-scanning/managing-code-scanning-alerts/responsible-use-autofix-code-scanning.md +++ b/content/code-security/code-scanning/managing-code-scanning-alerts/responsible-use-autofix-code-scanning.md @@ -22,7 +22,7 @@ redirect_from: {% data reusables.rai.code-scanning.copilot-autofix-note %} -{% data variables.product.prodname_copilot_autofix_short %} generates potential fixes that are relevant to the existing source code and translates the description and location of an alert into code changes that may fix the alert. {% data variables.product.prodname_copilot_autofix_short %} uses internal {% data variables.product.prodname_copilot %} APIs interfacing with the large language model GPT 4o from OpenAI, which has sufficient generative capabilities to produce both suggested fixes in code and explanatory text for those fixes. +{% data variables.product.prodname_copilot_autofix_short %} generates potential fixes that are relevant to the existing source code and translates the description and location of an alert into code changes that may fix the alert. {% data variables.product.prodname_copilot_autofix_short %} uses internal {% data variables.product.prodname_copilot %} APIs interfacing with the large language model GPT-4o from OpenAI, which has sufficient generative capabilities to produce both suggested fixes in code and explanatory text for those fixes. {% data variables.product.prodname_copilot_autofix_short %} is allowed by default and enabled for every repository using {% data variables.product.prodname_codeql %}, but you can choose to opt out and disable {% data variables.product.prodname_copilot_autofix_short %}. To learn how to disable {% data variables.product.prodname_copilot_autofix_short %} at the enterprise, organization and repository levels, see [AUTOTITLE](/code-security/code-scanning/managing-code-scanning-alerts/disabling-autofix-for-code-scanning). diff --git a/content/copilot/managing-copilot/managing-copilot-as-an-individual-subscriber/about-github-copilot-free.md b/content/copilot/managing-copilot/managing-copilot-as-an-individual-subscriber/about-github-copilot-free.md index fd7a4e642bee..680effa33d7e 100644 --- a/content/copilot/managing-copilot/managing-copilot-as-an-individual-subscriber/about-github-copilot-free.md +++ b/content/copilot/managing-copilot/managing-copilot-as-an-individual-subscriber/about-github-copilot-free.md @@ -32,7 +32,7 @@ topics: * {% data variables.product.prodname_copilot_cli_short %} * {% data variables.product.prodname_windows_terminal %} * Block suggestions matching public code -* Access to the {% data variables.copilot.copilot_claude_sonnet %}, {% data variables.copilot.copilot_gemini_flash %} and o3-mini models +* Access to {% data variables.copilot.copilot_claude_sonnet_35 %}, {% data variables.copilot.copilot_gemini_flash %} and o3-mini models * Access to {% data variables.product.prodname_copilot_extensions_short %} in {% data variables.product.prodname_vscode %}, {% data variables.product.prodname_vs %}, JetBrains IDEs, {% data variables.product.prodname_dotcom_the_website %}, and {% data variables.product.prodname_mobile %} ## What are the limitations of {% data variables.product.prodname_copilot_free_short %}? diff --git a/content/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise.md b/content/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise.md index 3ecf1067c7d5..92a1ebc71a5f 100644 --- a/content/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise.md +++ b/content/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise.md @@ -79,12 +79,12 @@ Some features of {% data variables.product.prodname_copilot_short %} are availab > [!NOTE] The following models are currently in {% data variables.release-phases.public_preview %} as AI models for {% data variables.product.prodname_copilot %}, and are subject to change. The [AUTOTITLE](/free-pro-team@latest/site-policy/github-terms/github-pre-release-license-terms) apply to your use of these products. -By default, {% data variables.product.prodname_copilot_chat_short %} uses the GPT 4o model. If you grant access to the alternative models, members of your enterprise can choose to use these models rather than the default GPT 4o model. The available alternative models are: +By default, {% data variables.product.prodname_copilot_chat_short %} uses the GPT-4o model. If you grant access to the alternative models, members of your enterprise can choose to use these models rather than the default GPT-4o model. The available alternative models are: * **{% data variables.copilot.copilot_claude_sonnet %}**. See [AUTOTITLE](/copilot/using-github-copilot/ai-models/using-claude-sonnet-in-github-copilot). * **{% data variables.copilot.copilot_gemini_flash %}**. See [AUTOTITLE](/copilot/using-github-copilot/ai-models/using-gemini-flash-in-github-copilot). * **OpenAI's o1 and o3 models** - * **o1**: This model is focused on advanced reasoning and solving complex problems, in particular in math and science. It responds more slowly than the GPT 4o model. Each member of your enterprise can make 10 requests to this model per day. + * **o1**: This model is focused on advanced reasoning and solving complex problems, in particular in math and science. It responds more slowly than the GPT-4o model. Each member of your enterprise can make 10 requests to this model per day. * **o3-mini**: This is the next generation of reasoning models, following from o1 and o1-mini. The o3-mini model outperforms o1 on coding benchmarks with response times that are comparable to o1-mini, providing improved quality at nearly the same latency. It is best suited for code generation and small context operations. Each member of your enterprise can make 50 requests to this model every 12 hours. ### {% data variables.product.prodname_copilot_short %} Metrics API access diff --git a/content/copilot/using-github-copilot/ai-models/changing-the-ai-model-for-copilot-chat.md b/content/copilot/using-github-copilot/ai-models/changing-the-ai-model-for-copilot-chat.md index 71a2f91896e2..6e973ca29fba 100644 --- a/content/copilot/using-github-copilot/ai-models/changing-the-ai-model-for-copilot-chat.md +++ b/content/copilot/using-github-copilot/ai-models/changing-the-ai-model-for-copilot-chat.md @@ -8,7 +8,7 @@ topics: - Copilot --- -By default, {% data variables.product.prodname_copilot_chat_short %} uses OpenAI's GPT 4o large language model. This is a highly proficient model that performs well for text generation tasks, such as summarization and knowledge-based chat. The model is also capable of reasoning, solving complex math problems and coding. +By default, {% data variables.product.prodname_copilot_chat_short %} uses OpenAI's GPT-4o large language model. This is a highly proficient model that performs well for text generation tasks, such as summarization and knowledge-based chat. The model is also capable of reasoning, solving complex math problems and coding. However, you are not limited to using this model. You can choose from a selection of other models, each with its own particular strengths. You may have a favorite model that you like to use, or you might prefer to use a particular model for inquiring about a specific subject. @@ -28,7 +28,7 @@ Changing the model that's used by {% data variables.product.prodname_copilot_cha ### Limitations of AI models for {% data variables.product.prodname_copilot_chat_short %} -* If you want to use the skills listed in the table above{% ifversion ghec %}, or knowledge bases{% endif %}, on the {% data variables.product.github %} website, only the GPT 4o, {% data variables.copilot.copilot_claude_sonnet %}, and {% data variables.copilot.copilot_gemini_flash %} models are supported. +* If you want to use the skills listed in the table above{% ifversion ghec %}, or knowledge bases{% endif %}, on the {% data variables.product.github %} website, only the GPT-4o, {% data variables.copilot.copilot_claude_sonnet %}, and {% data variables.copilot.copilot_gemini_flash %} models are supported. * Experimental pre-release versions of the models may not interact with all filters correctly, including the duplication detection filter. ## Changing your AI model diff --git a/content/copilot/using-github-copilot/ai-models/using-claude-sonnet-in-github-copilot.md b/content/copilot/using-github-copilot/ai-models/using-claude-sonnet-in-github-copilot.md index 10d236cfab79..840bad9b2564 100644 --- a/content/copilot/using-github-copilot/ai-models/using-claude-sonnet-in-github-copilot.md +++ b/content/copilot/using-github-copilot/ai-models/using-claude-sonnet-in-github-copilot.md @@ -1,5 +1,5 @@ --- -title: Using Claude 3.5 Sonnet in Copilot Chat +title: Using Claude Sonnet in Copilot Chat allowTitleToDifferFromFilename: true shortTitle: 'Use {% data variables.copilot.copilot_claude_sonnet %}' intro: 'Learn how to enable {% data variables.copilot.copilot_claude_sonnet %} in {% data variables.product.prodname_copilot_chat %}, for {% ifversion fpt %}yourself or{% endif %} your organization{% ifversion ghec %} or enterprise{% endif %}.' @@ -11,13 +11,19 @@ redirect_from: - /copilot/using-github-copilot/using-claude-sonnet-in-github-copilot --- -> [!NOTE] {% data variables.copilot.copilot_claude_sonnet %} is in {% data variables.release-phases.public_preview %} and subject to change. The [AUTOTITLE](/free-pro-team@latest/site-policy/github-terms/github-pre-release-license-terms) apply to your use of this product. +> [!NOTE] All {% data variables.copilot.copilot_claude_sonnet %} models are in {% data variables.release-phases.public_preview %} and subject to change. The [AUTOTITLE](/free-pro-team@latest/site-policy/github-terms/github-pre-release-license-terms) apply to your use of this product. ## About {% data variables.copilot.copilot_claude_sonnet %} in {% data variables.product.prodname_copilot_chat %} -{% data variables.copilot.copilot_claude_sonnet %} is a large language model that you can use as an alternative to the default model used by {% data variables.product.prodname_copilot_chat_short %}. {% data variables.copilot.copilot_claude_sonnet %} excels at coding tasks across the entire software development lifecycle, from initial design to bug fixes, maintenance to optimizations. Learn more about the [model's capabilities](https://www.anthropic.com/claude/sonnet) or read the [model card](https://assets.anthropic.com/m/61e7d27f8c8f5919/original/Claude-3-Model-Card.pdf). +{% data variables.copilot.copilot_claude_sonnet %} is a family of large language models that you can use as an alternative to the default model used by {% data variables.product.prodname_copilot_chat_short %}. {% data variables.copilot.copilot_claude_sonnet %} excels at coding tasks across the entire software development lifecycle, from initial design to bug fixes, maintenance to optimizations. Learn more about the [Sonnet's capabilities](https://www.anthropic.com/claude/sonnet). -{% data variables.copilot.copilot_claude_sonnet %} is currently available in: +{% data variables.copilot.copilot_claude_sonnet_37 %} is currently available in: + +* {% data variables.product.prodname_copilot_chat_short %} in {% data variables.product.prodname_vscode %} +* {% data variables.product.prodname_copilot_chat_short %} in {% data variables.product.prodname_vs %} 2022 version 17.13 or later +* Immersive mode in {% data variables.product.prodname_copilot_chat_short %} in {% data variables.product.github %} + +{% data variables.copilot.copilot_claude_sonnet_35 %} is currently available in: * {% data variables.product.prodname_copilot_chat_short %} in {% data variables.product.prodname_vscode %} * {% data variables.product.prodname_copilot_chat_short %} in {% data variables.product.prodname_vs %} 2022 version 17.12 or later @@ -29,15 +35,17 @@ When using {% data variables.copilot.copilot_claude_sonnet %}, input prompts and ## Configuring access -You must enable access to {% data variables.copilot.copilot_claude_sonnet %} before you can use the model. +You must enable access to each {% data variables.copilot.copilot_claude_sonnet %} individually before you can use the model. {% ifversion fpt %} ### Setup for individual use +> [!NOTE] {% data variables.copilot.copilot_claude_sonnet_37 %} is not currently available for {% data variables.product.prodname_copilot_free_short %} + If you have a {% data variables.product.prodname_copilot_free_short %} or {% data variables.product.prodname_copilot_pro_short %} subscription, you can enable {% data variables.copilot.copilot_claude_sonnet %} in two ways: -* The first time you choose to use {% data variables.copilot.copilot_claude_sonnet %} with {% data variables.product.prodname_copilot_chat_short %} in {% data variables.product.prodname_vscode %}, or in the immersive view of {% data variables.product.prodname_copilot_chat_short %}, you will be prompted to allow access to the model. +* The first time you choose to use {% data variables.copilot.copilot_claude_sonnet %} models with {% data variables.product.prodname_copilot_chat_short %} in {% data variables.product.prodname_vscode %}, or in the immersive view of {% data variables.product.prodname_copilot_chat_short %}, you will be prompted to allow access to the model. Clicking **Allow** enables you to use {% data variables.copilot.copilot_claude_sonnet %} and updates the policy in your personal settings on {% data variables.product.github %}. @@ -47,7 +55,7 @@ If you have a {% data variables.product.prodname_copilot_free_short %} or {% dat ### Setup for organization {% ifversion ghec %}and enterprise{% endif %} use -As an {% ifversion ghec %}enterprise or{% endif %} organization owner, you can enable or disable {% data variables.copilot.copilot_claude_sonnet %} for everyone who has been assigned a {% ifversion ghec %}{% data variables.product.prodname_copilot_enterprise_short %} or {% endif %}{% data variables.product.prodname_copilot_business_short %} seat through your {% ifversion ghec %}enterprise or {% endif %}organization. See [AUTOTITLE](/copilot/managing-copilot/managing-github-copilot-in-your-organization/setting-policies-for-copilot-in-your-organization/managing-policies-for-copilot-in-your-organization){% ifversion ghec %} and [AUTOTITLE](/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise){% endif %}. +As an {% ifversion ghec %}enterprise or{% endif %} organization owner, you can enable or disable {% data variables.copilot.copilot_claude_sonnet %} models for everyone who has been assigned a {% ifversion ghec %}{% data variables.product.prodname_copilot_enterprise_short %} or {% endif %}{% data variables.product.prodname_copilot_business_short %} seat through your {% ifversion ghec %}enterprise or {% endif %}organization. See [AUTOTITLE](/copilot/managing-copilot/managing-github-copilot-in-your-organization/setting-policies-for-copilot-in-your-organization/managing-policies-for-copilot-in-your-organization){% ifversion ghec %} and [AUTOTITLE](/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise){% endif %}. ## Using {% data variables.copilot.copilot_claude_sonnet %} diff --git a/content/copilot/using-github-copilot/copilot-chat/asking-github-copilot-questions-in-github.md b/content/copilot/using-github-copilot/copilot-chat/asking-github-copilot-questions-in-github.md index b1c601f52fad..10e72a2077c4 100644 --- a/content/copilot/using-github-copilot/copilot-chat/asking-github-copilot-questions-in-github.md +++ b/content/copilot/using-github-copilot/copilot-chat/asking-github-copilot-questions-in-github.md @@ -37,7 +37,7 @@ For example, asking `Generate a simple calculator using HTML, CSS, and JavaScrip ## Powered by skills -When using the GPT 4o and {% data variables.copilot.copilot_claude_sonnet %} models, {% data variables.product.prodname_copilot_short %} has access to a collection of skills to fetch data from {% data variables.product.github %}, which are dynamically selected based on the question you ask. You can tell which skill {% data variables.product.prodname_copilot_short %} used by clicking {% octicon "chevron-down" aria-label="the down arrow" %} to expand the status information in the chat window. +When using the GPT-4o and {% data variables.copilot.copilot_claude_sonnet %} models, {% data variables.product.prodname_copilot_short %} has access to a collection of skills to fetch data from {% data variables.product.github %}, which are dynamically selected based on the question you ask. You can tell which skill {% data variables.product.prodname_copilot_short %} used by clicking {% octicon "chevron-down" aria-label="the down arrow" %} to expand the status information in the chat window. ![Screenshot of the {% data variables.product.prodname_copilot_short %} chat panel with the status information expanded and the skill that was used highlighted with an orange outline.](/assets/images/help/copilot/chat-show-skill.png) diff --git a/data/reusables/copilot/copilot-chat-models-list-visual-studio.md b/data/reusables/copilot/copilot-chat-models-list-visual-studio.md index 7c81451ca633..8a0c398c40e1 100644 --- a/data/reusables/copilot/copilot-chat-models-list-visual-studio.md +++ b/data/reusables/copilot/copilot-chat-models-list-visual-studio.md @@ -1,8 +1,8 @@ The following models are currently available through multi-model {% data variables.product.prodname_copilot_chat_short %}: -* **GPT 4o:** This is the default {% data variables.product.prodname_copilot_chat_short %} model. It is a versatile, multimodal model that excels in both text and image processing and is designed to provide fast, reliable responses. It also has superior performance in non-English languages. Learn more about the [model's capabilities](https://platform.openai.com/docs/models/gpt-4o) and review the [model card](https://openai.com/index/gpt-4o-system-card/). GPT 4o is hosted on Azure. +* **GPT-4o:** This is the default {% data variables.product.prodname_copilot_chat_short %} model. It is a versatile, multimodal model that excels in both text and image processing and is designed to provide fast, reliable responses. It also has superior performance in non-English languages. Learn more about the [model's capabilities](https://platform.openai.com/docs/models/gpt-4o) and review the [model card](https://openai.com/index/gpt-4o-system-card/). GPT-4o is hosted on Azure. * **{% data variables.copilot.copilot_claude_sonnet %}:** This model excels at coding tasks across the entire software development lifecycle, from initial design to bug fixes, maintenance to optimizations. Learn more about the [model's capabilities](https://www.anthropic.com/claude/sonnet) or read the [model card](https://assets.anthropic.com/m/61e7d27f8c8f5919/original/Claude-3-Model-Card.pdf). {% data variables.product.prodname_copilot %} uses {% data variables.copilot.copilot_claude_sonnet %} hosted on Amazon Web Services. -* **o1:** This model is focused on advanced reasoning and solving complex problems, in particular in math and science. It responds more slowly than the GPT 4o model. You can make 10 requests to this model per day. Learn more about the [model's capabilities](https://platform.openai.com/docs/models/o1) and review the [model card](https://openai.com/index/openai-o1-system-card/). o1 is hosted on Azure. +* **o1:** This model is focused on advanced reasoning and solving complex problems, in particular in math and science. It responds more slowly than the GPT-4o model. You can make 10 requests to this model per day. Learn more about the [model's capabilities](https://platform.openai.com/docs/models/o1) and review the [model card](https://openai.com/index/openai-o1-system-card/). o1 is hosted on Azure. * **o1-mini:** This is the faster version of the o1 model, balancing the use of complex reasoning with the need for faster responses. It is best suited for code generation and small context operations. You can make 50 requests to this model per day. Learn more about the [model's capabilities](https://platform.openai.com/docs/models/o1) and review the [model card](https://openai.com/index/openai-o1-system-card/). o1-mini is hosted on Azure. For more information about the o1 models, see [Models](https://platform.openai.com/docs/models/models) in the OpenAI Platform documentation. diff --git a/data/reusables/copilot/copilot-chat-models-list.md b/data/reusables/copilot/copilot-chat-models-list.md index 009294d99dd0..d35a4b4a325b 100644 --- a/data/reusables/copilot/copilot-chat-models-list.md +++ b/data/reusables/copilot/copilot-chat-models-list.md @@ -1,13 +1,14 @@ The following models are currently available through multi-model {% data variables.product.prodname_copilot_chat_short %}: -* **GPT 4o:** This is the default {% data variables.product.prodname_copilot_chat_short %} model. It is a versatile, multimodal model that excels in both text and image processing and is designed to provide fast, reliable responses. It also has superior performance in non-English languages. Learn more about the [model's capabilities](https://platform.openai.com/docs/models/gpt-4o) and review the [model card](https://openai.com/index/gpt-4o-system-card/). GPT 4o is hosted on Azure. -* **{% data variables.copilot.copilot_claude_sonnet %}:** This model excels at coding tasks across the entire software development lifecycle, from initial design to bug fixes, maintenance to optimizations. Learn more about the [model's capabilities](https://www.anthropic.com/claude/sonnet) or read the [model card](https://assets.anthropic.com/m/61e7d27f8c8f5919/original/Claude-3-Model-Card.pdf). {% data variables.product.prodname_copilot %} uses {% data variables.copilot.copilot_claude_sonnet %} hosted on Amazon Web Services. +* **GPT-4o:** This is the default {% data variables.product.prodname_copilot_chat_short %} model. It is a versatile, multimodal model that excels in both text and image processing and is designed to provide fast, reliable responses. It also has superior performance in non-English languages. Learn more about the [model's capabilities](https://platform.openai.com/docs/models/gpt-4o) and review the [model card](https://openai.com/index/gpt-4o-system-card/). GPT-4o is hosted on Azure. +* **{% data variables.copilot.copilot_claude_sonnet_37 %}:** This model, likes its predecessor, excels across the software development lifecycle, from initial design to bug fixes, maintenance to optimizations. It also has thinking capabilities which can be enabled by selecting the thinking version of the model, which can be particularly useful in agentic scenarios. Learn more about the [model's capabilities](https://www.anthropic.com/claude/sonnet) or read the [model card](https://assets.anthropic.com/m/785e231869ea8b3b/original/claude-3-7-sonnet-system-card.pdf). {% data variables.product.prodname_copilot %} uses {% data variables.copilot.copilot_claude_sonnet %} hosted on Amazon Web Services. +* **{% data variables.copilot.copilot_claude_sonnet_35 %}:** This model excels at coding tasks across the entire software development lifecycle, from initial design to bug fixes, maintenance to optimizations. Learn more about the [model's capabilities](https://www.anthropic.com/claude/sonnet) or read the [model card](https://assets.anthropic.com/m/61e7d27f8c8f5919/original/Claude-3-Model-Card.pdf). {% data variables.product.prodname_copilot %} uses {% data variables.copilot.copilot_claude_sonnet %} hosted on Amazon Web Services. * **{% data variables.copilot.copilot_gemini_flash %}:** This model has strong coding, math, and reasoning capabilities that makes it well suited to assist with software development. {% data reusables.copilot.gemini-model-info %} -* **o1:** This model is focused on advanced reasoning and solving complex problems, in particular in math and science. It responds more slowly than the GPT 4o model. You can make 10 requests to this model per day. Learn more about the [model's capabilities](https://platform.openai.com/docs/models/o1) and review the [model card](https://openai.com/index/openai-o1-system-card/). o1 is hosted on Azure. +* **o1:** This model is focused on advanced reasoning and solving complex problems, in particular in math and science. It responds more slowly than the GPT-4o model. You can make 10 requests to this model per day. Learn more about the [model's capabilities](https://platform.openai.com/docs/models/o1) and review the [model card](https://openai.com/index/openai-o1-system-card/). o1 is hosted on Azure. * **o3-mini:** This model is the next generation of reasoning models, following from o1 and o1-mini. The o3-mini model outperforms o1 on coding benchmarks with response times that are comparable to o1-mini, providing improved quality at nearly the same latency. It is best suited for code generation and small context operations. You can make 50 requests to this model every 12 hours. Learn more about the [model's capabilities](https://platform.openai.com/docs/models#o3-mini) and review the [model card](https://openai.com/index/o3-mini-system-card/). o3-mini is hosted on Azure. For more information about these models, see: -* **OpenAI's GPT 4o, o1, and o3-mini models**: [Models](https://platform.openai.com/docs/models/models) in the OpenAI Platform documentation. +* **OpenAI's GPT-4o, o1, and o3-mini models**: [Models](https://platform.openai.com/docs/models/models) in the OpenAI Platform documentation. * **Anthropic's {% data variables.copilot.copilot_claude_sonnet %} model**: [AUTOTITLE](/copilot/using-github-copilot/ai-models/using-claude-sonnet-in-github-copilot). * **Google's {% data variables.copilot.copilot_gemini_flash %} model**: [AUTOTITLE](/copilot/using-github-copilot/ai-models/using-gemini-flash-in-github-copilot). diff --git a/data/variables/copilot.yml b/data/variables/copilot.yml index ecf771c38bd4..4659857a2c3c 100644 --- a/data/variables/copilot.yml +++ b/data/variables/copilot.yml @@ -28,7 +28,10 @@ copilot_code-review: 'GitHub Copilot code review' copilot_code-review_short: 'Copilot code review' ## LLM models for Copilot -copilot_claude_sonnet: 'Claude 3.5 Sonnet' +copilot_claude_sonnet: 'Claude Sonnet' +copilot_claude_sonnet_35: 'Claude Sonnet 3.5' +copilot_claude_sonnet_37: 'Claude Sonnet 3.7' + copilot_gemini_flash: 'Gemini 2.0 Flash' ## Next edit suggestions in VS Code