Degraded Gemini 2.5 Pro experience in Copilot


Incident resolved in 29h50m3s

Resolved

Between October 1st, 2025 at 1 AM UTC and October 2nd, 2025 at 10:33 PM UTC, the Copilot service experienced a degradation of the Gemini 2.5 Pro model due to an issue with our upstream provider. Before 15:53 UTC on October 1st, users experienced higher error rates with large context requests while using Gemini 2.5 Pro. After 15:53 UTC and until 10:33 PM UTC on October 2nd, requests were restricted to smaller context windows when using Gemini 2.5. Pro. No other models were impacted.The issue was resolved by a mitigation put in place by our provider. GitHub is collaborating with our provider to enhance communication and improve the ability to reproduce issues with the aim to reduce resolution time.

1759444400

Investigating

We have confirmed that the fix for the lower token input limit for Gemini 2.5 Pro is in place and are currently testing our previous higher limit to verify that customers will experience no further impact.

1759443985

Investigating

The underlying issue for the lower token limits for Gemini 2.5 Pro has been identified and a fix is in progress. We will update again once we have tested and confirmed that the fix is correct and globally deployed.

1759425198

Investigating

We are continuing to work with our provider to resolve the issue where some Copilot requests using Gemini 2.5 Pro return an error indicating a bad request due to exceeding the input limit size.

1759373528

Investigating

We are continuing to investigate and test solutions internally while working with our model provider on a deeper investigation into the cause. We will update again when we have identified a mitigation.

1759342617

Investigating

We are testing other internal mitigations so that we can return to the higher maximum input length. We are still working with our upstream model provider to understand the contributing factors for this sudden decrease in input limits.

1759340275

Investigating

We are experiencing a service regression for the Gemini 2.5 Pro model in Copilot Chat, VS Code and other Copilot products. The maximum input length of Gemini 2.5 prompts been decreased. Long prompts or large context windows may result in errors. This is due to an issue with an upstream model provider. We are working with them to resolve the issue.Other models are available and working as expected.

1759337356

Investigating

We are investigating reports of degraded performance for Copilot

1759336997