Serverless Inference Issue


Incident resolved in 5h34m36s

Resolved

This incident has been resolved.

1775498557

Update

A fix has been implemented and we are monitoring the results.

1775488533

Investigating

Our Engineering team is investigating an issue with Serverless inference.

At this time, users may experience high error rates for open source models (llama 3.3 70b).

We apologize for the inconvenience and will share an update once we have more information.

1775478481