Intermittent Errors with Llama 3.3-70B
Resolved
Issue resolved. Cause: A few requests made to the Llama 3.3-70B model caused issues. Impact: Intermittent errors when interacting with the model through serverless inference and/or with agents created using this model. Contact support if issues persist.
Update
Fix deployed. Monitoring resources related to the Llama 3.3-70B. Users should no longer experience intermittent errors when making serverless inference requests via APIs and Agents . Awaiting confirmation before closure.
Investigating
We are currently investigating an issue affecting the Llama 3.3-70B model. Symptoms: Users may encounter intermittent errors when making serverless inference requests via APIs and Agents. Current Status: Our engineering team is actively investigating the issue to determine the root cause.