Troubleshooting
This page covers common issues when using AI Gateway. For provider-specific troubleshooting, refer to the relevant provider documentation.
If you receive authentication errors from your AI provider, AI Gateway did not pass valid credentials upstream. Check the following:
-
Verify header placement: Make sure your Cloudflare token is in
cf-aig-authorization, notAuthorization. TheAuthorizationheader is reserved for provider credentials. -
Check your configuration based on endpoint type:
- Provider-specific endpoints: Confirm your request URL includes the provider path (for example,
/google-vertex-ai/or/openai/). AI Gateway uses this to identify the provider and apply the correct stored credentials. - Unified
/compat/chat/completionsendpoint: Confirm yourmodelname starts with the provider prefix (for example,google-vertex-ai/google/gemini-2.5-flashoropenai/gpt-4o). AI Gateway uses this prefix to route the request and select the correct stored credentials.
- Provider-specific endpoints: Confirm your request URL includes the provider path (for example,
-
Verify BYOK key selection: If you have multiple keys configured for a provider, ensure either:
- You are using the key with alias
default, or - You include the
cf-aig-byok-aliasheader with the correct alias name
- You are using the key with alias
-
Verify BYOK configuration: If using BYOK, confirm in the dashboard that your credentials were saved correctly.
For provider-specific authentication issues:
For troubleshooting Data Loss Prevention issues such as DLP not triggering or unexpected blocking, refer to DLP troubleshooting.
- Check if the upstream provider is experiencing issues
- Consider implementing dynamic routing with fallbacks for transient failures
- Review your rate limiting configuration
- Verify your API key or credentials are valid with the provider directly
- Check the provider's status page for outages
- Review AI Gateway logs for detailed error information
- Verify caching is enabled for your gateway
- Check that the request method and content type are cacheable
- Streaming responses are not cached by default
- Review your cache TTL settings
- Check if semantic caching is affecting cache behavior