AI Gateway workshop
Conceptual introduction of AI Gateway capabilities in Azure API Management
Filter
Type:
Model Context Protocol (MCP)
Use MCP in Azure API Management for seamless LLM tool integration, leveraging OAuth 2.0 for robust authentication and authorization.
Bicep
OpenAI Agents
Integrate OpenAI Agents with Azure OpenAI models and API-based tools, managed through Azure API Management.
Bicep
AI Agent Service
Integrate Azure AI Agent Service with Azure OpenAI models, Logic Apps, and OpenAPI-based APIs using Azure API Management.
Bicep
Function Calling
Utilize Azure API Management to manage OpenAI function calling with an Azure Functions API for streamlined and efficient operations.
Bicep
Access Control
Enable authorized access to OpenAPI APIs with OAuth 2.0 via an identity provider, managed through Azure API Management.
Bicep
Token Rate Limiting
Control API traffic by enforcing usage limits and optimize resource allocation with rate limiting policies in Azure API Management.
Azure Portal
Analytics & Monitoring
Gain insights into model token consumption for usage patterns with Application Insights and emit token metric policy.
Azure Portal
Semantic Caching
Reduce latency and costs with caching strategies in Azure API Management, based on vector proximity and similarity score thresholds.
Azure Portal
COMING SOON
Semantic Caching
Reduce latency and costs with caching strategies in Azure API Management, based on vector proximity and similarity score thresholds.
Bicep
Dynamic Failover
Utilize Azure API Management's built-in load balancing functionality to manage Azure OpenAI endpoints or mock servers efficiently.
Azure Portal
FinOps Framework
Control AI costs with Azure API Management and the FinOps Framework, enabling automated subscription disabling via Azure Monitor/ Logic Apps.
Bicep
SLM Self-hosting
Utilize the self-hosted Phi-3 (SLM) through Azure API Management's self-hosted gateway with OpenAI API compatibility for efficient integration.
Bicep