Azure Foundry is a developer-focused platform for building intelligent, multi-agent applications using Azure OpenAI and other Azure services. It provides a robust orchestration layer that enables agents, each representing a distinct persona, tool, or task, to collaborate in structured conversations or workflows. Built with extensibility in mind, Azure Foundry supports custom tool integration, dynamic planning, and role-based behavior, making it ideal for autonomous systems, document analysis, and AI-powered task automation. Developers can use familiar tools like C# or Python to define agents and tools, while benefiting from seamless Azure integration, including identity, storage, and observability.
Why Choose Azure Foundry?
Azure Foundry stands out from other AI agent frameworks for several key reasons:
-
Enterprise Integration: Unlike standalone frameworks, Azure Foundry is deeply integrated with Azure services, providing built-in support for enterprise features like authentication, monitoring, and compliance.
-
Language Flexibility: While many AI agent frameworks are Python-centric, Azure Foundry supports multiple languages, making it particularly appealing for .NET developers who want to stay in their ecosystem.
-
Scalability: Built on Azure's infrastructure, it offers automatic scaling and high availability out of the box, which is crucial for production deployments.
-
Cost Efficiency: By leveraging Azure's consumption-based pricing model, you only pay for what you use, making it more cost-effective than maintaining your own infrastructure.
-
Security: With built-in Azure security features, including private endpoints and managed identities, it's easier to maintain enterprise-grade security standards.
Comparison with Other Frameworks
Feature | Azure Foundry | LangChain | AutoGPT |
---|---|---|---|
Language Support | Multi-language (C#, Python) | Primarily Python | Python |
Cloud Integration | Native Azure | Manual | Manual |
Scalability | Built-in | Manual | Manual |
Security | Enterprise-grade | Basic | Basic |
Cost Model | Consumption-based | Self-hosted | Self-hosted |
In this post I will show you how to implement agent tools, communication, and orchestration entirely with C#.
Key Concepts
Before diving into implementation, let's understand the core concepts of Azure Foundry and AI agents:
1. Agents
Agents are autonomous units that can:
- Process input and make decisions
- Execute specific tasks or workflows
- Communicate with other agents
- Use tools to accomplish goals
Think of agents as specialized workers, each with a specific role and set of capabilities.
2. Tools
Tools are the capabilities that agents can use to:
- Interact with external systems
- Process data
- Make API calls
- Perform specific functions
In our C# implementation, tools are typically exposed as API endpoints or services.
3. Orchestration
Orchestration is the process of:
- Coordinating multiple agents
- Managing agent communication
- Handling workflow execution
- Ensuring proper sequencing of tasks
Azure Foundry provides built-in orchestration capabilities that we can leverage through C#.
4. Memory and State
Agents can maintain:
- Short-term memory for current tasks
- Long-term memory for historical context
- State management for complex workflows
- Persistent storage for important data
5. Planning and Reasoning
Agents use:
- Dynamic planning to achieve goals
- Reasoning to make decisions
- Context awareness to adapt behavior
- Error handling and recovery strategies
6. Integration Points
Azure Foundry integrates with:
- Azure OpenAI for language models
- Azure Services for infrastructure
- Custom APIs and services
- External systems and databases
Understanding these concepts will help you design more effective agent-based solutions. Now, let's implement these concepts in C#.
Prerequisites
Before you start:
- Azure subscription with OpenAI resource deployed
- GPT-4 or GPT-3.5 Turbo model access
- Approximate cost: $0.002-0.03 per 1K tokens
- Region: Any region with OpenAI service availability
- .NET 8 SDK installed (version 8.0.100 or later)
- A .NET compatible IDE, like Visual Studio or VSCode
- An HTTP-accessible endpoint for your C# agent services (e.g., Azure App Service or Azure Container Apps)
Required NuGet Packages
Add these packages to your project:
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Http" Version="8.0.0" />
<PackageReference Include="Microsoft.Extensions.Logging" Version="8.0.0" />
<PackageReference Include="System.Text.Json" Version="8.0.0" />
<PackageReference Include="JsonSchema.Net" Version="5.3.0" />
<PackageReference Include="Polly" Version="8.2.0" />
</ItemGroup>
Required Using Statements
Add these to your C# files:
using System.Text.Json;
using System.Text.Json.Serialization;
using Json.Schema;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
using Polly;
using System.Net.Http.Json;
What Are AI Agents?
AI Agents in Azure Foundry are intelligent units that:
- Accept input (text, data, etc.)
- Use reasoning to decide what to do
- Call external tools
- Return output or forward it to another agent
With C#, you can expose these tools and orchestrate their interactions.
Step 1: Create an AI Tool Using ASP.NET Core
Here's a basic example of an AI summarizer tool using the Azure OpenAI API and C#.
SummarizeController.cs
[ApiController]
[Route("api/[controller]")]
public class SummarizeController : ControllerBase
{
private readonly IOpenAIService _openAIService;
public SummarizeController(IOpenAIService openAIService)
{
_openAIService = openAIService;
}
[HttpPost]
public async Task<IActionResult> Post([FromBody] SummarizeRequest request)
{
var summary = await _openAIService.GetSummaryAsync(request.Input);
return Ok(new { summary });
}
}
public class SummarizeRequest
{
public string Input { get; set; }
}
Step 2: Add OpenAI Integration
OpenAIService.cs
public interface IOpenAIService
{
Task<string> GetSummaryAsync(string input);
}
public class OpenAIService : IOpenAIService
{
private readonly HttpClient _httpClient;
private readonly IConfiguration _config;
private readonly ILogger<OpenAIService> _logger;
private readonly IAsyncPolicy<HttpResponseMessage> _retryPolicy;
public OpenAIService(
HttpClient httpClient,
IConfiguration config,
ILogger<OpenAIService> logger)
{
_httpClient = httpClient;
_config = config;
_logger = logger;
// Configure retry policy with exponential backoff
_retryPolicy = Policy<HttpResponseMessage>
.Handle<HttpRequestException>()
.Or<TimeoutException>()
.WaitAndRetryAsync(3, retryAttempt =>
TimeSpan.FromSeconds(Math.Pow(2, retryAttempt)));
}
public async Task<string> GetSummaryAsync(string input)
{
if (string.IsNullOrWhiteSpace(input))
{
throw new ArgumentException("Input text cannot be empty", nameof(input));
}
try
{
var body = new
{
messages = new[]
{
new { role = "system", content = "Summarize the following text." },
new { role = "user", content = input }
},
temperature = 0.5
};
var endpoint = _config["AzureOpenAI:Endpoint"]
?? throw new InvalidOperationException("Azure OpenAI endpoint not configured");
var deployment = _config["AzureOpenAI:Deployment"]
?? throw new InvalidOperationException("Azure OpenAI deployment not configured");
var apiKey = _config["AzureOpenAI:ApiKey"]
?? throw new InvalidOperationException("Azure OpenAI API key not configured");
_httpClient.DefaultRequestHeaders.Clear();
_httpClient.DefaultRequestHeaders.Add("api-key", apiKey);
var response = await _retryPolicy.ExecuteAsync(async () =>
{
var result = await _httpClient.PostAsJsonAsync(
$"{endpoint}/openai/deployments/{deployment}/chat/completions?api-version=2024-03-01-preview",
body
);
result.EnsureSuccessStatusCode();
return result;
});
var json = await response.Content.ReadFromJsonAsync<JsonElement>();
if (json.TryGetProperty("choices", out var choices) &&
choices.GetArrayLength() > 0 &&
choices[0].TryGetProperty("message", out var message) &&
message.TryGetProperty("content", out var content))
{
return content.GetString() ?? throw new InvalidOperationException("Empty response from OpenAI");
}
throw new InvalidOperationException("Unexpected response format from OpenAI");
}
catch (HttpRequestException ex)
{
_logger.LogError(ex, "Failed to communicate with Azure OpenAI service");
throw new OpenAIServiceException("Failed to communicate with Azure OpenAI service", ex);
}
catch (JsonException ex)
{
_logger.LogError(ex, "Failed to parse response from Azure OpenAI service");
throw new OpenAIServiceException("Failed to parse response from Azure OpenAI service", ex);
}
catch (Exception ex) when (ex is not OpenAIServiceException)
{
_logger.LogError(ex, "Unexpected error while processing summary request");
throw new OpenAIServiceException("Unexpected error while processing summary request", ex);
}
}
}
public class OpenAIServiceException : Exception
{
public OpenAIServiceException(string message) : base(message) { }
public OpenAIServiceException(string message, Exception innerException)
: base(message, innerException) { }
}
Step 3: Register and Expose Tools for Agents
To make your C# tools available to Azure Foundry agents, you need to register them in a way that the platform can discover and use them. Here's how to do it:
1. Create a Tool Definition
First, create a class to define your tool's metadata and schema:
public class ToolDefinition
{
public string Name { get; set; }
public string Description { get; set; }
public string Endpoint { get; set; }
public string Method { get; set; }
public JsonSchema InputSchema { get; set; }
public JsonSchema OutputSchema { get; set; }
}
public class SummarizeToolDefinition : ToolDefinition
{
public SummarizeToolDefinition(string baseUrl)
{
Name = "summarize_tool";
Description = "Summarizes input text using Azure OpenAI";
Endpoint = $"{baseUrl}/api/summarize";
Method = "POST";
InputSchema = new JsonSchema
{
Type = JsonSchemaType.Object,
Properties = new Dictionary<string, JsonSchema>
{
["input"] = new JsonSchema
{
Type = JsonSchemaType.String,
Description = "The text to summarize"
}
},
Required = new[] { "input" }
};
OutputSchema = new JsonSchema
{
Type = JsonSchemaType.Object,
Properties = new Dictionary<string, JsonSchema>
{
["summary"] = new JsonSchema
{
Type = JsonSchemaType.String,
Description = "The generated summary"
}
},
Required = new[] { "summary" }
};
}
}
2. Create a Tool Registry Service
Implement a service to manage tool registration:
public interface IToolRegistryService
{
Task RegisterToolAsync(ToolDefinition tool);
Task<IEnumerable<ToolDefinition>> GetRegisteredToolsAsync();
}
public class ToolRegistryService : IToolRegistryService
{
private readonly IConfiguration _config;
private readonly ILogger<ToolRegistryService> _logger;
private readonly HttpClient _httpClient;
public ToolRegistryService(
IConfiguration config,
ILogger<ToolRegistryService> logger,
HttpClient httpClient)
{
_config = config;
_logger = logger;
_httpClient = httpClient;
}
public async Task RegisterToolAsync(ToolDefinition tool)
{
try
{
var foundryEndpoint = _config["AzureFoundry:Endpoint"];
var apiKey = _config["AzureFoundry:ApiKey"];
_httpClient.DefaultRequestHeaders.Clear();
_httpClient.DefaultRequestHeaders.Add("api-key", apiKey);
var response = await _httpClient.PostAsJsonAsync(
$"{foundryEndpoint}/api/tools/register",
tool
);
response.EnsureSuccessStatusCode();
_logger.LogInformation("Successfully registered tool: {ToolName}", tool.Name);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to register tool: {ToolName}", tool.Name);
throw new ToolRegistrationException($"Failed to register tool: {tool.Name}", ex);
}
}
public async Task<IEnumerable<ToolDefinition>> GetRegisteredToolsAsync()
{
try
{
var foundryEndpoint = _config["AzureFoundry:Endpoint"];
var apiKey = _config["AzureFoundry:ApiKey"];
_httpClient.DefaultRequestHeaders.Clear();
_httpClient.DefaultRequestHeaders.Add("api-key", apiKey);
var response = await _httpClient.GetAsync($"{foundryEndpoint}/api/tools");
response.EnsureSuccessStatusCode();
return await response.Content.ReadFromJsonAsync<IEnumerable<ToolDefinition>>()
?? Enumerable.Empty<ToolDefinition>();
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to retrieve registered tools");
throw new ToolRegistrationException("Failed to retrieve registered tools", ex);
}
}
}
public class ToolRegistrationException : Exception
{
public ToolRegistrationException(string message) : base(message) { }
public ToolRegistrationException(string message, Exception innerException)
: base(message, innerException) { }
}
3. Register Tools at Startup
In your Program.cs
or Startup.cs
, register your tools when the application starts:
public class Program
{
public static async Task Main(string[] args)
{
var host = CreateHostBuilder(args).Build();
// Register tools
using var scope = host.Services.CreateScope();
var toolRegistry = scope.ServiceProvider.GetRequiredService<IToolRegistryService>();
var baseUrl = host.Services.GetRequiredService<IConfiguration>()["BaseUrl"];
var summarizeTool = new SummarizeToolDefinition(baseUrl);
await toolRegistry.RegisterToolAsync(summarizeTool);
await host.RunAsync();
}
}
4. Configure Azure Foundry Integration
Add the following to your appsettings.json
:
{
"AzureFoundry": {
"Endpoint": "https://your-foundry-instance.azurewebsites.net",
"ApiKey": "your-api-key"
}
}
This setup allows you to:
- Define tools with clear input/output schemas
- Register tools with Azure Foundry
- Manage tool lifecycle
- Handle errors and logging
- Scale tool registration across multiple instances
The tools you register will be available to all agents in your Azure Foundry instance, and they can be used in agent workflows and conversations.
Deployment and Security Considerations
Deployment
-
Azure App Service
- Deploy your API to Azure App Service
- Configure managed identity for secure access
- Set up application settings in Azure Portal
-
Azure Container Apps
- Containerize your application
- Deploy to Azure Container Apps
- Configure scaling rules
Security Best Practices
-
API Security
- Use managed identities where possible
- Store secrets in Azure Key Vault
- Implement proper authentication
- Use HTTPS for all endpoints
-
OpenAI Security
- Use private endpoints for Azure OpenAI
- Implement rate limiting
- Monitor token usage
- Set up alerts for unusual activity
Performance Optimization
-
Caching
- Implement response caching
- Use Azure Cache for Redis
- Cache tool definitions
-
Scaling
- Configure auto-scaling rules
- Use Azure Front Door for global distribution
- Implement circuit breakers
Troubleshooting
Common issues and solutions:
-
Tool Registration Failures
- Check API key permissions
- Verify endpoint accessibility
- Ensure correct JSON schema format
-
OpenAI Integration Issues
- Verify model deployment
- Check token limits
- Monitor rate limits
-
Performance Issues
- Check response times
- Monitor token usage
- Review scaling configuration
What's Next?
Want to see a full Maker-Checker system in C# using multiple agent APIs and orchestration? Or should we explore using message queues for agent handoff?
Let me know what you'd like to see in part two!