I'm always excited to take on new projects and collaborate with innovative minds.
As your microservices architecture grows, one problem hits hard and fast:
"How the heck do I trace what went wrong across services?"
If you've ever asked yourself:
Then... this post is for you.
Let’s fix this with a custom ASP.NET Core middleware that does request-response logging, pushes structured logs directly into Elasticsearch, and lets you search/debug them in Kibana—all in near real-time 🚀.
Picture this setup:
Each of them handling thousands of API calls per day.
Now imagine trying to debug an issue where a single request flows through 3 services and fails somewhere in the middle.
You’d want:
✅ A single place to search all logs
✅ Ability to filter by endpoint, status code, date, or correlation ID
✅ Dashboards showing error trends and response times
✅ The ability to quickly debug production issues
Why not just sprinkle logs inside Controllers?
Here’s why middleware-based logging wins:
Here’s what the flow will look like:
Client ➡️ ASP.NET Core Middleware ➡️ Service Logic ➡️ Middleware (on Response) ➡️ Elasticsearch ➡️ Kibana
We’re keeping things lightweight:
✅ Using Elasticsearch Low-Level Client
✅ Skipping Logstash for simplicity
dotnet new webapi -n ApiLoggingWithElasticsearch
cd ApiLoggingWithElasticsearchdotnet add package Elasticsearch.Net
dotnet add package Newtonsoft.JsonThese will help serialize our logs and push them to Elasticsearch.
This middleware will:
✅ Read incoming request body
✅ Capture outgoing response body
✅ Measure execution time
✅ Push a structured log to Elasticsearch
using System.Diagnostics;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Http;
using Newtonsoft.Json;
public class RequestResponseLoggingMiddleware
{
private readonly RequestDelegate _next;
private readonly ElasticsearchService _elasticService;
public RequestResponseLoggingMiddleware(RequestDelegate next, ElasticsearchService elasticService)
{
_next = next;
_elasticService = elasticService;
}
public async Task Invoke(HttpContext context)
{
var stopwatch = Stopwatch.StartNew();
// Capture Request Body
var requestBody = await ReadRequestBody(context.Request);
// Swap response body stream
var originalBodyStream = context.Response.Body;
using var responseBody = new MemoryStream();
context.Response.Body = responseBody;
await _next(context); // Proceed with the pipeline
stopwatch.Stop();
// Capture Response Body
var responseBodyText = await ReadResponseBody(context.Response);
// Prepare Log Object
var log = new
{
Timestamp = DateTime.UtcNow,
Path = context.Request.Path,
Method = context.Request.Method,
RequestHeaders = context.Request.Headers,
RequestBody = requestBody,
StatusCode = context.Response.StatusCode,
ResponseBody = responseBodyText,
DurationMs = stopwatch.ElapsedMilliseconds
};
// Push to Elasticsearch
await _elasticService.LogToElasticsearchAsync(log);
// Copy response back for the client
await responseBody.CopyToAsync(originalBodyStream);
}
private async Task<string> ReadRequestBody(HttpRequest request)
{
request.EnableBuffering();
request.Body.Position = 0;
using var reader = new StreamReader(request.Body, Encoding.UTF8, leaveOpen: true);
var body = await reader.ReadToEndAsync();
request.Body.Position = 0;
return body;
}
private async Task<string> ReadResponseBody(HttpResponse response)
{
response.Body.Seek(0, SeekOrigin.Begin);
string text = await new StreamReader(response.Body).ReadToEndAsync();
response.Body.Seek(0, SeekOrigin.Begin);
return text;
}
}This service connects directly to your local Elasticsearch instance (e.g., http://localhost:9200).
using System;
using System.Threading.Tasks;
using Elasticsearch.Net;
using Newtonsoft.Json;
public class ElasticsearchService
{
private readonly ElasticLowLevelClient _client;
private const string IndexName = "api-logs";
public ElasticsearchService(string elasticsearchUrl)
{
var settings = new ConnectionConfiguration(new Uri(elasticsearchUrl));
_client = new ElasticLowLevelClient(settings);
}
public async Task LogToElasticsearchAsync(object logEntry)
{
var json = JsonConvert.SerializeObject(logEntry);
var response = await _client.IndexAsync<StringResponse>(IndexName, PostData.String(json));
Console.WriteLine($"Elasticsearch Log Status: {response.HttpStatusCode}");
}
}Modify your Program.cs (for .NET 6+ style):
var builder = WebApplication.CreateBuilder(args);
// Register ElasticsearchService
builder.Services.AddSingleton(new ElasticsearchService("http://localhost:9200"));
var app = builder.Build();
// Plug in our middleware
app.UseMiddleware<RequestResponseLoggingMiddleware>();
app.MapControllers();
app.Run();dotnet runThen hit an endpoint like:
curl http://localhost:5000/weatherforecastapi-logsYou can even build custom dashboards showing error rates, average response times, etc.
✅ No external logging frameworks needed (No Serilog, NLog, etc.)
✅ Works at middleware level—no code changes needed in Controllers
✅ Searchable, structured JSON logs
✅ Easy to extend with trace IDs, exception handling, etc.
✅ Real-time visibility using Kibana
This setup gives you a production-ready, low-maintenance, and extensible foundation for centralized API logging across all your microservices.
If you want, you can find the full source code on GitHub.
Your email address will not be published. Required fields are marked *