Application logging to Azure using SeriLog

I'm in the process of creating a cloud-based application that needs to scale well and I'm thinking about error management and logging. There will be a follow up post about the application itself, but for now I want to focus on the logging bit.

In my quest to find the right tool, I remembered reading about Serilog some time ago. I've been meaning to try Serilog but I had to find the right project/opportunity. And now I finally got it!

The thing I like about Serilog is that it’s extremely easy to set up especially with its fluent API. It comes with a large number of Sinks, so you're spoilt for choice when it comes to deciding where to output or store your logs. A Sink is an output adapter that adheres to the Serilog's common API and allows you to send the generated logs to variety of destinations. The following are just for reference: command window, text file, Azure Table Storage etc. You get the point. If you’re curious about the full list of all the available sinks, have a look here

For this proof of concept I created a very simple console application that will send all log entries to an Azure Table Storage. Since I don’t want to mess around with my live Azure account (at this stage I’m still testing things in development), I’ll use the local Azure Storage Emulator which is installed on my machine.

1. Install the Azure SDK

If you don't have the Azure SDK installed, you'll need to grab it from the official downloads page. This will bring down the necessary tooling and templates for Visual Studio. It will also install the Microsoft Azure Storage Emulator which can be used during development to ensure that you don't spend a fortune during the early stages. Once your code is ready for production, you can simply change the connection string in the app.config/web.config. It's truly that simple

2. Create the Console app

Disclosure: this is an extremely basic example only used for the purpose of showcasing Serilog. In the real world I would be deploying Serilog on top of LibLog to provide the appropriate abstraction

These are the steps necessary to add Serilog to our application:

  • Open Visual Studio and create a simple console application.
  • Right-click on the solution and launch the NuGet Package Manager.
  • Search for Serilog and install version 1.5.14 (at the time of writing, 2.0 was not fully compatible with the Azure Table Storage Sink)
  • Search for Serilog.Sinks.AzureTableStorage and install version 1.5.14

To install the packages, you can alternatively use the NuGet command line in Visual Studio and run the following commands:

Install-Package Serilog -Version 1.5.14  
Install-Package Serilog.Sinks.AzureTableStorage -Version 1.5.14  

With the necessary packages installed, we can now instantiate a logger in the code and start logging. Open your Program.cs file and add the following code inside Main()

var storage = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));

var log = new LoggerConfiguration()  

/*var log = new LoggerConfiguration() 
              .WriteTo.AzureTableStorageWithProperties(storage, storageTableName: "customName", writeInBatches: true, batchPostingLimit: 100, period: new System.TimeSpan(0, 0, 3)) 

log.Debug("hello world");  
log.Error("Oh no, an error");  
log.Information("another message, just for info"); 

Finally, we need to configure our app.config with the appropriate Azure Storage Account connection string. In this case, we’ll use the local/development storage account. Add the following:

    <add key="StorageConnectionString" value="UseDevelopmentStorage=true;" />

This line should be changed to look like the following when you go to production or when you're ready to point your application to the live storage account

    <add key="StorageConnectionString" value="DefaultEndpointsProtocol=https;AccountName=<storage name>;AccountKey=<account key>"

Just make sure you replace the values in the angle brackets with your storage account name and key.

3. Code Analysis

3.1 Synchronous Sink

First of all, we need to get a reference to our storage account. The reference to the storage account is then passed to our AzureTableStorage sink as we create our logger using the fluent API. You’ll notice that there are 2 different options for creating a logger. The first one is pretty simple and it instantiates a synchronous logger that will try to write the log entity every time we call log.<logLevel>("some message"). This is fine and it works as expected, however, it’s not ideal when writing thousands of logs against a remote cloud-based storage account.

3.2 Batch logging

This is where the second option comes in and allows us to write logs to the destination in batches. This is highly optimised for higher workloads where we need to log multiple messages a second. Imagine looping through thousands of records and logging something about each record. Once you enable batching with writeInBatches: true, you can define the maximum number of logs per batch with batchPostingLimit: 100 and how often to commit the batch by setting period: new System.TimeSpan(0, 0, 3). In this case, we have 100 logs per batch and we commit every 3 seconds.

You can also define the minimum log level to capture. By default, anything above debug will be logged/sent to the sink for processing. You can override this setting based on your needs. This can also be configure in the app/web.config, so you don’t have to recompile and redeploy your application every time you need to change the log level in production.

A nice trick to stop you messing with configuration files as well is to use ConfigR which relies on ScriptCS and Roslyn and allows you to provide configuration settings in code. This way, you don’t have to restart your application in production every time you need to change logging levels. Hopefully you can see the pattern here: configuration changes should be seamless to your application. That's why it's very important that you use the right tool for the right job.

4. Context aware logger

If you want your logger to be aware of the current context or even pass a customized context, you can use the following syntax when instantiating a logger within your class:

//class context
var log = Log.ForContext<MyClass>();

// object context
var myObject = new MyObject();  
var log = Log.ForContext("MyObjectId", myObject.Id);  

5. Debugging and troubleshooting your logger

If things don’t go to plan and you need to work out what’s wrong, you can use the debug and diagnostics tools that come with Serilog. You can enable verbose logging using the following command:

Serilog.Debugging.SelfLog.Out = Console.Out;

If you don't want to use the Console, you can alternatively output the error messages to a file or in-memory StringWriter.

6. Conclusion

The Serilog team has done an excellent job to come up with a very comprehensive logging framework. They are also in the process of updating the code to work with .NET Core. In the meantime, there’s a stable release that works beautifully, it’s extremely easy to setup and offers the largest variety of Sinks/Adapters I’ve seen. The fluent API is irreplaceable and it only takes a few lines of code to add logging capabilities to your application. If you need to implement logging in your code, I highly recommend that you give Serilog a go and, as always, make sure you let me know in the comments how you find it. The same goes for questions.

  • Share this post on
comments powered by Disqus