serilog-archive / serilog-sinks-azuredocumentdb Goto Github PK
View Code? Open in Web Editor NEWA Serilog sink that writes to Azure DocumentDB
License: Apache License 2.0
A Serilog sink that writes to Azure DocumentDB
License: Apache License 2.0
It appears that microsoft will be getting rid of the 'fixed' collections soon as they are no longer needed. This will allow logs to grow beyond 10Gbs.
The code will need updating to keep track of logs by partition id.
The bulk import script will need updating to make the calls against individual partitions.
My aim is to Configure Serilog logging. After installing following nuget packages i'm getting an error.
Serilog https://github.com/serilog/serilog
Serilog DocumentDB https://github.com/serilog/serilog-sinks-azuredocumentdb
The error is attached below
I get this error after installing Serilog.Sinks.AzureDocumentDB.
This is .net core with MVC 1.0.1 running as an App Service in Azure. Does not exhibit this issue when running locally with the DocumentDB emulator from VS. When I start up the app the logging works as expected and then about 1/15 minutes after the app was started the logs stop being written to the database. I'm not seeing any exceptions. This is what I am using to configure:
Log.Logger = new LoggerConfiguration() .MinimumLevel.Override("Microsoft", LogEventLevel.Warning) .MinimumLevel.Override("Hangfire", LogEventLevel.Warning) .MinimumLevel.Override("System", LogEventLevel.Error) .Enrich.FromLogContext() .WriteTo.AzureDocumentDB(connectionInfo.EndPoint, connectionInfo.Key, databaseName: connectionInfo.DbName, collectionName: connectionInfo.Collection) .CreateLogger();
I am also using the provided code to init the service:
loggerFactory.AddSerilog(); appLifetime.ApplicationStopped.Register(Log.CloseAndFlush);
If I restart the app then it works again for 10/15 and then stops.
Going to clone the repo and try to run it locally and see if I can see what I'm doing wrong or if there is a bug.
I was looking at the documentation here and in the collections section, it has this example:
var fruit = new Dictionary<string,int> {{ "Apple", 1}, { "Pear", 5 }};
Log.Information("In my bowl I have {Fruit}", fruit);
So I tried that, and I get in Document DB:
{
"EventIdHash": 2098721849,
"Timestamp": "2017-03-15 19:46:13.712-04:00",
"Level": "Information",
"Message": "In my bowl I have [(\"Apple\": 1), (\"Pear\": 5)]",
"MessageTemplate": "In my bowl I have {Fruit}",
"Exception": null,
"Properties": {
"Fruit": null,
"SourceContext": "MyProgram.MyClass"
},
"id": "b62d1578-2056-6335-21a4-57655c12f22e"
}
{ "Fruit": ["Apple", "Pear", "Orange"] }
"Fruit": null
Now if you just run a simple console app:
namespace ConsoleApp1
{
class Program
{
static void Main(string[] args)
{
var logger = new LoggerConfiguration()
.WriteTo.Console().CreateLogger();
//.WriteTo.Console(formatter: new JsonFormatter()).CreateLogger();
var fruit = new Dictionary<string, int> { { "Apple", 1 }, { "Pear", 5 } };
logger.Information("In my bowl I have {@Fruit}", fruit);
Console.ReadKey();
}
}
}
the output is the same:
2017-03-15 20:40:20 [Information] In my bowl I have [("Apple": 1), ("Pear": 5)]
But if you comment out the first line, and add in the JsonFormatter, then the Dictionary works and everything looks great... but in the console.
{"Timestamp":"2017-03-15T20:41:21.0136426-04:00","Level":"Information","MessageTemplate":"In my bowl I have {@Fruit}","Properties":{"Fruit":{"Apple":1,"Pear":5}}}
The DocumentDb sink doesn't have the capability to pass new JsonFormatter()
.
Let me know if you need more information
Hi,
I'm using CosmosDb sink in a .net core 3.1 web API project, and it works with normal logging things, but whenever I use a BeginScope then sometimes logs are written in the CosmosDb and most of the times NOT.
##Program.cs
public static void Main(string[] args)
{
var configuration = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appsettings.json")
.Build();
using (var logger = new LoggerConfiguration()
.ReadFrom.Configuration(configuration)
.CreateLogger())
{
Log.Logger = logger;
try
{
Log.Information("Starting up");
CreateHostBuilder(args).Build().Run();
}
catch (Exception ex)
{
Log.Fatal(ex, "Application start-up failed");
}
finally
{
Log.CloseAndFlush();
}
}
}
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.UseSerilog()
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseStartup<Startup>();
});
}
##appSettings.json
{
"Serilog": {
"MinimumLevel": {
"Default": "Information",
"Override": {
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"WriteTo": [
{
"Name": "Console",
"Args": {
"outputTemplate": "{Timestamp:HH:mm:ss} [{Level:u3}] [{SourceContext}] {Scope} {Message}{NewLine}{Exception}"
}
},
{
"Name": "AzureTableStorageWithProperties",
"Args": {
"storageTableName": "AppLogs",
"connectionString": "xxxx",
"propertyColumns": [ "Scope", "RequestId" ]
}
},
{
"Name": "AzureDocumentDB",
"Args": {
"endpointUrl": "https://xxxx.documents.azure.com",
"authorizationKey": "xxxx=="
}
},
{
"Name": "File",
"Args": {
"path": "%TEMP%/Logs/AppLogs.txt",
"outputTemplate": "{Timestamp:o} [{Level:u3}] ({Application}/{MachineName}/{ThreadId}/{ThreadName}) {Message}{NewLine}{Exception}"
}
}
],
"Enrich": [ "FromLogContext", "WithMachineName", "WithThreadId" ],
"Properties": {
"Application": "App"
}
}
}
##Controller
public class LogsController : ControllerBase
{
private readonly ILogger<LogsController> _logger;
public LogsController(
ILogger<LogsController> logger
)
{
_logger = logger;
}
[HttpPost("installation")]
[AllowAnonymous]
public async Task<IActionResult> ReportInstallerLogs(InstallationLogsDto input)
{
try
{
if (input == null)
{
throw new ArgumentNullException(nameof(input));
}
// import the logs
using (_logger.BeginScope($"IPIN {input.Ipin}"))
{
if (input.CompletedSuccessfully)
{
_logger.LogInformation("Install succeeded!");
}
else
{
_logger.LogError("InstallationFailed");
}
}
return Ok();
}
catch (Exception ex)
{
_logger.LogError(ex, ex.Message);
return StatusCode(400, ex.Message);
}
}
}
I have already read this issue, but I'm not sure how can I use it within a controller and DI.
How to force this sink to output the MessageTemplate property too?
There has apparently been a change within Microsoft.Azure.DocumentDB 1.15.0 that doesn't work with the sink. The following error is generated when using it with the latest version of the Azure DocumentDB Sink.
System.MissingMethodException: Method not found: 'Void Microsoft.Azure.Documents.Client.DocumentClient..ctor(System.Uri, System.String, Microsoft.Azure.Documents.Client.ConnectionPolicy, System.Nullable1)'. at Serilog.Sinks.AzureDocumentDb.AzureDocumentDBSink..ctor(Uri endpointUri, String authorizationKey, String databaseName, String collectionName, IFormatProvider formatProvider, Boolean storeTimestampInUtc, Protocol connectionProtocol, Nullable
1 timeToLive)
at Serilog.LoggerConfigurationAzureDocumentDBExtensions.AzureDocumentDB(LoggerSinkConfiguration loggerConfiguration, Uri endpointUri, String authorizationKey, String databaseName, String collectionName, LogEventLevel restrictedToMinimumLevel, IFormatProvider formatProvider, Boolean storeTimestampInUtc, Protocol connectionProtocol, Nullable`1 timeToLive)
Looking at the code for Microsoft.Azure.DocumentDB.Client.DocumentClient, a new parameter was added to the constructor and set to default as NULL but just updating the NuGet Package causes it to break.
Just curious if this is supported or can be supported in future.
Thanks,
Hi guys,
I have issue right now when I write data to cosmos DB using serilog, the timestamp is saved in UTC format "Timestamp": "2022-03-22 01:47:39.102+00:00",. Cosmos DB recommended ISO 8601 UTC standard using this string format yyyy-MM-ddTHH:mm:ss.fffffffZ.
I wrote question in stackoverflow for more descriptive info and configuration:
https://stackoverflow.com/questions/71580138/serilog-to-cosmos-db-not-saving-timestamp-to-iso-8601-utc-standard
Hope to hear from you on this issue.
kudos to everyone in here!
Thanks
I put in a pull request last week to support a flag to indicate whether the Timestamp should be the local time or UTC. It's basically the same thing that was done in the MSSQL sink last year. Comments appreciated.
My team has a .NET Core Library that targets .NETStandard 1.6. We were running version 3.6.1 and decided to upgrade to 3.8.1 and saw that the requirements for the nuget package had changed to .NETCoreApp 1.0.
Now, nuget is trying to install the .NET Framework version, which fails to compile because the underlying DocumentDB library is not .NETStandard compatible.
Is there a reason for this change?
Serilog logs into Diagnostics DB / Logs container. But when i try to change into cosmosdb which i created, its not logging anything, please advise. If i dont specify cosmosdb and container name, the logs are getting created in Diagnostics DB / Logs container. But if i include my cosmos db and my container, no logs are written into the cosms db which i created
class Starup : FunctionStartup
{
public override void Configure(IFunctionHostBuilder builder)
{
var logger = new LoggerConfiguration()
.MinimumLevel.Verbose()
.WriteTo.AzureDocumentDB("https://myendpoint.documents.azure.com:443/", "mykey", "mycosmosdb", "mycontainer")
.CreateLogger();
builder.Services.AddLogging(lb => lb.AddSerilog(logger));`
Hi, this package should be only applied to the azure cosmos db with Sql api,
if mongo api , please use serilog-sinks-mongodb
https://github.com/serilog/serilog-sinks-mongodb
Hi All,
I am using Azure document Db logger sink with Azure service fabric Actors and when we deploy these actor services in Azure service fabric cluster, logs are not written to document db. After analyzing DUMP files we find that all Actors have one thread is blocking at below method:
Serilog.Sinks.Batch.BatchProvider.EventPump
Is it happening due to batching. Why Actor thread is getting blocked... After further investigation, the collection link is ‘dbs/Z0FDAA==/colls/Z0FDAJSwbQA=/’
Is this normal/correct?? I am using following configuration appsettings to write logs into Doc db:
<add key="LogLevel" value="Info"/> <add key="serilog:using:AzureDocumentDB" value="Serilog.Sinks.AzureDocumentDB" /> <add key="serilog:write-to:AzureDocumentDB.endpointUrl" value="https://XXXXXXX.documents.azure.com:443/" /> <add key="serilog:write-to:AzureDocumentDB.authorizationKey" value="******************x5szfD2bnwa6s40QPhvTAyxw5jfhjk678sfw6sdghdfgecpUF3FMBlXnG58UQ32w==" /> <add key="serilog:write-to:AzureDocumentDB.timeToLive" value="864000" />
I am having problems writing documents to Cosmos DB.
I have made two attempts to write to Cosmos - using the appsettings configuration API and using the fluent configuration API.
The sink can connect to the Cosmos DB in question and also create a "Diagnostics" database and a "Logs" collection if those values are not specified. But nothing gets logged to Cosmos DB.
I have downloaded the "ToDo"-sample app from Azure to verify that documents can indeed be written to my Cosmos - and that works. But not the AzureDocumentDB sink.
My fluent configuration is this:
var logger = new LoggerConfiguration()
.WriteTo.ColoredConsole()
.WriteTo.AzureDocumentDB("https://myendpoint.documents.azure.com:443/", "mykey")
.CreateLogger();
logger.Information("Hello friend");
I have updated all dependent nuget packages in the Visual Studio project.
My appsettings configuration is this:
<add key="serilog:using:AzureDocumentDB" value="Serilog.Sinks.AzureDocumentDB" />
<add key="serilog:write-to:AzureDocumentDB.endpointUrl" value="https://myendpoint.documents.azure.com:443/" />
<add key="serilog:write-to:AzureDocumentDB.authorizationKey" value="mykey" />
<!-- Liefspan of log messages in DocumentDB in seconds, leave empty to disable expiration. -->
<add key="serilog:write-to:AzureDocumentDB.timeToLive" value="" />
With C# initialization:
var log = new LoggerConfiguration()
.ReadFrom.AppSettings()
.CreateLogger();
A long shot to the root cause of the issue could be the addition of CORS to Cosmos a couple of months back, but it is just a guess.
More details are available on SO at https://stackoverflow.com/questions/55377272/using-xml-configuration-to-log-to-cosmos-db-with-serilog where I originally logged the issue.
I am using .NET Framework 4.6.1. Developed a logger component on top of Serilog and wanted to use Azure Document Db sink. I am getting a strange behavior. My first log statement appears in Document Db, but the subsequent log messages are not coming. After deleting the first document, again logging happens only first time and never happens again. Below is my code
internal class SeriLogger : ILogger
{
private static readonly string EndpointUri = "https://mysample.documents.azure.com:443/";
private static readonly string AuthKey = "UkgOMJT08bA5KOvd1O9IYgo0w==";
private static readonly string DatabaseName = "Diagnostics";
private static readonly string CollectionName = "Logs";
protected readonly Serilog.Core.Logger logger;
public SeriLogger()
{
logger = new LoggerConfiguration()
.WriteTo.AzureDocumentDB(EndpointUri, AuthKey, DatabaseName, CollectionName)
.CreateLogger();
//logger = new LoggerConfiguration()
//.WriteTo.RollingFile(Path.Combine(
// AppDomain.CurrentDomain.BaseDirectory, "log-{Date}.txt"), fileSizeLimitBytes: 536870912)
// .CreateLogger();
}
public void Log(string message, LogLevelEnum logLevel)
{
switch (logLevel)
{
case LogLevelEnum.Debug: logger.Debug(message); break;
case LogLevelEnum.Information: logger.Information(message); break;
case LogLevelEnum.Warning: logger.Warning(message); break;
case LogLevelEnum.Error: logger.Error(message); break;
case LogLevelEnum.Fatal: logger.Fatal(message); break;
}
}
Below are the Nuget versions I am using
<packages> <package id="Microsoft.Azure.DocumentDB" version="1.11.1" targetFramework="net461" /> <package id="Newtonsoft.Json" version="6.0.8" targetFramework="net461" /> <package id="Serilog" version="2.4.0" targetFramework="net461" /> <package id="Serilog.Sinks.AzureDocumentDb" version="3.5.18" targetFramework="net461" /> <package id="Serilog.Sinks.File" version="3.2.0" targetFramework="net461" /> <package id="Serilog.Sinks.RollingFile" version="3.3.0" targetFramework="net461" /> </packages>
But the logging works perfectly with Rolling File sink. Please help me.
I tried to enable diagnostics for serilog like below
Serilog.Debugging.SelfLog.Enable(msg => Debug.WriteLine(msg));
Now I can see two lines in my VS output window.
2017-03-14T07:40:18.5291346Z Sending batch of 2 messages to DocumentDB 2017-03-14T07:40:22.1107764Z One or more errors occurred.
Hey guys,
We did a production release of our codebase yesterday and immediately following had a massive change in CPU usage on the server. After debugging the problem today I have narrowed it down to upgrading our Serilog.Sinks.AzureDocumentDB
nuget package from 3.8.0 to 3.8.1.
The following graph shows our Azure web app before and after deployment when the only change was upgrading Serilog.Sinks.AzureDocumentDB
from 3.8.0 to 3.8.1.
The overloaded CPU never stops unless we revert the change from 3.8.1 back down to 3.8.0.
Our implementation of Serilog is pretty simple, wired up to an ASP.NET Core MVC API project. See here:
The code path for this deployment scenario utilizes the DocumentDB configuration.
Upgrading from 3.8.0 to 3.8.1 also seems to bring in a few other dependency updates, so I am uncertain if it could be one of the others, such as the main Serilog package.
So this happens infrequently, and it's not bothering me that much but I thought I'd post it here. Here's the log history, you can see at the top it logged to DocumentDb (and that log shows up correctly), then it missed the batch of 50 for some reason. Then it continues logging happily.
This file shows the exception and stacktrace:
logoutput.txt
This file shows an example of what should be logged (50 times in this case) to documentDb.
DocumentDbEntrySample.txt
Any idea what is null? Unfortunately it's not reliably reproducable. I'd be happy to submit a PR if you can point me in the right direction.
Hi,
I tried a minimal sample with a cosmosdb, but unfortunately I do not see any documents dropped into the database.
Sample:
class Program
{
static void Main()
{
using (var logger = new LoggerConfiguration()
.Enrich.FromLogContext()
.Enrich.WithProperty("Application", "Test-Console")
.WriteTo.AzureDocumentDB(
"https://mpa-logtest-core.documents.azure.com:443/",
"****",
restrictedToMinimumLevel: Serilog.Events.LogEventLevel.Verbose,
storeTimestampInUtc: true,
timeToLive: 60*60*24*14,
logBufferSize: 1000,
batchSize: 10)
.WriteTo.Console()
.CreateLogger())
{
Log.Logger = logger;
logger.Information("Test");
logger.Error(new Exception("Test"), "Exception");
Enumerable.Range(1, 10000).ToList().ForEach(x => logger.Fatal("Hallo {x}", x));
Console.WriteLine("Hello World!");
}
//Thread.Sleep(600 * 1000);
Log.CloseAndFlush();
}
}
Even with the Wait time, nothing happens in the database.
I see a lot of traffice between my PC and the database endpoint, which is actually TLS encrypted, but it is there.
Anything I am doing wrong?
Edit: code sample changed to Cosmos DB from Mongo DB (supplied the wrong test)
Using this serilog sink to write into Azure DocumentDB, there seams to be some issue with the sinking for Error event type i.e. in case of passing Exception object.
Here is the code snippet:
public IActionResult Index()
{
try
{
int.Parse("s");
}
catch (Exception exception)
{
_logger.Error("Test error");
_logger.Error(exception, "Test error");
throw;
}
_logger.Information("test");
return View();
}
In the above code:
_logger.Error("Test error"); This works well as we are not passing the exception object.
Whereas _logger.Error(exception, "Test error"); dont log any entry in document DB as we are passing the exception object.
Is there any work-around for this, so that the exception details can be logged into DocumentDB.
Hi,
when using this sink with the newer Microsoft.Azure.DocumentDB Client (1.11.1) the following error occurs at runtime:
Serilog.Sinks.AzureDocumentDb.<CreateDatabaseIfNotExistsAsync>d__12.MoveNext() +0
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start(TStateMachine& stateMachine) +124
Serilog.Sinks.AzureDocumentDb.AzureDocumentDBSink.CreateDatabaseIfNotExistsAsync(String databaseName) +137
it seems that the OpenAsync() on the client lib has changed slightly: an optional cancellation token can now be supplied.
i cannot explain this runtime behaviour as it is a new optional argument, it should work ;)
i would love to do a PR and update this project so it uses ddb client 1.11.1 and this issue probably will go away. however i have a hard time even opening this core project in vs15/17rc (nothing but vs errors). will later try on another box...
Hi @serilog/reviewers-azure-documentdb 👋
Via serilog/serilog#1627 - we're unbundling the serilog
organization to help distribute the effort involved in managing the various Serilog sub-projects. The serilog
organization will now only manage the fundamental sinks and other packages that the rest of the ecosystem builds upon.
If this package is actively maintained, it can be moved to a new organization managed by the maintainers. Otherwise, it can move to the serilog-archive
parking lot, from where we hope a new community-run fork might spring from in the future 🌷 .
Let me know if you're a maintainer and keen to continue this project under a new org; otherwise, I'll shuffle things around and move this one to the archive.
Thanks!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.