own/maintain/contribute:
manigandham / serilog-sinks-fastconsole Goto Github PK
View Code? Open in Web Editor NEWSerilog sink that writes to console with high-performance non-blocking output.
License: MIT License
Serilog sink that writes to console with high-performance non-blocking output.
License: MIT License
With current project implementation, I can only provide different template using "outputTemplate". But for my project I need to provide ElasticsearchJsonFormatter directly.
It looks like this for regular Console output:
.WriteTo.Console(formatter: new ElasticsearchJsonFormatter(inlineFields: true, formatStackTraceAsArray: false))
It would be nice to be able to provide formatter directly, since "outputTemplate" does not really work for me
let logger =
LoggerConfiguration()
.MinimumLevel.Debug()
.WriteTo.File(logfileName, shared = true, restrictedToMinimumLevel = LogEventLevel.Debug)
.WriteTo.Console(restrictedToMinimumLevel = LogEventLevel.Warning)
.CreateLogger()
No console output:
let logger =
LoggerConfiguration()
.MinimumLevel.Debug()
.Destructure.FSharpTypes()
.WriteTo.File(logfileName, shared = true, restrictedToMinimumLevel = LogEventLevel.Debug)
.WriteTo.FastConsole()
.CreateLogger()
Also, no console output at all:
let logger =
LoggerConfiguration()
.MinimumLevel.Debug()
.Destructure.FSharpTypes()
.WriteTo.FastConsole()
.WriteTo.File(logfileName, shared = true, restrictedToMinimumLevel = LogEventLevel.Debug)
.CreateLogger()
Console output only without File logger:
let logger =
LoggerConfiguration()
.MinimumLevel.Debug()
.Destructure.FSharpTypes()
.WriteTo.FastConsole()
.CreateLogger()
Normal console sink work in combination with file sink in both first and second scenario.
The sink produces missing or partial console output when the current size of the log is less than the size of the buffer of the StreamWriter
used to write to the console's stdout stream.
Each log message should appear in the console in its entirety shortly after it has been generated by the application.
The last line in the console output contains a partial log message. Generating more log messages completes the line.
DemoApp
http://localhost:5000
http://localhost:5000
The screenshot below shows an example of how it might look like in step 4:
Generating a constant stream of log messages does not expose this bug.
In my WinForms app I have a deadlock on line 'WriteQueueWorker.GetAwaiter().GetResult();'
I am using ver. 1.3.2 of FastConsoleSink
Hey @manigandham,
First of all thank for great package. This is amazing work. But can I ask for some benchmarks in order to prove that this sink is faster that standard one? I think it will be useful to put it in the description and will answer many questions. Right now, everyone who wants to test this sink has to do it on its own, so it would be nice to have proofs in the package description.
P.S. personally, I have troubles to benchmark this package, because it behaves very strange in my benchmarks. It outputs too many log entries and sometimes getting stuck.
It would be nice to have an option to block when log queue is full to avoid losing log messages if blocking is preferred. Like this:
public void Emit(LogEvent logEvent) {
if (_options.blockWhenFull)
{
_writeQueue.Writer.WriteAsync(logEvent).GetAwaiter().GetResult();
}
else
{
_writeQueue.Writer.TryWrite(logEvent);
}
}
With BlockWhenFull=True
, when a lot of tasks try to write logs simultaneously, and QueueLimit
is exhausted, there is a massive and fast grow of the amount of threads.
It looks like new threads creation is not even being throttled by thread pool. (not sure though)
At the same time, this is not the case for Serilog.Sinks.Async (with the same settings) which uses BlockingCollection.
I think this line of code is the reason.
I also made some experiments which prove my suggestion:
using System.Collections.Concurrent;
using System.Diagnostics;
using System.Threading.Channels;
var lockObject = new object();
var blockingCollection = new BlockingCollection<int>(10);
var channel = Channel.CreateBounded<int>(new BoundedChannelOptions(10) {SingleReader = true});
var cts = new CancellationTokenSource();
cts.CancelAfter(30000);
while (!cts.IsCancellationRequested)
{
Task.Run(() =>
{
//channel.Writer.WriteAsync(0).AsTask().GetAwaiter().GetResult(); // Result - 147 threads
//blockingCollection.Add(0); //Result - 52 threads
//lock (lockObject) { channel.Writer.WriteAsync(0).AsTask().GetAwaiter().GetResult(); } //Result - 56 threads
//var j = 1 + 1; // Result - 15 threads
});
}
Console.WriteLine(Process.GetCurrentProcess().Threads.Count);
Why is this a problem? (just in case)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.