Coder Social home page Coder Social logo

serilog-sinks-azureblobstorage's People

Contributors

adarshgupta avatar adementjeva avatar alexeysas avatar azure-pipelines[bot] avatar brolaugh avatar ccubbage avatar ceemafour avatar chriswill avatar daanwasscher avatar dependabot[bot] avatar dsbut avatar fdudannychen avatar ghyath-serhal avatar jamessampica avatar krishnakole avatar marcderaedt avatar marien-ov avatar ry8806 avatar symbiotickilla avatar throck95 avatar tomseida avatar vip32 avatar vplauzon avatar yvonnearnoldus avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

serilog-sinks-azureblobstorage's Issues

Call to log writer never returns

I'm trying to add Serilog logging to my project (ASP .NET Web Application (.NET Framework 4.6.1)) and I can't get it past the actual logging of the statement.

var _logger = new LoggerConfiguration()
	.WriteTo.AzureBlobStorage( "DefaultEndpointsProtocol=https;AccountName=zzzzzzzz;AccountKey=xxxxxxx;EndpointSuffix=core.windows.net", Serilog.Events.LogEventLevel.Debug, "dev-logs", "{yyyy}/{MM}/{dd}/log.txt" )
	.CreateLogger();
_logger.Information( "testing" );

If I put a breakpoint in place, I can step over the .CreateLogger() statement, but if I try to step past the .Information() call, it never makes it to the next statement. No exceptions are thrown, and the entry never makes it to the log file. The file is created, but remains empty. What am I missing? I've tried this with version 1.2.3 and 1.1.1. I've also tried both 2.7.1 and 2.8.0 of Serilog itself. Is there something wrong with my connection string (in which case, how is the file created empty)? What other settings or changes do I need to make in order to get this running properly?

Other packages that got added to my project:
..\packages\Microsoft.Azure.KeyVault.Core.1.0.0\lib\net40\Microsoft.Azure.KeyVault.Core.dll
..\packages\WindowsAzure.Storage.9.3.3\lib\net45\Microsoft.WindowsAzure.Storage.dll
..\packages\Newtonsoft.Json.12.0.1\lib\net45\Newtonsoft.Json.dll
..\packages\Serilog.2.8.0\lib\net46\Serilog.dll
..\packages\Serilog.Sinks.AzureBlobStorage.1.2.3\lib\net45\Serilog.Sinks.AzureBlobStorage.dll
..\packages\Serilog.Sinks.File.4.0.0\lib\net45\Serilog.Sinks.File.dll
..\packages\Serilog.Sinks.PeriodicBatching.2.1.1\lib\net45\Serilog.Sinks.PeriodicBatching.dll

Naming Storage File

For our application we are creating log files for a specific task. Each time that task is run we would ideally like a new file in the blob storage to be generated with the Task ID (which is a GUID) in the file name. Is there currently a way to set the Storage File Name to one of the log properties?

Log file not being created

Hello @chriswill ,

I was just trying out this library. But I am facing an issue where a blob container is being created, but I can't find any log files under that container. Not sure what exactly went wrong. I am using .NET Core 3.1 and Microsoft Azure Storage Emulator for local development btw.

Here is the code snippet I am using:

var bn = new BlobNameFactory("{yyyy}/{MM}/{dd}_V2_Info.txt");
 var log = new LoggerConfiguration()
                 .WriteTo.AzureBlobStorage(connectionString, LogEventLevel.Information,"logs",bn.GetBlobName(new DateTimeOffset((DateTime.Now))))
                 .CreateLogger();

log.Information("test information log");

Rolling log file by size

From what I can see, there's no option for a rolling file with a size limitation, like there is in the File sink. Is this possible?

Also, how do I specify the additional parameters (line format, blob container, and blob name) when using the AppSettings method?

Rolling file with UTC time

Is there a way to create the rolling file with UTC time instead of local time?
.WriteTo.AzureBlobStorage(connectionString, Serilog.Events.LogEventLevel.Information, null, "{yyyy}/{MM}/{dd}//hour{HH}.log")

using ReadFrom.AppSettings() invalid cast from 'System.String'

I am trying to read the settings from my web config file to log to an azure blob storage. This is my code in the global.asax.cs file.

Log.Logger = new LoggerConfiguration()
                    .ReadFrom.AppSettings()
                    .CreateLogger();

these are the settings in my web.config

<appSettings>
    <add key="webpages:Version" value="3.0.0.0" />
    <add key="webpages:Enabled" value="false" />
    <add key="ClientValidationEnabled" value="true" />
    <add key="UnobtrusiveJavaScriptEnabled" value="true" />
    <add key="DataProtectionKeyFolderPath" value="C:\\Temp" />
    <add key="serilog:minimum-level" value="Verbose" />
    <add key="serilog:enrich:with-property:Version" value="1.0" />
    <add key="serilog:using:RollingFile" value="Serilog.Sinks.RollingFile" />
    <add key="serilog:write-to:RollingFile.retainedFileCountLimit" value="365" />
    <add key="serilog:using:AzureBlobStorage" value="Serilog.Sinks.AzureBlobStorage" />
    <add key="serilog:write-to:AzureBlobStorage.connectionString" value="DefaultEndpointsProtocol=https;AccountName=gva4lil4waspdiagstg1;AccountKey=***;EndpointSuffix=core.usgovcloudapi.net"/>
    <add key="SAConnectionString" value="DefaultEndpointsProtocol=https;AccountName=gva4lil4waspdiagstg1;AccountKey=***;EndpointSuffix=core.usgovcloudapi.net" />
    <add key="serilog:write-to:AzureBlobStorage.storageContainerName" value="logcontainer" />
    <add key="serilog:write-to:AzureBlobStorage.formatter" value="Serilog.Formatting.Compact.CompactJsonFormatter, Serilog.Formatting.Compact" />
    <add key="serilog:write-to:AzureBlobStorage.storageFileName" value="{yyyy}/{MM}/{dd}/log.txt" />
</appSettings>

I'm getting the following error: {"Invalid cast from 'System.String' to 'Serilog.Formatting.ITextFormatter'."}

Empty log.txt file created on Azure Data Lake (built on Azure Blob Storage)

I am using Azure Functions and want to save my logs on Azure Data Lake gen2, logging to console and local file works fine, logging to Data Lake gen2 (which is basically blob storage) only creates folders and empty "log.txt" file with size of 0B and type of append blob. Do you know the reason? I see a similar issue over a year ago regarding WebJobs and it is not resolved.

public class Startup : FunctionsStartup
    {
        public override void Configure(IFunctionsHostBuilder builder)
        {
            string connectionString = "myConnectionString";
            var logger = new LoggerConfiguration()
                .WriteTo.Console()
                .WriteTo.File("log.txt", rollingInterval: RollingInterval.Minute)
                .WriteTo.AzureBlobStorage(connectionString, Serilog.Events.LogEventLevel.Information, null, "{yyyy}/{MM}/{dd}/log.txt")
                .CreateLogger();
            builder.Services.AddLogging(lb => lb.AddSerilog(logger));
}

Doesn't append to blob

I can't seem to get it working. It does create the container and the blob, but nothing is appended to the blob. I'm using the ILogger interface. In Application Insights I see 409 when it tries to log. Must be something that I do wrong.. but I just cant see what it is..

Using netcoreapp2.2.
Serilog Version="2.8.0"
Serilog.AspNetCore Version="2.1.1"
Serilog.Sinks.AzureBlobStorage Version="1.3.0"

Program.cs

WebHost.CreateDefaultBuilder(args)
                .UseStartup<Startup>()
                .UseSerilog((context, configuration) =>
                    {
                        configuration
                            .MinimumLevel.Debug()
                            .MinimumLevel.Override("Microsoft", LogEventLevel.Warning)
                            .MinimumLevel.Override("System", LogEventLevel.Warning)
                            .MinimumLevel.Override("Microsoft.AspNetCore.Authentication", LogEventLevel.Information)
                            .Enrich.FromLogContext()
                            .WriteTo.AzureBlobStorage(CloudStorageAccount.Parse(context.Configuration.GetConnectionString("StorageAccount")), LogEventLevel.Information, null, "{yyyy}/{MM}/{dd}/log.txt")
                            .WriteTo.Console(outputTemplate: "[{Timestamp:HH:mm:ss} {Level}] {SourceContext}{NewLine}{Message:lj}{NewLine}{Exception}{NewLine}", theme: AnsiConsoleTheme.Literate);
                    });

No content output to blob

Hi,

I'm new to Serilog & to this extension, so this is more likely a beginner's mistake than an issue with the software.

Basically I create a logger:

        var logger = new LoggerConfiguration()
            .WriteTo
            .AzureBlobStorage(
                connectionString,
                LogEventLevel.Verbose,
                container,
                "{yyyy}-{MM}-{dd}-log.txt",
                blobSizeLimitBytes:200000000)
            .CreateLogger();

(with a valid connectionString and container)

I DO NOT set Log.Logger (I've tried with same result).

I then pass the logger instance to a service (as a singleton) and use it:

logger.Information("something");

What happens:

  • No exceptions are thrown
  • A blob is created in the right container 2021-03-26-log.txt at the moment the logger.Information line is executed
  • (So far so good)
  • The blob is of length zero and therefore empty and remains empty

I've tried multiple "slap the ATM" tactics such as running multiple logger.Information and never got anything in a blob.

Any tips to resolve this?

Cheers,

VP

Subsequent statements are not logged

When trying to log more statements in the same method, the subsequent statements are not logged.
For example, if you run the testclient, you could add an extra logging statement

Log.Information("Hello World!");
Log.Information("This line is never logged!");

You will see that only the first line will get logged

The example runs:
Serilog v2.6.0
Serilog.Sinks.AzureBlobStorage v0.8.15
Serilog.Sinks.PeriodicBatching v2.1.1

Thanks for the blob sink implementation @chriswill !!

Call to log writer never returns in Blazor application (from Issue #32)

Hi,

A problem was raised at the end of Issue #32, which was closed because it initially concerned only pre-.NET Standard 2.0 applications. The very same problem arises in Blazor applications. The application stops reading logs after the first line read, and it then loads endlessly. Help would be greatly appreciated.

Does not log to Blob container

Hi,
I've got an application with plenty logging in it. But somehow I can't get this logging sink to work.
All I see is an empty Blob container 'logs' being created, but no logfiles are being pushed.

Currently, I'm using the following code that I've gotten from one of your examples:

var connectionString = CloudStorageAccount.Parse("UseDevelopmentStorage=true");
var configuration = config.Build();
Log.Logger = new LoggerConfiguration()
  .WriteTo.Console()
  .WriteTo.AzureBlobStorage(connectionString, Serilog.Events.LogEventLevel.Information, null, "{yyyy}/{MM}/{dd}/log.txt")
  .CreateLogger();`

Also tried the following:

.WriteTo.AzureBlobStorage(connectionString, Serilog.Events.LogEventLevel.Information, null, null, null, true, TimeSpan.FromSeconds(15), 10);

With this I'd expect some kind of logfiles every 15 seconds?

Best regard,
Rick

File doesn't rollover when reaching file size limit

After reaching a configured file size limit the file doesn't rollover. After stopping and starting the application a new file is created. It seems the file size is only determined once. A solution can be fetching the filesize more often. Tested the solution as shown below and this seems to work (only added fetchAttributes marked in bold). Can someone agree that this is the wright solution and maybe fix this ?

The code change is done in the class DefaultCloudBlobProvider.

public async Task GetCloudBlobAsync(CloudBlobClient cloudBlobClient, string blobContainerName, string blobName, bool bypassBlobCreationValidation, long? blobSizeLimitBytes = null)
{
// Check if the current known blob is the targeted blob
if (currentCloudAppendBlob != null && currentBlobName.Equals(blobName, StringComparison.OrdinalIgnoreCase))
{
await currentCloudAppendBlob.FetchAttributesAsync().ConfigureAwait(false);
// Check if the current blob is within the block count and file size limits
if (ValidateBlobProperties(currentCloudAppendBlob, blobSizeLimitBytes))
{
return currentCloudAppendBlob;
}
else
{
// The blob is correct but needs to be rolled over
currentBlobRollSequence++;
await GetCloudAppendBlobAsync(cloudBlobClient, blobContainerName, blobName, bypassBlobCreationValidation);
}
}
else
{
//first time to get a cloudblob or the blobname has changed
currentBlobRollSequence = 0;
await GetCloudAppendBlobAsync(cloudBlobClient, blobContainerName, blobName, bypassBlobCreationValidation, blobSizeLimitBytes);
}

return currentCloudAppendBlob;

}

Managed Idenity / Key Vault

Can you add an example of using a Managed Identity to access the blob and/or pulling the connection string from a Key Vault.

Storing the account key in plain text is bad. SAS is slightly better ;), but since the app is running in azure with a managed id, and we can configure sql to not use a password, and our other file access using MI to the storage account, this logging should also access it the same way.

https://docs.microsoft.com/en-us/azure/storage/common/storage-auth-aad-msi

Format option

I wanted to write in blob in json format and for which I used following overload method in serilog
AzureBlobStorage(
this LoggerSinkConfiguration loggerConfiguration,
ITextFormatter formatter,
string connectionString,
LogEventLevel restrictedToMinimumLevel = LogEventLevel.Verbose,
string storageContainerName = null,
string storageFileName = null,
bool writeInBatches = false,
TimeSpan? period = null,
int? batchPostingLimit = null,
bool bypassBlobCreationValidation = false,
ICloudBlobProvider cloudBlobProvider = null)

where for ITextFormatter I passed new JsonFormatter() but the log file doesn't show rendered message. If I want to see rendered message as well what should I do?

{"Timestamp":"2019-11-05T22:31:10.2994880+13:00","Level":"Information","MessageTemplate":"Test {@template}","Properties":{"template":"test","UtcTimestamp":"2019-11-05T09:31:10.2994880Z"}}

File is empty when using with WebJob

Are there any considerations when using the package with appservice WebJob?
I'm using .net core 2.2.
In Program.cs:

var host = new HostBuilder()

                .UseConsoleLifetime()
                ....
                .UseSerilog((hostingContext, loggerConfiguration) =>
                {
                    loggerConfiguration.ReadFrom.Configuration(hostingContext.Configuration);
                })
                .Build();

The appsettings.json:

"Serilog": {

    "MinimumLevel": "Debug",

    "WriteTo": [
      {
        "Name": "AzureBlobStorage",
        "Args": {
          "connectionString": "comes-from-ci",
          "storageContainerName": "myapp",
          "storageFileName": "{yyyy}/{MM}/{dd}/log.txt"
        }
      },
      {
        "Name": "ColoredConsole"
      }
    ]
  },

As a result, the yyyy/MM/dd/log.txt file is created in the Blob storage, however the file is always 0B size.

Another question is, should I turn on any options in the App Service Logs in the AppService configuration when using this library?

Thanks

JSON configuration works in appsettings.json but not in Application settings.

If I put the connection string in appsettings.json like this it works.
"Serilog": { "WriteTo": [ {"Name": "AzureBlobStorage", "Args": {"connectionString": "", "storageContainerName": "", "storageFileName": ""}} ] }
But if I follow good practice and put it in the applications settings of the web app in Azure like so:

Serilog:WriteTo:AzureBlobStorage.connectionString this conection string is not used.

I did the same with Serilog:WriteTo:ApplicationInsightsEvents.InstrumentationKey for "Serilog.Sinks.ApplicationInsights and that did work.

Any idea why it is not working here?

Log context?

Does this store the log context anywhere, or just the rendered message?

Thanks!

Date not working for blob name

I have used the same code you have for the rolling files in your readme but my file name is coming out as: "{yyyy}{MM}{dd}.txt", I even tried "{yyyyMMdd}.txt", but this didn't work either. Any ideas?

Unit tests fail when run under NetCoreApp2.0

WriteSingleBlockOnSingleInput() and WriteTwoBlocksOnOnInputOfTwo() in DefaultAppendBlobBlockWriterUT fail when running under netcoreapp2.0, but they pass under .net452.

@chriswill to check logging functionality and revise tests as necessary.

@DaanWasscher just an fyi. If you have any thoughts on the tests, please let me know or submit an update. These were the tests that failed in the CI build, and I can fail them on my desktop now as well using Resharper.

Blob not appending after upgrading

I've been trying to get the rolling blob name functionality working and I realized I was a few revisions behind. When I upgraded to 1.2.1 from 1.0.0.1 the blobs are created but they do not update. I followed the JSON format as you describe in the doc:
{
"Name": "AzureBlobStorage",
"Args": {
"outputTemplate": "",
"connectionString": "<my connection string",
"storageContainerName": "serverlogs",
"storageFileName": "{yyyy}/{MM}/{dd}/localInstance.txt"
}
The bob is created as the storageFileName dictates but no log entries are ever written to it. If I switch back to 1.0.0.1 the blobs update as expected. Any insight you may have would be appreciated. Thanks

version 3+ requires .net standard 2.1?

Hi,

when i upgrade from v2.1 to v3.0 , nuget install faied. can't support .netframework 4.8.
I saw nuget dependencies require .net standard 2.1.
why not support .net standard 2.0(.net framework 4.8)?
Azure.Storage.Blobs v12 only require .net standard 2.0

Ability to set the Content Type of the files

Currently, the content type of file files added to the azure storage is application/octet-stream. This is the default content type for any blob being added to azure storage blob container.
Custom Content Type of a blob can only be set when creating the blob. Can we have a configuration setting called "ContentType". When this value is set, the blobs created should have the contenttype mentioned in the configuration?

Rolling file cleanup

Hello,

Is there any plans to implement some sort of automatic cleanup of the rolling files like in the File Sink?

Thanks

Olivier

Feature Request: Azure.Storage.Blobs 12.x upgrade

Upgrading to the the latest major rev would be helpful, and knock out a couple extra issues as far as I can tell. Pre-12.x is now considered legacy (they changed package names).

Pros:

  • Upgrade out of legacy dependency
  • 12.x has better integration between Azure libraries, meaning Identity credential management can be integrated for free (would handle at least part of #42)
    Cons:
  • 12.x was built from scratch; not an easy upgrade
  • Would be a breaking change for this library: some APIs here expose the old Storage SDK API.

Combine appsettings.json with runtime configuration

I like having Serilog configured in appsettings.json, for example

{
  "Serilog": { 
    "Using": [
      "Serilog.Sinks.AzureBlobStorage"
    ],
    "WriteTo": [
      {
        "Name": "AzureBlobStorage",
        "Args": {
          //"connectionString": "I wish to come from environment variables",
          "storageContainerName": "SomeContainer",
          "outputTemplate": "[{Timestamp:HH:mm:ss} {Level:u3} {SubscriptionName,-10}] {Message:lj} {NewLine}{Exception}",
          "storageFileName": "logs/{yyyy}-{MM}-{dd}-{HH}.txt"
        }
      }
    ]
  }
}

At the same time I would like to configure the connection string at runtime, or leave the connection string in appsettings.json but substitutue storage account name and key with environment variables.

I have tried the approach of building configuration twice, where first

var configurationBuilder = new ConfigurationBuilder()
    .AddJsonFile("appsettings.json", false, true)
    .AddEnvironmentVariables();

then create the storage connection string and finally

configurationBuilder.AddInMemoryCollection(new[] { 
    new KeyValuePair<string, string>(
        "Serilog:WriteTo:AzureBlobStorage.connectionString", 
        storageConnectionString)});

but that key is not "hitting" the right field for Serilog configuration to pickup it up.

Any ideas?

Log file content not output when using Service Fabric

Whilst running my Service Fabric API within the remote cluster it fails to output any log information to the log file, however it does create the folder within the Blob container and it also creates the log file it just has no output.

I have added file logging to ensure that the log is being populated correctly and that log file when viewed on the VM does indeed contain log information.

If have configured the sink using the appsettings.json file, but I have also configured programmatically with the same result.

I have attached the settings file and I have also attached my StatelessService class to show how I am setting up the Logger.

Identity.cs.txt

appsettings.json.txt

Error message raised by Serilog.Debugging.SelfLog: -
2019-05-15T08:10:48.6968808Z Caught exception while emitting to sink Serilog.Sinks.AzureBlobStorage.AzureBlobStorageSink: System.MissingMethodException: Method not found: 'System.Threading.Tasks.Task1<Int64> Microsoft.Azure.Storage.Blob.CloudAppendBlob.AppendBlockAsync(System.IO.Stream)'. at Serilog.Sinks.AzureBlobStorage.DefaultAppendBlobBlockWriter.WriteBlocksToAppendBlobAsync(CloudAppendBlob cloudAppendBlob, IEnumerable1 blocks)
at System.Runtime.CompilerServices.AsyncMethodBuilderCore.Start[TStateMachine](TStateMachine& stateMachine)
at Serilog.Sinks.AzureBlobStorage.DefaultAppendBlobBlockWriter.WriteBlocksToAppendBlobAsync(CloudAppendBlob cloudAppendBlob, IEnumerable`1 blocks)
at Serilog.Sinks.AzureBlobStorage.AzureBlobStorageSink.Emit(LogEvent logEvent)
at Serilog.Core.Sinks.SafeAggregateSink.Emit(LogEvent logEvent)

Can append blobs be used?

Is it possible to use append blobs? That definitely makes the most sense for logging to me. Is it more efficient than regular block blobs? What if logging in batches?

Stack Overflow when using sink within Function App

I have migrated our project to latest package dependencies for various things and removed some deprecated Microsoft packages etc. and then found our app no longer deployed within the Function App in Azure. Upon investigation a stack overflow message was encountered and I have traced it back to this package.

Update: I am using Azurite and if I set the restrictedToMinimumLevel to Error then the stack overflow doesn't occur because the log events being raised are Request events an example of which shown at the foot of this post.

I have traced the issue to this function: -

Serilog.Sinks.AzureBlobStorage.AzureBlobProvider.GetCloudBlobAsync: Line 46

Line 46 makes this call: -

Response<BlobProperties> propertiesResponse = await currentAppendBlobClient.GetPropertiesAsync().ConfigureAwait(false);

This emits an event handled by: -

Serilog.Sinks.AzureBlobStorage.Emit

and the first thing that this does is call: -

Serilog.Sinks.AzureBlobStorage.AzureBlobProvider.GetCloudBlobAsync: Line 46

DemoDotNetCore31.zip

[08:48:57 INF] Request [37a4ec4f-9503-4a0a-b12d-ad31e9321b4c] HEAD http://127.0.0.1:10000/devstoreaccount1/azure-webjobs-hosts/locks/andydesktop-1020578781/host
x-ms-version:2020-08-04
Accept:application/xml
x-ms-client-request-id:37a4ec4f-9503-4a0a-b12d-ad31e9321b4c
x-ms-return-client-request-id:true
User-Agent:azsdk-net-Storage.Blobs/12.9.0,(.NET 6.0.3; Microsoft Windows 10.0.19044)
x-ms-date:Wed, 06 Apr 2022 07:48:57 GMT
Authorization:REDACTED
client assembly: Azure.Storage.Blobs
[08:48:57 INF] Response [37a4ec4f-9503-4a0a-b12d-ad31e9321b4c] 200 OK (00.0s)
Server:Azurite-Blob/3.16.0
x-ms-creation-time:Wed, 06 Apr 2022 07:46:51 GMT
x-ms-meta-FunctionInstance:REDACTED
x-ms-blob-type:BlockBlob
x-ms-lease-duration:fixed
x-ms-lease-state:leased
x-ms-lease-status:locked
ETag:"0x1EAFB3D22287DC0"
x-ms-client-request-id:37a4ec4f-9503-4a0a-b12d-ad31e9321b4c
x-ms-request-id:1c7c35a3-7e06-4453-bec5-a8007c115f47
x-ms-version:2021-04-10
Date:Wed, 06 Apr 2022 07:48:57 GMT
Accept-Ranges:bytes
x-ms-server-encrypted:true
x-ms-access-tier:Hot
x-ms-access-tier-inferred:true
x-ms-access-tier-change-time:Wed, 06 Apr 2022 07:46:51 GMT
Connection:keep-alive
Keep-Alive:REDACTED
Last-Modified:Wed, 06 Apr 2022 07:46:51 GMT
Content-Length:0
Content-Type:application/octet-stream
Content-MD5:

For azure function v2, the ReadFrom.Configuration does not work.

I have a azure function v2, and configured all settings in appsettings.json.

And in the function.cs, I use the ReadFrom.Configuration to try read settings and write log to azure blob storage. But there is no logs generated in blob storage(the function runs without errors).

my function.cs:

[FunctionName("Function1")]
        public static void Run([TimerTrigger("0 */1 * * * *")]TimerInfo myTimer, ExecutionContext context)
        {
             var configuration = new ConfigurationBuilder()
                .SetBasePath(Environment.CurrentDirectory)
                .AddJsonFile(@"appsettings.json", false, true)               
                .Build();

               Log.Logger = new LoggerConfiguration()
               .ReadFrom.Configuration(configuration) 
               .CreateLogger();

              Log.Logger.Information("test 123.");

}

my appsettings.json

{
  "Serilog": {
  "WriteTo": [
    {
      "Name": "AzureBlobStorage",
      "Args": {
        "connectionString": "storage account connection string",
        "storageContainerName": "test-1",
        "storageFileName": "testfile.txt"
      }
    }
  ]
}
}

Ambiguous call on .WriteTo.AzureBlobStorage

I've set up the write to blob storage with the code below.

.WriteTo.AzureBlobStorage(configuration.GetSection("AzureAd").Get<AzureSettings>().BlobContainer,
    outputTemplate: "[{Timestamp:HH:mm:ss} {Level:u3}] {Message:lj} {Exception} {NewLine}",

This produces the following error:

Error CS0121 The call is ambiguous between the following methods or properties: 'LoggerConfigurationAzureBlobStorageExtensions.AzureBlobStorage(LoggerSinkConfiguration, string, LogEventLevel, string, string, string, bool, TimeSpan?, int?, bool, IFormatProvider, ICloudBlobProvider, long?, int?, bool)' and 'LoggerConfigurationAzureBlobStorageExtensions.AzureBlobStorage(LoggerSinkConfiguration, string, IConfiguration, LogEventLevel, string, string, string, bool, TimeSpan?, int?, bool, IFormatProvider, ICloudBlobProvider, long?, int?, bool)'

To get around this, it is necessary to explicitly indicate the value for the connectionString property:

.WriteTo.AzureBlobStorage(connectionString: configuration.GetSection("AzureAd").Get<AzureSettings>().BlobContainer...

My project uses the following packages:

<PackageReference Include="Serilog" Version="2.10.0" />
<PackageReference Include="Serilog.AspNetCore" Version="5.0.0" />
<PackageReference Include="Serilog.Sinks.AzureBlobStorage" Version="3.1.1" />
<PackageReference Include="Serilog.Sinks.Console" Version="4.0.2-dev-00890" />

Named ConnectionString not working

I'm currently defining my connection strings the my appsettings .

"ConnectionStrings": {
    "AzureBlob": ""
  }

I want to use the name of my connection string in my sink configuration.

      {
        "Name": "Logger",
        "Args": {
          "configureLogger": {
            "WriteTo": [
              {
                "Name": "AzureBlobStorage",
                "Args": {
                  "connectionString": "AzureBlob",
                  "storageContainerName": "audit",
                  "storageFileName": "API/{yyyy}/{MM}/{dd}/logs.txt"
                }
              }
            ]
          }
        }
      }

It seems that this is not working. Or am I doing something wrong?

Using Account SAS

Hi, I'm trying to use an Account SAS for a blob container, but it doesn't seem to work, any help would be appreciated. I've tried what feels like everything, but I can't get it working at all.

This is my my current setup:

var log = new LoggerConfiguration()
    .WriteTo.Async(a => a.AzureBlobStorage(
        formatter: new CompactJsonFormatter(),
        blobServiceClient: new BlobServiceClient(new Uri(config["SharedAccessSignature"])),
        storageContainerName: "somecontainer",
        storageFileName: "{yyyy}-{MM}-{dd}_log.txt"
    ))
    .CreateLogger();

where config["SharedAccessSignature"] is the full SAS token URI (https://).

The BlobServiceClient seems to get built correctly, or at least no exceptions are thrown with the current setup, and the account name is set as it should.

Thanks a lot.

Add support for using date fields multiple times

I'm currently in the process of implementing this project. What would really help me is if this library supported having each date part multiple times to allow formats like this:

{yyyy}/{MM}/{dd}/{yyyy}-{MM}-{dd}_{HH}:{mm}.txt
2019/06/20/2019-06-20_14:40.txt

This would help in identifying log files which has been downloaded and no longer has the folder named to give it context. One could put all the date fields into the file name but that would potentially create HUGE leaf folders in terms of the amount of files.

Default cloud blob provider fails when using a SAS token

Hi,
I'm using a SAS token to give access to a specific container in my storage account. In this case the DefaultCloudBlobProvider cannot create the container and ends up in the catch block of the provider. But then the returned CloudAppendBlob is always null.

And in cases where the CloudAppendBlob can be created it seems to me that the rolling over to a new log file for a new day will not work. But I am not fully sure of that.

If you want I could make some PR's for the newly created issues...

Br,
Daan

Method call do not proceed after the logger function called

Hi,
Trying to use the functionality in my Blazor application. Whenever am calling logger function to log the message the method call do not proceed after the logger call. Below is my method

     void logmessages()
       {
           _logger.LogDebug("Logging Debug");
           _logger.LogError("Logging Error");
           _logger.LogTrace("Logging Trace");
           _logger.LogCritical("Logging Critical");
           _logger.LogInformation("Logging Information");
           _logger.LogWarning("Logging Warning");
       }

The method call just log Logging Debug ( or which ever is the first log function ) and the the function don't proceed and the page keeps loading.

Below is my appsetting.json


"Serilog": {
    "Using": [
      "Serilog.Sinks.AzureBlobStorage", "Serilog.Filters.Expressions"
    ],
    "MinimumLevel": {
      "Default": "Debug",
      "Override": {
        "Default": "Information",
        "Microsoft": "Warning",
        "Microsoft.Hosting.Lifetime": "Information"
      }
    },
    "WriteTo": [
      {
        "Name": "Logger",
        "Args": {
          "configureLogger": {
            "Filter": [
              {
                "Name": "ByIncludingOnly",
                "Args": {
                  "expression": "(@Level = 'Error' or @Level = 'Fatal' or @Level = 'Warning' or @Level = 'Information' or @Level = 'Debug')"
                }
              }
            ],
            "WriteTo": [
              {
                "Name": "AzureBlobStorage",
                "Args": {
                  "connectionString": "Valid connection string",
                  "storageContainerName": "My Log",
                  "storageFileName": "logFile.log",
                  "outputTemplate": "{Timestamp:yyyy-MM-dd hh:mm:ss} [{Level:u3}] ({SourceContext}) {Message}{NewLine}{Exception}"
                }
              }
            ]
          }
        }
      }
    ],
    "Enrich": [
      "FromLogContext",
      "WithMachineName",
      "WithProcessId",
      "WithThreadId"
    ],
    "Properties": {
      "Application": "MyApp"
    }
  }

Below is my code for Program.cs

      public static IHostBuilder CreateHostBuilder(string[] args) =>
           Host.CreateDefaultBuilder(args)
           .ConfigureLogging((hostingContext, logging) =>
           {
               var logger = new LoggerConfiguration()
                    .ReadFrom.Configuration(hostingContext.Configuration)
                    .CreateLogger();

               logging.AddSerilog(logger,dispose:true);
           })
               .ConfigureWebHostDefaults(webBuilder =>
               {
                   webBuilder.UseStartup<Startup>();
               });

Batch posting not working using JSON configuration

Hello,

I have used Serilog.Sinks.AzureBlobStorage to write log in Blob storage. It's working fine. Now, I have performance issue. So, as per documentation, I tried to enable batch posting using (writeInBatches) following JSON configuration.

{ "Name": "AzureBlobStorage", "Args": { "connectionString": "BLOB ConnectionString", "storageContainerName": "test", "storageFileName": "{yyyy}/{MM}/Testlog-{yyyyMMdd}.txt", "writeInBatches": true, "restrictedToMinimumLevel": "Information" } }

When I enable batch posting by setting writeInBatches: true it's throwing out without any exception from here. If I set it back to writeInBatches: false, it's working fine.

Please let me know if there is any issue in JSON configuration.

File is created multiple times per hour

Hello,

When configuring to write to blob storage like below:
"Name": "AzureBlobStorage",
"Args": {
"connectionString": "[connection here]",
"storageContainerName": "blobContainer",
"storageFileName": "logs/{yyyy}{MM}/{dd}{HH}bloblogs.txt",
"outputTemplate": "{Timestamp:o} [{Level:u3}] ({Application}/{MachineName}/{ThreadId}) {Message}{NewLine}{Exception}"
}

I'm seeing that it is logging multiple files in an hour like the format below.

logs/202006/1000bloblogs.txt
logs/202006/1000bloblogs-001.txt
logs/202006/1000bloblogs-002.txt
logs/202006/1000bloblogs-500.txt

It is creating a new file within the same hour, but it is not due to size, since it varies and it is creating more than 500 files per hour.

Any idea why is it behaving like this?

JSON configuration is not working in .netcore console app

Hello,

I have created simple .netcore 2.2 console app and try to configure using JSON configuration but it's not working.

Here is my .csproj package reference.

`

<PackageReference Include="Serilog.Settings.Configuration" Version="3.1.0" />
<PackageReference Include="Serilog.Sinks.Async" Version="1.4.0" />
<PackageReference Include="Serilog.Sinks.Console" Version="3.1.1" />
<PackageReference Include="Serilog.Sinks.File" Version="4.0.0" />
<PackageReference Include="Serilog.Sinks.AzureBlobStorage" Version="2.0.2" />

`

Here is appsettings.json configuration

{ "Serilog": { "Using": [ "Serilog.Sinks.AzureBlobStorage" ], "MinimumLevel": "Verbose", "WriteTo": [ { "Name": "Async", "Args": { "configure": [ { "Name": "Console", "Args": { "theme": "Serilog.Sinks.SystemConsole.Themes.AnsiConsoleTheme::Code, Serilog.Sinks.Console", "outputTemplate": "{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} [{Level:u3}] {Message:j}{NewLine}{Properties:j}{NewLine}{Exception}" } }, { "Name": "File", "Args": { "restrictedToMinimumLevel": "Warning", "path": "D:\\Logs\\Test_log.txt", "rollingInterval": "Day", "fileSizeLimitBytes": 10240, "rollOnFileSizeLimit": true, "retainedFileCountLimit": 30 } }, { "Name": "AzureBlobStorage", "Args": { "connectionString": "[Connection string]", "storageContainerName": "logs", "storageFileName": "test-logs-{dd-MM-yyyy}.txt", "blobSizeLimitBytes": "4000000", "restrictedToMinimumLevel": "Information" } } ] } } ], "Enrich": [ "FromLogContext", "WithMachineName", "WithExceptionDetails" ], "Properties": { "ApplicationName": "SampleApp", "Environment": "Dev" } } }

Here is the code to create logger and use it.

var configuration = new ConfigurationBuilder() .AddJsonFile("appsettings.json") .Build();

Log.Logger = new LoggerConfiguration() .ReadFrom.Configuration(configuration) .CreateLogger();

Log.Verbose("Here's a Verbose message."); Log.Information(new Exception("Exceptions can be put on all log levels"), "Here's an Info message.");

Please let me know if I'm missing something.

Thanks,
Abid Ali

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.