Coder Social home page Coder Social logo

Comments (11)

shubhamCedargate avatar shubhamCedargate commented on July 22, 2024 1

I'm sorry for not replying. I created a sample application using .net4.8 and EF6 but could not replicate the issue. The exception disappeared after I did a fresh install of all the packages. Thank you for the assistance.

from audit.net.

thepirat000 avatar thepirat000 commented on July 22, 2024

You should call Audit.Core.Configuration.Setup() and Audit.EntityFramework.Configuration.Setup() only once, from your start-up or initialization code, since they set a global static configuration.

If you call them from your DbContext constructor, they will be called multiple times, potentially from different threads, overriding the global configuration each time and leading to errors.

So, you should move your Setup calls to your start-up code.

An alternative to set the data provider per DbContext instance is to set the AuditDataProvider property of the IAuditDbContext in the constructor:

public CoreDbContext()
{
    _auditContext = new DefaultAuditContext(this);
    // Set the data provider for this DbContext instance: 
    _auditContext.AuditDataProvider = new FileDataProvider(c => c.Directory(@"C:\Logs"));
    _helper.SetConfig(_auditContext);
}

or if you inherited from AuditDbContext, just:

public CoreDbContext()
{
    this.AuditDataProvider = new FileDataProvider(c => c.Directory(@"C:\Logs"));
}

In these two cases, the audited DbContext will not use the globally configured data provider, but the instance provided in the constructor.

from audit.net.

shubhamCedargate avatar shubhamCedargate commented on July 22, 2024

Yes sir,
The audit logs are generated when I add the mentioned code in the constructor:

public CoreDbContext() { _auditContext = new DefaultAuditContext(this); // Set the data provider for this DbContext instance: _auditContext.AuditDataProvider = new FileDataProvider(c => c.Directory(@"C:\Logs")); _helper.SetConfig(_auditContext); }

But it stops working immediately after installing the Audit.NET.Elasticsearch package without any other code changes.

I need Audit.NET.Elasticsearch in order to write the audit logs onto ES.

from audit.net.

thepirat000 avatar thepirat000 commented on July 22, 2024

Note you must use the same version for all the Audit.* references. Looks like you're using a different version for Audit.EntityFramework

Audit.NET, Version=21.0.0.0
Audit.NET.Elasticsearch, Version=21.0.0.0
Audit.EntityFramework, Version=20.2.4.0

from audit.net.

thepirat000 avatar thepirat000 commented on July 22, 2024

Closing this assuming it was solved by correcting the version number. Otherwise please comment below

from audit.net.

shubhamCedargate avatar shubhamCedargate commented on July 22, 2024

Also, what is the latest version of ES server that Audit.NET.Elasticsearch supports? https://cloud.elastic.co/ only allows the creation of an ES server of version 7.17.9 or higher for trial purposes.

Audit.NET.Elasticsearch 21.0.0 installs the Elasticsearch.Net 7.17.0 package by default.

from audit.net.

thepirat000 avatar thepirat000 commented on July 22, 2024

Also, what is the latest version of ES server that Audit.NET.Elasticsearch supports? https://cloud.elastic.co/ only allows the creation of an ES server of version 7.17.9 or higher for trial purposes.

Audit.NET.Elasticsearch 21.0.0 installs the Elasticsearch.Net 7.17.0 package by default.

You can force the use of 7.17.9 just adding the reference to that version on your app, assuming it will be binary compatible with 7.17.0+

from audit.net.

shubhamCedargate avatar shubhamCedargate commented on July 22, 2024

Can we filter which entities to watch when using a custom data provider?

Previously, Did this using the Audit.EntityFramework; package by annotating the Entity with [AuditInclude] :

[AuditInclude] public class Patient : EntityBase
and, using .UseOptIn() to only track those entities

Audit.EntityFramework.Configuration.Setup() .ForAnyContext(_ => _ .AuditEventType(EventTypeEntityFramework) .IncludeEntityObjects()) .UseOptIn();

Now, I'm trying to write logs into an ElasticSearch domain through Kinesis firehose, for which, I've created a custom data provider like this:

`
using Audit.Core;
using System;
using Amazon.KinesisFirehose;
using Newtonsoft.Json;
using Newtonsoft.Json.Serialization;
using Amazon.KinesisFirehose.Model;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using System.Threading;

namespace Cwm
{
public class CustomKinesisFirehoseDataProvider : AuditDataProvider
{
private readonly AmazonKinesisFirehoseClient _kinesisFirehoseClient;
private readonly string _deliveryStreamName;

    public CustomKinesisFirehoseDataProvider(AmazonKinesisFirehoseClient kinesisFirehoseClient, string deliveryStreamName)
    {
        _kinesisFirehoseClient = kinesisFirehoseClient;
        _deliveryStreamName = deliveryStreamName;
    }

    public override object InsertEvent(AuditEvent auditEvent)
    {
        var json = JsonConvert.SerializeObject(auditEvent, new JsonSerializerSettings
        {
            ContractResolver = new CamelCasePropertyNamesContractResolver(),
            NullValueHandling = NullValueHandling.Ignore,
            ReferenceLoopHandling = ReferenceLoopHandling.Ignore
        });

        var record = new Record { Data = new MemoryStream(Encoding.UTF8.GetBytes(json)) };
        var putRecordRequest = new PutRecordRequest
        {
            DeliveryStreamName = _deliveryStreamName,
            Record = record
        };

        _kinesisFirehoseClient.PutRecordAsync(putRecordRequest).Wait();

        return record;
    }

    public override async Task<object> InsertEventAsync(AuditEvent auditEvent, CancellationToken cancellationToken = default)
    {
        var json = JsonConvert.SerializeObject(auditEvent, new JsonSerializerSettings
        {
            ContractResolver = new CamelCasePropertyNamesContractResolver(),
            NullValueHandling = NullValueHandling.Ignore,
            ReferenceLoopHandling = ReferenceLoopHandling.Ignore
        });

        var record = new Record { Data = new MemoryStream(Encoding.UTF8.GetBytes(json)) };
        var putRecordRequest = new PutRecordRequest
        {
            DeliveryStreamName = _deliveryStreamName,
            Record = record
        };

        await _kinesisFirehoseClient.PutRecordAsync(putRecordRequest);

        return record;
    }


    public override void ReplaceEvent(object eventId, AuditEvent auditEvent)
    {
        var json = JsonConvert.SerializeObject(auditEvent, new JsonSerializerSettings
        {
            ContractResolver = new CamelCasePropertyNamesContractResolver(),
            NullValueHandling = NullValueHandling.Ignore,
            ReferenceLoopHandling = ReferenceLoopHandling.Ignore
        });

        var record = new Record { Data = new MemoryStream(Encoding.UTF8.GetBytes(json)) };
        var putRecordRequest = new PutRecordRequest
        {
            DeliveryStreamName = _deliveryStreamName,
            Record = record
        };

        _kinesisFirehoseClient.PutRecordAsync(putRecordRequest).Wait();

    }

    public override async Task ReplaceEventAsync(object eventId, AuditEvent auditEvent, CancellationToken cancellationToken = default)
    {
        var json = JsonConvert.SerializeObject(auditEvent, new JsonSerializerSettings
        {
            ContractResolver = new CamelCasePropertyNamesContractResolver(),
            NullValueHandling = NullValueHandling.Ignore,
            ReferenceLoopHandling = ReferenceLoopHandling.Ignore
        });

        var record = new Record { Data = new MemoryStream(Encoding.UTF8.GetBytes(json)) };
        var putRecordRequest = new PutRecordRequest
        {
            DeliveryStreamName = _deliveryStreamName,
            Record = record
        };

        await _kinesisFirehoseClient.PutRecordAsync(putRecordRequest);

    }

}

}
`

and I've used this data provider in the Statrup config as such:

`
var credentials = new BasicAWSCredentials("ACCESS_KEY", "SECRET_KEY");
var kinesisFirehoseClient = new AmazonKinesisFirehoseClient(credentials, Amazon.RegionEndpoint.USEast1);
var deliveryStreamName = "DATA_STREAM_NAME";
var firehoseProvider = new CustomKinesisFirehoseDataProvider(kinesisFirehoseClient, deliveryStreamName);

        Audit.Core.Configuration.DataProviderFactory = () => firehoseProvider;

        Audit.Core.Configuration.Setup()
            .UseCustomProvider(firehoseProvider)
            .WithCreationPolicy(EventCreationPolicy.InsertOnStartInsertOnEnd);

        Audit.Core.Configuration.DataProvider = firehoseProvider;

`

I am able to write Audit logs into the Opensearch domain but am having a tough time configuring selective tracking of only certain entities like done previously.

Could you please point out if I'm missing something?
Thank You!

from audit.net.

shubhamCedargate avatar shubhamCedargate commented on July 22, 2024

I missed changing the AuditOptionMode property of the annotation on the DbContext class.

Seems like it works after

[AuditDbContext(Mode = AuditOptionMode.OptIn, IncludeEntityObjects = false, AuditEventType = "EF")]
internal class CoreDbContext : BaseContext

from audit.net.

thepirat000 avatar thepirat000 commented on July 22, 2024

ok 👌

please note you don't need the following line:

Audit.Core.Configuration.DataProviderFactory = () => firehoseProvider;

since you're afterwards calling .UseCustomProviser() which basically will do the same.

from audit.net.

shubhamCedargate avatar shubhamCedargate commented on July 22, 2024

Hello! again,
I ran into an issue when writing the audit logs into ES
"
{"type":"illegal_argument_exception","reason":"mapper [entityFrameworkEvent.entries.changes.newValue] cannot be changed from type [text] to [ObjectMapper]"}
"
I'm using Kinesis firehose to write logs into an Amazon Opensearch ES cluster.

I'm guessing this error was caused because the nodes newValue and originalValue have multiple DataTypes.
When they were being written into the kibana index, whichever data type got inserted first,
that node was mapped to that data type ([text] in this case)
and
when it attempted to insert the second value of the changes node, it threw the exception as the node had already been mapped to [text] and could not be changed.

Can I Get around this scenario?
(Ideally making the originalValue and newValue accept dynamic DataTypes.)

A part of "entityFrameworkEvent" "entries" JSON looks like this:

         "action":"Update",
         "changes":[
            {
               "columnName":"Name",
               "originalValue":{
                  "firstName":"Robin",
                  "lastName":"Scherbatsky",
                  "firstNameDecrypted":"Robin",
                  "lastNameDecrypted":"Scherbatsky",
                  "nameDecrypted":true
               },
               "newValue":{
                  "firstName":"ENCRYPTED_STRING_HERE",
                  "lastName":"ENCRYPTED_STRING_HERE",
                  "firstNameDecrypted":"Robin Sparkles",
                  "lastNameDecrypted":"Scherbatsky",
                  "nameDecrypted":true
               }
            },
            {
               "columnName":"NameHash",
               "originalValue":"ENCRYPTED_STRING_HERE",
               "newValue":"ENCRYPTED_STRING_HERE"
            }
         ]

My Entity class looks somewhat like this:

public class Patient : EntityBase, IHaveEncryptedProperties
{
public Guid Id { get; private set; }
...
...
public Name Name { get; set; }
...
}

Where the name class has the First name and last name properties

public class Name : IHaveEncryptedProperties
{
...
[Encrypt]
public string FirstName { get; protected set; }
[Encrypt]
public string MiddleName { get; protected set; }
[Encrypt]
public string LastName { get; protected set; }
...
}

from audit.net.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.