Coder Social home page Coder Social logo

zzzprojects / bulk-operations Goto Github PK

View Code? Open in Web Editor NEW
142.0 11.0 36.0 365 KB

C# SQL Bulk Operations | High-performance C# bulk insert, update, delete and merge for SQL Server, SQL Azure, SQL Compact, MySQL, and SQLite.

Home Page: https://bulk-operations.net

C# 100.00%
sqlbulkcopy csharp dotnet

bulk-operations's People

Contributors

jonathanmagnan avatar lempireqc avatar stgelaisalex avatar waqasm78 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bulk-operations's Issues

How to update selected data with BulkUpdate?

Hi,
I have a question about bulk update.

Very simple example. I have this Table:

| ID | NAME | PASS |
| 1 | UserA | PassA |
| 2 | UserB | PassB |

I want to update for user A his name, and for user B the Pass. How can i do it with bulk?

If I set empty the Pass value for User A, bulk set it like null, and the same for user B with his name. This is a problem for me. How can i resolve it?

What I want to say is that I want to set into bulk only the values that I want to update. This is because of restrictions of my application.

Thanks in advance.

Bulk insert to existing table with different schema

Hello,

We have a table already defined in our DB that we are using your library to bulk insert to. The incoming data has changed in column definition and we now have more columns than the existing table. I was expecting the bulk insert to fail, however, it works without issue and just ignores the new columns in the data we are trying to insert.

Is there anyway to check that the receiving table is the same definition as the source data? How do you handle these scenarios?

Thanks

c

Issue in inserting the identiy

Hi,
I am trying to copy data from SQL server to Mysql server using the BulkOperation and i want to insert the identity column as it is in the Sql server but it is not working with me and the odd thing it is working in one of tables.
All of the tables are the same on both of the databases and my code is very simple

bulk.ColumnMappings.Add("Id", "Id",true);
           bulk.UseLogDump = true;
           bulk.InsertKeepIdentity = true;
            bulk.InsertIfNotExists = true;
           bulk.BulkInsert(dr);
           string d= bulk.LogDump.ToString(); 

From the LogDump i can't see the id in the list of the fields.
Any help is appreciated.
Thanks

usage of composite automapkeynames

Hi,

We require to use the multiple key mapping (How to assign multiple/group of automapkeyname) in bulk merge operation.

We want to check multiple combination of columns to validate while uploading data. If record exist then it should check for combination of multiple columns & then update the existing record.

How to use BulkOperation.ResolutionType enum?

Hi,
I am using the BulkOperation class (Z.BulkOperations namespace) but I didn't find found into the Wiki or into the assembly file the documentation about the ResolutionType enum.
I would understand the difference between the following enum values (how they work in detail) and the advantages/disadvantages to know what is the best solution for my cases:

  1. Smart
  2. TemporaryTable
  3. WorkingTable

Thanks.
Andrea

Enquiry

Hello there,
Please I want to achieve a scenario,
We have an MVC Web application already built using EF and MSSQL DB.
The application and Database are both hosted on a local computer (COMPUTER A).
Also, we have a database with similar schema hosted on another computer (COMPUTER B) with access to internet.
We intend to provide access to the internet for this computer (COMPUTER A) so it can send its data to COMPUTER B while keeping track of synced data.
Do you think this project will do this?
Thanks.

How to enable the Pro license?

Hi.
To add the license I added the license name and the key into the app.config in the appSettings section.
After that, to enable the pro license I must call the ValidateLicense() method when the application runs before to use the library. It's right?
It's a problem if I call this method each time the application runs?

Thanks.
Andrea

Issue with insert into SQL Server table from ExpandoObject

Hello,

I have an expando object which has key values that have full stops in the name i.e. Avg. Position. The table in SQL Server has the exact same column names, however, the BulkInsert does not work as it seems to complain about the columns with a full stop in. The error message is:-

' Position' is not a member of type 'System.Object'

Interestingly if I remove the full stops from the table column name (Avg Position) the data inserts even if the key in the expando object still has them in (Avg. Position)

The table is created on the fly depending on the return json from any number of API calls so I would like to avoid having to amend the column names in code if possible.

Are you able to look into this issue?

Many thanks

C

Primary Key not Present in Dictionary

Hello!

I have a MySQL Bulk Insert operation. The primary key is a composite key of { PartyId, PartySourceSystemId }.

In my database, I have a pre-insert trigger that takes care of assigning the primary key. In other words, I do not explicitly need to set it in my code.
image

var parties = (new List<Party>()
{
    new Party
    {
        // PartyId - Primary Key
        PartySourceSystemId = ctx.SourceSystemId, // Primary Key
        PartyTypeId = 1, //Non-nullable FK
        PartyTypeSourceSystemId = 1, //Non-nullable FK
        PartyKey = Guid.NewGuid() // Non-nullable
    }
});

try
{
    ctx.BulkInsert<Party>(parties, opts =>
        {
            opts.CaseSensitive = CaseSensitiveType.Insensitive;
        });
}

This results in the exception above.
I can resolve this by manually mapping out the columns --

ctx.BulkInsert<Party>(parties, opts =>
    {
        opts.CaseSensitive = CaseSensitiveType.Insensitive;
        opts.ColumnMappings.Add(a => a.PartyId, true);
        opts.ColumnMappings.Add(a => a.PartySourceSystemId, true);
        opts.ColumnMappings.Add(a => a.PartyKey, false);
        opts.ColumnMappings.Add(a => a.PartyTypeId, false);
        opts.ColumnMappings.Add(a => a.PartyTypeSourceSystemId, false);
    });

But now I have to manually map out every column every time I use BulkInsert.
Is there a better way?

Thanks!!!

Facing Issue in bulk merge without primary key table

Hi,

Issue details.

I have "AA" table. in this table I don't have a primary key value. if merge concept will work? other wise how will pass the primary key value in the merge method.

My code:
BulkOperationManager.BulkOperationBuilder = operation => operation.BatchTimeout = 999999999;
if (scrStringOrderByPKCol =="")
{
var bulk = new BulkOperation(destinationConnection);
bulk.DestinationTableName = targetTableName;
bulk.BatchSize = batchSize;
bulk.UseInternalTransaction = true;
bulk.DataSource = stagingReader;
bulk.CaseSensitive = CaseSensitiveType.Insensitive;
bulk.BulkMerge();
}

This below property will work?

bulk..ColumnPrimaryKeyExpression ==??????

issue:
If I am adding the above property means I will face the issue "An error occured while resolving ColumnPrimaryKey expression"

Question:

Without primarykey bulk.merge will work?

please guide me

Bulk Synchronize not deleting rows on SQLite

Hello.

I'm having a problem using the Bulk Synchronize with SQLite. I'm using the attachment files to do a bulk insert of one of them then use the bulk synchronizer with the other one. Those are the results that I got:

  • The new lines are inserted
  • The lines with differences are updated
  • The lines missing are not deleted

This scenario is working properly with SQL Server and MySql, the ones I tested locally. This seems to be a bug related only to SQLite.

testdata.xlsx
updateTestData.xlsx

Getting Error "one of the identified items was in an invalid format" for BulkInsert

So basically this is a very small issue but I have tried almost all sorts of modification. I Comapred the DB data types to the one in the codes too, etc etc.
Now here is the piece of code which I am using for BulkInsert. I have changed the original table in which I wanted to bulk insert initially with a easier one for testing purpose. It just has three fields one datetime and two string columns

it basically throws the error mentioned in the title. I have tried using ColumnDirectionMapping to input and also tried changing the way I am inputting date in the DAY column. Any sort of suggestion would be helpful.

Update : From my further research I guess the way I am assigning the date to the DAY column is wrong. Please suggest me a proper way of doing that.

Performance difference with BulkOperation Insert and Context.BulkInsert

We are using ZZZ EntityFramework Core extension version 2.0.10. I tried using couple of option for performing the bulk Insert operation:

Option1:

var conn = context.Database.GetDbConnection();
var bulk = new BulkOperation(conn);
bulk.BulkInsert(result);

The entity on which this works has an Id column set as PrimaryKey. This operation takes about 2 minutes to process to 200000 records.

Option2:

context.BulkInsert(result, options =>
{
options.SqlBulkCopyOptions = SqlBulkCopyOptions.TableLock;
// options.TemporaryTableInsertBatchSize = 10000;
options.BatchSize = 1000;
options.ColumnPrimaryKeyExpression = c => new { c.GUID };
});

We are implementing a generic approach and using GUID as the Key. We don't have an Id column in the data class/entity. This takes over 12 minutes to process the same 20000 records or sometimes just hangs. I tried with different BatchSize and SqlBulckCopy options but not much improvement.

Is there an issue with the way it is working ?

Problem using BulkMerge() method adding at runtime a new column into destination table

Hi,
I have a problem using the BulkMerge() method when I add a new column into the destination table while the application is running. In this case occurs the following error:

"Message: An error occured while resolving mapping by name. See the inner exception for details"
InnerException: "Missing Colum : On Table : [Destination table name]"

Here I report the related code:

using (var bulkMerge = new BulkOperation(conn))
               {
                   bulkMerge.UseCompile = true;
                   bulkMerge.Transaction = trans;
                   bulkMerge.BatchSize = 10000;
                   bulkMerge.DestinationTableName = "BM_Items";

                   bulkMerge.ColumnMappings.Add("IdERP", "IdERP", true);
                   bulkMerge.ColumnMappings.Add("IdVariant", "IdVariant", true);
                   bulkMerge.ColumnMappings.Add("Name", "Name", false);
                   bulkMerge.ColumnMappings.Add("Nature", "Nature", false);
                   if (!skipUoM) bulkMerge.ColumnMappings.Add("IdERPUoM", "IdERPUoM", false);

                   bulkMerge.AllowDuplicateKeys = true;

                   bulkMerge.BulkMerge(dataTable);

                   dataTable.Dispose();

                   trans.Commit();
                   conn.Close();
               }

I'm using the table BM_Items where to export data retrieved from a DataTable (The first time the IdERPUoM column is not added into the table).
After that the first export has been executed (the app is still running) I add the new column "IdERPUoM" and I reinvoke the BulkMerge. In this case occurs the above error:

Message: An error occured while resolving mapping by name. See the inner exception for details"
InnerException: "Missing Colum : [IdERPUoM] On Table : [BM_Items]

If I rerun the application the error doesn't occurs and the export works correctly.

Potential bug:
From what I understand the error depends on the fact that if I do at least an export with BulkMerge in the destination table, if I then make changes on this table, while the application is still running, the Bulk library does not detect the new column. If then restart the app the problem is solved.

I need an urgent answer. I would like to know if it's a bug or if I'm missing something in the configuration of the BulkOperation object.

Thanks in advance.

BulkInsert And BulkDelete ?

How to batch two tables at the same time

using (var connection = new SqlConnection(connectionString))
{
connection.Open();

                        // BulkInsert
                        using (var bulk = new BulkOperation<Task01Rec>(connection))
                        {
                            bulk.DestinationTableName = "Task_01_Rec_History";
                            bulk.AutoMapOutputDirection = false; // performance can be improved with options

                            bulk.BulkInsert(tasklist.ToList<Task01Rec>());

                            bulk.BulkDelete(systasklist.ToList<sysTask>());

                        }
                    }

Bulk insert never ends

Hi
Can you help us with this issue?
We developed an application that copies data between 2 servers using a datareader and when we tried in a testing environment the bulk insert never ends.
Is there some known issue or a hint to this?

Thanks for your help

Problem with BulkInserting an IList of dynamic/IDictionary<string, object> on Oracle 12.2

I'm new to ZZZProjects and Bulk-Operations and Trying to copy data from one Oracle Environment to another in batches of 200 records.

The "normal" Dapper insert (Execute) works as expected but is somewhat slow ( 3 a 4 seconds for 200 records...)
I'm doing it by using and Querying with the dynamic type which a transform right after the Query to an IDictionary<string, object>.

But when trying to use the BulkInsert I'm stuck trying different settings. Are dynamics not supported for BulkInsert?

Code I have so far is:

             DapperPlusManager.Entity<IDictionary<string, object>>("mapperKey_001")
                                                                        .Table(table.Value.Meta.TargetTableName)
                                                                        .Map(x => x["ID"] );
             transaction.BulkInsert(("mapperKey_001", data.Rows);

Am I doing something wrong?

Thanks for any help and suggestions.

Kind regards,
Henry Roeland

Error : Atleast one members must be declared.

Good afternoon,

I can not seem to get this working.
What i am trying to do is :

A BulkInsert from an ODBC Datasource, into an MySQL table as target, through an OdbcDataReader.

When using an DataTable to a MySQL table ,it works just fine.
When using an OdbcDataReader, but the target is a MS SQL database , it works just fine

But the combination OdbcDataReader to a MySql database as target gives the error message : "Atleast one members must be declared."

The code I use is something like this : (the simplified version)

            //Open the ODBC connection and fill the Odbcreader  (This is the Datasource)
            OdbcConnection sourceConn = new OdbcConnection(SourceOdbcConnString); 
            sourceConn.Open();
            OdbcDataReader Odbcreader;
            OdbcCommand commandTC = new OdbcCommand
                    {
                        Connection = sourceConn
                        CommandText = "SELECT * FROM TestTable" 
                    };
			Odbcreader = commandTC.ExecuteReader();		
			
			//Open the MySQL connecton and try to BulkInsert the contents of the Odbcreader
			
			MySqlConnection mySqlTargetConn = new MySqlConnection("server=localhost;userid=*******;pwd=**********;database=test");
			
			using (BulkOperation bulk = new BulkOperation(mySqlTargetConn))
                {
                    bulk.DestinationTableName = "TestTableTarget";
                    bulk.BulkInsert(Odbcreader);
                }
            Odbcreader.Close();
			mySqlTargetConn.Close()

Can you perhaps point me in the right direction ?
I would be very happy if i got this working.

Best Regards,

J. Fielder

BulkUpdate is not updating

Hello,
I'm trying to update some values and it's not working. I'm trying to do like the I saw in the examples.. but any value is updated. I'm using SQL Server.
What I'm doing wrong?
Thanks.

using (var connection = new SqlConnection(connectionString))
                {
                    connection.Open();
                    using (var bulk = new BulkOperation<MyBussines>(connection))
                    {
    
                        bulk.DestinationTableName = "MyBussines";
                        bulk.ColumnMappings.Add("MyBussinesId", "MyBussinesID", true);
                        bulk.ColumnMappings.Add("MyBussinesNameId", "MyBussinesNameID", false);
                        bulk.ColumnMappings.Add("MyBussinesSavedByUserId", "MyBussinesSavedByUserID", false);
                        bulk.BulkUpdate(values);
                    }
                }

BulkInsert/BulkMerge in combination with Npgsql (PostgreSQL) has weird behaviour

Hi,
I have the following problem in trying to BulkInsert several entities into a table. The code looks like this:

[Table("Properties")]
[Serializable]
public class Property
{
    public long ID { get; set; }
    public string PropertyGroup { get; set; }
    public string TranslationKey { get; set; }
    public int GroupOrder { get; set; }
    public int Order { get; set; }
}

.
.
.
DapperPlusManager.Entity<Property>().Table("Properties");
DapperPlusManager.Entity<Property>().Identity(x => x.ID);
DapperPlusManager.Entity<Property>().Key(x => x.ID);
.
.
.
using (var dapperDb = new NpgsqlConnection(_connectionString))
{
    dapperDb.Open();
    await dapperDb.BulkActionAsync(x => x.BulkInsert(insert));  // also x.BulkMerge(insert) do the same behaviour
}

'insert' is just a list of 50 elements of the class Property, the ID is set to 0 for every object before bulkinsert ist called.

After executing the code above, only the last element of that list is inserted into the DB table and no other element. Calling the method again, then the same behaviour applies as described, only the last element is added to the DB table and so on. So I have to call the method 50 times to add all elements which is definitely not intended.

Is there some special option in dapper to get this working with Npgsql? Or do I have to set something special in the PostgreSQL database to get the Bulk methods work?

Versions I used:

  • PostgreSQL 13.1, compiled by Visual C++ build 1914, 64-bit
  • Npgsql.EntityFrameworkCore.PostgreSQL Version 3.1.4
  • DapperPlus 3.0.29
  • EntityFrameworkCore 3.1.9

System.OutOfMemoryException in MySQL inserting one million rows

Hello,
I'm trying to insert 1 million rows in a table with 3 columns and I have an out of memory exception, this is the stack trace:

at System.Text.StringBuilder.ExpandByABlock(Int32 minBlockCharCount)
   at System.Text.StringBuilder.Append(Char* value, Int32 valueCount)
   at System.Text.StringBuilder.AppendHelper(String value)
   at System.Text.StringBuilder.Append(String value)
   at �.�.�(BulkOperation , DbCommand )
   at �.�.�(DbCommand , BulkOperation )
   at �.�.�(� , DbCommand )
   at �.�.�(List`1 )
   at �.�.�(List`1 )
   at �.�.�(BulkOperation )
   at Z.BulkOperations.BulkOperation.BulkInsert()
   at Z.BulkOperations.BulkOperation.BulkInsert(Object datasource)

Could you please help me?

Thanks in advance.

Problem on SQLServer when BulkInsert(dataTable)

Hi,
I have this error when I'm executing BulkInsert(dataTable) with SQLServer -->
" Must declare the scalar variable "@Table_0". "
After read several documents I think that is probably caused by undeclared variable "@Table_0".

Any Help Please?

Thanks in advance.

Synchronize with SynchronizeMatchedAndFormula

I'm trying to use BulkSynchronize to synchronize some tables. The problem is that I don't have the entire tables in memory, only those rows that relates to a specific aggregate root. I would expect that SynchronizeMatchedAndFormula would be used during the delete phase of BulkSynchronization but it does not seem to do that. Now all other rows that don't belong to the aggregate root are deleted but I want all rows that don't match the AggregateRootID to be kept.

This is how I use SynchronizeMatchedAndFormula
SynchronizeMatchedAndFormula = $"AggregateRootID = '{id}'";

Giving Composite key of table and Registering the Mapping using DapperPlusManager

Using code snippet as below, they are throwing exceptions.

Code Snippet 1 : Trying to map Composite key of the table using "Key".
DapperPlusManager.Entity<Tab>()
Table("tbl_tab")
.Map(x => x.Id, "tab_id")
.Map(x => x.ScreenId, "Screen_id")
.Map(x => x.Name, "tab_name")
.Key(x => new { x.Id, x.ScreenId})
.InsertIfNotExists();
connection.BulkInsert(insertTabs)

Error : It says, An error occurred while resolving mapping by name.

Code Snippet 2:
I am trying register the mapping in the constructor of the class where I am using it. as follows:
DapperPlusManager.Entity<Tab>().Map(new TabEntityMapper());

where TabEntityMapper is as below:

public class TabEntityMapper: DapperPlusEntityMapper<Tab>
    {
        public SectionAccessBEEntityMapper()
        {
            Table("tbl_tab");
            Map(x => x.Id, "tab_id");
            Map(x => x.ScreenId, "Screen_id");
            Map(x => x.Name, "tab_name");
            InsertIfNotExists();
         }
    }

Error : An error occurred while retrieving informationtable information.
           Inner-exception:Invalid object name 'tbl_tab'.

Is there any other way to map Composite key of the table and register the mapping of the table at program start?
Or else something wrong is there in above code snippet?
Please help.

BulkMerge with Postgres Not Working as Expected

With the help of this url https://bulk-operations.net/online-examples/cODwTM, I tried to do same in postgres. Kindly help me about the issue

NpgsqlConnection connection = new NpgsqlConnection(connString);
connection.Open();

        DataTable dtCustomers = new DataTable("Customers");

        dtCustomers.Columns.Add("CustomerID", typeof(int));
        dtCustomers.Columns.Add("Name", typeof(string));
        //dtCustomers.Columns.Add("Description", typeof(string));

        for (int i = 0; i < 10; i++)
        {
            // Add very customer twice with the same "CustomerID"
            {
                var drCustomer = dtCustomers.NewRow();
                drCustomer["CustomerID"] = (i + 1);  
                drCustomer["Name"] = "Customer_" + i;
                //drCustomer["Description"] = "Customer_Desc_" + i;
                dtCustomers.Rows.Add(drCustomer);
            }

            {
                var drCustomer = dtCustomers.NewRow();
                drCustomer["CustomerID"] = (i + 1);  
                drCustomer["Name"] = "Customer_" + i;
               // drCustomer["Description"] = "Customer_Desc_" + i;
                dtCustomers.Rows.Add(drCustomer);
            }
        }

        // SAVE customers
        using (var bulk = new BulkOperation(connection))
        {
            bulk.DestinationTableName = "Customers";
            //bulk.AllowUpdatePrimaryKeys = true;
            //bulk.BatchSize = 1000;
            //bulk.AutoMapKeyName = "Name;Description";
            bulk.AllowDuplicateKeys = true;
            bulk.InsertIfNotExists = true;

            bulk.BulkMerge(dtCustomers);
        }

Bulk Merge with DataTable

Hello,

I am trying to use the BulkMerge functionality with a DataTable and without entity framework. I can't figure out how to set the primary key or the MERGE JOIN columns (these are usually set with ColumnPrimaryKeyExpression and ColumnInputExpression).

I can add a single primary key link this:
bulkMerge.ColumnMappings.Add(primaryKey, true);

How can I add a primary key that has multiple columns?

Also, how do I set the MERGE JOIN columns? Is there another way to do it without ColumnInputExpression? My table is dynamic and does not have EF entities.

Is this possible?

Thanks,
M

An error occured, no primary key could be found or resolved.

在 Z.BulkOperations.BulkOperation.()
在 Z.BulkOperations.BulkOperation.Execute()
在 Z.BulkOperations.BulkOperation.BulkDelete()
在 Z.BulkOperations.BulkOperation1.BulkDelete(List1 datasource)
在 yw56.rec.Tasks.Task01Recs.Task01RecService.<>c__DisplayClass14_0.b__0(Object taskindex) 位置 F:\project\rec\3.9.0\src\yw56.rec.Application\Tasks\Task01Recs\Task01RecService.cs:行号 304

using Abp.Domain.Entities;
using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations.Schema;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace yw56.rec.SysTasks
{
    [Table("sys_task_history4")]
    public class sysTask:Entity<long>
    {
        [Column("taskId")]
        public override long Id { get => base.Id; set => base.Id = value; }

        public long taskCode { get; set; }

        public int status { get; set; }

        public System.String businessCode { get; set; }
        public int platformCode { get; set; }

        public System.DateTime createTime { get; set; }
        public System.String businessBody { get; set; }

        public System.String businessRemark { get; set; }
    }
}




   using (var connection = new SqlConnection(connectionString))
                        {
                            connection.Open();

                            // BulkInsert
                            using (var bulk = new BulkOperation<sysTask>(connection))
                            {
                                //bulk.DestinationTableName = "Task_01_Rec_History";
                                //bulk.AutoMapOutputDirection = false; // performance can be improved with options

                                //bulk.BulkInsert(tasklist.ToList<Task01Rec>());

                                bulk.DestinationTableName = "sys_task_history4";
                                bulk.BatchSize = 100;
                                bulk.BulkDelete(systasklist.ToList<sysTask>());

                            }
                        }

how to get Identity Value for each iteration bulk insert

        List< IletisimLog> bulkInsertIletisimLog = new List<IletisimLog>();
        //there are 1000 values in the array of paramaters
        foreach (var kId in paramaters)
        {     
            var iletisimLogInsert = new IletisimLog()
            {
                KullaniciID = kId.KullaniciId,
                EklendigiTarih = DateTime.Now,
                GonderildigiTarih = DateTime.Now,
                BilgilendirmeTurID = bilgilendirmeturId,
            };
            bulkInsertIletisimLog.Add(iletisimLogInsert);

        }
        _iLetisimLogService.BulkInsertRange(bulkInsertIletisimLog);

//There are 1000 records registered in the database. How to get the primary key value for each record

Issue in BulkInsert function when System Cryptography: Use FIPS Compliant Algorithms for encryption, hashing and signing is Enabled the server.

Hi Support,

When we use BulkInsert function we are getting following error. This implementation is not part of the Windows Platform FIPS validated cryptographic algorithms.

System Cryptography: Use FIPS Compliant Algorithms for encryption, hashing and signing is Enabled the server. When we set FIPS Compliance security setting to Disabled it is working fine.

Do you have dll which can work with System Cryptography: Use FIPS Compliant Algorithms for encryption, hashing and signing is Enabled the server ?

Error Message: Exception has been thrown by the target of an invocation.
Stack Trace: at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor)
at System.Reflection.RuntimeConstructorInfo.Invoke(BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
at System.Security.Cryptography.CryptoConfig.CreateFromName(String name, Object[] args)
at System.Security.Cryptography.MD5.Create()
at .(String )
at .(String , String , Int32& )
at .(ProviderType , Boolean , Boolean , Boolean )
at Z.BulkOperations.BulkOperation.()
Inner Error Message: This implementation is not part of the Windows Platform FIPS validated cryptographic algorithms.
Inner Stack Trace: at System.Security.Cryptography.MD5CryptoServiceProvider..ctor()

Regards,
Manoj Ahir

Retrieving IDs of inserted records

After doing a bulk.BulkInsert(t) I would like to retrieve the IDs of the inserted records. At the moment I am able to get the RowsAffectedInserted only.

Any ideas?

How to validate all providers?

Hi,
I purchase the All Providers option for BUlkOperations.
I see that if I use the ValidateLincense method() (Z.BulkOperations.LicenseManager class) I can validate a specific provider (e.g. SQLServer).
If I want to validate all providers in a single time I can use the "ProviderType.None" enum value as parameter? Or I must call the ValidateLincense(ProviderType) for each provider?

Thanks
Andrea

Bulk Operations do no work on tables outside of 'dbo' schema

EF Core 2.1.2
Z.EntityFramework.Extensions.EFCore 2.1.33

In doing some testing and tracing I have found that we get an exception thrown when we try and stating "Invalid object name 'TableName'. I did a trace on MSSQL server while I was running the program and found out that OBJECT_ID() take a value as [schem].[tablename] I have changed the SQL statement as below to account for not using the Schema, though you could have a table named the same in multiple schema, and my change does not account for that.
`exec sp_executesql N'
/* SELECT server information */
SELECT @@Version

/* SELECT table information */
SELECT A.Name AS DestinationName ,
( SELECT 1
WHERE EXISTS ( SELECT 1
FROM sys.triggers AS X
WHERE X.parent_id = A.object_id
AND X.is_disabled = 0
AND OBJECTPROPERTY(X.object_id,
''ExecIsInsertTrigger'') = 1 )
) AS HasInsertTrigger ,
( SELECT 1
WHERE EXISTS ( SELECT 1
FROM sys.triggers AS X
WHERE X.parent_id = A.object_id
AND X.is_disabled = 0
AND OBJECTPROPERTY(X.object_id,
''ExecIsUpdateTrigger'') = 1 )
) AS HasUpdateTrigger ,
( SELECT 1
WHERE EXISTS ( SELECT 1
FROM sys.triggers AS X
WHERE X.parent_id = A.object_id
AND X.is_disabled = 0
AND OBJECTPROPERTY(X.object_id,
''ExecIsDeleteTrigger'') = 1 )
) AS HasDeleteTrigger
FROM sys.tables AS A
LEFT JOIN sys.synonyms AS B ON B.object_id = A.object_id
AND COALESCE(PARSENAME(base_object_name,4), @@ServerName) = @@ServerName
AND COALESCE(PARSENAME(base_object_name,3), DB_NAME(DB_ID())) = DB_NAME(DB_ID())
WHERE A.Name = @Table_0
ORDER BY DestinationName

/* SELECT column information */
SELECT A.name AS DestinationName ,
C.name AS ColumnName ,
C.column_id AS ColumnOrder ,
C.precision AS Precision ,
C.scale AS Scale ,
C.max_length AS MaxLength ,
C.collation_name AS Collation ,
C.Is_Identity AS IsIdentity ,
( CASE WHEN EXISTS ( SELECT 1
FROM sys.index_columns AS X
WHERE X.index_id = B.index_id
AND X.object_id = B.object_id
AND X.column_id = C.column_id ) THEN 1
ELSE 0
END ) AS IsPrimaryKey ,
C.system_type_id AS System_Type_Id ,
LOWER(D.Name) AS TypeName,
(CASE WHEN E.base_object_name IS NOT NULL THEN 1 ELSE 0 END) AS IsSynonym,
D.is_user_defined,
F.name
FROM sys.tables AS A
LEFT JOIN sys.synonyms AS E ON E.object_id = A.object_id
AND COALESCE(PARSENAME(base_object_name,4), @@ServerName) = @@ServerName
AND COALESCE(PARSENAME(base_object_name,3), DB_NAME(DB_ID())) = DB_NAME(DB_ID())
LEFT JOIN sys.indexes AS B ON B.object_id = A.object_id
AND B.is_primary_key = 1
INNER JOIN sys.columns AS C ON C.object_id = A.object_id
INNER JOIN sys.types AS D ON D.system_type_id = C.system_type_id
AND D.user_type_id = C.user_type_id
INNER JOIN sys.schemas AS F ON D.schema_id = F.schema_id
WHERE A.name = @Table_0
ORDER BY DestinationName , ColumnOrder
',N'@Table_0 nvarchar(21)',@Table_0=N'Animal'`

BulkInsert many-to-many with implicit link table

I have this entity class which has a many-to-many relation with an implicit link table:

public class DiscountRule
{
	public virtual ISet<CostType> ExcludedCostTypes { get; set; }
              = new HashSet<CostType>();
}

I want to bulk insert a set of DiscountRules and include the implicit link table but not the CostType target table.

If I don't specifiy a IncludeGraphOperationBuilder option, like this:

await context.BulkInsertAsync(discountRules, options =>
{                    
    options.IncludeGraph = true;
    options.UnsafeMode = true;
});

I get the error "Cannot insert duplicate key row in object 'dbo.CostTypes' with unique index 'IX_CostType_Value'. The duplicate key value is (0)."

So I try to add a IncludeGraphOperationBuilder option to exclude the CostType table:

await context.BulkInsertAsync(discountRules, options =>
{    
    options.IncludeGraph = true;
    options.IncludeGraphOperationBuilder = operation =>
    {
        if (operation is BulkOperation<CostType>)
        {
            var bulk = (BulkOperation<CostType>) operation;
            bulk.IsReadOnly = true;
        }
    };
    options.UnsafeMode = true;
});

But then I get the exception:

"internalexception": {
  "message": "Exception has been thrown by the target of an invocation.",
  "type": "System.Reflection.TargetInvocationException",
  "stacktrace": "   at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor)\r\n   at System.Reflection.RuntimeMethodInfo.UnsafeInvokeInternal(Object obj, Object[] parameters, Object[] arguments)\r\n   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)\r\n   at Z.EntityFramework.Extensions.InternalBulkOperationManager.\u0001[\u0001](BulkOperation`1 \u0002, DbContext \u0003, List`1 \u0004, Boolean \u0005, List`1 \u0006, Type \u0007, String \b)\r\n   at Z.EntityFramework.Extensions.InternalBulkOperationManager.\b.\u0001(SchemaEntityType )\r\n   at System.Collections.Generic.List`1.ForEach(Action`1 action)\r\n   at Z.EntityFramework.Extensions.InternalBulkOperationManager.\u0001[\u0001](DbContext \u0002, BulkOperation`1 \u0003, IEnumerable`1 \u0004, List`1 \u0005)\r\n   at Z.EntityFramework.Extensions.InternalBulkOperationManager.\u0001[\u0001](BulkOperation`1 \u0002, DbContext \u0003, IEnumerable`1 \u0004, List`1 \u0005)\r\n   at DbContextExtensions.\u0001[\u0001](DbContext , IEnumerable`1 , Action`1 , List`1 )\r\n   at Z.EntityFramework.Extensions.BulkSaveChanges.\u0001(DbContext , List`1 , List`1 , Action`1 )\r\n   at DbContextExtensions.\u0001(DbContext , Boolean , Action`1 , Boolean )",
  "internalexception": {
    "message": "Incorrect syntax near ';'.",
    "type": "System.Data.SqlClient.SqlException",
    "stacktrace": "   at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\r\n   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\r\n   at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\r\n   at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\r\n   at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\r\n   at System.Data.SqlClient.SqlDataReader.get_MetaData()\r\n   at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption, Boolean shouldCacheForAlwaysEncrypted)\r\n   at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\r\n   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\r\n   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\r\n   at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\r\n   at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\r\n   at System.Data.Common.DbCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior)\r\n   at System.Data.Common.DbDataAdapter.FillInternal(DataSet dataset, DataTable[] datatables, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior)\r\n   at System.Data.Common.DbDataAdapter.Fill(DataSet dataSet, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior)\r\n   at System.Data.Common.DbDataAdapter.Fill(DataSet dataSet)\r\n   at Z.BulkOperations.SqlActionExecutor.\u0001(SqlCommand \u0002, BulkOperation \u0003, Int32 \u0004)\r\n   at Z.BulkOperations.SqlActionExecutor.\u0001(DbCommand \u0002, BulkOperation \u0003, Int32 \u0004)\r\n   at Z.BulkOperations.SqlActionExecutor.\u0006(ExecuteAction , DbCommand )\r\n   at Z.BulkOperations.DbActionExecutor.\u0001(List`1 )\r\n   at Z.BulkOperations.SqlProvider.\u0001(List`1 )\r\n   at \u0007.\u0003.\u0001(BulkOperation )\r\n   at Z.BulkOperations.BulkOperation.BulkInsert()\r\n   at Z.EntityFramework.Extensions.InternalBulkOperationManager.\u0001[\u0001](BulkOperation`1 \u0002, DbContext \u0003, List`1 \u0004, Boolean \u0005, List`1 \u0006, Type \u0007, String \b)\r\n   at Z.EntityFramework.Extensions.PublicInternalBulkOperationManager.BulkInsertCast[T](BulkOperation`1 this, DbContext context, List`1 list, Boolean isManager, List`1 entitiesToUpdate, Type type, String typeName)"
  }
}

Which I don't know what to do with.

Can BulkInsert handle a many-to-many relation and include the implicit link table but not the target table?

TimestampTZ issue datatable to postgres

While trying insert bulkmerge from datatable which has column type as datetime and inserting/updating to postgres table which has the field mapped with datatype timestamptz.
After inserting we examine that the value of datetime is too different from the current timezone

Several Bulk problems

I'm using connection.BeginTransaction() and then the ...Bulk methods on the transaction, but none of them are working. Without a transaction it's the same.

BulkMerge has this problem:
The DestinationTableName cannot be null or empty. You must specify a valid DestinationTableName.'
transaction.BulkMerge(items, x => x.Name, x => x.Type, x => x.Information);

BulkInsert on the other hand has this problem (without the selectors, otherwise it's the same message as BulkMerge):
System.Exception: 'Error: 0013: Oops! A general error has occurred. Please report the issue including the stack trace to our support team: [email protected]'

-->

at �.�.�(BulkOperation )
at Z.BulkOperations.BulkOperation.BulkInsert()
at Z.Dapper.Plus.DapperPlusAction.Execute()
at Z.Dapper.Plus.DapperPlusActionSet1.AddAction(String mapperKey, DapperPlusActionKind actionKind, TEntity item) at Z.Dapper.Plus.DapperPlusActionSet1.DapperPlusActionSetBuilder(IDbConnection connection, IDbTransaction transaction, String mapperKey, DapperPlusActionKind actionKind, TEntity item, Func2[] selectors) at Z.Dapper.Plus.DapperPlusExtensions.BulkInsert[T](IDbTransaction transaction, String mapperKey, T item, Func2[] selectors)
at Z.Dapper.Plus.DapperPlusExtensions.BulkInsert[T](IDbTransaction transaction, T item, Func`2[] selectors)

It doesn't matter if I use the DapperPlusManager.Entity<...>().Map methods or not, it never works. Just Dapper works fine, and also all other Dapper helpers on NuGet.

Calculated column

I want to bulkinsert an idatareader calculating some fields on the go is it possible?
something like this:

bulk.calculatedfield = row => row["A·] + 5;
bulk.bulkinsert(data);

Timestamp already in UTC but being treated as if in local time

HI,

I'm trying to bulk upsert some data to a Postgres DB using BulkOperations with npgsql. My entities are being mapped to a DataTable before being calling either BulkInsert or BulkMerge depending on the status of the entity in the DB.

Inside my entity, I have a couple of DateTimeOffset's, which I can see have the correct time component @ UTC in the debugger. These fields are mapped to columns in the database with the type timestamptz.

According to the npgsql documentation DateTimeOffsets are converted to UTC (so no change) and put in the DB.

However, when I call BulkMerge or BulkInsert, the timestamps in the DB are 1 hour behind compared to the values I saw in the debugger. It looks like BulkOperations assumes my DateTimeOffset's are in local time (BST) and then either itself or npgsql is performing an unnecessary conversion to UTC.

I feel like I'm going a bit mad, as I'm pretty sure this was working OK a couple of releases OK, but now it seems broken. Any ideas?

Error during bulk insert

Hi,

I'm getting an error during bulk insert on .NET Core 2.0 and library version 2.13.7:

The type initializer for '' threw an exception.

Inner exception:

Method not found: 'System.Reflection.Emit.AssemblyBuilder System.AppDomain.DefineDynamicAssembly(System.Reflection.AssemblyName, System.Reflection.Emit.AssemblyBuilderAccess)'.

Code:

using (var bulk = new BulkOperation<Person>(connection))
{
	bulk.Transaction = transaction;
	bulk.DestinationTableName = "dbo.Persons";
	bulk.BatchSize = _importExportOptions.Import.BatchSize;
	bulk.BatchTimeout = _importExportOptions.Import.BatchTimeout;
	bulk.CaseSensitive = CaseSensitiveType.Insensitive;

	bulk.BulkInsert(newPersons);
}

InsertIfNotExists does not work as expected

User table has 2 columns: UserId INT, PRIMARY KEY, LoginName VARCHAR(255)
sourceDt is DataTable and have exact same 2 columns above.
Insert the second time I hit exception "Duplicate entry '108' for key 'PRIMARY'"

            using (var connection = DbAdapter.GetDbConnection(targetProviderName, targetConnString))
            {
                if(connection.State!= ConnectionState.Open)
                    connection.Open();
                // Bulk insert
                using (var bulk = new BulkOperation(connection))
                {
                    bulk.DestinationTableName = "User";
                    bulk.InsertIfNotExists = true;
                    bulk.ColumnMappings.Add("UserId", true);
                    bulk.ColumnMappings.Add("LoginName");
                    bulk.BulkInsert(sourceDt);
                }
            }

Importing data to an AlwaysEncrypted database with BulkOperation.BulkInsert is failing on a TinyInt column

Hi we are currently using Z.EntityFramework.Extensions v3.17.8 against SQLSERVER 2017 with many encrypted columns of varying types across many tables.

We have a data transport facility for moving data between clients that are not connected to the same database. This process involves pulling the tables down into a DataSet and converting that DataSet to XML (The data is in it's unencrypted form) Then that output file is encrypted. That XML file is then decrypted, loaded into a DataSet and we are attempting to use BulkOperation to insert the data from the tables in that dataset.

Everything works great until it hits a column of type TinyInt that is encrypted. (Up to this point say half a dozen encrypted columns of types (varchar, nvarchar) were successfully imported)

We were previously on version 3.14.something which I couldn't get to import ANY of the data, but upgrading to a more recent version 3.17.8 fixed that, but fails on the TinyInt.

I'm not certain that this is a problem with your library or not, but I haven't been able to spot an issue from my side. I don't have a reproducible sample project for you, but might be able to build one if it's essential to you.

SqlException: Operand type clash: tinyint is incompatible with tinyint encrypted with (encryption_type = 'RANDOMIZED', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = 'C2defaultCEK1', column_encryption_key_database_name = 'TestDb')
Statement(s) could not be prepared.

Any help on this would be greatly appreciated

BulkInsert errors when using InsertIfNotExists and ColumnMappings.

Given the following table in SQL Server:

CREATE TABLE Product (
    Name nvarchar(50) null
);

The following code throws an error:

class Product
{
    public string Name { get; set; }
}

static void InsertBroken()
{
    using (var connection = new SqlConnection(ConnectionString))
    using (var bulk = new BulkOperation(connection))
    {
        connection.Open();

        var products = new List<Product>
        {
            new Product { Name = "Name" }
        };

        bulk.ColumnMappings = new List<ColumnMapping>
        {
            new ColumnMapping("Name", "Name")
        };
        bulk.DestinationTableName = "Product";
        bulk.AutoMapKeyName = "Name";
        bulk.InsertIfNotExists = true;

        bulk.BulkInsert(products);
    }
}

"System.Exception: 'An error occured, no primary key could be found or resolved.'"

If the ColumnMappings aren't set, then this code works fine. The ColumnMapping is not needed for this example, but they are for my actual use case.

Is there anything I can do to get around this or is this a bug?

Transactions

Hi
Whats the right way to use a transaction that rollbacks the whole bulkinsert, we have tried a few combinations but always some data gets persisted.
Can you provide some example code?

Thanks for your help

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.