Bulk copy a DataTable into MySQL (similar to System.Data.SqlClient.SqlBulkCopy)

Similar to SqlBulkCopy, we have MySqlBulkCopy for Mysql. here is the example how to use it.

public async Task<bool> MySqlBulCopyAsync(DataTable dataTable)
    {
        try
        {
            bool result = true;
            using (var connection = new MySqlConnector.MySqlConnection(_connString + ";AllowLoadLocalInfile=True"))
            {
                await connection.OpenAsync();
                var bulkCopy = new MySqlBulkCopy(connection);
                bulkCopy.DestinationTableName = "yourtable";
                // the column mapping is required if you have a identity column in the table
                bulkCopy.ColumnMappings.AddRange(GetMySqlColumnMapping(dataTable));
                await bulkCopy.WriteToServerAsync(dataTable);
                return result;
            }
        }
        catch (Exception ex)
        {
            throw;
        }
    }
    private List<MySqlBulkCopyColumnMapping> GetMySqlColumnMapping(DataTable dataTable)
    {
        List<MySqlBulkCopyColumnMapping> colMappings = new List<MySqlBulkCopyColumnMapping>();
        int i = 0;
        foreach (DataColumn col in dataTable.Columns)
        {
            colMappings.Add(new MySqlBulkCopyColumnMapping(i, col.ColumnName));
            i++;
        }
        return colMappings;
    }

You can ignore the column mapping if you don't have any identity column in your table. If you have identity column then you have to use the column mapping otherwise it won't insert any records in the table It will just give message like "x rows were copied but only 0 rows were inserted".

This class i available in the below library Assembly MySqlConnector, Version=1.0.0.0


Using any of BulkOperation NuGet-package, you can easily have this done.

Here is an example using the package from https://www.nuget.org/packages/Z.BulkOperations/2.14.3/

MySqlConnection conn = DbConnection.OpenConnection();
DataTable dt = new DataTable("testtable");
MySqlDataAdapter da = new MySqlDataAdapter("SELECT * FROM testtable", conn);
MySqlCommandBuilder cb = new MySqlCommandBuilder(da);
da.Fill(dt);

instead of using

......
da.UpdateBatchSize = 1000;
......
da.Update(dt)

just following two lines

var bulk = new BulkOperation(conn);
bulk.BulkInsert(dt);

will take only 5 seconds to copy the whole DataTable into MySQL without first dumping the 100,000 rows from the DataTable to a temporary CSV file.


Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader class.

Don't rule out a possible solution based on unfounded assumptions. I just tested the insertion of 100,000 rows from a System.Data.DataTable into a MySQL table using a standard MySqlDataAdapter#Update() inside a Transaction. It consistently took about 30 seconds to run:

using (MySqlTransaction tran = conn.BeginTransaction(System.Data.IsolationLevel.Serializable))
{
    using (MySqlCommand cmd = new MySqlCommand())
    {
        cmd.Connection = conn;
        cmd.Transaction = tran;
        cmd.CommandText = "SELECT * FROM testtable";
        using (MySqlDataAdapter da = new MySqlDataAdapter(cmd))
        {
            da.UpdateBatchSize = 1000;
            using (MySqlCommandBuilder cb = new MySqlCommandBuilder(da))
            {
                da.Update(rawData);
                tran.Commit();
            }
        }
    }
}

(I tried a couple of different values for UpdateBatchSize but they didn't seem to have a significant impact on the elapsed time.)

By contrast, the following code using MySqlBulkLoader took only 5 or 6 seconds to run ...

string tempCsvFileSpec = @"C:\Users\Gord\Desktop\dump.csv";
using (StreamWriter writer = new StreamWriter(tempCsvFileSpec))
{
    Rfc4180Writer.WriteDataTable(rawData, writer, false);
}
var msbl = new MySqlBulkLoader(conn);
msbl.TableName = "testtable";
msbl.FileName = tempCsvFileSpec;
msbl.FieldTerminator = ",";
msbl.FieldQuotationCharacter = '"';
msbl.Load();
System.IO.File.Delete(tempCsvFileSpec);

... including the time to dump the 100,000 rows from the DataTable to a temporary CSV file (using code similar to this), bulk-loading from that file, and deleting the file afterwards.

Tags:

C#

Mysql

Database