View Full Version : Optimization question (C#, MSSQL)

06-13-2007, 08:33 PM
I've coded a simple console app that will read in the fields from an Excel file and then use that data to create Sql update statements and pass them to the Sql server.

Here's the relevant code:

String sConnectionString =
"Provider=Microsoft.Jet.OLEDB.4.0;" +
"Data Source=C:\\UPDATE.xls;" +
"Extended Properties=Excel 8.0;";

OleDbConnection objConn = new OleDbConnection(sConnectionString);
OleDbCommand objCmdSelect = new OleDbCommand("SELECT * FROM [DATABASE$]", objConn);
OleDbDataReader excelReader;

excelReader = objCmdSelect.ExecuteReader();

Console.WriteLine("Connecting to SQL Database");

String connStr = connection string;

SqlConnection objSqlConn = new SqlConnection(connStr);

Console.WriteLine("SQL Connection Open. Beginning SQL Database update.");

String strSql = "";
SqlCommand objCommand = new SqlCommand();
objCommand.Connection = objSqlConn;

int i = 0;
while (excelReader.Read())

strSql = "UPDATE ZipMasterBack SET " +
"Distrib='" + excelReader.GetValue(2) + "', " +
"Region='" + excelReader.GetValue(3) + "', " +
"txtDistribName='" + excelReader.GetValue(4) + "'" +
"WHERE txtZip_Code = '" + excelReader.GetValue(1) + "'";

objCommand.CommandText = strSql;

if (i % 1000 == 0)
Console.WriteLine(i.ToString() + " Records Updated");
else if (i % 50 == 0)
}//end while

This works well on a test file with 50 records. On the real thing, however, it's more than a little slow. It took 2 hours to update 54,500 records. Is there a more efficient method to accomplish this?


06-14-2007, 02:36 AM
Well, for one thing, you don't select every field in the entire database if you're only going to update a few fields. Change your SQL statement here:

OleDbCommand objCmdSelect = new OleDbCommand("SELECT * FROM [DATABASE$]", objConn);

so that it only selects the fields you're updating, which are located in the following statement:

strSql = "UPDATE ZipMasterBack SET " +
"Distrib='" + excelReader.GetValue(2) + "', " +
"Region='" + excelReader.GetValue(3) + "', " +
"txtDistribName='" + excelReader.GetValue(4) + "'" +
"WHERE txtZip_Code = '" + excelReader.GetValue(1) + "'";

06-14-2007, 07:17 PM
If you're using Sql Server you should check out using Bulk Insert. The only issue is with bulk inserts the actual file needs to be on the database machine (or you need to use a unc path and share the file), also you would need to use a csv or xml file.

You can get more info:

I'm sure you can probably find some more references by using google too.

semper fi...

06-15-2007, 01:27 PM
what also might be a performance gain, is using a stored procedure (with passing the right arguments to it) instead of 50000+ update sql statements.

Try to add some timing output in your program to see where the bottleneck is. I think reading the xls file is probably the problem here. perhaps you can read the file at once in a 2-dimensional array, then use that array to feed the stored procedure (or each separate sql statement) for each record to update.

06-16-2007, 07:19 PM
If you are using .NET 2.0 then you can use the SqlBulkCopy class to write the data to SQL Server.

First use your OleDbConnection object to copy the data from Excel into a DataTable, and then use SqlBulkCopy to copy the contents of the DataTable to SQL Server, e.g.,

OleDbDataAdapter da = new OleDbDataAdapter("select fields from [Database$]", objConn);
DataTable dt = new DataTable();

SqlBulkCopy bcp = new SqlBulkCopy(connStr, SqlBulkCopyOptions.KeepIdentity);
bcp.DestinationTableName = tableName;

You won't need any SqlConnection or SqlCommand objects with this approach.

06-18-2007, 11:50 PM
Thank you all for the replies. I eventually got the script trimmed down to about 2 minutes by using the SqlBulkCopy class, as SouthwaterDave suggested, as well as adding some limited recursion to the method that actually does the updating. The thing eats memory for lunch, but since it only has to run once a week, it doesn't really matter.

Thanks again for all the help.