• 2 min read

BCP Utility Upload Errors in SQL Azure

[This article was contributed by the SQL Azure team.]One cause of bcp utility upload errors in SQL Azure is trying to upload too much data in a single batch. Each batch of rows from your table is a…

[This article was contributed by the SQL Azure team.]

One cause of bcp utility upload errors in SQL Azure is trying to upload too much data in a single batch. Each batch of rows from your table is a single transaction, and SQL Azure has constraints on transactions that could cause the bcp utility upload to fail if BCP violated those constraints. This article will address how to avoid violating the constraints.

The bcp utility is a command line utility that ships with Microsoft SQL Server. It bulk copies data between SQL Azure (or SQL Server) and a data file in a user-specified format. The bcp utility that ships with SQL Server 2008 R2 is fully supported by SQL Azure. You can find out more about using BCP with SQL Azure in this blog post.

One type of error you can encounter when uploading is:

SQLState = 08S01, NativeError = 10054

Error = [Microsoft][SQL Native Client]TCP Provider: An existing connection was forcibly closed by the remote host.

SQLState = 08S01, NativeError = 10054

Error = [Microsoft][SQL Native Client]Communication link failure

This is an example of SQL Azure closing the connection because of transaction constraints, including the rule that a transaction must not exceed 1 Gigabyte of data.

The default batch size is 1000 for BCP, which means that there are 1000 rows uploaded per transaction. If those rows exceed the transaction constraints, you could get the error above. To reduce the number of rows in a batch you can implement bcp utility with the –b flag and indicate the number of rows.

bcp AdventureWorksLTAZ2008R2.SalesLT.Customer in C:UsersuserDocumentsMoveDataToSQLAzure.txt -b100 -c -U username@servername -S tcp:servername.database.windows.net -P password

Reducing the row count will make the transfer chattier, and slows your rows/second transfer rate. However, it might be the only way to get tables with large row sizes into SQL Azure using bcp utility.

For Better Performance

You can increase your performance by increasing the batch size, so that more rows are inserted per transaction. This will only work if data length of each row is small – opposite of the rows giving you the error above. I have been uploading batches of 10,000 rows with good results, especially for many to many tables where there are only four columns.

Do you have questions, concerns, comments? Post them below and we will try to address them.