If you’ve ever wanted to quickly transfer data from a Powershell script into a SQL Server database for further processing, you’ll find that the Write-SqlTableData cmdlet (from the SqlServer module) is a very powerful tool that allows you to do just that with a single call.
However, I recently ran into a problem with this cmdlet, which took me a while to figure out. I’ll explain the solution here so hopefully it will save the rest of the world some time 🙂
My situation
I have a powershell script that uses Write-SqlTableData
in three places to write PSCustomObjects to an (already existing) database table, like this:
# Note that for some reason, PSCustomObjects created by Select-Object don't always work well with # Write-SqlTableData, whereas "[PSCustomObject] @{ ... }" does. $userDataRows = $users | Sort-Object id | ForEach-Object { [PSCustomObject] @{ id = $_.id; name = $_.name; employee_num = [string]($_.employee_num) }} # Be careful with -Force, because the column types appear to be based on the *first row only*, and # if it contains a $null value its corresponding column will be of type SQL_VARIANT. $userDataRows | Write-SqlTableData -ServerInstance '.' -Database 'MyDB'-SchemaName 'Staging' ` -TableName 'User'
The problem
On my development machine, this works without issues, even when importing about 18000 rows. However, on the production machine, the call to Write-SqlTableData
fails with the following error:
Write-SqlTableData : Failed to connect to server .. At D:\Scripts\MyScript.ps1:155 char:25 + ... eDataRows | Write-SqlTableData -ServerInstance '.' -Database 'MyDB' ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : ObjectNotFound: (.:String) [Write-SqlTableData], ConnectionFailureException + FullyQualifiedErrorId : ConnectionToServerFailed,Microsoft.SqlServer.Management.PowerShell.WriteSqlTableData
The weird thing is, it doesn’t always fail on the same point – usually it fails when importing the 3rd set of data, but occassionally, it will fail during the 2nd set. Also, usually some rows will have been inserted but their count varies: sometimes 99, sometimes 141…
The solution
Long story short, I now invoke Write-SqlTableData
with the -InputData
parameter instead of piping data to it.
That is, changing this:
$userDataRows | Write-SqlTableData ...
into this:
Write-SqlTableData -InputData $userDataRows ...
…appears to have solved my problem. Also, inserting 18000 rows now takes 2 seconds instead of 3,5 minutes.
Why, you might ask? Well, during my search for a solution, I found this Azure Feedback link mentioning Write-SqlTableData Multi-Threading issues, and a threading problem seemed consistent with the fact that the insert kept failing on different moments. Also, what kept bothering me was the performance (inserting 18000 rows takes 3,5 minutes).
And then I remembered that, if you’re writing a Cmdlet that streams its input from the pipeline, in the `process` block you can only access a single pipeline element at a time (e.g. see here). If that’s how Write-SqlTableData
works internally, then it makes sense that it takes each PSCustomObject in turn and handles it with a separate INSERT, which would explain the poor performance. Also, it could run out of a system resource such as lingering sockets or SQL server connections more easily, explaining why I eventually get a “Failed to connect to server” error.
Hence, changing the Write-SqlTableData
invocation to use -InputData means it gets all its 18000 PSCustomObjects at once, and enables it to perform its inserts in a batched fashion. This avoids the “Failed to connect to server” error and gives better performance.
2 comments
Nice find. Thanks.
Dave
This was exactly what I needed it was clear and to the point.
Matt C.