ARTICLE

Writing to SQL Server

From Learn dbatools in a Month of Lunches by Chrissy LeMaire, Rob Sewell, Jess Pomfret, Cláudio Silva

Manning Publications
18 min readApr 1, 2022

--

This article focuses on saving data to the place that SQL Server DBAs feel most comfortable keeping data: a table in an SQL Server database!

You’ll learn different ways to write data to an SQL Server table using dbatools.

Take 40% off Learn dbatools in a Month of Lunches by entering fcclemaire into the discount code box at checkout at manning.com.

Let’s start with understanding PowerShell’s pipeline. The pipeline in PowerShell is a feature that you need to be familiar with to use PowerShell effectively. We start with the pipeline because it enables us to write the output of any PowerShell command to SQL Server.

Piping Commands

One of PowerShell’s most powerful functionalities is the pipeline. The pipeline enables you to easily pass output from one command to another without using a cumbersome foreach. This takes the values in the $instances variable and pipes them to Connect-DbaInstance.

You may have also noticed that you don’t need to specify the -SqlInstance parameter for Connect-DbaInstance or Test-DbaConnection as the values were piped or passed along the pipeline from the left-hand side to the next command on the right.

Imagine you’re a DBA and a Release Manager asks you for the names and sizes of the databases on a particular instance. They also want to know when these databases were last backed up. You know that you can do that with Get-DbaDatabase and you run:

Listing 1. Get the Databases on the instance

PS C:\> $sqlinstance1 = "e608f3dad95c"
PS C:\> Get-DbaDatabase -SqlInstance $sqlinstance1 -ExcludeSystem

ComputerName : e608f3dad95c
InstanceName : MSSQLSERVER
SqlInstance : e608f3dad95c
Name : Northwind
Status : Normal
IsAccessible : True
RecoveryModel : Full
LogReuseWaitStatus : LogBackup
SizeMB : 8.25
Compatibility : Version130
Collation : SQL_Latin1_General_CP1_CI_AS
Owner : sqladmin
LastFullBackup : 6/10/2019 12:00:00 AM
LastDiffBackup : 6/11/2019 12:00:00 AM
LastLogBackup : 6/11/2019 12:15:00 AM

ComputerName : e608f3dad95c
InstanceName : MSSQLSERVER
SqlInstance : e608f3dad95c
Name : pubs
Status : Normal
IsAccessible : True
RecoveryModel : Full
LogReuseWaitStatus : LogBackup
SizeMB : 8.1875
Compatibility : Version130
Collation : SQL_Latin1_General_CP1_CI_AS
Owner : sa
LastFullBackup : 6/10/2019 12:01:00 AM
LastDiffBackup : 6/11/2019 12:01:00 AM
LastLogBackup : 6/11/2019 12:16:00 AM

You can copy the results from the PowerShell window and paste them into an email or document but you know that the Release Manager prefers the information in an easier to read format.

Easily export the results of a PowerShell command into the clipboard

You can export the results of a PowerShell command into the clipboard by piping to the clip command. This only works on Windows.

Get-DbaDatabase -SqlInstance $sqlinstance1 -ExcludeSystem | clip

A lot of output provided isn’t required. The Release Manager asked only for the names, the sizes and the last time the databases were backed up. You can use Select-Object or its alias Select to only display the properties that you require by piping the results of the command Get-DbaDatabase to Select.

Listing 2. Get the Name, Size and Last Full Backup Time of User Databases on the instance

PS C:\> Get-DbaDatabase -SqlInstance $sqlinstance1 -ExcludeSystem | Select Name, Size, LastFullBackup

Name Size LastFullBackup
---- ---- --------------
Northwind 8.25 6/10/2019 12:00:00 AM
pubs 8.18 6/10/2019 12:01:00 AM
db01 16 1/1/0001 12:00:00 AM
db02 16 1/1/0001 12:00:00 AM
db03 16 1/1/0001 12:00:00 AM
db04 16 1/1/0001 12:00:00 AM
db05 16 1/1/0001 12:00:00 AM
db06 16 1/1/0001 12:00:00 AM
db07 16 1/1/0001 12:00:00 AM
db08 16 1/1/0001 12:00:00 AM
db09 16 1/1/0001 12:00:00 AM

And you get a result that the Release Manager finds much easier to use for their report.

Now that you know about the clip command, we hope that you save time by using it. Look at the example in Listing 3. You can see that you aren’t limited to one pipe in your commands. You can carry on piping as long as there’s an output from the command.

Listing 3. Get the Name, Size and Last Full Backup Time of Databases on the instance to the clipboard

PS C:\> Get-DbaDatabase -SqlInstance $sqlinstance1 -ExcludeSystem | Select Name, Size, LastFullBackup | clip

PowerShell enables you to save any data that you gather in a number of formats: XML, JSON, Text, CSV, etc. with default commands.

Suppose you want to save the results that you gathered in Listing 3 into a CSV file. PowerShell has a built-in command that you can use to do this called Export-Csv. The NoTypeInformation parameter removes the #TYPE information header from the CSV output and it isn’t required in PowerShell 6 or higher.

Listing 4. Get the name, size and last full backup time of the user databases on the instance and export to CSV

PS C:\> Get-DbaDatabase -SqlInstance $sqlinstance1 -ExcludeSystem | Select Name, Size, LastFullBackup | Export-Csv -Path Databaseinfo.csv -NoTypeInformation

PS C:\> Get-Content DatabaseInfo.csv
"Name","Size","LastFullBackup"
"Northwind","8.25","6/10/2019 12:00:00 AM"
"pubs","8.18","6/10/2019 12:01:00 AM"
"db01","16","1/1/0001 12:00:00 AM"
"db02","16","1/1/0001 12:00:00 AM"
"db03","16","1/1/0001 12:00:00 AM"
"db04","16","1/1/0001 12:00:00 AM"
"db05","16","1/1/0001 12:00:00 AM"
"db06","16","1/1/0001 12:00:00 AM"
"db07","16","1/1/0001 12:00:00 AM"
"db08","16","1/1/0001 12:00:00 AM"
"db09","16","1/1/0001 12:00:00 AM"

You may notice the use of the command Get-Content. This command returns a string of objects containing the contents of a file. By default, it displays the results to the screen.

Two other built-in PowerShell commands can be used to return information in different formats.

  • ConvertTo-Xml
  • ConvertTo-Json

These require you to pipe to Out-File if you wish to save that information to disk. You can pipe the results of any PowerShell command to Out-File to save the PowerShell output to disk.

Want to export to other data formats, including SQLite or Excel? You can find numerous useful modules in the PowerShell Gallery.

Exporting to Excel

As Data professionals we’re often requested to return information in Excel. We recommend that you look at the excellent ImportExcel module written by Doug Finke. Despite its name, it does far more than just import to Excel.

This can be installed from the PowerShell Gallery with Install-Module -Name ImportExcel

Writing to a database

Writing information to files is useful but as DBAs, saving to a database table is preferable because we can then make use of the data in Power BI or SSRS reports.

An added benefit to importing files to a database is that we’ll then be the ones in control of availability and backups!

Importing from a CSV to a database table

A common request that DBAs frequently hear is: “Can you add the contents of this CSV into the database?”

dbatools offers two ways to do this:

  • Import-DbaCsv
  • Import-Csv and Write-DbaDataTable

Which one should you use? It’s often a personal preference, we find that Import-DbaCsv is better suited for larger CSVs because it’s optimized to keep memory usage low.

Using Import-DbaCsv

When you use Import-DbaCSV, it’s using some streaming magic to move the data efficiently between the disk and the SQL Server.

The output from the command shows table information, the number of rows copied and even how quickly it copied them. This is useful when you’re testing a script with a smaller amount of data as you can extrapolate the time it takes to load the data in your production environment.

Listing 5. Import CSV to SQL Server

PS C:\> Get-ChildItem -Path E:\csvs\top.csv | Import-DbaCsv -SqlInstance $sqlinstance1 -Database tempdb -Table top

ComputerName : SQLDEV01
InstanceName : MSSQLSERVER
SqlInstance : SQLDEV01
Database : tempdb
Table : top
Schema : dbo
RowsCopied : 2450
Elapsed : 55.16 ms
RowsPerSecond : 44663
Path : E:\csvs\top.csv

You can even import multiple CSVs at once! Frequently, we find that more than one CSV is required to be imported. In the same way that you passed multiple instances to SqlInstance, you can pass multiple CSV files to Import-DbaCsv, as demonstrated in Listing 6.

Listing 6. Import CSV to SQL Server

PS C:\> Get-ChildItem E:\csv\top*.csv | Import-DbaCsv -SqlInstance sql2017 -Database tempdb -AutoCreateTable

ComputerName : SQLDEV01
InstanceName : MSSQLSERVER
SqlInstance : SQLDEV01
Database : tempdb
Table : top-tracks-lastfm-alltime
Schema : dbo
RowsCopied : 2450
Elapsed : 73.02 ms
RowsPerSecond : 33712
Path : E:\csv\top-tracks-lastfm-alltime.csv

ComputerName : SQLDEV01
InstanceName : MSSQLSERVER
SqlInstance : SQLDEV01
Database : tempdb
Table : top-tracks-lastfm-year ❶
Schema : dbo
RowsCopied : 1312
Elapsed : 65.41 ms
RowsPerSecond : 20160
Path : E:\csv\top-tracks-lastfm-year.csv ❷

Auto-generated table name

Base name is the same as the auto-generated table name

In Listing 6, you can see that when the table name is unspecified, the base name of the CSV file is used. If the table doesn’t exist, AutoCreateTable creates it for you. This saves time upfront, but the data types aren’t precise. This potentially means longer import times, like for large data sets. You may also need to transform the data types to use the data effectively.

We recommend that you pre-stage the tables by creating them before you run Import-DbaCsv. To find out more about Import-DbaCsv, visit dbatools.io/csv.

Using Import-Csv with Write-DbaDataTable

Import-Csv and Write-DbaDataTable are two commands which are commonly used by data professionals.

Import-Csv is a powerful command that turns the text within CSV files to objects.

Listing 7. Import a CSV file to a PowerShell object

PS C:\> Import-Csv -Path E:\csv\top-tracks.csv | Select Rank, Plays, Artist, Title

Rank Play Artist Title
---- ---- ------ -----
1 130 Nizlopi Freedom
2 55 The Courteeners Bide Your Time
3 50 Paloma Faith Stargazer
4 44 Citizen Cope Pablo Picasso
5 42 William Fitzsimmons After Afterall
6 40 Birdy Nam Nam Abbesses
7 39 Glasvegas Geraldine
8 35 Adele Melt My Heart to Stone
9 33 Florence + the Machine Howl
10 20 Paolo Nutini Rewind

Generally, Import-Csv is piped right to Write-DbaDataTable, as seen in Listing 8.

Listing 8. Add Contents of a CSV into a SQL Server database

# Import the CSV and write to the database
PS C:\> Import-Csv -Path .\Databaseinfo.csv | Write-DbaDataTable -SqlInstance $sqlinstance1 -Database tempdb -Table Databases –AutoCreateTable

Here, we used tempdb as we know that this database will exist. Please remember that tempdb is recreated every time SQL is started and therefore it isn’t the place to store things permanently!

Piping to Write-DbaDataTable is convenient and extremely fast for small batches, but slows down for larger datasets (similar to SQL’s RBAR concept). If you intend to import a large dataset, use Import-DbaCsv or the following syntax instead:

PS C:\> $csv = Import-Csv \\server\bigdataset.csv
PS C:\> Write-DbaDataTable -SqlInstance sql2014 -InputObject $csv -Database mydb

This syntax can also be found using Get-Help Write-DbaDataTable -Examples. For more information and alternative techniques, visit dbatools.io/rbar.

What have we done? Let’s take a look at the contents of the table “Databases”. You can use Invoke-DbaQuery to execute a SQL Query against a database.

Listing 9. Select from the newly created Databases table

PS C:\> $query = "Select * from Databases"
PS C:\> Invoke-DbaQuery -SqlInstance $sqlinstance1 -Query $query -Database tempdb

This returns results that match the contents of Databaseinfo.csv.

Which quote should we default to when creating strings, single or double? This is a debated topic within the PowerShell community. Microsoft’s own PowerShell documentation details the differences, but doesn’t prescribe one way or the other.

We always use double quotes unless literals are needed. A big reason for this is because T-SQL queries use single quotes. When passing queries to Invoke-DbaQuery, wrapping them in double quotes makes the most sense. Considering queries are such a big part of our PowerShell experience, we continue to use double quotes in other areas in order to remain consistent.

Invoke-DbaQuery is likely one of the commands you’ll use the most and we encourage you to explore its features.

By using the -AutoCreateTable parameter you created a new table called Databases because it didn’t already exist. You also created three columns (Name, Size and LastFullBackup) which match the columns in the CSV file. What data types are these columns?

To find the data types of the column you need another command Get-DbaDbTable. This command returns information about the tables in a database. You can use this to get the data types of the columns in a table as follows.

Listing 10. Get the data types of the columns

(Get-DbaDbTable -SqlInstance $sqlinstance1 -Database tempdb -Table Databases).Columns | Select-Object Parent, Name, Datatype

This command combines two important things: accessing the properties of a PowerShell command’s result and piping them to Select-Object. The results look like this:

PS:\> (Get-DbaDbTable -SqlInstance $sqlinstance1 -Database tempdb -Table Databases).Columns | Select-Object Parent, Name, DataType

Parent Name DataType
------ ---- --------
[dbo].[Databases] Name nvarchar ❶
[dbo].[Databases] Size nvarchar ❶)
[dbo].[Databases] LastFullBackup nvarchar ❶

You can see that the data types are all “nvarchar”.

You can get even more detailed information by expanding the DataType column.

Listing 11. Get detailed information about DataType

PS:\> (Get-DbaDbTable -SqlInstance $sqlinstance1 -Database tempdb -Table Databases).Columns | Select-Object -First 1 -ExpandProperty DataType

Name : nvarchar
SqlDataType : NVarCharMax
Schema :
MaximumLength : -1
NumericPrecision : 0
NumericScale : 0
XmlDocumentConstraint : Default
IsNumericType : False
IsStringType : True

The information here lets you know that the column was created as nvarchar(MAX).

Importing to a database table from a dbatools command

Instead of exporting the results to a CSV and then importing them into a database table, you can import the output of any PowerShell command straight to a database table.

If you’re following the examples, you need to remove the Databases table we created earlier. We can use the Drop() method of Get-DbaDbTable to do this.

Listing 12. Drop the newly created table

PS C:\> $sqlinstance1 = "e608f3dad95c"
PS C:\> (Get-DbaDbTable -SqlInstance $sqlinstance1 -Database tempdb -Table Databases).Drop()

By the way, if you’re wondering what dbatools’ command code looks like behind the scenes, when we create Remove-DbaDbTable, it wraps this command: $dbtable.Drop()

Listing 13. Add results of Get-DbaDatabase into a SQL Server Database

# Get the Database information and write to the database
PS C:\> Get-DbaDatabase -SqlInstance $sqlinstance1 -ExcludeSystem | Select Name, Size, LastFullBackUp | Write-DbaDataTable -SqlInstance $sqlinstance1 -Database tempdb -Table Databases –AutoCreateTable

Select from the table using Invoke-DbaQuery and observe that the results are the same.

Listing 14. Select from the newly created Databases table

PS C:\> $query = "Select * from Databases"
PS C:\> Invoke-DbaQuery -SqlInstance $sqlinstance1 -Query $query -Database tempdb

The data types of the columns in the auto-generated table look like this:

Listing 15. The data types of the Databases table

PS C:\> (Get-DbaDbTable -SqlInstance $sqlinstance1 -Database tempdb -Table Databases).Columns | Select-Object  Parent,Name, Datatype

Parent Name DataType
------ ---- --------
[dbo].[Databases] Name nvarchar ❶
[dbo].[Databases] Size float ❷
[dbo].[Databases] LastFullBackup datetime2 ❸

Still “nvarchar”

No longer “nvarchar”, now “float”

No longer “nvarchar”, now “datetime2

This time Write-DbaDataTable created columns with the data types of the incoming object from Get-DbaDatabase. You can use this method with any dbatools command that outputs objects. You can even use it with many PowerShell commands from other modules – both those included with PowerShell and those you add from the PowerShell Gallery or write yourself.

Creating the database table first and then importing from CSV

It’s important to remember when you use Write-DbaDataTable to create database tables, it tries to create columns with a matching data type of the incoming object. If you want to explore the data types created, you can use the PowerShell command Get-Member. This shows the data types of the incoming object. Examining the output of Import-Csv and Get-DbaDatabase from the previous examples, shows the difference and explain why the tables were created with different data types.

PS C:\> Import-Csv -Path .\Databaseinfo.csv | Get-Member
PS C:\> Get-DbaDatabase -SqlInstance $sqlinstance1 -ExcludeSystem | Select Name, Size, LastFullBackup | Get-Member
Figure 1. The data types of the CSV
Figure 2. The data types of the Database object

You can see in the first results for Import-Csv the data types in the Definition column are all strings but the results for Get-DbaDatabase are datetime, string and double.

Using more accurate types is more efficient, both for PowerShell and for SQL Server.

If the object returned from Get-Member doesn’t have the data types you want, you need to create the table manually first.

Listing 16. Create a Databases table with T-SQL

Use tempdb
GO
CREATE TABLE [dbo].[Databases](
[Name] [nvarchar](7) NULL,
[Size] [float] NULL,
[LastFullBackup] [datetime2](7) NULL
) ON [PRIMARY]
GO

Then, you can import the CSV using the Write-DbaDataTable command as before but this time without the -AutoCreateTable parameter. This isn’t always true; we do this here because you know for certain that the table is already created. When you use Write-DbaDataTable into an automated solution where you don’t know whether the table has been created, you can leave the -AutoCreateTable switch in and if a table already exists with that name then the command won’t try to create a new one.

You can import the CSV file, Databaseinfo.csv, into the existing Database table in tempdb.

We know! We’re sorry, we fooled you. If you have followed these instructions precisely, you’re now looking at results that look like this.

Listing 17. Good ol’ String or binary data are truncated

PS C:\> Import-Csv -Path .\Databaseinfo.csv | Write-DbaDataTable -SqlInstance $sqlinstance1 -Database tempdb -Table Databases
WARNING: [15:25:47][Write-DbaDbTableData] Failed to bulk import to [tempdb].[dbo].[Databases] | String or binary data would be truncated.

You get the error because the width of the Name column is too small to allow the NorthWind database name to be added. The error that you get is the infamous 8152 SQL error which occurs when the source and the destination don’t have matching data types or lengths. At the time of writing (March 2019) the same error is presented even for SQL2016 (from SP2 CU6), SQL2017 and SQL2019 instances with the trace flag 460 enabled which allows the new Error message 2628 with more detail of the data that caused the error.

This makes it tricky to work out which cell is causing the failure with large datasets. We’ve a PowerShell snippet which helps with this problem

Listing 18. Getting the max length of columns in a datatable

$columns = ($datatable | Get-Member -MemberType Property).Name
foreach($column in $columns) {
$max = 0
foreach ($row in $datatable){
if($max -lt $row.$column.Length){
$max = $row.$column.Length
}
}
Write-Output "$column max length is $max"
}

You need to pass a datatable object to the $datatable variable. To do this in this example you need to use the ConvertTo-DbaDataTable command. You can use this with the CSV you created as follows.

Listing 19. Find the max length of a column in the CSV

$datatable = Import-Csv -Path .\Databaseinfo.csv | ConvertTo-DbaDataTable
$columns = ($datatable | Get-Member -MemberType Property).Name
foreach($column in $columns) {
$max = 0
foreach ($field in $datatable){
if($max -lt $field.$column.Length){
$max = $field.$column.Length
}
}
Write-Output "$column max length is $max"
}

This gives the following output:

LastFullBackup max length is 19
Name max length is 9
Size max length is 4

By comparing the length of the data type in the table you created with the max length of the datatable you can see that the Name column is the one which is causing the issue. You can resolve this by altering the columns data type.

Listing 20. Alter the column length

use tempdb
GO

ALTER TABLE Databases ALTER COLUMN [Name] nvarchar(10)

Once you alter the column length the Write-DbaDataTable command will succeed.

Listing 21. Import a CSV into existing database table

PS C:\> Import-Csv -Path .\Databaseinfo.csv | Write-DbaDataTable -SqlInstance $sqlinstance1 -Database tempdb -Table Databases
PS C:\> Invoke-DbaQuery -SqlInstance $sqlinstance1 -Query "select * from Databases" -Database tempdb

Name Size LastFullBackup
---- ---- --------------
Northwind 8.25 6/10/2019 12:00:00 AM
pubs 8.18 6/10/2019 12:01:00 AM
db01 16 1/1/0001 12:00:00 AM
db02 16 1/1/0001 12:00:00 AM
db03 16 1/1/0001 12:00:00 AM
db04 16 1/1/0001 12:00:00 AM
db05 16 1/1/0001 12:00:00 AM
db06 16 1/1/0001 12:00:00 AM
db07 16 1/1/0001 12:00:00 AM
db08 16 1/1/0001 12:00:00 AM
db09 16 1/1/0001 12:00:00 AM

In this section you learned how to import data into a SQL database table from a CSV and from a dbatools command with Write-DbaDataTable. When your input object doesn’t have suitable data types, you can pre-create the table with more suitable data types than nvarchar(MAX) which is the default.

You also saw the SQL error message that you get if the data length is greater than the column in the table and a snippet of PowerShell code which identifies which column is causing the issue.

Writing the results of other commands to a table

Using Write-DbaDatable to add data to a database table isn’t limited to CSVs and dbatools commands, you can use the output from any PowerShell command and import it into a database table. For example, getting the currently running processes with PowerShell can be achieved with the Get-Process command and you can use Write-DbaDataTable to add that to a table.

Listing 22. Import a sample of the running processes on a computer into a database table

PS C:\> Get-Process | Select -Last 10 | Write-DbaDataTable -SqlInstance $sqlinstance1 -Database tempdb -Table processes –AutoCreateTable

Try it now

Get the currently running processes and import them to a database table and examine the results. When you look at the data types of the table you can see that the command has successfully created suitable data types.

Listing 23. Get the data types

PS C:\> (Get-DbaDbTable -SqlInstance $sqlinstance1 -Database tempdb -Table processes).Columns | Select-Object  Parent,Name, Datatype

Parent Name DataType
------ ---- --------
[dbo].[processes] Name nvarchar
[dbo].[processes] SI int
[dbo].[processes] Handles int
[dbo].[processes] VM bigint
[dbo].[processes] WS bigint
[dbo].[processes] PM bigint
[dbo].[processes] NPM bigint
[dbo].[processes] Path nvarchar
[dbo].[processes] Company nvarchar
[dbo].[processes] CPU nvarchar
[dbo].[processes] FileVersion nvarchar
[dbo].[processes] ProductVersion nvarchar
[dbo].[processes] Description nvarchar
[dbo].[processes] Product nvarchar
[dbo].[processes] __NounName nvarchar
[dbo].[processes] BasePriority int
[dbo].[processes] ExitCode int
[dbo].[processes] HasExited bit
[dbo].[processes] ExitTime datetime2
...

Writing the results of other commands to an Azure SQL Database

Imagine that you’ve been tasked with loading a database table with the current Virtual machines in an Azure Resource Group. You need the Az module from the PowerShell Gallery to gather this information. Logging in to your Azure subscription is accomplished with the Connect-AzAccount command which opens a login box.

IMPORTANT: If you use VS Code then this box is found behind the window and you need to minimize VS Code to find it.

When you finish the login process you can get the information about the virtual machines using the Get-AzVM command.

As you learned in this article, you can pipe the results of this command to Write-DbaDataTable to add this information to a table in a database.

Listing 24. Loading Azure VM details into SQL Server

# Login to Azure. If using VS Code, this will pop-up underneath a VS Code window
PS C:\> Connect-AzAccount

Account SubscriptionName TenantId Environment
------- ---------------- -------- -----------
dba@dbatools.io Microsoft Azure 7eb75625-3716-461e-bdb4-94670c36e593 AzureCloud

PS C:\> Get-AzVM -Status | Write-DbaDataTable -SqlInstance $sqlinstance1 -Database tempdb -Table AzureVMs –AutoCreateTable

Now that you’ve loaded the output of Get-AzVM into your database, let’s take a look at selected columns.

Listing 25. Checking your work

PS C:\> $query = "SELECT [Name]
,[Location]
,[PowerState]
,[StatusCode]
FROM [AzureVMs]"
PS C:\> Invoke-DbaQuery -SqlInstance $sqlinstance1 -Query $query -Database tempdb

Name Location PowerState StatusCode
---- -------- ---------- ----------
big eastus VM deallocated OK
temp eastus VM deallocated OK
server centralus VM running OK
win10 centralus VM running OK

Notice the $query syntax. Many people are surprised that PowerShell supports multi-line variable values and we were, too! This is one of many ways that PowerShell tries to be as user-friendly as possible.

You may notice that some fields are flattened and only have class names as the string value. To get the values into a database, you need to use Select-Object -ExpandProperty. For more information on this topic, please visit dbatools.io/rich.

We hope that this has given you lots of ideas for information that you can collect with PowerShell and save to a SQL database.

Copying Tables including their data

Now that you’ve learned how to save the results of a PowerShell command to a database table, you may be wondering about data. As a DBA you may be given a task of writing or rewriting a query and require some representative data. You want to have a copy of the table to be able to work with.

PowerShell splatting

Splatting is a brilliant word which always makes us smile. It’s also an extremely useful way of passing PowerShell parameters to a command in an easy to read and easy to alter format.

Visual Studio Code makes it extra easy to splat. Check out Rob’s article at sqlps.io/splat to find out more.

Compare Figure 3 and Figure 4, each of which performs the same operation. The first one is harder to read and also harder to use again with a different table. The second one is easier to read with the parameter values laid out below each other. This is called splatting.

Figure 3. Copy-DbaDbTableData with parameters, conventional syntax
Figure 4. Copy-DbaDbTableData with parameters, splatting syntax

Imagine that you’re required to perform some work on the Purchasing.PurchaseOrders table in the WideWorldImporters database and you want to have a copy of that table on your local instance to work on. This can be achieved with Copy-DbaDbTableData.

Listing 26. Using Copy-DbaDbTableData to copy table data

PS C:\> $copyDbaDbTableDataSplat = @{
SqlInstance = $sqlinstance1
Database = 'WideWorldImporters'
Table = '[Purchasing].[PurchaseOrders]'
Destination = $localhost
DestinationDatabase = 'WIP'
DestinationTable = 'dbo.PurchaseOrders'
AutoCreateTable = $true
}
PS C:\> Copy-DbaDbTableData @copyDbaDbTableDataSplat

SourceInstance : localhost,15591
SourceDatabase : WideWorldImporters
SourceSchema : Purchasing
SourceTable : PurchaseOrders
DestinationInstance : localhost,15591
DestinationDatabase : WIP
DestinationSchema : dbo
DestinationTable : PurchaseOrders
RowsCopied : 2074
Elapsed : 77.01 ms

It’s as easy as that, 2074 rows copied in a few milliseconds and the data is ready to work with on your local machine.

Figure 5. Comparing the data

An important point to make is that the command name Copy-DbaDbTableData emphasizes data. If you use the AutoCreateTable parameter, it doesn’t copy the constraints, filegroups or indexes, as can be seen from the table definitions in Figure 6.

If advanced table creation is required, you need to pre-create the table before importing the data.

Figure 6. Incomplete table definitions may be created when using AutoCreateTable

Now that you’ve learned how to write data to a SQL Server table, it’s time to find undocumented SQL Servers in your domain.

That’s all for this article. If you want to learn more about the book, check it out on Manning’s liveBook platform here.

--

--

Manning Publications
Manning Publications

Written by Manning Publications

Follow Manning Publications on Medium for free content and exclusive discounts.

No responses yet