“Ups…I have deleted some data. Can you put it back?” – dbatools for the rescue

Few days ago I received a request to restore a dozen of tables because someone have deleted more data than it was supposed.
I immediately thought about dbatools for the job!

NOTE: I also thought about SSMS “Import/Export Data”. And this is ok when someone says “it’s to run just once, just now”. When you are in the IT for a while you know that is not only once :-). And what if I need to re-run? Or save it as example and share with my colleagues? Doing this using a PowerShell script makes it much more flexible.

At this post, I’m assuming that we already have restored what will be our source database using the backup requested by the client.

Let’s start – Finding the right command

I knew we have a Copy command for table data.

To search for commands within dbatools we can use the Find-DbaCommand.

Find-DbaCommand -Pattern Copy*Table*Data

Note: As you can see, we can use wildcards to do the search when using -Pattern parameter.

From the results list, we know that we have at least three commands (as of v0.9.824) that match the specified pattern. By reading the Synopsis of the first command that seems what we need to accomplish the work.

Get familiarized with the command

Don’t forget to use Get-Help cmdlet to find the available parameters and get some examples on how you can use the command.

Get-Help Copy-DbaDbTableData -Detailed

Let’s try the command and copy the data

$params = @{
    SqlInstance = 'sql1'
    Destination = 'sql2'
    Database = 'db1'
    DestinationDatabase = 'db2'
    Table = '[dbo].[Table]'
    DestinationTable = '[dbo].[Table]'
}
Copy-DbaDbTableData @params

Result

WARNING: [14:26:06][Copy-DbaDbTableData] Table [dbo].[Table] cannot be found in db2. Use -AutoCreateTable to automatically create the table on the destination.

Wait a moment…it yielded a warning?

Oh, I see…this will not work if the table does not exists on the destination database.

At this point I realised that some of the requested tables were dropped. The client did not specify that.

But dbatools got you covered and the warning gives you one hint: use -AutoCreateTable which will, per its description,

Creates the destination table if it does not already exist, based off of the “Export…” script of the source table.

That is nice!

I added the -AutoCreateTable parameter and rerun the command

$params = @{
    SqlInstance = 'sql1'
    Destination = 'sql2'
    Database = 'db1'
    DestinationDatabase = 'db2'
    Table = '[dbo].[Table]'
    DestinationTable = '[dbo].[Table]'
    AutoCreateTable = $true
}
Copy-DbaDbTableData @params

And now it has completed successfully.

At this point you may be thinking…”everything is done!”

The client just said that he wants to replace all the data from last backup for a dozen of tables. In this case some of the tables do not exist anymore, they were dropped.. so adding the -AutoCreateTable parameter helped.

And we are done! Aren’t we?

Not so fast!

If the table don’t have any indexes, triggers, etc, or you just want the table skeleton, yes the work here is done!
Otherwise, we are not finished yet. And this was the scenario.

Primary key was missing and some constraints and indexes as well.

Beyond default scripting options

As I have demonstrated on my last post Scripting SQL Server objects with dbatools – Beyond default options, we can generate an object of ScriptingOptions and use the -ScriptingOptionsObject parameter available on the Export-DbaScript command to specify more than the defaults.

Exploring other options

As stated on the previous post, we have some properties like:

  • Indexes
  • DriPrimaryKey
  • DriForeignKeys
  • etc

By default they are set to $false and that explains why our -AutoCreateTable parameter (which uses default options) didn’t bring all of this details from our source tables.

This means that with a little bit more code we will be able to accomplish our requirements.

The code

Note: This is an modified version (to meet my needs) of the original script that I have borrowed from Andy Levy’s (b | t) blog post Copying Individual Tables with dbatools. Andy is also a dbatools contributor.

$SourceServer = "SQL1";
$DestinationServer = "SQL2"
$SourceDB = "srcDB";
$DestinationDB = "dstDB";
$tables = "Table1", 'Table2', "Table3";

$options = New-DbaScriptingOption

$options.DriPrimaryKey = $true
$options.DriForeignKeys = $true
$options.DriUniqueKeys = $true
$options.DriClustered = $true
$options.DriNonClustered = $true
$options.DriChecks = $true
$options.DriDefaults = $true

$tables | ForEach-Object {
    # Get the table definition from the source
    [string]$tableScript = Get-DbaDbTable -ServerInstance $SourceServer -Database $SourceDB -Table $_ | Export-DbaScript -ScriptingOptionsObject $options -Passthru;

    if (-not [string]::IsNullOrEmpty($tableScript)) {
        if ($null -eq (Get-DbaDbTable -ServerInstance $DestinationServer -Database $DestinationDB -Table $_)) {
            # Run the script to create the table in the destination
            Invoke-DbaQuery -Query $tableScript -ServerInstance $DestinationServer -Database $DestinationDB;
        }
        else {
            Write-Warning "Table $_ already exists in detination database. Will continue and copy the data."
        }

        # Copy the data
        Copy-DbaDbTableData -ServerInstance $SourceServer -Database $SourceDB -DestinationDatabase $DestinationDB -KeepIdentity -Truncate -Table $_ -DestinationTable $_;
    }
    else {
        Write-Warning "Table $_ does not exists in source database."
    }
}

This will do the following steps:

  • New-DbaScriptingOption creates a new object of Scripting Options type and then we set a bunch of properties as $true.

For each of the tables that we define on our $tables list variable:

  • Export-DbaScript will generate the T-SQL script from the source table using the properties that we have defined. In this case, Keys (Primary, Foreign, Unique), Defaults and Checks Constraints and also clustered and non-clustered indexes.

  • Invoke-DbaQuery will run the generated script on the destination database. At this point we have the same table structure on both sides. This will only run if the table does not exists on the destination database.

  • Finally we use our Copy-DbaDbTableData command to copy the records. Here I have choosen to truncate the table with -Truncate parameter and keep identity values by specifying the -KeepIdentity.

Special callout:

If these tables have relations between them, you need to specify the table list in a specific order to make sure that parent tables are created before child tables. Otherwise you will get, as expected, some errors.

Final thoughts

dbatools has, literally, hundreds of commands so it’s important to know how to find the right one for the job.
Make sure you take a look on the help to understand what is or not covered by the process. As we saw in this example, assuming that the table would be created is a pitfall. For that, we need to use another parameter, but even so when reading the parameter description we will understand that this table will not be created with its full definition. This means that primary key, constraints, indexes, etc will not be created by default.

Because we are using PowerShell, and there is more than one way to accomplish a task, combining both commands as the New-DbaScriptingOption will make it more flexible and possible to reach the final result that we need.

Thanks for reading!

Scripting SQL Server objects with dbatools – Beyond default options

Probably you had the need to script out some objects from a SQL Server instance/database and this is quite easy. You just need to right click on the object (well…not on every single one, try it with an Availability Group :-), no script option available) select “Script XXXX as” and you have it.

But have you realized that this option doesn’t bring all the stuff?
Let’s say you are scripting a table and you have a Non-Clustered index or a trigger…using this option some of the objects under the table will not be scripted out. I understand this as each one of this is a different object, which means if you go to each one you can right click and script just that one.

SSMS – “Generate Scripts…” option

This is a tedious work and you can easily miss some objects. What can we do on SSMS to make this task easier?

You can accomplish this task by using the “Generate Scripts…” option under “Tasks” when you right-click on the database:

This will open the wizard and at the “Set Scripting Options” you can just click on the “Advanced” button and there you can change the properties.
Here you can see that some defaults are in place, and “Script Indexes” is one of them.

This is much easier right? All-in-one in a single scripting operation.

What about automating this task? You want to script out multiple objects from different instances/databases.

Enter dbatools’ “Export-” commands

To search for commands within dbatools we can use the Find-DbaCommand.

Find-DbaCommand -Tag Export

This command give to us a nice Synopsis text that help to find which command is the one we want. From the output we can see that we have (as of v0.9.824) 5 commands tagged as Export.

To replicate our SSMS example using PowerShell we will use the Export-DbaScript.

Don’t forget to use Get-Help cmdlet to find the available parameters and get some examples on how you can use the command.

Get-Help Export-DbaScript -Detailed

Tip: Try other switches like -Examples or even -ShowWindow (won’t work on PSCore but dbatools does!) switches.

Example using our “MyTable” object

Here is how MyTable looks like:

  • 3 Columns
  • 1 Default constraint
  • 1 Non-Clustered Index
Get-DbaDbTable -SqlInstance SQL1 -Database DB1 -Table MyTable | Export-DbaScript -Passthru

Note: I’m using -PassThru parameter to output the script to the console, by default it will create a SQL file.

The output of this execution is even more incomplete when comparing with SSMS. Here, we dont even get the default constraint scripted.

Using “New-DbaScriptingOption” command

dbatools has a command that makes it possible to create an object of type ScriptingOptions.
Then we can change the properties like we have done before on the “Generate Scripts…” option on SSMS.

$options = New-DbaScriptingOption
$options | Get-Member

Use Get-Member so you can see what properties the object offers.

Here we start seeing what we need.

By default what are the values of properties like NonClusteredIndexes and DriDefaults

$options = New-DbaScriptingOption
$options.NonClusteredIndexes
$options.DriDefaults

False! That explains why they are “missing” from our default Export-DbaScript output.

NOTE: Export-DbaUser comamnd can also leverage on this object. Try it.

Let’s change this options to $true and pass our $options object as the value of the -ScriptingOptionsObject parameter and run the command again.

$options = New-DbaScriptingOption
$options.NonClusteredIndexes = $true
$options.DriDefaults = $true
Get-DbaDbTable -SqlInstance SQL1 -Database DB1 -Table MyTable | Export-DbaScript -Passthru -ScriptingOptionsObject $options

Nice! Now we can see all the stuff.

Try it yourself

See the other options available, change the values, rerun and analyse the output.
Do you need to export it to run on a lower SQL Server version? Change the TargetServerVersion option

#Will script the code like SQL Server 2014
$options.TargetServerVersion = "Version120"

You want to include the “IF NOT EXISTS” statement? Change the “IncludeIfNotExists” option.

Here is an example to script out a list of tables (add more to the $TableName variable):

$SourceServer = "SQL1";
$SourceDB = "DB1";
$TableName = "MyTable", 'YourTable';

$options = New-DbaScriptingOption

$options.DriPrimaryKey = $true
$options.DriForeignKeys = $true
$options.DriNonClustered = $true
$options.DriChecks = $true
$options.DriDefaults = $true
$options.Indexes = $true
$options.IncludeIfNotExists = $true

$TableName | Foreach-Object {
    Get-DbaDbTable -ServerInstance $SourceServer -Database $SourceDB -Table $_ | Export-DbaScript -ScriptingOptionsObject $options -Passthru;
}

Availability Groups example using dbatools

Try it yourself:

Get-DbaAvailabilityGroup -SqlInstance SQL1 -AvailabilityGroup SQL1_AG1 | Export-DbaScript -Passthru

Summary

We have seen how we can leverage on some dbatools commands to generate T-SQL scripts from objects. This way we can versioning or just run them on other instace/database. We have seen what default options it (doesn’t) brings and how to add more options to it.
We also saw that, for some objects, the script option is not available. As example Availability Groups, but dbatools can help you here too.

Thanks for reading!

I will be speaking @ SQLSaturday #795 Porto

On 29th September we will have SQL Saturday Porto and I’m proud to announce that I will be speaking!

I will be speaking about dbachecks PowerShell module – dbachecks – Checking SQL Server environments with PowerShell.

The event will be held at ISEP, Rua Dr. António Bernardino de Almeida, 431, 4249-015 Porto, Portugal and you can see the full schedule here.

Register, it’s free!

Hope to see you in Porto! If you see me, come and say hi!

I will be speaking @ Tuga IT 2018

Starting on 19th of July and during 3 days the Tuga IT 2018 Summer Edition will be happening in Lisbon!

This edition will have:

4 full-day workshops
38 breakout sessions (1 hour)
5 deep dive sessions (2 hours)
That is over 72 hours of content delivered by 44 awesome speakers, many of them recognized industry experts.

You can see the full schedule here. This edition will have sessions about: Cloud, Data Platform, Development, Integration, IOT, Office, PowerShell and Security.

I will be speaking on 19th about the dbachecks PowerShell module for SQL Server professionals. You can find more information about the session here: dbachecks – Checking SQL Server Environments With PowerShell.

The event will be held at Microsoft Headquarters in Lisbon.

What are you waiting for? Registration to the main event is free for all attendees, with the exception of the full-day workshops.
Main event registrations
Workshops registrations.

Hope to see you in Lisbon!

SQL Saturday #731 Athens 2018


In less than two weeks (on 16th June) it’s time for SQL Saturday Athens and I’m proud to announce that I will be speaking!

This will be my first SQL Saturday ever both as speaker and attendee outside Portugal so I’m especially thrilled and excited to be part of it!

I will be presenting my Administrate SSRS/PBIRS with PowerShell session and I can share in first hand that it will include new demos with some fixes and enhancements that will be available on the next version of the module later this week.

The event will be held at Microsoft Hellas , 221 Kifisias Ave, Marousi, Attica, 151 24 and you can see the full schedule here.

Hope to see you in Athens! If you see me, come and say hi!

Learn, Evolve and Giving Back – TSQL Tuesday #102

https://scribnasium.com/2018/05/giving-back-t-sql-tuesday-102-invite/

This month’s T-SQL Tuesday is brought to us by Riley Major‏ (b | t) and he encourage us to talk about how we are helping by giving back to our community.

This is the 102nd edition of TSQL2sDay – an Adam Machanic’s (b | t) brainchild.

Let’s go back for a moment:

I have heard about SQL Server the first time back in 2003 when I was on the high school and I had a database class. Was a slow start and, at that moment I haven’t imagined that would follow that path professionally.

Learn

At that time I didn’t even know that a community exists, but when I went to professional course (level IV – it’s a degree before bachelors) on 2006 that was based on Microsoft technologies, .NET (Web and Windows forms) and I had a database class using SQL Server 2000!
That was when I started using more and more the internet for study, found some blogs posts and the SQLServerCentral. So I can say that I started consuming the knowledge from the community back on 2006.

A year later I started my intership in what was my first IT job. At the time I needed to work with a bit of everything but more focused on SQL Server development. The guy that was on my place before I arrived took a manager position but he was a big lover of SQL Server and I learnt a LOT from him! At the time he subscribed the SQLServer Magazine in paper! (later the name changed for SQLServer Pro).

SQLServer Magazine October 2007

SQLServer Magazine
October 2007


You can found the maganize archives in PDF format here.

That was when I start knewing who people like Itzik Ben-Gan and Kalen Delaney were!

Evolve

During about five years I was a compulsive consumer of blog posts related with SQL Server development but most of the time was when I had a problem and wanted some guidance on the solution. During that time I haven’t shared too much knowledge on the internet.

Then, 2012, I went to my first SQL Port user group meetings and my first ever SQL Saturday, 115 Portugal. Those were my first real contact with SQL Server community and where I met some speakers and started to join the montly meetings regulary.

Giving back

At 2015 I joined Twitter and decided that I wanted to share my knowledge even more. So after being approached several times I took courage and did my first public presentation for SQL Port user group in Lisbon and right after I submitted a session for SQL Saturday Oporto 2015 and I was accepted.
You know, start small and grow…so I started talking in Portuguese for the user group and then went to SQL Saturday where I did the first public speech in English.

Now, we were in 2016 and was the first edition of TugaIT (the CFS is open for this year – Summer Edition) and at that time I was “just” a volunteer but I had a gut feeling – I have discovered, few months earlier, that dbatools was a thing and decided to go and talk with the creator Chrissy LeMaire – I had written a couple of PowerShell scripts to help me and decided to ask her if she has the intention to extend the module to best practices and we talk for like 1 hour, exchange contacts and started talking furiously about dbatools and then…I was doing a PR with a full command (Expand-DbaTLogResponsibly) and this was the time I felt I was doing my first “more international” contribution to the community.

Since then I have being more active on the community mainly with SQLServer and PowerShell stuff.

Some contribution points:

What you will do?

As you can see I didn’t born inside the comunity, I have grown because of it and decided to start giving back less then 3 years ago.
It hurst? Nothing!
It helps? A lot!
Help me be a better professional? For sure, I have learned so much from many different sources and every day is a new learning day!

Thanks for reading.