dbatools v1.0? It’s available – Check it out!

Dear reader, before continue please open a PowerShell console and run the following command:

 
Install-Module -Name dbatools

If you are doing this on the date of this blog post, you have just installed dbatools v1.0!

After more than 200 commits, the work of more than 20 contributors and 20 days since the last published version, dbatools v1.0 is live!

To all of you that have contributed direct or indirectly to the module a big thank you!

Fun fact

I’m sure this was one of the longest periods without releasing new versions since we started doing it more often.
Bare minimum has been 1 release per week 🙂

But, there are good reasons for it! v1.0 brings
– Standardised code – parameters names / output
– Code cleanup
– More tests
– Azure connections supported
– And of course, fixes and new (13) commands.

You can read the v1.0 change log for more details.

New commands

From the 13 new commands, I decided to share the ones that make it possible to interact with Server/Database roles.

Here is the list of the newest commands:
Add-DbaDbRoleMember
Get-DbaDbRole
New-DbaDbRole
New-DbaInstanceRole
Remove-DbaDbRole
Remove-DbaDbRoleMember
Remove-DbaInstanceRole

Note: Database Application Roles are not covered yet.
Note2: A new command to add logins to one (or more) server role is being cooked.
This is why we release often, improvements and new features are always on our pipeline.

Code examples

Here is a short script that shows how you can leverage these new commands.
Don’t forget to use Get-Help or visit our docs page to know more about the commands.

$instance = "sql2016"
$login = "domain\user"
$newServerRole = "securityMaster"
$defaultExistingServerRole = "sysadmin"

$database = "db1"
$username = $login
$newDatabaseRole = "SPExecutor"

### Create

# Create new login and set default database
New-DbaLogin -SqlInstance $instance -Login $login -DefaultDatabase $database

# Create new server role
New-DbaInstanceRole -SqlInstance $instance -ServerRole $newServerRole

# Create new database user
New-DbaDbUser -SqlInstance $instance -Database $database -Username $username -Login $login

# Create new database role
New-DbaDbRole -SqlInstance $instance -Database $database -Role $newDatabaseRole

# Add new user to the newly created database role
Add-DbaDbRoleMember -SqlInstance $instance -Database $database -User $username -Role $newDatabaseRole


### Now using Get-Dba*Role*

# Get all members of an role (or list of roles)
Get-DbaInstanceRoleMember -SqlInstance $instance -ServerRole $defaultExistingServerRole | Format-Table -AutoSize


# Get newly create server role 'securityMaster' and defaul existing role 'sysadmin'
Get-DbaInstanceRole -SqlInstance $instance -ServerRole $newServerRole, $defaultExistingServerRole


### Database level

# Get newly creted 'SPExecuter' role
Get-DbaDbRole -SqlInstance $instance -Database $database -Role $newDatabaseRole

# Get all users member of an role (or list of roles)
Get-DbaDbRoleMember -SqlInstance $instance -Database $database -Role $newDatabaseRole


### Clean up

# Remove user from database role
Remove-DbaDbRoleMember -SqlInstance $instance -Database $database -Role $newDatabaseRole -User $username

# Remove role from database
Remove-DbaDbRole -SqlInstance $instance -Database $database -Role $newDatabaseRole

# Remove server role from instance
Remove-DbaInstanceRole -SqlInstance $instance -ServerRole $newServerRole

Hope you found them as useful as we did!

Wrap up

Even though this is a milestone for us, we will keep working on the module bringing more awesomeness to it!

We want to hear from you!
If you have questions, suggestions, requests or you just want to give a shout out to the team you can:
Request a feature or report a bug
Join #dbatools channel on Slack SQL Community
– Find us on Twitter – @psdbatools

Other useful links:
Website and blog
Command list and documentation

To celebrate the launch of v1.0 we have a lot of blog posts related to it!

dbatools 1.0 has arrived by Chrissy
dbatools 1.0 – the tools to break down the barriers – Shane O’Neill
dbatools 1.0 is here and why you should care – Ben Miller
dbatools 1.0 and beyond – Joshua Corrick
dbatools 1.0 – Dusty R
Your DBA Toolbox Just Got a Refresh – dbatools v1.0 is Officially Available!!! – Garry Bargsley
updating sql server instances using dbatools 1.0 – Gareth N

Enjoy dbatools v1.0!

Thanks for reading!

“Ups…I have deleted some data. Can you put it back?” – dbatools for the rescue

Few days ago I received a request to restore a dozen of tables because someone have deleted more data than it was supposed.
I immediately thought about dbatools for the job!

NOTE: I also thought about SSMS “Import/Export Data”. And this is ok when someone says “it’s to run just once, just now”. When you are in the IT for a while you know that is not only once :-). And what if I need to re-run? Or save it as example and share with my colleagues? Doing this using a PowerShell script makes it much more flexible.

At this post, I’m assuming that we already have restored what will be our source database using the backup requested by the client.

Let’s start – Finding the right command

I knew we have a Copy command for table data.

To search for commands within dbatools we can use the Find-DbaCommand.

Find-DbaCommand -Pattern Copy*Table*Data

Note: As you can see, we can use wildcards to do the search when using -Pattern parameter.

From the results list, we know that we have at least three commands (as of v0.9.824) that match the specified pattern. By reading the Synopsis of the first command that seems what we need to accomplish the work.

Get familiarized with the command

Don’t forget to use Get-Help cmdlet to find the available parameters and get some examples on how you can use the command.

Get-Help Copy-DbaDbTableData -Detailed

Let’s try the command and copy the data

$params = @{
    SqlInstance = 'sql1'
    Destination = 'sql2'
    Database = 'db1'
    DestinationDatabase = 'db2'
    Table = '[dbo].[Table]'
    DestinationTable = '[dbo].[Table]'
}
Copy-DbaDbTableData @params

Result

WARNING: [14:26:06][Copy-DbaDbTableData] Table [dbo].[Table] cannot be found in db2. Use -AutoCreateTable to automatically create the table on the destination.

Wait a moment…it yielded a warning?

Oh, I see…this will not work if the table does not exists on the destination database.

At this point I realised that some of the requested tables were dropped. The client did not specify that.

But dbatools got you covered and the warning gives you one hint: use -AutoCreateTable which will, per its description,

Creates the destination table if it does not already exist, based off of the “Export…” script of the source table.

That is nice!

I added the -AutoCreateTable parameter and rerun the command

$params = @{
    SqlInstance = 'sql1'
    Destination = 'sql2'
    Database = 'db1'
    DestinationDatabase = 'db2'
    Table = '[dbo].[Table]'
    DestinationTable = '[dbo].[Table]'
    AutoCreateTable = $true
}
Copy-DbaDbTableData @params

And now it has completed successfully.

At this point you may be thinking…”everything is done!”

The client just said that he wants to replace all the data from last backup for a dozen of tables. In this case some of the tables do not exist anymore, they were dropped.. so adding the -AutoCreateTable parameter helped.

And we are done! Aren’t we?

Not so fast!

If the table don’t have any indexes, triggers, etc, or you just want the table skeleton, yes the work here is done!
Otherwise, we are not finished yet. And this was the scenario.

Primary key was missing and some constraints and indexes as well.

Beyond default scripting options

As I have demonstrated on my last post Scripting SQL Server objects with dbatools – Beyond default options, we can generate an object of ScriptingOptions and use the -ScriptingOptionsObject parameter available on the Export-DbaScript command to specify more than the defaults.

Exploring other options

As stated on the previous post, we have some properties like:

  • Indexes
  • DriPrimaryKey
  • DriForeignKeys
  • etc

By default they are set to $false and that explains why our -AutoCreateTable parameter (which uses default options) didn’t bring all of this details from our source tables.

This means that with a little bit more code we will be able to accomplish our requirements.

The code

Note: This is an modified version (to meet my needs) of the original script that I have borrowed from Andy Levy’s (b | t) blog post Copying Individual Tables with dbatools. Andy is also a dbatools contributor.

$SourceServer = "SQL1";
$DestinationServer = "SQL2"
$SourceDB = "srcDB";
$DestinationDB = "dstDB";
$tables = "Table1", 'Table2', "Table3";

$options = New-DbaScriptingOption

$options.DriPrimaryKey = $true
$options.DriForeignKeys = $true
$options.DriUniqueKeys = $true
$options.DriClustered = $true
$options.DriNonClustered = $true
$options.DriChecks = $true
$options.DriDefaults = $true

$tables | ForEach-Object {
    # Get the table definition from the source
    [string]$tableScript = Get-DbaDbTable -SqlInstance $SourceServer -Database $SourceDB -Table $_ | Export-DbaScript -ScriptingOptionsObject $options -Passthru;

    if (-not [string]::IsNullOrEmpty($tableScript)) {
        if ($null -eq (Get-DbaDbTable -SqlInstance $DestinationServer -Database $DestinationDB -Table $_)) {
            # Run the script to create the table in the destination
            Invoke-DbaQuery -Query $tableScript -SqlInstance $DestinationServer -Database $DestinationDB;
        }
        else {
            Write-Warning "Table $_ already exists in detination database. Will continue and copy the data."
        }

        # Copy the data
        Copy-DbaDbTableData -SqlInstance $SourceServer -Database $SourceDB -DestinationDatabase $DestinationDB -KeepIdentity -Truncate -Table $_ -DestinationTable $_;
    }
    else {
        Write-Warning "Table $_ does not exists in source database."
    }
}

This will do the following steps:

  • New-DbaScriptingOption creates a new object of Scripting Options type and then we set a bunch of properties as $true.

For each of the tables that we define on our $tables list variable:

  • Export-DbaScript will generate the T-SQL script from the source table using the properties that we have defined. In this case, Keys (Primary, Foreign, Unique), Defaults and Checks Constraints and also clustered and non-clustered indexes.

  • Invoke-DbaQuery will run the generated script on the destination database. At this point we have the same table structure on both sides. This will only run if the table does not exists on the destination database.

  • Finally we use our Copy-DbaDbTableData command to copy the records. Here I have choosen to truncate the table with -Truncate parameter and keep identity values by specifying the -KeepIdentity.

Special callout:

If these tables have relations between them, you need to specify the table list in a specific order to make sure that parent tables are created before child tables. Otherwise you will get, as expected, some errors.

Final thoughts

dbatools has, literally, hundreds of commands so it’s important to know how to find the right one for the job.
Make sure you take a look on the help to understand what is or not covered by the process. As we saw in this example, assuming that the table would be created is a pitfall. For that, we need to use another parameter, but even so when reading the parameter description we will understand that this table will not be created with its full definition. This means that primary key, constraints, indexes, etc will not be created by default.

Because we are using PowerShell, and there is more than one way to accomplish a task, combining both commands as the New-DbaScriptingOption will make it more flexible and possible to reach the final result that we need.

Thanks for reading!

Scripting SQL Server objects with dbatools – Beyond default options

Probably you had the need to script out some objects from a SQL Server instance/database and this is quite easy. You just need to right click on the object (well…not on every single one, try it with an Availability Group :-), no script option available) select “Script XXXX as” and you have it.

But have you realized that this option doesn’t bring all the stuff?
Let’s say you are scripting a table and you have a Non-Clustered index or a trigger…using this option some of the objects under the table will not be scripted out. I understand this as each one of this is a different object, which means if you go to each one you can right click and script just that one.

SSMS – “Generate Scripts…” option

This is a tedious work and you can easily miss some objects. What can we do on SSMS to make this task easier?

You can accomplish this task by using the “Generate Scripts…” option under “Tasks” when you right-click on the database:

This will open the wizard and at the “Set Scripting Options” you can just click on the “Advanced” button and there you can change the properties.
Here you can see that some defaults are in place, and “Script Indexes” is one of them.

This is much easier right? All-in-one in a single scripting operation.

What about automating this task? You want to script out multiple objects from different instances/databases.

Enter dbatools’ “Export-” commands

To search for commands within dbatools we can use the Find-DbaCommand.

Find-DbaCommand -Tag Export

This command give to us a nice Synopsis text that help to find which command is the one we want. From the output we can see that we have (as of v0.9.824) 5 commands tagged as Export.

To replicate our SSMS example using PowerShell we will use the Export-DbaScript.

Don’t forget to use Get-Help cmdlet to find the available parameters and get some examples on how you can use the command.

Get-Help Export-DbaScript -Detailed

Tip: Try other switches like -Examples or even -ShowWindow (won’t work on PSCore but dbatools does!) switches.

Example using our “MyTable” object

Here is how MyTable looks like:

  • 3 Columns
  • 1 Default constraint
  • 1 Non-Clustered Index
Get-DbaDbTable -SqlInstance SQL1 -Database DB1 -Table MyTable | Export-DbaScript -Passthru

Note: I’m using -PassThru parameter to output the script to the console, by default it will create a SQL file.

The output of this execution is even more incomplete when comparing with SSMS. Here, we dont even get the default constraint scripted.

Using “New-DbaScriptingOption” command

dbatools has a command that makes it possible to create an object of type ScriptingOptions.
Then we can change the properties like we have done before on the “Generate Scripts…” option on SSMS.

$options = New-DbaScriptingOption
$options | Get-Member

Use Get-Member so you can see what properties the object offers.

Here we start seeing what we need.

By default what are the values of properties like NonClusteredIndexes and DriDefaults

$options = New-DbaScriptingOption
$options.NonClusteredIndexes
$options.DriDefaults

False! That explains why they are “missing” from our default Export-DbaScript output.

NOTE: Export-DbaUser comamnd can also leverage on this object. Try it.

Let’s change this options to $true and pass our $options object as the value of the -ScriptingOptionsObject parameter and run the command again.

$options = New-DbaScriptingOption
$options.NonClusteredIndexes = $true
$options.DriDefaults = $true
Get-DbaDbTable -SqlInstance SQL1 -Database DB1 -Table MyTable | Export-DbaScript -Passthru -ScriptingOptionsObject $options

Nice! Now we can see all the stuff.

Try it yourself

See the other options available, change the values, rerun and analyse the output.
Do you need to export it to run on a lower SQL Server version? Change the TargetServerVersion option

#Will script the code like SQL Server 2014
$options.TargetServerVersion = "Version120"

You want to include the “IF NOT EXISTS” statement? Change the “IncludeIfNotExists” option.

Here is an example to script out a list of tables (add more to the $TableName variable):

$SourceServer = "SQL1";
$SourceDB = "DB1";
$TableName = "MyTable", 'YourTable';

$options = New-DbaScriptingOption

$options.DriPrimaryKey = $true
$options.DriForeignKeys = $true
$options.DriNonClustered = $true
$options.DriChecks = $true
$options.DriDefaults = $true
$options.Indexes = $true
$options.IncludeIfNotExists = $true

$TableName | Foreach-Object {
    Get-DbaDbTable -ServerInstance $SourceServer -Database $SourceDB -Table $_ | Export-DbaScript -ScriptingOptionsObject $options -Passthru;
}

Availability Groups example using dbatools

Try it yourself:

Get-DbaAvailabilityGroup -SqlInstance SQL1 -AvailabilityGroup SQL1_AG1 | Export-DbaScript -Passthru

Summary

We have seen how we can leverage on some dbatools commands to generate T-SQL scripts from objects. This way we can versioning or just run them on other instace/database. We have seen what default options it (doesn’t) brings and how to add more options to it.
We also saw that, for some objects, the script option is not available. As example Availability Groups, but dbatools can help you here too.

Thanks for reading!

Don’t cutoff yourself with dates in T-SQL – Did you know…

Almost all the applications we use have a lot of options. And, sometimes we even stumble across them by accident.
Who’s ever heard something like “Normal user don’t even use 5% of Excel capabilities!”?

Other options, we know they exist but because the default value it’s ok (so far) we tend to forget it. It is just sitting there, waiting for the day we want/need to change it.

The oddity

Few days ago I was talking with a friend that show me some “odd behavior” when working with dates on SQL Server.

Are you mad?! Is this SQL Server bugged?

In case you ask, this is my table definition:

CREATE TABLE DatesFormat
(
    dt DATETIMEOFFSET DEFAULT ('1900-01-01')
);
GO

Let’s take a closer look at the SELECT.

  1. On the 1st column, we can see the complete value of the column
  2. The 2nd column picks the year from the date column
  3. The 3rd one also picks the year from a datetime but declared as string.

But why don’t the 2nd and 3rd columns return the exact same value?!

What is your interpretation?

What do you read when you see some date in a format like “01-Jan-00 00:00:00.000”? Keep in mind that I’m talking about the output directly from the table and without any formatting.
1st of January seems to leave no doubt (just because there is no default date format starting with two digits for the year), but…what about the year part ’00’?
It stands for 1900 and the 3rd column is wrong?
Or it stands for 2000 and the DATEPART function is returning the wrong value?

Both are returning the correct value! Say hello to “Two Digit Year Cutoff” configuration

You can find it on the advanced tab in the Server Proprieties:

Or by running the sp_configure command:

EXEC sp_configure 'show advanced options', 1;  
GO  
RECONFIGURE ;  
GO  
EXEC sp_configure 'two digit year cutoff';  
GO 

Or even using dbatools PowerShell module:

Get-DbaSpConfigure -SqlInstance sql2016 -ConfigName 'TwoDigitYearCutoff'

Output:

That’s right! This option is the one that makes the last column value being translated to 2000 and not 1900.

If we change the configuration to 1999 instead of 2049 (default value) the result of the DATEPART will be 1900 but having it as 2049 will convert the year as 2000 (from the date that is a string).

The test:

-- Check the running value
EXEC sp_configure 'two digit year cutoff';
GO

-- Notice the 3rd column
SELECT dt, DATEPART(YEAR, dt), DATEPART(YEAR, '01-Jan-00 12:00:00 AM +00:00') 
  FROM dbo.DatesFormat
GO

-- Change the configuration
EXEC sp_configure 'two digit year cutoff', 1999;  
GO  
RECONFIGURE ;  
GO 

-- Confirm the running value
EXEC sp_configure 'two digit year cutoff';
GO

-- See the different value on the 3rd column
SELECT dt, DATEPART(YEAR, dt), DATEPART(YEAR, '01-Jan-00 12:00:00 AM +00:00') 
  FROM dbo.DatesFormat

Output:

Remember, this only happens when you use a literal string.

To set a new value using dbatools:

Set-DbaSpConfigure -SqlInstance sql2016 -ConfigName 'TwoDigitYearCutoff' -Value 1999

Output:

What about the returning value?

Yeah, I know, why is the value of the first column returned on that format? You are used to seeing in the format of yyyy-MM-dd right?
I’ll explain this in a next post! Stay tuned.

Summary

Next time you have to work with dates in formats like dd-MMM-yy remember that “Two Digit Year Cutoff” configuration exists and may mislead you.

To complete the question…”Did you know that ‘Two Digit Year Cutoff’ configuration can trick you?” Now you do.

Thanks for reading.

dbachecks – Going parallel – Cut off your execution times

On one of the last clients I have worked, I have implemented dbachecks in order to get the state of art and know how cool or bad the environments are before start knocking down the bad practices.

This client has seven different environments with more than 100 instances and more than 2000 databases.

Serial execution

A non-parallel execution (single session) took more than 2 hours.
This is not a big problem when we run it out of hours and we don’t want/need to be looking or waiting for it to finish.
I set up it to run at 6 in the morning and when I arrive at the office I can refresh the Power BI dashboard a get a recent state of art.

Well, 2 hours seems not too much if we compare with some other dbachecks users

What if we want to lower down that 2 hours execution? Or the Hiram Fleitas’ (b | t) 4,6 hours?

Going parallel

First let me remember that this doesn’t come for free! I mean, if you set multiple checks in parallel (different sessions), you will consume more resources (CPU, memory, etc) on the machine where you are running them. Test the process and find a reasonable limit for it, otherwise this can become slower that the serial execution.

This brought some new needs.

By default, dbachecks works with the values previously saved (for that we use Set-DbcConfig or the Import-DbcCheck). This means when we start a new session and the last session have changed any configuration, that configuration is the one that will be used in the new session.

Can you see the problem?

Imagine that I want to check for databases in Full Recovery Model on the production environment and I want to start (in parallel) a new check for the development environment where I want to check for Simple Recovery Model if this setting is not changed in the correct time frame, we can end checking for Full Recovery Model on the development environment where we want the Simple Recovery Model.

The first time I tried to run tests for some environments in parallel, that had the need to change some configs, I didn’t realise about this detail so I ended up with much more failed tests than the expected! The bell rang when the majority of the failed tests were from a specific test…the one I had changed the value.

-Temporary parameter for the rescue!

On my last dbachecks blog post – dbachecks – Setting temporary configuration values I have explained how this works so if you haven’t read it yet, take a look before continuing.

Ok, now that you know you can use the -Temporary to run the tests without interfering with the persisted, you may already notice what we will do next..

My recipe to run in parallel

Disclaimer: First, let me say that this is just one option you can come up with a different one. Please drop a comment so I, and others, can become aware of different approaches.

  1. If you don’t have a configuration file for the environment yet, start by configuring all the settings and use Export-DbcConfig to save them.
  2. You need to do a split of your instances/hosts in one or more groups that can share the exact same configurations.
  3. Start a new powershell session, set (using Set-DbcConfig) or import (using Import-DbcConfig) your configurations (set up on number 1) but don’t forget to use the -Temporary parameter.
  4. Run the Invoke-DbcCheck
  5. Repeat steps 1, 2 and 3 as many times as you want – I encourage you to start with just 2 sessions and monitoring your computer resources. Then if you still have room, add one more.
  6. Grab a coffee, a beer or any other drink of your choice and wait until it finishes.

Again, take a look on your resources and then you can test with one more session. Do it until you find the sweet number of parallel sessions.

Here is the code you can use:
For 1st point:

#PROD environment
Set-DbcConfig -Name policy.recoverymodel.type -Value Full -Temporary
Export-DbcConfig -Path "D:\dbachecks\Prod_Configs.json"

 

#DEV environment
Set-DbcConfig -Name policy.recoverymodel.type -Value Simple -Temporary
Export-DbcConfig -Path "D:\dbachecks\Dev_Configs.json"

2nd, 3rd and 4th point together:

#PROD instances
$sqlInstances = "prod1", "prod2", "prod3"

#Import Prod_Configs.json with -Temporary
Import-DbcConfig -Path "D:\dbachecks\Prod_Configs.json" -Temporary

#Run the checks - Don't forget to add all the parameters you usually use
Invoke-DbcCheck -SqlInstance $sqlInstances

 

#DEV instances
$sqlInstances = "dev1", "dev2", "dev3"

#Import Dev_Configs.json with -Temporary
Import-DbcConfig -Path "D:\dbachecks\Dev_Configs.json" -Temporary

#Run the checks - Don't forget to add all the parameters you usually use
Invoke-DbcCheck -SqlInstance $sqlInstances

Save this scripts in two different ps1 files. Then, open two different PowerShell sessions and call each script on different session. Let it flow 🙂

Results

On my case I was able to drop from 2 hours to about 1 hour with 3 parallel sessions. Adding a 4th session made the whole process slower.

Wrap

We saw that we may have problems if we try to run more than one dbachecks session when using different configured values. Using -Temporary parameter when setting the values come in handy for this scenario.
This way we can run two or more sessions in parallel and each one on different environments without messing each other and hopefully, cut off our execution times.

Hope this helps! I would love to hear if you were able to drop down your execution times and what they are before and after.

Thanks for reading!

dbachecks – Setting temporary configuration values

dbachecks has seen the light about two months ago. As I’m writing this blog post, the module counts with more than 2600 downloads just from the PowerShell gallery.
The module has about 110 configurable checks that make our live easier!

Today I will write about an option that I think users still do not realize that exists.

The default

dbachecks works with the values previously saved (for that we use Set-DbcConfig). This means that when we start a new session and the last session has changed any configuration, that configuration is now, by default, the one that will be used in the new session.

What about if we want to run a check with a different value just once?!

Today I want to share a different option!

Let’s assume that you have your dbachecks configs set up for the Production environment. What do you need to do if you want to change just one check to test it in the Test environment?
One option is use the export/import method that Rob (b | t) wrote about on his dbachecks – Configuration Deep Dive blog post.

What if, we could change this property just for the current session without messing with possible new sessions?

When we start a new session and we import dbachecks (in matter of fact when the PSFramework is imported – required module for dbachecks to work) we get the values from the registry. This means that we will read whatever is there at that moment.

Let me introduce to you the -Temporary parameter

This parameter is available on Set-DbcConfig command. As said before, this command allows us to set a configuration which is, by default, persisted. But, if we use the -Temporary parameter we are saying that the configured value is only available for the current session the value will not be persisted for future executions, hence, will not mess with other new sessions.

You can run the following code to get the parameter description:

Get-Help Set-DbcConfig -Parameter temporary

Here is a demonstration:

This video shows that when we don’t use the -Temporary parameter and we start a new session we will read the last value set up. When we run the command with the -Temporary parameter (when setting the value to 5) after start a new session the value read will still be 3.

This way we don’t need to export/import the configurations. Perhaps this will save you time when doing some ad-hoc tests and not stay in doubt if you forgot to replace the older values after a different environment test with different configurations.

I know what you are thinking…

“But I already have and use the export/import method! Changing this can be more work…”.
We got that covered! 💪

If you run

Get-Help Import-DbcConfig -Detailed

you can see the -Temporary is also available in this command.

Hope this bring some new ideas like making your single, ad-hoc, one-time tests easier to configure!”
I have an idea that I will share on my next post about dbachecks!

Wrap

-Temporary parameter exists on both Set-DbcConfig and Import-DbcConfig commands.
By using it, you are just changing the values on the current session and won’t overwrite the persisted values. This can become in handy in some cases.
Explore it!

Drop a message in the comments section either if you already use it and in which way or if you were not aware that it exists and will give it a spin!

Thanks for reading!

dbachecks – A different approach for an in-progress and incremental validation

dbachecks is a new PowerShell module from the SQL Server Community! For more information, read introducing dbachecks.

If you don’t know dbachecks, we have released a good amount of blog posts that will help you:
Announcing dbachecks – Configurable PowerShell Validation For Your SQL Instances by Rob Sewell
introducing dbachecks – a new module from the dbatools team! by Chrissy LeMaire
install dbachecks by Chrissy LeMaire
dbachecks commands by Chrissy LeMaire
dbachecks – Using Power BI dashboards to analyse results by Cláudio Silva
My wrapper for dbachecks by Tony Wilhelm
Checking backups with dbachecks by Jess Promfret
dbachecks please! by Garry Bargsley
dbachecks – Configuration Deep Dive by Rob Sewell
Test Log Shipping with dbachecks
Checking your backup strategy with dbachecks by Joshua Corrick
Enterprise-level reporting with dbachecks by Jason Squires
Adding your own checks to dbachecks by Shane O’Neill
dbachecks – A different approach for an in-progress and incremental validation by Cláudio Silva

Other documentation:
dbachecks readme
dbachecks wiki (for developers)

I will share one of the ways I like to use dbachecks when I’m knocking down the problems in order to increase the green percentage and lower the red one!

Output files

How do you save the results?
Do you save one file per instance (all tests included)?
Using -Append?
Alternatively, one per check/environment?

There is not a single way of doing this. Neither a “correct way”.
Here you can find another different way grouping your results per application.

I will share the way I like to use it, when using the PowerBI dashboards to analyze the results, and explain the advantages I get from it.

Choosing a road

My personal choice is to have one file per check and environment. This means that if I’m running a check for SuspectPage I run for all instances/databases belonging to the development environment, I will end with a file named dbachecks_1_SuspectPage_DEV.json.
Keeping the same line, I will get a filename dbachecks_1_SuspectPage_PRD.json if I run it for production.

$sqlInstances = "dev1", "dev2"

$checks = (Get-DbcCheck).UniqueTag
$checks.ForEach{

Invoke-DbcCheck -SqlInstance $sqlInstances -Checks $_ -PassThru -Show Fails | Update-DbcPowerBiDataSource -Environment "DEV" -Path "C:\windows\temp\dbachecks"

}

This will output:

Total number of files

“This will create a lot of files…”

Let’s do some math

Let’s imagine for a moment that we have to manage 3 different environments (DEV, QA, PRD):
Currently, we have 80 checks if your approach is 1 file per environment you will end up with 3 files. The way I like to do it, I will end up with 240 files.

WOW! Big difference right?

Fear nothing

Yes, it is a big difference but that is no problem at all! The Power BI file will deal with this increase flawlessly as I have mentioned before on dbachecks – Using Power BI dashboards to analyse results blog post.

Advantages

The biggest advantage, for me, is the possibility I have to re-run a single test for a single environment and with it, only touch just one of the files. It’s an update of that file.
By doing it, for the same destination folder, I will overwrite the existing file then I literally just need to go and hit “Refresh” button on PowerBI dashboards.
This way it took just the time of that test and not all of them. Quick and easily, I’m able to confirm that the fix I have run actually worked and my red values are lower! 😀

Real scenario

  1. You run, overnight, all your tests.
  2. In the morning you open the Power BI dashboard and hit “Refresh”
  3. You look to your red values.
  4. You pick one (for this examples purpose let’s say “Auto-Close”)
  5. You run a query to fix all databases with the wrong value
  6. Re-run just this test just for one environment (run multiple times for various environment)
  7. Go to your Power BI and hit “Refresh” again.
  8. Repeat from point 3.

The point 6 is where you will save huge amounts of time because if you have just one file for all tests for one environment, you would need to rerun ALL the tests in order to refresh your environment.

Hope this helps!

Thanks for reading!