dbachecks – Going parallel – Cut off your execution times

On one of the last clients I have worked, I have implemented dbachecks in order to get the state of art and know how cool or bad the environments are before start knocking down the bad practices.

This client has seven different environments with more than 100 instances and more than 2000 databases.

Serial execution

A non-parallel execution (single session) took more than 2 hours.
This is not a big problem when we run it out of hours and we don’t want/need to be looking or waiting for it to finish.
I set up it to run at 6 in the morning and when I arrive at the office I can refresh the Power BI dashboard a get a recent state of art.

Well, 2 hours seems not too much if we compare with some other dbachecks users

What if we want to lower down that 2 hours execution? Or the Hiram Fleitas’ (b | t) 4,6 hours?

Going parallel

First let me remember that this doesn’t come for free! I mean, if you set multiple checks in parallel (different sessions), you will consume more resources (CPU, memory, etc) on the machine where you are running them. Test the process and find a reasonable limit for it, otherwise this can become slower that the serial execution.

This brought some new needs.

By default, dbachecks works with the values previously saved (for that we use Set-DbcConfig or the Import-DbcCheck). This means when we start a new session and the last session have changed any configuration, that configuration is the one that will be used in the new session.

Can you see the problem?

Imagine that I want to check for databases in Full Recovery Model on the production environment and I want to start (in parallel) a new check for the development environment where I want to check for Simple Recovery Model if this setting is not changed in the correct time frame, we can end checking for Full Recovery Model on the development environment where we want the Simple Recovery Model.

The first time I tried to run tests for some environments in parallel, that had the need to change some configs, I didn’t realise about this detail so I ended up with much more failed tests than the expected! The bell rang when the majority of the failed tests were from a specific test…the one I had changed the value.

-Temporary parameter for the rescue!

On my last dbachecks blog post – dbachecks – Setting temporary configuration values I have explained how this works so if you haven’t read it yet, take a look before continuing.

Ok, now that you know you can use the -Temporary to run the tests without interfering with the persisted, you may already notice what we will do next..

My recipe to run in parallel

Disclaimer: First, let me say that this is just one option you can come up with a different one. Please drop a comment so I, and others, can become aware of different approaches.

  1. If you don’t have a configuration file for the environment yet, start by configuring all the settings and use Export-DbcConfig to save them.
  2. You need to do a split of your instances/hosts in one or more groups that can share the exact same configurations.
  3. Start a new powershell session, set (using Set-DbcConfig) or import (using Import-DbcConfig) your configurations (set up on number 1) but don’t forget to use the -Temporary parameter.
  4. Run the Invoke-DbcCheck
  5. Repeat steps 1, 2 and 3 as many times as you want – I encourage you to start with just 2 sessions and monitoring your computer resources. Then if you still have room, add one more.
  6. Grab a coffee, a beer or any other drink of your choice and wait until it finishes.

Again, take a look on your resources and then you can test with one more session. Do it until you find the sweet number of parallel sessions.

Here is the code you can use:
For 1st point:

#PROD environment
Set-DbcConfig -Name policy.recoverymodel.type -Value Full -Temporary
Export-DbcConfig -Path "D:\dbachecks\Prod_Configs.json"

 

#DEV environment
Set-DbcConfig -Name policy.recoverymodel.type -Value Simple -Temporary
Export-DbcConfig -Path "D:\dbachecks\Dev_Configs.json"

2nd, 3rd and 4th point together:

#PROD instances
$sqlInstances = "prod1", "prod2", "prod3"

#Import Prod_Configs.json with -Temporary
Import-DbcConfig -Path "D:\dbachecks\Prod_Configs.json" -Temporary

#Run the checks - Don't forget to add all the parameters you usually use
Invoke-DbcCheck -SqlInstance $sqlInstances

 

#DEV instances
$sqlInstances = "dev1", "dev2", "dev3"

#Import Dev_Configs.json with -Temporary
Import-DbcConfig -Path "D:\dbachecks\Dev_Configs.json" -Temporary

#Run the checks - Don't forget to add all the parameters you usually use
Invoke-DbcCheck -SqlInstance $sqlInstances

Save this scripts in two different ps1 files. Then, open two different PowerShell sessions and call each script on different session. Let it flow 🙂

Results

On my case I was able to drop from 2 hours to about 1 hour with 3 parallel sessions. Adding a 4th session made the whole process slower.

Wrap

We saw that we may have problems if we try to run more than one dbachecks session when using different configured values. Using -Temporary parameter when setting the values come in handy for this scenario.
This way we can run two or more sessions in parallel and each one on different environments without messing each other and hopefully, cut off our execution times.

Hope this helps! I would love to hear if you were able to drop down your execution times and what they are before and after.

Thanks for reading!

dbachecks – Setting temporary configuration values

dbachecks has seen the light about two months ago. As I’m writing this blog post, the module counts with more than 2600 downloads just from the PowerShell gallery.
The module has about 110 configurable checks that make our live easier!

Today I will write about an option that I think users still do not realize that exists.

The default

dbachecks works with the values previously saved (for that we use Set-DbcConfig). This means that when we start a new session and the last session has changed any configuration, that configuration is now, by default, the one that will be used in the new session.

What about if we want to run a check with a different value just once?!

Today I want to share a different option!

Let’s assume that you have your dbachecks configs set up for the Production environment. What do you need to do if you want to change just one check to test it in the Test environment?
One option is use the export/import method that Rob (b | t) wrote about on his dbachecks – Configuration Deep Dive blog post.

What if, we could change this property just for the current session without messing with possible new sessions?

When we start a new session and we import dbachecks (in matter of fact when the PSFramework is imported – required module for dbachecks to work) we get the values from the registry. This means that we will read whatever is there at that moment.

Let me introduce to you the -Temporary parameter

This parameter is available on Set-DbcConfig command. As said before, this command allows us to set a configuration which is, by default, persisted. But, if we use the -Temporary parameter we are saying that the configured value is only available for the current session the value will not be persisted for future executions, hence, will not mess with other new sessions.

You can run the following code to get the parameter description:

Get-Help Set-DbcConfig -Parameter temporary

Here is a demonstration:

This video shows that when we don’t use the -Temporary parameter and we start a new session we will read the last value set up. When we run the command with the -Temporary parameter (when setting the value to 5) after start a new session the value read will still be 3.

This way we don’t need to export/import the configurations. Perhaps this will save you time when doing some ad-hoc tests and not stay in doubt if you forgot to replace the older values after a different environment test with different configurations.

I know what you are thinking…

“But I already have and use the export/import method! Changing this can be more work…”.
We got that covered! 💪

If you run

Get-Help Import-DbcConfig -Detailed

you can see the -Temporary is also available in this command.

Hope this bring some new ideas like making your single, ad-hoc, one-time tests easier to configure!”
I have an idea that I will share on my next post about dbachecks!

Wrap

-Temporary parameter exists on both Set-DbcConfig and Import-DbcConfig commands.
By using it, you are just changing the values on the current session and won’t overwrite the persisted values. This can become in handy in some cases.
Explore it!

Drop a message in the comments section either if you already use it and in which way or if you were not aware that it exists and will give it a spin!

Thanks for reading!

Is this command broken? I can’t see some properties! – DefaultDisplayPropertySet, Get-Member and Select-Object *

Every now and again I see some people complaining about not getting the properties they want when using a PowerShell command.

For instance, someone was using the Get-Service command to query what was the “Startup Type” of WinRM service. For that the person used the following command:

Get-Service WinRM

which produces the following output:

As you can see, the “Startup Type” property that we can find on the user interface does not appear here!

“Wait wait wait…what? Is the command broken?”

Fear nothing!

In this case, this property does not belong to the default display properties set but the properties are still there!

So how can we get the list of available properties?

First, let me say that this person knows that Select-Object can be used to select the properties we want, so he tried to guess the property name using a trial/error approach.

The person tried:

Get-Service WinRM | Select-Object Startup, Status, Name, DisplayName

and also:

Get-Service WinRM | Select-Object StartupType, Status, Name, DisplayName

But all of them were just empty.

Let me invoke a cliché but yet, still true:
When I, and probably most of the people, started learning PowerShell, we learn that Get-Member (and also Get-Help) are our two best friends.

Get-Help needs no introduction, it will retrieve the help for a command! You should always start by reading the help of the command, you will find the overall description, parameters explained and even examples on how to execute the command.

On the other hand, Get-Member can be not so obvious for people that are not familirized with OOP (Object-oriented programming). Looking on documentation we can see that this command

Gets the properties and methods of objects.

This means it can give you a variety of information on the objects you are working with, including, for our use case, the available properties.

Let’s see if we can find the property we want. We can do this by piping the command we are working with to Get-Member.

Get-Service | Get-Member

We can see all the member types, but since we know we want to search on properties we can filter it down using:

Get-Service | Get-Member -MemberType Property

If it retrieves a big list we can also add a filter by the name we think it has like “Start”

Get-Service | Get-Member -MemberType Property -Name Start*

And, in this case we narrow it down to just one result – StartType. Let’s try to include on our original command.

Get-Service WinRM | Select-Object StartType, Status, Name, DisplayName

Boom! We now have the property we are looking for!

Select-Object *

I mentioned the Select-Object * on the title of this post, that is because we can use it to get ALL existing properties that our object owns and their values.

Get-Service WinRM | Select-Object *


As you can see we can find the StartType there.

Why hide some properties by default?

This way it will become cleaner and faster.
Faster: We can have 20 properties but if only 5 are the most useful, by setting this five the default the command will be faster than if we retrieve the whole 20 properties.
Cleaner: We don’t fill the screen with information that 90% of the time is not useful for us.

Can we know beforehand what are the default properties of a command?

Yes, we can! And it is very easy actually.

Using our initial example:

(Get-Service WinRM).PSStandardMembers.DefaultDisplayPropertySet

There they are.

Getting the full list of properties:

(Get-Service WinRM).PSStandardMembers.DefaultDisplayPropertySet.ReferencedPropertyNames

Bonus

If you use some properties a lot and they are not part of the defaults, or you just would like to change the default properties that are retrieved, you can use the Update-TypeData or Update-FormatData cmdlets to make it work that way.

Quick note: For commands that have format XML you will need to use the Update-FormatData.
Thanks to Friedrich Weinmann (b | t), (dbatools architect) that helped me to realize this!

Wrap

This post was intended to show / remember how you can know what are the default properties that will be shown when you run a command. Also, I showed two ways to get the full list of properties Get-Member (just the property name) and “Select-Object *” which also retrieve the values.

Thanks for reading!

Did you know…you can’t do arithmetic operations with datetime2 like with datetime?

I’m currently working on a SQL code migration from Firebird to SQL Server and I hit an error that I haven’t seen for some time.

The error message is the following:

Msg 206, Level 16, State 2, Line 4
Operand type clash: datetime2 is incompatible with int

This ringed the bell right away! Somewhere on the code someone was trying to do an arithmetic calculation without using the proper function.

How so?

In the early days of my T-SQL coding, I used to do this a lot. Also, I still see some code from other applications that still use it this way. Take, for instance, the following code that returns all orders placed with more than 1 day old:

SELECT OrderId, ClientId, Quantity, OrderDate
FROM dbo.Orders
WHERE OrderDate < GETDATE() -1

For this example let’s say that the OrderDate column is a DATETIME2. This works just fine because the GETDATE() function returns a DATETIME value and thus we can subtract one day from it.

If we define a variable of DATETIME2 datatype and assign it a GETDATE() value, then attempt to subtract-1 from the variable, an error will yield!

DECLARE @vOrderDate DATETIME2 = GETDATE()
SELECT OrderId, ClientId, Quantity, OrderDate
FROM dbo.Orders
WHERE OrderDate < @vOrderDate - 1

Msg 206, Level 16, State 2, Line 20
Operand type clash: datetime2 is incompatible with int

But it was working!?

Yes it was on the source engine (Firebird) and it will still work on the destination (SQLServer) if the datatype is still the same – DATETIME.

What happened here was the column datatype was changed during the schema migration from DATETIME to DATETIME2.

NOTE: The most recent date/time datatypes appeared with SQL Server 2008. They are DATE, TIME, DATETIME2, DATETIMEOFFSET.
Also, bear in mind that actually the DATETIME and SMALLDATETIME datatypes are the only from the date/time family that supports this arithmetic operations.

How to fix this error?

To solve this, we need to convert the

@vOrderDate  -1

to

DATEADD(dd, -1, @vOrderDate) 

Whole code looks like:

DECLARE @vOrderDate DATETIME2 = GETDATE()
SELECT OrderId, ClientId, Quantity, OrderDate
FROM dbo.Orders
WHERE OrderDate < DATEADD(dd, -1, @vOrderDate) 

this way, it will work with DATETIME, DATETIME2, DATE, SMALLDATETIME and DATETIMEOFFSET datatypes.

NOTE: DATEADD also support TIME datatype, I didn’t mention because on our example we are subtracting DAYS, and as (at least I) expected this will give an error.

Wrap up

Are you thinking about changing your DATETIME columns to DATETIME2? Or are you just beginning to use it in your projects?
Documentation encourages you to do so (https://docs.microsoft.com/en-us/sql/t-sql/data-types/datetime-transact-sql) but as you could see from this post, you need to pay attention and do the proper testing and T-SQL code revision.

Thanks for reading.

dbachecks – A different approach for an in-progress and incremental validation

dbachecks is a new PowerShell module from the SQL Server Community! For more information, read introducing dbachecks.

If you don’t know dbachecks, we have released a good amount of blog posts that will help you:
Announcing dbachecks – Configurable PowerShell Validation For Your SQL Instances by Rob Sewell
introducing dbachecks – a new module from the dbatools team! by Chrissy LeMaire
install dbachecks by Chrissy LeMaire
dbachecks commands by Chrissy LeMaire
dbachecks – Using Power BI dashboards to analyse results by Cláudio Silva
My wrapper for dbachecks by Tony Wilhelm
Checking backups with dbachecks by Jess Promfret
dbachecks please! by Garry Bargsley
dbachecks – Configuration Deep Dive by Rob Sewell
Test Log Shipping with dbachecks
Checking your backup strategy with dbachecks by Joshua Corrick
Enterprise-level reporting with dbachecks by Jason Squires
Adding your own checks to dbachecks by Shane O’Neill
dbachecks – A different approach for an in-progress and incremental validation by Cláudio Silva

Other documentation:
dbachecks readme
dbachecks wiki (for developers)

I will share one of the ways I like to use dbachecks when I’m knocking down the problems in order to increase the green percentage and lower the red one!

Output files

How do you save the results?
Do you save one file per instance (all tests included)?
Using -Append?
Alternatively, one per check/environment?

There is not a single way of doing this. Neither a “correct way”.
Here you can find another different way grouping your results per application.

I will share the way I like to use it, when using the PowerBI dashboards to analyze the results, and explain the advantages I get from it.

Choosing a road

My personal choice is to have one file per check and environment. This means that if I’m running a check for SuspectPage I run for all instances/databases belonging to the development environment, I will end with a file named dbachecks_1_SuspectPage_DEV.json.
Keeping the same line, I will get a filename dbachecks_1_SuspectPage_PRD.json if I run it for production.

$sqlInstances = "dev1", "dev2"

$checks = (Get-DbcCheck).UniqueTag
$checks.ForEach{

Invoke-DbcCheck -SqlInstance $sqlInstances -Checks $_ -PassThru -Show Fails | Update-DbcPowerBiDataSource -Environment "DEV" -Path "C:\windows\temp\dbachecks"

}

This will output:

Total number of files

“This will create a lot of files…”

Let’s do some math

Let’s imagine for a moment that we have to manage 3 different environments (DEV, QA, PRD):
Currently, we have 80 checks if your approach is 1 file per environment you will end up with 3 files. The way I like to do it, I will end up with 240 files.

WOW! Big difference right?

Fear nothing

Yes, it is a big difference but that is no problem at all! The Power BI file will deal with this increase flawlessly as I have mentioned before on dbachecks – Using Power BI dashboards to analyse results blog post.

Advantages

The biggest advantage, for me, is the possibility I have to re-run a single test for a single environment and with it, only touch just one of the files. It’s an update of that file.
By doing it, for the same destination folder, I will overwrite the existing file then I literally just need to go and hit “Refresh” button on PowerBI dashboards.
This way it took just the time of that test and not all of them. Quick and easily, I’m able to confirm that the fix I have run actually worked and my red values are lower! 😀

Real scenario

  1. You run, overnight, all your tests.
  2. In the morning you open the Power BI dashboard and hit “Refresh”
  3. You look to your red values.
  4. You pick one (for this examples purpose let’s say “Auto-Close”)
  5. You run a query to fix all databases with the wrong value
  6. Re-run just this test just for one environment (run multiple times for various environment)
  7. Go to your Power BI and hit “Refresh” again.
  8. Repeat from point 3.

The point 6 is where you will save huge amounts of time because if you have just one file for all tests for one environment, you would need to rerun ALL the tests in order to refresh your environment.

Hope this helps!

Thanks for reading!

dbachecks – Using Power BI dashboards to analyse results

For the last couple of months, members of the dbatools team have been working on a new PowerShell module called dbachecks. This open source PowerShell module will enable you to validate your SQL Instances. Today it is released for you all to start to use 🙂

dbachecks launch – the blog posts series

Here you can find a list of the blog posts made available today:
Announcing dbachecks – Configurable PowerShell Validation For Your SQL Instances by Rob Sewell
introducing dbachecks – a new module from the dbatools team! by Chrissy LeMaire
install dbachecks by Chrissy LeMaire
dbachecks commands by Chrissy LeMaire
dbachecks – Using Power BI dashboards to analyse results by Cláudio Silva
My wrapper for dbachecks by Tony Wilhelm
Checking backups with dbachecks by Jess Promfret
dbachecks please! by Garry Bargsley
dbachecks – Configuration Deep Dive by Rob Sewell
Test Log Shipping with dbachecks
Checking your backup strategy with dbachecks by Joshua Corrick
Enterprise-level reporting with dbachecks by Jason Squires
Adding your own checks to dbachecks by Shane O’Neill
dbachecks – A different approach for an in-progress and incremental validation by Cláudio Silva

Other documentation:
dbachecks readme
dbachecks wiki (for developers)

Let’s begin

In this blog post, I will write about the Power BI dashboards that we have created to analyse the output of the tests.

Disclaimer

Here, at dbachecks team, we don’t have BI gurus. This means, that what you are about to see come from our non-BI minds plus some research on the web to achieve the output we wanted.
The main objective is to have something functional, that helps to make decisions and help to know where we should look next.
That being said if you master Power BI, M query language, DAX or other stuff that we have used or can be used on this project and you found some archaic or a too much year 2000 way to do some stuff, let us know! We would love to follow best practices and improve the dashboards.

Why Power BI? The background…

I started to work with Power BI after seeing a couple of sessions where it was and was not the main attraction. One of those sessions was from Rob Sewell (b | t) where he was showing Pester and, to show the output, he has used Power BI desktop. This means that he is the principal “culprit” for this choice 🙂. If, in addition to that, we join the ease of import the JSON (and other) files, the usefulness and the eye-candy dashboards all together made the decision easier.
With this in mind, I picked the initial Power BI from Rob and started digging on it and make some changes.

Note: If your tool of choice is not Power BI or you just feel more comfortable using another tool to work with JSON files, please share your dashboards with us. Write about it and share with the community.

Tests output format

dbachecks output consists of one or more JSON files. When you write your dbachecks scripts you can choose between just one or more files as result of your validation. For instance, you can generate one output file per context or just one that includes all contexts. You can even use the -Append to append results to an existing file as Rob described on dbachecks – Configuration Deep Dive blog post.

Before opening the file that contains the dashboards…

For a more pleasant and smooth experience, we recommend that you update your Power BI desktop to the latest version.

Nowadays, Power BI’s team releases a new version every month. Each new version brings new features and improvements. Because we may be using the most recent version of Power BI (desktop) to take advantage of some of those, we also encourage you to use the most recent version.
You can download it from the official Power BI website.

Note: If you try to open a Power BI file saved on a more recent version than the one you are using you may see an error message like:

This was the result of an attempt to open a file on the December 2017 version that was saved on a more recent (the February 2018) version.

Which file should I open? pbix or pbit?

dbachecks contains two different files:

  • pbix – is a document created by Power BI Desktop. It contains queries, data models, visualizations, settings, and reports added by the user.
  • pbit – is similar to pbix but it is a template. You can configure variables that will be asked when you open the file. Those can act, for instance, as filters.

By default, dbachecks will output the results to the “c:\windows\temp\dbachecks” folder. This folder is also the default one configured in the pbix file. This path is the only variable that we use on the pbit file, which means if you don’t need to change your path because you relied on the default one, you can just open the pbix file and click refresh.

On the other hand, the intent of the pbit file is to make it easy for people that are not so comfortable with Power BI desktop, this way they just need to type their output folder (where all the JSON files are) and they are ready to go.
Where this is different from the other file? On the other file (“pbix” extension), you need to go to edit queries menu in order to change your path. This is a more “advanced way” to do it, so if you just want to start to see your tests results in a beautiful way, you can open the template file (“pbit” extension), fulfil the requested path, hit Load and wait for it to load all of your results.

Note: When using pbit, if no objects appear, please make sure you have entered the correct path. Wrong paths will lead to empty dashboards.

You have opened the pbix file but you are not seeing (your) data?

The pbix file will keep the last data you saw on the dashboard. If you have rerun all or some tests, don’t forget to click the “Refresh” button!

Only after refreshing the data source you will get all the data (re) loaded and thus get the most recent version of it.

The dashboards

On this liftoff of dbachecks, our Power BI file includes two dashboards.

If you never used Power BI before you may not know that you can maximize just one visual (almost any type) and come back to the report whenever you want. This is cool when you have a lot of information and you want to take a closer look.

See this example (click to open on new window):

By Environment

This dashboard gives to you a glance at how good/bad your environments are. This is done, at first, by the 2 circles with percentages and green/red waves.

On the right side you can find a matrix where you can drill-down from Environment -> Test -> Context. This drill-down can be hierarchical or absolute. To navigate on the levels, select the matrix and use the 3 buttons in the top-left corner of the visual (as shown on the last animation)

On the bottom, you can find a grid that shows all the failed (and only failed) tests with a descriptive message.

To help you to read this dashboard, on the left side you have the quick filters by Environment and Instance.
There are two other ways to filter the information – you can select a row on the matrix or on the grid to see information just related to that row.
Example:

By Time

The main objective of this dashboard is helping you to understand which tests take the most time to run.
Bear in mind that the times you will see are just the time that one test or the sum of tests took. The time switching between tests are not accounted.

I use this dashboard to understand which tests should/can be running in parallel and that way make the full execution faster.
Also, you can decide that some tests don’t need to run on the same frequency than others.

On the next example, I can see that my Production environment is the one taking the most time to complete, then I filter by it and I can see that the “Testing duplicate indexes” and “Testing Column Identity Usage” are the ones that take the most time. If I want, I can exclude them from the bar chart just to take a closer look at the other test times. Finally, I can go to the filters and remove them resetting the bar char right where we started.

Take a look:

Rules

Yes, there are some rules 🙂

To accomplish these dashboards, some rules must be followed. For example, the “Context” message needs to follow a specific nomenclature. You can read more about it on our wiki on Github.
If these rules are not followed when writing the tests, you can see some weird results on the dashboards.

On this example, you can see six instances, but the last two “Procedures” and “table” appear because the test wasn’t write in the right way.

Also, we have a unit test to help you check for these “Context” rules!

Dynamic

The data source on the Power BI was built to be dynamic. Because we can output just 1 test result per file (a record) or multiple results in the same file (a list), we built it so they can live together!

Load times

You may be thinking how fast it is to load the data and apply all the transformations we have to the files. We had that in mind and we tried to reach a good performance on that task.
I can load 270 files, totalling 397MB of data, in less than 30 seconds (the time will vary depending on the machine specifications).

If you are curious, each file contains a single test for every instance in a specific environment. At that time I managed 7 environments with more than 100 instances.

Next steps?

Now you can run your tests, analyse the output, make the changes needed, rerun the tests and start seeing your green percentage going up and the number of errors going down!

Our next objectives

We will bring some new dashboards and improve the existing ones whenever possible.
If you have a case that you would like to see covered by a dashboard share with us. Do you already have it sort out? Share with us and we can replicate to our file.

It’s Open Source – We Want Your Ideas, Issues, New CodeNew Code

dbachecks is open-source available on GitHub for anyone to contribute.

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful.

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation.

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl
Rob Sewell @sqldbawithbeard
Fred Weinmann @FredWeinmann
Stuart Moore @napalmgram
Shawn Melton @wsmelton
Garry Bargsley @gbargsley
Stephen Bennett @staggerlee011
Sander Stad @SQLStad
Jess Pomfret @jpomfret
Jason Squires @js0505
Shane O’Neill @SOZDBA

and all of the other people who have contributed in the dbachecks Slack channel

I will be speaking at SQL Bits 2018

In two weeks from now, on 23rd of February, I will be speaking at SQLBits 2018!
It’s my first time on the largest SQL Server conference in Europe for data professionals.

I will deliver a session about SQL Server Reporting Services and PowerShell titled – “Administrating SSRS without boring web based clicks”.

You can check the great content that will be shared on the 4 days:
Trainning days: Wednesday and Thursday
Regular sessions: Friday and Saturday

Are you gonna be there? More than 2000 people are already registered!

If you want to know more about the conference you can follow @sqlbits on Twitter or go to sqlbits website.

See you in London!