Did you know…you can change the default file encoding for new query on SSMS?

If you have been reading my last blog posts, you know that I’m currently working on a SQL code migration from Firebird to SQL Server.

The client provided the scripts with all modules (Stored Procedures, functions, etc) and the steps I’m following (roughly speaking) for converting are:

  1. Open new query window
  2. Copy and paste de object code
  3. Save the file
    This is how the file look like:

  4. Run a PowerShell script that does a find and replace based on a hashtable. Apply all the changes and save the file again.

  5. The file refresh on SSMS
    This is how the file look like after the find and replace:

  6. Unicode characters are broken 😦

So…what is happening?

The file that is used to create a new query window has ANSI encoding but when I save the file on the PowerShell script I save it as UTF-8 because the client have comments on the code with unicode characters.

On this process, the unicode characters are replaced by some symbols.

How to solve?

We can change the default file encoding in order to be the one we want in the first place. What I have done was change from ANSI encoding to UTF-8.

This way, when use the keyboard shortcut CTRL + N to open a new window and hit Save, I’m saving as UTF-8 which means that the PowerShell script will do the find and replace, save the file and preserve the unicode characters. 🙂

Where is that default file?

My path to the file is C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\ManagementStudio\SqlWorkbenchProjectItems\Sql where the 140 stands for the SSMS v17 (in my case, right know I’m using the v17.5).
Inside this folder we can find the file SQLFile.sql.

We just need to open it, for example with notepad, do the Save As and choose the encoding we want.

Now if you go to the SSMS, open a new query window and hit save, you can confirm that the file is saved by default with the new encoding you have setup.

With this I have eliminated one tedious step that forced me to do the Save As and choose the UTF-8 every single file I wanted to save. I have hundreds of objects so, this small improvement times hundreds of iterations save me a good amount of time!

Thanks for reading.

SQL Saturday #731 Athens 2018


In less than two weeks (on 16th June) it’s time for SQL Saturday Athens and I’m proud to announce that I will be speaking!

This will be my first SQL Saturday ever both as speaker and attendee outside Portugal so I’m especially thrilled and excited to be part of it!

I will be presenting my Administrate SSRS/PBIRS with PowerShell session and I can share in first hand that it will include new demos with some fixes and enhancements that will be available on the next version of the module later this week.

The event will be held at Microsoft Hellas , 221 Kifisias Ave, Marousi, Attica, 151 24 and you can see the full schedule here.

Hope to see you in Athens! If you see me, come and say hi!

Learn, Evolve and Giving Back – TSQL Tuesday #102

https://scribnasium.com/2018/05/giving-back-t-sql-tuesday-102-invite/

This month’s T-SQL Tuesday is brought to us by Riley Major‏ (b | t) and he encourage us to talk about how we are helping by giving back to our community.

This is the 102nd edition of TSQL2sDay – an Adam Machanic’s (b | t) brainchild.

Let’s go back for a moment:

I have heard about SQL Server the first time back in 2003 when I was on the high school and I had a database class. Was a slow start and, at that moment I haven’t imagined that would follow that path professionally.

Learn

At that time I didn’t even know that a community exists, but when I went to professional course (level IV – it’s a degree before bachelors) on 2006 that was based on Microsoft technologies, .NET (Web and Windows forms) and I had a database class using SQL Server 2000!
That was when I started using more and more the internet for study, found some blogs posts and the SQLServerCentral. So I can say that I started consuming the knowledge from the community back on 2006.

A year later I started my intership in what was my first IT job. At the time I needed to work with a bit of everything but more focused on SQL Server development. The guy that was on my place before I arrived took a manager position but he was a big lover of SQL Server and I learnt a LOT from him! At the time he subscribed the SQLServer Magazine in paper! (later the name changed for SQLServer Pro).

SQLServer Magazine October 2007

SQLServer Magazine
October 2007


You can found the maganize archives in PDF format here.

That was when I start knewing who people like Itzik Ben-Gan and Kalen Delaney were!

Evolve

During about five years I was a compulsive consumer of blog posts related with SQL Server development but most of the time was when I had a problem and wanted some guidance on the solution. During that time I haven’t shared too much knowledge on the internet.

Then, 2012, I went to my first SQL Port user group meetings and my first ever SQL Saturday, 115 Portugal. Those were my first real contact with SQL Server community and where I met some speakers and started to join the montly meetings regulary.

Giving back

At 2015 I joined Twitter and decided that I wanted to share my knowledge even more. So after being approached several times I took courage and did my first public presentation for SQL Port user group in Lisbon and right after I submitted a session for SQL Saturday Oporto 2015 and I was accepted.
You know, start small and grow…so I started talking in Portuguese for the user group and then went to SQL Saturday where I did the first public speech in English.

Now, we were in 2016 and was the first edition of TugaIT (the CFS is open for this year – Summer Edition) and at that time I was “just” a volunteer but I had a gut feeling – I have discovered, few months earlier, that dbatools was a thing and decided to go and talk with the creator Chrissy LeMaire – I had written a couple of PowerShell scripts to help me and decided to ask her if she has the intention to extend the module to best practices and we talk for like 1 hour, exchange contacts and started talking furiously about dbatools and then…I was doing a PR with a full command (Expand-DbaTLogResponsibly) and this was the time I felt I was doing my first “more international” contribution to the community.

Since then I have being more active on the community mainly with SQLServer and PowerShell stuff.

Some contribution points:

What you will do?

As you can see I didn’t born inside the comunity, I have grown because of it and decided to start giving back less then 3 years ago.
It hurst? Nothing!
It helps? A lot!
Help me be a better professional? For sure, I have learned so much from many different sources and every day is a new learning day!

Thanks for reading.

Using CAST() function with User-Defined Data Types…Did you know…

I’m converting some Firebird database code to T-SQL and I’m seeing a lot of code that doesn’t work in the same way that SQL Server.
No surprise – I already expected that but for those that still say that “all engines/databases are equal”…”is SQL right?” here is another proof that is not true.

On Firebird it is possible to use the CAST function using a DOMAIN (the equivalent in SQLServer is UDDT – User-Defined Data Types) as target data-type, well turns out that on SQL Server…that is not possible.

Which means that, even if I have created a UDDT as:

CREATE TYPE dbo.VARCHAR_50 FROM VARCHAR(50);

The following code will not work

SELECT CAST('value' AS VARCHAR_50)

and it yields the following error message:

Msg 243, Level 16, State 2, Line 1
Type VARCHAR_50 is not a defined system type.

this means that we need to change it to the defined system type like:

SELECT CAST('value' as VARCHAR(50))

Maybe it works with CONVERT() function?!…not really, the behaviour is the same.

To finish the title…Using CAST() function with User-Defined Data Types…Did you know… it is not possible. You need to use the system type.

Thanks for reading.

dbachecks – Going parallel – Cut off your execution times

On one of the last clients I have worked, I have implemented dbachecks in order to get the state of art and know how cool or bad the environments are before start knocking down the bad practices.

This client has seven different environments with more than 100 instances and more than 2000 databases.

Serial execution

A non-parallel execution (single session) took more than 2 hours.
This is not a big problem when we run it out of hours and we don’t want/need to be looking or waiting for it to finish.
I set up it to run at 6 in the morning and when I arrive at the office I can refresh the Power BI dashboard a get a recent state of art.

Well, 2 hours seems not too much if we compare with some other dbachecks users

What if we want to lower down that 2 hours execution? Or the Hiram Fleitas’ (b | t) 4,6 hours?

Going parallel

First let me remember that this doesn’t come for free! I mean, if you set multiple checks in parallel (different sessions), you will consume more resources (CPU, memory, etc) on the machine where you are running them. Test the process and find a reasonable limit for it, otherwise this can become slower that the serial execution.

This brought some new needs.

By default, dbachecks works with the values previously saved (for that we use Set-DbcConfig or the Import-DbcCheck). This means when we start a new session and the last session have changed any configuration, that configuration is the one that will be used in the new session.

Can you see the problem?

Imagine that I want to check for databases in Full Recovery Model on the production environment and I want to start (in parallel) a new check for the development environment where I want to check for Simple Recovery Model if this setting is not changed in the correct time frame, we can end checking for Full Recovery Model on the development environment where we want the Simple Recovery Model.

The first time I tried to run tests for some environments in parallel, that had the need to change some configs, I didn’t realise about this detail so I ended up with much more failed tests than the expected! The bell rang when the majority of the failed tests were from a specific test…the one I had changed the value.

-Temporary parameter for the rescue!

On my last dbachecks blog post – dbachecks – Setting temporary configuration values I have explained how this works so if you haven’t read it yet, take a look before continuing.

Ok, now that you know you can use the -Temporary to run the tests without interfering with the persisted, you may already notice what we will do next..

My recipe to run in parallel

Disclaimer: First, let me say that this is just one option you can come up with a different one. Please drop a comment so I, and others, can become aware of different approaches.

  1. If you don’t have a configuration file for the environment yet, start by configuring all the settings and use Export-DbcConfig to save them.
  2. You need to do a split of your instances/hosts in one or more groups that can share the exact same configurations.
  3. Start a new powershell session, set (using Set-DbcConfig) or import (using Import-DbcConfig) your configurations (set up on number 1) but don’t forget to use the -Temporary parameter.
  4. Run the Invoke-DbcCheck
  5. Repeat steps 1, 2 and 3 as many times as you want – I encourage you to start with just 2 sessions and monitoring your computer resources. Then if you still have room, add one more.
  6. Grab a coffee, a beer or any other drink of your choice and wait until it finishes.

Again, take a look on your resources and then you can test with one more session. Do it until you find the sweet number of parallel sessions.

Here is the code you can use:
For 1st point:

#PROD environment
Set-DbcConfig -Name policy.recoverymodel.type -Value Full -Temporary
Export-DbcConfig -Path "D:\dbachecks\Prod_Configs.json"

 

#DEV environment
Set-DbcConfig -Name policy.recoverymodel.type -Value Simple -Temporary
Export-DbcConfig -Path "D:\dbachecks\Dev_Configs.json"

2nd, 3rd and 4th point together:

#PROD instances
$sqlInstances = "prod1", "prod2", "prod3"

#Import Prod_Configs.json with -Temporary
Import-DbcConfig -Path "D:\dbachecks\Prod_Configs.json" -Temporary

#Run the checks - Don't forget to add all the parameters you usually use
Invoke-DbcCheck -SqlInstance $sqlInstances

 

#DEV instances
$sqlInstances = "dev1", "dev2", "dev3"

#Import Dev_Configs.json with -Temporary
Import-DbcConfig -Path "D:\dbachecks\Dev_Configs.json" -Temporary

#Run the checks - Don't forget to add all the parameters you usually use
Invoke-DbcCheck -SqlInstance $sqlInstances

Save this scripts in two different ps1 files. Then, open two different PowerShell sessions and call each script on different session. Let it flow 🙂

Results

On my case I was able to drop from 2 hours to about 1 hour with 3 parallel sessions. Adding a 4th session made the whole process slower.

Wrap

We saw that we may have problems if we try to run more than one dbachecks session when using different configured values. Using -Temporary parameter when setting the values come in handy for this scenario.
This way we can run two or more sessions in parallel and each one on different environments without messing each other and hopefully, cut off our execution times.

Hope this helps! I would love to hear if you were able to drop down your execution times and what they are before and after.

Thanks for reading!

dbachecks – Setting temporary configuration values

dbachecks has seen the light about two months ago. As I’m writing this blog post, the module counts with more than 2600 downloads just from the PowerShell gallery.
The module has about 110 configurable checks that make our live easier!

Today I will write about an option that I think users still do not realize that exists.

The default

dbachecks works with the values previously saved (for that we use Set-DbcConfig). This means that when we start a new session and the last session has changed any configuration, that configuration is now, by default, the one that will be used in the new session.

What about if we want to run a check with a different value just once?!

Today I want to share a different option!

Let’s assume that you have your dbachecks configs set up for the Production environment. What do you need to do if you want to change just one check to test it in the Test environment?
One option is use the export/import method that Rob (b | t) wrote about on his dbachecks – Configuration Deep Dive blog post.

What if, we could change this property just for the current session without messing with possible new sessions?

When we start a new session and we import dbachecks (in matter of fact when the PSFramework is imported – required module for dbachecks to work) we get the values from the registry. This means that we will read whatever is there at that moment.

Let me introduce to you the -Temporary parameter

This parameter is available on Set-DbcConfig command. As said before, this command allows us to set a configuration which is, by default, persisted. But, if we use the -Temporary parameter we are saying that the configured value is only available for the current session the value will not be persisted for future executions, hence, will not mess with other new sessions.

You can run the following code to get the parameter description:

Get-Help Set-DbcConfig -Parameter temporary

Here is a demonstration:

This video shows that when we don’t use the -Temporary parameter and we start a new session we will read the last value set up. When we run the command with the -Temporary parameter (when setting the value to 5) after start a new session the value read will still be 3.

This way we don’t need to export/import the configurations. Perhaps this will save you time when doing some ad-hoc tests and not stay in doubt if you forgot to replace the older values after a different environment test with different configurations.

I know what you are thinking…

“But I already have and use the export/import method! Changing this can be more work…”.
We got that covered! 💪

If you run

Get-Help Import-DbcConfig -Detailed

you can see the -Temporary is also available in this command.

Hope this bring some new ideas like making your single, ad-hoc, one-time tests easier to configure!”
I have an idea that I will share on my next post about dbachecks!

Wrap

-Temporary parameter exists on both Set-DbcConfig and Import-DbcConfig commands.
By using it, you are just changing the values on the current session and won’t overwrite the persisted values. This can become in handy in some cases.
Explore it!

Drop a message in the comments section either if you already use it and in which way or if you were not aware that it exists and will give it a spin!

Thanks for reading!

Did you know…you can’t do arithmetic operations with datetime2 like with datetime?

I’m currently working on a SQL code migration from Firebird to SQL Server and I hit an error that I haven’t seen for some time.

The error message is the following:

Msg 206, Level 16, State 2, Line 4
Operand type clash: datetime2 is incompatible with int

This ringed the bell right away! Somewhere on the code someone was trying to do an arithmetic calculation without using the proper function.

How so?

In the early days of my T-SQL coding, I used to do this a lot. Also, I still see some code from other applications that still use it this way. Take, for instance, the following code that returns all orders placed with more than 1 day old:

SELECT OrderId, ClientId, Quantity, OrderDate
FROM dbo.Orders
WHERE OrderDate < GETDATE() -1

For this example let’s say that the OrderDate column is a DATETIME2. This works just fine because the GETDATE() function returns a DATETIME value and thus we can subtract one day from it.

If we define a variable of DATETIME2 datatype and assign it a GETDATE() value, then attempt to subtract-1 from the variable, an error will yield!

DECLARE @vOrderDate DATETIME2 = GETDATE()
SELECT OrderId, ClientId, Quantity, OrderDate
FROM dbo.Orders
WHERE OrderDate < @vOrderDate - 1

Msg 206, Level 16, State 2, Line 20
Operand type clash: datetime2 is incompatible with int

But it was working!?

Yes it was on the source engine (Firebird) and it will still work on the destination (SQLServer) if the datatype is still the same – DATETIME.

What happened here was the column datatype was changed during the schema migration from DATETIME to DATETIME2.

NOTE: The most recent date/time datatypes appeared with SQL Server 2008. They are DATE, TIME, DATETIME2, DATETIMEOFFSET.
Also, bear in mind that actually the DATETIME and SMALLDATETIME datatypes are the only from the date/time family that supports this arithmetic operations.

How to fix this error?

To solve this, we need to convert the

@vOrderDate  -1

to

DATEADD(dd, -1, @vOrderDate) 

Whole code looks like:

DECLARE @vOrderDate DATETIME2 = GETDATE()
SELECT OrderId, ClientId, Quantity, OrderDate
FROM dbo.Orders
WHERE OrderDate < DATEADD(dd, -1, @vOrderDate) 

this way, it will work with DATETIME, DATETIME2, DATE, SMALLDATETIME and DATETIMEOFFSET datatypes.

NOTE: DATEADD also support TIME datatype, I didn’t mention because on our example we are subtracting DAYS, and as (at least I) expected this will give an error.

Wrap up

Are you thinking about changing your DATETIME columns to DATETIME2? Or are you just beginning to use it in your projects?
Documentation encourages you to do so (https://docs.microsoft.com/en-us/sql/t-sql/data-types/datetime-transact-sql) but as you could see from this post, you need to pay attention and do the proper testing and T-SQL code revision.

Thanks for reading.