DELETE data on SQL Server HEAP table – Did you know…

Before I complete my question let me provide context.

I’ve received an alert saying that a specific database could not allocate a new page (disk was full)

The message that you will see on the SQL Server Error log is:

Could not allocate a new page for database ” because of insufficient disk space in filegroup ”. Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.

I didn’t know the database structure or what is saved there, so I picked up a script from my toolbelt that shreds all indexes from all table. Just some information like number of rows and space that it is occupying. I have sorted by occupying space in descending order, look what I found…

So…my script has a bug? 🙂 No, it hasn’t!

The joy of heaps

First, the definition:

A heap is a table without a clustered index. One or more nonclustered indexes can be created on tables stored as a heap. Data is stored in the heap without specifying an order. Usually data is initially stored in the order in which is the rows are inserted into the table, but the Database Engine can move data around in the heap to store the rows efficiently; so the data order cannot be predicted. To guarantee the order of rows returned from a heap, you must use the ORDER BY clause. To specify the order for storage of the rows, create a clustered index on the table, so that the table is not a heap.

Source: MS Docs – Heaps (Tables without Clustered Indexes)

Until now, everything seems normal, it is just a table with unordered data.

Why am I talking about heaps?

Not because of table name (was created on propose for this demo), let me show to you the whole row of the script:

Do you have a clue? Yup, index_id = 0. That means that our table does not have a clustered index defined and therefore it is an HEAP.

Even so, how it is possible? 0 rows but occupying several MB…

The answer is…on the documentation 🙂

When rows are deleted from a heap the Database Engine may use row or page locking for the operation. As a result, the pages made empty by the delete operation remain allocated to the heap. When empty pages are not deallocated, the associated space cannot be reused by other objects in the database.

source: DELETE (Transact-SQL) – Locking behavior

That explains it!

So…what should I do in order to get my space back when deleting from a HEAP?

On the same documentation page we can read the following:

To delete rows in a heap and deallocate pages, use one of the following methods.

  • Specify the TABLOCK hint in the DELETE statement. Using the TABLOCK hint causes the delete operation to take an exclusive lock on the table instead of a row or page lock. This allows the pages to be deallocated. For more information about the TABLOCK hint, see Table Hints (Transact-SQL).
  • Use TRUNCATE TABLE if all rows are to be deleted from the table.
  • Create a clustered index on the heap before deleting the rows. You can drop the clustered index after the rows are deleted. This method is more time consuming than the previous methods and uses more temporary resources.

Following the documentation, it suggest we can to use the TABLOCK hint in order to release the empty pages when deleting the data.
Example:

DELETE 
  FROM dbo.Heap WITH (TABLOCK)

What if I didn’t that way or if anyone else run a DELETE without specify it?

You can rebuild your table using this syntax (since SQL Server 2008):

ALTER TABLE dbo.Heap REBUILD

This way, the table will release the empty pages and you will recovery the space to use on other objects in the database.

Wrap up

I hope that with this little post you understood how and why a HEAP can have few rows or even zero but still occupy lots of space. Also I have mentioned two ways to solve this problem.
Also, I have found databases with dozens of HEAPS almost empty or even empty that were occupying more than 50% of the total space allocated to the database. And guess what? People where complaining about space.

To finish, I need to complete the title, Did you know…you should use TABLOCK hint when deleting data from a HEAP?

Thanks for reading!

Someone is not following the best practices – dbatools and Pester don’t lie!

This month’s T-SQL Tuesday is brought to us by my good friend Rob Sewell (b | t). Together “Let’s get all Posh – What are you going to automate today?”

I have written some blog posts on how I use PowerShell to automate mundane tasks or some other more complex scenarios like:  Find and fix SQL Server databases with empty owner property using dbatools PowerShell module or Have you backed up your SQL Logins today?  or even using ReportingServicesTools module for deploy reports – SSRS Report Deployment Made Easy – 700 times Faster.

But today I want to bring something little different.  This year, back in May I saw two presentations from Rob about using Pester to do unit tests for our PowerShell code and also to validate options/infrastructure like checklists. This got my attention and made me want to play with it!

Therefore, I want to share an example with you using two of my favorite PowerShell modules dbatools and Pester.

Let’s play a game

You go to a client or you have just started working on your new employer and you want to know if the entire SQL Server state complies with the best practices.

For the propose of this blog, we will check:

  • if our databases (from all instances) have the following configurations:
    • PageVerify -> Checksum
    • AutoShrink -> False
  • if each SQL Server instance:
    • has the MaxMemory setting configured to a value lower than the total existing memory on the host.

How would you do that?

Let me introduce to you – dbatools

For those who don’t know, dbatools is a PowerShell module, written by the community, that makes SQL Server administration much easier using PowerShell. Today, the module has more than 260 commands. Go get it (dbatools.io) and try it! If you have any doubt you can join the team on the #dbatools channel at the Slack – SQL Server Community.

In this post I will show some of those commands and how they can help us.

Disclaimer: Obviously this is not the only way to accomplish this request, but for me, is one excellent way!

Get-DbaDatabase command

One existing command on the dbatools swiss army knife is Get-DbaDatabase.
As it states on the command description

The Get-DbaDatabase command gets SQL database information for each database that is present in the target instance(s) of SQL Server. If the name of the database is provided, the command will return only the specific database information.

This means that I can run the following piece of PowerShell code and get some information about my databases:

Get-DbaDatabase -SqlServer sql2016 | Format-Table

This returns the following information from all existing databases on this SQL2016 instance.

Too little information

That’s true, when we look to it, it brings not enough information. I can’t even get the “PageVerify” and “AutoShrink” properties that I want. But that is because we, by default, only output a bunch of properties and this doesn’t mean that the others are not there.

To confirm this we can run the same code without the ” | Format-Table” that is useful to output the information in a table format but depending on the size of your window it can show more or less columns.
By running the command without the “format-table” we can see the following (just showing the first 3 databases):

Now, we can see more properties available look to the ones inside the red rectangle.

I continue not to see the ones I want

You are right. But as I said before that does not means they aren’t there.
To simplify the code let’s assign our output to a variable named $databases and then we will have a look to all the Members existing on this object

$databases = Get-DbaDatabase -SqlServer sql2016
$databases | Get-Member

Now we get a lot of stuff! The Get-Member cmdlet say to us which Properties and Methods of the object (in this case the $databases).

This means that I can use a filter to find results with “auto” in its name:

$databases | Get-Member | Where-Object Name -like *auto*

Some cmdlets have parameters that allow us to filter information without the need to pipeing it so, the last line command could be written as:

$databases | Get-Member -Name *auto*

Which will output something like this:

So, we have found our “AutoShrink” property. With this in mind, lets query all the properties we want.

$databases | Select-Object SqlInstance, Name, AutoShrink, PageVerify

And here we have the result:

Scaling for multiple instances

This is where the fun begins.
We can pass multiple instance names and the command will go through all of them and output a single object with the data.

$databases = Get-DbaDatabase -SqlServer sql2016, sql2012
$databases | Select-Object SqlInstance, Name, AutoShrink, PageVerify

Which outputs:

As you can see I have passed two different instances sql2016 (in red) and sql2012 (in green) and the output brought both information.

Using Out-GridView to filter results

We can use another PowerShell native cmdlet called Out-GridView to show our results in a grid format. This grid also make it possible to use filters.
For the next example, I have misconfigurated two databases so we can find them among the others.

$databases | Select-Object SqlInstance, Name, AutoShrink, PageVerify | Out-GridView

As you can see, inside red rectangles we have two not optimal configurations regarding the SQL Server best practices. You can also see the green rectangle on the top left corner where you can type text and the results will be filter as you type. So if you type “true” you will end just with one record.

Checking the MaxMemory configuration

Now, that you have seen how to do it for one command, you can start exploring the other ones. As I said in the beginning of this post we will also check the MaxMemory setting for each instance. We will use the Get-DbaMaxMemory. From the help page we can see the description that says:

This command retrieves the SQL Server ‘Max Server Memory’ configuration setting as well as the total physical installed on the server.

Let’s run it through our two instances:

Get-DbaMaxMemory -SqlInstance sql2012, sql2016

We can see that SQL2012 instance is running on a host with 6144MB of total memory but its MaxMemory setting is set to 3072MB and also, SQL2016 instance has 4608MB configured form the 18423MB existing on the host.

Final thought on this fast introduction to dbatools PowerShell module

As you see, it is pretty easy to run the commands for one or multiple instances to get information to work on. Also you have seen different ways to output that information.
I encourage you to use the Find-DbaCommand to discover what other commands exists and what they can do for you.

Example, if you want to know which commands we have that works with “memory” you can run the following code:

Find-DbaCommand -Pattern memory

Automating even more

Using the dbatools module we could verify if the best practice is in place or not. But we had to run the command and then verify the values by filtering and looking for each row.

You may be thinking that must exists some other more automated method to accomplish that, right?

Say hello to Pester PowerShell module

Pester is unit test framework for PowerShell. I like to say If you can PowerShell it, you can Pester it.

Pester provides a framework for running Unit Tests to execute and validate PowerShell commands. Pester follows a file naming convention for naming tests to be discovered by pester at test time and a simple set of functions that expose a Testing DSL for isolating, running, evaluating and reporting the results of PowerShell commands.

Please see how to install Pester module here.

With this framework, that I really encourage you to read more about it on the project Wiki, we can automate our tests and make it do the validations for us!

As quick example – if we run the following code:

We are checking if the login returned by the whoami is base\claudio.

This return green which means it’s ok!

If is not ok (because I’m testing to “base\claudio.silva”), will retrieve something like this:

Quick walkthrough on Pester syntax

As you can see, to do a test we need a:

  • Describe block (attention: the “{” must be on the same line!)
    • Inside it, the Context block
      • And inside the Context block the validation that we want to do the It and Should.

Let’s join forces

With this in mind, I can create tests for my needs using dbatools and Pester.

I will have a variable ($SQLServers)

$SQLServers = @('sql2012', 'sql2014', 'sql2016')

with all the instances I want to test and two “Describe” blocks, one for “Testing database options” – PageVerify and AutoShrink

Describe "Testing Database Options for $Server" {
   foreach($Server in $SQLServers){
      #Just selecting some columns so it don't take too much time returning all the thing that we don't want
      $databases = Get-DbaDatabase -SqlServer $server | Select-Object Name, SqlInstance, CompatibilityLevel, PageVerify, AutoShrink, AutoClose
      foreach($database in $databases) {
         Context "$($Database.Name) Validation" {
            It "PageVerfiy set to Checksum" {
               $database.PageVerify| Should Be "Checksum"
            }
            It "AutoShrink set to False" {
               $database.AutoShrink| Should Be $false
            }
         }
      }
   }
}

And another one for “Testing instance MaxMemory”:

Describe "Testing Instance MaxMemory"{
   foreach($Server in $SQLServers){
      $instanceMemory = Get-DbaMaxMemory -SqlInstance $Server
      Context "Checking MaxMemory value" {
         It "$($Server) instance MaxMemory value $($instanceMemory.SqlMaxMb) is less than host total memory $($instanceMemory.TotalMB)" {
            $instanceMemory.SqlMaxMb | Should BeLessThan $instanceMemory.TotalMB
         }
      }
   }
}

To run this tests we should save a file with the “.Tests.ps1” ending name. Let’s save as “SQLServerBestPractices.Tests.ps1”. To run the tests we need to use the Invoke-Pester and the file that contains the tests.

Invoke-Pester .\SQLServerBestPractices.Tests.ps1

To much noise – can’t find the failed tests easily

You are right, showing all the greens make us lose the possible red ones. But Pester has an option to show just the failed tests.

Invoke-Pester .\SQLServerBestPractices.Tests.ps1 -Show Failed

But, be aware that -Show Fails can be a better solution, specially when you are working with multiple Tests.ps1 files.

This way you can see where your error come from.

Reading and fixing the errors

As you can read from the last image from -Show Failed execution, the database “dbft” on “SQL2016” instance has the “AutoShrink” property set to “True” but we expect the value “False”. Now you can go to the database properties and change this value!

Also, the “PageVerify” value that we expect to be “Checksum” is “TornPageDetection” for the database “dumpsterfire4” and “SQL2016” instance.

Finally the MaxMemory configuration on the “SQL2016” instance is set to 46080MB (45GB) but we expect that should be less than 18432mb (18GB) that is the total memory of the host. We need to reconfigure this value too.

This is great!

Yes it is! Now when a new database is born on an existing instance, or you update your instances with a new one, you can simply run the tests and the new stuff will be included on this set of tests!

If you set it to run daily or even once per week you can check your estate and get new stuff that haven’t been you to setup and maybe is not following the best practices.

Get the fails and email them (I will blog about it).

Next steps

  • Explore Pester syntax.
  • Add new instances.
  • Add new tests
    • Check if you have access to the instance (great way to know quickly if some instance is stopped)
    • Check if your backups are running with success and within our policy time interval
    • Check if your datafiles are set to growth by fixed value and not percent. Also if that fixed value is more than X mb.
    • Want to Test your last backup? Or something completely different like Rob’s made for Pester for Presentations – Ensuring it goes ok?

You name it!

Want more?

I hope you have seen some new stuff and get some ideas from this blog post!

If you want to know if there will be some dbatools presentations near you, visit our presentation page. You can find some of our presentations on our youtube channel and code example on the community presentations on GitHub.

About Pester and other examples and use cases, we have the Articles and other resources page maintained by the Pester team.

I’m looking forward to read the other blog posts (follow the comments on Rob’s post, or the roundup later) on this month’s T-SQL Tuesdays and see what people is being doing with PowerShell.

Thanks for reading.

“Invalid class [0x80041010]” error when trying to access SQLServer’s WMI classes

I was using open source PowerShell module dbatools (GitHub repository) to get the list of SQL Server services I have on a bunch of hosts so I could confirm if they are in “running” state.

— Quick note —
For those who don’t know, dbatools is a module, written by the community, that makes SQL Server administration much easier using PowerShell. Today, the module has more than 260 commands. Go get it and try it! If you have any doubt you can join the team on the #dbatools channel at the Slack – SQL Server Community.
— Quick note —

To accomplish this, I’m using the Get-DbaSqlService initially written by Klaas Vandenberghe (b | t).

This command is very handy, as it will try different ways to connect to the host and we don’t need to do anything extra. Also, it has a -Credential parameter so we can use it to connect to hosts in different domains (I have 10 different credentials, one per domain).

Everything was running fine, for the first couple of hosts, until…

I got the following message when running on a specific host:

WARNING: Get-DbaSqlService – No ComputerManagement Namespace on HOST001. Please note that this function is available from SQL 2005 up.

Trying to get more information, I have executed the same command but added the -Verbose switch

From all the blue lines, I spotted this:

VERBOSE: [Get-DbaCmObject][12:23:31] [HOST001] Retrieving Management Information
VERBOSE: [Get-DbaCmObject][12:23:31] [HOST001] Accessing computer using Cim over WinRM
VERBOSE: [Get-DbaCmObject][12:23:47] [HOST001] Accessing computer using Cim over WinRM – Failed!
VERBOSE: [Get-DbaCmObject][12:23:47] [HOST001] Accessing computer using Cim over DCOM
VERBOSE: [Get-DbaCmObject][12:23:48] [HOST001] Accessing computer using Cim over DCOM – Success!

Ok, this means that for this specific host I can’t connect via WinRM (using WSMan) but I can when using the DCOM protocol. However,  the WMI query used to get the list of SQL services fails.

Going further

I open the Get-DbaSqlService.ps1 script and spotted where the warning message comes from. Then, I have copied the code to a new window in order to isolate it and do another execution tests.

The code is:

$sessionoption = New-CimSessionOption -Protocol DCOM
$CIMsession = New-CimSession -ComputerName $Computer -SessionOption $sessionoption -ErrorAction SilentlyContinue -Credential $Credential
#I have skipped an if ( $CIMSession ) that is here because we know that works.
$namespace = Get-CimInstance -CimSession $CIMsession -NameSpace root\Microsoft\SQLServer -ClassName "__NAMESPACE" -Filter "Name Like 'ComputerManagement%'" -ErrorAction SilentlyContinue |Where-Object {(Get-CimInstance -CimSession $CIMsession -Namespace $("root\Microsoft\SQLServer\" + $_.Name) -Query "SELECT * FROM SqlService" -ErrorAction SilentlyContinue).count -gt 0}

I splitted the last command to remove the pipeline since I would like to analyze each part of the code. I ended with the following code:

$sessionoption = New-CimSessionOption -Protocol DCOM
$CIMsession = New-CimSession -ComputerName "HOST001" -SessionOption $sessionoption -ErrorAction Continue -Credential $Credentials -Verbose

Get-CimInstance -CimSession $CIMsession -NameSpace root\Microsoft\SQLServer -Query "Select * FROM __NAMESPACE WHERE Name Like 'ComputerManagement%'"
#This one is comment for now
#Get-CimInstance -CimSession $CIMsession -Namespace $("root\Microsoft\SQLServer\ComputerManagement10") -Query "SELECT * FROM SqlService"

This can return more than one line with different ComputerManagement (like ComputerManagement10). It depends on the versions you have installed on the host. The number “10” refers to the SQL Server 2008.
Now I can uncomment the last command and run it. The result is:

Get-CimInstance : Invalid class
At line:1 char:1
+ Get-CimInstance -CimSession $CIMsession -Namespace $(“root\Microsoft\SQLServer\C …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : MetadataError: (:) [Get-CimInstance], CimException
+ FullyQualifiedErrorId : HRESULT 0x80041010,Microsoft.Management.Infrastructure.CimCmdlets.GetCimInstanceCommand
+ PSComputerName : HOST001

Ok, a different error message. Let’s dig in it. I logged in on the host and confirmed that I have a SQL Server 2008 R2 instance installed. This means that I’m not accessing a lower version than 2005 like the initial warning message was suggesting.

I tried to execute locally the same query but this time using Get-WmiObject instead of Get-CimInstance (which, in this case wasn’t available because the host only have PowerShell v2.0. It’s a Windows server 2008 SP2. CIM cmdlets appears on v3.0) and it failed with the same error.

Get-WmiObject : Invalid class
At line:1 char:5
+ gwmi <<<< -Namespace “root\Microsoft\SQLServer\ComputerManagement10” -Query “SELECT * FROM SqlService”
+ CategoryInfo : InvalidOperation: (:) [Get-WmiObject], ManagementException
+ FullyQualifiedErrorId : GetWMIManagementException,Microsoft.PowerShell.Commands.GetWmiObjectCommand

I remembered, from past experiences, that SQL Server Configuration manager relies on WMI classes to show the information, so I tried to open it and I got the following error message:

Cannot connect to WMI provider. You do not have permission or the server in unreachable. Note that you can only manage SQL Server 2005 and later servers with SQL Server Configuration Manager.
Invalid class [0x80041010]

Again, that 2005 callout, but…did you recognize the last sentence? It’s the same error I was getting with Get-CIMInstance remotely and Get-WmiObject locally.

Definitely something is broken.

Let’s fix it!

To fix this problem we need to reinstall the SQL Server WMI provider. To do this we need to run 2 commands. (I found this in this post)

  1. Install classes:
    Go to C:\Program Files (x86)\Microsoft SQL Server\{Version 110 is SQL2012}\Shared
    There you can find a file with mof extension. The file name sqlmgmproviderxpsp2up.mof
    Now on the command line run the following command:
    mofcomp sqlmgmproviderxpsp2up.mof
    The output:
  2. Install localization info:
    Navigate to the Shared sub-folder that indicates the locale of your SQL Server installation. In my case is the 1033 (english-US).
    Inside that folder you will find a file with the .mfl extension. The file name is sqlmgmprovider.mfl. On the command line run the following command:
    mofcomp sqlmgmprovider.mfl 
    The output:

With these 2 actions, we are done.

Now we can try to open the SQL Server Configuration Manager again and it opens like expected! Without error messages.

Let’s go back and rerun our commands.
On the host:

Remotely:

And from dbatools Get-DbaSqlService command:

No more “invalid class” messages and we get the output we want!

Thanks for reading.

HTTP 403 error – PowerShell Remoting, Different Domains and Proxies

On my day to day work I use Nagios monitoring software. I want to add some custom SQL Server scripts to enrich the monitoring, and to accomplish this I will need to:

  • Find a folder
  • Create a sub folder
  • Copy bunch of file
  • edit a ini file to verify/add new entries

all of this for every single host on my entire estate. Obviously (for me 🙂 ) I decided to use PowerShell!

Hold your horses!

Yes, calm down. I’m working on a client where the network it’s anything but simple. As far as I know they have 10 domains and few of them have trust configured, but even those that have, is not in both ways… so I didn’t expect an easy journey to get the task done.

Side note: For those thinking how I can live without PowerShell, I can’t! But,  the majority of my time using PowerShell is with SQL Server, mainly using SMO (with the help of dbatools), which means I haven’t struggle that much until now.

“…WinRM client received an HTTP status code of 403…”

Ok, here we go!

PowerShell Remoting and different domains…

….needs different credentials. This is a requirement when using ip address.
If we try to run the following code:

$DestinationComputer = '10.10.10.1'
Invoke-Command -ScriptBlock { Get-Service *sql* } -ComputerName $DestinationComputer

we will get the following error message:

Default authentication may be used with an IP address under the following conditions: the transport is HTTPS or the destination is in the TrustedHosts list, and explicit credentials are provided.

First, I add the destination computer to my TrustedHosts. We can do this in two ways:

Using Set-Item PowerShell cmdlet

Set-Item WSMan:\localhost\Client\TrustedHosts "10.10.10.1"

Or using winrm executable:

winrm s winrm/config/client '@{TrustedHosts="10.10.10.1"}'

Note: You can use “*” (asterisk) to say all remote hosts are trusted. Or just a segment of IPs like “10.10.10.*”.

But, there is another requirement like the error message says “…and explicit credentials are provided.”. This means that we need to add, and in this case I really want to use, a different credential so I have modified the script to:

$DestinationComputer = '10.10.10.1'
Invoke-Command -ScriptBlock { Get-Service *sql* } -ComputerName $DestinationComputer -Credential domain2\user1

Now I get prompted for the user password and I can… get a different error message (*sigh*):

[10.10.10.1] Connecting to remote server 10.10.10.1 failed with the following error message : The WinRM client received an HTTP status code of 403 from the remote WS-Management service. For more information, see the

about_Remote_Troubleshooting Help topic.

+ CategoryInfo : OpenError: (10.10.10.1:String) [], PSRemotingTransportException

+ FullyQualifiedErrorId : -2144108273,PSSessionStateBroken

This one was new for me so I jumped to google and started searching for this error message. Unfortunately all the references I found are to solve an IIS problem with SSL checkbox on the website like this example.

Clearly this is not the problem I was having.

Proxies

I jumped into PowerShell slack (you can ask for an invite here and join more than 3 thousand professionals) and ask for help on #powershell-help channel.
In the meantime, I continued my search and found something to do with proxies in the The dreaded 403 PowerShell Remoting blog post.
This actually could help, but I don’t want to remove the existing proxies from the remote machine. I had to find another way to do it.

Returning to Slack, Josh Duffney (b | t) and Daniel Silva (b | t) quickly prompted to help me and when I mentioned the blog post on proxies, Daniel has shown to me the PowerTip PowerShell Remoting and HTTP 403 Error that I haven’t found before (don’t ask me why…well, I have an idea, I copy & paste the whole error message that’s why).

ProxyAccessType

The answer, for my scenario, is the ProxyAccessType parameter. As it says on the help page, this option “defines the access type for the proxy connection”. There are 5 different options AutoDetect, IEConfig, None, NoProxyServer and WinHttpConfig.

I need to use NoProxyServer to “do not use a proxy server – resolves all host names locally”. Here is the full code:

$DestinationComputer = '10.10.10.1'
$option = New-PSSessionOption -ProxyAccessType NoProxyServer
Invoke-Command -ScriptBlock { Get-Service *sql* } -ComputerName $DestinationComputer -Credential domain2\user1 -SessionOption $option

This will:

  • create a new PowerShell Session option (line 2) with New-PSSessionOption cmdlet saying that -ProxyAccessType is NoProxyServer.
  • Then, just use the $option as the value of -SessionOption parameter on the Invoke-Command.

This did the trick! Finally I was able to run code on the remote host.

Thanks for reading.

Offline Microsoft Documentation? Download it!

On my last article I shared how we can now Contribute to Microsoft Documentation. Today I bring another quick tip on Microsoft Documentation!

Download Microsoft Documentation

Did you know that we can download PDF files with Microsoft Documentation?

I did not know until my colleague called my attention to it few days ago.

Important note: This tip is not (yet?) available for all Microsoft’s product suite. You should confirm if this tip applies to the product you need.

“Which documentation?”

The one we can find at docs.microsoft.com.

Here is why this can be useful

Nowadays, some of us have access to the internet almost 100% of the time, this help us forget that this may fail. You probably have gone through this, losing the internet access right when you needed to check a document. You know, it can happen.

If it happens, you get stuck because you can’t access a small (or not) piece of text that you could have backed up before but you didn’t, right?

Were you using the online documentation to understand what a specific field that belongs to an System Dynamic Management View (DMV) means? Or, which parameter you need to use to execute a specific system stored procedure?

If you get the pdf, you can continue working offline. Going on a flight? Will you be in a place where you don’t have internet access at all?

I think you get the point.

“I will give it a try, show me how”

The link is located on the bottom left of the page.

DownloadLink

This download will not download just the current page. By using the “Download PDF” link you will get all the content that is present on the tree-view under the “filter” box on the left of the page.

treeview

Script to download all existing PDF files

From my search exists at least 98 pdf documents (~66mb) exist just for the Relational Databases topic. Download them all is not the kind of work I would like to do manually.

PowerShell for the rescue

I wrote a PowerShell script that make thing a little bit easier.

With this script, you can download all files for a specific topic. You can find and download the script Get-MSDocs from my GitHub repository, just change the variables and run it.

Let’s see an example

You search for ‘sys.dm_exec_sessions’ DMV and you find the corresponding page from Microsoft documentation -> sys.dm_exec_sessions

The image below shows where you find the topic (highlighted in yellow) that you need to setup on the $topic variable on the script.

mainTopic

By setting the variable $topic = "relational-databases" this script will download all PDF files for that main topic. I have accomplished that by understanding the sql-docs GitHub repository nomenclature.

Each folder in there is the name of one PDF file plus, the current folder ‘Relational-Database’ in this scenario.

Next, choose the destination by setting it on the $outputFolder variable.

As an example for the SQL docs, you have to choose a folder from the Docs root at GitHub repository.

If you find any difficulty working with it let me know by writing a comment to this blog post.

Let’s say you want to do the same but for Azure – you need to change the URLs too. The script is currently pointing to ‘SQL.sql-content’ and for Azure is ‘Azure.azure-documents’. The way I know this is by clicking on download PDF on one of the pages and read the URL from the PDF.

Wrap up:

I have shown how you can download a copy of the documentation manually but also how to get all existing files for a specific topic.

Also, I explained that this is not available for every Microsoft product. For example, PowerShell docs don’t have the link to download PDF file on the docs.microsoft.com site.

Maybe in the future this will become the standard for every Microsoft’s product documentation.

 

Thanks for reading

Contribute to Microsoft Documentation

Times have changed and Microsoft has changed the way we can contribute for documentation!

We already have access to the source code from some programs. One example is PowerShell, that has an GitHub repository where anyone can contribute!

Now anyone can contribute to the documentation too!

How and where?

If you haven’t seen before, now we have a pencil icon on the top right corner that makes possible to suggest a change.

feature_image

 

When clicking on that pencil we will be redirected, in this case, to the MicrosoftDocs – sql-docs repository on GitHub.

There, we need to fork the repository, make the changes and submit our suggestion by doing a pull request (PR). After that we just need to wait for some feedback from the Microsoft team that will review what we have submitted.

Start contributing

In the past, if you saw any errors on Microsoft documentation you could not help easily. But now we don’t have more excuses! If we want to contribute the process is much easier.

Have you overcome a not so common problem and have precious information to add to the documentation? Do you want to add another code example? Or have you “just” found a typo?

Just go ahead and submit a PR.

I will be speaking at PowerShell Conference Asia 2017

Which better way could I have to launch my blog if not with great news ?!

I am so happy and excited to announce that I will be speaking at PowerShell Conference Asia in Singapore!

On 28th of October I will be presenting two sessions with the following titles:

  • Next step to your script…turn it into an Advanced Function
  • SQLServer Reporting Services administration new best friend – PowerShell

Also, on 26th there will be 2 precon:

If you want to know more about the conference you can follow @psconfasia on Twitter, go to the psconf.asia website, and join the Slack team at psconfasia.slack.

Looking forward to meet you in Singapore!