PowerShell Modules Central – Share with community – What PowerShell modules are you using?

Like the blog post title states this is all about sharing with others! My idea is to share with the community which PowerShell modules you are using.

Let me introduce to you the PowerShell Modules Central

PowerShellModulesCentral is a GitHub repository that was founded as a central hub to a list of PowerShell modules that people know/use. Each module has a file describing its name, basic information about the module, as well as one or more blog posts/videos from people that have written about or used them.

This way we can reduce friction when people are starting out or are trying to solve similar problems.


When a new module appears on the PowerShell scene it can be difficult to advertise and gain mindshare among developers/end users who could be interested in it. There are also times when difficulties arise in finding if a good tool exists or not, if its up to date, and how relevant it is in the community.

Why not just use the PS Gallery or script center?

This is, by no means, a replacement of those. Actually it is opposite, it is meant to be a community complement. Normally, when you need to do a task that you’ve never done before you like to have some jump start like blog posts or videos, and maybe you find the ones that are very close to your real scenario.
This repository enables not only people to write blog posts and share them with the community, but also the new guy (on PowerShell or just on a new task) that is searching for a specific tool to accomplish a task.
I can go to the PowerShell Gallery and see that the module I want to use has 1K downloads. That is really cool! It will give me confidence to use it. But, next, when you want to start working with it maybe you would like to see examples. The objective here is to have a quick look on some problems and tools used to solve them, as they can also be your problem.

Let me tell you a quick story

I went to google, found a PowerShell Gallery script, and after checking that the script didn’t work with some particulars I did a further research and found (google results – page 3 or 4 due ranks) a comment on a forum pointing to the GitHub repository. Guess what? The problems I was having were already addressed. 😉

Are you a module owner? Are you writing something new? Do you contribute to a module? Share it! The ones I know and use could be very different from the ones you know and use! Why not share?

How can this help me?

Are you trying to find a module to work on a specific task? Use the search on the top of the repository page and try to find what you need.

  • Working with Windows? Type “Windows”.
  • Working with SQL Server? Type “SQL Server”.
  • Do you know the author’s name? You can search that way too.
  • Have you read a blog post before and just remember one word or the blogger’s name? Type it and see what you find.

How to contribute?

Just fork the repository, add the information and send a pull request (PR). I will merge it once everything is OK.
For new modules please use the template available here. If you find that module already exists, you just need to add your URLs and any other information to be updated, tags that you think may be useful, add something to description, etc.

If you use a module that doesn’t have a blog post and/or videos yet, you can submit a PR anyway so all of the community can know that it exists and maybe someone will write about it!

Follow up

Follow the repository news by clicking on “Watch” button and/or follow @psmcentral Twitter account.

Feel free to share this blog post! The more people we reach, the better!

Thanks for reading.

Generate Markdown Table of Contents based on files within a folder with PowerShell

Last week I was talking with Constantine Kokkinos (b | t) about generating a Table Of Contents (TOC) for a GitHub repository.

He wrote a cool blog post – Generating Tables of Contents for Github Projects with PowerShell – about it and I will write this one with a different problem/solution.


I’m working on a new project (news coming soon) that uses a GitHub repository and I expect to have a big number of files within a specific folder.


After some pull requests and merges, I want to update my readme.md file and update the INDEX with this TOC.
For this:

  • I want to be able to generate a TOC based on the existing content of a specific folder.
  • Each TOC entry must be composed by a name and a link to the .md online file.
  • This list must be ordered alphabetically.

Then, I can copy & paste and update the readme.me .
NOTE: For now, I just want a semi-automatic way to do it. Maybe later I will setup Appveyor and make this fully automated 🙂 ).


Get all files with .md extension, order by name and, for each one, generate a line with a link to the GitHub repository .md file.

To do the list I will use the “*” (asterisk) character after a “TAB” to generate a sub list. (This is Markdown’s syntax)

The code

I have three parameters:

  • the $BaseFolder – It’s the folder’s location on the disk
  • $BaseURL – to build the URL for each file. This will be added as a link
  • $FiletypeFilter – to filter the files on the folder. In my case I will use “*.md” because I only want markdown files.

The code is:
UPDATE: Thanks to Jaap Brasser (b | t) who has contributed to the this code by adding the help and some improvements like dealing with special characters on the URL (spaces). You can find the most recent version of this Convert-FolderContentToMarkdownTableOfContents.ps1 function here on my GitHub

Running this code pointing to my “NewProject” folder

I will get this output (This have fake links but just to show the output format)


Nice! This has the following code behind:

## Index
* Modules
  * [File1](https://github.com/user/repository/tree/master/Modules/File1.md)
  * [OneNewFile](https://github.com/user/repository/tree/master/Modules/OneNewFile.md)
  * [OtherFile](https://github.com/user/repository/tree/master/Modules/OtherFile.md)

Now, I can copy this markdown code and update my readme.md file.

Final thoughts

This isn’t rocket science 🙂 but it is an idea and a piece of code that will help me and maybe can help you too 🙂

Read Constantine’s blog post (Generating Tables of Contents for Github Projects with PowerShell) to get different ideas.

Thanks for reading

[PowerShell] From 20 minutes to 20 seconds – Change domain user password on multiple domains without login into each one

I’m working on a environment where I have to deal with multiple domains.
The user’s password needs to be updated each 40/45 days (it depends on the domain).

Can you see the pain?

This means that every month and half I have to dedicate like 20 minutes to change my password on 10 different domains by logging in to a host that belongs to that domain to be able to change it.

Because I would like a faster way to do this and I’m not proficient with AD, I asked help to Jaap Brasser (b | t). He pointed me to a blog post he has written sometime ago called Active Directory Friday: Change a user’s password.

This code resets the password and not to change/update it. Because I don’t have permission to do it, this method won’t work for me. Also, I don’t have the AD module with all *-ad* cmdlets installed on my machine.

Despite this, his code gave me some clues about the way it could work. I did some more research and finally I found a way to do it from a single machine.

Disclaimer: For this script work, you need to be able to query the LDAP on the different domains. Some things may differ between my environment configuration and yours. Example: In my environment I had to use the IP of the DC instead of name.

When you run the script you will be prompted for your current/old credentials and the new password.

$oldCredential = Get-Credential -Message "Enter domain, user name and old password"
$NewPassword = Read-Host -AsSecureString -Prompt 'Enter new password'

#Here, we get the domain part from credential - in my case I had to use the IP
$DC = $($oldCredential.UserName.split('\')[0]) 
$userName = $($oldCredential.UserName.split('\')[1])
$DomainEntry = New-Object -TypeName System.DirectoryServices.DirectoryEntry "LDAP://$DC" ,$oldCredential.UserName,$($oldCredential.GetNetworkCredential().password)
$DomainName = $DomainEntry.name

# Example search against remote domain
$Searcher = New-Object -TypeName System.DirectoryServices.DirectorySearcher
$Searcher.Filter = "(samaccountname=$userName)"
$Searcher.SearchRoot = $DomainEntry
$user = [adsi]$Searcher.FindOne().Path
$user.ChangePassword($oldCredential.GetNetworkCredential().password, [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($NewPassword)))

With this code you should be able to change your password on a different domain from just one location.

I have updated my password on all domains in less than 20 seconds.

Thanks for reading

Someone is not following the best practices – dbatools and Pester don’t lie!

This month’s T-SQL Tuesday is brought to us by my good friend Rob Sewell (b | t). Together “Let’s get all Posh – What are you going to automate today?”

I have written some blog posts on how I use PowerShell to automate mundane tasks or some other more complex scenarios like:  Find and fix SQL Server databases with empty owner property using dbatools PowerShell module or Have you backed up your SQL Logins today?  or even using ReportingServicesTools module for deploy reports – SSRS Report Deployment Made Easy – 700 times Faster.

But today I want to bring something little different.  This year, back in May I saw two presentations from Rob about using Pester to do unit tests for our PowerShell code and also to validate options/infrastructure like checklists. This got my attention and made me want to play with it!

Therefore, I want to share an example with you using two of my favorite PowerShell modules dbatools and Pester.

Let’s play a game

You go to a client or you have just started working on your new employer and you want to know if the entire SQL Server state complies with the best practices.

For the propose of this blog, we will check:

  • if our databases (from all instances) have the following configurations:
    • PageVerify -> Checksum
    • AutoShrink -> False
  • if each SQL Server instance:
    • has the MaxMemory setting configured to a value lower than the total existing memory on the host.

How would you do that?

Let me introduce to you – dbatools

For those who don’t know, dbatools is a PowerShell module, written by the community, that makes SQL Server administration much easier using PowerShell. Today, the module has more than 260 commands. Go get it (dbatools.io) and try it! If you have any doubt you can join the team on the #dbatools channel at the Slack – SQL Server Community.

In this post I will show some of those commands and how they can help us.

Disclaimer: Obviously this is not the only way to accomplish this request, but for me, is one excellent way!

Get-DbaDatabase command

One existing command on the dbatools swiss army knife is Get-DbaDatabase.
As it states on the command description

The Get-DbaDatabase command gets SQL database information for each database that is present in the target instance(s) of SQL Server. If the name of the database is provided, the command will return only the specific database information.

This means that I can run the following piece of PowerShell code and get some information about my databases:

Get-DbaDatabase -SqlServer sql2016 | Format-Table

This returns the following information from all existing databases on this SQL2016 instance.

Too little information

That’s true, when we look to it, it brings not enough information. I can’t even get the “PageVerify” and “AutoShrink” properties that I want. But that is because we, by default, only output a bunch of properties and this doesn’t mean that the others are not there.

To confirm this we can run the same code without the ” | Format-Table” that is useful to output the information in a table format but depending on the size of your window it can show more or less columns.
By running the command without the “format-table” we can see the following (just showing the first 3 databases):

Now, we can see more properties available look to the ones inside the red rectangle.

I continue not to see the ones I want

You are right. But as I said before that does not means they aren’t there.
To simplify the code let’s assign our output to a variable named $databases and then we will have a look to all the Members existing on this object

$databases = Get-DbaDatabase -SqlServer sql2016
$databases | Get-Member

Now we get a lot of stuff! The Get-Member cmdlet say to us which Properties and Methods of the object (in this case the $databases).

This means that I can use a filter to find results with “auto” in its name:

$databases | Get-Member | Where-Object Name -like *auto*

Some cmdlets have parameters that allow us to filter information without the need to pipeing it so, the last line command could be written as:

$databases | Get-Member -Name *auto*

Which will output something like this:

So, we have found our “AutoShrink” property. With this in mind, lets query all the properties we want.

$databases | Select-Object SqlInstance, Name, AutoShrink, PageVerify

And here we have the result:

Scaling for multiple instances

This is where the fun begins.
We can pass multiple instance names and the command will go through all of them and output a single object with the data.

$databases = Get-DbaDatabase -SqlServer sql2016, sql2012
$databases | Select-Object SqlInstance, Name, AutoShrink, PageVerify

Which outputs:

As you can see I have passed two different instances sql2016 (in red) and sql2012 (in green) and the output brought both information.

Using Out-GridView to filter results

We can use another PowerShell native cmdlet called Out-GridView to show our results in a grid format. This grid also make it possible to use filters.
For the next example, I have misconfigurated two databases so we can find them among the others.

$databases | Select-Object SqlInstance, Name, AutoShrink, PageVerify | Out-GridView

As you can see, inside red rectangles we have two not optimal configurations regarding the SQL Server best practices. You can also see the green rectangle on the top left corner where you can type text and the results will be filter as you type. So if you type “true” you will end just with one record.

Checking the MaxMemory configuration

Now, that you have seen how to do it for one command, you can start exploring the other ones. As I said in the beginning of this post we will also check the MaxMemory setting for each instance. We will use the Get-DbaMaxMemory. From the help page we can see the description that says:

This command retrieves the SQL Server ‘Max Server Memory’ configuration setting as well as the total physical installed on the server.

Let’s run it through our two instances:

Get-DbaMaxMemory -SqlInstance sql2012, sql2016

We can see that SQL2012 instance is running on a host with 6144MB of total memory but its MaxMemory setting is set to 3072MB and also, SQL2016 instance has 4608MB configured form the 18423MB existing on the host.

Final thought on this fast introduction to dbatools PowerShell module

As you see, it is pretty easy to run the commands for one or multiple instances to get information to work on. Also you have seen different ways to output that information.
I encourage you to use the Find-DbaCommand to discover what other commands exists and what they can do for you.

Example, if you want to know which commands we have that works with “memory” you can run the following code:

Find-DbaCommand -Pattern memory

Automating even more

Using the dbatools module we could verify if the best practice is in place or not. But we had to run the command and then verify the values by filtering and looking for each row.

You may be thinking that must exists some other more automated method to accomplish that, right?

Say hello to Pester PowerShell module

Pester is unit test framework for PowerShell. I like to say If you can PowerShell it, you can Pester it.

Pester provides a framework for running Unit Tests to execute and validate PowerShell commands. Pester follows a file naming convention for naming tests to be discovered by pester at test time and a simple set of functions that expose a Testing DSL for isolating, running, evaluating and reporting the results of PowerShell commands.

Please see how to install Pester module here.

With this framework, that I really encourage you to read more about it on the project Wiki, we can automate our tests and make it do the validations for us!

As quick example – if we run the following code:

We are checking if the login returned by the whoami is base\claudio.

This return green which means it’s ok!

If is not ok (because I’m testing to “base\claudio.silva”), will retrieve something like this:

Quick walkthrough on Pester syntax

As you can see, to do a test we need a:

  • Describe block (attention: the “{” must be on the same line!)
    • Inside it, the Context block
      • And inside the Context block the validation that we want to do the It and Should.

Let’s join forces

With this in mind, I can create tests for my needs using dbatools and Pester.

I will have a variable ($SQLServers)

$SQLServers = @('sql2012', 'sql2014', 'sql2016')

with all the instances I want to test and two “Describe” blocks, one for “Testing database options” – PageVerify and AutoShrink

Describe "Testing Database Options for $Server" {
   foreach($Server in $SQLServers){
      #Just selecting some columns so it don't take too much time returning all the thing that we don't want
      $databases = Get-DbaDatabase -SqlServer $server | Select-Object Name, SqlInstance, CompatibilityLevel, PageVerify, AutoShrink, AutoClose
      foreach($database in $databases) {
         Context "$($Database.Name) Validation" {
            It "PageVerfiy set to Checksum" {
               $database.PageVerify| Should Be "Checksum"
            It "AutoShrink set to False" {
               $database.AutoShrink| Should Be $false

And another one for “Testing instance MaxMemory”:

Describe "Testing Instance MaxMemory"{
   foreach($Server in $SQLServers){
      $instanceMemory = Get-DbaMaxMemory -SqlInstance $Server
      Context "Checking MaxMemory value" {
         It "$($Server) instance MaxMemory value $($instanceMemory.SqlMaxMb) is less than host total memory $($instanceMemory.TotalMB)" {
            $instanceMemory.SqlMaxMb | Should BeLessThan $instanceMemory.TotalMB

To run this tests we should save a file with the “.Tests.ps1” ending name. Let’s save as “SQLServerBestPractices.Tests.ps1”. To run the tests we need to use the Invoke-Pester and the file that contains the tests.

Invoke-Pester .\SQLServerBestPractices.Tests.ps1

To much noise – can’t find the failed tests easily

You are right, showing all the greens make us lose the possible red ones. But Pester has an option to show just the failed tests.

Invoke-Pester .\SQLServerBestPractices.Tests.ps1 -Show Failed

But, be aware that -Show Fails can be a better solution, specially when you are working with multiple Tests.ps1 files.

This way you can see where your error come from.

Reading and fixing the errors

As you can read from the last image from -Show Failed execution, the database “dbft” on “SQL2016” instance has the “AutoShrink” property set to “True” but we expect the value “False”. Now you can go to the database properties and change this value!

Also, the “PageVerify” value that we expect to be “Checksum” is “TornPageDetection” for the database “dumpsterfire4” and “SQL2016” instance.

Finally the MaxMemory configuration on the “SQL2016” instance is set to 46080MB (45GB) but we expect that should be less than 18432mb (18GB) that is the total memory of the host. We need to reconfigure this value too.

This is great!

Yes it is! Now when a new database is born on an existing instance, or you update your instances with a new one, you can simply run the tests and the new stuff will be included on this set of tests!

If you set it to run daily or even once per week you can check your estate and get new stuff that haven’t been you to setup and maybe is not following the best practices.

Get the fails and email them (I will blog about it).

Next steps

  • Explore Pester syntax.
  • Add new instances.
  • Add new tests
    • Check if you have access to the instance (great way to know quickly if some instance is stopped)
    • Check if your backups are running with success and within our policy time interval
    • Check if your datafiles are set to growth by fixed value and not percent. Also if that fixed value is more than X mb.
    • Want to Test your last backup? Or something completely different like Rob’s made for Pester for Presentations – Ensuring it goes ok?

You name it!

Want more?

I hope you have seen some new stuff and get some ideas from this blog post!

If you want to know if there will be some dbatools presentations near you, visit our presentation page. You can find some of our presentations on our youtube channel and code example on the community presentations on GitHub.

About Pester and other examples and use cases, we have the Articles and other resources page maintained by the Pester team.

I’m looking forward to read the other blog posts (follow the comments on Rob’s post, or the roundup later) on this month’s T-SQL Tuesdays and see what people is being doing with PowerShell.

Thanks for reading.

“Invalid class [0x80041010]” error when trying to access SQLServer’s WMI classes

I was using open source PowerShell module dbatools (GitHub repository) to get the list of SQL Server services I have on a bunch of hosts so I could confirm if they are in “running” state.

— Quick note —
For those who don’t know, dbatools is a module, written by the community, that makes SQL Server administration much easier using PowerShell. Today, the module has more than 260 commands. Go get it and try it! If you have any doubt you can join the team on the #dbatools channel at the Slack – SQL Server Community.
— Quick note —

To accomplish this, I’m using the Get-DbaSqlService initially written by Klaas Vandenberghe (b | t).

This command is very handy, as it will try different ways to connect to the host and we don’t need to do anything extra. Also, it has a -Credential parameter so we can use it to connect to hosts in different domains (I have 10 different credentials, one per domain).

Everything was running fine, for the first couple of hosts, until…

I got the following message when running on a specific host:

WARNING: Get-DbaSqlService – No ComputerManagement Namespace on HOST001. Please note that this function is available from SQL 2005 up.

Trying to get more information, I have executed the same command but added the -Verbose switch

From all the blue lines, I spotted this:

VERBOSE: [Get-DbaCmObject][12:23:31] [HOST001] Retrieving Management Information
VERBOSE: [Get-DbaCmObject][12:23:31] [HOST001] Accessing computer using Cim over WinRM
VERBOSE: [Get-DbaCmObject][12:23:47] [HOST001] Accessing computer using Cim over WinRM – Failed!
VERBOSE: [Get-DbaCmObject][12:23:47] [HOST001] Accessing computer using Cim over DCOM
VERBOSE: [Get-DbaCmObject][12:23:48] [HOST001] Accessing computer using Cim over DCOM – Success!

Ok, this means that for this specific host I can’t connect via WinRM (using WSMan) but I can when using the DCOM protocol. However,  the WMI query used to get the list of SQL services fails.

Going further

I open the Get-DbaSqlService.ps1 script and spotted where the warning message comes from. Then, I have copied the code to a new window in order to isolate it and do another execution tests.

The code is:

$sessionoption = New-CimSessionOption -Protocol DCOM
$CIMsession = New-CimSession -ComputerName $Computer -SessionOption $sessionoption -ErrorAction SilentlyContinue -Credential $Credential
#I have skipped an if ( $CIMSession ) that is here because we know that works.
$namespace = Get-CimInstance -CimSession $CIMsession -NameSpace root\Microsoft\SQLServer -ClassName "__NAMESPACE" -Filter "Name Like 'ComputerManagement%'" -ErrorAction SilentlyContinue |Where-Object {(Get-CimInstance -CimSession $CIMsession -Namespace $("root\Microsoft\SQLServer\" + $_.Name) -Query "SELECT * FROM SqlService" -ErrorAction SilentlyContinue).count -gt 0}

I splitted the last command to remove the pipeline since I would like to analyze each part of the code. I ended with the following code:

$sessionoption = New-CimSessionOption -Protocol DCOM
$CIMsession = New-CimSession -ComputerName "HOST001" -SessionOption $sessionoption -ErrorAction Continue -Credential $Credentials -Verbose

Get-CimInstance -CimSession $CIMsession -NameSpace root\Microsoft\SQLServer -Query "Select * FROM __NAMESPACE WHERE Name Like 'ComputerManagement%'"
#This one is comment for now
#Get-CimInstance -CimSession $CIMsession -Namespace $("root\Microsoft\SQLServer\ComputerManagement10") -Query "SELECT * FROM SqlService"

This can return more than one line with different ComputerManagement (like ComputerManagement10). It depends on the versions you have installed on the host. The number “10” refers to the SQL Server 2008.
Now I can uncomment the last command and run it. The result is:

Get-CimInstance : Invalid class
At line:1 char:1
+ Get-CimInstance -CimSession $CIMsession -Namespace $(“root\Microsoft\SQLServer\C …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : MetadataError: (:) [Get-CimInstance], CimException
+ FullyQualifiedErrorId : HRESULT 0x80041010,Microsoft.Management.Infrastructure.CimCmdlets.GetCimInstanceCommand
+ PSComputerName : HOST001

Ok, a different error message. Let’s dig in it. I logged in on the host and confirmed that I have a SQL Server 2008 R2 instance installed. This means that I’m not accessing a lower version than 2005 like the initial warning message was suggesting.

I tried to execute locally the same query but this time using Get-WmiObject instead of Get-CimInstance (which, in this case wasn’t available because the host only have PowerShell v2.0. It’s a Windows server 2008 SP2. CIM cmdlets appears on v3.0) and it failed with the same error.

Get-WmiObject : Invalid class
At line:1 char:5
+ gwmi <<<< -Namespace “root\Microsoft\SQLServer\ComputerManagement10” -Query “SELECT * FROM SqlService”
+ CategoryInfo : InvalidOperation: (:) [Get-WmiObject], ManagementException
+ FullyQualifiedErrorId : GetWMIManagementException,Microsoft.PowerShell.Commands.GetWmiObjectCommand

I remembered, from past experiences, that SQL Server Configuration manager relies on WMI classes to show the information, so I tried to open it and I got the following error message:

Cannot connect to WMI provider. You do not have permission or the server in unreachable. Note that you can only manage SQL Server 2005 and later servers with SQL Server Configuration Manager.
Invalid class [0x80041010]

Again, that 2005 callout, but…did you recognize the last sentence? It’s the same error I was getting with Get-CIMInstance remotely and Get-WmiObject locally.

Definitely something is broken.

Let’s fix it!

To fix this problem we need to reinstall the SQL Server WMI provider. To do this we need to run 2 commands. (I found this in this post)

  1. Install classes:
    Go to C:\Program Files (x86)\Microsoft SQL Server\{Version 110 is SQL2012}\Shared
    There you can find a file with mof extension. The file name sqlmgmproviderxpsp2up.mof
    Now on the command line run the following command:
    mofcomp sqlmgmproviderxpsp2up.mof
    The output:
  2. Install localization info:
    Navigate to the Shared sub-folder that indicates the locale of your SQL Server installation. In my case is the 1033 (english-US).
    Inside that folder you will find a file with the .mfl extension. The file name is sqlmgmprovider.mfl. On the command line run the following command:
    mofcomp sqlmgmprovider.mfl 
    The output:

With these 2 actions, we are done.

Now we can try to open the SQL Server Configuration Manager again and it opens like expected! Without error messages.

Let’s go back and rerun our commands.
On the host:


And from dbatools Get-DbaSqlService command:

No more “invalid class” messages and we get the output we want!

Thanks for reading.

HTTP 403 error – PowerShell Remoting, Different Domains and Proxies

On my day to day work I use Nagios monitoring software. I want to add some custom SQL Server scripts to enrich the monitoring, and to accomplish this I will need to:

  • Find a folder
  • Create a sub folder
  • Copy bunch of file
  • edit a ini file to verify/add new entries

all of this for every single host on my entire estate. Obviously (for me 🙂 ) I decided to use PowerShell!

Hold your horses!

Yes, calm down. I’m working on a client where the network it’s anything but simple. As far as I know they have 10 domains and few of them have trust configured, but even those that have, is not in both ways… so I didn’t expect an easy journey to get the task done.

Side note: For those thinking how I can live without PowerShell, I can’t! But,  the majority of my time using PowerShell is with SQL Server, mainly using SMO (with the help of dbatools), which means I haven’t struggle that much until now.

“…WinRM client received an HTTP status code of 403…”

Ok, here we go!

PowerShell Remoting and different domains…

….needs different credentials. This is a requirement when using ip address.
If we try to run the following code:

$DestinationComputer = ''
Invoke-Command -ScriptBlock { Get-Service *sql* } -ComputerName $DestinationComputer

we will get the following error message:

Default authentication may be used with an IP address under the following conditions: the transport is HTTPS or the destination is in the TrustedHosts list, and explicit credentials are provided.

First, I add the destination computer to my TrustedHosts. We can do this in two ways:

Using Set-Item PowerShell cmdlet

Set-Item WSMan:\localhost\Client\TrustedHosts ""

Or using winrm executable:

winrm s winrm/config/client '@{TrustedHosts=""}'

Note: You can use “*” (asterisk) to say all remote hosts are trusted. Or just a segment of IPs like “10.10.10.*”.

But, there is another requirement like the error message says “…and explicit credentials are provided.”. This means that we need to add, and in this case I really want to use, a different credential so I have modified the script to:

$DestinationComputer = ''
Invoke-Command -ScriptBlock { Get-Service *sql* } -ComputerName $DestinationComputer -Credential domain2\user1

Now I get prompted for the user password and I can… get a different error message (*sigh*):

[] Connecting to remote server failed with the following error message : The WinRM client received an HTTP status code of 403 from the remote WS-Management service. For more information, see the

about_Remote_Troubleshooting Help topic.

+ CategoryInfo : OpenError: ( [], PSRemotingTransportException

+ FullyQualifiedErrorId : -2144108273,PSSessionStateBroken

This one was new for me so I jumped to google and started searching for this error message. Unfortunately all the references I found are to solve an IIS problem with SSL checkbox on the website like this example.

Clearly this is not the problem I was having.


I jumped into PowerShell slack (you can ask for an invite here and join more than 3 thousand professionals) and ask for help on #powershell-help channel.
In the meantime, I continued my search and found something to do with proxies in the The dreaded 403 PowerShell Remoting blog post.
This actually could help, but I don’t want to remove the existing proxies from the remote machine. I had to find another way to do it.

Returning to Slack, Josh Duffney (b | t) and Daniel Silva (b | t) quickly prompted to help me and when I mentioned the blog post on proxies, Daniel has shown to me the PowerTip PowerShell Remoting and HTTP 403 Error that I haven’t found before (don’t ask me why…well, I have an idea, I copy & paste the whole error message that’s why).


The answer, for my scenario, is the ProxyAccessType parameter. As it says on the help page, this option “defines the access type for the proxy connection”. There are 5 different options AutoDetect, IEConfig, None, NoProxyServer and WinHttpConfig.

I need to use NoProxyServer to “do not use a proxy server – resolves all host names locally”. Here is the full code:

$DestinationComputer = ''
$option = New-PSSessionOption -ProxyAccessType NoProxyServer
Invoke-Command -ScriptBlock { Get-Service *sql* } -ComputerName $DestinationComputer -Credential domain2\user1 -SessionOption $option

This will:

  • create a new PowerShell Session option (line 2) with New-PSSessionOption cmdlet saying that -ProxyAccessType is NoProxyServer.
  • Then, just use the $option as the value of -SessionOption parameter on the Invoke-Command.

This did the trick! Finally I was able to run code on the remote host.

Thanks for reading.

Offline Microsoft Documentation? Download it!

On my last article I shared how we can now Contribute to Microsoft Documentation. Today I bring another quick tip on Microsoft Documentation!

Download Microsoft Documentation

Did you know that we can download PDF files with Microsoft Documentation?

I did not know until my colleague called my attention to it few days ago.

Important note: This tip is not (yet?) available for all Microsoft’s product suite. You should confirm if this tip applies to the product you need.

“Which documentation?”

The one we can find at docs.microsoft.com.

Here is why this can be useful

Nowadays, some of us have access to the internet almost 100% of the time, this help us forget that this may fail. You probably have gone through this, losing the internet access right when you needed to check a document. You know, it can happen.

If it happens, you get stuck because you can’t access a small (or not) piece of text that you could have backed up before but you didn’t, right?

Were you using the online documentation to understand what a specific field that belongs to an System Dynamic Management View (DMV) means? Or, which parameter you need to use to execute a specific system stored procedure?

If you get the pdf, you can continue working offline. Going on a flight? Will you be in a place where you don’t have internet access at all?

I think you get the point.

“I will give it a try, show me how”

The link is located on the bottom left of the page.


This download will not download just the current page. By using the “Download PDF” link you will get all the content that is present on the tree-view under the “filter” box on the left of the page.


Script to download all existing PDF files

From my search exists at least 98 pdf documents (~66mb) exist just for the Relational Databases topic. Download them all is not the kind of work I would like to do manually.

PowerShell for the rescue

I wrote a PowerShell script that make thing a little bit easier.

With this script, you can download all files for a specific topic. You can find and download the script Get-MSDocs from my GitHub repository, just change the variables and run it.

Let’s see an example

You search for ‘sys.dm_exec_sessions’ DMV and you find the corresponding page from Microsoft documentation -> sys.dm_exec_sessions

The image below shows where you find the topic (highlighted in yellow) that you need to setup on the $topic variable on the script.


By setting the variable $topic = "relational-databases" this script will download all PDF files for that main topic. I have accomplished that by understanding the sql-docs GitHub repository nomenclature.

Each folder in there is the name of one PDF file plus, the current folder ‘Relational-Database’ in this scenario.

Next, choose the destination by setting it on the $outputFolder variable.

As an example for the SQL docs, you have to choose a folder from the Docs root at GitHub repository.

If you find any difficulty working with it let me know by writing a comment to this blog post.

Let’s say you want to do the same but for Azure – you need to change the URLs too. The script is currently pointing to ‘SQL.sql-content’ and for Azure is ‘Azure.azure-documents’. The way I know this is by clicking on download PDF on one of the pages and read the URL from the PDF.

Wrap up:

I have shown how you can download a copy of the documentation manually but also how to get all existing files for a specific topic.

Also, I explained that this is not available for every Microsoft product. For example, PowerShell docs don’t have the link to download PDF file on the docs.microsoft.com site.

Maybe in the future this will become the standard for every Microsoft’s product documentation.


Thanks for reading