dbatools v1.0? It’s available – Check it out!

Dear reader, before continue please open a PowerShell console and run the following command:

 
Install-Module -Name dbatools

If you are doing this on the date of this blog post, you have just installed dbatools v1.0!

After more than 200 commits, the work of more than 20 contributors and 20 days since the last published version, dbatools v1.0 is live!

To all of you that have contributed direct or indirectly to the module a big thank you!

Fun fact

I’m sure this was one of the longest periods without releasing new versions since we started doing it more often.
Bare minimum has been 1 release per week 🙂

But, there are good reasons for it! v1.0 brings
– Standardised code – parameters names / output
– Code cleanup
– More tests
– Azure connections supported
– And of course, fixes and new (13) commands.

You can read the v1.0 change log for more details.

New commands

From the 13 new commands, I decided to share the ones that make it possible to interact with Server/Database roles.

Here is the list of the newest commands:
Add-DbaDbRoleMember
Get-DbaDbRole
New-DbaDbRole
New-DbaInstanceRole
Remove-DbaDbRole
Remove-DbaDbRoleMember
Remove-DbaInstanceRole

Note: Database Application Roles are not covered yet.
Note2: A new command to add logins to one (or more) server role is being cooked.
This is why we release often, improvements and new features are always on our pipeline.

Code examples

Here is a short script that shows how you can leverage these new commands.
Don’t forget to use Get-Help or visit our docs page to know more about the commands.

$instance = "sql2016"
$login = "domain\user"
$newServerRole = "securityMaster"
$defaultExistingServerRole = "sysadmin"

$database = "db1"
$username = $login
$newDatabaseRole = "SPExecutor"

### Create

# Create new login and set default database
New-DbaLogin -SqlInstance $instance -Login $login -DefaultDatabase $database

# Create new server role
New-DbaInstanceRole -SqlInstance $instance -ServerRole $newServerRole

# Create new database user
New-DbaDbUser -SqlInstance $instance -Database $database -Username $username -Login $login

# Create new database role
New-DbaDbRole -SqlInstance $instance -Database $database -Role $newDatabaseRole

# Add new user to the newly created database role
Add-DbaDbRoleMember -SqlInstance $instance -Database $database -User $username -Role $newDatabaseRole


### Now using Get-Dba*Role*

# Get all members of an role (or list of roles)
Get-DbaInstanceRoleMember -SqlInstance $instance -ServerRole $defaultExistingServerRole | Format-Table -AutoSize


# Get newly create server role 'securityMaster' and defaul existing role 'sysadmin'
Get-DbaInstanceRole -SqlInstance $instance -ServerRole $newServerRole, $defaultExistingServerRole


### Database level

# Get newly creted 'SPExecuter' role
Get-DbaDbRole -SqlInstance $instance -Database $database -Role $newDatabaseRole

# Get all users member of an role (or list of roles)
Get-DbaDbRoleMember -SqlInstance $instance -Database $database -Role $newDatabaseRole


### Clean up

# Remove user from database role
Remove-DbaDbRoleMember -SqlInstance $instance -Database $database -Role $newDatabaseRole -User $username

# Remove role from database
Remove-DbaDbRole -SqlInstance $instance -Database $database -Role $newDatabaseRole

# Remove server role from instance
Remove-DbaInstanceRole -SqlInstance $instance -ServerRole $newServerRole

Hope you found them as useful as we did!

Wrap up

Even though this is a milestone for us, we will keep working on the module bringing more awesomeness to it!

We want to hear from you!
If you have questions, suggestions, requests or you just want to give a shout out to the team you can:
Request a feature or report a bug
Join #dbatools channel on Slack SQL Community
– Find us on Twitter – @psdbatools

Other useful links:
Website and blog
Command list and documentation

To celebrate the launch of v1.0 we have a lot of blog posts related to it!

dbatools 1.0 has arrived by Chrissy
dbatools 1.0 – the tools to break down the barriers – Shane O’Neill
dbatools 1.0 is here and why you should care – Ben Miller
dbatools 1.0 and beyond – Joshua Corrick
dbatools 1.0 – Dusty R
Your DBA Toolbox Just Got a Refresh – dbatools v1.0 is Officially Available!!! – Garry Bargsley
updating sql server instances using dbatools 1.0 – Gareth N

Enjoy dbatools v1.0!

Thanks for reading!

“Ups…I have deleted some data. Can you put it back?” – dbatools for the rescue

Few days ago I received a request to restore a dozen of tables because someone have deleted more data than it was supposed.
I immediately thought about dbatools for the job!

NOTE: I also thought about SSMS “Import/Export Data”. And this is ok when someone says “it’s to run just once, just now”. When you are in the IT for a while you know that is not only once :-). And what if I need to re-run? Or save it as example and share with my colleagues? Doing this using a PowerShell script makes it much more flexible.

At this post, I’m assuming that we already have restored what will be our source database using the backup requested by the client.

Let’s start – Finding the right command

I knew we have a Copy command for table data.

To search for commands within dbatools we can use the Find-DbaCommand.

Find-DbaCommand -Pattern Copy*Table*Data

Note: As you can see, we can use wildcards to do the search when using -Pattern parameter.

From the results list, we know that we have at least three commands (as of v0.9.824) that match the specified pattern. By reading the Synopsis of the first command that seems what we need to accomplish the work.

Get familiarized with the command

Don’t forget to use Get-Help cmdlet to find the available parameters and get some examples on how you can use the command.

Get-Help Copy-DbaDbTableData -Detailed

Let’s try the command and copy the data

$params = @{
    SqlInstance = 'sql1'
    Destination = 'sql2'
    Database = 'db1'
    DestinationDatabase = 'db2'
    Table = '[dbo].[Table]'
    DestinationTable = '[dbo].[Table]'
}
Copy-DbaDbTableData @params

Result

WARNING: [14:26:06][Copy-DbaDbTableData] Table [dbo].[Table] cannot be found in db2. Use -AutoCreateTable to automatically create the table on the destination.

Wait a moment…it yielded a warning?

Oh, I see…this will not work if the table does not exists on the destination database.

At this point I realised that some of the requested tables were dropped. The client did not specify that.

But dbatools got you covered and the warning gives you one hint: use -AutoCreateTable which will, per its description,

Creates the destination table if it does not already exist, based off of the “Export…” script of the source table.

That is nice!

I added the -AutoCreateTable parameter and rerun the command

$params = @{
    SqlInstance = 'sql1'
    Destination = 'sql2'
    Database = 'db1'
    DestinationDatabase = 'db2'
    Table = '[dbo].[Table]'
    DestinationTable = '[dbo].[Table]'
    AutoCreateTable = $true
}
Copy-DbaDbTableData @params

And now it has completed successfully.

At this point you may be thinking…”everything is done!”

The client just said that he wants to replace all the data from last backup for a dozen of tables. In this case some of the tables do not exist anymore, they were dropped.. so adding the -AutoCreateTable parameter helped.

And we are done! Aren’t we?

Not so fast!

If the table don’t have any indexes, triggers, etc, or you just want the table skeleton, yes the work here is done!
Otherwise, we are not finished yet. And this was the scenario.

Primary key was missing and some constraints and indexes as well.

Beyond default scripting options

As I have demonstrated on my last post Scripting SQL Server objects with dbatools – Beyond default options, we can generate an object of ScriptingOptions and use the -ScriptingOptionsObject parameter available on the Export-DbaScript command to specify more than the defaults.

Exploring other options

As stated on the previous post, we have some properties like:

  • Indexes
  • DriPrimaryKey
  • DriForeignKeys
  • etc

By default they are set to $false and that explains why our -AutoCreateTable parameter (which uses default options) didn’t bring all of this details from our source tables.

This means that with a little bit more code we will be able to accomplish our requirements.

The code

Note: This is an modified version (to meet my needs) of the original script that I have borrowed from Andy Levy’s (b | t) blog post Copying Individual Tables with dbatools. Andy is also a dbatools contributor.

$SourceServer = "SQL1";
$DestinationServer = "SQL2"
$SourceDB = "srcDB";
$DestinationDB = "dstDB";
$tables = "Table1", 'Table2', "Table3";

$options = New-DbaScriptingOption

$options.DriPrimaryKey = $true
$options.DriForeignKeys = $true
$options.DriUniqueKeys = $true
$options.DriClustered = $true
$options.DriNonClustered = $true
$options.DriChecks = $true
$options.DriDefaults = $true

$tables | ForEach-Object {
    # Get the table definition from the source
    [string]$tableScript = Get-DbaDbTable -SqlInstance $SourceServer -Database $SourceDB -Table $_ | Export-DbaScript -ScriptingOptionsObject $options -Passthru;

    if (-not [string]::IsNullOrEmpty($tableScript)) {
        if ($null -eq (Get-DbaDbTable -SqlInstance $DestinationServer -Database $DestinationDB -Table $_)) {
            # Run the script to create the table in the destination
            Invoke-DbaQuery -Query $tableScript -SqlInstance $DestinationServer -Database $DestinationDB;
        }
        else {
            Write-Warning "Table $_ already exists in detination database. Will continue and copy the data."
        }

        # Copy the data
        Copy-DbaDbTableData -SqlInstance $SourceServer -Database $SourceDB -DestinationDatabase $DestinationDB -KeepIdentity -Truncate -Table $_ -DestinationTable $_;
    }
    else {
        Write-Warning "Table $_ does not exists in source database."
    }
}

This will do the following steps:

  • New-DbaScriptingOption creates a new object of Scripting Options type and then we set a bunch of properties as $true.

For each of the tables that we define on our $tables list variable:

  • Export-DbaScript will generate the T-SQL script from the source table using the properties that we have defined. In this case, Keys (Primary, Foreign, Unique), Defaults and Checks Constraints and also clustered and non-clustered indexes.

  • Invoke-DbaQuery will run the generated script on the destination database. At this point we have the same table structure on both sides. This will only run if the table does not exists on the destination database.

  • Finally we use our Copy-DbaDbTableData command to copy the records. Here I have choosen to truncate the table with -Truncate parameter and keep identity values by specifying the -KeepIdentity.

Special callout:

If these tables have relations between them, you need to specify the table list in a specific order to make sure that parent tables are created before child tables. Otherwise you will get, as expected, some errors.

Final thoughts

dbatools has, literally, hundreds of commands so it’s important to know how to find the right one for the job.
Make sure you take a look on the help to understand what is or not covered by the process. As we saw in this example, assuming that the table would be created is a pitfall. For that, we need to use another parameter, but even so when reading the parameter description we will understand that this table will not be created with its full definition. This means that primary key, constraints, indexes, etc will not be created by default.

Because we are using PowerShell, and there is more than one way to accomplish a task, combining both commands as the New-DbaScriptingOption will make it more flexible and possible to reach the final result that we need.

Thanks for reading!

Scripting SQL Server objects with dbatools – Beyond default options

Probably you had the need to script out some objects from a SQL Server instance/database and this is quite easy. You just need to right click on the object (well…not on every single one, try it with an Availability Group :-), no script option available) select “Script XXXX as” and you have it.

But have you realized that this option doesn’t bring all the stuff?
Let’s say you are scripting a table and you have a Non-Clustered index or a trigger…using this option some of the objects under the table will not be scripted out. I understand this as each one of this is a different object, which means if you go to each one you can right click and script just that one.

SSMS – “Generate Scripts…” option

This is a tedious work and you can easily miss some objects. What can we do on SSMS to make this task easier?

You can accomplish this task by using the “Generate Scripts…” option under “Tasks” when you right-click on the database:

This will open the wizard and at the “Set Scripting Options” you can just click on the “Advanced” button and there you can change the properties.
Here you can see that some defaults are in place, and “Script Indexes” is one of them.

This is much easier right? All-in-one in a single scripting operation.

What about automating this task? You want to script out multiple objects from different instances/databases.

Enter dbatools’ “Export-” commands

To search for commands within dbatools we can use the Find-DbaCommand.

Find-DbaCommand -Tag Export

This command give to us a nice Synopsis text that help to find which command is the one we want. From the output we can see that we have (as of v0.9.824) 5 commands tagged as Export.

To replicate our SSMS example using PowerShell we will use the Export-DbaScript.

Don’t forget to use Get-Help cmdlet to find the available parameters and get some examples on how you can use the command.

Get-Help Export-DbaScript -Detailed

Tip: Try other switches like -Examples or even -ShowWindow (won’t work on PSCore but dbatools does!) switches.

Example using our “MyTable” object

Here is how MyTable looks like:

  • 3 Columns
  • 1 Default constraint
  • 1 Non-Clustered Index
Get-DbaDbTable -SqlInstance SQL1 -Database DB1 -Table MyTable | Export-DbaScript -Passthru

Note: I’m using -PassThru parameter to output the script to the console, by default it will create a SQL file.

The output of this execution is even more incomplete when comparing with SSMS. Here, we dont even get the default constraint scripted.

Using “New-DbaScriptingOption” command

dbatools has a command that makes it possible to create an object of type ScriptingOptions.
Then we can change the properties like we have done before on the “Generate Scripts…” option on SSMS.

$options = New-DbaScriptingOption
$options | Get-Member

Use Get-Member so you can see what properties the object offers.

Here we start seeing what we need.

By default what are the values of properties like NonClusteredIndexes and DriDefaults

$options = New-DbaScriptingOption
$options.NonClusteredIndexes
$options.DriDefaults

False! That explains why they are “missing” from our default Export-DbaScript output.

NOTE: Export-DbaUser comamnd can also leverage on this object. Try it.

Let’s change this options to $true and pass our $options object as the value of the -ScriptingOptionsObject parameter and run the command again.

$options = New-DbaScriptingOption
$options.NonClusteredIndexes = $true
$options.DriDefaults = $true
Get-DbaDbTable -SqlInstance SQL1 -Database DB1 -Table MyTable | Export-DbaScript -Passthru -ScriptingOptionsObject $options

Nice! Now we can see all the stuff.

Try it yourself

See the other options available, change the values, rerun and analyse the output.
Do you need to export it to run on a lower SQL Server version? Change the TargetServerVersion option

#Will script the code like SQL Server 2014
$options.TargetServerVersion = "Version120"

You want to include the “IF NOT EXISTS” statement? Change the “IncludeIfNotExists” option.

Here is an example to script out a list of tables (add more to the $TableName variable):

$SourceServer = "SQL1";
$SourceDB = "DB1";
$TableName = "MyTable", 'YourTable';

$options = New-DbaScriptingOption

$options.DriPrimaryKey = $true
$options.DriForeignKeys = $true
$options.DriNonClustered = $true
$options.DriChecks = $true
$options.DriDefaults = $true
$options.Indexes = $true
$options.IncludeIfNotExists = $true

$TableName | Foreach-Object {
    Get-DbaDbTable -ServerInstance $SourceServer -Database $SourceDB -Table $_ | Export-DbaScript -ScriptingOptionsObject $options -Passthru;
}

Availability Groups example using dbatools

Try it yourself:

Get-DbaAvailabilityGroup -SqlInstance SQL1 -AvailabilityGroup SQL1_AG1 | Export-DbaScript -Passthru

Summary

We have seen how we can leverage on some dbatools commands to generate T-SQL scripts from objects. This way we can versioning or just run them on other instace/database. We have seen what default options it (doesn’t) brings and how to add more options to it.
We also saw that, for some objects, the script option is not available. As example Availability Groups, but dbatools can help you here too.

Thanks for reading!

Access is denied – Using DTCPing utility between two Windows Server 2016

Few days ago a client requested the configuration of MSDTC (Microsoft Distributed Transaction Coordinator).

NOTE: If you want to know more about it here is a nice FAQ from Microsoft blogs – MSDTC Recommendations on SQL Failover Cluster?

The client has 2 machines: one an application server and one a database server.

Both run on Windows Server 2016 OS and, the database server runs SQL Server 2016 using Availability Groups feature (where their databases resides).

This seems normal… but actually SQL Server 2016 SP2 is the first version that provides full support for distributed transactions in availability groups.
For more info take a look on Transactions – availability groups and database mirroring help page.

Configuration

To configure the MSDTC correctly, you can/should follow all the check lists on the How to cluster the DTC service for an Always On availability group.

“Ok, but you mentioned ‘Access is denied’ error on the title” – Here is the story behind it

To test and/or troubleshoot if the configuration of MSDTC is correct you can rely on two main utilities:
DTCTester – Tests the transactions between two computers if SQL Server is installed on one computer, using ODBC to verify transaction support against an SQL Server database.
DTCPing – Tests the transaction support between two computers without testing SQL Server duties. The DTCPing tool must be run on both the client and server computer. Read more on Troubleshooting MSDTC issues with the DTCPing tool

The client requested a test with DTCPing utility. After hitting the “The RPC server is unavailable” error which can be overpass by open the correct firewall rules, I was hitting the “Access is Denied” error.
I read, once again, the troubleshooting post but the explanation/resolution for this error did not fit on my configuration (remember the application server is an Windows Server 2016 not an “client OS” (AKA windows 7/8/10) as mentioned on the post.
I tryied my google-fu to find more answers but…nothing. Every single response where people solved their issues fits on the troubleshooting post.
I talked with my colleague from the firewall team just to double-check that the traffic was not being blocked at all. It was OK. Everything going on…so it should be something different.

When nothing else fits, you need to try anything

The documentation (Troubleshooting MSDTC issues with the DTCPing tool) mention “Windows XP” and “Windows VISTA” but this article is from 2008. Translating for today this should apply to Windows 7/8/10, even so, I decided go give it a try and change on the Windows Server 2016 machines the registry key mentioned.
Guess what?! It worked!!!

In this case, I had to ignore the statement: “This error will only occur if the destination machine is a Windows XP machine or a Windows VISTA machine.”

This blog post is to document this so other people that face the same problem can know they should try.

Final thoughts

When nothing else seems to work and you have some notes saying that it only applies to specific versions/scenarios, sometimes it worth trying on your scenario. Assumptions can change over time.

Thanks for reading.

SSMS GUI not working? Try to use T-SQL!

One of the good things, when we have new clients, is that sometimes they have needs that you never heard before.
This does not necessarily mean that they are complex. As a matter of fact, they can be really simple..now the question is..are they doable? 🙂

From my experience, this can happen mainly because one of two reasons, they have some very specific need, or because the way the application is built will make you work with features that you haven’t played yet.

SQL Server Agent Job Schedules – Scenario

The client approached me and asked “Hey, we have an account that is the owner of our jobs, but we would like to use a different account to change the schedule of the job, mainly the start time: is that possible?
As I was not sure about it, I jumped to the documentation.

First things first

I double checked that the login they were mentioning had any permissions on the msdb database. In this case, the login was already part of one of the SQL Server Agent Fixed Database Roles, namely the SQLAgentOperatorRole, which have the following permissions described here.

If we take a look at the 1st row of the grid we can see that a login can change a Job Schedule if they own it.

Fair enough, let’s try it

With this information, I was confident that it would work.
To test it, I have created a login, added it to SQLAgentOperatorRole fixed role on the msdb database and had to change the schedule owner.

NOTE: We are talking about schedule owner and not job owner, this are two different things.

To find who is the owner of the schedule we can run the following query:

SELECT name, SUSER_SNAME(owner_sid) AS ScheduleOwner
  FROM dbo.sysschedules

Then, we need to change the owner to the login we want to use. For this, we should use the sp_update_schedule stored procedure on msdb database using the following code:

EXEC msdb.dbo.sp_update_schedule 
	@name = 'ScheduleName',
	@owner_login_name = 'NewOwnerLoginName'

Now that the login we want to use to change the schedule is the owner of it, the client can connect to the instance using SSMS and this login and edit the schedule, right? Unfortunately no.

Bug on GUI, or missing detail on documentation?

I tested on SSMS and the GUI is disabled

I had SSMS v17.3 which is a little bit out of date, so I upgraded to v17.9.1 which is the current GA (General Availability) version but I got the same behaviour. I have also installed the most recent version which is v18.0 preview 7 (by the time of this post) but, then again the same behaviour.

I decided to open a bug item 37230619 on SQL Server UserVoice called “Edit Job Schedule not working when login is the schedule owner” that you can upvote here if you agree that this should be fixed.

Workaround

Get the schedule id from the list above and you can run the following command (with the login that is the owner of the schedule) in order to change the schedule properties, in this case, the start date, to run at 1am.

USE [msdb]
GO
EXEC msdb.dbo.sp_update_schedule @schedule_id=0000, 
		@active_start_time=10000
GO

I agree that the @active_start_time parameter value is not the most intuitive, but if you look closer it has the format of ‘hh:mm:ss’ but without the ‘:’ characters and it’s a number.
In this example, ’01:00:00′ is translated to 10000.

At least this way it works. The client was happy to have one way to do it.

Bottom line

When the GUI doesn’t work, try to script out the action or find what is running behind the hood and run the command manually. Maybe you get a surprise!

SQL Server Operations Studio and VSCode: The wrong default datetimeoffset format

This post is to answer the question: “You are used to seeing in the format of yyyy-MM-dd right?” that I have raised on my blog post Don’t cutoff yourself with dates in T-SQL – Did you know….

As you could see from that blog post, my screen shots were from VSCode and in this case using mssql extension, but this happens also on SQL Server Operations Studio.

“But why are my datetimeoffset values on VSCode being showed in that format?” (dd-MMM-yyyy)

The short answer is because it relies on your regional settings.

This means that whatever settings you have set for your date when you open the VSCode or SQL Operations Studio this will be used to show the output from your datetimeoffset columns.

I’m being specific when I mean datetimeoffset

The DATETIME and DATETIME2 types already display always in yyyy-MM-dd format like SQL Server Management Studio.
Once again, remember, I’m talking about default output not if you use a CAST, CONVERT or FORMAT function to manipulate the results.

How can we fix this?

When I was looking for this behaviour I did some research and found that a similar problem was raised but regarding DATETIME2. You can see it here – issue #570.
With this in mind, I decided to open an new issue (#1139) on the vscode-mssql extension repository on GitHub and point to the other one already solved.

If you identify yourself with it, please add your to the issue.

Now, we need to wait to see the evolution and, hopefully, a fix will be included on a upcoming release.

Summary

Always try to use an unambiguous date format like “yyyy-MM-dd”.

Remember, copying things that can have double meaning (incorrect format) can lead to unexpected results like I have shown on the other blog post.

And last but not least, if you think is a bug/missing feature please take the time to fill an issue. It will help you an others for sure!

Thanks for reading!

Triggers: The hidden logic that will strike back – TSQL Tuesday #106

https://voiceofthedba.com/2018/09/03/t-sql-tuesday-106-trigger-headaches-or-happiness/

This month’s T-SQL Tuesday is brought by Steve Jones (b | t) and he wants to know if triggers causes headaches or happiness to us.

This is the 106nd edition of TSQL2sDay – an Adam Machanic (b | t) brainchild.

 
Triggers are those kind of database objects that sometimes are the best bet but most of the times they will strike back.

“Why?” – You may ask.

From my experience triggers are objects that we “tend” to forget are there, and only when we hit a problem and, sometimes, after digging into the problem we end by saying something like

wait..but why is the value different from what I have used?! Ohh…maybe the table has some triggers.

Yes, it has already been uttered by me before. For me, triggers are a hole of possible business logic written and forgotten…forever.
Let me just clarify the meaning of “forever” here – until you hit a problem or you need to rewrite some logic and the result is not what you expect. Again, because the code is ‘hidden’.

Recent pain – A project to migrate Firebird SQL to T-SQL

When Steve mention the “I think I’ve ended up using triggers in 0.01% of my tables or less.” I remember this project and after the math I can say they used ~78,84% and ~63,32% (were two databases) of triggers on their tables.

When looking at their code it became obvious the type of use they are making, mainly to fill 1-2 columns with current datetime and/or to generate a new ID (because they didn’t use identity columns – supported on v3.0 or higher)

“You never used triggers, it’s what you are saying?”

No, it’s not. I have used them before and not only 0,01% but maybe less than 5%.

Let me share an story

Back in the earlier times when I was a full-time T-SQL developer (I think was on 2008/2009), I worked with an application that use triggers to insert data on some tombstone tables that help to go back and get the data at that time.
Nowadays, we can achieve this with much less code using temporal tables.

One day I hit an issue where the record on the main table wasn’t being inserted, I run the code, a stored procedure, with some parameters and I got back a new ID (we had identity columns) all good. When I ran with other parameters (the ones not working) I didn’t see the new ID, I went back and forward and seeing the gaps generated by the problematic parameter set, throw some debug messages into the stored procedure, also throwing more information on the catch block to understand the problem.

Then, after reading the message carefully I saw that the problem was on the insert on my tombstone table and not on my “visible” logic. Spent about one hour to get it…

Waste of time

Or not, from that day on, one of my first questions when I hear people have some problems like this is to ask “are there any triggers on the tables involved?”

With that I already saved some hours of work.

Wrap up

As you may understand I decided to not use triggers whenever possible, but, sometimes we pick the train already in movement and we may need them.
If you fit on this last scenario, please, please document it very well!

I’m eager to read other’s stories because I bet they also have some good points and on this interesting topic.

Thanks for reading.