Shortcut: Import and Export Settings in SQL Server Management Studio

Whenever I need to work on a new laptop or server, or whenever I change versions of SQL Server Management Studio, I kick myself for not remembering to export my settings, so I can import them again.

I spend quite a bit of effort getting SSMS configured the way I want, so it only makes sense to save the settings. Saving them isn’t perfect but it’s far better than not having done it.

From the Tools menu, choose Import and Export Settings:

As an example, let’s export the settings. I’ll choose Next:

Notice that it’s not an all or nothing export. I can choose details of which settings or groups of settings to export.

In this case, I wanted all of them, so I just need to pick a name and a location:

And next time I change to a different machine or new version, I can just import them and pat myself on the back for remembering.

Opinion: Constant churn breaks community learning for software applications

A current trend that I can’t say that I love is constant churn within software applications. I have no interest to go back to the days where we got a new version of SQL Server or Power BI, etc. every few years.

It’s also not a case of who moved my cheese?

In fact, I thrive on change. However, I’ve now become really concerned about how anyone:

  • Learns to use a complex application
  • Remembers how to use a complex application when they don’t use it daily

I first really struck this issue with Azure. If I was teaching a class that used Azure, I could check every single lab on Sunday night, then Monday morning, the students would find it had all changed. That’s OK for an experienced person, but not OK for a learner.

I love the rate of change for Power BI. We’re endlessly getting wonderful new things. But I have to say that every class that I teach on this is like a new experience. I’ve got another one this coming Tuesday. I used to look forward to them but now I have a major hesitation every time, as I wonder what parts of the labs will have broken.

This is now an ongoing challenge for all this type of software though. I helped create some labs for VSTS very recently, and when I look at the product now, it barely resembles the one that I built the labs on.

Is it better? Probably yes.

But even though it might have been a few months ago, it feels like just the other week, and yet, not only has the UI changed, entire concepts have been added or removed, and the order that things are done in has changed substantially.

I don’t know the answer to this but the current rate of churn is a substantial issue.

I gather the plan with the DevOps guys is to put a set of labs on GitHub, and let people who are doing the labs point out the issues day by day as they strike them. Again, for experienced users that might work. But for newcomers, I really wonder if that’s what they’ll think.

Will they realize the app must have changed, and it’s all different, or will they just think the product is too hard to use. Either way, they’ll be very frustrated.

Image by JeShoots

And while initial learning the product is one thing, I’m worried about it longer-term. A product like VSTS lets you set up automation and you hope you won’t need to change it constantly. But if every time you go to make a change, you’re struggling to use it like you’re a newbie again, that’s a problem.

Finally, I’m really concerned about ongoing support.

The vast majority of support of software applications today happens from community resources like blogs, webcasts, etc.

Will they continue to be created at the same pace if the authors know they’ll be irrelevant or wrong within a very short time? How will end-users learn to do things when none of the online examples they find still work?

I wish I knew the answer to this.

DevOps: Declarative is where we want to be

If you have ever tried to write scripts to configure operating systems, you’d realize just how hard that is to get correct.

For example, if you need to ensure that your virtual machine has two network adapters, and they need to have a specific set of IP addresses, how do you do that?

Image by Markus Spiske

The traditional approach for PowerShell (and similar tools) was to try to write a step-by-step script to configure the network adapters the way you want. But where do you start? Do you write a script to check for any existing adapters and loop through them to try to remove them? Do you try to add the ones you want, and then remove the others?

You’ll quickly realize that you get into very, very complex coding because you might not be able to be sure what your starting point is. Everything has to stay on the path that you prescribed.

And worse, what happens if you run this script more than once?

That’s the problem with imperative code. You are telling the operating system the steps required for configuration.

We don’t want to be doing this.

With a declarative approach, we tell the system how we want it to end up ie: the desired state, and let it worry about how to get configured that way. This is what we want to be doing instead of writing imperative code.

PowerShell offers DSC (desired state configuration) where you describe in a JSON file, the way you want the configuration to end up. A bonus in this approach is that it’s idempotent ie: no matter how many times you run it, you end up with the same outcome.

It’s important wherever possible to be doing declarative configuration not imperative configuration. In later posts, I’ll talk more about how.

Introducing SDU Tools: Free T-SQL Tools for DBAs and Developers

I’ve worked with T-SQL for a very long time, and over the years I’ve lost count of how many tools I’ve found the need to create to help me in my work.

They have been in a variety of script folders, etc. and whenever I go to use them now, I often have to decide which is the best version of a particular tool, as they’ve also been refined over time.

So I decided to get them into a clean clear shape and SDU Tools was born. I’ve grabbed a whole bunch of tools for a start, made sure they are pretty consistent in the way they do things, and published them within a single script. I figured I might as well also share them with others. They are free (I get you to register so I know what usage they are getting).

image

For each of the tools, I’ve also created a YouTube video that shows how to use the tool. I’m also planning to create blog posts for each tool so I have the opportunity to show things that won’t fit in a short video and ways of using them that might be less obvious.

I’ve got a big backlog of tools that I’d like to add so I’m intending to add in whichever ones I get reworked during each month. Likely areas in the upcoming months are further functions and procedures related to scripting objects, and also to code generation.

The tools ship as a single T-SQL script, don’t require SQLCLR or anything to clever, and are targeted at SQL Server 2008 and above. They are shipped as a single schema that you can place in whichever database (or databases) suits you best.

Of course there’s the usual rider that you use them at your own risk. If they don’t do what they should, I’ll be sad and then fix it, but that’s all Smile

I hope you find them useful.

You can find out more about our free SDU Tools here:

http://sdutools.sqldownunder.com

Enjoy !

Auto-Injection of Enterprise Edition Features

There’s an interesting scenario that’s come up in creating some new samples for SQL Server 2016.

I know that for many, many years people have asked for a way to limit developer edition to only use standard edition features, or at least to have a way to flag them. I’ve often thought that I’d like a way to set a “target environment” and have the colorization change anything that I can’t use in standard to an obviously different color.

However, previously, if you used developer edition to create a database, as long as you didn’t use any enterprise features, you could then backup and restore that database to a standard edition server.

That’s no longer the case with SQL Server 2016.

If you create a temporal table in a database on developer edition, it automatically chooses PAGE compression for the history file. There is also no DDL to avoid this. The database can no longer be restored on a standard edition server. To work around it, you would need to rebuild the table without compression after it’s created.

I see this as quite a change in behavior. I don’t think that features that are only available in enterprise (or other perhaps Azure) editions should be “auto-injected”.

Clearly, if you script out the database and create a new database on standard edition using the script, it will create the history table without compression. But I suspect that many people create DBs as starter DBs using developer edition, and then deploy them to other editions that don’t support compression.

I’d be interested in thoughts on how common this practice currently is.

SQL Down Under Podcast 68–Guest Joe Yong–SQL Server 2016 StretchDB

Hi Folks,

One of the intriguing options that’s coming as part of SQL Server 2016 is StretchDB. I’ve been spending some time working with it and learning about it, and I’ve ended up with lots of questions.

It was great to get an opportunity to speak to Joe Yong to get all my questions sorted.

Hope you’ll enjoy it too. You’ll find it here: http://sqldownunder.azurewebsites.net/Podcasts

R Tools for Visual Studio

In recent months, I’ve been brushing up my R skills. I’ve had a few areas of interest in this:

* R in Azure Machine Learning

* R in relation to Power BI and general analytics

* R embedded (somewhat) in SQL Server 2016

As a client tool, I’ve been using RStudio. It’s been good and very simple but it’s a completely separate environment. So I was excited when I saw there was to be a preview of new R tooling for Visual Studio.

I’ve been using a pre-release version of R Tools for Visual Studio for a short while but I’ve already come to quite like it. It’s great to have this embedded directly within Visual Studio. I can do everything that I used to do in RStudio but really like the level of Intellisense, etc. that I pick up when I’m working in R Tools for Visual Studio.

So today I was pleased to see the announcement that these tools have gone public. You’ll find more info here in today’s post from Shahrokh Mortazavi in the Azure Machine Learning blog: https://blogs.technet.microsoft.com/machinelearning/2016/03/09/announcing-r-tools-for-visual-studio-2/

My Sessions from Ignite Australia on the Gold Coast now online

Hi Folks,

Couldn’t make it to Ignite? The team from Microsoft Australia recorded all the sessions and they are online now.

Here are the three sessions that I delivered:

 

Azure Datacamp Power Hour:   http://www.sqldownunder.com/links/5

Things I Wish Developers Knew About SQL Server: http://www.sqldownunder.com/links/6

Working With SQL Server Spatial: http://www.sqldownunder.com/links/7

 

Enjoy!

Data Camp Sydney (Free)

Hi Folks, on the 5th June (yes that’s Friday next week), I’m running a Data Camp day for the local Microsoft team. It’s being held at Cliftons in the city.

We’ve got four topics for  the day:

  • Azure SQL DB
  • Azure DocumentDB
  • Azure Machine Learning
  • Azure Stream Analytics

If you want to get your head around any/all of these, we’d love to see you there. Places are limited but you must register and can do so here: https://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032627085&Culture=en-AU&community=0