Reliably dropping a SQL Server database if it exists

I often need to write scripts that drop and recreate databases. The hard part of that has always been reliably dropping a database if it already exists. And no, you wouldn't think that would be hard, but it is.

Built in Command

T-SQL has a built-in command for this.

You'd hope that would work, but it doesn't.  I wish it did. The problem is that it will fail if anyone is connected to the DB. And to check if anyone is attached, you first need to check if the DB exists, so it makes the whole "IF EXISTS" part that was added to this command, completely pointless.

Worse, if you have separate code to kick everyone off first, you always have a chance of a race condition, between when you kick everyone off, and when you execute the command.

Nearly OK

Years back, the Microsoft docs library said to drop a database like this:

This was promising, but unfortunately, it has an issue as well. Because you were in the master database when you issued the ALTER, you don't know that you are the single user. So, periodically, that would fail too.

Best Workaround

Over the last few days, we've had a discussion on an MVP list about how to work around this. Many thanks to Paul White, Erland Sommarskog, and Simon Sabin for contributing to it.

The best outcome I have right now is to use this:

To get the DROP to work properly, you need to execute the ALTER DATABASE from within the target database. That way, you end up being the single user, and even though you then execute a change to master, you hold the required session lock on the DB, and then the drop works as expected.

Because you can't have a USE Sales in the script if the Sales DB doesn't exist, this unfortunately has to be done in dynamic SQL code, where it is only executed if the DB does exit.

The last change to tempdb is just protection, if I have a script that then wants to create the DB and change to using it. If that goes wrong, I want to end up creating things in tempdb, not somewhere else like master.

What I wanted

What I've been asking for, and for a very long time, is this:

The ROLLBACK IMMEDIATE needs to be on the DROP DATABASE command, not on a separate ALTER command. Hopefully one day we'll get this.

Book: Implementing Power BI in the Enterprise

It's been a while coming, but my latest book is now out. Implementing Power BI in the Enterprise is now available in both paperback and eBook. The eBook versions are available in all Amazon stores, and also through most book distributors through Ingram Spark distribution.

I've had a few people ask about DRM-free ePub and PDF versions. While the Kindle version on Amazon is their normal DRM setup, you can purchase the DRM free version directly from us here:

https://sqldownunder.thrivecart.com/implementing-power-bi-ent-ebook/

It contains both the ePub and PDF versions.

Book Details

Power BI is an amazing tool. It's so easy to get started with and to develop a proof of concept. Enterprises want more than that. They need to create analytics using professional techniques.

There are many ways that you can do this but in this book, I've described how I implement these projects.  And it's gone well for many years over many projects.

If you want a book on building better visualizations in Power BI, this is not the book for you.

Instead, this book will teach you about architecture, identity and security, building a supporting data warehouse, using DevOps and project management tools, learning to use Azure Data Factory and source control with your projects.

It also describes how I implements projects for clients with differing levels of cloud tolerance, from the cloud natives, to cloud friendlies, to cloud conservatives, and to those clients who are not cloud friendly at all.

I also had a few people ask about the table of contents. The chapters are here:

  • Power BI Cloud Implementation Models
  • Other Tools That I Often Use
  • Working with Identity
  • Do you need a Data Warehouse?
  • Implementing the Data Model Schema
  • Implementing the Analytics Schema
  • Using DevOps for Project Management and Deployment
  • Staging, Loading and Transforming Data
  • Implementing ELT and Processing
  • Implementing the Tabular Model
  • Using Advanced Tabular Model Techniques
  • Connecting Power BI and Creating Reports

I hope you enjoy it.

MVP Challenge: Data and AI plus some online exams

If you follow anyone that's part of Microsoft's MVP program, you might have heard there has been a global cloud skills challenge happening lately: #TheMVPChallenge.

There were three challenges that each of us could complete:

  • Azure Data & AI Challenge
  • Dynamics 365/Power Platform Challenge
  • Microsoft 365 Challenge

You can imagine which one I chose to complete. I did the Azure Data and AI challenge. Data and AI are a pretty common grouping.

While I would have liked to also do both the other challenges, Power Platform is obviously interesting to me, but I've looked at Dynamics over the years, and it's just not for me. The Microsoft 365 aspects aren't also my territory, but it might have been interesting to see what I could have learned if I'd done it, given most of us use those products every single day.

Data and AI Challenge

The name of this challenge was odd. There really weren't any data topics. It was all AI, and I don't mind that.

Some years back, I did the full Microsoft Professional Program for Data Science, and I also did the full Microsoft Professional Program for AI. Those offerings are now gone, and I have to say that I was quite sad when they disappeared. They really covered the topics in good detail. I also work with many of the Cognitive Services regularly, so this wasn't a new area for me.

I was really interested to see what Microsoft Learn now offered as a mechanism for free learning on these topics.

In the challenge, we needed to complete 42 modules. The depth really isn't there any more but for introductory-level material, what is provided is quite excellent. And did I mention that it's free?

Did I learn any new things while doing the challenge? Yes, a few. I think that any time you go back over an area, you pick up something that you've missed before. Or perhaps there's something you've forgotten because it didn't seem useful to you at the time, and now you realize that it is quite useful.

I'm already using some of the concepts that I picked up while doing the challenge, even though it was introductory level content.

MVP Community

I was really pleased to see how many of the local (and remote) MVP community took part. It's easy to start these types of challenges and to lose focus and stop.

In particular, I loved the way that our CPM Shiva Ford and other MVPs egged each other one to make sure we completed the challenge.

Exams

There was no requirement to do any exams related to this. I had a real interest, though, in knowing what was in all the "Fundamentals" exams. I wouldn't normally have taken them as they don't count towards certifications, so I decided to do a few exams during the month.

First I took the Azure Fundamentals exam AZ-900.

I thought the exam was OK, not too difficult and should be attainable for most people starting out with Azure. One thing I didn't like was that they spent so much time examining whether or not particular services were Platform as a Service (PaaS) or Infrastructure as a Service (IaaS) offerings. For many services, that's straightforward, but for services like storage, there are aspects of both. Regardless, that's not the sort of thing people should be all that focussed on. They should understand the difference, and that's enough.

Next I took the Azure Data Fundamentals exam DP-900. This had a little more substance to it. I thought it was a reasonable exam and covered many areas of data. The balance was a bit different to what I would have hoped for, but still OK. I did find questions, though, that were just simply wrong, where the author clearly might have read about a concept, but really didn't understand it. In hindsight, I should have taken the time to comment on the really problematic questions but I had a bunch of exams that day, and was already tired from fitting in the exams.

Next I took the Azure Administrator exam AZ-104. Now this exam was way more challenging than I expected. I wouldn't say that any of the concepts were all that difficult if you've worked with Azure for any length of time, but the way the questions were phrased made them more challenging than necessary. I was also surprised by the amount of focus on networking, when you consider all the tasks that an Azure administrator needs to handle.

I was glad to have taken AZ-104 though, as I had previously added the Designing and Implementing Microsoft DevOps Solutions exam AZ-400. I'd been meaning to take AZ-104 for ages, as that's what I still needed for the Microsoft Certified: DevOps Engineer Expert certification.

Upcoming

Now that I'm back into taking some exams again, I plan to do the other fundamentals exams (AI, and Power Platform) soon to see what's in those. Then I'll do the other data-related exams to complete those certifications.

Online Exams

I'll finish this post with a few comments about the online exam mechanism.

I didn't love it but it's workable.

I found the process pretty random. With my first exam, I tried to follow their instructions perfectly. When it got to my turn, a chat window opened but the text box where I could type wasn't enabled, so I couldn't respond to the monitor person. When I didn't respond, they told me they had to put me back in the queue again. Then that happened yet again. I was starting to think I wasn't going to be able to do the exam.

On the third time round the queue, I was able to type into the text box. They person then told me they couldn't see my video. But my video was displayed on the screen in their application. So clearly the application could see me. No idea why the monitor person couldn't. I started to explain that in the chat text box and then they just suddenly started the exam.

The other bizarre part is that they tell you to put your phone out of reach, and that if you leave the video area, you'll fail. So I put my phone in another room. And then a later screen tells you that if they need to contact you, they'll call you on your phone.

For the second exam, again I went round the queue two times. When the person came online in the chat, he told me I wasn't allowed to wear headphones. The bizarre thing is that I thought that would be best, and I did the first exam with them on. I took them off and he was happy.

Third exam started OK, but then I got pinged for turning my head. I have a tendency when sitting and pondering a question, to turn my head, and perhaps even lean back and look up. As soon as I did that, I had the monitor person warn me that if I did that again, I'd fail.

I'm glad there is a way to do these exams online, but I'd really like to see the experience improved. At least with these notes, I hope it will help you if you haven't been doing any exams this way.

 

 

 

 

How to kill off the Camtasia 2021 Launcher Pop Up

Camtasia is one of my favourite products. I use it regularly. I've been so excited to start to get to use Camtasia 2021 that was released just recently. It's a nice step up from an already great product.

But what I didn't like after upgrading, is that every time I started Camtasia, instead of the "normal" editing screen, I got a cutesy little popup that asked me what I wanted to do with the product today.

I'm not a fan of the popup; I'd rather the product just opened into a blank new project like it used to. The popup really just slows me down.

So I asked the TechSmith people on Twitter and they came to my rescue!

Here's the registry key that you need to modify to make this go away:

HKEY_CURRENT_USER\SOFTWARE\TechSmith\Camtasia Studio\21.0\Camtasia Studio\21.0\ShowLauncherAtStartup

If that's a 1, then you get the launcher. If you change it to a zero, it's gone.

Hope that helps someone.

SQL Interview: #14: Set operations using EXCEPT

This is a post in the SQL Interview series. These aren't trick or gotcha questions, they're just questions designed to scope out a candidate's knowledge around SQL Server and Azure SQL Database.

Section: Development
Level: Medium

Question:

UNION and UNION ALL are commonly used to combine two sets of rows into a single set of rows.

EXCEPT is another set operator.

Can you explain what it does?

Answer:

EXCEPT is used to remove any rows in the first set of rows, if the same rows appear in the second set.

For example, in the code below:

The query returns all the Trading Names for customers unless a supplier also has that same name.

In other database engines (e.g. Oracle), this operator is called MINUS.

Note: For EXCEPT to work, the queries must have the same number of columns, and compatible data types. The values must also be "comparable". For example, with strings, that means compatible collations, and it means that only comparable data types are supported i.e. spatial data types would not be, as they are not "comparable".

SQL: Password complexity rules for Azure SQL

Azure SQL (both Azure SQL Database and Azure SQL Managed Instance) both have different password complexity rules to SQL Server. I was reading an email discussion list and a poster asked where he could find the list of password complexity rules for Azure SQL. I said I'd never seen a list.

Well it turns out that there is a list, but not where you might have thought to look. They're spelled out in this article:

Identify the right Azure SQL Database SKU for your on-premises database (Data Migration Assistant) – SQL Server | Microsoft Docs

To avoid you reading the whole article, at the time of writing, these were the rules for Azure SQL Database:

  • Your password must be at least 8 characters in length and no more than 128 characters in length.
  • Your password must contain characters from three of the following categories – English uppercase letters, English lowercase letters, numbers (0-9), and non-alphanumeric characters (!, $, #, %, etc.).
  • Your password cannot contain all or part of the login name. (Part of a login name is defined as three or more consecutive alphanumeric characters.)

Slightly more confusing is that the article says it's talking about the server admin password, but it also appears that this same list of restrictions applies to all logon/user passwords as well.

Azure SQL Managed Instance is exactly the same except it requires the password to be at least 16 characters in length.

I hope that helps someone, including myself when I next go looking for this.

 

SQL Interview: #13: Impact of optimize for adhoc workloads

This is a post in the SQL Interview series. These aren't trick or gotcha questions, they're just questions designed to scope out a candidate's knowledge around SQL Server and Azure SQL Database.

Section: Server Tuning
Level: Advanced

Question:

SQL Server 2008 introduced a server option for optimize for adhoc workloads.

Can you explain the problem that it is designed to solve, what most commonly causes the problem, and what impact this option has when you enable it?

Answer:

On a SQL Server, you can end up with a large amount of memory being taken up by query plans that will likely never be used again.

There are two primary causes of this situation:

  • The less common cause is that you have a large number of adhoc queries being executed as one-off queries.
  • The most common cause is that you are using a framework like LINQ that does not handle data typing properly, and causes a large number of query plans to be created for each query. You might also be using lower-level frameworks like ADO.NET incorrectly.

As an example, command objects in ADO.NET have a parameters collection. If you add parameters using the AddWithValue() method, you specify the parameter name, and the value, but you do not specify the data type. The problem with this is that the framework then tries to work out what the data type is, from the value.

This means that if you pass a string like 'Hello', then it might guess nvarchar(5) but if you pass a string like 'Hello There', it might guess nvarchar(11).

The lengths of data types are part of the signature for each query plan. This means that you can easily end up with different query plans for every combination of every length of string that has ever been passed to the query. We call this Plan Cache Pollution.

The correct way to fix this is to avoid ever using methods like AddWithValue() and instead using a method where you specify the data type. The challenge here is that many frameworks like LINQ have this behaviour baked in, and the developer cannot change it.

When the optimize for adhoc workloads option has been set, the first time a query is seen, the hash for the query is stored (so it can be remembered) but the query plan is not stored. The next time the same query is seen, the compiled plan is then stored for reuse.

This avoids the plan cache becoming littered with one-off query plans.

Another option taken by some frameworks, is to just use varchar(4000) or nvarchar(4000), or worse nvarchar(max) for all strings. That will of course avoid the plan cache pollution, but it's likely to then cause issues with memory grants in poor quality plans.

 

T-SQL 101: #96 Choosing from alternatives with IIF in SQL Server T-SQL

In my last T-SQL 101 post, I described the CASE statement. Until SQL Server 2012, that was the only real option that we had for choosing between alternate values. In SQL Server 2012, Microsoft gave us another option with the IIF function.

The IF function is very similar to the IF function in Microsoft Excel. It takes three parameters:

  • A boolean value to check (normally this is an expression)
  • A value that will be returned if the first parameter is true.
  • A value that will be returned if the first parameter is false.

In the main image above, you can see the equivalent CASE statement and the simplified IF function equivalent.

Nesting

Again similar to how IF works in Excel, you can nest IF functions. For example, you can write:

In this case, we're saying that if the Size is Large, then 12 will be returned. But otherwise, if the Width is 13, then 0 will be returned. In all other cases, -12 will be returned.

Returned Data Type

The function looks at the data type of the second and third parameters (i.e. the values returned for true and false), and chooses the data type with the highest precedence. For example, if the second parameter is an int, and the third parameter is a bigint, then a bigint will be returned no matter which value is chosen.

Learning T-SQL

It's worth your while becoming proficient in SQL. If you'd like to learn a lot about T-SQL in a hurry, our Writing T-SQL Queries for SQL Server course is online, on-demand, and low cost.

SDU Tools: Weekday Across Years in SQL Server T-SQL

Another request that I received a while back, for a new function to our free SDU Tools for developers and DBAs, was to be able to find the day of the week, for the same day and month, over a range of years. A simple example would be to find what day Christmas will be each year for the next ten years. So we've added a new function WeekdayAcrossYears.

It takes four parameters:

@DayNumber int – day number in the target month
@MonthNumber int – target month number
@FromYear int – starting year
@ToYear int – ending year

The function returns a rowset with YearNumber and WeekDay (in English).

Find out more

You can see it in action in the main image above, and in the video here. The full current version of the code is also shown below:

You can use our tools as a set or as a great example of how to write functions like these.

Access to SDU Tools is one of the benefits of being an SDU Insider, along with access to our other free tools and eBooks. Please just visit here for more info:

http://sdutools.sqldownunder.com

Latest version of the code

Note: the code might wrap when displayed below.

SQL Interview: #12: Using UNION vs UNION ALL

This is a post in the SQL Interview series. These aren't trick or gotcha questions, they're just questions designed to scope out a candidate's knowledge around SQL Server and Azure SQL Database.

Section: Development
Level: Intro

Question:

Consider the following code:

A: What would be the difference in results if the query was changed to use a UNION ALL instead of the UNION?

B: What are the performance differences between using a UNION ALL and a UNION in a SELECT statement?

C: How could this query be rewritten without the UNION?

Answer:

A: If the query was changed to use UNION ALL, a customer might be returned more than once. A UNION performs a DISTINCT operation on the results of the query. A UNION ALL does not perform the DISTINCT operation on the results.

B: A UNION invariably involves more work and is slower as it needs to perform a DISTINCT operation on the results.

C: One option would be to rewrite it as follows: