SQL: How do I stop my database files going into the Users folder with localdb?

I've seen questions on the forums asking about where localdb stores its database files. Often, the question is about how to stop them going into the C:\Users folder. Clearly, that doesn't seem to be a good spot for them.

localdb stores database files in the C:\Users area by default as it's trying to find a location that the current user will have permission to write to. While I understand the rationale, that's not where most users will want those files.

The answer is to not let localdb pick where the files will go.

Instead of writing:

use a more complete command like:

While you're at it, check out the CREATE DATABASE syntax and examples for other options that might be useful to you, particularly the ones around filegrowth. (Hint: Use a fixed increment, not a percentage) Not everything will work on localdb but most will.


SQL: Why didn't my SQL Server transaction roll back on a primary key violation?

There's a question that I hear all the time in SQL Server forums:
Why didn't the transaction roll back when a primary key violation occurred?
Take a look at the code in the main image above. Would you expect it to execute the second and third INSERTs if the first INSERT failed with a primary key violation? So many people would. If you're one of them, read on.
By default, that's not how SQL Server works.
It might not be a primary key violation. It might be any number of other errors in the question, but the answer is still the same.

What's going on?

The primary key violation is an example of what's called a statement-terminating error. T-SQL data modification statements are atomic. If the INSERT was inserting three rows, none get inserted.
But only the statement ends, not the transaction.
If, instead, the error was a batch-terminating error, there's a difference. SQL Server aborts the transaction and rolls it back.

How can I change that?

If you want to change that behaviour, put

at the start of your code.

It tells SQL Server to promote statement-terminating errors to batch-terminating errors. Once you do that, the transaction code above works as expected. The primary key violation aborts and rolls back the transaction.

But I've seen it work without that!

I also occasionally hear people say that they've seen a different outcome. They didn't use XACT_ABORT and it still rolled back. What they're usually missing is that their data access library has set it on for them.
You can tell if it's on by executing this code:

Note: there are a few edge cases on this. Errors that relate to invalid object names, etc. are different. But for standard code, this is how it all works.

SDU Tools: Extracting initials from a name in SQL Server T-SQL

I recently came across a requirement to extract someone's initials from within their name. That was a new one for me, so we added a new function into our free SDU Tools for developers and DBAs. It's called InitialsFromName.

It's a straightforward scalar function that takes two parameters:

@Name nvarchar(max) is the name to extract the initials from

@Separator nvarchar(max) is a separator placed between the returned initials.  (Make it an empty string for none)

In the main image above, you can see the difference between the first two examples. The only difference is the separator.

You can use our tools as a complete set or as a library that provides great examples of how to write functions like these.

Find out more

You can see it in action in the main image above, and in the video here. The full current version of the code is also shown below:

Access to SDU Tools is one of the benefits of being an SDU Insider, along with access to our other free tools and eBooks. Please just visit here for more info:


Latest version of the code

Note: the code might wrap when displayed below.


Opinion: On forums, don't do DBTs (drive-by trashings)

I hear about frequent drive-by shootings in some countries. Fortunately that doesn't happen where I live. But what I come across all the time on Q&A forums, is what I'd like to call DBTs (Drive by trashings).
It usually starts when someone makes a genuine effort to try to help answer a question. The DBT (drive-by trasher) pops in and leaves a nasty unhelpful message. It could be "That's misleading" or "That's wrong" or "You don't understand how it works".
But a telltale sign of a DBT is that they never go on to explain anything about their objection. They leave the nasty or condescending message that helps no-one. In the end, all they achieve is to add to the toxic nature of many forums. 
Often when I see one of these people, I challenge them. Far from being useful, invariably their comment often turns out to relate to some rare edge case. It isn't relevant in the slightest and more importantly, it isn't helpful to the person asking the question. 
It's posted as a put-down.
I've written before about how toxic many of the forums are. The worst offenders are also PBs (points-bandits). Their very existence seems to be to chase as many points as possible on the forums. They'll do as little as possible to help the person asking the question. Worse, they are also often DBTs. Their aim is to put down anyone else who might try to answer questions.
It's simple. If you are super-knowledgeable and know that what someone has posted is wrong or incomplete, don't be a DBT. The purpose of the forums is to help people, and you're not doing that. Explain yourself, or it's time to step away from the keyboard.

T-SQL 101: #68 Current date and time in SQL Server T-SQL with SYSDATETIME and SYSDATETIMEOFFSET

It's really common to need to know the current date and time in T-SQL code. Ever since the beginning of the SQL Server product, we've had a function called GETDATE(). It returns the current date and time of the server as a datetime value.

In SQL Server 2008, we got new higher precision data types, and so new current date and time functions were added as well.

SYSDATETIME() is a function that returns the current date and time as a datetime2 value. And SYSDATETIMEOFFSET() returns the timezone offset for the server as well.

You can see the output below:

You might also want the current date and time, based on Universal Time (UTC aka GMT). SYSUTCDATETIME() does that. It calculates UTC time by deducting the server's current timezone offset from the current local date and time.

Learning T-SQL

It's worth your while becoming proficient in SQL. If you'd like to learn a lot about T-SQL in a hurry, our Writing T-SQL Queries for SQL Server course is online, on-demand, and low cost.

Learning Mandarin: Cowherd and the Weaver Girl

I'm following up on the promise I made in the my last Learning Mandarin post about the Qixi Festival. I said I'd talk about the legend behind it, and that's the Cowherd and the Weaver Girl.
(Awesome main image from qixifestival.com)

The Legend

Niulang was young and poor but he was kind-hearted. His biggest possession was an old ox. He's the one referred to as the cowherd. (I always hear them say cowherd but that sounds odd to me. I think they should say cowherder. He's not a herd !)

Zhinü was the seventh daughter of a union between a goddess and the emperor. She's even sometimes described as a fairy.

Curiously, the ox had a history. In some versions of the tale, it used to be the God of the cattle but broke the rules and lost the position.

Either way, Niulang saved the ox when it got sick, and in return, the ox helped Niulang to meet Zhinü who then fell in love with him.

Briefly happy

They married and were really happy. Soon they had two children: a boy and a girl.

The problem is that they married in secret because they knew that Zhinü's mother wouldn't approve of her marrying a mere mortal. After all, her mum was the Goddess of Heaven.

Eventually of course, Zhinü's mum found out about the marriage and was really upset. Celestial soldiers turned up and took Zhinü back to heaven. Niulang and his children went off to heaven to find his wife.  (In an odd twist to the tale, he'd killed the ox and wore his skin. Apparently that's what the ox wanted).

Just before Niulang found his wife, Zhinü's mum used her hairpin to make a river appear between them.

Celestial story

This story has a celestial story as well. The star Altair represents Niulang, and the star Vega represents Zhinü. The river between them is the Milky Way as shown here:

Awesome image from qixifestival.com

But where's the happy ending?

These stories usually have a happy ending, and this one's no different.

Niulang and his children were so sad that the magpies felt sorry for them, and flew up to Heaven to make a bridge between the two lovers. They could then meet on the bridge.

Apparently, the Goddess also felt sorry for them and allowed them to meet on the bridge on the 7th day of the 7th month each year.

OK, so not the happiest ending !

Learning Mandarin

I'll write more soon on the best methods for learning Mandarin. If you want to get a taste for it in the meantime though, my current favorite site is iTalki, and my favorite teacher by far is Amy He. If you decide to try it, click here and it's cheaper for both you and me.

Opinion: Don't add pages to your website if you're not going to update them

Today I wanted to call out a common mistake that I see at websites all over the country. Don't add pages to your website if you're not going to update them.

I'm particularly talking about pages with names like "News", "Articles", "Blog Posts", etc. They're often added when someone first builds a website and is full of hope for how it will be used.

And then it isn't.

Old News

I've lost count of how many sites I visit where there's a News section and when I visit it, there are two or three entries, often years apart. Or worse, there are a few entries from five years ago when the website was first created.

Old newspaper
Awesome image by Holger Link

This makes your company look worse than if you didn't have those pages at all, so remove them.

Old Social Media and Blog Posts

I see a similar issue at companies that I consult at. I watched tools like Yammer being introduced, and the CEOs obliging someone, by making a post or two. And then the CEO is never heard from again.

It's the same if your website has a link to the CEO's blog. If it does, there had better be a bunch of pretty current content, or you should remove it.

The worst version of links to blog posts, is when the only posts are ones that apologise for not posting lately, and promising to post more regularly. And then that last post was two years ago and there's been nothing since.

It's simple: don't have these deadwood links in your sites. It's not the front door that you want to show to the world. 

FIX: SQL Server Replication – Cannot alter column because it is 'REPLICATED'

In SQL Server 2005, I really appreciated the addition of DDL replication.  Before that, I found that making schema changes to replicated tables was a pain in the neck. Almost every time, I was there dropping replication and later setting it up again. That's easy but takes a long time, so sometimes, I'd resort to lots of manual hacking.

Using DDL replication, I could make a whole lot of different types of changes and the subscribers would be updated automatically, at least in most standard subscriptions.

The issue

That was a good story for me up to a few weeks ago. I was working with a client, and they needed to modify the same column in six different databases. Each separate database was replicating (via transactional replication) to Azure SQL Databases.

The client made the changes and they all worked, apart from one server. It failed with the error:

Cannot alter column 'columnname' because it is 'REPLICATED'

I can't say I love that error message. Why is REPLICATED in quotes? (Pretend replication??)

So I was wondering what on earth was different about the servers, or how they were configured. I did a lot of searching and I found many unhelpful posts and comments. I was surprised how many said that you can't modify a column in transactional replication without dropping it and setting it up again. That's not true, and was obvious from all the other servers where it worked. I checked all their subscribers, and sure enough, the change was applied just like we expected.

The fix

I eventually found a useful comment from Steve Fenton.

He mentioned that he'd seen that error coming up if you happened to try to change a column while the Snapshot Agent is running.

I knew the Snapshot Agent shouldn't have been running but I checked with the client. Turns out the Agent had failed when it ran on the previous Sunday. It had a deadlock error, and stopped. So some part of SQL Server thought the Snapshot Agent was still running.

Sure enough, after the client did another execution of the Snapshot Agent, the column could be altered as expected.

I hope that error message gets fixed, as it's really, really misleading.

SDU Tools: Start of Year, End of Year in SQL Server T-SQL

I mentioned in my last two SDU_Tools posts that I get really good feedback all the time from users of our free SDU Tools for developers and DBAs, but none more than when we have more and more date and time functions. Yet another pair of functions that we added in v19 that was recently shipped was StartOfYear and EndOfYear.

We had feedback that even though we had StartOfFinancialYear and EndOfFinancialYear, we didn't have just StartOfYear and EndOfYear. So we've fixed that. These are simple scalar functions that take a single date parameter, and return the date for the start of the calendar year and the end of the calendar year that contain the date provided.

You can use our tools as a set or as a great example of how to write functions like these.

Find out more

You can see it in action in the main image above, and in the video here:

Access to SDU Tools is one of the benefits of being an SDU Insider, along with access to our other free tools and eBooks. Please just visit here for more info:



Certification: Q: When will the new SQL Server certifications come out? A: They won't

I've had a number of people asking me lately about what's happening with SQL Server certifications. There are a number of clients that I deal with, where they require their staff to regularly update their certifications. It's been part of their ongoing commitment to training.

In particular, I'm asked When will the new SQL Server certifications be available?

And the (perhaps surprising for some) answer is: They won't be.

The way that Microsoft certifies people has gone through a seismic shift. The entire focus is now on certifying people's ability to perform roles, rather than their ability to use a particular product.

And of course in the past, it wasn't just a particular product, it was a particular version of a particular product.

Role Based Levels

The new role-based certifications have three basic levels:

Fundamentals – these certifications demonstrate a basic knowledge of an area and do not expire.

Associate – these certifications are more in depth and are targeted at typical professionals.

Expert – as the name says, these are targeted a highly-competent individuals.

There are other questions that I've been hearing:

Q: All the data certifications say "Azure" in the name. Will there be any "on-premises" versions?

A: It's a great question but the answer seems to be "no". The Azure-named exams are intended to cover that knowledge as well. Mind you, most clients that I deal with now, have part of their systems in Azure anyway. Most have some type of hybrid environment now happening.

An obvious disappointment is the sun-setting of the existing MCSA, MCSD, MCSE style certifications, particularly for people who were part way through acquiring them.

Q: For data, where are the expert certifications ?

A: They don't exist yet. I do hope they will as it gives people more to strive for.

You can find details of the new role-based certifications here.

How tough are Expert exams?

It's important to understand that the expert-level exams really are targeted at expert-level candidates. The AZ-400 exam is the DevOps Expert level. What seems to surprise many candidates is that the exam is broader than they imagined. They were expecting an exam that just covered what was included in Azure DevOps as delivered by Microsoft.

To pass, you really need to know your way around other common tools. For example how to integrate Maven, Gradle, SonarCloud, Whitesource Bolt, etc., how to implement OWASP testing, and understand not just how to integrate GitHub but concepts like GitFlow.  The contention is that an expert won't just know the Azure DevOps parts; they'll be experienced with integrating other commonly used tools.