Shortcut: Change connection in SQL Server Management Studio

I commonly run into a few connection-related scenarios:

  • I'm working on a large query and need to run it against several servers, not concurrently, but one after the other.
  • I've just made a database connection, and got my query ready, only to discover that I've connected to the wrong server.

Either way, what I've seen people do in these scenarios is to:

  • Select all the text in the current query
  • Copy it
  • Open a new query window
  • Paste the code

That's all good but SQL Server Management Studio (SSMS) has had a simpler way of doing this for quite a while.

If you right-click in the middle of a query window, you can choose to change the connection. Here's an example. I opened a new query window and entered a query, only to find that it connected to the Azure server that I was working with earlier. The easiest option is to right-click and choose to change the connection:

After you click Change Connection, you can log on again:

And then I can continue in the same query window as before. This is particularly useful if I need to run some code against a test server, and once I've decided that it was correct, I can just change the connection and connect to the "real" server.

At the bottom-right of your query window, you can always see which server you are connected to:

And at the bottom-left, you can see your connection state:

SDU Tools: Extract trigrams from strings in T-SQL

Fellow MVP Paul White and others have discussed the indexing of small segments of strings, to make fast indexes that I think are somewhat like full-text indexes but potentially more efficient. Three character substrings are common, and are known as trigrams.

I've experimented with these at a number of site and seen really interesting (good) results.

In our free SDU Tools for developers and DBAs, we added a table-valued function ExtractTrigrams to help with this. You can pass it a string, and it will pull it apart for you, and return the set of trigrams. This would make it much easier for you to experiment with these types of indexes.

You can see ExtractTrigrams in action here:

To become an SDU Insider and to get our free tools and eBooks, please just visit here:

http://sdutools.sqldownunder.com

Opinion: Don't have blind faith in hardware

There was a discussion the other day (on one of my lists), where someone pointed out that over many years, they had made tape backups, sent them via secured transport, and then paid for high-tech storage. And when they went to restore one of the tapes recently, there was nothing on the tape.

Over the years, I've lost count of the number of times I've heard stories like this. Long ago, I realized that you must never trust hardware.

There are just too many situations where you can get fooled. I'll give you a couple of examples. (I have many, many more).

I used to do maintenance on mini-computers at HP in the mid-1980's. Tape drives were some of the most fascinating pieces of equipment that I used to work on. The capstan-based devices were something, but the high speed vacuum drives were really wonders of engineering.

This shot from the movie Terminator shows one of the tape drives that I'm talking about, and some of our disk drives that were used with those systems:

I loved the way this guy was poring over the tape drive:

No idea what he would have been thinking about when looking at the drive.

But it reminded me of one of the HP tape drives I worked on where the erase head was stuck hard on. That meant that as the tape went through the drive, it first erased it as planned, then wrote to it, then read it back to make sure it was written correctly. All good to that point. But when it got to the end of the tape and rewound, it erased the entire tape.

I also saw a cartridge tape drive that was a multi-track drive. It had a problem where it wasn't stepping between tracks. So it would write one track, turn around and not step to the next track when it was supposed to, and then write straight back over the same track it just wrote, again and again.

You simply can't trust hardware. No matter how sophisticated it is, no matter how much error checking it has, etc. there's always something unplanned that can happen.

Unless you are periodically restoring your data (wherever it's stored), you have no idea if it's any good.

And importantly, you need to restore it on another system, not the one that you created the backups on. You don't want to find out later that the only device that could restore your backups was the one that created them, and it's now died.

Finally, you need to do a complete rebuild periodically. I've seen far too many systems where even when the restore was successful, people find that there was something else that should have been backed up but wasn't, and they now need it.

Don't be any of these people. You can get burned.

 

SQL: Make sure to use ORIGINAL_LOGIN when auditing

I regularly see code where SQL Server DBAs and developers are trying to log which user/login took a particular action within the database. There are many functions which appear to return the information needed, but there's only one that should normally be used: ORIGINAL_LOGIN().

A login is the way that a connection is authenticated to the server ie: it's the "who are you?" at the server level.

Most times, a user is a mapping of that login to a particular database.  The login and user will often have the same name (and I'd recommend that you do that to avoid confusion) but they do not have to be. A login Terry could be a user Mary in one database and a user Nga in another database.

A user can also represent the role that a login plays in the database. This is commonly see with administrators who will appear as dbo (ie: database owner) within each database, rather than their actual names.

So, using a user name within a database for auditing is normally not appropriate. 

A further complication is that a security context can be changed by impersonation. For example, a user might execute a procedure that has a WITH EXECUTE AS clause. Functions like USER_NAME() will return the impersonated context, not the actual context.

What you normally really want is the actual person who has logged in. ORIGINAL_LOGIN() goes to the bottom of the security stack, and returns that value. It's typically what I use for any type of auditing.

You can see its action in the main image above. Note what happens in the second (impersonated) context.

 

 

 

Learning Mandarin: Should I use Pin Yin while learning?

When anyone starts learning to read or write Chinese, there are two basic barriers:

  • Understanding the writing and how it's pronounced
  • Understanding the meaning of what's written

Now when people start to learn Chinese, it's tough to tackle both at once, so the usual starting point is to use what's called 拼音 (or Pīnyīn). Pīnyīn is a schema for Romanizing the characters, basically so they are familiar to people who use alphabets based on that, like English.

So instead of writing 我喜欢学中文, I could write Wǒ xǐhuān xué zhōngwén (meaning I like to study Chinese language). Clearly, for someone coming from a language like English, the latter is much easier to work with at first.

However, even though the letters look familiar to us, it's important to note that the pronunciation of them is a bit harder to get used to. For example, the is pronounced more like war than woe, and is pronounced more like she. Also, many of the sounds that are used aren't exactly familiar in English. For example cái (or 才) is pronounced closer to tsai than to kai.

Once you get over that though, it's certainly easier to get started with.

A big challenge though, are tones. I'll write more about them another day, but notice the symbol (umlaut) over the letter o in the word (which means I). This indicates that the word is pronounced with the third tone called 三声 (or Sān shēng). It's critical to learn to get tones correct pretty early on.

In terms of Pīnyīn though, my advice is to try to start reading the Chinese characters as quickly as you can. I remember clearly the day I decided to turn off the Pīnyīn on my learning tools, and just immerse myself with the Chinese characters. My learning accelerated at that point.

The exception to this however, is writing on a phone or computer. Pīnyīn is by far the easiest way to type Chinese into one of these devices. I'll also write about the different input editors for computers another day.

Book Review: Introduction To Personal Branding – Mel Carson

Over the last few months, I've also been reading a number of branding-related books. One that caught my eye was Introduction To Personal Branding: Ten Steps Toward A New Professional You by Mel Carson.

Part of the reason I looked into it was that Mel was an evangelist at Microsoft and as most would know, I spend a lot of time dealing with Microsoft in various ways.

It was also interesting as it's a low cost book that Mel has published using CreateSpace and I'd often wondered about using that so I thought I'd check the outcome of that as well. Finally, it's a short book. Listening to it on Audible would probably only take an hour or two.

Mel defines personal branding as the practice of defining a professional purpose, and then being able to explain that to a target audience through digital media and social channels. He also talks about how having the profile in place is important if you want to get the most out of in-person events like conferences.

I did like the way he focused on defining how you see your role, and then honing right in on making sure that all your touch-points really support that. He's keen to see everything else gone. That one's a touchy subject as I often see people complaining that a well-known figure in one area, is posting comments about another area (eg: politics). The defence to that is normally "that I'm a real person and a whole person".

While much of the content that he covers would be familiar to anyone who's been building a brand already, there are always good solid and timely reminders, many of which I've taken to heart myself.

If you are just starting out with building your personal brand, you could do worse than spending an hour or two hearing what Mel has to offer.

Greg's rating: 7 out of 10

Note: as an Amazon Associate I earn from qualifying purchases but whether or not I recommend a book is unrelated to this. One day it might just help cover some of my site costs. (But given the rate, that's not really likely anyway 🙂

 

 

Shortcut: Viewing client statistics in SQL Server Management Studio

While SQL Server is quite fast at executing queries, when you are connecting from a client application like SQL Server Management Studio (SSMS), you might wonder how much time SQL Server spent executing the query, as opposed to how long the communication with the server took.

This type of information is available in the Client Statistics.

Let's see an example. If I connect to a server in an Azure data center, I'll have higher latency than for one in my own site. That will affect the wait time for a server response.

This server is in the Melbourne (Australia South East) data center.

Let's execute a simple query against it, but before doing so, on the Query menu, choose Include Client Statistics.

I'll just query the SQL Server version:

Notice that an extra tab of data is returned.

From the bottom section of this tab, we can see where the time was spent. In this case, out of a total of 34 milliseconds for the query, 33 milliseconds was spent waiting for the server.

SDU Tools: Extract Trimmed Words from T-SQL Strings

Occasionally I've needed to take a string, and extract all the words out of it. For example a string like 'hello        there     greg' might lead me to want the three words 'hello', 'there', and 'greg'. Note that I usually want them trimmed, not just extracted.

In our free SDU Tools for developers and DBAs, we added a table-valued function ExtractTrimmedWords to help with this. You can pass it a string, and it will pull it apart for you, assuming that you have whitespace separating the words.

We use space, tab, carriage return, and line feed as whitespace characters for separating words.

The main image above shows it in use. As well as returning the words, we decided to return a WordNumber column as well, in case the ordering of the words matter to you.

I wish Microsoft had done that with their STRING_SPLIT function. (And we added that in our SplitDelimitedString function).

You can see ExtractTrimmedWords in action here:

To become an SDU Insider and to get our free tools and eBooks, please just visit here:

http://sdutools.sqldownunder.com

Opinion: Avoid annual subscription surprises for your customers

Yet again, a few days back, I received two invoices that showed I'd just paid (via PayPal fortunately) a pair of annual subscriptions. These are subscriptions that I thought were already cancelled, and we'd stopped using the products many months back.

The problem is that I've now spent quite a bit of my time, and quite a bit of the vendor's time trying to work out how to cancel and reverse them. For days now we've had emails going backwards and forwards between ourselves and the 3rd party that they use for provisioning/charging.

That's a serious waste of time for all three organizations, and it means that I now feel worse towards a product (and the company) that I've already stopped using. That makes it even less likely (not more) that I won't use it or them again.

Annual subscriptions and pre-approved payments are becoming somewhat of a cancer in our industry. I get the point of them when I'm signing up for something ongoing. But I do not get the point of pre-approved future payments when I'm buying something one-off.

Why do so many companies do this? And why do so many set auto-renewals without asking you? On many sites, it's almost (if not) impossible to buy something one off without having to go back into the account after the sale and nuke all the pre-approval and auto-renewal stuff.

Here's a hint: All of these actions come across to the customer as dodgy.

Surely you want your customers to want to deal with you and want to pay you, not to be feeling tricked into ongoing things, many of which are quite hard to reverse. Are the companies simply hoping for customer apathy?

At least if I pay for these things with something like PayPal, I could set myself a monthly reminder to go into my account and nuke anything that I don't want to be pre-authorizing. But I shouldn't need to do this.

Probably the biggest thing that suppliers with annual subscriptions could do is to send you a reminder that you are about to be billed, a few days before you are actually billed.

It's hard to believe that we've become so "pro-consent" about email addresses and haven't done that for payments. And the IT industry is one of the worst offenders.

It's time for this all to become much, much cleaner and simpler for the customer.

 

SQL: The T-SQL SIGN function and what's in a return type?

When you've worked with a product like SQL Server for a long time, and more importantly, are one of the odd people who've read a great amount of the documentation simply for interest, it feels really strange to come across a basic function that you'd never noticed before. That's how I felt when someone mentioned the T-SQL SIGN function.

I thought, "the what function??".

Now it works pretty much as you'd expect. It returns:

  • +1 for positive numbers
  • 0 for zero values
  • -1 for negative numbers
  • NULL for NULL values

No surprises there and you can see that in the main picture above.

What I wasn't expecting (and have to say if I was creating it that I would not have done), was the output data types. Here are the values returned:

I don't get why the return type has been designed to match the input type. It seems to me that a value indicating positive, zero, or negative should really have a fixed data type ie: int.

Regardless, I was also intrigued by the "Other types" going to float. It's not all types, as the value appears to need to be directly castable to float:

I tried other numeric data types to see what happens. I pushed a set of them into a temporary table:

And then checked the column data types that were returned:

It really does try to mostly match the input data type. I was mostly interested to see if decimal values would match the precision and scale, and they do.

There must have been some logic for the varying output data types, and that means there must have been some envisaged functionality beyond just indicating the sign. I'd love to hear from any of you if you have any ideas on how the varying output data types for this could be used in a practical way.

The online documentation also says "from SQL Server 2008" but that's just the oldest supported version anyway. Anyone know what version this was first introduced in?