SQL Server Upgrade Advisor should check Database db_compat levels, not server version

We have a very common customer scenario where the customer decides to upgrade to a new version (in this case let's say SQL Server 2012). They run upgrade advisor and note that there are a whole lot of problems. They think "Hey that's way too much to deal with right now. I don't have time".

So they don't deal with them. They upgrade the server to 2012 but leave their db_compat level at a lower level with the intention of dealing with the problems later.

But when they go to run the upgrade advisor later, to find the issues that they need to resolve before changing db_compat level to 2012, it tells them that they can no longer use it because their server is already at 2012. How can they now check what needs to be changed? They also cannot now reattach their databases to an earlier version server, as they've been upgraded to 2012 structure.

I believe that upgrade advisor should check db_compat levels, not server versions, before refusing to run. If you agree, you know what to do. Vote once, vote often:


New SQL Down Under Podcast with Roger Doherty on SQL Server 2012

Well it's been a while since I've posted up a new podcast. (I know, I know).

But I've just started a new series for SQL Server 2012. First out of the gate is Roger Doherty (Senior Program Manager in the SQL Server team) with an overview of all the key SQL Server 2012 pillars and enhancements.

You'll find it here:



SQL Azure DB size limit increased to 150GB

Nice to see the increase in maximum database size on SQL Azure kicked up to 150GB.

In most enterprises I go into, there are a few databases that wouldn't fit but now the vast majority of databases would fit in SQL Azure.

Also included in the November release are federations and an updated management portal.

More info here: http://msdn.microsoft.com/en-us/library/windowsazure/ff602419.aspx

Avoiding connection timeouts on first connection to LocalDB edition of SQL Server Express

When you first make a connection to the new LocalDB edition of SQL Server Express, the system files, etc. that are required for a new version are spun up. (The system files such as the master database files, etc. end up in C:\Users\<username>\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances\LocalDBApp1) That can take a while on a slower machine, so this means that the default connection timeout of 30 seconds (in most client libraries) could be exceeded.

To avoid this hit on the first connection, you can create the required instance of LocalDB beforehand using the SqlLocalDB.exe utility, like this:

<path to binaries>\SqlLocalDB.exe create InstanceName

You can also specify the required version of SQL Server and ask to start the instance like this:

<path to binaries>\SqlLocalDB.exe create InstanceName 11.0 -s

Your application should then connect quickly, even on the first connection.

SqlLocalDB documentation is starting to appear now. Documentation on the utility is here: http://msdn.microsoft.com/en-us/library/hh212961(v=sql.110).aspx.

FIX: A network error occurred during SQL Server Native Client installation

One of the things that I have been pestering the SQL team to do is to name their updates according to what is contained in them. For example, instead of just:


What I'd prefer is that the file was called something like:


So I normally rename them as soon as I receive them, to avoid confusion in future. However, today I found that doing so caused me a problem. After renaming the file, and installing it, the installation failed with the error:

"A network error occurred while reading from the file: C:\temp\sqlncli.msi"

A quick inspection of the error shows that the code in the msi is looking for the file by name. Renaming the file back to the original name makes it install ok. It's a pity that the person coding the installer didn't pick up the name of the file programmatically, rather than hard-coding it.

Anyway, hope that helps someone that sees this error.

MVP Deep Dives volume 2 is now published!

It's great to see that volume 2 of MVP Deep Dives is now available and will be distributed at the PASS summit next week. I'm really sad that I won't be at the book signing next week but I'd encourage you all to get along, order a copy and have it signed.

A huge thanks has to go to Kalen Delaney for her management of this project and a big thanks to my fellow editors Louis Davidson, Brad McGehee, Paul Nielsen, Paul Randal, and Kimberly Tripp for their efforts. A special mention for Paul Nielsen whose ideas and spirit around volume 1 that have continued into this new volume.

And of course, a really big thank you to all the authors that gave their time to make this possible.

Please buy a copy and help us to help Operation Smile. You'll find the book's website here: http://www.manning.com/delaney/ 

While you're at it, why not send an extra donation to Operation Smile: https://secure.operationsmile.org/site/Donation2?df_id=10380&10380.donation=form1

Denali: Improved T-SQL Query Optimization

Part of the value in the ongoing evolution of the T-SQL language is that we are moving further and further towards being declarative rather than prescriptive ie: we are able to tell SQL Server what we want, rather than how to do it. Over time, that raises more and more possibilities for the optimizer to work with us to achieve a better outcome.

For example, note the following query against the AdventureWorksDW database:

SELECT rs.ProductKey, rs.OrderDateKey, rs.SalesOrderNumber,

       rs.OrderDateKey – (SELECT TOP(1) prev.OrderDateKey

                          FROM dbo.FactResellerSales AS prev

                          WHERE rs.ProductKey = prev.ProductKey

                          AND prev.OrderDateKey <= rs.OrderDateKey

                          AND prev.SalesOrderNumber < rs.SalesOrderNumber

                          ORDER BY prev.OrderDateKey DESC,

                                  prev.SalesOrderNumber DESC)

                AS DaysSincePrevOrder

FROM dbo.FactResellerSales AS rs

ORDER BY rs.ProductKey, rs.OrderDateKey, rs.SalesOrderNumber;

In this query, I'm trying to include details of how long it was since the previous order, beside the details of the current order. Note the option that the LAG operator now provides:

SELECT ProductKey, OrderDateKey, SalesOrderNumber,

       OrderDateKey – LAG(OrderDateKey)

                         OVER (PARTITION BY ProductKey

                                ORDER BY OrderDateKey, SalesOrderNumber)

                AS DaysSincePrevOrder

FROM dbo.FactResellerSales AS rs

ORDER BY ProductKey, OrderDateKey, SalesOrderNumber;

Also note how much more elegant the code is but more importantly, look at the difference in optimization:


This is great work from the T-SQL and engine teams. I encourage you to get out and try the new Windowing functions in Denali CTP3.

Denali: Note the small but important things in SSIS

With SQL Server, there often seems to be an over-emphasis on the items that provide bullet points for the marketing brochure. Yet, small but useful changes to the product can make a more profound impact on DBAs/developers than the new items that are highlighted by those bullet points.

One of the things I loved about Reporting Services in SQL Server 2008 R2 is the amount of focus they provided on ease of use. In particular, the addition of data bars, lookup functions, pagination control, rendering targets in expressions, domain scope, etc. made a huge difference to anyone that really needs to build reports.

For Denali, Integration Services has me similarly excited. I've just completed prepping and presenting a session on the Denali enhancements to SSIS (and on the introduction of Data Quality Services) for TechEd Australia. It was interesting to note, though, that the biggest reactions I received from the crowd were for the "little" improvements, not for the big ticket items. Some of the best examples of these are:

  • Zoom control, general UI improvements and the Fit to Window
  • Improvements around the display of precedence labels (great for those who are color blind and struggle with the standard red/green/blue)
  • Toolbox groupings
  • Ease of integrating new items into the Toolbox
  • Reliability and performance improvements around Merge/Merge Join

And there are many more. The two biggest woohoo moments though, came for:

  • Multi-level undo and redo
  • Remapping work that has been done in the data flow (remapping GUI is great but the way that so many mapping issues are automatically fixed is awesome)

I want to congratulate the team on spending time making these type of "less visible" improvements.

Do you still sharpen your knives?

We end up staying in Sydney several times per year, either for training or mentoring/consulting work. When we do, one of the hotels that we really like is the Westin. I would find it hard though, to list all the reasons why I like it. (There are some things I don't like about it too but that's a topic for another day).

But one of the things that has always surprised me, each and every time I eat breakfast there, is how sharp the knives are. Clearly, someone must be assigned to sharpen them, or they replace them very regularly (unlikely). Alternately, one of the staff members might have been passionate about it and it was an individual thing but we stayed about 160 nights in hotels last year and I can tell you that most hotels (big or small) don't bother doing this.

Today at breakfast, I noticed that the knives weren't sharp. Same thing happened last time. I can't help but thinking that this might be one little tell-tale sign of some belt-tightening in the running costs of the hotel. If it was an individual effort, perhaps that person has just left.

However, it got me wondering about what impact these little things have ie: the little things that companies do but that would never appear on a brochure. Do you have little things that you do for your customers that are a bit unique and that your customers appreciate without telling you? Do you still do them? Do you still sharpen your knives?