Avoiding connection timeouts on first connection to LocalDB edition of SQL Server Express

When you first make a connection to the new LocalDB edition of SQL Server Express, the system files, etc. that are required for a new version are spun up. (The system files such as the master database files, etc. end up in C:\Users\<username>\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances\LocalDBApp1) That can take a while on a slower machine, so this means that the default connection timeout of 30 seconds (in most client libraries) could be exceeded.

To avoid this hit on the first connection, you can create the required instance of LocalDB beforehand using the SqlLocalDB.exe utility, like this:

<path to binaries>\SqlLocalDB.exe create InstanceName

You can also specify the required version of SQL Server and ask to start the instance like this:

<path to binaries>\SqlLocalDB.exe create InstanceName 11.0 -s

Your application should then connect quickly, even on the first connection.

SqlLocalDB documentation is starting to appear now. Documentation on the utility is here: http://msdn.microsoft.com/en-us/library/hh212961(v=sql.110).aspx.

FIX: A network error occurred during SQL Server Native Client installation

One of the things that I have been pestering the SQL team to do is to name their updates according to what is contained in them. For example, instead of just:

sqlncli.msi 

What I'd prefer is that the file was called something like:

SQLServerNativeClient2k8SP1CU3x64.msi

So I normally rename them as soon as I receive them, to avoid confusion in future. However, today I found that doing so caused me a problem. After renaming the file, and installing it, the installation failed with the error:

"A network error occurred while reading from the file: C:\temp\sqlncli.msi"

A quick inspection of the error shows that the code in the msi is looking for the file by name. Renaming the file back to the original name makes it install ok. It's a pity that the person coding the installer didn't pick up the name of the file programmatically, rather than hard-coding it.

Anyway, hope that helps someone that sees this error.

MVP Deep Dives volume 2 is now published!

It's great to see that volume 2 of MVP Deep Dives is now available and will be distributed at the PASS summit next week. I'm really sad that I won't be at the book signing next week but I'd encourage you all to get along, order a copy and have it signed.

A huge thanks has to go to Kalen Delaney for her management of this project and a big thanks to my fellow editors Louis Davidson, Brad McGehee, Paul Nielsen, Paul Randal, and Kimberly Tripp for their efforts. A special mention for Paul Nielsen whose ideas and spirit around volume 1 that have continued into this new volume.

And of course, a really big thank you to all the authors that gave their time to make this possible.

Please buy a copy and help us to help Operation Smile. You'll find the book's website here: http://www.manning.com/delaney/ 

While you're at it, why not send an extra donation to Operation Smile: https://secure.operationsmile.org/site/Donation2?df_id=10380&10380.donation=form1

Denali: Improved T-SQL Query Optimization

Part of the value in the ongoing evolution of the T-SQL language is that we are moving further and further towards being declarative rather than prescriptive ie: we are able to tell SQL Server what we want, rather than how to do it. Over time, that raises more and more possibilities for the optimizer to work with us to achieve a better outcome.

For example, note the following query against the AdventureWorksDW database:

SELECT rs.ProductKey, rs.OrderDateKey, rs.SalesOrderNumber,

       rs.OrderDateKey – (SELECT TOP(1) prev.OrderDateKey

                          FROM dbo.FactResellerSales AS prev

                          WHERE rs.ProductKey = prev.ProductKey

                          AND prev.OrderDateKey <= rs.OrderDateKey

                          AND prev.SalesOrderNumber < rs.SalesOrderNumber

                          ORDER BY prev.OrderDateKey DESC,

                                  prev.SalesOrderNumber DESC)

                AS DaysSincePrevOrder

FROM dbo.FactResellerSales AS rs

ORDER BY rs.ProductKey, rs.OrderDateKey, rs.SalesOrderNumber;

In this query, I'm trying to include details of how long it was since the previous order, beside the details of the current order. Note the option that the LAG operator now provides:

SELECT ProductKey, OrderDateKey, SalesOrderNumber,

       OrderDateKey – LAG(OrderDateKey)

                         OVER (PARTITION BY ProductKey

                                ORDER BY OrderDateKey, SalesOrderNumber)

                AS DaysSincePrevOrder

FROM dbo.FactResellerSales AS rs

ORDER BY ProductKey, OrderDateKey, SalesOrderNumber;

Also note how much more elegant the code is but more importantly, look at the difference in optimization:

 

This is great work from the T-SQL and engine teams. I encourage you to get out and try the new Windowing functions in Denali CTP3.

Denali: Note the small but important things in SSIS

With SQL Server, there often seems to be an over-emphasis on the items that provide bullet points for the marketing brochure. Yet, small but useful changes to the product can make a more profound impact on DBAs/developers than the new items that are highlighted by those bullet points.

One of the things I loved about Reporting Services in SQL Server 2008 R2 is the amount of focus they provided on ease of use. In particular, the addition of data bars, lookup functions, pagination control, rendering targets in expressions, domain scope, etc. made a huge difference to anyone that really needs to build reports.

For Denali, Integration Services has me similarly excited. I've just completed prepping and presenting a session on the Denali enhancements to SSIS (and on the introduction of Data Quality Services) for TechEd Australia. It was interesting to note, though, that the biggest reactions I received from the crowd were for the "little" improvements, not for the big ticket items. Some of the best examples of these are:

  • Zoom control, general UI improvements and the Fit to Window
  • Improvements around the display of precedence labels (great for those who are color blind and struggle with the standard red/green/blue)
  • Toolbox groupings
  • Ease of integrating new items into the Toolbox
  • Reliability and performance improvements around Merge/Merge Join

And there are many more. The two biggest woohoo moments though, came for:

  • Multi-level undo and redo
  • Remapping work that has been done in the data flow (remapping GUI is great but the way that so many mapping issues are automatically fixed is awesome)

I want to congratulate the team on spending time making these type of "less visible" improvements.

Do you still sharpen your knives?

We end up staying in Sydney several times per year, either for training or mentoring/consulting work. When we do, one of the hotels that we really like is the Westin. I would find it hard though, to list all the reasons why I like it. (There are some things I don't like about it too but that's a topic for another day).

But one of the things that has always surprised me, each and every time I eat breakfast there, is how sharp the knives are. Clearly, someone must be assigned to sharpen them, or they replace them very regularly (unlikely). Alternately, one of the staff members might have been passionate about it and it was an individual thing but we stayed about 160 nights in hotels last year and I can tell you that most hotels (big or small) don't bother doing this.

Today at breakfast, I noticed that the knives weren't sharp. Same thing happened last time. I can't help but thinking that this might be one little tell-tale sign of some belt-tightening in the running costs of the hotel. If it was an individual effort, perhaps that person has just left.

However, it got me wondering about what impact these little things have ie: the little things that companies do but that would never appear on a brochure. Do you have little things that you do for your customers that are a bit unique and that your customers appreciate without telling you? Do you still do them? Do you still sharpen your knives?

More things I learned today: EXEC AS USER on dynamic SQL

I really liked the addition of the WITH EXECUTE AS clause when defining stored procedures and functions, to change the execution context, just for the duration of the stored procedure or function. For example:

CREATE PROC SomeSchema.SomeProc

WITH EXECUTE AS USER = 'Fred'

AS

… 

I'd also used the EXEC AS clause to temporarily change my execution context during testing. For example:

EXEC AS USER = 'Fred';

— Try some code here while running as Fred

REVERT;

But what I didn't realize is that you can set the execution context for a single execution like this:

EXEC ('Some command to be executed as Fred') AS USER = 'Fred';

Back in SQL 2000, I made a big effort to read all of Books Online. The more I read of the current BOL, the more I think I need to do that again. (Mind you, I'm not sure that that's even possible any more. I think they're writing/modifying it faster than I have time to read it).

Things I learned today: Calling a scalar UDF with EXEC

One thing I love about T-SQL is that every day I seem to find another way to use it that wasn't immediately obvious to me. I was reading the SQL Server Denali documentation and noticed that in the definition of the EXEC command that it said you could execute a stored procedure or scalar function using EXEC.

While I'd obviously used it to call a stored procedure, it had never dawned on me that you could use EXEC to call a scalar UDF, so I had to try it. Sure enough, it's obviously worked for a long time. Here's an example of a function and calling it as I would have in the past:

USE tempdb;
GO

CREATE FUNCTION dbo.SayHello(@WhoTo NVARCHAR(100))
RETURNS NVARCHAR(120)
AS
BEGIN
  RETURN N'Hello ' + @WhoTo;
END;
GO

SELECT dbo.SayHello(N'Greg');
GO

Notice that you can also call it like this:

DECLARE @ReturnValue NVARCHAR(120);

EXEC @ReturnValue = dbo.SayHello N'Greg';

 

SELECT @ReturnValue;

I was speaking with a member of the SQL Product Group last week and noting just how good the documentation has become in recent years. I continue to be impressed with it.

Setting default values for slicers in PowerPivot

I've been doing some work with PowerPivot and SharePoint/Excel Services this week. I wanted the user interface to have slicers for:

  • Year
  • Month
  • Day

But I wanted the slicer to be preselected for the current month. There is no property on the slicers to set a default value. I read a number of websites and the suggestion was to use VBA code to set the value. This works but if you want to have the VBA code run at workbook open, you have to create a macro-enabled workbook, and these are not supported by Excel Services.

So I seemed to be fresh out of luck. However, one of my Spanish colleagues José Quinto Zamora came to the rescue. All you need to do is to select the slicer filter value that you want as the default, before you save the workbook, and every time you open the workbook, the slicer value will be already selected. That's as good as a default value for me. Thanks José!

Hope this helps someone else.