I've been working on a team with Roger Doherty building parts of what's now become the SQL Server 2012 Early Adoption Cook Book.
So, if you work on the bleeding edge of SQL Server and are keen to get your head around what's coming, this is a seriously good resource.
Time to go and get it.
The material is constructed as a large number of bite-sized pieces. Each presentation is about 15 minutes in length, and each demo is about 5 minutes. And there are lots of them.
Look for recordings of these by the original authors on the Channel9 website soon too, followed by local in person presentations from Joe Homnick in many locations.
If you have Chinese friends, it's time to say 新年快乐 to them! (Xīnnián kuàilè -> Pronounced like Shin Nien Kwai Ler)
They are welcoming in the year of the dragon.
Temp tables are visible within the scope where they are declared but also in sub-scopes. This means that you can declare a temp table in one stored procedure but access it in another stored procedure that is executed from within the first stored procedure.
There are two reasons that people do this. One reason is basically sloppy code, a bit like having all your variables global in a high level programming language. But the more appropriate reason is to avoid the overhead of moving large amounts of data around, and because we only have READONLY table valued parameters.
But I've never liked the idea of this type of access. It breaks normal coding rules on encapsulation. Nothing in the child procedure gives any clue that it expects the temp table to already be present. The code that accesses the temporary object could just as easily be a typo. Where this comes to light is when you try to use tools like SQL Server Data Tools (or previously a similar issue with DataDude). How can a utility that analyzes your code know if that reference is valid or just an unresolved reference?
I've previously posted about the need for stored procedures to have contracts: http://www2.sqlblog.com/blogs/greg_low/archive/2010/01/20/stored-procedure-contracts-return-values.aspx
This seems to be another case where a contract might help. The fact that a procedure depends upon the pre-existence of a temporary object should be discoverable from the metadata of the procedure. Perhaps something like this:
CREATE PROCEDURE ChildProc
REQUIRES #SomeTempTable SomeTableType
or if the temp table isn't based on a table type, just syntax similar to a table valued function where you could say:
CREATE PROCEDURE ChildProc
REQUIRES #SomeTempTable TABLE (Table definition here)
I'd love to hear what you think.
PS: I've added a Connect item on this. Please vote if you agree: https://connect.microsoft.com/SQLServer/feedback/details/716565/sql-server-should-allow-declarative-specification-of-temp-tables-used-as-parameters
We have a very common customer scenario where the customer decides to upgrade to a new version (in this case let's say SQL Server 2012). They run upgrade advisor and note that there are a whole lot of problems. They think "Hey that's way too much to deal with right now. I don't have time".
So they don't deal with them. They upgrade the server to 2012 but leave their db_compat level at a lower level with the intention of dealing with the problems later.
But when they go to run the upgrade advisor later, to find the issues that they need to resolve before changing db_compat level to 2012, it tells them that they can no longer use it because their server is already at 2012. How can they now check what needs to be changed? They also cannot now reattach their databases to an earlier version server, as they've been upgraded to 2012 structure.
I believe that upgrade advisor should check db_compat levels, not server versions, before refusing to run. If you agree, you know what to do. Vote once, vote often:
Well it's been a while since I've posted up a new podcast. (I know, I know).
But I've just started a new series for SQL Server 2012. First out of the gate is Roger Doherty (Senior Program Manager in the SQL Server team) with an overview of all the key SQL Server 2012 pillars and enhancements.
You'll find it here:
Nice to see the increase in maximum database size on SQL Azure kicked up to 150GB.
In most enterprises I go into, there are a few databases that wouldn't fit but now the vast majority of databases would fit in SQL Azure.
Also included in the November release are federations and an updated management portal.
More info here: http://msdn.microsoft.com/en-us/library/windowsazure/ff602419.aspx
Over the years, I've seen several causes of this error in SQL Server Integration Services but today I came across another one.
You can get this error if you've used 3rd party components (particularly data sources) and the licensing for those components has expired.
Hope that helps someone sometime.
When you first make a connection to the new LocalDB edition of SQL Server Express, the system files, etc. that are required for a new version are spun up. (The system files such as the master database files, etc. end up in C:\Users\<username>\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances\LocalDBApp1) That can take a while on a slower machine, so this means that the default connection timeout of 30 seconds (in most client libraries) could be exceeded.
To avoid this hit on the first connection, you can create the required instance of LocalDB beforehand using the SqlLocalDB.exe utility, like this:
<path to binaries>\SqlLocalDB.exe create InstanceName
You can also specify the required version of SQL Server and ask to start the instance like this:
<path to binaries>\SqlLocalDB.exe create InstanceName 11.0 -s
Your application should then connect quickly, even on the first connection.
SqlLocalDB documentation is starting to appear now. Documentation on the utility is here: http://msdn.microsoft.com/en-us/library/hh212961(v=sql.110).aspx.
One of the things that I have been pestering the SQL team to do is to name their updates according to what is contained in them. For example, instead of just:
What I'd prefer is that the file was called something like:
So I normally rename them as soon as I receive them, to avoid confusion in future. However, today I found that doing so caused me a problem. After renaming the file, and installing it, the installation failed with the error:
"A network error occurred while reading from the file: C:\temp\sqlncli.msi"
A quick inspection of the error shows that the code in the msi is looking for the file by name. Renaming the file back to the original name makes it install ok. It's a pity that the person coding the installer didn't pick up the name of the file programmatically, rather than hard-coding it.
Anyway, hope that helps someone that sees this error.
It's great to see that volume 2 of MVP Deep Dives is now available and will be distributed at the PASS summit next week. I'm really sad that I won't be at the book signing next week but I'd encourage you all to get along, order a copy and have it signed.
A huge thanks has to go to Kalen Delaney for her management of this project and a big thanks to my fellow editors Louis Davidson, Brad McGehee, Paul Nielsen, Paul Randal, and Kimberly Tripp for their efforts. A special mention for Paul Nielsen whose ideas and spirit around volume 1 that have continued into this new volume.
And of course, a really big thank you to all the authors that gave their time to make this possible.
Please buy a copy and help us to help Operation Smile. You'll find the book's website here: http://www.manning.com/delaney/
While you're at it, why not send an extra donation to Operation Smile: https://secure.operationsmile.org/site/Donation2?df_id=10380&10380.donation=form1