Fascinating demo of surface technology

Fellow RD Tim Huckaby’s company has been really pushing the capabilities of the surface platform and building a really interesting set of apps. Most are still embargoed but this one is now allowed to be shown:




I suspect many of you will find it really interesting. Enjoy!


Modifying the Thesaurus in Full Text Search in SQL Server 2008

While I was prepping my full-text search session for TechEd Australia today, I decided to modify the thesaurus. I found the discussion in books online a bit confusing regarding the locations of the files involved.

What threw me was it said the default location of the thesaurus files was:

SQL_Server_install_path\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\FTDATA\

I went there and couldn't find anything. I worked out that for ENU language, my file should be called tsENU.xml. I searched for that and found it in:

C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\Binn\Templates

I modified that and got nowhere. Turns out that as the name implies, this is where the templates for the files live. I then tried creating the missing folder under the stated default location and also got nowhere. Turns out that it's actually related to your data directory location. Where I needed to put it was:


because I'd used C:\SQLData as my data folder. So it's not the SQL Server install path, it's the data file folder. Anyway, that file contains the following XML:

<XML ID="Microsoft Search Thesaurus">  <!--  Commented out      <thesaurus xmlns="x-schema:tsSchema.xml"> <diacritics_sensitive>0</diacritics_sensitive>         <expansion>             <sub>Internet Explorer</sub>             <sub>IE</sub>             <sub>IE5</sub>         </expansion>         <replacement>             <pat>NT5</pat>             <pat>W2K</pat>             <sub>Windows 2000</sub>         </replacement>         <expansion>             <sub>run</sub>             <sub>jog</sub>         </expansion>     </thesaurus> --> </XML>

The template file has three examples commented out. The diacritics entry relates to whether or not accents matter ie: in words like cafe. Expansion entries are used to provide synonyms. In this case, if you search for any one of "Internet Explorer", "IE" and "IE5", it will search for all of them. Replacement is used to force an alternate lookup. In the example above, if you search for either "NT5" or "W2K", it will search for "Windows 2000" instead.

After modifying the correct file, I then caused it to be loaded by specifying:

EXEC sys.sp_fulltext_load_thesaurus_file 1033 

and then I was in working fine with queries like:

SELECT MessageID,Description

FROM dbo.Messages



Hope this helps someone get started with it.





Data Driven Subscriptions in SQL Server 2005 Standard Edition

Recently, I was working at a client's site where SQL Server 2000 was still in use. They wanted to upgrade to SQL Server 2005. The only reason they were looking at the Enterprise Edition was for data-driven subscriptions in Reporting Services. The CEO certainly wasn't interested in paying the sort of dollars required for that, just to be able to automatically email out a set of monthly reports to about 30 or so folk within the company.

A quick bit of research led me to Jason L. Selburg's article at Code Project on how he achieved this by modifying the subscription details in the ReportServer database. You'll find his original post here: http://www.codeproject.com/KB/database/DataDrivenSubscriptions.aspx

I wanted to tackle a more complete version than Jason provided. There were a couple of key limitations:

1. You'd have to modify the stored procedure every time your parameter list changed. (and potentially have different stored procedures for different reports).

2. He used commands for manipulating the ntext columns that were deprecated. I wanted to avoid these.

3. The code didn't deal with concurrent usage.

Below, you'll find an alternate version of the proc. The concept is still the same. You create a subscription to a report, using tokens for parameters, set the subscription to happen in the past and then execute it using this proc in a scheduled Agent job. The different approach I've taken is:

1. I've provided the ability to deal with a parameter list rather than a single parameter.

2. I found that the ntext columns actually only contained simple xml. While I could have used XQuery to modify it, it's easier to directly manipulate it as varchar data (could have used nvarchar), given all we're doing is replacement of tokens with values. Either way, this avoids the use of UPDATETEXT etc.

3. I've used more of my style of naming conventions..

4. I've used a global temporary table as a flag to limit concurrent use. (Without this, the scheme could fail).

Hope you find it useful.





      Creates the effect of a data driven subscription by replacing the fields in

      an existing subscription with the supplied values, executing the report

      and then replacing the original values.



        @ScheduleID     The Job Name in SQL Server

        @EmailTo      The TO address of the email

        @EmailCC      The Carbon Copy address of the email

        @EmailBCC        The Blind Copy address of the email

        @EmailReplyTo  The Reply TO address of the email

        @EmailBody       Any text that you want in the email body

        @ParameterList The parameters for the report in the format 'Parameter1Token,Parameter1Value,Parameter2Token,Parameter2Value…'

                     Example: '|StartDate|,20071231,|Salesperson|,GE,|Region|,NW'





      Greg Low based on a concept from Jason L. Selburg at CodeProject.com



      ParameterTokens and ParameterValues are limited to 1000 characters

      EmailBody is limited to 8000 characters

      ParameterList is limited to 8000 characters total



CREATE PROCEDURE dbo.ExecuteSubscribedReport

( @ScheduleID uniqueidentifier,

  @EmailTo varchar (1000) = NULL,

  @EmailCC varchar (1000) = NULL,

  @EmailBCC varchar (1000) = NULL,

  @EmailReplyTo varchar (1000) = NULL,

  @EmailBody varchar (8000) = NULL,

  @ParameterList varchar (8000) = NULL




  DECLARE @extensionSettingsPointer binary(16),

          @parametersPointer binary(16),

          @tokenPosition int,

          @tokenLength int,

          @subscriptionID uniqueidentifier,

          @parameterToken varchar(1000),

          @parameterValue varchar(1000),

          @parameterPosition int,

          @numberOfParameters int,

          @parameterCounter int,

          @character varchar(1),

          @parseStatus varchar(1), — 0 ready for another token, 1 in a token, 2 in a value

          @originalExtensionSettings varchar(8000),

          @originalParameters varchar(8000),

          @newExtensionSettings varchar(8000),

          @newParameters varchar(8000);

  DECLARE @parameters TABLE (ParameterID int IDENTITY(1,1),

                             ParameterToken varchar(1000),

                             ParameterValue varchar(1000));


  — first we need to unpack the parameter list

  IF @ParameterList IS NOT NULL BEGIN

    SET @parameterPosition = 1;

    SET @parseStatus = 0;

    SET @parameterToken = ";

    SET @parameterValue = ";

    SET @numberOfParameters = 0;

    WHILE @parameterPosition <= LEN(@ParameterList) BEGIN

      SET @character = SUBSTRING(@ParameterList,@parameterPosition,1);

      IF @character = ',' BEGIN

        IF @parseStatus = 0 BEGIN — we had two commas in a row or the first character was a comma

          PRINT 'ParameterList has incorrect format';

          RETURN 1;


        ELSE IF @parseStatus = 1 BEGIN — we are at the end of the token

          SET @parseStatus = 2;

          SET @parameterValue = ";


        ELSE BEGIN — we are at the end of a value

          INSERT @parameters (ParameterToken,ParameterValue)

            VALUES (@ParameterToken,@ParameterValue);

          SET @numberOfParameters = @numberOfParameters + 1;

          SET @parseStatus = 0;

          SET @parameterToken = ";



        IF @parseStatus = 0 BEGIN — we have the first character of a token

          SET @parseStatus = 1;

          SET @parameterToken = @parameterToken + @character;


        ELSE IF @parseStatus = 1 BEGIN — we have another character in a token

          SET @parameterToken = @parameterToken + @character;


        ELSE BEGIN — we have another character in a value

          SET @parameterValue = @parameterValue + @character;



      SET @parameterPosition = @parameterPosition + 1;


    IF @parseStatus = 2 BEGIN— we were still collecting a value

      INSERT @parameters (ParameterToken,ParameterValue)

        VALUES (@ParameterToken,@ParameterValue);

      SET @numberOfParameters = @numberOfParameters + 1;




  — we need to wait for our turn at using the subscription system

  WHILE EXISTS(SELECT 1 FROM tempdb.sys.objects WHERE name = '##ReportInUse')

    WAITFOR DELAY '00:00:30';

  CREATE TABLE ##ReportInUse (ReportID int);


  — once we have the parameters unpacked, we now need to find the subscriptionID

  SELECT @subscriptionID = SubscriptionID

    FROM dbo.ReportSchedule

    WHERE ScheduleID = @ScheduleID;


  — next we save away the original values of ExtensionSettings and Parameters

  — (we use them to make it easy put the values back later)

  — they are actually xml but it'll be easier to work with them as strings


  SELECT @originalExtensionSettings = CAST(ExtensionSettings AS varchar(8000)),

         @originalParameters = CAST(Parameters AS varchar(8000))

    FROM dbo.Subscriptions

    WHERE SubscriptionID = @subscriptionID;


  SET @newExtensionSettings = @originalExtensionSettings;

  SET @newParameters = @originalParameters;


  — if they have supplied arguments ie: not NULL and not blank, process them

  IF COALESCE(@EmailTo,") <> "

    SET @newExtensionSettings = REPLACE(@newExtensionSettings,'|TO|',@EmailTo);

  IF COALESCE(@EmailCC,") <> "

    SET @newExtensionSettings = REPLACE(@newExtensionSettings,'|CC|',@EmailCC);

  IF COALESCE(@EmailBCC,") <> "

    SET @newExtensionSettings = REPLACE(@newExtensionSettings,'|BC|',@EmailBC);

  IF COALESCE(@EmailReplyTo,") <> "

    SET @newExtensionSettings = REPLACE(@newExtensionSettings,'|RT|',@EmailReplyTo);

  IF COALESCE(@EmailBody,") <> "

    SET @newExtensionSettings = REPLACE(@newExtensionSettings,'|BD|',@EmailBody);


  IF @numberOfParameters > 0 BEGIN

    — process each parameter in turn

    SET @parameterCounter = 1;

    WHILE @parameterCounter <= @numberOfParameters BEGIN

      SELECT @parameterToken = ParameterToken,

             @parameterValue = ParameterValue,

             @tokenLength = LEN(ParameterToken)

        FROM @parameters

        WHERE ParameterID = @parameterCounter;

      SET @newParameters = REPLACE(@newParameters,@ParameterToken,@ParameterValue);

      SET @parameterCounter = @parameterCounter + 1;




  — Temporarily update the values

  UPDATE dbo.Subscriptions

    SET ExtensionSettings = CAST(@newExtensionSettings AS ntext),

        Parameters = CAST(@newParameters AS ntext)

  WHERE SubscriptionID = @subscriptionID;


  — run the job

  EXEC msdb..sp_start_job @job_name = @ScheduleID


  — make enough delay for the report to have started

  WAITFOR DELAY '00:00:30'


  — put the original extensionsettings and parameter values back

  UPDATE dbo.Subscriptions

    SET ExtensionSettings = CAST(@originalExtensionSettings AS ntext),

        Parameters = CAST(@originalParameters AS ntext)

  WHERE SubscriptionID = @subscriptionID;

  — finally we free up the subscription system for another person to use

  DROP TABLE ##ReportInUse;



EXEC dbo.ExecuteSubscribedReport

   @ScheduleID = '4CE38C83-6A03-4780-895A-92FD6F8FD5B0',

   @EmailTo = 'greg@sqldownunder.com',

   @EmailCC = 'ozinfo@sqldownunder.com',

   @EmailBCC = 'info@sqldownunder.com',

   @EmailReplyTo = 'greg@sqldownunder.com',

   @EmailBody = 'Hello Greg',

   @ParameterList = '|StartDate|,20071231,|Salesperson|,GE,|Region|,NW';




Indexing Foreign Keys – should SQL Server do that automatically?

I've been doing another performance tuning job today and it highlighted to me once again that problem that a lack of indexes on foreign key columns can bring.

By adding indexes on the foreign keys on three tables, we saw a reduction of 87% in total I/O load on the server. There are other aspects of the system that I'm now working on but it really struck me that having SQL Server do this by default would avoid a lot of apparent performance problems. It would have to be one of the most common indexing issues that I see in my work.

What do you think? Should SQL Server simply do this by default when you declare a foreign key reference?

Would you like a Visual Studio MSDN Premium License for a Year?

In the latest podcasts at www.sqldownunder.com, we're giving away a Visual Studio MSDN Premium Online license for a year, courtesy of our friends at Microsoft. It's a simple:

* you need to visit Quest's site for info on Litespeed via the simple link http://shrinkster.com/10fn,

* read the article covering 10 things DBAs often don't know about Litespeed

* email a description of two of them to questATsqldownunder.com (replacing AT with @) before August 17th 2008

I'll select one from those that come in correctly and give that person the license. Easy enough?

Thanks to all at Microsoft.



Geek: New Phone -> Yes it's an iPhone

I've also been looking around for a new phone. It became much more urgent last week after I dropped my iMate.


After trying lots of phones, I ended up opting for an iPhone 3G. And after using it for a few days, couldn’t be happier with it. Well, not quite true, if it worked as a NextG modem as well, it would be even better as the data plans are quite costly here in Australia.


I've heard people raving about the user interface and wondered if it was just all hype. It's not. The user interface is just so much better than what I’ve used on Windows Mobile devices. I didn’t know how I’d go just using a finger instead of a stylus but I really don’t miss it at all. The setup was beyond simple. When I compare what I had to do to get it to sync with Outlook with the struggles I had with my iMate and Windows Mobile Device Center on Vista, I was left shaking my head as to how simple it was. In fact, it left me thinking that it was exactly the experience that I should have had with WMDC but didn't and gave me faith that these things don't have to be so hard for the end user.


If I’ve had a hassle with anything, it’s been my preconceptions about how the interface should work, based on a Windows Mobile background. Instead of working out how to give you a Windows-like interface in a small package, they seem to have just spent time thinking about how a really good phone would work. First and foremost, a phone needs to be a good phone.


In general, if you just do the most logical thing, you find that’s how the phone works. For example, I spent a short while trying to work out how to call a phone number on the screen. I was looking for selections and menu options, etc. But you simply point at something that looks like a phone number and it asks if you want to call it. It was only my right-click (or hold and tap) background that had me looking for something else.


I also really like the fact that the interface isn’t designed for people with super-eyesight. It’s clear, even outdoors.


For the car, I purchased a SuperTooth Light (BlueTooth) hands-free unit. It was $99AUD and I found it trivial to setup and use. (http://www.supertooth.net/light.html).


Both recommended !


Now the decision is whether or not I want my next notebook to be a Mac so I can dual boot and do development for the iPhone. I took a good look at what they offer developers the other day and it's pretty slick also. I watched a number of the getting started videos and it has me wanting to try it. I've got such a good feeling about what could be done with this quality of interface.

Geek: New Monitor Dell 24inch Ultrasharp

I've been chasing around for a while for a new monitor. I find I need to work at 1900×1200 but my eyesight doesn't want to let me work well on 19inch screens.

Fellow MVP Ken Schaefer suggested trying a Dell 24inch Ultrasharp (http://accessories.us.dell.com/sna/products/Monitors/productdetail.aspx?c=us&l=en&s=dhs&cs=19&sku=320-6272).

It arrived today. What can I say? I love it.


Parallel Programming In TSQL: Is It In Our Future?

I saw an interesting videopod from Steve Jones the other day where he noted that upcoming processors would have many more cores than now and hoping that DBAs won't have to learn to write multiprocessor-style code. What was also interesting was the way he described that he learned some of this style of coding but later when he came back to it, he realised how much he thought he knew but didn't.

For languages like T-SQL, we don't have inherent support for multi-threading. In fact, the only trace I can see of this in T-SQL today is the ability to have multiple readers on a service broker queue. In general, we haven't needed this because SQL Server systems constantly have many requests thrown at them concurrently anyway and there is a natural style of parallelism happening.

But if languages need to evolve, I think it's important that wherever possible, that it's the languages that work out how to parallelise the code, not the developer. I often mention to people that in one interview I did with the late Dr Jim Gray, he noted that people learning to write multithreaded code tend to go through three phases:

1. when it all just looks like magic and you don't understand any of it.

2. when you think you understand it.

3. when you get wise.

That's very insightful and has stuck with me ever since. I've written a bunch of multithreaded code over the years and my experience exactly parallels (pun intended) the phases he described. Getting it right is much, much harder than it looks.