Suggestion: ALLFILES option for RESTORE

The default action when performing a backup is to append to the backup file yet the default action when restoring a backup is to restore just the first file.

I constantly come across customer situations where they are puzzled that they seem to have lost data after they have completed a restore. Invariably, it's just that they haven't restored all the backups contained within a single OS file. This happens most commonly with log backups but also happens when they have not restored the most recent database backup file.

It is not trivial to achieve this within simple T-SQL scripts, when the number of backup files within the OS file is unknown. It really should be.

I'd like to see a FILES=ALLFILES option on the RESTORE command. For RESTORE DATABASE, it should restore the most recent database backup plus any subsequent log files. For RESTORE LOG (which is the most important missing option), it should just restore all relevant log backups that are contained.

If you agree, you know what to do: please vote:  

https://connect.microsoft.com/SQLServer/feedback/details/769204/option-to-restore-all-backups-files-within-a-media-set

Alternately, how would you write a T-SQL command to restore all log backups within a single OS file where the number of files is unknown? Would love to hear creative solutions because all the ones that I think of are pretty messy and need dynamic SQL.

 

Update from Ola Hallengren: Target multiple devices during SQL Server backup

Ola has produced another update of his database management scripts. If you haven't taken a look at them, you should. At the very least, they'll give you good ideas about what to implement and how others have done so. The latest update allows targeting multiple devices during backup. This is available in native SQL Server backup and can be helpful with very large databases. Ola's scripts now support it as well.

Details are here:

http://ola.hallengren.com/sql-server-backup.html

http://ola.hallengren.com/versions.html

The following example shows it backing up to 4 files on 4 drives, one file on each drive:

EXECUTE dbo.DatabaseBackup
@Databases = 'USER_DATABASES',
@Directory = 'C:\Backup, D:\Backup, E:\Backup, F:\Backup',
@BackupType = 'FULL',
@Compress = 'Y',
@NumberOfFiles = 4

And this example shows backing up to 16 files on 4 drives, 4 files on each drive:

EXECUTE dbo.DatabaseBackup
@Databases = 'USER_DATABASES',
@Directory = 'C:\Backup, D:\Backup, E:\Backup, F:\Backup',
@BackupType = 'FULL',
@Compress = 'Y',
@NumberOfFiles = 16

Ola mentioned that you can now back up to up to 64 drives.

 

Opinion: Passwords as a concept are completely broken

One thing you get to do as you get older, or have been around the industry for a long time, is to pontificate. My pet topic today is passwords. I think that they are, as a concept, now completely broken and have been for a long time.

We tell users:

1. Pick something really complex

2. Don't write it down

3. Change it regularly

4. Use a different password for each site, and often each role that you hold in each site

5. Deal with the fact that we apply different rules for passwords on each site

etc, etc.

Is this even humanly possible? I don't think it is. Yet we blame the users when "they" get it wrong. How can they be getting it wrong when we design a system that requires super-human ability to comply. (These guys are potential exceptions: http://www.worldmemorychampionships.com/)

We are the ones that are getting it wrong and it's long overdue that we, as an industry, need to apply our minds to fixing it, instead of assuming that users should just deal with it.

SQL Down Under Show 51 – Guest Conor Cunningham – Now online

Late last night I got to record an interview with Conor Cunningham.

Most people that know Conor have come across him as the product team wizard that knows so much about query processing and optimization in SQL Server. Conor is currently spending quite a lot of time working on Windows Azure SQL Database, which we used to know as SQL Azure.

I'm still trying to think of a good way to say "WASD". I suppose I'll pronounce it like "wassid". Windows Azure SQL Reporting is easier. I think it just needs to be pronounced like "wazza" with a very Australian accent.

In the show, we've spent time on the current state of the platform, on dispelling a number of common misbeliefs about the product, and hopefully on answering most of the common questions that seem to get asked about it. We then ventured into Federations, Data Sync, and Reporting.

You'll find the show (and previous shows) here: http://www.sqldownunder.com/Resources/Podcast.aspx

Enjoy!

PS: For those that like transcripts, we've got the process for producing them much improved now and the transcript should also be up within a few days.

SQL Down Under Podcast 50 – Guest Louis Davidson now online

Hi Folks,

I've recorded an interview today with SQL Server MVP Louis Davidson. In it, Louis discusses some of his thoughts on database design and his latest book.

You'll find the podcast here: http://www.sqldownunder.com/Resources/Podcast.aspx

And you'll find his latest book (Pro SQL Server 2012 Relational Database Design and Implementation) here: http://www.amazon.com/Server-Relational-Database-Implementation-Professional/dp/1430236957/ref=sr_1_2?ie=UTF8&qid=1344997477&sr=8-2&keywords=louis+davidson

Enjoy!

Query Performance Tuning class now has online option

The query performance tuning class has been a popular class but we've only been running it in local cities. We're now running it in a fully-interactive instructor-led online class, complete with hosted hands-on labs. Later this year it will also be available amongst our upcoming on-demand training offerings.

If you're keen to spend some time in a query performance tuning class, let us know.

For the online offering, we've spread it over 4 x 1/2 days to make it easier to attend. We've got the same hands-on-labs and course content as the in-person class. The labs are hosted so you don't need to provide equipment but we have a lower cost attendance option if you don't need the hosted labs.

The initial course is offered to suit Asia-Pacific timezone but we intend to alternate it with an offering to suit US timezone.

Details are online here: http://www.sqldownunder.com/Training/SQLServerQueryPerformanceTuning.aspx

Run database checks but omit large tables or filegroups – New option in Ola Hallengren's Scripts

One of the things I've always wanted in DBCC CHECKDB is the option to omit particular tables from the check. The situation that I often see is that companies with large databases often have only one or two very large tables. They want to run a DBCC CHECKDB on the database to check everything except those couple of tables due to time constraints.

I posted a request on the Connect site about time some time ago:

https://connect.microsoft.com/SQLServer/feedback/details/611164/dbcc-checkdb-omit-tables-option

The workaround from the product team was that you could script out the checks that you did want to carry out, rather than omitting the ones that you didn't. I didn't overly like this as a workaround as clients often had a very large number of objects that they did want to check and only one or two that they didn't.

I've always been impressed with the work that our buddy Ola Hallengren has done on his maintenance scripts. He pinged me recently about my old Connect item and said he was going to implement something similar. The good news is that it's available now.

Here are some examples he provided of the newly-supported syntax:

EXECUTE dbo.DatabaseIntegrityCheck @Databases = 'AdventureWorks', @CheckCommands = 'CHECKDB'

EXECUTE dbo.DatabaseIntegrityCheck @Databases = 'AdventureWorks', @CheckCommands = 'CHECKALLOC,CHECKTABLE,CHECKCATALOG', @Objects = 'AdventureWorks.Person.Address'

EXECUTE dbo.DatabaseIntegrityCheck @Databases = 'AdventureWorks', @CheckCommands = 'CHECKALLOC,CHECKTABLE,CHECKCATALOG', @Objects = 'ALL_OBJECTS,-AdventureWorks.Person.Address'

EXECUTE dbo.DatabaseIntegrityCheck @Databases = 'AdventureWorks', @CheckCommands = 'CHECKFILEGROUP,CHECKCATALOG', @FileGroups = 'AdventureWorks.PRIMARY'

EXECUTE dbo.DatabaseIntegrityCheck @Databases = 'AdventureWorks', @CheckCommands = 'CHECKFILEGROUP,CHECKCATALOG', @FileGroups = 'ALL_FILEGROUPS,-AdventureWorks.PRIMARY'

Note the syntax to omit an object from the list of objects and the option to omit one filegroup.

Nice! Thanks Ola!

You'll find details here: http://ola.hallengren.com/

 

SQL Down Under Podcast – Gadi Peleg – Data Quality Services

Well it's been a few months but I'm back on a roll creating some SQL Down Under podcasts. The first out the door is an interview with Gadi Peleg from the SQL Server team, introducing Data Quality Services.

Gadi came to Microsoft when Zoomix was acquired.

Details of this podcast (and other available podcasts) are here: http://www.sqldownunder.com/Resources/Podcast.aspx

Hope you enjoy it even though there are some telling signs that I recorded it at 3AM 🙂

If you are using iTunes, you can also subscribe here: http://itunes.apple.com/au/podcast/sql-down-under/id503822116?mt=2

Free eBooks – SQL Server and other Microsoft Technologies

Great to see the advice from Gail Erickson about the release of a number of SQL Server related eBooks on the new Microsoft eBook Gallery site. It's good to see this sort of content moving over to eBook formats.

The e-books that are currently available include:

  • SQL Server 2012 Transact-SQL DML Reference
  • Master Data Services Capacity Guidelines
  • Microsoft SQL Server AlwaysOn Solutions Guide for High Availability and Disaster Recovery
  • QuickStart: Learn DAX Basics in 30 Minutes
  • Microsoft SQL Server Analysis Services Multidimensional Performance and Operations Guide

You'll find details of them here: http://social.technet.microsoft.com/wiki/contents/articles/11608.e-book-gallery-for-microsoft-technologies.aspx