Has the term "big data" completely lost meaning yet?

There are some terms in IT that make their way straight into the hype stratosphere. Unfortunately “big data” is one of these. I see very few systems that I’d contend are actually “big data”. However, I endlessly see the term applied to data stores that are trivial by today’s standards. This might help the marketing teams but it’s sad none-the-less. There are some technological challenges that really do start to bite as the data volume really does start to become large, and as the proportion of unstructured and semi-structured data increases. There are also some very interesting new tools that allow us to process larger volumes of data faster, particularly in relation to analytics, and a large market building around Hadoop and its derivatives.

I also see entire teams that claim to focus on big data, yet whenever I discuss the projects with them, none of them are working with databases that are even vaguely in the ballpark of what anyone would have considered big data ten years ago, let alone today. None of the people involved have ever dealt with even really large tables by today’s standards. It’s interesting that so many data-related jobs have suddenly become “big data” jobs. I’d love to know what these teams think they mean, when they say that they “focus” in these areas. It simply isn’t possible for so many of them to do so.

I had a chuckle when I read this blog post from Warwick Leitch: Call it big data if you like… but you will look foolish. In that example, Warwick was referring to survey data that was held in a spreadsheet…

For a more serious take on this subject though, there is some interesting material in Stephen Few’s recent blog post: Big Data, Big Deal. Stephen argues that big data is simply a marketing campaign. As always, the comments associated with the blog post make for reading that’s as interesting as the post itself. I don’t totally agree with Stephen on this, as there really has been quite a shift in the available tooling in recent years, but much of his discussion is right on target.

Ironically yesterday I was working with a team that has a project that I would qualify as “big data”, yet they had never thought to call it that.

I suspect we as an industry need to start to quantify what the term “big data” really means, at a given point in time. It’s clearly a relative term that changes over time. Otherwise, we should lose the term entirely or further define it, as there is currently a great deal of confusion around it.

The whole discussion reminded me of this wonderful xkcd cartoon that compared production levels in the energy industry: http://xkcd.com/1162/

One of the more amusing calls I had last year was with a US based fast food chain. They told me that they were ok using SQL Server for their analytic work but they’d decided they needed to use Oracle for the main database engine, based on the volume of data that they needed to handle efficiently. The Oracle sales guy had done a good job. I was intrigued about what volumes of data they thought would justify this. Later, it became apparent that it was about 30GB…

Without triggering a “that’s not a knife, that’s a knife” moment, I’d love to hear what others currently consider “big data”. I don’t consider “using Hadoop (or HDInsight) as a synonym for “working with big data”.

2013-02-10