One of the interesting things we found interviewing Big Data experts is how they define Big Data.
Most articles and studies define Big Data as:
a collection of datasets so large and complex that it becomes difficult to process using traditional database management tools or data processing applications.
Most definitions also discuss the 3 Vs - volume, variability and velocity - and how the Internet, mobile computing and smart objects are leading to vast increases in all three.
When we asked experts to define Big Data, they consistently gave us some form of this definition. But in the discussions that followed, their usage of Big Data indicated a different definition.
The experts consistently talked about Big Data as any use of the data and analytics that was bigger than normal for their firm or clients.
In multiple cases the experts even talked about using Excel to analyze their Big Data datasets. Obviously datasets that can be analyzed using Excel are not too big for traditional IT tools.
Also, the experts' usage of Big Data made it clear they include data analysis when they use the term. For example, we heard things like "we use Big Data to target customers better" and "Big Data has really improved our supply chain".
This usage makes perfectly good sense. The "Big" in Big Data is situational - what's big to one firm or person may be very small to another. And obviously data is not of much value unless it's analyzed and turned into useful insights.
Because of these interviews we're thinking differently about Big Data. We now define it as:
Any use of data and analytics that leads to actionable information where the volume, variety and/or velocity of the analyzed data is greater than normal for the firm/user.
We think this works for businesses of all sizes and better fits the way the term is being used. We also think it better reflects the importance of turning data into information.
For more on Big Data and what it means for small businesses, see the Intuit 2020 report The New Data Democracy.