In our love affair with all things Big Data we easily forget the nuances that make it more about ubiquitous data than anything else. The oversimplification of the term is what frustrates many. Just last weekend Forrester’s John Rymer penned Big Data: The Worst Category Name Ever.
Strong words? Not really. The term Big Data has been grossly oversimplified by media looking to give complex ideas a simple name and vendors wanting to sell one-size-fits-all solutions. The truth is very apparent that Big Data takes an infrastructure village of several technologies, including process management, to ‘work’.
While Big Data has been loosely defined by volume, velocity, variety, the volatility aspect of the data is a key nuance. Volatility speaks to the lifespan of data and comes in three flavors. These flavors define how data gets treated at the lowest level…when it enters our organization from outside or pops out of an organizational filter/source like social media, analytics or calculation.
The three flavors have everything to do with gauging immediacy and look like this:
Now – From events to processes
Information is often so volatile that the moment we are aware of it, it needs to be acted upon as an alert or the start of a process. When an order hits the system, an invoice goes out immediately and hastens payment and improves cash flow. When a trader breaches a limit, someone needs to be told and permissions restricted. This is the most volatile of data and has a very short half-life for opportunity or risk. It has operational implications and must be consumed by smart systems in real-time.
Soon – From analytics to action
There’s enormous value in immediately flagging data as important to enriching the output of analytics. We get bulk files from a partner that immediately need to be put through algorithms to find better answers to supply chain issues. Sales details trigger new buying patterns or stock movements. This is less volatile data but important to tactical decision making.
Someday – Let’s keep it just in case
Gartner’s Doug Laney coined the term Infonomics to describe the concept of data as a formal business asset. Knowing that data has value that may not be apparent, there’s a new emphasis on saving what you have on the chance that it will reveal a meaningful pattern later or will be valuable to a partner or ‘data customer’. This is the least volatile data but could have unseen value and needs to be stored for strategic value.
These are three simple flavors of data volatility that need to be part of an enterprise strategy for the management of information at a very discreet level, where decisions are made in the moment and automation is a premium. Clever enterprises manage these data flows through a service oriented infrastructure and across an information bus using tools that give flexibility and intelligence to ubiquitous data.
Are you ready? There’s every indication that the challenges of data volatility are only going to increase and become more of a reason for success or failure.