Bill Pollock, President and Principal Consulting Analyst with Strategies for GrowthSM explains why Big Data isn’t the holy grail, instead focus on the quality, accuracy, accessibility and application of the data you routinely collect...
ARCHIVE FOR THE ‘business-analytics’ CATEGORY
May 02, 2016 • Features • Future of FIeld Service • big data • Bill Pollock • Business Analytics
Bill Pollock, President and Principal Consulting Analyst with Strategies for GrowthSM explains why Big Data isn’t the holy grail, instead focus on the quality, accuracy, accessibility and application of the data you routinely collect...
While much of the ongoing discourse in the global Information Technology (IT) community nowadays seems to center around hot topics, such as the Internet of Things (IoT) or “Big Data”, research has shown that it is not necessarily the size of the database that matters; but, rather, the quality, accuracy, accessibility and application of the relevant data that is being routinely collected, analysed and shared throughout the organisation.
In other words, data does not necessarily need to be “big”; it simply needs to be relevant, accessible and actionable, in order to be useful.
However, this is an important distinction that is missed by many!
First, let’s talk about what the “big” in “big data” really is. According to IBM, every day, we create 2.5 quintillion bytes of data – in fact, so much that 90% of the data resident in the world today has been created in just the last two years alone.
As a result, field service organisations now have access to an unprecedented amount of data about the performance of their technicians, their vehicles, the equipment they service and their business performance in general.
“The rule of thumb is more a matter of focusing primarily on the data that you “need-to-know” rather than collecting data that is only “nice-to-know”
Other questions are also bandied about, such as “how big is too big data?”, and “what constitutes “big enough” data?”
It is, typically, in their responses to these types of questions, where many field service organisations initially go wrong – that is, they incorrectly believe that since they have already collected mountains of data from multiple sources (i.e., service call activity records, closed call reports, technician generated utilisation and/or productivity reports, machine-to-machine communications; etc.) that they must use all of these data in as many scenarios as possible.
But, the rule of thumb is more a matter of focusing primarily on the data that you “need-to-know”,rather than collecting data that is only “nice-to-know”.
The difference between these two types of data may appear to be subtle at first glance, but it is an important distinction since data collection, in and of itself, requires a massive expenditure of time, resources and investment, both human and pound-wise; it must be gathered, analysed and disseminated through a highly organised and controlled process, with direct senior management oversight and accountability; and it must bridge virtually all areas within the organisation – both from the top-down, bottom-up, and all throughout.
Other questions are also bandied about, such as “how big is too big data?”, and “what constitutes “big enough” data?”
In fact, it is those services organisations that are most successful in managing their business analytics that can easily tell the difference between “big data” and “enough data”.
They are also the ones that can most easily recognise when the bar for data collection, analysis and sharing needs to be raised in order to accommodate anything from the normal evolution of the organisation’s evolving database needs, to more event-driven needs, such as to account for a new product/service launch; increases in the numbers of customers, installed base and/or field technicians; business mergers, acquisitions or consolidation; new strategic alliance partnerships; etc.).
So … how big does your data really need to be?
The answer is simple: Big enough to support the organisation’s ongoing business analytics needs and requirements in terms of the ability to collect, analyse and share all of the data that is deemed important (e.g., business-critical, or mission-critical, etc.); required as input into the organisation’s ongoing metrics, or Key Performance Indicator (KPI), program; as input to annual or other periodic planning and forecasting activities; and the like.
Whether your organisation finds itself “swimming” in a data lake of epic proportions, or simply maintaining a modest database that fully supports its front and back offices; its field technicians, customers, and partners; its management decision makers; strategic partners; or any other stakeholders within the organisation, it will still require a sound “data analytics” program in order to make it all work.
Once again, it does not need to be “big” – just “big enough”, relevant and actionable.
Leave a Reply