Across any industry, data is an imperative. Today’s business leaders understand that they compete on information as much as the goods and services that they provide. There is no longer any doubt that organisations should be data-driven. Now, it’s a matter of how to accelerate and mature analytics initiatives, and how to increase their accessibility beyond a core data science team.
Reaching this “transformative” level in data and analytics is inextricably tied to growth, and a top priority for organisations worldwide. Gartner reports that in recent years, this area has been a number one investment priority for CIOs.
In that same report, however, Gartner explains that the majority of organisations have been slow to advance in data and analytics. There are a number of reasons for this, which vary across geography and the organisation’s current level of maturity, and an even greater number of solutions to address these stumbling blocks. But we’ve seen a consistency in deterrents to analytic progress across many different organisations—data preparation.
It’s been widely covered that getting data cleaned and prepared for analysis takes up 80% of the time and resources in any data project, which makes it the biggest area of inefficiency when working with data, but also the biggest area for improvement. In order to accelerate analytics across any organisation, improving data preparation is a good place to start.
Below, we’ve suggested three ideas when implementing a new data preparation strategy to ensure it gets off the ground and contributes to the acceleration of analytics initiatives.
Reconsider who owns the work of preparing data
Many analysts are forced to wait in line to get data cleaned, passing specs back and forth, and iterating endlessly before they can interrogate the data or run the algorithms that will improve their business. It’s time to ask why people who know the data best can’t do the preparation. Why aren’t the users with the business context in their heads in a position to take care of data preparation? Trying to meet the needs of an exploding number of analysts and data scientists at a time when IT budgets are flat or shrinking is not efficient.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below formBy GlobalData
IT organisations simply can’t scale to meet the data provisioning needs of the business. Enterprises need to shift the burden of the work to end users. It’s the only way to keep up and the only way to stay competitive.
Here’s the secret: organisations shouldn’t covet this work anyway. Remember, it’s janitorial work—cleansing, structuring, distilling, enriching, validating, etc. Organisations should give this work to those doing the analysis and they’ll be grateful for it. This shift will result in faster cycle time and better insights because the people preparing the data actually know how it’s being used to drive decisions.
Reconsider how this work should be done
Simply passing this work off to business users doesn’t solve all of the problems. For one, many of them are already preparing data. They’re using common spreadsheet tools like Excel and that hasn’t increased efficiency—these tools are prone to error, prohibit collaboration, and have difficulty handling large or complex datasets. This necessitates users to check and recheck their work or can cause analytics initiatives to stall out altogether.
Not to mention, Excel doesn’t provide IT with the governance needed to oversee sensitive data. Scaling analytics initiatives is important, but not at the risk of security.
Instead, organisations need to adopt a modern data preparation platform that’s accommodating to analysts and IT departments. It should offer an intuitive experience for analysts so that they can transform data without learning code, one informed by visual representations and machine learning. Speed is a must—the data preparation platform needs to be able to process large datasets instantly so as not to sacrifice on speed. And, of course, extensive support for open source and vendor-specific security, metadata management and governance frameworks to work in collaboration with IT.
Share modern data preparation success stories
Using new technologies—and giving business users new responsibilities when it comes to data preparation—is a highly effective approach. However, it takes getting used to, just like any other new business process or technology within an organisation.
At a time when many employees are still learning the impact that a new approach to data preparation can have, being able to share case studies will more effectively demonstrate their value.
When deploying a new data preparation platform, start with a core set of users first who have a specific use case they need to execute. Build upon quick wins with these users in order to implement larger and more expansive data preparation use cases.