Posted Wed 13 Feb 2019
Enterprise data volumes are growing exponentially. Legal and regulatory pressures have driven an approach that retaining data is a significantly more palatable solution than the risks associated with removing it. With storage always seen as relatively cheap, volumes have been allowed to grow. But the world is changing. Overall, storage is getting increasingly costly (despite unit costs falling, volume growth is far outpacing it) and regulations such as GDPR are driving a greater a requirement for better data classification and disposal processes. All of this leads to an increasingly complex issue for enterprises to tackle.
My colleague Raaj Parmar covers in his article the links and benefits of building out the case for true enterprise data management in the regulatory/records management space. The focus of this discussion is on the steps required to manage enterprise data, why it often fails and how it can - collaboratively - be managed effectively to drive real benefits to the organisation.
A key question is: who is responsible for data volume management? Is it technology, who owns the platform? or the business, who drives the production of the data in the first place? The answer, ultimately, is it’s both. Historically, both parties have laid the issue at the door of the other, but it’s fair to say they both have a point. Let’s explore...
Let’s start at the fundamental cornerstone of data management: M.I. Yes, data about data. The platform technology teams manage and run the storage infrastructure on the organisation's behalf. They are best placed to know what is stored where, the capacity and utilisation of the underlying assets, and to manage the good practice activities to maximise asset efficiency (think thin provisioning, compression etc.).
Therefore it is technology who should take the lead. In providing this information, they are also well placed to provide a historic trend analysis into business, leveraging the information built over the last 3-5 years. If the data and tooling can support it, this initial baseline can also benefit from splitting BAU and project growth, as well as any other restrictions and dependencies such as dedicated assets or disaster recovery and business continuity management considerations.
This initial baseline of MI moves us away from starting with a blank sheet, and provides valuable context as the initial strawman onto which business leads can develop and refine.
The business leads, now armed with an informative view and an understanding of where they have come from, are best placed to advise where they are going to. They can review and challenge the data, call-out inaccuracies or where historic trends no longer provide an accurate view of the future. They can also identify and address the major users of storage in their world. Strategic plans, including any moves to the Cloud, can be overlaid onto the planned growth from project and BAU demand, building a much more holistic picture.
All parties should recognise that it is not going to be perfect at the first pass, but completing these initial steps will place many organisations into materially better position than their competition.
Accountability for good data management doesn’t stop there, however. Now better informed to take targeted and meaningful action, business leads should review and implement a plan to purge aged and suitably classified legacy data, inform end users how and why they need to employ better housekeeping, and drive better links into the organisation's records management and data retention policies.
TORI estimates that these clean-ups, linked to better data management practices, can help reduce current data demands by an average of 20%, and provide an ongoing constraint on growth. That a significant reduction to any capex budget, or provides significant opportunity to address technical debt or key constraints without having to result to reactive new asset purchases.
Thereafter, the process moves into continuous improvement. Data is exchanged between technology and the business leads and continually refined, providing iteratively improved, meaningful insight. Decision making becomes clearer and more strategic, investments become longer term and more cost effective.
Having control of the infrastructure, and a clear view of the demand timeline, provides technology with the means to drive ever greater levels of efficiency. Thin provisioning becomes increasingly scientific, utilisation is driven to optimal levels, and technology can be better maintained and refreshed. The cumulative benefits continue to accrue as the process develops in maturity.
Once this level is reached, the business is truly empowered to understand and ‘own’ their storage consumption. They understand the implications of their decisions, in terms of cost and service delivery. It also presents the wider enterprise with significant benefits. Businesses, understanding their demand and available capacity better, and in near real-time, means they can barter and trade as their demand inevitably changes. When that happens, the wider enterprise wins.
So who’s problem is it anyway? It’s everyone’s, but we think it starts with technology and with quality M.I.