When a business is deemed successful it’s often because of great people doing awesome things with the latest technologies. Yet there is always a built-in problem for long-time winners that follows their success over time – inevitable age that causes obsolescence of both people and machinery alike.
The aging process leads to a need for ongoing retirement and refresh, even in companies that have in the past deployed prize-winning formulas. Unfortunately, winning arrangements by definition have survived, and through momentum often live well past their prime.
When it comes to data protection, what worked very well in the past to help the long surviving IT and the well-established business protect their mission-critical data most likely no longer works quite as well as it did in its prime. In fact, given the pace of change in data, applications, architectures, and even the skills and expectations in the available workforce, most organizations are working feverishly just to keep their business applications competitive. Quite often long-running back-office disciplines like data protection have to limp along as best they can, maybe adding band-aids and patches where it visibly hurts but not actually refreshing the whole solution to keep it truly current.
At least until something finally breaks in a big way – perhaps a ransom-ware attack, a rogue ex-employee, a 1000-year flood, or even full-blown compliance audit. By then of course it’s too late to be protected and prepared. The consequences can be fatal – winners can become losers overnight.
Good Legacies Can Beget Bad Ones
I see the legacy data protection challenge arising in three primary areas:
- Protecting legacy technology – Nothing that works well goes away fast. (Long live mainframes!) Even if users, usage, requirements, and expectations have grown and changed significantly over the years, the underlying IT methods, interface protocols, and performance capabilities of many long-successful applications and infrastructure may still be the same as the day they were first deployed – and today in 2018 that could be multiple decades past.Newer data protection architectures might require quite significant backwards integration to protect legacy technologies appropriately. And sometimes protecting both hardware and software built and deployed generations ago can still require legacy data protection technologies, doubling down on the legacy challenge.
- Technical legacies aging out – People grow old, especially experts it seems! Sometimes they leave even before they retire. Regardless, people inevitably grow old and age out of the workforce. And when they leave, there often aren’t equivalent knowledge replacements readily available. Old-timers just know things, particularly about legacy technologies, that no one newly available to the market will have any exposure or experience with.The learning curve for someone new to pick up legacy technology expertise may not only be steep, it may be too slippery to climb at all depending on just how legacy the technology really is. Lack of current documentation, relevant training classes, original equipment vendors, and of course senior staff mentors can all hinder effective knowledge replacement.
- Backup product stagnation – Many backup products have failed to evolve and keep pace with the current state of IT solutions. A partial laundry list would include virtualized servers, hyperconverged infrastructure, hybrid cloud applications, public cloud data pipelines, web applications, multi-cloud deployments, n-way replications, and globalized 7×24 operations. Let’s not even talk yet about protecting big data clusters, distributed containerized applications, temporal software defined storage, or IoT edge devices.In addition, expectations for data availability have changed significantly too – with end users more and more expecting “Apple Time Machine” like functionality in every IT application, instant RTO and seconds-level RPO from any mobile device anywhere in the world.
Even if implemented backup solutions have somewhat evolved, the necessary patches, upgrades, and migrations are likely well outside the ability of many organizations to even consider rolling out. I’m sure top of mind for many is that if a complex, legacy solution is even partly working, it’s probably best not to mess with it at all or risk blowing it up completely.
Not Just Surviving, But Thriving
So what’s the best approach to dealing with age and obsolescence? Fundamentally it’s not fighting to retain aging staff on contract into their geriatric years, or ignore the increasing wrinkles and weakening bones of your data protection program.
Rather it’s looking for a trusted service provider that specializes in data protection for enterprises like yours (like most, really). One that can afford to develop and maintain legacy technology expertise because they leverage it across multiple clients, that has current experience with most legacy hardware and software IT solutions, and that can not only maintain, but integrate, optimize and proactively operate modern data protection solutions for you on your behalf.
If you have age-related issues with your data protection, and want to keep on as a winning corporation, you might want to ask an expert data protection company like Cobalt Iron to come in and show what they can do to help keep you eternally young.
Learn about freedom from legacy challenges.
< Back to Blog