At its simplest level, data protection isn’t really a hard concept. We start with a bunch of zeros and ones in a certain order and we need to ensure that regardless of disaster, interference, failure, or incompetence that we can always restore those bits to a pristine and fully operational condition. But assuring data protection in practice can be really difficult. Cascading incremental backups, complex snapshots and replication schedules, distributed data sets, increasingly mobile and demanding users, hybrid cloud operations, and motivated hackers all make complete data protection almost impossible.
Once upon a time we could just make a simple tape copy of our “main” frame and store it offsite as a backup in case we ever needed it. Those days are long gone. Today’s production data environments are complex, heterogeneous and even hybrid architectures – much of it made up of layers of virtualized infrastructure hosting increasingly agile applications. A lot of our important data no longer lives strictly within a physically defined data “center”.
What’s to be done? I propose that the future data protection answer has to consist of a broad three-pronged approach. We absolutely need high levels of automation. We need to leverage in-depth and up-to-date – almost real-time – intelligence to identify evolving threats. Then the best insight and expertise needs to be embedded at speed and scale to move operations from reactive to proactive, even to the predictive. I believe the intelligence necessary to drive all this will be encapsulated inside modern and big data-based, intelligent analytics.
All of which means that great data protection will require the equivalent of rocket science in the form of production-grade advanced intelligent analytics.
Survival for us higher life forms depends a great deal on our prediction-capable minds. We could spend our lives just reacting to given situations, but almost all of our best courses of action in any scenario depend on accurately identifying actors (good and bad) and predicting their actions, behaviors, and outcomes. It turns out data protection is much like this grand game of evolution – threats evolve, applications and usage change dynamically, and only the strong survive.
Maybe that analogy sounds a bit over the top, but my point is that enterprise data protection operations now are needing to be rolled out on a larger scale and with greater intelligence than ever before to match our increasingly digitized landscapes. Reliably protective operations need to adapt quickly to rapidly evolving threats challenging our vastly distributed and increasingly permeable “data” attack surface.
We absolutely need smarter (and faster) analytics and applications. It’s beyond time to roll in some automated intelligence into enterprise data protection operations. Such automated intelligence may come from astute automation, encapsulation of best practices, and a well-established discipline of “pattern recognition”. This pattern recognition approach has seen a new revival in the last few years for a number of reasons – larger available data sets, scalable big data algorithms, cloud computing (elastic resources on demand), and the crushing pressure to proactively identify abnormal behaviors in real-time at large scales.
Learning and applying automated expertise and intelligence will greatly enhance traditional IT operations. To be clear, we are not talking about sentient computers taking over and running IT anytime soon, but as an industry we are building “smarter” operational applications that embed increasing amounts of intelligent analytics and automated reasoning. To be really smart, we will want to learn about new threats before experiencing them ourselves! And we will want to optimize our operations based on others’ experiences as well as our own. In this respect, Service Providers have the great benefit of being able to look across the IT operations of many organizations – sometimes thousands or more – when building and training their analytically intelligent services.
If you are still skeptical, consider that many valuable security and protection tasks are no longer manually feasible in our current IT world. These tasks simply have to be intelligently automated. For example, imagine we want to identify security issues with data protection operations globally across thousands of data stores and access points – perhaps recognizing the abnormal signature of some new ransomware that might be slowly encrypting some vulnerable data stores. We need trained analytics that run, learn and score all of our systems 7×24, identifying intrusions as early as possible.
You don’t want (and actually can’t afford) to staff up data protection rocket scientists – there aren’t many of those out there to hire in any case. The smarter route is to engage emerging intelligent services such as those provided by Cobalt Iron. Cobalt Iron is doing more than just talking about intelligent operations, they are delivery smart, analytics-driven data protection services. They have been working diligently to embed advanced levels of expertly trained analytics into their scalable data protection offerings. They have the benefit of scale and focus – they can look across an entire “cohort” of data protection clients to distill out best practices and get ahead of emerging threats. They are data protection rocket scientists, and their “Advanced Analytics” solutions are getting smarter every day. Take advantage!