[ad_1]
Over the last couple of years, ransomware has taken center stage in data protection, but very few people realize it is only the tip of the iceberg. Everybody wants to protect their data against this new threat, but most solutions available in the market focus just on relatively quick recovery (RTO) instead of detection, protection, and recovery. In fact, recovery should be your last resort.
Protection and detection are much more difficult measures to implement than air gaps, immutable backup snapshots, and rapid restore procedures. But when well-executed these two stages of ransomware defense open up a world of new opportunities. Over time, they will help defend your data against cybersecurity threats that now are less common, or better said, less visible in the news—such as data exfiltration or manipulation. And again, when I say less visible, it is not only because the incidents are not reported, it is because often nobody knows they happened until it’s too late!
Security and Data Silos
Now that data growth is taken for granted, one of the biggest challenges most organizations face is the proliferation of data silos. Unfortunately, new hybrid, multi-cloud, and edge infrastructures are not helping this. We are seeing what we might call a “data silo sprawl”–a multitude of hard-to-manage data infrastructure repositories that proliferate in different locations and with different access and security rules. And across these silos there are often rules that don’t always follow the company’s policies because the environments are different and we don’t have complete control over them.
As I have written many times in my reports, the user must find a way to consolidate all their data in a single domain. It could be physical—backup is the easiest way in this case—or logical, and it is also possible to use a combination of physical and logical. But in the end, the goal is to get a single view of all the data.
Why is it important? First of all, once you have complete visibility, you know how much data you really have. Secondly, you can start to understand what the data is, who is creating and using it, when they use it, and so on. Of course, this is only the first step, but, among other things, you start to see usage patterns as well. This is why you need consolidation: to gain full visibility.
Now back to our ransomware problem. With visibility and pattern analysis, you can see what is really happening across your entire data domain as seemingly innocuous individual events begin to correlate into disturbing patterns. This can be done manually, of course, but machine learning is becoming more common, and subsequently, analyzing user behavior or unprecedented events has become easier. When done right, once an anomaly is detected, the operator gets an alert and suggestions for possible remediations so they can act quickly and minimize the impact of an attack. When it is too late, the only option is a full data recovery that can take hours, days, or even weeks. This is principally a business problem, so what are your RPO and RTO in case of a ransomware attack? There really aren’t many differences between a catastrophic ransomware attack and a disaster that make all of your systems unusable.
I started talking about ransomware as malware that encrypts or deletes your data, but is this ransomware the worst of your nightmares? As I mentioned before, such attacks are only one of the demons that keep you up at night. Other threats are more sneaky and harder to manage. The first two that come to mind are data exfiltration (another type of prevalent attack where ransom is demanded), and internal attacks (such as from a disgruntled employee). And then of course there is dealing with regulations and the penalties that may result from the mishandling of sensitive data.
When I talk about regulations, I’m not joking. Many organizations still take some rules lightly, but I would think twice about it. GDPR, CCPA, and similar regulations are now in place worldwide, and they are becoming more and more of a pressing issue. Maybe you missed that last year Amazon was fined €746,000,000 (nearly $850,000,000) for not complying with GDPR. And you would be surprised at how many fines Google got for similar issues (more info here). Maybe that’s not much money for them, but this is happening regularly, and the fines are adding up.
There are several questions that a company should be able to answer when authorities investigate. They include:
- Can you preserve data, especially personal information, in the right way?
- Is it well protected and secure against attacks?
- Is it stored in the right place (country or location)?
- Do you know who is accessing that data?
- Are you able to delete all the information about a person when asked? (right to be forgotten)
If regulatory pressures weren’t concerning enough to encourage a fresh look at how prepared your current data management solution is for today’s threats, we could talk for hours about the risks posed by internal and external attacks on your data that can easily compromise your competitive advantage, create countless legal issues, and ruin your business credibility. Again, a single domain view of the data and tools to understand it are becoming the first steps to stay on top of the game. But what is really necessary to build a strategy around data and security?
Security is a Data Management Problem
It’s time to think about data security as part of a broader data management strategy that includes many other aspects such as governance, compliance, productivity, cost, and more.
To implement such a strategy, there are some critical characteristics of a next-generation data management platform that can’t be underestimated. Many of these are explored in the GigaOm Key Criteria Report for Unstructured Data Management:
- Single domain view of all your data: Visibility is critical, yet attempts to close a visibility gap with point solutions can result in complexity that only heightens risk. Employing multiple management platforms that can’t talk to each other can make it almost impossible to operate seamlessly. When we talk about large-scale systems for the enterprise, ease of use is mandatory.
- Scalability: The data management platform should be able to grow seamlessly with the needs of the user. Whether it is deployed in the cloud, on-prem, or both, it has to scale according to the user’s needs. And scalability has to be multidimensional, meaning that not all organizations have the exact same needs regarding compliance or governance and may start with only a limited set of features to expand later depending on the business and regulatory requirements.
- Analytics, AI/ML: Managing terabytes is very difficult, but when we talk about petabytes distributed in several environments, we need tools to get information quickly and be readable by humans. More so, we need tools that can predict as many potential issues as possible before they become a real problem and remediate them automatically when possible.
- Extensibility: We often discussed the necessity of a marketplace in our reports. A marketplace can provide quick access to third-party extensions and applications to the data management platform. In fact, it is mandatory that APIs and standard interfaces integrate these platforms with existing processes and frameworks. But if the IT department wants to democratize access to data management and make it readily available to business owners, it must enable a mechanism that, in principle, looks like an app store of a mobile platform.
From my point of view, these are the main principles of a modern data management platform, and this is the only way to think holistically about data security looking forward.
Data Management is Evolving. Are You?
Now back to the premise of this article. Ransomware is everybody’s top-of-mind threat today, and most organizations are focusing on finding a solution. At the same time, users are now aware of their primary data management needs. In most cases, we talk about the first steps to get more visibility and understand how to improve day-to-day operations, including better data placement to save money, search files globally, and similar tasks. I usually classify these tasks in infrastructure-focused data management. These are all basic unstructured data management functions performed at the infrastructure level. Still, they need the same visibility, intelligence, scalability, and extensibility characteristics of advanced data management I mentioned above. But now there are increasingly pressing business needs, including compliance and governance, in addition to learning from data to improve several other aspects of the business.
Now is the right time to start thinking strategically about next-generation data management. We can have several point solutions, one for ransomware, one for other security risks, one for infrastructure-focused data management, and maybe, later, one more for business-focused data management. Or we can start thinking about data management as a whole. Even if the initial cost of a platform approach should prove higher than single-point solutions, it won’t take long before the improved TCO repays the initial investment. And later, the ROI will be massively different, especially when it comes to the possibility of promptly answering new business needs.
[ad_2]
Source link