It’s no secret that cloud adoption is gathering significant steam, with an estimated 9 out of 10 operations that will be running in the cloud by 2020. Already, we know that 70% of organisations are managing some form of data in the cloud, right now. This is why it’s important for organisations to look at how their data protection platform can aid in their future strategy for cloud migration. I believe businesses should choose a data protection platform that has multi-cloud functionality, giving them the flexibility to decide the best services and solutions (private, public, or hybrid cloud) that work for them.
I reviewed a white paper the other day from Commvault, called “Adapting Data Protection Strategies to Cloud Infrastructure” which provided some relevant insights into the challenges and considerations for cloud data protection. Following are snippets from that whitepaper that I thought were worth sharing.
And if you read on, I’ll share with you some real-life cloud data protection scenarios that we’ve run internally at ViFX and with a couple of our clients, to show how easily it and effectively it can be done.
The particular challenges of cloud data protection
When you’re considering your data protection strategies as you move workloads, applications, and data sets into the cloud, it is important to clearly understand the particular challenges of cloud data protection.
Managing data across physical and virtual infrastructures creates a number of challenges for IT departments. On top of this businesses are creating data at increasing rates, resulting in 40-50% data growth annually. Moving data into a public cloud environment brings several benefits, however there are certainly some challenges when it comes to data protection in public cloud.
Following are a few of those challenges:
Understand the level of importance of the data in the cloud
You may end up with data residing on multiple storage platforms across multiple data centre locations, as well as on multiple cloud services outside of those data centres. The need to understand the nature of the data and its importance, to ensure it resides on the appropriate tier for recovery time expectations, becomes even more critical.
Fully evaluate the costs of downtime or data loss
For any application or data set you should gain stakeholder alignment on the business impact if it is unavailable. This will enable you to determine the right investments in service levels and disaster recovery planning. Without this alignment you may be paying too much or too little for what you actually need to ensure the appropriate level of availability.
Data migration practicalities
Bandwidth constraints often present additional challenges for migrating data into, and later recovering from, the appropriate cloud services. Data gravity is a very significant and a very real thing. It was noted that for some organisations, even when they were ready to start moving data sets into the public cloud, the physical constraints of how long it would take made the move impractical and cost prohibitive.
Addressing security concerns
Public cloud providers have a lot of security features to address this key barrier to adoption. These features are available to ensure an organisation’s data is protected against intrusion, authenticated for user access controls, and ensured of secure upload and download processes. Additionally, you should consider encrypting the data itself to provide another layer of security. This can be done both at the source with in-stream encryption, and then extended to the data at-rest in the cloud storage target.
Data retrieval of what you need, when you need it
Many organisations are finding that migrating data to the cloud is becoming easier with continually improving tools, however the ability to retrieve data is a little more complicated. You need to ensure the ability to search and retrieve the exact file needed, by end users self-serving their needs quickly and without IT intervention. Without this, public cloud services become just another storage tier, and not necessarily more agile, flexible, or cost-effective in some cases.
Portability of data
To avoid lock in, and as the market continues to innovate, organisations are looking to use multiple cloud providers across their environments. While on the outset this provides increased agility and a healthy level of diversity, it unfortunately also increases operational complexity to manage multiple vendor relationships. The ability to understand what data resides in which platforms, and to retrieve and migrate that data, are key factors in ensuring your data can continue to be leveraged and even moved across a range of cloud infrastructures.
Putting Commvault to the test
Over the last 12 months, ViFX have tested a range of data protection scenarios across the main public cloud platforms (Azure and AWS), both for ourselves and our customers. The results have been positive and highlighted that Commvault provides the necessary flexibility to aid the consumption of cloud services/solutions into an existing data protection strategy with ease.
Following are some cloud data protection scenarios we have investigated:
- Backup to the cloud - using blob storage such as Amazon S3 as a storage location for remote copies of local backup data. This provides an alternative option to traditional off-siting strategies such as use of tape movements and provides a number of benefits such as expandable capacity (no more running out of tapes) and continuous availability (removing time taken to recall tapes for restores)
- Backup in the cloud - Protecting workloads in AWS and Azure by running Commvault infrastructure in cloud instances. This type of solution can be very rapidly deployed in Azure as there is a preconfigured Commvault solution available from the Azure catalogue, with a ‘bring your own license’ model which will allow you to get up and running in minutes.
- Backup from the cloud - Backing up public cloud workloads directly back to an on-premise/internal Commvault platform.
Additionally, one very cool scenario involved backing up a virtual machine on an internal platform (VMware or Hyper-V) and then restoring directly into Azure as a new instance, and have it up and running immediately. Such functionality can be used to rapidly migrate on premise workloads to the cloud for permanent relocation, or for standing up test or development copies of production.
The rewarding results
I wouldn’t want to limit my choice of cloud service, simply because my existing data protection platform didn’t accommodate the requirement. I can appreciate that organisations may perceive they are locked in/stuck with what they have for now, but I wonder what the cost is to the business of not accommodating? Even if it doesn’t stack up financially for that requirement, what about the additional requirements?
The shift towards cloud consumption is certainly ramping up and IT professionals are looking for ways in which they can not only accommodate this movement, but in fact provide cost effective and easily managed solutions. No enterprise data protection platform is cheap, so it really needs to deliver on value, and part of the value is enabling businesses to consume the cloud services they want.
What challenges has your organisation come across when protecting data in the cloud? Let us know in the comments below.