Can data transparency be the future of outsourcing?

Can data transparency be the future of outsourcing?
Matt Kingswood is the Head of Managed Services of Midlands and London-based ITS, a provider of Managed IT services across the UK. ITS is part of the US Reynolds and Reynolds company which has a strong heritage in data backup and recovery services. In his position, Matt is responsible for developing Managed IT services within the UK and is currently focused on the next generation of cloud and recovery products. Matt has more than 20 years of experience in the information technology industry, and was formerly CEO of The IT Solution – a full service IT Supplier acquired by ITS. Since joining ITS, he has led efforts to introduce a range of managed services based on the new ITS cloud platform. Previously Matt had a career in technology for several top tier investment banks before founding and selling several companies in the IT services industry. Matt has an MBA from The Wharton School of the University of Pennsylvania and a Master’s in computer science from Cambridge University.


The benefits of storing data in the cloud are clear. However, as businesses are beginning to closely examine what having data in the cloud entails, they’re discovering that their relationships with cloud vendors are sometimes, well, cloudy.

In a 2015 Forrester Consulting survey, more than 60% of businesses said issues with transparency were stalling further expansion into the cloud. These organisations are justified in being wary, because knowing where data is going and how it is being treated is paramount.

I’ll explain why the next wave of successful cloud providers will compete on these issues rather than price, product or market.

Why is location important?

If backups are vaulted in the wrong geographic location, businesses limit their ability to rebound from an incident within the necessary recovery time objectives (RTOs), due to latency concerns and bandwidth cost. The goal of strategically selecting where data will be vaulted is to minimise organisational risk as much as possible. To achieve this goal, businesses need to have two separate RTOs in place for operational issues that are specific to the individual environment (such as a server outage), and regional disasters.

A business would likely require a lower RTO for an operational issue, which would allow for local data vaulting, whereas a regional disaster could either have an equal or less aggressive RTO, simply because customers view events affecting several providers in a given area differently to an event affecting a single entity only. Unfortunately, many organisations focus solely on addressing operational RTOs in the disaster recovery (DR) planning process, which is catastrophic in a widespread event.

One of the benefits of the cloud is that it allows businesses to achieve a solution that addresses both operational and DR RTOs – they just need to know where the cloud’s data centres terminate. The important thing is to have their data as close as possible, but far enough away to ensure there’s not a common risk between geographies. The further apart locations are, the more availability and recovery challenges there are – and of course cost and latency (affecting communications, user experience, and so on).

It’s also important to know where data is stored, as compliance obligations – whether explicit or implicit – sometimes restrict the flow of data across EU borders, and many cloud-based solutions have multiple back-end data centres spanning multiple regions.

Knowing where their data is being sent allows businesses to dictate risk aversion or assumption. To win new outsourcing deals, cloud providers will be upfront about where an organisation’s data will be sent. Additionally, the provider will demonstrate that it can meet the client organisation’s RTOs by offering a service level agreement (SLA) that both IT and executive management can understand.

What data handling responsibilities must a business meet?

One issue that complicates the matter of data transparency is the fact that the data businesses manage isn’t just data that’s critical to operations – it’s personal data entrusted to the business by its employees and customers.

Many organisations have to adhere to regulatory requirements which require businesses to handle sensitive data in accordance with a specific set of standards. Under the EU’s General Data Protection Regulation (GDPR), for example, organisations handling personal data belonging to EU members will be responsible for ensuring that information is protected and are responsible for breaches of this data. The GDPR promises citizens the right for data to be forgotten, easier access to one’s data and a right to data portability. This responsibility extends to third-party cloud providers, which is why transparency into service providers’ data management practices is crucial.

In the wake of Brexit, UK businesses might expect that GDPR is no longer relevant, but they’d be mistaken. GDPR analyst Chiara Rustici is one of many pundits who argue that UK companies should continue forging a path toward GDPR compliance. The reason, she argues, is that GDPR will affect UK businesses serving EU customers. GDPR isn’t so much about where the company handling the data resides, but rather where the person to whom the data belongs resides.

Even if GDPR were removed from the picture, UK businesses would still need to be aware of how their and their customers’ data is being handled in the cloud. Some businesses, for example, have mandated data retention time frames for specific types of data, such as the six-year retention period for payroll information, as required by the Taxes Management Act 1970.

For cloud vendors to reassure businesses that they can protect sensitive data, they’ll emphasise their encryption and archival practices. In the event that an individual invokes their right to be forgotten as per GDPR, cloud providers should be able to identify all the places this data resides and provide the client organisation with a written record of the applicable records’ destruction.

Again, successful cloud vendors will be those which back all of the above guidelines with an SLA that includes predefined, clearly outlined terms.

How much does price matter?

Price has long been touted as a benefit of moving to the public cloud, and public cloud providers are continually competing for customers by slashing prices. In 2013, for example, RightScale reported that the four public cloud providers (Amazon Web Services, Azure, Google and Rackspace) rolled out a total of 25 price drops, up from 22 in 2012. When it comes to overall cost reductions, the providers are in a constant battle. In 2012, Azure had the most cuts, only to be edged out by Amazon in 2013. These price wars have continued into 2016. However, this so-called race to the bottom draws the focus away from the more important issue of data transparency.

Today, while cost is important, it’s not the top driver for cloud adoption. Research released by 451 Research earlier this year indicated that customers are concerned with cloud providers’ ability to provide value in terms of managed services, especially with regulations requiring strict data protection measures. In Europe, 451 Research assigned customers a 12 per cent Cloud Commodity Score (CCS), which measures price sensitivity – the higher the score, the greater the impact of price on adoption.

If a cloud provider is unable to meet an organisation’s need for data security and comprehensive support, the organisation won’t hesitate to invest more in a provider that can meet its needs. For cloud vendors to be able to win more deals, they should focus on value-added services – not cost.

Data transparency is a legitimate concern for businesses, but it needn’t be a barrier to cloud expansion. Competitive cloud providers will be upfront about their data handling practices and work with their clients to help them fulfil their compliance requirements.

View Comments
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *