Confidential Computing: A Second Chance for Safeguarded Dataset Migration

Over the last 10 years, while migrating Data from on-prem datacenters to the public cloud for UK Government or Enterprise Organisations, I have consistently engaged in conversations with security, which always ended up mentioning the Patriot Act. (not the Netflix show)

Given that the big three cloud providers all have their headquarters in the US, they are subject to a law called the 'Patriot Act.' This law can be applied to US-based organisations, compelling them to hand over any data they hold for their customers without judicial approval.

While we can use TLS/IPsec and other encryption methods to secure data in transit and implement storage encryption for data at rest, the remaining vulnerability arises when applications access unencrypted data in memory during runtime.

In theory, any workload in the Cloud can be compromised through access to raw memory via the supervisor.

“Confidential computing” is a technology helping to close the last part of securing the Data lifecycle - “data in use”.

By “confidential computing” we call protection of data in use by performing computation in a hardware-based, attested Trusted Execution Environment (TEE).

TEEs prevent unauthorised access or modification of applications and data during computation, thereby always protecting data. The TEEs are a trusted environment providing assurance of data integrity, data confidentiality, and code integrity.

Any code outside TEE can't read or tamper with data inside the TEE. The confidential computing threat model aims at removing or reducing the ability for a cloud provider operator or other actors in the tenant's domain accessing code and data while it's being executed.

It effectively establishes isolation and protection for CPU operations and a portion of the operating memory at a hardware level.

What's even better is that Trusted Execution Environments can be utilised not only by the operating system running directly on the server but also by a virtual machine or even a container!

Some thoughts that come to mind:

  • This does re-opens some of the conversations abandoned in 2015 at the peak of the Cloud adoption, can one or another Dataset be moved to the Public Cloud?

  • Organisations might want to re-evaluate some of their security solutions like Oracle Vault providing a similar functionality via Software Encryption

  • Existing services can benefit from increased security posture and the zero-trust model becomes more reachable than ever

  • Sharing resources doesn't seem like such a bad idea when each container can be effectively fenced off

Want more use-cases? Just append the word 'confidential' to any existing service you are using. It seems like this is what cloud provider marketing teams have done:

  • Confidential VMs

  • Confidential Kubernetes

  • Confidential Serverless

  • Confidential Database

  • Confidential <name>

Any workload in the Cloud can be compromised through access to raw memory via the supervisor

Any workload in the Cloud can be compromised through access to raw memory via the supervisor

Any workload in the Cloud can be compromised through access to raw memory via the supervisor

Andrey Kozichev

Subscribe for the latest blogs and news updates!

Related Posts

© MetaOps 2024

© MetaOps 2024

© MetaOps 2024