Content/ Context / Device aware Cloud Data Protection

In this two-part blog, I am going to talk about Intel Cloud Data protection solution that helps our customers utilize their data, in both context and content-aware manner.

This is the newer set of technologies that has hit the market in the last few years. In the past, we used to think just encrypting the transport layer (such as TLS/SSL) was good enough. Given the complex nature of services and API composition, we quickly realized that is not enough. Then we moved to protect the messages (most times the entire message), or field level to protect the specific sensitive fields. The problem with any of these situations is that it is somewhat static in nature; somewhere exists a definition of what “sensitive data” is, and it is strictly enforced. While this is good, when there is a real need to send sensitive data out, yet a need to protect that, making sure only the authenticated party can receive and/or use the message is very important.

(Click on the picture to enlarge the image)

Essentially “Content/ Context Aware” data protection is data protection on steroids. Remember yester years when we used the DLP technologies, identified data leakage/ data loss based on certain policies/ parameters and stopped the data loss but did nothing about it? The problem with DLP is it is passive in most cases. It identifies sensitive data based on some context/policy combination and then blocks the transaction. While this can work for rigid enterprise policy sets, this may not work for cloud environments where you need these policies to be flexible. Well, the issue with that is when someone really needs to have that data (who is authorized for it); it is annoying to have the transactions stopped. What if there is a way to do data protection which is identity aware, location aware, invocation aware and yet it is policy based, compliance based, and more importantly, very dynamic? In other words, what if you provide data protection based on content and context awareness? Gone are the days in which you get your systems compliant, and you are done. Read my blog on why getting compliant is not enough anymore. (link here). That is because your data is NOT staying within your compliant enterprise Ft. Knox anymore; it is moving around. Getting your systems compliant, risk averse and secure, is just not good enough as your data is moving through other eco-systems, not just yours.

When you move your data through cloud providers (especially public cloud) and add removable devices (mobility) to the mix, the issue gets even more interesting. Sprinkle data residency issues on top of that to spice it up.

First of all, take a look at your cloud provider contract closely if you haven’t done so already.

  1. Are there any guarantees on where the data is stored (meaning the location of the data residency)?
  2. Are there any guarantees on where the data will be processed (meaning location of data processing)?
  3. Are they willing share the liability with you if they lose your or your customer’s data?

Yes, some providers are better than others, but I have seen some of those contracts, and it gives me a heart attack. No wonder companies are scared to death about protecting their data when moving to cloud.

The data residency issues are especially big for some of our European customers. Especially when you are providing multi-country services, they look to restrict not only data residency for data at rest, but also have mandates to do data processing where they can process data. Imagine when you are dealing with financial, healthcare and other sensitive data for a specific country and they ask that you not only store that data in a place that is within legal boundaries of that country, but also ask that you process the data within the data centers located in their country as well. So then you need to sanitize the data, route the messages to services located in a specific place, desensitize the data for processing, and sanitize it again for storage.

Essentially, your solution needs to be:

  1. Have a strong encryption engine which has all the possible security certifications that you can think of – such as FIPS 140-2 Level 3, DOD PKI, CC EAL 4+, etc.
  2. Use very strong encryption standards/ algorithm for data, whether in storage or in transit.
  3. Protect the encryption keys with your life. There is no point in encrypting the data yet giving away the “Keys to the Kingdom” easily.
  4. Have a solution that can sanitize the data very dynamically and very granularly, based on either pre-defined policies (such as XACML, etc.) or DLP based.
  5. Make a decision based on the content/context and protect the data based on the need. This means have the flexibility to encrypt the entire message, specific sensitive data in the message, have an option to preserve the format of the sensitive data of the message and/or tokenize the data based on the need.
  6. Encrypt the message while preserving the format, so it won’t break the backend systems.
  7. Tokenize the PCI and/or PII data for compliance and security reasons.
  8. Scrutinize the message deeper if the message is intended to go to a non-secure location/ endpoint – such as mobile devices, cloud location, third world country, etc.
  9. Comply with data residency issues by mandating the processing and storage of data in to a specific instance of the service based on where it is located.
  10. Have an elaborate access-control mechanism to the data based on user/ application clearance, data classification and the time and day of the access request.
  11. Most importantly, all of the above should be policy based which can be dynamically changed based on the need.
  12. Do all of the above seamlessly (or “automagically”).

In part 2 of my blog, I will discuss how Intel Cloud data privacy solutions (or the Cloud encryption / tokenization gateway) elegantly solves this problem and should be the only tool kit you will ever need in your arsenal to solve this issue.

In the meanwhile, you can check out information about our tokenization and cloud data privacy solutions here.

Intel Cloud Data Privacy/ Tokenization Solutions

Intel Cloud/ API resource center

About Andy Thurai
This blog is published by Andy Thurai, Program Director - API Economy, IoT, Connected cloud solutions with IBM. The views expressed here are my own and not of my employer. Please feel free to comment or engage in a stimulating conversation, but please keep it professional. I can be reached via the “Contact Me” page here. You can also find me on LinkedIn or on Twitter @AndyThurai

6 Responses to Content/ Context / Device aware Cloud Data Protection

  1. Pingback: Part 2: Context aware Data Privacy « SOA, Cloud, Identity & Security Blog

  2. Pingback: Context Aware Data Privacy (part II) « Intel XML Gateway – SOA Expressway Blog

  3. Pingback: You are Gazetted… « SOA, Mobile, Cloud, Identity & Security Blog @AndyThurai

  4. Pingback: The Façade Proxy | API, SOA, Mobile, Cloud, Identity & Security Blog

  5. Pingback: How to Harden Your APIs : Cloud Security Alliance Blog

  6. Pingback: An Efficient Context Aware Data Security in Mobile Cloud Computing | Context Aware Security For Mobile Cloud Computing

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 35 other followers

%d bloggers like this: