The Façade Proxy

KuppingerCole analyst Craig Burton (of Burton Group originally) wrote a recent article about Façade proxies. You can read the article here:

As Craig notes,

“A Façade is an object that provides simple access to complex – or external – functionality. It might be used to group together several methods into a single one, to abstract a very complex method into several simple calls or, more generically, to decouple two pieces of code where there’s a strong dependency of one over the other. By writing a Façade with the single responsibility of interacting with the external Web service, you can defend your code from external changes. Now, whenever the API changes, all you have to do is update your Façade. Your internal application code will remain untouched.”

I call this “Touchless Proxy”. We have been doing the touchless gateway for over a decade, and now using the same underlying concept, we provide touchless API gateway or a façade proxy.

While Intel is highlighted as a strong solution in this analyst note by KuppingerCole, Craig raises the following point:

“When data leaves any school, healthcare provider, financial services or government office, the presence of sensitive data is always a concern.”

This is especially timely as the healthcare providers, financial institutions, and educational institutions rush to expose their data using APIs to their partners.

Read more of this post

Content/ Context / Device aware Cloud Data Protection

In this two-part blog, I am going to talk about Intel Cloud Data protection solution that helps our customers utilize their data, in both context and content-aware manner.

This is the newer set of technologies that has hit the market in the last few years. In the past, we used to think just encrypting the transport layer (such as TLS/SSL) was good enough. Given the complex nature of services and API composition, we quickly realized that is not enough. Then we moved to protect the messages (most times the entire message), or field level to protect the specific sensitive fields. The problem with any of these situations is that it is somewhat static in nature; somewhere exists a definition of what “sensitive data” is, and it is strictly enforced. While this is good, when there is a real need to send sensitive data out, yet a need to protect that, making sure only the authenticated party can receive and/or use the message is very important.

(Click on the picture to enlarge the image)

Essentially “Content/ Context Aware” data protection is data protection on steroids. Remember yester years when we used the DLP technologies, identified data leakage/ data loss based on certain policies/ parameters and stopped the data loss but did nothing about it? The problem with DLP is it is passive in most cases. It identifies sensitive data based on some context/policy combination and then blocks the transaction. While this can work for rigid enterprise policy sets, this may not work for cloud environments where you need these policies to be flexible. Well, the issue with that is when someone really needs to have that data (who is authorized for it); it is annoying to have the transactions stopped. What if there is a way to do data protection which is identity aware, location aware, invocation aware and yet it is policy based, compliance based, and more importantly, very dynamic? In other words, what if you provide data protection based on content and context awareness? Gone are the days in which you get your systems compliant, and you are done. Read my blog on why getting compliant is not enough anymore. (link here). That is because your data is NOT staying within your compliant enterprise Ft. Knox anymore; it is moving around. Getting your systems compliant, risk averse and secure, is just not good enough as your data is moving through other eco-systems, not just yours.

Read more of this post

Who is more sensitive – you or your data?

Sooner or later the following (not so hypothetical quandary) will undoubtedly arise: When moving your data to the cloud, you’ll be faced with an array of decisions that will need to be made. What considerations will you make for the protection of your data? In the not-so-distant past, you most likely invested a lot of time and resources into building “enterprise Ft. Knox” – a state-of-the-art, highly advanced and very expensive solution replete with several sophisticated gadgets, strategically positioned around the enterprise perimeter. You had a moment to breathe a sigh of relief, taking solace in knowing that no one could penetrate the fortress you built. You even went so far as to give yourself a pat on the shoulder, enjoying the moment.

Alas, the respite ended with a tap on the shoulder! The King, also known as the CIO, has informed you that the rules have changed! Apparently, when you were working hard building this impenetrable boundary around the edge fixing the exposure, he made a deal for the kingdom (in this case, your company), that expanded its territory. As a result, the short but life-changing edict is to move processing to a third-world country (in other words to the cloud). Gulp.

Medieval comparisons aside, the matter of fact is that your IT systems have been moved to the cloud – public, private, or hosted. With the stroke of a quill (or pen), the circumscribed limits of your perimeter have changed. Unfortunately, protecting your databases, processes, applications, app servers, web servers, systems, middleware, and back-end systems won’t work anymore, and as in most similar scenarios, you’ll have absolutely no control over them in a cloud environment. It’s highly likely that you won’t even know where things are even running most of the time.

The advantages of moving to the cloud cannot be denied, but the new paradigm-shift is not without headaches and real concerns that come with data privacy, security, auditing, compliance, residency (at certain times you can’t let the data leave certain countries for example), in addition to having to worry about being exposed to hackers on a 24×7 basis.

Now what? Well, there is an easy way to solve this problem. Instead of protecting all of the above, you can simply just protect your data instead. This is exactly where Intel cloud encryption/ data privacy gateways shine. We created these gateways a few years ago, keeping the ever-changing landscape in mind.

So how do we do it? Well, for starters, the Intel cloud encryption gateway is the ONLY solution that is available in multiple form factors – as an appliance, software and virtual. It can also be available as a hosted solution, through our partners if you should choose that option. Our appliances are not “virtual appliances” unlike competing vendors in the market. We provide a “true” appliance. This is imperative in the security field, especially when you need FIPS 140-2 Level 3 compliance in the government (or other highly secure environments like the healthcare) space. (As a side note, I recently read a competitors spec where that company claimed to “enable” you — so you could plug in and use FIPS 140-2 if needed. It’s not certain what they exactly meant or how to parse the finely nuanced language used in their advertisements. In contrast, we are completely straight-forward about our enterprise class capabilities. And, yes, we have that feature built in already.)

In addition, our appliance has a unique set of features that include: tokenization, encryption, Format Preserving Encryption (FPE), as well as others that will help ensure the authenticity, integrity, and validity of your data. That’s not all. What makes us unique is that our cloud encryption gateways are built to fit your current eco-system. This means that regardless of the protocol, identity system, logging system, monitoring system, or data/message type, we can encrypt/tokenize the data that is flowing in and out of your organization.

Let’s think about that for a second. You get these appliances, drop them in the line of traffic, do a few configurations, and you are done. Either you keep the sensitive data and send the tokens to the cloud, or alternatively, send the protected (encrypted) data to the cloud and keep the keys to yourself. This allows you to be compliant and mitigate your risk. There are no more long drawn-out IT engagements, nor nightmare filled sleepless nights trying to figure out what will happen when moving your sensitive data to the cloud.

This is really important where time to market (TTM) is the key. We can have you up and running and poised for being production-ready, in a matter of days (or even in a matter of hours as most cases call for). When making a decision, It’s also essential for your calculus to include ROI and TCO. When you buy a similar solution from someone else, make sure to ask yourself these questions: Will I have to spend hundreds of hours building this? How long will it take me to integrate this within my eco-system? We can get you connected with most existing enterprise systems such as logging, monitoring, auditing, middleware, identity systems, database, (web) services, and SIEM systems such as Arcsight/Nitro quickly. And you get the added advantage of having mobile enablement already built-in.

There’s one last note of chuckle I want to share. I saw a competitor’s blog suggesting that they are rated by Gartner, for tokenization and encryption gateways, and are rated “close enough”, to Intel & McAfee in this area. I just want to close this out by saying we are Intel-McAfee, and we thankfully don’t feel compelled to make similar associations with someone, just to bolster our viability or engender notions of greater stability. We genuinely care for our customers and know that we will be here for many years to come.

Please contact me if you need more information. I’m more than happy to send you any additional information that you may need.

%d bloggers like this: