Taming Big Data Location Transparency

Andy Thurai, Chief Architect & CTO, Intel App security & Big Data (@AndyThurai) | David Houlding, Privacy Strategist, Intel (@DavidHoulding)

Original version of this article appeared on VentureBeat.

Concern over big government surveillance and security vulnerabilities has reached global proportions. Big data/analytics, government surveillance, online tracking, behavior profiling for advertising and other major tracking activity trends have elevated privacy risks and identity based attacks. This has prompted review and discussion of revoking or revising data protection laws governing trans-border data flow, such as EU Safe Harbor, Singapore government privacy laws, Canadian privacy laws, etc. Business impact to the cloud computing industry is projected to be as high as US $180B.

The net effect is that the need for privacy has emerged as a key decision factor for consumers and corporations alike. Data privacy and more importantly identity-protected, risk mitigated data processing are likely to further elevate in importance as major new privacy-sensitive technologies emerge. These include wearables, Internet of Things (IoT), APIs, and social media that powers both big data and analytics that further increase associated privacy risks and concerns. Brands that establish and build trust with users will be rewarded with market share, while those that repeatedly abuse user trust with privacy faux pas will see eroding user trust and market share. Providing transparency and protection to users’ data, regardless of how it is stored or processed, is key to establishing and building user trust. This can only happen if the providers are willing to provide this location and processing transparency to the corporations that are using them.

Read more of this post

Don’t be stupid, use (cloud) protection!

– By Andy Thurai (Twitter: @AndyThurai)

This article originally appeared on PandoDaily.

Looks like Obama read my blog! The White House got the message. Politicians now seem to understand that while they are trying to do things to save the country, such as creating NSA programs, they cannot do that at the cost of thriving and innovative businesses, especially cloud programs, which are in their infancy. Recently, Obama met with technology leaders from Apple, AT&T, Google and others behind closed doors to discuss this issue.

While American initiatives, both federal and commercial, are trying to do everything to fix this issue, I see vultures in the air. I saw articles urging nationalism among Canadian companies, asking them to go Canadian. In addition, they are also trying to use scare tactics to steer the business towards them, which is not necessarily going to help global companies in my view.

Read more of this post

Snowden gone, ripples remain!

– By Andy Thurai (Twitter: @AndyThurai)

[Original version of this blog appeared on PandoDaily magazine.]

Though Snowden is long gone now, the ripple effects that he created are going to remain for a long time to come. If you haven’t done so already, I suggest you read about the NSA surveillance programs PRISM and XKeyscore before you continue with this article.

Essentially, these government programs are creating nervous times for my Canadian, European and APAC customers who are using US cloud providers. Given the very strict data residency and data privacy requirements to protect their citizens’ sensitive data in these parts of the world, through “guilt by association” alone, the latest incidents have implicated most corporations that move their data across boundaries. One thing is certain: these programs that are exposed because someone came out in the public. Just because a specific country’s cloud provider hasn’t been accused yet (or not found guilty) doesn’t necessarily mean that they are not doing the same thing. There is a chance that they might be doing it and have not been caught yet.

Unfortunately, the cloud community spent years alleviating the fear of moving data to the cloud by entities. Those days, the fear was about hackers and disgruntled employees/partners accidentally or willfully exposing their data. Now they need to fight an uphill battle of convincing the entities not about hackers, but about legal entities and governments.

Read more of this post

5 Practical Steps to Building an Enterprise Class API Program

When it comes to building API programs, everyone seems to think in terms of technology, platforms, scalability, security, execution, hackathons, etc., but people tend to forget the most important thing. What do you think it is – TTM (Time to Market)? Additional Revenue? Newer Partners? TCO (Total Cost of Ownership)? Usability? IT approval? or Something else?

If you want to know what that is and how to effectively build an Enterprise class API program, please attend this webinar that I am co-presenting with Mashery and CapitalOne. Every customer seem to have an aha! moment after our conversation.

This live webinar is at 1 pm EST on May 22 (this Wednesday). You can register here http://tiny.cc/0ywexw.

The Façade Proxy

KuppingerCole analyst Craig Burton (of Burton Group originally) wrote a recent article about Façade proxies. You can read the article here: http://blogs.kuppingercole.com/burton/2013/03/18/the-faade-proxy/

As Craig notes,

“A Façade is an object that provides simple access to complex – or external – functionality. It might be used to group together several methods into a single one, to abstract a very complex method into several simple calls or, more generically, to decouple two pieces of code where there’s a strong dependency of one over the other. By writing a Façade with the single responsibility of interacting with the external Web service, you can defend your code from external changes. Now, whenever the API changes, all you have to do is update your Façade. Your internal application code will remain untouched.”

I call this “Touchless Proxy”. We have been doing the touchless gateway for over a decade, and now using the same underlying concept, we provide touchless API gateway or a façade proxy.

While Intel is highlighted as a strong solution in this analyst note by KuppingerCole, Craig raises the following point:

“When data leaves any school, healthcare provider, financial services or government office, the presence of sensitive data is always a concern.”

This is especially timely as the healthcare providers, financial institutions, and educational institutions rush to expose their data using APIs to their partners.

Read more of this post

PCI / Cloud Data Privacy webinar – Wednesday Mar/20 @ 12:25 pm

Replay link here for those who missed it. PCI/ PII webinar replay link.

———————————————————————————————————————————-

I am speaking at the SC World eConference this Wednesday (12:25 PM – 01:05 PM) with our customer WestJet on PCI Compliance/ Cloud Data Privacy issues. You can register at the link below. It is free. Plus you earn CPE credits! Attend the session to hear the WestJet use case on how they used Intel solution to get PCI compliant quickly without a long drawn IT engagement.

You can register here: http://tiny.cc/5p15tw

State of CA – Split Personality Syndrome?

It’s interesting to see that the state of CA has a split personality disorder! I wrote in a blog about a year ago how the state of CA is being a model citizen by forcing companies to protect consumer sensitive data by protecting the PII information (such as zipcodes and other sensitive information by classifying them as PII) and imposing penalties on companies that don’t comply. (Link here) But now, they sided with Apple stating that for on-line transactions the vendors can collect additional PII information that is not necessary for brick-and-mortar vendors. This means if you are an online retailer and collect such PII data, you need to have a mechanism to protect all this information you are collecting from your consumers, not just the PCI data but the PII data as well. In order to comply with this dual personality, you will need a solution that can encrypt and tokenize the sensitive information as necessary and as seamlessly as possible.

http://news.cnet.com/8301-13579_3-57567526-37/apple-wins-california-credit-card-privacy-case/

Protected: Follow-up on Global Payments breach

This content is password protected. To view it please enter your password below:

You are Gazetted…

Recently the government of Singapore passed a bill (or “Gazetted” as they call it, which sounds a lot fancier) about protecting personal data of consumers:

Click to access Annex%20D_Draft%20PDP%20Bill%20for%20Consultation.pdf

“Protection of personal data

26. An organisation shall protect personal data in its custody or under its control by making reasonable security arrangements to prevent unauthorised access, collection, use, disclosure, copying, modification or disposal or similar risks.

Cross-border Transfers

The PDPA also permits an organisation to transfer personal data outside Singapore provided that it ensures a comparable standard of protection for the personal data as provided under the PDPA (Section 26(1)). This can be achieved through contractual arrangements.”

So what they are suggesting is that gone are the days that if a business loses its customers’ data, they tell the consumers, “Oops, sorry, we lost your data…………” and that is about it. Now, the governments are taking initiatives that can hold the companies responsible for being careless with consumer data and not protecting it with their life, if not face consequences.

http://europa.eu/rapid/press-release_IP-12-46_en.htm?locale=en

This means, as a corporation, you need to protect not only the data in storage and in transit, but also given the cross-border restrictions (this is especially strictly enforced in Europe; read about them on above URL links) you need to figure out a way to keep the data and the risk to yourself instead of passing this on to third parties. The easiest way to achieve that would be to tokenize the sensitive data, keep the sensitive data in your secure vault and send only the tokens to the other end. Even if the other end is compromised, your sensitive data and your integrity will be intact, and it will be easy to prove in case of an audit that you went above and beyond not only to comply with requests/ laws such as this, but also you genuinely care for your customers’ sensitive personal data. Brand reputation is a lot more important than you think.

Check out some of my older blogs on this topic:

Who is more sensitive – you or your data?

Content/ Context / Device aware Cloud Data Protection

Part 2: Context aware Data Privacy

Also, keep in mind Intel Token Broker and Cloud Security Gateway solutions can help you solve this fairly easily without messing with your existing systems too much.

Check out more details on Intel cloud data privacy solutions.

Effective PCI tokenization methods

Recently a colleague and a friend of mine wrote a great article about different ways to be PCI 2.0 compliant by tokenizing PAN data. If in case you missed it I want to draw your attention to it.

Essentially, if you are looking to be PCI-DSS 2.0 compliant there are few ways you can achieve that. The most painful would be obviously a rip-and-replace strategy and the easiest would be to do it in an incremental, less intrusive method.

First approach, the Monolithic big bang approach, is the legacy way of doing things. Once you figure out the areas of your system that are non-compliant (that is either storing PAN data –encrypted or not, or processing PAN in clear), you decide whether you need that component to be PCI compliant. As the PCI audit is very extensive, time consuming and very methodical, in which every process, application, storage, database, and system will be looked at and thereby it becomes very expensive. Once you figure out which components need to be PCI compliant you can do the rip and replace approach in which you will touch every system component that needs to be modified and rewrite the system to become compliant. This might involve touching every component and change your entire architecture. This essentially will be the most expensive, painful and the slowest before you can be compliant. While this can be the most effective for spot solutions, this could be an issue if you have to do this every time when the PCI-DSS needs change (which seems to be every year).

Read more of this post

%d bloggers like this: