Are you a “data liberal” or a “data conservative”?

– By Andy Thurai (@AndyThurai). This article was originally published on Xively blog site.

In the last decade, as a society, we had worked very hard toward “liberating our data” — unshackling it from the plethora of constraints unnecessarily imposed by I.T. In contrast to this, In the 90s and early 00s, data had been kept in the Stygian depths of the data warehouse, where only an elite few had access to, or had knowledge about it or the characteristics defining it.

Once we had the epiphany that we could glean amazing insights from data, even with our “junk” data, our efforts quickly refocused around working hard to expose data in every possible way. We exposed data at the bare bones level using the data APIs, or at a value added data platforms level, or even as industry based solutions platforms.

Thus far, we have spent a lot of time analyzing, finding patterns, or in other words, innovating, with a set of data that had been already collected. I see, however, many companies taking things to the next proverbial level.

In order to innovate, we must evolve to collect what matters to us the most as opposed to resign to just using what has been given to us. In other words, in order to invent, you need to start with an innovative data collection model. What this means is for us to move with speed and collect the specific data that will add value not only for us, but for our customers in a meaningful way.

Read more of this blog on Xively blog site.

Advertisements

Prescriptive Analytics: Predict and Shape the Future

This article originally appeared on Gigaom

–  By Andy Thurai (@AndyThurai) and Atanu Basu (@atanubasu). Andy Thurai is the Chief Architect and CTO for Intel App Security unit. Atanu Basu is the CEO of Ayata.

Knowledge is power, according to Francis Bacon, but knowing how to use knowledge to create an improved future is even more powerful. The birth of a sophisticated Internet of Things has catapulted hybrid data collection, which mixes structured and unstructured data, to new heights.

Broken Analytics

According Gartner, 80% of data available has been collected within the past year. In addition, 80% of the world’s data today is unstructured. Using older analysis, security, and storage tools on this rich data set is not only painful, but will only produce laughable results.

Even now, most corporations use descriptive/diagnostic analytics. They use existing structured data and correlated events, but usually leave the newer, richer, bigger unstructured data untouched. The analyses are built on partial data and usually produce incomplete takeaways.

Smarter Analytics to the rescue

Gaining momentum is a newer type of analytics technology, called prescriptive analytics, which is about figuring out the future and shaping it using this hybrid data set. Prescriptive analytics is evolving to a stage where business managers – without the need for data scientists – can predict the future and make prescriptions to improve this predicted future.

Prescriptive analytics is working towards that “nirvana” of event prediction and a proposed set of desired actions that can help mitigate an unwanted situation before it happens. If a machine prescribes a solution anticipating a future issue and you ignore it, the machine can think forward and adapt automatically. It can realize there was no action taken and predict a different course of events based on the missed action and generate a different prescription that takes into account the new future.

Read more of this post

How to effectively build a hybrid SaaS API management strategy

– By Andy Thurai (@AndyThurai) and Blake Dournaee (@Dournaee). This article was originally published on Gigaom

Summary: Enterprises seeking agility are turning to the cloud while those concerned about security are holding tight to their legacy, on-premise hardware. But what if there’s a middle ground?

If you’re trying to combine both a legacy and a cloud deployment strategy without having to do everything twice a hybrid strategy might offer the best of both worlds. We discussed that in our first post API Management – Anyway you want it!.

In that post, we discussed the different API deployment models as well as the need to understand the components of API management, your target audience and your overall corporate IT strategy. There was a tremendous readership and positive comments on the article. (Thanks for that!). But, there seem to be a little confusion about one particular deployment model we discussed – the Hybrid (SaaS) model. We heard from a number of people asking for more clarity on this model. So here it is.

Meet Hybrid SaaS

A good definition of Hybrid SaaS would be “Deploy the software, as a SaaS service and/or as on-premises solution, make those instances co-exist, securely communicate between each other, and be a seamless extension of each other.”

Read more of this post

API Management – Anyway you want it!

– By Andy Thurai (Twitter:@AndyThurai) and Blake Dournaee (@Dournaee). This article originally appeared on Gigaom.

Enterprises are building an API First strategy to keep up with their customer needs, and provide resources and services that go beyond the confines of enterprise. With this shift to using APIs as an extension of their enterprise IT, the key challenge still remains choosing the right deployment model.

Even with bullet-proof technology from a leading provider, your results could be disastrous if you start off with a wrong deployment model. Consider developer scale, innovation, incurring costs, complexity of API platform management, etc. On the other hand, forcing internal developers to hop out to the cloud to get API metadata when your internal API program is just starting is an exercise leading to inefficiency and inconsistencies.

Components of APIs

But before we get to deployment models, you need to understand the components of API management, your target audience and your overall corporate IT strategy. These certainly will influence your decisions.

Not all Enterprises embark on an API program for the same reasons – enterprise mobility programs, rationalizing existing systems as APIs, or find new revenue models, to name a few.  All of these factors influence your decisions.

API management has two major components: the API traffic and the API metadata. The API traffic is the actual data flow and the metadata contains the information needed to certify, protect and understand that data flow. The metadata describes the details about the collection of APIs. It consists of information such as interface details, constructs, security, documentation, code samples, error behavior, design patterns, compliance requirements, and the contract (usage limits, terms of service). This is the rough equivalent of the registry and repository from the days of service-oriented architecture, but it contains a lot more. It differs in a key way; it’s usable and human readable. Some vendors call this the API portal or API catalog.

Next you have developer segmentation, which falls into three categories – internal, partner, and public. The last category describes a zero-trust model where anyone could potentially be a developer, whereas the other two categories have varying degrees of trust. In general, internal developers are more trusted than partners or public, but this is not a hard and fast rule.

Armed with this knowledge, let’s explore popular API Management deployment models, in no particular order.

Read more of this post

How APIs Fuel Innovation

– By Andy Thurai (Twitter: @AndyThurai)

This article originally appeared on ProgrammableWeb.

There has been so much talk about APIs and how they add additional revenue channels, create brand new partnerships, allow business partners to integrate with ease, and how they help with promoting your brand. But an important and under looked aspect, which happens to be a byproduct of this new paradigm shift, is the faster innovation channel they provide. Yes, Mobile First and the API economies are enabled by APIs.

picture1

Read more of this post

Taming Big Data Location Transparency

Andy Thurai, Chief Architect & CTO, Intel App security & Big Data (@AndyThurai) | David Houlding, Privacy Strategist, Intel (@DavidHoulding)

Original version of this article appeared on VentureBeat.

Concern over big government surveillance and security vulnerabilities has reached global proportions. Big data/analytics, government surveillance, online tracking, behavior profiling for advertising and other major tracking activity trends have elevated privacy risks and identity based attacks. This has prompted review and discussion of revoking or revising data protection laws governing trans-border data flow, such as EU Safe Harbor, Singapore government privacy laws, Canadian privacy laws, etc. Business impact to the cloud computing industry is projected to be as high as US $180B.

The net effect is that the need for privacy has emerged as a key decision factor for consumers and corporations alike. Data privacy and more importantly identity-protected, risk mitigated data processing are likely to further elevate in importance as major new privacy-sensitive technologies emerge. These include wearables, Internet of Things (IoT), APIs, and social media that powers both big data and analytics that further increase associated privacy risks and concerns. Brands that establish and build trust with users will be rewarded with market share, while those that repeatedly abuse user trust with privacy faux pas will see eroding user trust and market share. Providing transparency and protection to users’ data, regardless of how it is stored or processed, is key to establishing and building user trust. This can only happen if the providers are willing to provide this location and processing transparency to the corporations that are using them.

Read more of this post

Don’t be stupid, use (cloud) protection!

– By Andy Thurai (Twitter: @AndyThurai)

This article originally appeared on PandoDaily.

Looks like Obama read my blog! The White House got the message. Politicians now seem to understand that while they are trying to do things to save the country, such as creating NSA programs, they cannot do that at the cost of thriving and innovative businesses, especially cloud programs, which are in their infancy. Recently, Obama met with technology leaders from Apple, AT&T, Google and others behind closed doors to discuss this issue.

While American initiatives, both federal and commercial, are trying to do everything to fix this issue, I see vultures in the air. I saw articles urging nationalism among Canadian companies, asking them to go Canadian. In addition, they are also trying to use scare tactics to steer the business towards them, which is not necessarily going to help global companies in my view.

Read more of this post

%d bloggers like this: