Value of Things – Internet of Things

Recently, I had the privilege to present on IoT security, alongside Michael Curry of IBM, at the MassTLC “Value of Things” conference. You can see the slides here http://www.slideshare.net/MassTLC/andy-thurai-iot-security. I will post the video once it is published.

One of topics that I discussed, which resonated well with the crowd, was about IoTs (Internet of Things) doing both data collection and process control on the same device — Not only on the same device, but also on the same plane most times. This means if someone has access to those data collection mechanisms they also get to control the processes as well, which could be dangerous in wrong hands.

different planes

Very often I see customers use the same device to both collect the data and control the systems. This is especially true in the so-called “industrial automation,” such as manufacturing, power grids, and other “smart” systems. Though these systems were put in place long before the IoT, they are getting Internet enabled now, which is a little scary. This is because security for these networks was not of prime importance as most of these controllers were on private, and most times completely isolated networks. Now putting these devices, and their associated isolated networks, on the Internet, without beefing up the security, is asking for disaster.

The SCADA systems (Supervisory Control And Data Acquisition systems) and the larger ICS (Industrial Control Systems) all fall in this category. They were all built before the current IOT infestation (and I am one of those guys who started his career with working on those systems waaay back when) so you can’t really blame the way it was built. For the time it was built, for the purpose it was built, and for the network it was built, I think it was a solid design. But you need to be very careful when you put them on the Internet.

This could be a problem because if the hackers get access to your network to steal data for monetization purposes they can also control your network to cause chaos. Generally, the hackers try to break into your system for one of two reasons. Either they want to steal your data so they can monetize it (cc, finance data, etc.) or they want to disrupt your system to cause chaos (power grid interruption, supply chain failure, etc.). If, and in some cases it is just a matter of when, the bad guys break into your systems for one of the above reasons, giving them opportunity to do the other is the worst case scenario which could lead to very disastrous results.

To begin with, this is a clear violation of separation of duties/responsibilities by mixing and matching. This is not even counting data collection mixing with control signals. The reason why this is so important is the fact is that if one of them is compromised then the other will be too.

Granted this is a more difficult problem to solve because the device footprints are generally tiny. You can’t have parallel devices doing multiple things. But before you put these things on the Internet you will be better off doing a process and security architecture review of these things. It might save you a lot of headaches.

 

Are you a “data liberal” or a “data conservative”?

- By Andy Thurai (@AndyThurai). This article was originally published on Xively blog site.

In the last decade, as a society, we had worked very hard toward “liberating our data” — unshackling it from the plethora of constraints unnecessarily imposed by I.T. In contrast to this, In the 90s and early 00s, data had been kept in the Stygian depths of the data warehouse, where only an elite few had access to, or had knowledge about it or the characteristics defining it.

Once we had the epiphany that we could glean amazing insights from data, even with our “junk” data, our efforts quickly refocused around working hard to expose data in every possible way. We exposed data at the bare bones level using the data APIs, or at a value added data platforms level, or even as industry based solutions platforms.

Thus far, we have spent a lot of time analyzing, finding patterns, or in other words, innovating, with a set of data that had been already collected. I see, however, many companies taking things to the next proverbial level.

In order to innovate, we must evolve to collect what matters to us the most as opposed to resign to just using what has been given to us. In other words, in order to invent, you need to start with an innovative data collection model. What this means is for us to move with speed and collect the specific data that will add value not only for us, but for our customers in a meaningful way.

Read more of this blog on Xively blog site.

Kin Lane – the stand-up guy

Recently, I had a great conversation with Kin Lane, the API messiah, on a variety of topics including API, IoT, security, and enterprises coming of (digital) age in the API space, etc. I appreciated his time after such long trip, especially with the issues he had to find parking for his jet and all :) (Those Canadians are never kind to American jets, for sure).

One of the topics of conversKL_InApiWeTrust-1000ation was about compromising integrity and beliefs for money. You might have seen his personal blog on the news lately about him turning down a big money offer to continue to do what he likes without the shackles. His blog, and the follow-up conversation we had, resonated very well with me. Some of his liberating thoughts were eye-opening to me (http://kinlane.com/2014/05/07/partnering-for-me-is-about-sharing-of-ideas-research-and-stories/).

Obviously, Kin needs no introduction. I respect his stand and thought process. If you are not following his blogs, you are missing a lot. You can find his blog site at APIevangelist.com

Kin, kudos to you. I hope when my time comes, I can be as noble and stand-up as you are. But knowing me well, I doubt that. :)

Prescriptive Analytics: Predict and Shape the Future

This article originally appeared on Gigaom

-  By Andy Thurai (@AndyThurai) and Atanu Basu (@atanubasu). Andy Thurai is the Chief Architect and CTO for Intel App Security unit. Atanu Basu is the CEO of Ayata.

Knowledge is power, according to Francis Bacon, but knowing how to use knowledge to create an improved future is even more powerful. The birth of a sophisticated Internet of Things has catapulted hybrid data collection, which mixes structured and unstructured data, to new heights.

Broken Analytics

According Gartner, 80% of data available has been collected within the past year. In addition, 80% of the world’s data today is unstructured. Using older analysis, security, and storage tools on this rich data set is not only painful, but will only produce laughable results.

Even now, most corporations use descriptive/diagnostic analytics. They use existing structured data and correlated events, but usually leave the newer, richer, bigger unstructured data untouched. The analyses are built on partial data and usually produce incomplete takeaways.

Smarter Analytics to the rescue

Gaining momentum is a newer type of analytics technology, called prescriptive analytics, which is about figuring out the future and shaping it using this hybrid data set. Prescriptive analytics is evolving to a stage where business managers – without the need for data scientists – can predict the future and make prescriptions to improve this predicted future.

Prescriptive analytics is working towards that “nirvana” of event prediction and a proposed set of desired actions that can help mitigate an unwanted situation before it happens. If a machine prescribes a solution anticipating a future issue and you ignore it, the machine can think forward and adapt automatically. It can realize there was no action taken and predict a different course of events based on the missed action and generate a different prescription that takes into account the new future.

Read more of this post

What the Frack?

I was doing some research recently for an article in ONG (Oil & Natural Gas) sector practice that is making huge headlines recently called “fracking”.

For those who ask What the Frack?

Fracking, or Frack, (or hydraulic fracturing – oh my, how much we love shortening things into cute names) is a procedure in which essentially you are fracturing (or cracking) things with hydraulics hoping to find oil or gas. Essentially this gives us an opportunity to do horizontal drilling which was otherwise impossible.

Conventional places are running dry so we need to find new sources – oil out of sand, gas out of rocks. We are becoming God by performing these miracles!

Read more of this post

How to effectively build a hybrid SaaS API management strategy

- By Andy Thurai (@AndyThurai) and Blake Dournaee (@Dournaee). This article was originally published on Gigaom

Summary: Enterprises seeking agility are turning to the cloud while those concerned about security are holding tight to their legacy, on-premise hardware. But what if there’s a middle ground?

If you’re trying to combine both a legacy and a cloud deployment strategy without having to do everything twice a hybrid strategy might offer the best of both worlds. We discussed that in our first post API Management – Anyway you want it!.

In that post, we discussed the different API deployment models as well as the need to understand the components of API management, your target audience and your overall corporate IT strategy. There was a tremendous readership and positive comments on the article. (Thanks for that!). But, there seem to be a little confusion about one particular deployment model we discussed – the Hybrid (SaaS) model. We heard from a number of people asking for more clarity on this model. So here it is.

Meet Hybrid SaaS

A good definition of Hybrid SaaS would be “Deploy the software, as a SaaS service and/or as on-premises solution, make those instances co-exist, securely communicate between each other, and be a seamless extension of each other.”

Read more of this post

Which kind of Cyborg are you?

By Andy Thurai (@AndyThurai)

[This article is a result of my conversations with Chris Dancy (www.Chrisdancy.com) on this topic. The original version of this was published on Wired magazine @ http://www.wired.com/insights/2014/01/kind-cyborg/].

Machines are replacing humans in the thinking process. The field of Cognitive Thinking is a mixture of combining rich data collection (with wide array of sensors), machine learning, predictive analysis, and cognitive anticipation in a right mix. Machines can do “just-in-time-machine-learning” rather than using predictive models and are virtually model free.

The Cognitive Computing concept revolves around few combined concepts:

  1. Machines learn and interact naturally with people to extend what either humans or machines could do on their own.
  2. They help human experts make better decisions.
  3. These machines collect richer data sets and use them in their decision making process, which creates the need for intelligent interconnected devices. This creates a network of intelligent sensors feeding the super brain.
  4. Machine learning algorithms sense, predict, infer, think, analyze, and reason before they make decisions.

Which kind of cyborg are you?

The field of cybernetics has been around for a long time. Essentially, it is the science (or art) of the evolution of cyborgs.  The cyborgs have evolved from assistive cyborgs to creative cyborgs. Not only can they adapt to human situations, but they are also able to learn from human experiences (machine learning), think (cognitive thinking), and figure out (situation analysis) how to help us rather than being told.

Read more of this post

Follow

Get every new post delivered to your Inbox.

Join 27 other followers

%d bloggers like this: