Is Your AI Ethical?

 

[Pic Courtesy: Atlantic Re:think]

A group of teachers successfully sued the Houston Independent School District (HISD) in 2017 claiming their Fourteenth Amendment rights were violated when the school district used an opaque artificial intelligence (AI) algorithm to evaluate and terminate 221 teachers. The judge overturned the use of AI algorithms suggesting, “When a public agency adopts a policy of making high stakes employment decisions based on secret algorithms (aka, AI and Neural Networks) incompatible with a minimum due process, the proper remedy is to overturn the policy.”

The fields of computer modeled risk assessment and algorithmic decision making have been around for a while, but AI takes it to the next level – as demonstrated by Cambridge Analytica’s recent infamous work. AI is having an even bigger impact in our lives than that we find in movies like Terminator and I, Robot. While those movies suggest that the robots might end human freedom and control us – the biased, unfair, or downright unethical decision-making algorithms, automatically created and used by machines, pose a bigger risk to humanity.

Read more of this post

Cognitive disruption: Where man and machine become one!

In 2017, digital disruption is history. If you are not thinking about cognitive disruption, your business is way behind in the technology curve.

In the recent IBM annual survey of global CEOs, about 73 percent say cognitive computing will play an important role in the near future of their organizations, with the same sentiment echoed by other c-suite executives as well. While almost three-fourths of CEOs agree that their businesses, and their industries, will be disrupted by cognitive computing in the near future, surprisingly only about half of these CEOs are planning to adopt cognitive computing by 2020.

While that may seem stunning, the primary reason is pretty clear: infusing cognition into an existing infrastructure is extremely difficult, very time-consuming and will be very expensive.

Read more of this post

Weather predictions, APIs, IoT, and a powerful digital platform for your uninterrupted Business

This article was originally published on IBM Big Data Hub.

Many of us seem to watch weather forecasts to figure out what to wear the next day but forget about it right after that, unless of course there is snow in the forecast. Especially here in the northeast; we dread watching the weather report for about six months of the year.

For this reason, IBM’s acquisition of The Weather Company was a head-scratching moment for many because we are used to only the consumer aspect of weather, not the business side—especially given the high speculation by The Wall Street Journal.

Why did IBM, an IT software company, go after The Weather Company then? IBM started this fundamental shift a few years ago, transforming itself from a big IT and mainframe provider to a digital, data and insight company. Recent speeches by the CEO of IBM clearly articulate its main focus has shifted toward cognitive computing, analytics, IoT, APIs, hybrid cloud and digital platforms that support big corporations to reinvent themselves and engage in the digital economy.

Read more of this article.

Digitizing Healthcare, Because Our Lives Matter

This article originally appeared on IBM Big Data & Analytics Hub.

The United States spends around 17-18% of its GDP on healthcare every year. When you put this into dollar numbers, it is a mind-boggling $2.9 trillion.

Unfortunately, that spending will grow at a faster rate now due to baby boomers becoming an aging population, and they are the largest demographic in the U.S. (Baby boomers are about 76 million, which accounts for 25% of the population of the U.S.). The healthcare related spending is expected to grow at a faster pace than the under 5% annual rate it grew over the last decade.

Unless the U.S. gets this spiraling healthcare spending under control, in a few short years we will be spending almost 25% of our entire GDP in healthcare trying to fix people’s failing health, instead of spending it somewhere else where it is desperately needed. Obviously, we can’t stop the aging population, but we can make the healthcare system more efficient. Overall, chronic diseases account for about 86% of the health care spending in USA. Severe chronic conditions such as heart disease, arthritis, asthma and diabetes alone cost 33% of the total spending.

Read more of this post

Bringing your ideas to life in digital economy

Bringing your life-changing ideas to fruition needs a different mindset (and toolset) in the digital economy. The need for speed with digital innovation is more important than ever, with every start-up trying to push the envelope with their new ideas.

Remember the good old days, when you had an idea and tried to prove that concept by doing a POC (Proof of Concept)? While the POC technically stands for proving a concept, most times it is done as a proof of technology. We try to figure out a way to incorporate this newer idea into an existing IT infrastructure, often failing, and trying to make it work. I still remember the days in which we used to wait for the security admins to open the appropriate ports and give us the right access before we could even start to “install” our software to try things out. Those days are gone!

In the digital economy, once you get an idea, you cannot afford to sit around and build it for years, not even months, like legacy enterprise software.

Read more of this post

Success of Data Insight–Driven Enterprises in Digital Economy

This article originally appeared on IBM Data Magazine.

Connecting everything to the Internet—the Internet of Everything—brings an interesting problem to the forefront: data onslaught. Examples of data onslaught in the new digital economy includes the 2.5 quintillion bytes of new data collected every single day (and it is expected to increase three times by 2017), or the 2.5 PB of data collected by a major retailer every hour or the fact that by 2015, 1 trillion devices are expected to be connected to the Internet and generate data for consumption.

A key point that almost every organization seems to miss in the data economy is that just because they are collecting so much data doesn’t mean they are collecting the right data, or even enough data. They may be either collecting very little of something very important or not collecting the right data at all. Even more appalling are situations in which organizations collect huge amounts of data and do absolutely nothing with it. People often make the mistake of connecting value with voluminous data.

Read more of this post

Going beyond the mobile app gold rush

Recently, I wrote a blog on What powers the mobile economy?which created lot of interesting conversations. A few large enterprise customers reached out to me and suggested they can relate to things I said in my post. In my follow-up conversations with them, a couple of more interesting views came up.

Sadhu-baba-with-mobile-funny

Read more of this post

Did Germany Cheat in World Cup 2014?

– By Andy Thurai (@AndyThurai)

This blog originally appeared on BigML blog site.

Now that I got your attention about Germany’s unfair advantage in the World Cup, I want to talk about how they used analytics to their advantage to win the World Cup—in a legal way.

player-performance

I know the first thing that comes to everyone’s mind talking about unfair advantage is either performance-enhancing drugs (baseball & cycling) or SpyCam (football, NFL kind). Being a Patriots fan, it hurts to even write about SpyCam, but there are ways a similar edge can be gained without recording the opposing coaches’ signals or play calling.

It looks like Germany did a similar thing, legally, and had a virtual 12th man on the field all the time. For those who don’t follow football (the soccer kind) closely, it is played with 11 players on the field.

So much has been spoken about Big Data, Analytics and Machine Learning from the technology standpoint. But the World Cup provided us all with an outstanding use case on the application of those technologies.

Read more of this blog.

Not with Intel Any More…

You might have read my recent blog about Kin Lane. I didn’t realize that I would have to make a decision of my own when I wrote that blog. Though our situations were entirely different, it is always tough to call it right when you are faced with multiple choices, especially when all of them seem like the right answer. In any case, I have decided to move on from my position at Intel in pursuit of other opportunities.

Read more of this post

Prescriptive Analytics: Predict and Shape the Future

This article originally appeared on Gigaom

–  By Andy Thurai (@AndyThurai) and Atanu Basu (@atanubasu). Andy Thurai is the Chief Architect and CTO for Intel App Security unit. Atanu Basu is the CEO of Ayata.

Knowledge is power, according to Francis Bacon, but knowing how to use knowledge to create an improved future is even more powerful. The birth of a sophisticated Internet of Things has catapulted hybrid data collection, which mixes structured and unstructured data, to new heights.

Broken Analytics

According Gartner, 80% of data available has been collected within the past year. In addition, 80% of the world’s data today is unstructured. Using older analysis, security, and storage tools on this rich data set is not only painful, but will only produce laughable results.

Even now, most corporations use descriptive/diagnostic analytics. They use existing structured data and correlated events, but usually leave the newer, richer, bigger unstructured data untouched. The analyses are built on partial data and usually produce incomplete takeaways.

Smarter Analytics to the rescue

Gaining momentum is a newer type of analytics technology, called prescriptive analytics, which is about figuring out the future and shaping it using this hybrid data set. Prescriptive analytics is evolving to a stage where business managers – without the need for data scientists – can predict the future and make prescriptions to improve this predicted future.

Prescriptive analytics is working towards that “nirvana” of event prediction and a proposed set of desired actions that can help mitigate an unwanted situation before it happens. If a machine prescribes a solution anticipating a future issue and you ignore it, the machine can think forward and adapt automatically. It can realize there was no action taken and predict a different course of events based on the missed action and generate a different prescription that takes into account the new future.

Read more of this post

%d bloggers like this: