Skip to main content

Equipping AI with emotional intelligence can improve outcomes

Join us in Atlanta on April 10th and explore the landscape of security workforce. We will explore the vision, benefits, and use cases of AI for security teams. Request an invite here.


There is a significant gap between an organization’s ambitions for using artificial intelligence (AI) and the reality of how those projects turn out, Intel chief data scientist Dr. Melvin Greer said in a conversation with VentureBeat founder and CEO Matt Marshall at last week’s Transf0rm 2021 virtual conference.

One of the key areas is emotional intelligence and mindfulness. The pandemic highlighted this gap: The way people had to juggle home and work responsibilities meant their ability to stay focused and mindful could be compromised, Greer said. This could be a problem when AI is used in a cyberattack, like when someone is trying to use a chatbot or some other adversarial machine learning technique against us.

“Our ability to get to the heart of what we’re trying to achieve can be compromised when we are not in an emotional state and mindful and present,” Greer said.

Align AI with cloud projects

In a recent Harvard Business Review survey of 3,000 executives in 14 industry sectors, just 20% of respondents said they have actually implemented AI as part of their core business.

VB Event

The AI Impact Tour – Atlanta

Continuing our tour, we’re headed to Atlanta for the AI Impact Tour stop on April 10th. This exclusive, invite-only event, in partnership with Microsoft, will feature discussions on how generative AI is transforming the security workforce. Space is limited, so request an invite today.
Request an invite

In order to bridge the gap between ambition and reality in AI, it is “absolutely critical” that organizations align AI with their cloud computing and cybersecurity initiatives, Greer said. When organizations think about other ongoing digital transformation initiatives — cybersecurity and cloud computing, for example — and align them with AI initiatives, that becomes a force multiplier, Greer said. These initiatives don’t require the same skills, move at the same pace, or achieve the same goals, but they do fit together. Cloud computing, as a place where lots of data is stored, can be a catalyst for AI, he said. Cybersecurity is another because the data, data models, and algorithms need to be protected.

“What we are seeing is that there is an inflection point, and what it requires us to do is to think more clearly around all the other initiatives that are going on in our digital transformation or artificial intelligence projects,” he added.

Quantum vs. neuromorphic

Enterprise leaders have to stay up to date with trends because the field is evolving rapidly, but some of the emerging trends are still years away from practical use. Quantum computing and neuromorphic computing are two very exciting research areas, Greer said, but neither is at a point of having commercial applications yet. In 2017, Intel formed its neuromorphic research community with about 100 universities and 50 industry partners. Researchers get access to hardware and computing platforms, along with a software development kit specifically designed as a software optimization mechanism, Greer said.

“We will see commercial applications and neuromorphic brain-inspired computing much sooner than we will with quantum,” Greer predicted, but noted that was still five to 10 years out.

In the past few years, Intel has made itself a data-centric organization that focuses on AI as a core competency. While many companies have been working on developing AI for different uses, Greer said, there is a significant gap between the ambition that organizations want to achieve and the reality associated with insights those data and programs delivered. For example, Greer said organizations need to start thinking about the emotional intelligence and mindfulness of AI. In the current stage of the COVID-19 pandemic, individuals need to work on multiple tasks at the same time; thus the ability to stay focused and be mindful may sometimes be compromised.

Growing AI capabilities

Greer noted that while investments in AI initiatives have tripled since 2016, many of those are driven by the fear of missing out, rather than successes in the development and deployment of AI. The enthusiasm, investment, and activity around AI aside, organizations need a pragmatic approach, Greer said.

One thing to consider is that in some cases, AI is not a suitable option, he said. It is important to be “absolutely crystal clear” about the problem to solve before trying to figure out whether to run deep learning applications.

Understanding the workforce — which means having diverse teams in the development and distribution of AI capabilities — is the most critical, Greer said. The lack of diverse talent “requires us to pretend everybody is representative of the very homogeneous people that make up the talent pool,” he said.

Having a data strategy

Another gap enterprises often overlook is the amount of data they have and what they can do with it. Many enterprises don’t have the access to manage the data they need to become successful. Greer estimated that 85% of a data scientist’s job is making data available, manageable, and governable so it can be used. Data needs to be classified, managed, and labeled at the point it is being created. Considering that data is being created at 3.7 terabytes per person every day, it isn’t easy to go back and clean data later. Before an organization can develop an AI strategy, it has to first create a data strategy.

“We’re still very much in a situation where if we have really bad data, we will simply do stupid things faster with machines, and we will train them to do things which are inherently erroneous or bias,” Greer said.

It is imperative that researchers, scientists, and developers take a human-centric approach to data and AI systems. Intel has published its ethical principles, or human rights policy, around how AI should be used, and is engaged with non-governmental and international organizations on how to use AI for good, Greer said.

“Because no, data is not oil. And data is not fuel. Data is people,” Greer said.

VB Daily - get the latest in your inbox

Thanks for subscribing. Check out more VB newsletters here.

An error occured.