BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Intel Shares A Broader Perspective On The Artificial Intelligence Technology Landscape

Following
This article is more than 5 years old.

At the recent VentureBeat’s Transform event, I had the opportunity to interview Intel’s CTO for Artificial Intelligence Products Group (AIPG), Amir Khosrowshahi, on stage about how AI will change marketing.  AI for marketing isn’t some futuristic set of tools. Real work can be accomplished with AI today. For example, available AI technologies enable image recognition, content matching and recommendation engines. Additionally, new voice interfaces and chatbots are fueled by AI functions such as natural language processing and sentiment analysis.

While much of the AI discussion has focused on what underlying processor technology to use and how to design algorithms, it’s important to remember that there’s no single technique or technology for building AI. For example, graphical processor units (GPU) have gained attention in the AI arena, but many companies are using CPUs for AI as well. Why? One reason is that everyone has CPUs. Additionally, as companies transition workloads to the cloud, existing CPU resources are available for AI.

Throughout the past several years, other types of processors have been invented to support AI such as Vision Processing Units (VPUs) for computer vision and chips optimized to support Tensorflow such as Tensorflow Processing Units (TPUs). Specialized AI chips in the works, such as Intel's Neural Network Processing NNP units will be used to optimize deep learning AI workloads. Deep learning is a class of machine learning that is inspired by the structure of neural networks in the human brain.

During the interview, Khosrowshahi described how Taboola was using AI to improve content recommendations for websites. Taboola is an Israel-based ISV and service operator that delivers 360 billion content recommendations to over a billion unique users every month on thousands of premium sites. It's a SaaS-service that requires a high level of computational speed. Taboola's challenge was to increase data processing speed by 30-50 percent to deliver content recommendations using deep learning inference on Taboola’s custom neural network based on a Tensorflow topology.

The conventional wisdom says GPUs are the answer for all AI problems. However, enhancements in software stacks and other optimizations means companies can more efficiently use CPUs for specific AI workloads. Ariel Pisetzky, the vice president of information technology at Taboola, shared in a separate conversation that Taboola had evaluated using GPUs but decided to stay with CPUs for reasons of speed and cost.  Pisetzky said, "Nvidia's chip was far faster, but time spent shuffling data back and forth to the chip negated the gains." By refining its computer code and using new software that optimized CPUs for machine learning Taboola achieves a 280% performance boost using CPUs. The result is the same servers could handle more than twice as many requests.

During the fireside chat, Khosrowshahi also shared how JD.com was using AI to enhance retail experiences. JD.com is the second largest online retailer in China with approximately 25 million registered users. JD.com is using CPUs for deep learning inference functions, such as image feature extraction. This type of AI workload enables image-similarity search and picture deduplication.

However, CPUs and GPUs aren’t the only technologies used to process large amounts of AI data. Microsoft’s AI platform Project Brainwave has incredibly compute-intensive workloads that use Intel’s Arria 10 FPGAs to speed up inference for machine learning models. Microsoft’s Bing also uses Intel FPGAs’ today to power search results. In other marketing and retail cases, VPUs are used to support footfall analysis, inventory monitoring and building security. 

Privacy, a primary issue for all marketers, was also addressed in the session. With all the negativity around data privacy, it was good to hear that there were several industry-wide AI research projects supporting privacy that ranged from data and models that are fully encrypted for maximum privacy to more federated learning models where the user defines a subset of data that remains private.

The key takeaway from my conversations with Intel is that AI isn’t one thing. AI spans multiple technology architectures and use cases. Eventually, AI gets incorporated into every product and service. Additionally, marketers should frequently evaluate the current state of AI technology because it's changing rapidly. Over the course of time, the industry will continue to advance existing architectures such as GPUs for AI. The market will also add new AI-specific architectures such as NNPs and VPUs. Companies can, and should, select a range of AI technologies based on budget and workload requirements.

Follow me on Twitter or LinkedInCheck out my website