Data Center
Participate in insightful discussions regarding Data Center topics
57 Discussions

How to Become a Data-First Enterprise

Jeremy_Rader
Employee
0 0 852

Insights powered by diverse and exploding data sources are the currency of today’s digital business. If you look at the top trends in technology, things like edge deployment autonomous vehicles, blockchain, natural language processing, and virtual reality, they all need analytics and artificial intelligence (AI). As society generates an unprecedented amount of data, there is an opportunity to make systems more efficient and create entirely new, immersive services. The challenge is the journey from raw data to useful insights.


Intel is no stranger to this challenge. We have 315 petabytes of data across over 140,000 different sources, and it’s growing exponentially each year. Seven years ago, our data sources likely looked like most enterprises at the time: silos scattered across supply chain, marketing, customer, manufacturing, finance and other functions. We set out on a mission to connect our corporate data where it made sense in order to drive more value. This journey lead us to create a unified data repository. By bringing all our data together and setting a framework for cleaning, storing and managing this data, advanced machine learning could start to take place, setting the stage for advanced analytics and AI.


As more and more data is generated by edge devices, as much as 75% by 2025, an enterprise’s data journey can’t be siloed any longer, it needs to be a flowing system, where data can be moved, stored, and processed wherever it’s needed. You cannot expect to tap into machine learning and deep learning without first getting the rest of your data pipeline in working order. To expand data analytics capacity, enterprises need an end-to-end strategy that addresses inefficiencies in the data pipeline, optimizing software, hardware, and libraries at every stage of the data life cycle (ingest, prepare, model, deploy).



Find End-to-End Solutions


Intel works with an ecosystem of partners to create end-to-end solutions for the modern data pipeline. For example, Alluxio, Hazelcast, H20.ai, and Vizion.ai have innovative projects to help enterprises create more efficient data pipelines that are ready to take advantage of the advancements in analytics and AI.


Open source cloud data orchestration software company Alluxio is offering a solution in collaboration with Intel to offer an in-memory acceleration layer with 2nd Generation Intel® Xeon® Scalable processors and Intel Optane™ persistent memory (PMem). Together, Alluxio and Intel are improving how customers manage and process data, including optimizations for Intel Deep Learning Boost. Benchmarking results show 2.13x faster completion compared to local HDFS and a 1.92x speedup over DRAM cache for 4TB decision support queries when adding Alluxio and Intel persistent memory.


Hazelcast’s Project Veyron focuses on optimizing the performance of the Hazelcast in-memory computing platform for 2nd Gen Intel Xeon Scalable processors and Intel Optane PMem. Hazelcast and Intel are currently working to provide an integrated edge-to-cloud IoT processing solution for the financial services, telecommunications, energy, manufacturing, and entertainment industries. Project Veyron will accelerate the completion of parallel in-memory tasks, complex analyses for more sophisticated models, and the use of structured and unstructured data sets. In persistent mode, Hazelcast can use Intel Optane PMem as a fast-medium for the Hot Restart Store feature, and initial benchmarks show that restarts can be up to 3.5x faster with Intel Optane PMem.


H20.ai’s project Blue Danube focuses on accelerating and scaling H2O.ai technologies on Intel platforms, including 2nd Gen Intel Xeon Scalable processors, for enterprises to gain a competitive edge with a highly scalable, cost effective path to AI insights and results. With the “Make Your Own AI” recipe combined from Intel® Data Analytics Acceleration Library (DAAL) and the H2O.ai open source recipe repository, customers can now achieve machine learning at speed and scale, capable of handling 4X the data set size than traditional memory systems


Vizion.ai multi-cloud solution is built on 2nd Gen Intel Xeon Scalable processors and accelerated by Intel Optane technology. The SaaS service provides customers the ability to search, analyze, and store massive volumes of data while lowering costs. "We observed a reduction in latency by 80 percent and accelerated indexing by 3x compared to a popular hyperscale cloud environment," says William Bell, phoenixNAP's EVP of Product, who utilizes Vizion.ai Analytics Service for Elasticsearch™ with Intel Optane PMem.



Invest in Consistent Architecture


The foundation of a valuable end-to-end solution is a consistently flexible architecture. As advanced analytics and AI become part of nearly every business process, it’s important for enterprises to get started now, but do so within the appropriate framework. While the executive “AI mandate” is true for almost every industry, the reality is that “77% of organizations report business adoption of big data and AI initiatives as a big challenge.” Prioritizing use cases that are critical and unique to the company, and collaborating with internal business partners from the beginning, is the first step to overcoming adoption challenges.


By aiming for quick, deliverable wins, enterprises can prioritize projects with the highest ROI and lowest complexity. Because there’s no “one size fits all approach to AI,” consistent architecture with open, flexible, and scalable AI software allows enterprises to choose projects from statistical to machine to deep learning without investing in single-purpose products. Even if some AI projects fail, the underlying infrastructure is already prepared for new projects to begin and large incremental compute investments have not gone wasted. Proven, world-class 2nd generation Intel Xeon Scalable processors deliver scalable performance for a wide variety of analytics and other enterprise applications, speeding up deployment times on infrastructure you already trust.



Become Data-Centric


As the volume of data continues to rise, enterprises are facing new challenges in moving data, storing data, and processing data. Intel’s portfolio leadership comes from creating products that address multiple steps in the data journey, from Ethernet and silicon photonics to persistent memory and storage, FPGAs and accelerators to world-class CPUs. These products are tied together by software partners and our own team of over 15,000 software engineers across the globe working to optimize layers of the system software infrastructure from operating system to applications, including libraries, industry frameworks, and tools.


Our enterprise software strategy is to simplify and accelerate the development of end-to-end solutions that address the entire data analytics and AI pipeline from ingest to insights. Optimized for individual workloads from databases to in-memory analytics, Intel technology helps businesses extract actionable insights from data. For a deeper look into how your enterprise can plan for the evolution of analytics, read this business brief  that outlines how Intel technology can improve performance and decrease TCO.

About the Author
Jeremy Rader, General Manager, Senior Director, Digital Transformation and Scale Solutions, Data Platforms Group, is responsible for enabling business transformation by driving Analytics, AI, and HPC solutions, while driving next generation silicon requirements.