Sponsored by Dell Technologies
Center for Edge Computing and 5G

6 Good Reasons to Adopt Edge Computing

PinIt

Every situation is unique. What is clear, however, is that a balance between cloud and edge computing will likely drive tomorrow’s IoT architecture.

Why should companies consider edge computing as a technology strategy? Edge introduces its own forms of complexity to information architectures, but at the same time, helps bring greater speed and security to the large complex hairballs that today’s computing environments have become.

That’s the word from Deloitte, which makes the case for edge in a new report, which cites a number of advantages to moving processing and data to the edge. As much as 55 percent of IoT data could soon be processed near the source, either on the device or through edge computing, the report’s authors, Ken Carroll and Mahesh Chandramouli, both with Deloitte, state.

See also: Moving Targets: Defining the Edge and Its Architecture

There are a number of advantages organizations can realize as they move toward edge computing, Carroll and Chandramouli state. They point to no less than six benefits that can be seen from edge deployments:

1) Latency: This is perhaps the most compelling use case. “Having the smallest possible latency between data generation and the decision or action can be critical to preserve an organization’s agility,” Carroll and Chandramouli point out. “Because data can become essentially valueless after it is generated, often within milliseconds, the speed at which organizations can convert data into insight and then into action is generally considered mission-critical. In a cloud-only world, the data ends up traveling hundreds or even thousands of miles, so where latency is critical to a solution, edge computing can become key.”

2) Bandwidth availability and use: While bandwidth has been amped up significantly in recent years, this still is not keeping up with the enormous volume of data now surging through networks. “Bringing processing closer to the source of the data could help equalize the equation – local processing (including compression, filtering, and prioritization) can help make effective and efficient use of the available bandwidth.”

3) Avoiding interruptions in connectivity: Many IoT devices and systems have local processing and storage capabilities, and therefore do not require wider connectivity. In addition, not everyone has access to stable network connections, and events such as storms could disrupt connectivity. “Having functionality at the edge can help ensure that applications are not disrupted when network connectivity issues occur,” the Deloitte authors state.

4) Security and privacy: Then, there’s the inherent security in distributing data across scattered locations, as well as maintaining data locally rather than sending it over networks. “Using a form of hub-and-spoke cloud/edge architecture can help localize or compartmentalize device secrets into multiple nodes, rather than storing all of them in one place, thereby helping to improve security.”

5) Data normalization and filtering: Many IoT devices and systems generate enormous streams of data, which puts tremendous loads on networks and centralized systems as they are transported. Localized devices can be programmed to filter out relevant data to pass along to the network. In addition, many devices also have their own data protocols, which may be difficult to manage from centralized locations. “Data normalization or homogenization at the edge is often coupled with filtering and other data compression functions, thereby also helping to make more efficient use of available bandwidth.”

6) Simpler, cheaper devices: It’s often expensive to use up computing cycles in centralized locations. “Gateways and edge server enable simpler, cheaper devices as data storage and processing are transferred to the gateway or edge server, instead of embedding them in the sensor or device,” Carroll and Chandramouli point out.

Watch Now: Dell Technologies Edge Solutions (Video)

There is no right way to implement a mix of edge and centralized computing, and the Deloitte co-authors do not discount the costs and complexities of maintaining networks of edge devices. “Edge processing is highly distributed and often includes far-flung or difficult-to-access locations, including sensors/actuators and gateways in offices, plants, at campuses, on pipelines, and in various remote field sites,” they state. “All these edge nodes have firmware, operating systems, some form of virtualization and containers, and software installed, some of which are provided by manufacturers and some by solution providers.”

Still, the benefits are seen in terms of latency, scalability, and increased access to information may likely outweigh the complexities in maintaining edge systems. “An IoT solution should be only as simple as it needs to be, and no simpler,” Carroll and Chandramouli state. “Conversely, it should be only as complex as it needs to be and no more complex. These seemingly straightforward, yet essential points can make a difference in the success of a solution. Every situation is unique. What is clear, however, is that a balance between cloud and edge computing will likely make up tomorrow’s IoT architecture.”

Avatar

About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply

Your email address will not be published. Required fields are marked *