How can Edge Analytics deliver value for your company?

Submitted by Yugo Sakamoto on Tue, 03/26/2019 - 18:33
viridis-article-edge-analytcs

The new challenges of increasingly connected industry

 

The Internet of Things (IoT) will continue to grow exponentially over the next few years, rising from 27 billion in 2017 to 73 billion devices installed globally in 2025, according to IHS Markit.

The challenges of managing and processing the large amount of data generated by this explosion of devices in a traditional cloud architecture – in a centralized way with processing on remote servers – are accompanied by problems of latency and high bandwidth consumption when handling this information.

To avoid situations where latency and bandwidth consumption are crucial elements for the business, two new terms have gained prominence in the IoT universe: Edge Computing and Fog Computing.

Despite the similar names, each concept plays a distinct role and it is important to clarify their differences.

Edge Computing

Edge Computing delivers the proposition of bringing processing and storage closer to the sensors or devices that read and capture the data. Such movement allows data to be analyzed and treated locally and in a distributed way, reducing or even eliminating the need to send data to a remote cloud or a centralized system, reducing intensive information traffic in the network.

According to a study conducted by Gartner, Edge Computing is among the 10 biggest strategic technology trends for 2019 and is primarily driven by the need to facilitate solutions where the data are located, far from a public cloud or a data center, where connectivity is limited or an immediate response is necessary and has a low tolerance for delays.

In this same study Gartner emphasizes that contrary to what many may think, Edge Computing is not a concept that competes with Cloud Computing but a complement that allows aspects of distributed processing to be added to a cloud.

cloud computing and edge computing are complementary concepts

Source: Gartner (2018).

Fog Computing

Fog Computing is principle that defines how Edge Computing should function to facilitate processing, storage, and communication operations between edge devices and the cloud platform. The term was created by Cisco and is also known to be an extension of cloud computing functions in a layer closer to the edges of the network.

With the aim of guiding this ecosystem fostered by devices from a variety of manufacturers, in 2015 Cisco Systems, Intel, Microsoft, Princeton University, Dell, and ARM Holdings came together to create the OpenFog Consortium, a group dedicated to defining architectural standards to deliver fully interoperable systems and components.

fog computing cloud

Fonte: OpenFog (2019).

Enabling Analytics at the Edge

The value in an IoT system is not in connecting a single sensor event, or millions of stored sensor events. The significant value of IoT is in the interpretation and decisions made by that data.

(Perry Lea, Internet of Things for Architects)

Edge Analytics is the concept that adds analytical power to Edge and Fog Computing, and proposes bringing prediction and optimization directly to edge devices, allowing analysis to be performed locally and in real time.

Moving analytical capabilities to the edge can bring a series of advantages to a set of specific problems:

  • Reducing latency in data consumption and delivery of results since the model is executed at the same location where the data reside.
  • Improved scalability of analyses with its decentralization, allowing the replication and multiplication of processing units.
  • Reduction of bandwidth consumed by handling only the results of the analytical models in the place of raw or cleaned data.
  • Access to restricted data for analysis, previously unavailable to send to a processing center in the cloud due to sensitivity of the information carried or legal security measures.

Not all types of analysis can or should be moved to the edge. It is necessary to evaluate the scope and dimensioning of the data that will be consumed and the level of processing required for each analysis.

For Accenture, few edge devices are able to provide an environment that allows training, execution, and retraining of these models. It is therefore more appropriate to apply Edge Analytics to devices in the Fog layer, which will have access to more data coming from connected edge devices and will probably have more storage and processing capacity for performing such tasks.

We should also consider occasional loss of data due to sending only the results of the edge analysis to the processing centers. We could be losing potential insights with this process. In this case, we should assess the tolerance for data loss and whether it is really necessary to keep all raw data coming from the source under study.

Conclusion

Edge Analytics is still a vast green and fertile field to be explored, and we have seen that it is not applicable to every situation. The field under study should allow us to create analytical models with reduced scope of data and/or with distributed processes to aggregate results in higher layers of the system. The concept brings benefits to problems related to connectivity, bandwidth resources, and response time.

Energy and Utilities Management Systems play a crucial role within the architecture of IoT because they act as a platform for the gateways that receive and process data generated by cutting-edge devices and are excellent candidates for running analytical models and exploiting the advantages of Edge Analytics.

Good energy and utilities management systems should be prepared to deliver extensibility and allow the addition of analytical models at various levels in the hierarchy of equipment, as well as the interoperability necessary to integrate the wide range of devices and platforms that make up the company's ecosystem.

10 capabilities that should not be missing in an effective energy and utilities management system

Software developer, Viridis

Software developer at Viridis holding a degree in computer science from the Federal University of Viçosa. Wide-ranging experience in designing corporate solutions and building high-performance software.

Add new comment