Blog

The Merits of Cloud Functions in an Edge Computing Context

For years IT infrastructures were specifically concentrated in data centers or outsourced to the cloud for greater efficiency but according to an IDC study on Industrial IoT in Germany[1] there are meanwhile more and more IT resources to be placed at the network edge. E.g., Industrial IoT use cases are characterized by an increased usage of edge or fog environments, which are added to the classic centralized cloud or core infrastructure to implement the everything as a service approach.


Applications typically consist of a multitude of components; some are in need of locality and others in need of significant computational resources to fulfil their scope and objectives. Even though the idea of efficiency in Function as a Service operation is correct, latency, computing costs or even data privacy is often a showstopper in using cloud services or data centers for IIoT use-cases. The combination of Edge computing with cloud computing might solve some problems associated with the usage of pure centralized computing by creating additional decentralized processing power for achieving a true edge-cloud-computing continuum.

Cloud-Core-Edge-Balance makes flexible and increases reliability for IIoT use cases. Edge computing is therefore complementary for most IIoT environments and does not replace central IT infrastructures. The focus is therefore on the right composition of the IIoT infrastructure – depending on the intended application scenarios and respective requirements, which can differ fundamentally. We can examine the edge-cloud-domain from the following three main requirements for Function-as-a-service:

[1. Cloud costs and volume of Data]

IoT use-cases often come with a high volume of data in small chunks. That said it is often worth to process as much data as possible directly at the edge. This will not only prevent networks from being overloaded but will also reduce costs for data transfer to the cloud. In addition, storing historical data in the cloud by batching them on the edge and uploading them to the cloud in larger batches reduces the cloud costs and increases the availability and durability of the data.

[2. Real-Time Response]

Inserted between space of the deployment and the time of the execution one cannot separate the two concepts in the eyes of the application user who only is interested in the deployment of the final optimized result. Utilizing a centralized service in the cloud far away from the observer may introduce latency or other factors that render this selection as more time consuming than a localized deployment on the edge. The data should also be processed directly at the edge if low latency is crucial; this keeps the shipping distance short. Depending on whether processing has to take place quickly or even close to real time, the analysis of this trade-off should result in a unified space-time combinatorial approach in order to handle the selection.

[3. Computing Power]

IIoT use-cases in particular where AI methods are involved require high computing power and storage capacity. It is often worth to run computing task in cloud or core environments, which requires high computing power like training of inference models and only run the inference task on the edge.

Methods like complex event processing, pre-filtering and batching of data and serverless functions or event-driven functions deployed in a continuum environment on the edge, cloud or in the core will help to overcome these obstacles in realizing value adding IoT use-cases.

The PHYSICS Project will further elaborate on these mentioned methods in the next years for a true continuum computing space.

[1] https://blog.de.fujitsu.com/connected-services/industrial-iot-in-deutschland-edge-computing-ist-das-fehlende-puzzleteil-fuer-viele-iot-use-cases/




You might be interested in …

Newsletter

View our previous Newsletters

Sign up to stay informed on our latest updates!