• Estimates the energy consumption of IoT applications.
• End-to-end energy cost model for data stream analysis in IoT.
• Comparison of edge and core cloud solutions for IoT.
Internet of Things (IoT) is bringing an increasing number of connected devices that have a direct impact on the growth of data and energy-hungry services. These services are relying on Cloud infrastructures for storage and computing capabilities, transforming their architecture into more a distributed one based on edge facilities provided by Internet Service Providers (ISP). Yet, between the IoT device, communication network and Cloud infrastructure, it is unclear which part is the largest in terms of energy consumption. In this paper, we provide end-to-end energy models for Edge Cloud-based IoT platforms. These models are applied to a concrete scenario: data stream analysis produced by cameras embedded on vehicles. The validation combines measurements on real test-beds running the targeted application and simulations on well-known simulators for studying the scaling-up with an increasing number of IoT devices. Our results show that, for our scenario, the edge Cloud part embedding the computing resources consumes 3 times more than the IoT part comprising the IoT devices and the wireless access point.
In 2011, Ericsson and Cisco started to announce that we will reach 50 billion devices connected to the Internet by 2020 [1,2]. Indeed, connected devices progressively invade our everyday lives with ever-widening application fields: personal health equipment, intelligent buildings, smart grids, connected vehicles, etc. The count in 2016 was under 20 billion of devices, including Internetof-Things (IoT) devices, smartphones, tablets and computers . Current forecasts estimate approximately 30 billion devices by 2020 .
All these objects, linked to telecommunication networks (most commonly the Internet), can interact with other connected devices or with distributed computing infrastructures, such as Clouds, for instance, to store information or perform computations. The growth in the number of connected objects and supporting infrastructures poses scientific challenges, notably in terms of managing the scaling, the heterogeneity of the communications networks used (Ethernet, WiFi, 3G, etc.), the migration of computations between objects and supporting infrastructures, and their energy consumption.
The development of IoT (Internet of Things) equipment, the popularization of mobile devices, and emerging wearable devices bring new opportunities for context-aware applications in Cloud computing environments . Since 2008, the U.S. National Intelligence Council lists the IoT among the six technologies that are most likely to impact U.S. national power by 2025 . The disruptive potential impact of IoT relies on its pervasiveness: it should constitute an integrated heterogeneous system connecting an unprecedented number of physical objects to the Internet . A basic example of such objects includes vehicles and their numerous sensors.
Among the many challenges raised by IoT, one is currently getting particular attention: making computing resources easily accessible from the connected objects to process the huge amount of data streaming out of them. Cloud computing has been historically used to enable a wide number of applications. It can naturally offer distributed sensory data collection, global resource and data sharing, remote and real-time data access, elastic resource provisioning and scaling, and pay-as-you-go pricing models . However, it requires the extension of the classical centralized Cloud computing architecture towards a more distributed architecture that includes computing and storage nodes installed close to users and physical systems . Such an edge Cloud architecture needs to deal with flexibility, scalability and data privacy issues to allow for efficient computational offloading services .
While computation offloading to the edge can be beneficial from a Quality of Service (QoS) point of view, from an energy perspective, it is relying on less energy-efficient resources than centralized Cloud data centers . On the other hand, with the increasing number of applications moving on to the Cloud, it may become untenable to meet the increasing energy demand which is already reaching worrying levels . Edge nodes could help to alleviate slightly this energy consumption as they could offload data centers from their overwhelming power load  and reduce data movement and network traffic. In particular, as edge Cloud infrastructures are smaller in size than centralized data center, they can make a better use of renewable energy .
On the other side, as IoT involves billions of connected devices mainly communicating through wireless networks, their power consumption is a major concern and limitation for the widespread of IoT . An IoT device does not consume a lot of power by itself, typically from few milliWatts to few Watts [13,14]. Yet, the increasing number of devices produces a scale effect and causes also a non negligible impact on Cloud infrastructures that provide the computing power required by IoT devices to offer services . To cope with the traffic increase caused by IoT devices, Cloud computing infrastructures start to explore the newly proposed distributed architectures, and in particular edge Cloud architectures where small data centers are located at the edge of the Cloud, typically in Internet Service Providers’ (ISP) edge infrastructures [16,17].
While the current state of the art offers numerous studies on energy models for IoT devices [18,19] and Cloud infrastructures [20,21], to the best of our knowledge, none of them provides the overall picture. It is thus hard to estimate the energy consumption induced by the increase of IoT devices on Cloud infrastructures for instance. The issue resides in having an end-to-end energy estimation of all the involved devices and infrastructures, including network devices from ISP and Cloud servers. Such results could also serve to identify which part consumes the most, and should then focus the energy-efficient efforts.
In this paper, we propose to investigate the end-to-end energy consumption of IoT platforms. Our aim is to evaluate, on a concrete use-case, the benefits of edge computing platforms for IoT regarding energy consumption. We propose end-to-end energy models for estimating the consumption when offloading computation from the objects to the edge or to the core Cloud, depending on the number of devices and the desired application QoS, in particular trading-off between performance (response time) and reliability (service accuracy).