چکیده
مقدمه
زمینه
روش تحقیق
طبقه بندی رویکردهای تخلیه محاسباتی مه
مسائل باز و روندهای آینده
نتیجه گیری و محدودیت
منابع
Abstract
Introduction
Background
Research methodology
Classification of the fog offloading approaches
Open issues and future trends
Conclusion and limitation
References
چکیده
محاسبات مه به عنوان یک مکمل قدرتمند نسل بعدی برای محاسبات ابری در نظر گرفته می شود. امروزه، با توجه به افزایش چشمگیر تعداد دستگاههای اینترنت اشیا، مشکلات متعددی در معماریهای ابری مطرح شده است. با معرفی محاسبات مه به عنوان یک لایه میانی بین دستگاه های کاربر و ابر، می توان قابلیت پردازش و ذخیره سازی رایانش ابری را گسترش داد. تخلیه می تواند به عنوان مکانیزمی برای انتقال محاسبات، داده ها و مصرف انرژی از دستگاه های کاربر با منبع محدود به لایه های مه / ابر غنی از منابع برای دستیابی به تجربه بهینه در کیفیت برنامه ها و بهبود عملکرد سیستم استفاده شود. این مقاله یک مطالعه سیستماتیک و جامع برای ارزیابی آثار فعلی و اخیر مکانیسمهای تخلیه مه ارائه میکند. مزایا و معایب هر مقاله انتخاب شده برای بیان و پرداختن به پتانسیلها و مسائل موجود مکانیسمهای تخلیه در یک محیط مه به طور موثر مورد بررسی و تجزیه و تحلیل قرار میگیرد. ما مکانیسمهای تخلیه را در یک سیستم مه به چهار گروه طبقهبندی میکنیم، از جمله رویکردهای مبتنی بر محاسبات، مبتنی بر انرژی، مبتنی بر ذخیرهسازی و رویکردهای ترکیبی. علاوه بر این، این مقاله معیارهای بارگذاری، الگوریتمهای کاربردی و روشهای ارزیابی مربوط به مکانیسمهای بارگذاری انتخابی در سیستمهای مه را بررسی میکند. علاوه بر این، چالشهای باز و روندهای آتی حاصل از مطالعات بررسی شده مورد بحث قرار میگیرند.
توجه! این متن ترجمه ماشینی بوده و توسط مترجمین ای ترجمه، ترجمه نشده است.
Abstract
Fog computing is considered a formidable next-generation complement to cloud computing. Nowadays, in light of the dramatic rise in the number of IoT devices, several problems have been raised in cloud architectures. By introducing fog computing as a mediate layer between the user devices and the cloud, one can extend cloud computing's processing and storage capability. Offloading can be utilized as a mechanism that transfers computations, data, and energy consumption from the resource-limited user devices to resource-rich fog/cloud layers to achieve an optimal experience in the quality of applications and improve the system performance. This paper provides a systematic and comprehensive study to evaluate fog offloading mechanisms' current and recent works. Each selected paper's pros and cons are explored and analyzed to state and address the present potentialities and issues of offloading mechanisms in a fog environment efficiently. We classify offloading mechanisms in a fog system into four groups, including computation-based, energy-based, storage-based, and hybrid approaches. Furthermore, this paper explores offloading metrics, applied algorithms, and evaluation methods related to the chosen offloading mechanisms in fog systems. Additionally, the open challenges and future trends derived from the reviewed studies are discussed.
Introduction
Fog computing, which is also known as fog or fog networking, is introduced as an emergent and novel paradigm for extending cloud services. Although cloud computing is presented as a model that provides on-demand and ubiquitous access to a shared pool of computing and storage resources, cloud resources are far from users, and as a result, the cloud cannot support low-latency services alone. The fog can extend these computing and storage resources by incorporating a transitional layer between the IoT devices and the cloud, leading to a three-layer hierarchy: user devices layer, fog layer, and cloud layer. The middle fog layer consists of a set of base stations, routers, and gateways that are geographically distributed and places as near as possible to the IoT devices. It is widely known that fog brings the cloud closer to the ground (IoT devices) [72, 88].
Conclusion and limitation
In conclusion, in this paper, a systematic study was presented with a focus on the current research studies of offloading mechanisms in fog computing, including its architecture, technologies, and application. In this study, by applying our search query, 131 publications were selected at the initial selection. At the final selection, we selected 37 papers with reference to the research questions and classified them based on their contents. According to RQ2, the applied mechanisms in the offloading of fog computing were classified into four groups, with the highest percentage of studies done in computation-based mechanisms with 38%, hybrid mechanisms with 35%, energy-based mechanisms with 16%, and storage-based mechanisms with 11% of all types of applied mechanisms. They were compared and analyzed according to their significance and crucial evaluation metrics. The key differences, advantages, disadvantages, and important factors of each of the selected works were addressed in the concept of offloading in fog. Based on RQ3, the most important metrics in various proposed approaches were energy and response time by 24% and cost by 20%. According to RQ4, the simulation method (67% of the papers) was dominant in most categories as a method of evaluation, followed by design (22% of the papers) and real-testbed (8% of the papers). In addition, with respect to the RQ5, the most common algorithms were the non-heuristic ones by 74% and the heuristic ones by 26%. Furthermore, based on RQ6, the existing fog offloading mechanisms have faced several open issues and future trends such as trustworthiness and security, multi-objective mechanisms, big data analytics, new generation mobile networks, network management, and carbon-aware offloading for geo-distributed. Lucidly, the most important challenges are scalability and real-testbed implementation.