ذخیره سازی و پردازش فایل های تصویری
ترجمه نشده

ذخیره سازی و پردازش فایل های تصویری

عنوان فارسی مقاله: یک راه حل بهینه برای ذخیره سازی و پردازش فایل های تصویری کوچک در Hadoop
عنوان انگلیسی مقاله: An Optimal Solution of Storing and Processing Small Image Files on Hadoop
مجله/کنفرانس: علوم کامپیوتر پروسیدیا – Procedia Computer Science
رشته های تحصیلی مرتبط: مهندسی کامپیوتر
گرایش های تحصیلی مرتبط: معماری سیستم های کامپیوتری
کلمات کلیدی فارسی: سیستم فایل توزیع شده Hadoop، کاهش نگاشت، فایل های تصویری کوچک، ذخیره سازی خودی و فرمت IO
کلمات کلیدی انگلیسی: Hadoop Distributed File System (HDFS), MapReduce, small images files, selfdined storage and IO format
نوع نگارش مقاله: مقاله پژوهشی (Research Article)
شناسه دیجیتال (DOI): https://doi.org/10.1016/j.procs.2019.06.092
دانشگاه: School of Computer Science & Engineering, South China University of Technology, Guangzhou, Guangdong, China
صفحات مقاله انگلیسی: 7
ناشر: الزویر - Elsevier
نوع ارائه مقاله: ژورنال
نوع مقاله: ISI
سال انتشار مقاله: 2019
ایمپکت فاکتور: 1.257 در سال 2018
شاخص H_index: 47 در سال 2019
شاخص SJR: 0.281 در سال 2018
شناسه ISSN: 1877-0509
فرمت مقاله انگلیسی: PDF
وضعیت ترجمه: ترجمه نشده است
قیمت مقاله انگلیسی: رایگان
آیا این مقاله بیس است: خیر
آیا این مقاله مدل مفهومی دارد: ندارد
آیا این مقاله پرسشنامه دارد: ندارد
آیا این مقاله متغیر دارد: ندارد
کد محصول: E12353
رفرنس: دارای رفرنس در داخل متن و انتهای مقاله
فهرست مطالب (انگلیسی)

Abstract

1-Introduction

2-Background

3-Proposed Solution

4-Conclusion

Acknowledgement

References

بخشی از مقاله (انگلیسی)

Abstract

The rapid development of the Internet, especially mobile Internet, makes it much easier for people to make social contacts online. Nowadays they tend to spend more and more time on social network service, producing a lot of image files. This brings a challenge to traditional standalone framework on handing the continued increasing image files. Therefore, it is advisable to find a new way to face the challenge. Hadoop is a notable, widely-used project for distributed storage and computations with high efficiency, data integrity, reliability and fault tolerance. Hadoop Distributed File System and MapReduce are two primary subprojects respectively for big data storage and computations. However, Hadoop do not provide any interface for image processing. Worse, both Hadoop Distributed File System and MapReduce have trouble processing large amount of small files, decreasing efficiency of files access and distributed computations. This prevents us from performing images processing actions on Hadoop. This paper proposes a method to optimize small image files storage on Hadoop and self-defines an input/output format to enable Hadoop to process image files.

Introduction

At the meantime, the rapid development of mobile technology and the wide spread of the Internet give rise to the time people spend on social networking service every day. They chat and share their life by pictures or videos with friends whenever they access to the net. Take Weibo as an example. According to The Report of Weibo Users Development 2016[2], the number of monthly active users on Weibo have reached 297 million while daily active users have got to 132 million by September, 2016. The report also revealed that these users active on Weibo mainly write tweet in the form of images with some text description, which accounts for 60%. We can simply estimate that even if we take Weibo alone into consideration, about 70 million of images will be created, stored and uploaded every day. And clearly there are many other network services similar to Weibo, popular and producing large amount of data especially images data. Thus, it is hard for traditional standalone framework to store and handle such a large number of image files. A solution to this problem is needed.