محیط مجازی تعامل همه جانبه انسان و رایانه
ترجمه نشده

محیط مجازی تعامل همه جانبه انسان و رایانه

عنوان فارسی مقاله: محیط مجازی تعامل همه جانبه انسان و رایانه با استفاده از سیستم نمایش مقیاس بزرگ
عنوان انگلیسی مقاله: Immersive human–computer interactive virtual environment using large-scale display system
مجله/کنفرانس: سیستم های کامپیوتری نسل آینده-Future Generation Computer Systems
رشته های تحصیلی مرتبط: مهندسی کامپیوتر، مهندسی فناوری اطلاعات
گرایش های تحصیلی مرتبط: مهندسی نرم افزار، شبکه های کامپیوتری
کلمات کلیدی فارسی: تعامل انسان و رایانه، واقعیت مجازی، تشخیص حرکات، برآورد حرکت
کلمات کلیدی انگلیسی: Human computer interaction; Virtual reality; Gesture recognition; Motion estimation
نوع نگارش مقاله: مقاله پژوهشی (Research Article)
شناسه دیجیتال (DOI): http://dx.doi.org/10.1016/j.future.2017.07.058
دانشگاه: College of Information Engineering, China Jiliang University, 258 Xueyuan Street, Hangzhou, China, 310018.
صفحات مقاله انگلیسی: 26
ناشر: الزویر - Elsevier
نوع ارائه مقاله: ژورنال
نوع مقاله: ISI
سال انتشار مقاله: 2019
ایمپکت فاکتور: 7.007 در سال 2018
شاخص H_index: 93 در سال 2019
شاخص SJR: 0.835 در سال 2018
شناسه ISSN: 0167-739X
شاخص Quartile (چارک): Q1 در سال 2018
فرمت مقاله انگلیسی: PDF
وضعیت ترجمه: ترجمه نشده است
قیمت مقاله انگلیسی: رایگان
آیا این مقاله بیس است: خیر
کد محصول: E12088
فهرست مطالب (انگلیسی)

Abstract

1. Introduction

2. Related works

3. A novel immersive human–computer interactive VR system using large-scale screen

4. Result and comparison

5. Conclusion and future work

Acknowledgments

References

بخشی از مقاله (انگلیسی)

Abstract

Large-scale display system with immersive human computer interaction (HCI) is an important solution for virtual reality (VR) systems. In contrast to the traditional human-computer interactive VR system that requires the user to wear heavy VR headsets for visualization and data gloves for HCI, the proposing method utilizes a large-scale display screen (with or without 3D glasses) to visualize the virtual environment and a bare-handed gesture recognition solution to receive user instructions. The entire framwork is named as an immersive HCI system. Through a virtual 3D interactive rectangular parallelepipe, we establish the correspondence between the virtual scene and the control information. A bare-handed gesture recognition method is presented based on extended genetic algorithm. An arm motion estimation method is designed based on the fuzzy predictive control theory. Experimental results showed that the proposed method has lower error rates than most existing solutions with acceptable recognition frequency rates.

Introduction

The technology of large-scale display system using multi-projector system presents an important solution for human computer interaction (HCI) [1, 2]. In addition, remote controls using bare hand gestures, body poses and limbs motions are more preferable for virtual reality (VR) experience, compared with wearing heavy VR devices, such as VR headsets and data gloves [3]. In this study, users will experience the virtual environment by roaming in front of a large-scale screen with multi-projector system. Instructions and orders can be received by bare-handed gesture recognition solutions [4]. The experimental setup of this study is illustrated in Fig. 1, which consists of a large-scale display, fifteen projectors, fifteen client PCs, two cameras, and one server. The user’s hand gesture and arm motion are captured by the two cameras. The hand gestures are translated to operating commands as grabbing, releasing, rotating etc. And the arm motions create navigate commands, such as move left, move right, move forward etc. By combining the operating commands and the navigation commands, we enable the users to experience the immersive HCI in the virtual environment.