Abstract
1. Introduction
2. Related works
3. A novel immersive human–computer interactive VR system using large-scale screen
4. Result and comparison
5. Conclusion and future work
Acknowledgments
References
Abstract
Large-scale display system with immersive human computer interaction (HCI) is an important solution for virtual reality (VR) systems. In contrast to the traditional human-computer interactive VR system that requires the user to wear heavy VR headsets for visualization and data gloves for HCI, the proposing method utilizes a large-scale display screen (with or without 3D glasses) to visualize the virtual environment and a bare-handed gesture recognition solution to receive user instructions. The entire framwork is named as an immersive HCI system. Through a virtual 3D interactive rectangular parallelepipe, we establish the correspondence between the virtual scene and the control information. A bare-handed gesture recognition method is presented based on extended genetic algorithm. An arm motion estimation method is designed based on the fuzzy predictive control theory. Experimental results showed that the proposed method has lower error rates than most existing solutions with acceptable recognition frequency rates.
Introduction
The technology of large-scale display system using multi-projector system presents an important solution for human computer interaction (HCI) [1, 2]. In addition, remote controls using bare hand gestures, body poses and limbs motions are more preferable for virtual reality (VR) experience, compared with wearing heavy VR devices, such as VR headsets and data gloves [3]. In this study, users will experience the virtual environment by roaming in front of a large-scale screen with multi-projector system. Instructions and orders can be received by bare-handed gesture recognition solutions [4]. The experimental setup of this study is illustrated in Fig. 1, which consists of a large-scale display, fifteen projectors, fifteen client PCs, two cameras, and one server. The user’s hand gesture and arm motion are captured by the two cameras. The hand gestures are translated to operating commands as grabbing, releasing, rotating etc. And the arm motions create navigate commands, such as move left, move right, move forward etc. By combining the operating commands and the navigation commands, we enable the users to experience the immersive HCI in the virtual environment.