Document Type : Review Article

Authors

Department of Engineering, Faculty of Vocational, Universitas Airlangga, Surabaya, Indonesia.

Abstract

Head movement utilizes gestures to aid people with disabilities so that they can have hands-free human-computer interaction. Currently, motion-based sensor is the most widely used approach to recognize head gestures. Identification of head movement is important to control a robotic manipulator in an assisting device. However, the most effective methodologies to assess head angular movements are yet to be discovered. This paper combines two algorithms, the visual sensor and the gyro sensor, to identify head orientation movement with high precision. Head orientations were measured using data distribution and this was done with a meal-assistance robot manipulator used in a sitting position. Evaluation of the accuracy of the system shows a visual sensor and gyro sensor. Experimental results show that a correct head movement with the average accuracy is 82%. Therefore, we propose the application of position control of meal assistive robot based on user's head movement in a sitting position.

Keywords

[1] F. Yakub, A. Z. M. Khudzari and Y. Mori, “Recent trends for practical rehabilitation robotics, current challenges and the future”, International Journal of Rehabilitation Research, 37(1), pp.9-21, 2014.
[2] P. K. Pisharady and M. Saerbeck, “Recent methods and databases in vision-based hand gesture recognition: A review”, Computer Vision and Image Understanding, 141, pp.152-165, 2015.
[3] E. Murphy-Chutorian and M. M. Trivedi “Head pose estimation in computer vision: A survey”, IEEE transactions on pattern analysis and machine intelligence, 31(4), pp.607-626, 2008.
[4] C. D. Metcalf, S. V. Notley, P. H. Chappell, J. H. Burridge and V. T. Yule, “Validation and application of a computational model for wrist and hand movements using surface markers”, IEEE Transactions on Biomedical Engineering, 55(3), pp.1199-1210, 2008.
[5] J. Musić, M. Cecić and M. Bonković, “Testing inertial sensor performance as hands-free human-computer interface”, WSEAS Trans. Comput., 8(4), pp.715-724, 2009.
[6] G. Lee, K. Kim and J. Kim, “Development of hands-free wheelchair device based on head movement and bio-signal for quadriplegic patients”. International Journal of Precision Engineering and Manufacturing, 17(3), pp.363-369, 2016.
[7] M. Nabati and A. Behrad, “3D Head pose estimation and camera mouse implementation using a monocular video camera”, Signal, Image and Video Processing, 9(1), pp.39-44, 2015.
[8] H. Tolle and K. Arai, “Design of head movement controller system (HEMOCS) for control mobile application through head pose movement detection”, International Journal of Interactive Mobile Technologies (iJIM), 10(3), pp.24-28, 2016.
[9] M. Davy and R. Deepa, “Hardware implementation based on head movement using accelerometer sensor”, International Journal of Applied Science and Engineering Research, 3(1), pp.17-21, 2014.
[10] C. Zhang, H. Nkashima, M. Tanaka, S. Moromugi and T. Ishimatsu, “Computerized Environment for People with Serious Disability”, In 2012 Fifth International Conference on Intelligent Networks and Intelligent Systems, pp. 286-289, IEEE, 2012.
[11] R. T. Yunardi. “Marker-based motion capture for measuring joint kinematics in leg swing simulator”, In 2017 5th International Conference on Instrumentation, Control, and Automation (ICA), pp. 13-17. IEEE, 2017.
[12] R. T. Yunardi, E. I. Agustin and R. Latifah, “Application of EMG and Force Signals of Elbow Joint on Robot-assisted Arm Training”, TELKOMNIKA (Telecommunication Computing Electronics and Control), 16(6), pp. 2913-2920, 2018.