Access the full text.
Sign up today, get DeepDyve free for 14 days.
Mohammed Hasanuzzaman, Zhang Tao, Vuthichai Ampornaramveth, M. Bhuiyan, Y. Shirai, H. Ueno (2004)
Gesture Recognition for Human-Robot Interaction Through a Knowledge Based Software Platform
H. Ueno (2002)
A Knowledge-Based Information Modeling for Autonomous Humanoid Service RobotIEICE Transactions on Information and Systems, 85
Vuthichai Ampornaramveth, H. Ueno (2001)
Software Platform for Symbiotic Operations of Human and Networked Robots
M. Minsky (1974)
A framework for representing knowledge
M. Bhuiyan, Vuthichai Ampornaramveth, S. Muto, H. Ueno (2003)
Face Detection and Facial Feature Localization for Human-machine Interface, 5
A. Utsumi, N. Tetsutani, S. Igi (2002)
Hand detection and tracking using pixel value distribution model for multiple-camera-based gesture interactionsProceedings. IEEE Workshop on Knowledge Media Networking
S. Waldherr, R. Romero, S. Thrun (2000)
A Gesture Based Interface for Human-Robot InteractionAutonomous Robots, 9
R.E. Axtell
Gestures: The Do's and Taboos of Hosting International Visitors
D. Koller, A. Pfeffer (1998)
Probabilistic Frame-Based Systems
D. Sturman, D. Zeltzer (1994)
A survey of glove-based inputIEEE Computer Graphics and Applications, 14
Mohammed Hasanuzzaman, Zhang Tao, Vuthichai Ampornaramveth, M. Bhuiyan, Y. Shirai, H. Ueno (2004)
Face and Gesture Recognition Using Subspace Method for Human-Robot Interaction
Zhang Tao, Mohammed Hasanuzzaman, Vuthichai Ampornaramveth, Pattara Kiatisevi, H. Ueno (2004)
Human-robot interaction control for industrial robot arm through software platform for agents and knowledge management2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583), 3
T. Fong, I. Nourbakhsh, K. Dautenhahn (2003)
A survey of socially interactive robotsRobotics Auton. Syst., 42
V. Pavlovic, Rajeev Sharma, Thomas Huang (1997)
Visual Interpretation of Hand Gestures for Human-Computer Interaction: A ReviewIEEE Trans. Pattern Anal. Mach. Intell., 19
H. Rowley, S. Baluja, T. Kanade (1996)
Neural network-based face detectionProceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition
M. Bhuiyan, Vuthichai Ampornaramveth, S. Muto, H. Ueno (2004)
On Tracking of Eye for Human-Robot InterfaceInt. J. Robotics Autom., 19
Ming-Hsuan Yang, D. Kriegman, N. Ahuja (2002)
Detecting Faces in Images: A SurveyIEEE Trans. Pattern Anal. Mach. Intell., 24
Vuthichai Ampornaramveth, Pattara Kiatisevi, H. Ueno (2004)
SPAK: Software Platform for Agents and Knowledge Systems in Symbiotic RobotsIEICE Trans. Inf. Syst., 87-D
C. Hu
Gesture recognition for human‐machine interface of robot teleoperation
R. Axtell (1990)
The Do's and Taboos of Hosting International Visitors
Takahiro Watanabe, M. Yachida (1998)
Real time gesture recognition using eigenspace from multi-input image sequencesProceedings Third IEEE International Conference on Automatic Face and Gesture Recognition
Matthew Turk, Alex Pentland (1991)
Eigenfaces for RecognitionJournal of Cognitive Neuroscience, 3
T. Zhang, V. Ampornaramveth, P. Kiatisevi, Md. Hasanuzzaman, H. Ueno
Knowledge‐based multiple robots coordinative operation using software platform
P. Kiatisevi, V. Ampornaramveth, H. Ueno
A distributed architecture for knowledge‐based interactive robots
G. Yang, Thomas Huang (1994)
Human face detection in a complex backgroundPattern Recognit., 27
G. Rigoll, A. Kosmala, S. Eickeler (1997)
High Performance Real-Time Gesture Recognition Using Hidden Markov Models
S. Sirohey (1998)
Human Face Segmentation and Identification
Ying Dai, Y. Nakano (1996)
Face-texture model based on SGLD and its application in face detection in a color scenePattern Recognit., 29
Mohammed Hasanuzzaman, Vuthichai Ampornaramveth, Zhang Tao, M. Bhuiyan, Y. Shirai, H. Ueno (2004)
Real-time Vision-based Gesture Recognition for Human Robot Interaction2004 IEEE International Conference on Robotics and Biomimetics
Purpose – Achieving natural interactions by means of vision and speech between humans and robots is one of the major goals that many researchers are working on. This paper aims to describe a gesture‐based human‐robot interaction (HRI) system using a knowledge‐based software platform. Design/methodology/approach – A frame‐based knowledge model is defined for the gesture interpretation and HRI. In this knowledge model, necessary frames are defined for the known users, robots, poses, gestures and robot behaviors. First, the system identifies the user using the eigenface method. Then, face and hand poses are segmented from the camera frame buffer using the person's specific skin color information and classified by the subspace method. Findings – The system is capable of recognizing static gestures comprised of the face and hand poses, and dynamic gestures of face in motion. The system combines computer vision and knowledge‐based approaches in order to improve the adaptability to different people. Originality/value – Provides information on an experimental HRI system that has been implemented in the frame‐based software platform for agent and knowledge management using the AIBO entertainment robot, and this has been demonstrated to be useful and efficient within a limited situation.
Industrial Robot: An International Journal – Emerald Publishing
Published: Jan 1, 2006
Keywords: Robotics; Man machine interface
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.