Access the full text.
Sign up today, get DeepDyve free for 14 days.
Jing Xu, N. Xi, Chi Zhang, Quan Shi, J. Gregory (2011)
Real-time 3D shape inspection system of automotive parts based on structured light patternOptics and Laser Technology, 43
Kai Nickel, R. Stiefelhagen (2007)
Visual recognition of pointing gestures for human-robot interactionImage Vis. Comput., 25
T. Kanda, Rumi Sato, N. Saiwaki, H. Ishiguro (2007)
A Two-Month Field Trial in an Elementary School for Long-Term Human–Robot InteractionIEEE Transactions on Robotics, 23
Fanhuai Shi, T. Lin, Shanben Chen (2009)
Efficient weld seam detection for robotic welding based on local image processingInd. Robot, 36
J. Geng (2011)
Structured-light 3D surface imaging: a tutorialAdvances in Optics and Photonics, 3
Agnès Just, S. Marcel (2009)
A comparative study of two state-of-the-art sequence processing techniques for hand gesture recognitionComput. Vis. Image Underst., 113
Xizhang Chen, Yu-Ming Huang, Shanben Chen (2012)
Model analysis and experimental technique on computing accuracy of seam spatial position information based on stereo vision for welding robotInd. Robot, 39
C. Pheatt, J. Ballester, Dustin Wilhelmi (2005)
Low-cost three-dimensional scanning using range imagingJournal of Computing Sciences in Colleges, 20
Junxia Gu, Xiaoqing Ding, Shengjin Wang, Youshou Wu (2010)
Action and Gait Recognition From Recovered 3-D Human JointsIEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 40
J. Hsieh, Yung-Tai Hsu, H. Liao, Chih-Chiang Chen (2008)
Video-Based Human Movement Analysis and Its Application to Surveillance SystemsIEEE Transactions on Multimedia, 10
X. Yin, Xing Zhu (2006)
Hand Posture Recognition in Gesture-Based Human-Robot Interaction2006 1ST IEEE Conference on Industrial Electronics and Applications
Purpose – The purpose of this paper is to propose a fast object detection algorithm based on structural light analysis, which aims to detect and recognize human gesture and pose and then to conclude the respective commands for human-robot interaction control. Design/methodology/approach – In this paper, the human poses are estimated and analyzed by the proposed scheme, and then the resultant data concluded by the fuzzy decision-making system are used to launch respective robotic motions. The RGB camera and the infrared light module aim to do distance estimation of a body or several bodies. Findings – The modules not only provide image perception but also objective skeleton detection. In which, a laser source in the infrared light module emits invisible infrared light which passes through a filter and is scattered into a semi-random but constant pattern of small dots which is projected onto the environment in front of the sensor. The reflected pattern is then detected by an infrared camera and analyzed for depth estimation. Since the depth of object is a key parameter for pose recognition, one can estimate the distance to each dot and then get depth information by calculation of distance between emitter and receiver. Research limitations/implications – Future work will consider to reduce the computation time for objective estimation and to tune parameters adaptively. Practical implications – The experimental results demonstrate the feasibility of the proposed system. Originality/value – This paper achieves real-time human-robot interaction by visual detection based on structural light analysis.
Engineering Computations: International Journal for Computer-Aided Engineering and Software – Emerald Publishing
Published: Oct 28, 2014
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.