The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
In this paper, we propose a marker-less full body human motion capture system designed for humanoid robot applications. The system is based on a stereo camera, and therefore has strong portability. Tracking is implemented within the particle filter framework, and the high dimensionality problem is solved through partitioned sampling. Taking advantage of the stereo setup, we propose a depth cue which...
This paper proposes a plausible approach for a humanoid robot to discover its own body part based on the coherence of two different sensory feedbacks; vision and proprioception. The image cues of a visually salient region are stored in a visuomotor base with the level of visuo proprioceptional coherence. The high coherence between the motions in the vision and proprioception suggests the visually...
This paper proposes a plausible approach for a robot to discover its own body based on the coherence of two different sensory feedbacks; vision and proprioception. The image cues of a moving region are stored in an image base with a visuo proprioceptional coherence label. The existence of coherence between the vision and proprioception suggests that the visually detected object is correlated to its...
In human-robot interaction, it is important for the robot to know the head movement, gaze direction and expression of the conversation partner, since such information are deeply related with the attention, intention and emotion. Recently, many types of real-time measurement systems for head pose and gaze direction have been proposed and utilized for human interfaces and ergonomics applications. We...
This paper presents a method to generate vision and task based robot manipulator trajectories. In the framework of the research on daily assistive robots the task formulated for this work consists in ranging boxes in a shelf or picking them up and placing them into a shopping cart. For object recovery SIFT keypoint tracking in combination with stereo vision is used. A collision free trajectory between...
Robotic assistants designed to coexist and communicate with humans in the real world should be able to interact with them in an intuitive way. This requires that the robots are able to recognize typical gestures performed by humans such as head shaking/nodding, hand waving, or pointing. In this paper, we present a system that is able to spot and recognize complex, parameterized gestures from monocular...
This paper proposed the face detection and tracking system using a embedded computing system for humanoid robot environment which allows the user to observe of recognize an unknown person. For detection faces in image sequence, the system uses the skin color model and deformable ellipse matching. Faces in a view are detected by maximizing the image gradient magnitude around the perimeter of the ellipse...
In this paper stereo vision based tracking system for humanoid robot is proposed. The goal is to locate and track the end-effector in unknown and dynamic environments. A robust approach is developed by utilizing visual features. Identical visual features such as color, disparity and adaptive template matching are used to detect and track the end-effector robustly. An improved active search method...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.