Research Movies 2010

Human Tracking using Ultrasonic Sensor for Mobile Robot

SHIINA Makoto

This human tracking system is achieved by estimating the position of the ultrasonic transponder attached to the waist of the target person using triangulation, of which the hardware configuration is consisted of 3 ultrasonic sensors mounted on the mobile robot. This system is developed using the ultrasonic sensor driver "HiSonic" developed in our laboratory, and a PIC18f4550 microcontroller with built-in USB module.

Winning Entries of Demo Program Contest 2010

1st Place: Posture and Position Control of Folding Handle Cart Pushed by Autonomous Mobile Robot

WATANABE Atsushi

This is a demonstration of controlling posture and position of passive folding handle cart pushed by an autonomous mobile robot. The position of the handle bar of the cart and the feet of the person walking in front are measured using a laser range scanner 'URG' mounted on the robot. Then its contact position and angle on the cart are adjusted by feedback control system, which directs the cart to move within a certain distance and angle towards the person. Therefore, the folding handle cart is controlled by the robot to follow the person walking ahead.

2nd place: Run meros!!!

BANDO Shigeru

In this system, the robot(yamabico meros) moves in a free route which is designated by the direction of marker set on the floor. A camera is used to capture the image of the marker, and the position of the marker is segmented out and analyzed by using ARToolKit. Then, a virtual arrow is displayed over the marker, which shows the direction in which the robot will move towards.

3rd place: Using only arc paths to achieve human like parking skills

OGATA Kazuki

In this demonstration, a 34cm wide mobile robot equipped with 240 degree laser range sensor(URG) is used to travel in a 40cm narrow path and reverse parking into a garage. We use the data acquired from the URG sensor to estimate the distance of the walls of the garage, which is within the blind spot of the robot. This enables the robot to maneuver into the garage by repeatedly turning left, right and going forward, like how a driver tries to align with the side wall and parks the car.

4th place: Vision-based Karugamo

MORIKAWA Naoki

Introduction: Instead of using a laser scanner, human following is achieved with a camera only. From the color image of the camera, a color histogram is created out of the area where the object of interest is situated. With the histogram overlaying on the image again, only the area with the same color intensity will be extracted and then noise reduced. Human following is made possible by tracking the biggest region of the extraction. The amount of movement of the biggest region from the center of the camera is used as a method of tracking the human. This method can achieve human following reasonably well but the object recognition is difficult when the venue is overly bright and the color of object does not stands out distinctly.

5th place: DanceDanceRobot

The Movie is closed-door from consideration to the copyright
KISHITA Kazuki

By capturing the projected game screen using a webcam, the robot uses the data collected to play the game Dance Dance Revolution. When the arrows flow from bottom to the top of the screen, they are captured and recognized by the robot, which then corresponds information of the direction of the arrows to the PC running the game, while dancing simultaneously.