New Robot Uses Kinect for Peepers
The list of uses for Kinect, other than petting a adorable baby tiger or getting your groove on, just got a little bit longer.
Researchers at the MIT Personal Robotics Group have built a prototype robot that sees using Microsoft's motion controller, Kinect, to see. The robot is built using the iRobot Create platform, and is essentially a Roomba with an onboard computer, range finder, and - of course - Kinect sensor.
The robot learns its surroundings by exploring and collecting information using Kinect's depth sensor. It relays that information back to a host computer where it is used to essentially "paint" a map of the area that becomes more and more detailed as the robot explores. The robot is also capable of identifying human beings, and then tracking their limbs. This allows people to direct the robot by pointing in the direction they want it to go.
It's unlikely that this is the first time anyone has thought about using a depth sensor on a robot, but I wouldn't be surprised if Kinect is opening up new avenues because it's cheaper and more convenient than trying to build a custom solution. It's a little like universities and other institutions - like the US Air Force [http://www.escapistmagazine.com/news/view/100631-Air-Force-Might-Be-Troubled-by-PS3-Other-OS-Removal] - buying up PS3 to use as processor clusters. It's not that other solutions didn't exist, it's just that using gaming hardware was more cost effective.
It's pretty remarkable what the researchers at MIT have been able to do with Kinect in such a short length of time. It makes you wonder what they, and the other people tinkering with Kinect right now, will come up with months from now.
Source: Engadget [http://www.engadget.com/2010/11/17/kinect-sensor-bolted-to-an-irobot-create-starts-looking-for-tro/]
Permalink
The list of uses for Kinect, other than petting a adorable baby tiger or getting your groove on, just got a little bit longer.
Researchers at the MIT Personal Robotics Group have built a prototype robot that sees using Microsoft's motion controller, Kinect, to see. The robot is built using the iRobot Create platform, and is essentially a Roomba with an onboard computer, range finder, and - of course - Kinect sensor.
The robot learns its surroundings by exploring and collecting information using Kinect's depth sensor. It relays that information back to a host computer where it is used to essentially "paint" a map of the area that becomes more and more detailed as the robot explores. The robot is also capable of identifying human beings, and then tracking their limbs. This allows people to direct the robot by pointing in the direction they want it to go.
It's unlikely that this is the first time anyone has thought about using a depth sensor on a robot, but I wouldn't be surprised if Kinect is opening up new avenues because it's cheaper and more convenient than trying to build a custom solution. It's a little like universities and other institutions - like the US Air Force [http://www.escapistmagazine.com/news/view/100631-Air-Force-Might-Be-Troubled-by-PS3-Other-OS-Removal] - buying up PS3 to use as processor clusters. It's not that other solutions didn't exist, it's just that using gaming hardware was more cost effective.
It's pretty remarkable what the researchers at MIT have been able to do with Kinect in such a short length of time. It makes you wonder what they, and the other people tinkering with Kinect right now, will come up with months from now.
Source: Engadget [http://www.engadget.com/2010/11/17/kinect-sensor-bolted-to-an-irobot-create-starts-looking-for-tro/]
Permalink