More fun out of MIT's AI lab. Grad student Peng Yu happily showed off a couple of flying demos on our visit, controlling an Ar.Drone with a number of methods, including keyboard, tablet (touch), voice and gesture, each naturally presenting their own positives and negatives, in terms of ease of use and specificity. The latter was certainly the most intriguing of the bunch, executed via a Kinect hack that allowed Yu to direct the flying robot over a small model town in the middle of the lab.
Voice, meanwhile, played an important role in a computer demo that keeps in line with a vision from Boeing of a future (some 20 or 30 years out, according to its estimates) in which citizens utilize personal aircrafts capable of carrying two to four people to, say, commute to work. Speaking into the system, the user essentially negotiates with the aircraft, giving a destination, hoped for flight duration and any pitstops to be made along the way. The system in the demo adjusted for storms and let Yu know how quickly it thought it would be able to make the run.
Demos of all of the above can be found after the break.
Filed under: Robots
MIT's Computer Science and Artificial Intelligence Lab envisions a future of personal air transport (video) originally appeared on Engadget on Sun, 18 Nov 2012 15:57:00 EDT. Please see our terms for use of feeds.
Permalink | | Email this | CommentsSource: http://feeds.engadget.com/~r/weblogsinc/engadget/~3/WdxS18qPJhE/
No comments:
Post a Comment