Blog

Latest entries for tag 'demo'

Video about Hybrid Reasoning on the PR2

We have posted a video introducing the DFG Research Unit on Hybrid Reasoning in general, and the C1 Robotics sub-project in particular. We present a demo on the PR2 robot that will serve as a baseline system and testbed for further research. The C1 project is a joint effort of the Research Lab Autonomous Intelligent Systems, University of Freiburg, the Knowledge-Based Systems Group, RWTH Aachen University, and the Research Group Foundations of Artificial Intelligence, University of Freiburg.

The Hybrid Reasoning C1 Robotics project investigates effective and efficient ways to combine quantitative and qualitative aspects of knowledge representation and reasoning. In the video in particular, we implemented a baseline system to work on active perception. We want the robot to reason on its current beliefs and if necessary decide what to do to improve them.

The base system was based on ROS for the PR2 and the TidyUpRobot demo from Freiburg. The PDDL-defined planning domain was adapted for action planning in our active perception scenario. The perception system was based on Fawkes' tabletop-objects plugin and the generic robot database recording with MongoDB. The planner decided on a sequence of positions to move to and waited a short time at each position and noted the timestamp at such a position. The database recording was running all the time, storing in particular transforms, images, and point clouds of the Kinect. After each position, the pcl-db-merge plugin was triggered by the planner as another action to merge and align the point clouds. The data for the recorded time stamps was retrieved from the database. The initial point cloud offset estimate was based on the robot's AMCL ( Fawkes port) pose information (at the respective recording time of the point cloud). Then, the point clouds were further aligned using pair-wise ICP alignment (implemented using PCL). The perception pipeline itself was improved to determine cylinder-shaped objects like certain cups. The pipeline was run on the merged point clouds, eventually leading to better results because occlusion shadows and incomplete object shapes had been resolved by taking data from multiple perspectives into account.

In the future, we want to integrate the system with Readylog to model the belief states and reason on their current quality.

The Fawkes related code is available in the timn/pcl-db-merge and abdon/tabletop-recognition branches. The planning code is in the hybris_c1-experimental branch of the alufr-ros-pkg repository.

Posted by Tim Niemueller on March 18, 2013 14:59

5th BRICS Research Camp

Last week, the BRICS project invited young researchers and robotics experts from around the globe for the 5th BRICS Research Camp in Granada, Spain. The overall goal was to integrate and to analyze a mobile manipulation task on the Care-O-bot and YouBot platforms, as well as integrating and evaluating components and ideas developed in the BRICS project.

Group 6 was tasked with porting the YouBot architecture onto the Care-O-bot -- and back. It was decided that rather than porting the software components bit by bit, rather it would be interesting to port capabilities and behavior to accomplish the same robot task. For this a semantic task abstraction level which allows to provide skills as a unified communication from the high level task coordination with the lower level subsystems. This mid-level level system is much in the tradition as Fawkes' Lua-based Behavior Engine. But here, it was implemented as an intermediate Python layer called ActionCmdr. In particular, it supports splitting of the behavior code into multiple packages allowing for general skills were possible, and specialized where necessary. This allowed to run the exact same high-level program on both robots. Two groups cooperated in bringing back the code onto the YouBot, after it has been ported to the Care-O-bot, showing the generality and portability of the new architecture. The picture to the right shows Group 6 and Group 1 with the Care-O-bot and two YouBots (from left to right, back row: Lucian Goron, Peter Schüller, Florian Weißhardt, Andreas Schierl, Tim Niemueller, Oliver Zendel, Felix Meßmer, front row: Leif Jentoft, Chandan Datta, and Daniel Ortiz Morales).

There is more on the Care-O-bot news page and in the program on the Research Camp wiki

Care-O-bot Demo

YouBot Demo

Thanks to Group 1 we were able to port the code back to the YouBot and run the demo.

Posted by Tim Niemueller on November 6, 2012 18:03

robOCD: Robotic Order Cups Demo - An Interactive Domestic Service Robotics Demo

We have uploaded a new video highlighting a demo integrating natural user interaction by speech and gesture, decision theoretic planning, and autonomous task execution on the domestic service robot Caesar at the KBSG, RWTH Aachen University. It was conceived and implemented during the RoboCup German Open.

In a home-like environment Caesar's task is to help setting the table. Besides basic capabilities of an autonomous mobile robot it uses methods for human-robot interaction and it also has a sophisticated high-level control that allows for decision-theoretic planning. We use this demo to illustrate the interplay of several modules of our robot control software in carrying out complex tasks. The overall system allows to perform robust reliable service robotics in domestic settings like in the RoboCup@Home league.

Also, we show how our high-level programming language provides a powerful framework for agent behavior specification that can be beneficially deployed for service robotic applications.

Posted by Tim Niemueller on October 5, 2012 13:24

BendIT - An Interactive Game with Two Robots

Below you can watch a video of the achievements of the Lab Course "Controlling Interactive Games and Robots" at the Knowledge-based Systems Group of the RWTH Aachen University in winter term 2011/2012. It was designed and implemented by the students of the lab course.

A human user uses his torso movements to steer a Festo Robotino robot along a pre-defined course. Our domestic service robot Caesar acts as a referee and autonomously follows the Robotino and makes sure that it stays within a corridor along the path. If the user manages to keep the Robotino within the corridor for the whole path he wins. The game can be used, for example, to engage people in physical training such as a rehabilitation after an injury.

Posted by Tim Niemueller on October 5, 2012 13:19