Blog

Latest entries for tag 'pcl'

Fawkes 1.0.0 released

Today we are happy to announce the release of Fawkes 1.0.0. This release four years after the last major release (we still need to get better on this) contains a vast amount of new software components and plugins, sub-systems, and overall system and framework improvements. You may also call this the tenth anniversary release (the first commit was done on Sep 5th 2006). In the past years, Fawkes (or parts thereof) has especially been used in domestic service robot and industrial multi-robot logistics tasks.

In the following we present some additions and changes in 1.0.0 since 0.5.0.

Behavior Modeling
A crucial part of a robotics system is the system to model, reason about and decide on, and eventually execute the intended behavior for the robot to accomplish its task. This version contains some significant improvements.
  • CLIPS-based agent framework: based on the integration of the CLIPS rule-based production system of the 0.5 release we now include several plugins that allow for the easy modeling and execution of agents as a rule-based action selection system.
  • OpenPRS: we have integrated and used the procedural reasoning system in the RoboCup Logistics League.
  • ECLiPSe CLP: this constraint logic programming system is used to power a Golog-based agent framework.
  • ROS-BE: we have backported advances made to the ROS version of the Lua-based Behavior Engine back to Fawkes. Additionally, it allows for native access to the ROS universe using roslua
Topological Map Graph
A topological map graph can be useful for global path finding, relating relevant points of interest (POI) from symbolic names to geometric coordinates, and to store semantic information and properties. In Fawkes, the newly introduced navgraph component and provides these capabilities. It is integrated with the behavior components and additionally provides constraint management (block nodes or edges, apply cost factors to edges), which can be used, for example, for path reservation. It features automatic navgraph generation given POIs, obstacles, and a cell-based map. A navgraph can be represented as a YAML file and there is support for interactive manipulation through ROS rviz.
Gazebo Simulation
The new Gazebo simulation system allows easy access to many types of sensors and full interaction with the Gazebo middleware. This integration is the basis for the simulation integration of the RoboCup Logistics League 2015 domain release.
New Hardware Drivers
Fawkes 1.0 supports several new hardware components, such as the Sick TiM 55x and 57x laser range finders, the Asus Xtion, Point Grey Bumblebee2, Cruizcore XG1100 IMU, and arbitrary Dynamixel servo chains. Based on our libkindrv library the Jaco arm is fully supported, even in dual-arm configurations. The Robotino driver supports direct-access communication mode that completely bypasses OpenRobotino (tested on Robotino 3). This allows for access to sensor timestamps and better control. This also includes a simple acceleration controller which makes the Robotino drive much more smoothly.
Other Items
Some of the other noteworthy changes include switching to YAML-based configuration files, switching to the tf2 transforms library, a plugin to recognize clusters in laser data, bash completion support, and support for IPv6 networking. Additionally, a full pipeline for storing, retrieving, and merging point clouds to and from a MongoDB database has been added. Interfaces are now packed more densely, which means external access libraries must be updated (e.g., jfawkes).
There are many other fixes and performance improvements throughout the software.

Thank you to the contributors who made this release possible. The new release is available on the download page. The documentation is continuously extended and improved in our Trac Wiki.

Posted by Tim Niemueller on December 30, 2016 16:17

MongoDB Logging presented at ROSCon 2013

Last weekend some of us went to ROSCon 2013 to listen to all the great stuff happening in the ROS eco system and present some of our own work.

The item which probably gained most attention was the presentation and announcement of the MoveIt release. It's definitely something to look into and a platform to consider in the future for doing mobile manipulation. Another interesting presentation was on the Robot Web Tools. While the presentation's balance leaned a bit from technical to fancy (probably a web people thing) it showed some very nice integration possibilities. Depending on how platform agnostic it is it could be re-used for Fawkes. Another thing to look into is tf2. It seems to finally get rolled out in ROS, making it worthwhile to consider it for integration again.

A particular thing that was talked much about was ROS 2.0, the magic next version which solves everything and makes all dreams come true -- sort of. It shows a lot of promise and the open process by which its design decisions should be made is appealing. Some of the things that were discussed indicate that ROS 2.0 might be much easier to integrate with other non-ROS software.

On our end Ingo L├╝tkebohle (Bielefeld University) and Tim Niemueller(RWTH Aachen University) presented data recording and evaluation techniques. The MongoDB-based logging received quite some attention. It seems to be time to backport the advancements we made on the Fawkes logger version back to ROS. Additionally the RobotMetaLogger was presented that might benefit by supporting the MongoDB-logger as an input source.

On-line Use of Recorded Data from Database

On particular example we presented is the use of data recording to remember point clouds and other data. Certain times of observation points are recorded during a run. Then later the data and associated transforms is restored and the point clouds are merged to fill shadows and occlusions in the data for a complete perception run. Below are some screenshots of the visualization documenting the process. You can find a video of the process in our recent video on deliberative active perception employing hybrid reasoning.

Posted by Tim Niemueller on May 16, 2013 13:36

Video about Hybrid Reasoning on the PR2

We have posted a video introducing the DFG Research Unit on Hybrid Reasoning in general, and the C1 Robotics sub-project in particular. We present a demo on the PR2 robot that will serve as a baseline system and testbed for further research. The C1 project is a joint effort of the Research Lab Autonomous Intelligent Systems, University of Freiburg, the Knowledge-Based Systems Group, RWTH Aachen University, and the Research Group Foundations of Artificial Intelligence, University of Freiburg.

The Hybrid Reasoning C1 Robotics project investigates effective and efficient ways to combine quantitative and qualitative aspects of knowledge representation and reasoning. In the video in particular, we implemented a baseline system to work on active perception. We want the robot to reason on its current beliefs and if necessary decide what to do to improve them.

The base system was based on ROS for the PR2 and the TidyUpRobot demo from Freiburg. The PDDL-defined planning domain was adapted for action planning in our active perception scenario. The perception system was based on Fawkes' tabletop-objects plugin and the generic robot database recording with MongoDB. The planner decided on a sequence of positions to move to and waited a short time at each position and noted the timestamp at such a position. The database recording was running all the time, storing in particular transforms, images, and point clouds of the Kinect. After each position, the pcl-db-merge plugin was triggered by the planner as another action to merge and align the point clouds. The data for the recorded time stamps was retrieved from the database. The initial point cloud offset estimate was based on the robot's AMCL ( Fawkes port) pose information (at the respective recording time of the point cloud). Then, the point clouds were further aligned using pair-wise ICP alignment (implemented using PCL). The perception pipeline itself was improved to determine cylinder-shaped objects like certain cups. The pipeline was run on the merged point clouds, eventually leading to better results because occlusion shadows and incomplete object shapes had been resolved by taking data from multiple perspectives into account.

In the future, we want to integrate the system with Readylog to model the belief states and reason on their current quality.

The Fawkes related code is available in the timn/pcl-db-merge and abdon/tabletop-recognition branches. The planning code is in the hybris_c1-experimental branch of the alufr-ros-pkg repository.

Posted by Tim Niemueller on March 18, 2013 14:59

Fawkes 0.5.0 released

Today we are happy to announce the release of Fawkes 0.5.0. This release two years after the last major release (we need to get better on this) contains a vast amount of new software components and plugins, sub-systems, and overall system and framework improvements. For this release, the large majority of additions and changes has been made to functional components and plugins, rather than the core framework. This indicates that Fawkes has matured over the years and provides a solid base for robot software applications.

The new software components cover typical robot tasks like self-localization, (point cloud based) perception, robot arm motion planning, and integration with other software frameworks. Many of these components are possible because we integrated other third party robot software components and make it available within the Fawkes ecosystem. We have also added support for several common robot platforms like the Nao, the Robotino, and the Roomba. These robots can now be used easily out-of-the box with Fawkes.

Here is a more detailed (yet still incomplete) list of additions and changes in Fawkes 0.5.0.

ROS Integration
This version integrates closely with ROS, the Robot Operating System. It can provide data acquired in Fawkes to ROS and vice versa, integrate ROS' move_base locomotion planner, and several plugins now use rviz to visualize their internal state.
OpenNI Integration
Fawkes can now use OpenNI to acquire RGB-D data from sensors like the Kinect, and make use of the provided hand and user tracking capabilities.
Point Cloud Processing
New tool support and plugins have been added to make use of the Point Cloud Library (PCL). For example, a plugin to analyse tabletop scenes has been added identifying the position of table in front of the robot and objects on it.
OpenRAVE Manipulation Planning
An integration plugin for OpenRAVE has been added that allows plugins to use, for example, its motion planning capabilities. The Katana 5 DoF arm hardware plugin has been extended to make use of this new capability.
New Hardware Platforms
Fawkes can now work on robot platforms like the Nao, the Robotino, and the Roomba. The plugins integrate the robot's hardware capabilities and make it easily available to other plugins.
Self-localization
Fawkes now comes with an Adaptive Monte Carlo Localization plugin which has been ported from ROS. Using a known map an frequently taken 2D laser scans it can determine the robot's position within the map.
Coordinate Frame Transforms Framework
Fawkes now includes a framework and library for easily calculating transforms for points in different coordinate frames. The system is based on and thus compatible with ROS' tf framework.
RRD graphing
A new plugin provides RRD graphing capabilities for plugins. For example, for a MongoDB Logging project performance graphs have been created with this framework.
CLIPS Expert System Integration
The CLIPS rule engine for building Expert Systems has been integrated into Fawkes. Plugins can now easily acquire a CLIPS environment and start using it. For example, the Carologistics team has used this to create a reasoning agent to participate in the RoboCup Logistics League sponsored by Festo.

Thank you to the contributors who made this release possible. The new release is available on the download page. The documentation is continuously extended and improved in our Trac Wiki.

Posted by Tim Niemueller on September 27, 2012 12:43