This is an old revision of the document!


RoboCup@Home | RoboCup German Open 2019

The Cram Team took place at the RoboCup@home 2019! Within the Master Project Suturo we had 3 people who worked with CRAM on the HSR (robot). We are happy to announce that Cram as a framework was the cornerstone to solve the robot tasks. We used Cram to write general plans, so the robot managed to enter the arena on its own and perform pick and place task. Of course we have cooperated with other frameworks that come within our institute where we work.

Participating in a RoboCup was an exciting and rewarding experience for all of us. Furthermore it showed the students just once again why it is so important to incorporate Failure Handling and test their system for all eventualities. We look forward to the future and hope for a second time Cram@Robocup.

2019/06/12 13:31 · Vanessa Hassouna

CRAM Logo is out

There was a process in which this logo was created. At first it was too detailed or even stuffed. Actually, a CRAM logo needs everything, a camera, some tires, a lot of grippers and much more (because CRAM can do so much).

However, we had to reduce it so far that it is abstract but still recognizable. Therefore, we are very satisfied with the logo it should represent the manipulation of itself and the environment at the same time. We hope you enjoy it too!

2019/02/05 16:27 · Vanessa Hassouna

CRAM v0.7.0

Four contributors have been working hard to bring to you this new version of CRAM.

Here is a summary of the main new features:

  • We have worked hard on implementing environment manipulation and the first prototype is included in v0.7.0!
    • Currently, PR2 can perform an opening and closing action on any drawer in the kitchen.
    • Support for rotational joints is not finished yet. Coming up in next version, stay tuned!
    • Location costmap for positioning the robot base such that manipulation is possible is provided (right arm does not always produce good costmaps, stick to the left arm for now).
    • Plans for high-level actions of accessing and sealing containers are implemented: collision and IK checks are done in projection prior to execution.
    • New types of failures are included specifically for environment manipulation.
    • Objects can be attached to the environment for following drawers when they are manipulated. Once the robot grasps the object, attachment is destroyed as expected.
    • Environment is loaded as a URDF now, semantic map support is there but not used per default. To enable collision checks between the robot and the environment Bullet engine collision flags are set for the robot and the environment.
    • The environment is loaded from meshes represented in *.obj file format, and are loaded into Assimp and Bullet as compound shape objects, to circumvent the problem of convex hull collision algorithms used by Bullet per default.
    • An update is published on a topic to update the environment representation, e.g., in RViz.
  • Some new location costmaps have been implemented and a lot of the old ones have been refactored and nicified.
    • ON and IN relations are now implemented using URDF axis-aligned bounding boxes and the ON and IN keys require an object designator as a value.
    • RANGE, RANGE-INVERSE and SIDE have been implemented for spatial relation costmaps to specify parts of the costmap region.
    • LEFT-OF, RIGHT-OF, IN-FRONT-OF, BEHIND, FAR, NEAR have been refactored to work with and without supporting planes and are more robust now.
  • GiskardPy software is now used as a manipulation controller with support for collision avoidance
    • Different phases of PICKING-UP and PLACING actions have different collision flags.
    • The environment representation of GiskardPy is updated through the usual CRAM event system.
  • Grasping interfaces and their implementation has been reworked and further improved, and some bugs have been fixed (although some are still remaining :) ).
  • Per default, installing KnowRob is not required anymore for using the core CRAM packages and the projection with PR2 robot.
    • Due to numerous problems related to ROS Java, loosening the dependency on KnowRob should simplify the installation process for ROS beginners.

Please do take a look at the detailed ChangeLog to get familiar with the other smaller but nonetheless important changes.

2018/08/14 19:36 · Gayane Kazhoyan

CRAM Plan Transformation

In the scope of his bachelor's thesis, Arthur Niedzwiecki researched the topic of plan transformations in CRAM. He implemented new transformation rules for pick and place scenarios and developed a plan transformation pipeline. The video explains three possible transformations that are applied to the plans used on our PR2 (and Boxy) robot in the projected as well as real world environments. By analyzing the task tree of an executed CRAM plan, improvable patterns are found and altered to generate a better performing plan.
The code is currently under review and will be merged soon. So keep an eye open for a new cram_plan_transformation package!

2018/05/18 16:28 · Arthur Niedzwiecki

CRAM v0.6.0

Yet another version is out, yay!

  • New feature: using plan projection for adapting a plan to its environment:
    • changed PERFORM from cram function into goal to use execution trace tools
    • moved CET:*EPISODE-KNOWLEDGE* projection var declaration into pr2-proj package
    • TASK-TREE-NODE now has an optional NODE param
    • added predicates and testing functions for traversing task tree
    • AT and DURING and THROUGHOUT are now from cet package
    • implemented finding best projection round according to driving distance
    • fixed (HOLDS ?TIMELINE ?OCC …)
    • LOC occasion now returns a designator with pose-stamped
    • made relevant desigs of fetch and deliver plans into explicit args
    • implemented WITH-PROJECTED-TASK-TREE
    • projection reasoning works on the real robot
  • Fetch and deliver plans are now cleaner and nicer:
    • GRASP is now an attribute of PICKING-UP action desig
    • moved SEARCHING out of FETCHING into TRANSPORTING
    • world is now simulated after spawning a perceived obj
    • in the demo placing locations are now more easily reachable
    • ROBOT-STATE-CHANGED is asserted even if navigation action failed
    • stopped using CAD model fitter for perception for CUP and BOWL
    • added second lift pose to GET-OBJECT-GRASPING-POSES.
  • Various smaller new features:
    • implemented broadcasting of TF from projection (robot + items)
    • added *SPAWN-DEBUG-WINDOW* parameters for belief state setup
    • added virtual links of the robot's URDF into projection TF tree.
  • Bugfixes and small improvements:
    • allow asking CURRENT-DESIG also on NULL objects
    • fixed PROJECTION-RUNNING predicate
    • WITH-REAL-ROBOT now creates a named top level
    • RS now only supports input parameters such as TYPE and CAD-MODEL
    • added null pointer guards in CPL constructs
    • bugfix: when checking for collisions ignoring attached objects
    • bugfix: IK solver only works if every arm movement is asserted into TF

Please see the detailed changelog on GitHub.

2018/04/11 19:28 · Gayane Kazhoyan

Older entries >>