POMDP Planning Under Object Composition Uncertainty: Application to Robotic Manipulation

Research output: Contribution to journalArticleScientificpeer-review

2 Downloads (Pure)


Manipulating unknown objects in a cluttered environment is difficult because segmentation of the scene into objects, that is, object composition, is uncertain. Due to the uncertainty, prior work has either identified the "best" object composition and decided on manipulation actions accordingly or tried to greedily gather information about the "best" object composition. We instead, first, use different possible object compositions in planning, second, utilize object composition information provided by robot actions, third, consider the effect of competing object hypotheses on the desired task. We cast the manipulation planning problem as a partially observable Markov decision process (POMDP) that plans over possible object composition hypotheses. The POMDP chooses the action that maximizes long-term expected task-specific utility, and while doing so, considers informative actions and the effect of different object hypotheses on succeeding in the task. In simulation and physical robotic experiments, a probabilistic approach outperforms using the most likely object composition, and long term planning outperforms greedy decision making.

Original languageEnglish
Pages (from-to)1-16
Number of pages16
JournalIEEE Transactions on Robotics
Early online date20 Jul 2022
Publication statusE-pub ahead of print - 20 Jul 2022
MoE publication typeA1 Journal article-refereed


  • Grasping
  • Image color analysis
  • Image segmentation
  • image segmentation
  • partially observable Markov decision process (POMDP)
  • Planning
  • Probability distribution
  • robotic manipulation
  • Robots
  • Task analysis
  • task planning
  • Uncertainty


Dive into the research topics of 'POMDP Planning Under Object Composition Uncertainty: Application to Robotic Manipulation'. Together they form a unique fingerprint.

Cite this