Dual-Arm Robotic Transformer Models
Title: Dual-Arm Robotic Transformer Models
DNr: Berzelius-2024-446
Project Type: LiU Berzelius
Principal Investigator: Volker Krueger <volker.krueger@cs.lth.se>
Affiliation: Lunds universitet
Duration: 2024-11-29 – 2025-06-01
Classification: 10207
Keywords:

Abstract

Human-centric environments are generally designed for humans with two arms, and even the simplest tasks such as carrying a tray or playing with Lego are very challenging with only one arm. Interestingly, most industrial robots, with the notable exception of the ABB robot YuMi or Baxter, have only one arm, and programming them not only requires special tricks such as fixtures and custom-made grippers, but some tasks, e.g. handling flexible materials, are almost impossible to do with one arm, presenting a great need for the exploration of dual-arm skills. On the other hand, data-driven methods generally went to the center stage of the robotic manipulation area, showcasing the potential for fixable handling of various skills with rich observation information (image, language, depth, etc) and different scenarios (household, in the wild, etc). Which of course, can be applied to dual-arm tasks. But existing data-driven models all face different kinds of challenges, therefore the objective of this project would be to explore and deploy current SOTA deep-learning models and seek ways to improve or build upon them.