Self-generated dataset for category and pose estimation of deformable object

This work considers the problem of garment handling by a general household robot that focuses on the task of classification and pose estimation of a hanging garment in unfolding procedure. Classification and pose estimation of deformable objects such as garment are considered a challenging problem i...

Full description

Saved in:
Bibliographic Details
Main Authors: Hou Y.C., Sahari K.S.M.
Other Authors: 37067465000
Format: Article
Published: Atlantis Press 2023
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This work considers the problem of garment handling by a general household robot that focuses on the task of classification and pose estimation of a hanging garment in unfolding procedure. Classification and pose estimation of deformable objects such as garment are considered a challenging problem in autonomous robotic manipulation because these objects are in different sizes and can be deformed into different poses when manipulating them. Hence, we propose a self-generated synthetic dataset for classifying the category and estimating the pose of garment using a single manipulator. We present an approach to this problem by first constructing a garment mesh model into a piece of garment that crudely spread-out on the flat platform using particle-based modeling and then the parameters such as landmarks and robotic grasping points can be estimated from the garment mesh model. Later, the spread-out garment is picked up by a single robotic manipulator and the 2D garment mesh model is simulated in 3D virtual environment. A dataset of hanging garment can be generated by capturing the depth images of real garment at the robotic platform and also the images of garment mesh model from offline simulation respectively. The synthetic dataset collected from simulation shown the approach performed well and applicable on a different of similar garment. Thus, the category and pose recognition of the garment can be further developed. � 2019 The Authors. Published by Atlantis Press SARL.