The recent demographic trend across developed nations shows a dramatic increase in the aging population, fallen fertility rates and a shortage of caregivers. Hence, the demand for service robots to assist with dressing which is an essential Activity of Daily Living (ADL) is increasing rapidly. Robotic Clothing Assistance is a challenging task since the robot has to deal with two demanding tasks simultaneously, (a) non-rigid and highly flexible cloth manipulation and (b) safe human–robot interaction while assisting humans whose posture may vary during the task. On the other hand, humans can deal with these tasks rather easily. In this paper, we propose a framework for robotic clothing assistance by imitation learning from a human demonstration to a compliant dual-arm robot. In this framework, we divide the dressing task into three phases, i.e. reaching phase, arm dressing phase, and body dressing phase. We model the arm dressing phase as a global trajectory modification using Dynamic Movement Primitives (DMP), while we model the body dressing phase toward a local trajectory modification applying Bayesian Gaussian Process Latent Variable Model (BGPLVM). We show that the proposed framework developed towards assisting the elderly is generalizable to various people and successfully performs a sleeveless shirt dressing task. We also present participants feedback on public demonstration at the International Robot Exhibition (iREX) 2017. To our knowledge, this is the first work performing a full dressing of a sleeveless shirt on a human subject with a humanoid robot.

 

Video Demonstration: