Department Seminar of Osher Azulay - Learning In-Hand Perception and Manipulation with Adaptive Robotic Hands
School of Mechanical Engineering Seminar
Wednesday, August 7, 2024 at 14:00
Wolfson Building of Mechanical Engineering, Room 206
Learning In-Hand Perception and Manipulation with Adaptive Robotic Hands
Osher Azulay
Advisor: Dr. Avishai Sintov
Robots require efficient interaction with a large variety of objects in different environments, from industrial applications to domestic tasks. The interaction involves grasping and dexterous manipulation of objects to complete various tasks. Whereas the ability to manipulate an object within the hand is a fundamental task for humans, it remains challenging for robots. To cope with the diversity of objects and hand manipulation tasks in the real world, robots need to understand their environment and quickly infer the physical properties of objects.
Despite the impressive accuracy of industrial robotic hands, their complexity, fragility, high cost, and control challenges remain significant obstacles. Yet, the development of affordable, robust, and adaptable robot hardware has created opportunities to dramatically enhance robot autonomy. However, a major challenge for this new hardware is coping with the constant variability and uncertainty of real-world environments. Recently, underactuated hands, which are simpler and more flexible, have emerged as a promising alternative. Nevertheless, they present modeling difficulties due to their inherent uncertainties.
This research aims to advance robot manipulation skills using adaptive, underactuated hands and advanced learning algorithms. We have addressed several key aspects necessary for their efficient perception and adaptability: developing tactile fingers to establish the touch modality, optimizing the synergy between perception and action, and creating the algorithmic foundation for planning and control. Initially, we utilized haptic sensing for precise object pose estimation and manipulation, addressing visual feedback limitations in occluded environments. Subsequently, we developed the concept of haptic glances and applied reinforcement learning to achieve high accuracy in insertion tasks under spatial uncertainties. To further enhance perception, we introduced AllSight, a 3D-printed tactile sensor designed for robotic in-hand manipulation. AllSight offers high-resolution contact state estimation, including position, forces, and torsion, and demonstrates zero-shot learning capabilities, making it a low-cost and effective solution for advanced robotic manipulation tasks. Additionally, we proposed SightGAN, a bi-directional GAN that generates real-like synthetic images from simulated data, bridging the sim-to-real gap and improving model training for robotic tasks. |
|
Our current focus is on integrating these advancements into learnable tactile-based policies to enable robots to better understand external contacts and adapt to different tasks. By tightly coupling perception and action, we wish to allow the hand, despite its compliant nature, to perform complex tasks through self-supervision with minimal human effort and rapid adaptation to new situations. This research lays the algorithmic foundation for robot sensing, applicable to various robotic systems to overcome uncertainties, making high-capability robots more accessible.