Department Seminar of Osher Azulay - Learning In-Hand Perception and Manipulation with Adaptive Robotic Hands

07 August 2024, 14:00 - 15:00 
פקולטה להנדסה 
0
Department Seminar of Osher Azulay - Learning In-Hand Perception and Manipulation with Adaptive Robotic Hands

School of Mechanical Engineering Seminar
Wednesday, August 7, 2024 at 14:00
Wolfson Building of Mechanical Engineering, Room 206

Learning In-Hand Perception and Manipulation with Adaptive Robotic Hands

Osher Azulay

Advisor: Dr. Avishai Sintov

Robots require efficient interaction with a large variety of objects in different environments, from industrial applications to domestic tasks. The interaction involves grasping and dexterous manipulation of objects to complete various tasks. Whereas the ability to manipulate an object within the hand is a fundamental task for humans, it remains challenging for robots. To cope with the diversity of objects and hand manipulation tasks in the real world, robots need to understand their environment and quickly infer the physical properties of objects.

Despite the impressive accuracy of industrial robotic hands, their complexity, fragility, high cost, and control challenges remain significant obstacles. Yet, the development of affordable, robust, and adaptable robot hardware has created opportunities to dramatically enhance robot autonomy. However, a major challenge for this new hardware is coping with the constant variability and uncertainty of real-world environments. Recently, underactuated hands, which are simpler and more flexible, have emerged as a promising alternative. Nevertheless, they present modeling difficulties due to their inherent uncertainties.

This research aims to advance robot manipulation skills using adaptive, underactuated hands and advanced learning algorithms. We have addressed several key aspects necessary for their efficient perception and adaptability: developing tactile fingers to establish the touch modality, optimizing the synergy between perception and action, and creating the algorithmic foundation for planning and control. Initially, we utilized haptic sensing for precise object pose estimation and manipulation, addressing visual feedback limitations in occluded environments. Subsequently, we developed the concept of haptic glances and applied reinforcement learning to achieve high accuracy in insertion tasks under spatial uncertainties. To further enhance perception, we introduced AllSight, a 3D-printed tactile sensor designed for robotic in-hand manipulation. AllSight offers high-resolution contact state estimation, including position, forces, and torsion, and demonstrates zero-shot learning capabilities, making it a low-cost and effective solution for advanced robotic manipulation tasks. Additionally, we proposed SightGAN, a bi-directional GAN that generates real-like synthetic images from simulated data, bridging the sim-to-real gap and improving model training for robotic tasks.

 

Figure 1: Three AllSight sensors on the fingers of an adaptive robotic hand. The sensors provide real-time tactile images for contact state estimations during the manipulation of an object.

 

 

Our current focus is on integrating these advancements into learnable tactile-based policies to enable robots to better understand external contacts and adapt to different  tasks. By tightly coupling perception and action, we wish to allow the hand, despite its compliant nature, to perform complex tasks through self-supervision with minimal human effort and rapid adaptation to new situations. This research lays the algorithmic foundation for robot sensing, applicable to various robotic systems to overcome uncertainties, making high-capability robots more accessible.

Tel Aviv University makes every effort to respect copyright. If you own copyright to the content contained
here and / or the use of such content is in your opinion infringing, Contact us as soon as possible >>