Tactile sensing in dexterous robot hands

Z Kappassov, JA Corrales, V Perdereau - Robotics and Autonomous …, 2015 - Elsevier
Tactile sensing is an essential element of autonomous dexterous robot hand manipulation. It
provides information about forces of interaction and surface properties at points of contact …

A review of tactile information: Perception and action through touch

Q Li, O Kroemer, Z Su, FF Veiga… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
Tactile sensing is a key sensor modality for robots interacting with their surroundings. These
sensors provide a rich and diverse set of data signals that contain detailed information …

Learning the signatures of the human grasp using a scalable tactile glove

S Sundaram, P Kellnhofer, Y Li, JY Zhu, A Torralba… - Nature, 2019 - nature.com
Humans can feel, weigh and grasp diverse objects, and simultaneously infer their material
properties while applying the right amount of force—a challenging set of tasks for a modern …

Skin-inspired quadruple tactile sensors integrated on a robot hand enable object recognition

G Li, S Liu, L Wang, R Zhu - Science Robotics, 2020 - science.org
Robot hands with tactile perception can improve the safety of object manipulation and also
improve the accuracy of object identification. Here, we report the integration of quadruple …

A hierarchically patterned, bioinspired e-skin able to detect the direction of applied pressure for robotics

CM Boutry, M Negre, M Jorda, O Vardoulis… - Science Robotics, 2018 - science.org
Tactile sensing is required for the dexterous manipulation of objects in robotic applications.
In particular, the ability to measure and distinguish in real time normal and shear forces is …

Data-driven grasp synthesis—a survey

J Bohg, A Morales, T Asfour… - IEEE Transactions on …, 2013 - ieeexplore.ieee.org
We review the work on data-driven grasp synthesis and the methodologies for sampling and
ranking candidate grasps. We divide the approaches into three groups based on whether …

Making sense of vision and touch: Self-supervised learning of multimodal representations for contact-rich tasks

MA Lee, Y Zhu, K Srinivasan, P Shah… - … on robotics and …, 2019 - ieeexplore.ieee.org
Contact-rich manipulation tasks in unstructured environments often require both haptic and
visual feedback. However, it is non-trivial to manually design a robot controller that …

More than a feeling: Learning to grasp and regrasp using vision and touch

R Calandra, A Owens, D Jayaraman… - IEEE Robotics and …, 2018 - ieeexplore.ieee.org
For humans, the process of gras** an object relies heavily on rich tactile feedback. Most
recent robotic gras** work, however, has been based only on visual input, and thus …

Making sense of vision and touch: Learning multimodal representations for contact-rich tasks

MA Lee, Y Zhu, P Zachares, M Tan… - IEEE Transactions …, 2020 - ieeexplore.ieee.org
Contact-rich manipulation tasks in unstructured environments often require both haptic and
visual feedback. It is nontrivial to manually design a robot controller that combines these …

Integrated task and motion planning in belief space

LP Kaelbling, T Lozano-Pérez - The International Journal of …, 2013 - journals.sagepub.com
We describe an integrated strategy for planning, perception, state estimation and action in
complex mobile manipulation domains based on planning in the belief space of probability …