Tactile sensing in dexterous robot hands
Tactile sensing is an essential element of autonomous dexterous robot hand manipulation. It
provides information about forces of interaction and surface properties at points of contact …
provides information about forces of interaction and surface properties at points of contact …
A review of tactile information: Perception and action through touch
Tactile sensing is a key sensor modality for robots interacting with their surroundings. These
sensors provide a rich and diverse set of data signals that contain detailed information …
sensors provide a rich and diverse set of data signals that contain detailed information …
Learning the signatures of the human grasp using a scalable tactile glove
Humans can feel, weigh and grasp diverse objects, and simultaneously infer their material
properties while applying the right amount of force—a challenging set of tasks for a modern …
properties while applying the right amount of force—a challenging set of tasks for a modern …
Skin-inspired quadruple tactile sensors integrated on a robot hand enable object recognition
G Li, S Liu, L Wang, R Zhu - Science Robotics, 2020 - science.org
Robot hands with tactile perception can improve the safety of object manipulation and also
improve the accuracy of object identification. Here, we report the integration of quadruple …
improve the accuracy of object identification. Here, we report the integration of quadruple …
A hierarchically patterned, bioinspired e-skin able to detect the direction of applied pressure for robotics
Tactile sensing is required for the dexterous manipulation of objects in robotic applications.
In particular, the ability to measure and distinguish in real time normal and shear forces is …
In particular, the ability to measure and distinguish in real time normal and shear forces is …
Data-driven grasp synthesis—a survey
We review the work on data-driven grasp synthesis and the methodologies for sampling and
ranking candidate grasps. We divide the approaches into three groups based on whether …
ranking candidate grasps. We divide the approaches into three groups based on whether …
Making sense of vision and touch: Self-supervised learning of multimodal representations for contact-rich tasks
Contact-rich manipulation tasks in unstructured environments often require both haptic and
visual feedback. However, it is non-trivial to manually design a robot controller that …
visual feedback. However, it is non-trivial to manually design a robot controller that …
More than a feeling: Learning to grasp and regrasp using vision and touch
For humans, the process of gras** an object relies heavily on rich tactile feedback. Most
recent robotic gras** work, however, has been based only on visual input, and thus …
recent robotic gras** work, however, has been based only on visual input, and thus …
Making sense of vision and touch: Learning multimodal representations for contact-rich tasks
Contact-rich manipulation tasks in unstructured environments often require both haptic and
visual feedback. It is nontrivial to manually design a robot controller that combines these …
visual feedback. It is nontrivial to manually design a robot controller that combines these …
Integrated task and motion planning in belief space
We describe an integrated strategy for planning, perception, state estimation and action in
complex mobile manipulation domains based on planning in the belief space of probability …
complex mobile manipulation domains based on planning in the belief space of probability …