Title: High resolution touch sensing.
Marr said that vision tells you “what is where by looking.” By analogy, tactile sensing tells you “what is where by feeling.” As with vision, the mapping between world properties and sensory data is complex, and so the inverse problem, inferring world properties from sensory data, is also complex. Until recently, artificial touch sensors provided such crude information that it was hard to infer much from their data. However, with the advent of camera-based sensors such as GelSight, we now have a plethora of high dimensional data, including both normal and tangential force and displacement at high resolution. Deep learning helps us makes sense of the data stream. While many people think of local pressure as the basic measurement underlying touch, it is often better to think of touch processing in terms of geometrical measurements. The spatio-temporal patterns of skin deformation can help identify objects and their material properties such as hardness, roughness, and slipperiness. They can also convey accurate object pose within the gripper.