Toward Artificial Palpation
Representation Learning of Touch on Soft Bodies

Technion - Israel Institute of Technology

A proof-of-concept artificial palpation for breast cancer detection using tactile sensors.
The red dot marks the current tactile sensor position.

Abstract

Palpation, the use of touch in medical examination, is almost exclusively performed by humans. We investigate a proof of concept for an artificial palpation method based on self-supervised learning. Our key idea is that an encoder-decoder framework can learn a \textit{representation} from a sequence of tactile measurements that contains all the relevant information about the palpated object. We conjecture that such a representation can be used for downstream tasks such as tactile imaging and change detection. With enough training data, it should capture intricate patterns in the tactile measurements that go beyond a simple map of forces -- the current state of the art. To validate our approach, we both develop a simulation environment and collect a real-world dataset of soft objects and corresponding ground truth images obtained by magnetic resonance imaging (MRI). We collect palpation sequences using a robot equipped with a tactile sensor, and train a model that predicts sensory readings at different positions on the object. We investigate the representation learned in this process, and demonstrate its use in imaging and change detection.

Video

Object Fabrication

For better appreciation of the task difficulty, we created a special insert with a transparent gel instead of the regular one. As can be seen in the video, the lump moves when touched, making it hard to infer properties regarding the hidden 3D model structure. Due to this dynamics, non-data-driven approaches often fail.

this slowpoke moves
this slowpoke moves

Data Collection

Automatic Poking

We use a robotic arm to automatically palpate our fabricated objects. Using the QR code and a mounted camera, we can also associate appropriate ground-truth.

this slowpoke moves
this slowpoke moves
this slowpoke moves
this slowpoke moves
this slowpoke moves
this slowpoke moves

MRI Scan

Here, we visualize one of our MRI scans, where we run through the slices at different z values. As can be seen, we scanned 6 inserts at a time to save time and resources.

this slowpoke moves

Processed MRI

In the preprocessing stage we present in the paper, we separate the scans and discretize them.

PalpationSim

Here we show example videos of the poke trajectories from our proposed PalpationSim

this slowpoke moves
this slowpoke moves
this slowpoke moves
this slowpoke moves
this slowpoke moves
this slowpoke moves

BibTeX

 @inproceedings{rimontoward,
      title={Toward Artificial Palpation: Representation Learning of Touch on Soft Bodies},
      author={Rimon, Zohar and Shafer, Elisei and Tepper, Tal and Shimron, Efrat and Tamar, Aviv},
      booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems}
    }