hgupt32025MIT

Sensor-Invariant Tactile Representation (SITR) Dataset

A large-scale tactile perception dataset comprising 1M simulated samples across 100 sensor configurations and real-world classification/pose estimation data from 7 tactile sensors. The dataset enables training sensor-invariant tactile representations for zero-shot inference and fine-tuning on various tactile perception tasks.

Downloads122
Episodes1M (simulated) + 140K (classification) + 24K (pose estimation)
Likes1

Why This Matters for Physical AI

This dataset enables development of sensor-invariant tactile representations that can generalize across different tactile sensor hardware, a critical capability for deploying tactile perception in diverse robotic systems.

Technical Profile

Modalities
rgbtactiledepth
Environment
simulationlab
Task Types
tactile_classificationpose_estimationfeature_extraction
Episodes
1M (simulated) + 140K (classification) + 24K (pose estimation)
Data Format
PNG images, NPY arrays (depth maps, surface normals, pose data)
Annotation Types
class_labelspose_labelscalibration_images
License
MIT
Part of the SITR family

Access

Need custom rgb data?

Claru builds purpose-built datasets for simulation applications with dense human annotations and quality assurance.

Request a Sample Pack

Related Datasets