theairlabcmu2025cc-by-4.0

TartanGround

A large-scale, multi-modal dataset designed to advance perception and autonomy of ground robots operating in diverse environments, collected across 63 photorealistic simulation environments with over 1.44 million samples from 878 trajectories.

Downloads42K
Episodes878
Likes3

Why This Matters for Physical AI

TartanGround provides large-scale multi-modal sensor data across diverse simulated environments and robot morphologies to advance ground robot perception, SLAM, and autonomous navigation capabilities.

Technical Profile

Modalities
rgbdepthsemantic_segmentationoptical_flowstereo_disparitylidarimuproprioceptionpoint_cloud
Robot Embodiments
omnidirectional_ground_robotdifferential_drivequadruped
Environment
simulation
Task Types
navigationsemantic_occupancy_predictionvisual_slamscene_representationbird's_eye_view_prediction
Episodes
878
Annotation Types
semantic_segmentationground_truth_posessemantic_occupancy_maps
License
cc-by-4.0
Part of the TartanGround family

Community Signals

Top 5% by downloads
HuggingFace Discussions1

Access

Need custom rgb data?

Claru builds purpose-built datasets for simulation applications with dense human annotations and quality assurance.

Request a Sample Pack

Related Datasets