kordelfrance2025MIT

Olfaction-Vision-Language Dataset

An open-sourced multimodal dataset for prototyping olfaction-vision-language tasks in AI, robotics, and AR/VR domains, featuring synchronized sensor streams for applications like vision-scent navigation and odor source localization.

Downloads185
Episodes~118000
Likes3

Why This Matters for Physical AI

This dataset addresses multimodal perception for robotic olfactory navigation and hazard detection by integrating vision and language modalities with chemical compound representations, enabling robots to localize odor sources and navigate complex environments.

Technical Profile

Modalities
rgblanguageolfaction
Robot Embodiments
dronerobot
Environment
labindooroutdoor
Task Types
navigationodor_source_localizationimage_classificationimage_to_text
Episodes
~118000
Data Format
JSON/NoSQL
Annotation Types
language_instructionsexpert-generatedmachine-generated
License
MIT
Part of the Olfaction-Vision-Language Dataset family

Community Signals

Top 50% by downloads

Access

Need custom rgb data?

Claru builds purpose-built datasets for lab applications with dense human annotations and quality assurance.

Request a Sample Pack

Related Datasets