furuharuMIT

Drone Furniture Fly-Around Dataset

A synthetic Vision-Language-Action dataset for drone navigation generated using the Genesis physics simulator, consisting of FPV camera images, natural language instructions, and drone control actions. Designed for LoRA fine-tuning of OpenVLA 7B models on furniture fly-around tasks.

Downloads37
Episodes300

Why This Matters for Physical AI

This dataset enables Vision-Language-Action model research for drone navigation with automatically generated synthetic data, providing a foundation for training embodied AI models on vision-guided navigation tasks with language instructions.

Technical Profile

Modalities
rgblanguage
Robot Embodiments
drone
Action Space
body_frame_velocities
Environment
simulation
Task Types
navigationobject_approachorbit
Episodes
300
Data Format
JSON + JPEG
Annotation Types
language_instructionsaction_labels
License
MIT
Part of the Drone Furniture Fly-Around Dataset family

Access

Need custom rgb data?

Claru builds purpose-built datasets for simulation applications with dense human annotations and quality assurance.

Request a Sample Pack

Related Datasets