
EgoBlind: Enhancing AI Vision for the Visually Impaired
The first egocentric VideoQA dataset from blind individuals' perspectives
EgoBlind introduces a groundbreaking dataset of 1,210 egocentric videos recorded by blind individuals, designed to evaluate and improve multimodal large language models for visual assistance.
- First-of-its-kind collection featuring 4,927 questions posed by blind individuals
- Captures authentic daily activities from a first-person perspective
- Focuses on real-world assistance needs across various scenarios
- Provides a foundation for developing more helpful AI visual assistants
Medical Impact: This research addresses critical accessibility challenges in healthcare by enabling the development of AI systems that can provide real-time visual guidance for blind individuals, enhancing their autonomy in medical settings and daily life activities.
EgoBlind: Towards Egocentric Visual Assistance for the Blind People