JAKARTA - Microsoft announced its new collaboration with Be My Eyes to present high-quality data representing disabilities to train artificial intelligence (AI) systems.

Because according to Microsoft, data from disability communities are often underrepresented, resulting in less accurate introduction of disability-related objects.

In a Microsoft Research report on AI performance to describe images, objects such as braille devices are even less common in image-text datasets, which ultimately led to object recognition accuracy to be reduced by 30 percent.

This collaboration will help overcome the data shortage. So, Microsoft and Be My Eyes aim to make Microsoft's AI model more inclusive for 340 million people who are visually impaired.

In this collaboration, Be My Eyes will provide video datasets that include unique objects reflecting blind life experiences, with personal information removed before sharing.

Then, Microsoft will use the data to improve image description accuracy, facilitate the use of AI applications for the blind community.

This collaboration also builds Microsoft's commitment to an inclusive and responsible AI, which began in 2017 with the integration of Be My Eyes in its technical services.

With this move, Microsoft is strengthening its efforts to deliver more inclusive technologies for all users, especially those with disabilities.

"In Microsoft, we are committed to building an inclusive AI that represents everyone who uses it, while protecting marginalized members of society from widespread biases that can affect education, work, and community engagement," said Jenny Lay-Flurrie, Vice President and Microsoft's Chief Accessibility Officer.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)