For people with blindness or visual impairments, everyday tasks, like shopping, can prove challenging at best. When presented with a rack of clothes, how are they to know the size, color, or price of the items without asking for help? How can they be sure a cashier gives them bills in the correct denomination? A device created by Media Lab researchers Suranga Nanayakkara and Roy Shilkrot is aiming to change that.
EyeRing is a finger camera attached to a smartphone that allows users to receive information about various objects in their surroundings.
Simply point the ring at something, say a command like “color” or “text,” and push the thumb button. The device sends a picture via Bluetooth to the smartphone, which processes the image using computer-vision algorithms and quickly sends the answer back via text-to-speech and earphones worn by the user. Watch the device in action (in the Coop!) in the video below.
Currently, the device deciphers currency, color, text, and pricing, but Nanayakkara and Shilkrot, who are part of the Media Lab’s Fluid Interfaces research group, are working on other applications as well, such as aiding with navigation or translation. They are also considering making the code for the device open source. It runs with an Android app, but the duo is working on an iPhone version.
Nanayakkara is visiting faculty in the Fluid Interfaces group as well as a tenure-track professor at Singapore University of Technology and Design (SUTD), which was established in collaboration with MIT. Shilkrot is a first-year PhD student in the group. EyeRing is not ready for market yet though that’s the intention. Right now, the duo is working to make the system identify a wider range of details.