More often than not, we’re telling you about some artificial intelligence innovation that threatens to take away someone’s job.
But that doesn’t mean we don’t recognize the abundant benefits of AI technology. We do. You could call us cautiously optimistic around here.
And today’s story helps illustrate the reason for that more clearly than most.
Researchers at the National University of Singapore developed a device they’re appropriately calling the AiSee which assists the visually impaired “see” the world around them using a camera and artificial intelligence.
In a blog post discussing the research, lead researcher of Project AiSee Associate Professor Suranga Nanayakkar said: “With AiSee, our aim is to empower users with more natural interaction. By following a human-centred design process, we found reasons to question the typical approach of using glasses augmented with a camera. People with visual impairment may be reluctant to wear glasses to avoid stigmatisation. Therefore, we are proposing an alternative hardware that incorporates a discreet bone conduction headphone.”
How it works is quite amazing. You hold up an object to AiSee, it uses the camera and associated artificial intelligence platform to identify what it is that you are showing it and then it uses a bone conduction technology via an earpiece to tell you what it is without interrupting external sounds. This last aspect was of particular concern for the team as the ability to hear environmental sounds is essential to “provide essential information for decision-making, especially in situations involving safety considerations.”
You can check out a video discussing the innovation on YouTube.
Any thoughts you might have on using artificial intelligence and cameras to “see” the world around you are welcome in the comments.
We have some more news for you to read at this link right here.