Wednesday, January 3, 2024

Restoring Vision: Blind “See” Through Sound with Smart Glasses

Link to article

Click images to enlarge.

In a 2012 report from the World Health Organization, it was estimated that 39 million people around the world are blind and another 285 million are visually impaired. Of the latter group, 246 million (86%) have "low vision", meaning it can't be corrected with glasses, contact lenses, or surgery. But that was talking about glasses with lenses through which light enters and gets focused onto our eyes. What about other devices called smart glasses, and how do they work?


In the U.S., normal vision is measured as 20/20, meaning you are able see clearly what the average person can see on an eye chart when they are standing 20 feet away. In Europe, normal vision is 6/6, because they use 6 meters instead. (Six meters is 19.685 feet.) In the Japanese system, 1.0 is normal vision 1.5 is 20/15, 2.0 is 20/10, etc., and anything less than 1.0 is less than normal vision. A person can be legally blind but still see. Generally speaking, when people talk about being blind, they mean a person can actually see nothing at all, just darkness.

American and Japanese eye charts

An implantable bionic eye called the Phoenix99 was developed in 2021. Glasses with a camera sent images to a telemetry implant above one ear, which then sent a filtered signal to the visual stimulator (VS)which was inserted near the back of one eyeball. The VS sends that signal to the retina which might still have some functioning photoreceptor cells, and the person can see something, even if it is a fuzzy image. This works only if there are active photoreceptors, and surgery is obviously needed. Another brand called Argus II costs $150,000 plus the expense of the surgery.

Diagram of Phoenix99 system (from syfy.com)

The IrisVision website shows its choices of 5 top brands of vision aids that are not implants; these are made for low vision people, not those who are blind. Some speak what they see and can be programmed to read text or barcodes as well as identify faces.

  • IrisVision mounts a smartphone. ($2,950)
  • MyEye allows you to attach to the side of any glasses. ($3,500)
  • NuEyes magnifies with a telephoto lens. ($5,995 – $6,195)
  • Jordy is another magnifier that also comes with a separate stand. ($2,500)
  • eSight uses its camera to display high resolution images. ($5,950)


But what about people who are completely blind? You might say they are blind as a bat, and although the vision analogy doesn't work, the fact that bats detect sound does suggest how a new technology works. It speaks to the wearer somewhat like the above brands of smart glasses do. But it's a complex and entirely new type of technology.

Led by Professor Howe Yuan Zhu, researchers at the University of Technology Sydney collaborated with the University of Sydney and start-up company ARIA Research developed a new type of smart glasses. They combined technology of cameras, GPS systems, a microphone, and inertial measurement and depth sensing units to develop "acoustic touch technology". Acoustic refers to sound, so this technology joins hearing with touch to help blind people to see.

Bats aren't blind, but their eyes are more attuned to low-light situations. About 70% use a process called echolocation to navigate and find food. Echolocation means sending ultrasonic waves from their larynx to the mouth or nose, and after the sound bounces off something, the bat's ears can help to detect its size, shape, texture, movement, and location. 

Image from batwatch.ca

Acoustic touch technology uses cameras, not sound, to locate objects that the person is facing. Software in the glasses is translated into sounds that are familiar to the object or into words that identify it. The sound of rustling paper coming out of the glasses' speaker might suggest the person is facing a book or newspaper, for example. Or it might just say "book". Speakers are built into the temples of the glasses, and motion-capture reflective markers on the glasses track head movements. 

Beta testing the glasses (from YouTube)

The glasses were originally NREAL brand augmented-reality glasses made by ARIA. Augmented reality is a term for viewing the real world simultaneously with games, computer screens, or videos superimposed on it. This is different from virtual reality, which completely blocks out reality and substitutes it with the entertainment situation.

Augmented-reality view from NREAL glasses. Wearer is in kitchen accessing internet screens.

The Australian researchers tested acoustic touch technology on 14 subjects. Seven of them were blind or had low vision. Seven others were blindfolded sighted individuals who served as a control group. They were asked to identify and locate four objects: a book, a bowl, a bottle, and a cup. The software told them what it was and how far away they were.

Test subject. Bottom left shows the actual view, and under that is the computerized identification (PLOS One, 2023)

Each item was designated a specific sound that the wearer heard: the book—page turning sound, bottle—scraping a glass bottle sound, bowl—lid placed on a bowl sound, and cup—a cup placed on a wooden table sound. They trained to get comfortable with each object and its acoustic label. 

Seated, they did 60 trials:
  • They "looked" around the table and listened to the acoustic cue on the glasses speaker to create a mental image of the location of the items on the table (20 trials)
  • They heard directions for the items based on a clock, like pilots use. "Bowl, 12 o'clock" cue meant it was directly in front of them. "Bottle, 3 o'clock" meant it was on their right.  (20 trials)
  • Bluetooth speakers in front of the items played the auditory cue three times before the next one to its right did. (20 trials) 
Then, instructions asked them to pick up a specific item without saying where it was. If they touched it, it was scored as a success.

Seated task (PLOS One, 2023)
Standing, they did 3 trials:

Four items were placed on each of three tables surrounding the subject. Eleven were different sizes and shapes of the same item (for example, 11 bottles). The subject had to find another twelfth object different from among them (for example, a cup).

Standing task  (PLOS One, 2023)

The researchers measured heart rate, breathing rate, and skin temperatures during the trials to detect physiological signs of stress, and they gave the subjects a questionnaire to assess their mental workload at different stages of the project.

The blindfolded sighted people recorded more effort overall, and in the first seated trial they made more mistakes than the blind or low vision people. This was expected because of their unfamiliarity with working "in the dark".

Everyone had problems distinguishing the acoustic icons for the bowl and bottle, so future tests will make them clearer. The clock direction test did not include information on how far away the objects were, so that will be put into the next trials. Otherwise, these tests showed great promise, and a second round of testing is being planned. With 3-10% of the world suffering from vision problems, smart glasses hold promise to help millions.

Percent of people worldwide with vision problems (WHO, 2021)


Here is a 2:34 minute video on the technology