A new “bat-like” sensor, which could help social distancing as lockdown measures are eased, has been developed by a Scottish start-up company.
Alex Bowen’s creation uses echolocation instead of light to receive images – much like a bat uses sound to identify its surroundings – without identifiable data and allows artificial intelligence (AI) to understand the physical world.
Despite first focusing on domestic applications, it is hoped the sensor could help make social distancing in office buildings and other areas easier while still ensuring privacy.
Mr Bowen said: “All AIs need to constantly learn and adapt to understand the world like we do.
“But industry continues to face the challenge of how to teach AI about what happens in people’s homes without invading users’ privacy from human oversight or camera use.
“As with many problems, nature had the solution. In the wild, bats send out a screech and they listen for the echoes to understand distances and the location of physical objects.
“In this way, the bat can interpret its surroundings. Our sensors work in a similar way using echolocation to create a picture without any identifying data so that privacy is protected.”
The graduate of Edinburgh’s Heriot-Watt University set up his company IMERAI in 2018, which is based in the city’s Business School Incubator.
IMERAI recently received investment to add five new roles in its engineering team.
They used MEMS microphones – already widely used in mobile phones and smart home assistants – for the sensor which could now provide a base to create more advanced AI products.
Mr Bowen added: “As the UK debates how to ease lockdown measures safely, this type of technology could be used to count how many people are present in an office location and how far apart they are to aid with social distancing and infection control.
“For assisted living, this could be game-changing for dementia sufferers and others with assisted living needs, allowing their movements to be monitored and any deterioration to be picked up more quickly.
“The virtual AI assistants already on the market are manually triggered by voice but our technology will allow the AI to be more intuitive by understanding how its user is moving around.
“For example, if you are following a recipe and the virtual assistant is reading out the instructions, it will be able to ‘see’ when you are ready for the next ingredient rather than waiting to be prompted for the next instruction.”