December 29, 2024
overcast clouds Clouds 54 °F

New augmented-reality system changes how humans and robots interact

ARROCH offers two-way communication and integrates AI so robots can adjust plans as needed

Researchers in Assistant Professor Shiqi Zhang's lab sent small robots through the Engineering Building to retrieve objects and then return as a human monitored their progress. With the robots is PhD student Yan Ding. Researchers in Assistant Professor Shiqi Zhang's lab sent small robots through the Engineering Building to retrieve objects and then return as a human monitored their progress. With the robots is PhD student Yan Ding.
Researchers in Assistant Professor Shiqi Zhang's lab sent small robots through the Engineering Building to retrieve objects and then return as a human monitored their progress. With the robots is PhD student Yan Ding. Image Credit: Jonathan Cohen.

As automation becomes more prevalent at warehouses and in other environments, humans and robots are increasingly sharing each other’s spaces. Real-time collaborations between them remain tricky, however, and focus on how people adapt to their mechanical co-workers.

Research from and the consumer electronics and mobile communications company has developed a new way to bridge the human communication gap with multiple robots using augmented reality (AR), which uses technology to add digital visuals, sound or other stimuli to a real-world view.

Assistant Professor Shiqi Zhang, a faculty member in the Thomas J. Watson College of Engineering and Applied Science’s Department of Computer Science, led a team that created a system it calls — ARROCH for short.

Included on the team were PhD students Kishan Chandan and Vidisha Kudalkar as well as Xiang Li, a senior engineer for augmented reality research at OPPO​.

In 2020, Zhang received a $50,000 OPPO Faculty Research Award to support the research. OPPO is developing wearable AR devices, and this project could lead to new avenues for its hardware and software.

Their research on ARROCH was presented in June at the (ICRA).

Utilizing AR interfaces to manage robots is not a new concept, but ARROCH offers a few new features. First, multiple robots can be directed at the same time. Also, the humans can observe the robots’ current locations and provide feedback that affects what the robots do next.

“Usually, people just have AR for visualization of what a robot is currently doing — maybe I can hold up a tablet and see the robot is making a delivery, for example,” Zhang said. “But here, we provide a bidirectional communication channel where the human can comment on the robot’s plan or behaviors. The communication channel is not very complicated, but it’s the first time we have that kind of interaction.”

The technology for human/robot interaction used in warehouses and other industrial settings is primitive in comparison and generally separates each group into a separate zone.

”Pretty recently, Amazon started to encourage workers to wear special vests that send signals, so if humans have to go into the robot zone to fix something or because there is a box falling from a shelf, all the robots around the human immediately stop. That’s definitely not the best way for humans and robots to co-exist or collaborate with each other.

“With this interface, the human holding a tablet or wearing AR glasses can say: ‘What are the robots doing? Is there any robot coming into this area in the near future?’ If the human sees there are robots coming in, the human can say: ‘Please hold this movement for a few minutes — I need to work in this area.’ Such functionalities are not in the warehouse environment yet.”

The team’s study included sending three robots to retrieve objects and return to the closed door of the lab. A human in the lab worked on a task while the robots were gone, and then needed to open the door when they arrived. ARROCH monitored the robots’ progress throughout the experiments.

Zhang believes the development of ARROCH leads to a number of new research directions.

“If the human is moving in a big warehouse environment, how does the robot know what a human is doing? This coupled system needs to be tighter,” he said.

“Also, how do they communicate with each other? Currently, we have an interface where the robot avatars are shown as a warehouse with maybe 500 robots. If a human opens this AR interface and the robot avatars are everywhere, the whole screen will be super-crowded, and then what does the human do? If a wearable AR interface was being used, the human probably couldn’t even see what’s in the real world — it’s all robot avatars. We’re thinking of developing more advanced visualization strategies for better communication.”