Who Am I Working With?
Enabling Human-Robot-Human Collaboration for Future Distributed Manufacturing

There is a growing manufacturing skills gap in the U.S. resulting from an aging workforce, lack of work-life balance, and geographical mismatches between where people live and where jobs are. Our research explores multiple humans collaborating physically with one another across geographical barriers, using robots as a medium to help fill the manufacturing skills gap and enhance current and future distributed manufacturing. Human-robot-human collaboration, or simply HRH, will be a key to such geographically-distributed manufacturing environments. One important question we hope to address is how a teleoperator can effectively collaborate with onsite workers, as if they were co-located? As a start toward answering this question, a multidisciplinary team with expertise in human factors & ergonomics, robotics, and industrial psychology will explore scenarios wherein one human is remotely controlling a robot that is collaborating with a second human, emphasizing the human-technology partnership
We completed a total of 14 interviews with industry stakeholders in robot manufacturing, automobile/aircraft manufacturing, safety research, and supply chain (Figure 1). Each interview typically lasted about 50 minutes. Interviews provided various perspectives regarding human-robot-human (HRH) collaboration, and information on potential future work tasks that may benefit from such a collaboration. To analyze the interview data, we transcribed all the interview recordings, and have started to explore and analyze data using Hugging Face–an open-source community and data science platform for natural language processing (Figure 1).


