Skip to content Skip to navigation

Leveraging Human-Robot Collaboration in Construction (Continuation)

Project Team

Martin Fischer (CEE) and Oussama Khatib (CS), Cynthia Brosque (CEE), Elena Galbally (ME)

Research Overview

Observed Problem:

Prefabrication has improved efficiency and quality in construction. However, these systems require manual assembly on site that can be repetitive and strenuous for workers. Despite the benefits of robots in prefabrication, field robots present challenges to autonomously operate in highly unstructured environments. Most of the existing construction robots rely on bulky industrial robotic arms and simplistic control leading to unsafe operating conditions and limited capabilities that do not leverage human expertise.

Primary Research Objective:

Develop a bimanual, mobile, physical robot prototype for on-site bolting of prefabricated steel structures that enables the collaboration between humans and robots.

Potential Value to CIFE Members and Practice:

  • Simulate safe and light robot designs for construction
  • Utilize state-of-the-art robotic technologies like haptics for intuitive human-robot collaboration
  • Explore hazardous construction tasks with bimanual manipulation

Research provides relevant insights for:


Research and Theoretical Contributions

Our contributions to the robotics field are exploring human-robot collaboration in an unstructured environment and expanding the understanding of mobile grasp and manipulation skills for bolting. 
For the construction industry, we aim to contribute to the design of a bolting robot that can reduce a strenuous and repetitive manual construction task.

Industry and Acadmic Partners



Human-Robot collaboration, haptics, robotics, automation, safety



Previous Seed Project

Research Updates & Progress Reports

Progress Report June 1, 2021

Our research has advanced autonomous welding and sealing of concrete joints on a 6-level parking structure provided by the industry partner Goldbeck. The welding example incorporated HRC through haptic control and the concrete joint robot applied visual and tactile force feedback to complete the task.

1. Welding

We implemented a simulated environment with a Panda Franka 7-DOF (degree of freedom) robot arm to learn the welding task with haptic control. Haptic control enables an intuitive user interface to control the robot and can record and repeat motions.

The environment includes the BIM at LOD>300 for a section of the parking structure as a visual mesh. Moreover, we modeled collision meshes for key aspects of the task such as a rectangular mesh of the plate target for welding and a cylinder mesh at the contact point of the welding tool, critical to understanding the accuracy of the task.

Welding simulation environment in SAI 2.0.
Figure 1: Welding simulation environment in SAI 2.0.

The welding robot is controlled in two states: auto navigation state and haptic control state. In the auto navigation state, the robot navigates from one welding plate to the next one in the regular parking structure. As the autonomous navigation reaches the target position in the grid, the human operator launches the haptic control state to explore the edge of the welding target by controlling the desired position of the end-effector of the welding tool. The haptic state is represented with a green ball in front of the robot. This state takes advantage of control libraries provided within SAI 2.0 (Stanford Robotics simulation and control platform).

The next step of this project involves extracting parameters from the human demonstrations with haptics that are key to task success.

2. Concrete Joints

In the 6-level parking structure, workers also have to complete more than 6000 meters of concrete joints for the precast slabs. The task involves three different teams, each performing one step of the process: concrete pouring, material projection to increase the structural strength of the joints, and finally three layers of waterproof coating.

Our industry partner Goldbeck is interested in testing robotic solutions for the three steps of the tasks, given the repetitive and manually intensive nature of the tasks. The solution used the current machine led by a human worker to design an autonomous mobile solution with a nozzle that can distribute the material in the concrete joints.

The process involves:

  1. Joint identification
  2. Edge detection and tracking
  3. Material pouring (following a set sequence)

The researchers explored multi-sensory sources of information, combining vision and tactile feedback for edge identification and tracking. First, we measure the section of the slab selected for the simulation in our sample environment. Once the mapping is complete, the robot navigates the joints in a specified order according to structural reliability guidelines. Upon reaching a joint, the robot nozzle is lowered into the groove, and pouring begins. Force sensing is applied at the tip of the EE to confirm whether the nozzle is correctly positioned at the lower edge of the joint groove. The robot navigates alongside the joint until the edge of one pre-cast slab. Subsequently, the nozzle is retracted, and pouring stops so that the base can move to the next desired location. To identify that the robot has reached the slabs intersection, we incorporated two flaps on the nozzle with a force sensor.

Pre-cast slab joints and sequence identification.
Figure 2: Pre-cast slab joints and sequence identification.

Figure 3: Nozzle flap extends perpendicular to the joint.
Figure 3: Nozzle flap extends perpendicular to the joint.

Preliminary challenges to achieve this task include:

  • Accurately interpreting sensed forces to control the EE
  • Designing the nozzle flap
  • Rotating the nozzle flap between vertical and horizontal joints

Progress Report  November 2020

Link to research presentations:

CIFE summer seminar

Simulation set-up in SAI aligned to the Robotics Center lab set-up for testing.

Simulation set-up in SAI aligned to the Robotics Center lab set-up for testing.

Detailed Research Overview & Progress Updates

Overview & Observed Problem

Prefabrication in construction has improved efficiency and quality (Saidi et al. 2016), but these systems still require manual assembly on-site that can be repetitive and strenuous for workers. For example, a six-level prefabricated parking structure required two crewmembers to install 380 beams between 190 columns with a total of 3040 bolts, which took about 38 hours of strenuous and repetitive manual work.

We see the potential to apply robots in the field for these assembly tasks, especially with the situation we are living now that requires limiting the number of workers on the field. However, there are challenges to use autonomous robots on site. These robots require the perception of the environment in real-time, vision, contact information, mobility, and Building Information Models (BIM) at a high level of development, which takes considerable effort (Groll et al., 2017; Kangari, 1985). Even with sensory data, current field robots require an operator on site to oversee the robot tasks.

The researchers observed opportunities to simplify the robot control with advances in haptic interfaces from the robotics field. Haptics allows the robot to be controlled from a safe distance and rely on human vision, perception, and touch. The sensory feedback allows the operator to feel what the robot is feeling and act accordingly.

Theoretical & Practical Points of Departure

Our previous work tested the use of haptic technology in simulation for five strenuous and repetitive construction tasks (drywall, painting, pouring concrete, welding, and bolting) (Brosque et al., 2020). We used SAI (Simulation and Active Interfaces) as the main simulation method. SAI was developed by the Stanford Robotics lab in collaboration with Google to simulate and control the physical interactions of robots in the field.

SAI allows its users to control and test the robot’s behavior through haptic or other user interfaces and provides efficient and realistic simulations of complex environments (Khatib et al., 2016). This simulation interface was used in real field tests like the Ocean One project, a robot designed for underwater exploration. The haptic-visual interface allowed the researchers to combine the dexterity, flexibility, problem-solving, and expertise of humans with the strength, endurance, and precision of robots in the field.

In construction, we could take real production data from the field, extract the BIM of the project, and simulate the robot performance interacting with the construction site and task objects. In SAI we can test multiple robot designs in simulation to get industry feedback. We also measured the adaptability of the robot design to the workplace situation like the height limit, component sizes, acceptable payloads, speed, and space.

Our industry partner in this CIFE seed project, Goldbeck, prioritized the bolting of steel connections as a key strenuous and repetitive task to automate in the field. Additionally, the robotics lab had already simulated and prototyped a hot cable bimanual robot controlled with haptics that serves as a point of departure for the prototype of a bolting robot. The hot cable robot used two Franka arms, one with a hand end-effector and the other holding a wrench. The operator controlled the robot with a bimanual haptic device from a safe distance, instead of working overhead next to the hot cables. Mobility was still not incorporated into this prototype but is key for a construction robot intended to perform tasks on the field.

 Hot cable bimanual robot design with Franka arms.

Hot cable bimanual robot design with Franka arms.

Research Methods & Work Plan

This research will develop a haptic based human-robot interaction for the specific task of bolting steel elements and contribute to broader construction applications of human-robot collaboration that can distance workers from strenuous and repetitive manual tasks.

Research tasks: from haptic simulation (left) to physical prototype (right).

Research tasks: from haptic simulation (left) to physical prototype (right).

The research tasks include:

  1. Collecting field production data of bolting tasks.
  2. Exporting the project BIM into SAI for simulation and testing of robot designs with haptic control.
  3. Developing a physical prototype using real construction tools and materials.
  4. Validating the human-robot collaboration design with two arms on a mobile platform.

Tasks 1) and 2) were completed as part of the previous CIFE seed. Task 3) is already underway with our industry partner Goldbeck. Task 4) will make use of hardware in Stanford’s Robotics Laboratory and includes setting up the hardware in the lab, tuning controllers, and conducting the physical tests.

Expected Contributions to Practice

This study will test the feasibility of haptic applications in construction with a real test case and contribute to the design of a human-robot collaboration robot that can distance the workers from a strenuous and repetitive manual task.

Haptically controlled simulation of drywall task in SAI.

Haptically controlled simulation of drywall task in SAI.

Expected Contributions to Theory

From a robotics perspective, we will advance the understanding of mobile manipulation skills in unstructured environments. From a construction perspective, we seek to develop new construction robot designs that are autonomous, safe, and collaborative with humans.


  • Brosque, C., Galbally, E., Khatib, O., & Fischer, M. (2020). Human-Robot Collaboration in Construction: Opportunities and Challenges. In 2020 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA) (pp. 1–8). IEEE.


  • Bock, T., & Linner, T. (2015). Robotic Industrialization: Automation and Robotic Technologies for Customized Component, Module, and Building Prefabrication. Cambridge: Cambridge University Press.
  • Brosque, C., Galbally, E., Khatib, O., & Fischer, M. (2020). Human-Robot Collaboration in Construction: Opportunities and Challenges. In 2020 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA) (pp. 1–8). IEEE.
  • Groll, T., Hemer, S., Ropertz, T., & Berns, K. (2017). A behavior-based architecture for excavation tasks. ISARC 2017 - Proceedings of the 34th International Symposium on Automation and Robotics in Construction, (Isarc).
  • Kangari, R. (1985). Robotics Feasibility in the Construction Industry. Retrieved from
  • Khatib, O., Yokoi, K., Brock, O., Chang, K., & Casai, A. (1999). Robots in human environments. Proceedings of the 1st Workshop on Robot Motion and Control, RoMoCo 1999, 213–221.
  • Khatib, O., Yeh, X., Brantner, G., Soe, B., Kim, B., Ganguly, S., … Creuze, V. (2016). Ocean one: A robotic avatar for oceanic discovery. IEEE Robotics and Automation Magazine, 23(4), 20–29.
  • Khatib, O. (1998). Mobile manipulation: The robotic assistant. Robotics and Autonomous Systems, (26), 175–183.
  • Saidi, K. S., Bock, T., & Georgoulas, C. (2016). Robotics in Construction. In Siciliano & Khatib (Eds.), Springer Handbook of Robotics (2nd ed., pp. 1493–1519). Springer.
  • Smith, C., Karayiannidis, Y., Nalpantidis, L., Gratal, X., Qi, P., Dimarogonas, D. V., & Kragic, D. (2012). Dual arm manipulation—A survey. Robotics and Autonomous systems, 60(10), 1340-1353.

Funding Year:


Stakeholder Categories:


Funding Year: 
Stakeholder Categories: