Tactile Robotic Assembly
Autonomous assembly of modular constructions by AI-driven robots and visuo-tactile sensors

In the Tactile Robotic Assembly project, an interdisciplinary team develops modular, dry-joint constructions and their assembly and disassembly through autonomous robots. Dry-joint elements develop their load-bearing capacity through interlocking and can be reused in different configurations. With the help of visuo-tactile sensors and machine learning, robots are to assemble and disassemble these elements autonomously. Computation and robotics allow us to rethink prefabrication and component assembly in order to save precious resources and reduce waste in architecture and the construction industry.

Our society faces the global challenge of serving an enormous demand for housing now and in the coming decades with an industry whose productivity is traditionally low and which suffers from a shortage of skilled workers. At the same time, innovations are needed to dramatically reduce CO2 emissions and waste in construction. An important aspect of mastering this challenge is a continuous digital process chain from design, materialization, and prefabrication to finally component assembly, supported by artificial intelligence, robotics, and automation.

This project is dedicated to the prototypical development of a modular building system and its assembly by autonomous robots. First, the interdisciplinary team of the Intelligent Autonomous System Group (IAS) headed by Prof. Jan Peters and the Digital Design Unit (DDU) of Prof. Oliver Tessmann is developing elements that can be dry-joined into interlocking assemblies. Elements differ in material and function, enabling load transfer, moisture regulation, light transmission, etc. The material system is optimized to be assembled, disassembled, and reassembled by robots. The sequential assembly into a demonstrator structure, including all assembly steps, is based on an abstract description (3D model). Robots then determine their Task and Motion Planning (TAMP) and “understand” the weight, dimensions, and surface properties of components using visuo-tactile sensors. Machine learning and sensor technology help to autonomously perform construction tasks that currently cannot be programmed in advance due to the need for permanent adaptability. The project investigates how architectural/construction aspects can be represented as rewards for Reinforcement Learning (RL). Stable constructions are rewarded in RL (positive rewards). Falling elements led to punishments (negative rewards), so the robot develops an understanding of construction and stable construction states via AI and RL. Algorithmic design tools for element-based construction are developed to fully exploit the novel design/construction capabilities of autonomous building robots. The knowledge gained in the project will be applied and evaluated using prototypes and demonstrator constructions. Ultimately, the knowledge gained on the use of architectural/constructive criteria in machine learning processes is to be generalized in such a way that it can be used for a wide range of design and construction applications and that architects develop a deeper understanding of machine learning in construction. Sensor technology allows the use of robots in collaboration with humans. Automation can thus be gradually implemented in the extremely fragmented construction industry.

RESEARCH TEAM

Technische Universität Darmstadt

Digital Design Unit (DDU)

Intelligent Autonomous Systems Group (IAS)

PROJECT MANAGEMENT

Prof. Dr.-Ing. Oliver Tessmann

Prof. Jan Peters, PhD

TEAM

Boris Belusov, IAS

Mehrzad Esmaeili, DDU

Yuxi Liu, DDU

Yvonne Machleid, DDU

LINKS

Digital Design Unit (DDU)

Intelligent Autonomous Systems (IAS)

ZukunftBau

FUNDING AGENCY

This project is funded by the Federal Institute for Research on Building, Urban Affairs and Spatial Development on behalf of the Federal Office for Building and Regional Planning (BBR) with funds from the Zukunft Bau research research program.