Deep Reinforcement Learning-Based Control for Real-Time Hybrid Simulation of Civil Structures
Real-time Hybrid Simulation (RTHS) is a cyber-physical technique that studies the dynamic behavior of a system by combining physical and numerical components that are coupled through a boundary condition enforcer. In structural engineering, the numerical components are subjected to environmental loads that become dynamic displacements of the physical substructure applied through an actuator. However, the dynamics of the coupling between components and the complexities of the system lead to synchronization challenges that affect the accuracy of the simulation. Thus, requiring tracking controllers to ensure the fidelity of the simulation. This paper studies deep reinforcement learning (DRL) as a novel alternative to designing the tracking controller. Three controllers are designed: a DRL agent combined with a conventional time delay compensation, a conventional feedback controller combined with a DRL agent, and a DRL agent with a complex neural network architecture. The proposed approaches are tested using a virtual RTHS benchmark problem, and the results are compared with an optimized controller that has a proportional-integral-derivative controller and phase-lead compensation. The results show that DRL can address the synchronization challenges of RTHS with a model-free approach and simple neural network architectures. The work shown in this study is a critical step toward model-free control methodologies that can transform and further develop the RTHS method. The proposed methodology can be used to address important challenges related to RTHS, including nonlinearities and uncertainties of the physical substructure, complex boundary conditions, and the computational efficiency when physical structures with complex dynamics are present.
Files
Metadata
Work Title | Deep Reinforcement Learning-Based Control for Real-Time Hybrid Simulation of Civil Structures |
---|---|
Access | |
Creators |
|
License | CC BY-NC 4.0 (Attribution-NonCommercial) |
Work Type | Article |
Publisher |
|
Publication Date | January 25, 2025 |
Publisher Identifier (DOI) |
|
Deposited | June 20, 2025 |
Versions
Analytics
Collections
This resource is currently not in any collection.