Abstract
In a re-entrant job shop (RJS), an entity can visit the same resource type multiple times; this is called re-entrancy, which occurs frequently in actual industries. Re-entrancy causes an NP-hard problem and is dominated by heuristics-based production control. The stochastic arrivals due to re-entrancy require the design of an appropriate dispatching rule. Reinforcement learning (RL) is an efficient technique for establishing robust dispatching rules; however, only a few cases that coordinate RL-based production control with a digital twin (DT) have been reported. This study proposes a novel production control model that applies a DT and horizontal coordination with RL-based production control. The requirements for dispatching in the RJS and coordination between RL and the DT were defined. A suitable architectural framework, service composition, and systematic logic library schema were developed to exploit the advanced characteristics of the DT and improve the existing production control methods. This study is an early case of coordinating RL and DT, and the findings revealed that RL policy networks should be imported in the creation procedures rather than being synchronised to the DT. The results should be a valuable reference for research on other types of RL-based production control with regard to horizontal coordination.
| Original language | English |
|---|---|
| Pages (from-to) | 2151-2167 |
| Number of pages | 17 |
| Journal | International Journal of Production Research |
| Volume | 60 |
| Issue number | 7 |
| DOIs | |
| State | Published - 2022 |
Keywords
- Asset administration shell
- digital twin
- production control
- re-entrant job shop
- reinforcement learning