TY - JOUR
T1 - Neural radiance fields for construction site scene representation and progress evaluation with BIM
AU - Jeon, Yuntae
AU - Tran, Dai Quoc
AU - Vo, Khoa Tran Dang
AU - Jeon, Jaehyun
AU - Park, Minsoo
AU - Park, Seunghee
N1 - Publisher Copyright:
© 2025 Elsevier B.V.
PY - 2025/4
Y1 - 2025/4
N2 - Efficient progress monitoring is crucial for construction project management to ensure adherence to project timelines and cost control. Traditional methods, which rely on either 3D point cloud data or 2D image transformations, face challenges such as data sparsity in point cloud and the need for extensive human labeling. Recent NeRF-based methods offer high-quality image rendering for accurate evaluation, but challenges remain in comparing as-built scenes with as-planned designs or measuring actual dimensions. To address these limitations, this paper proposes a NeRF-based scene understanding approach synchronized with BIM. Additionally, a formalized progress evaluation method and the automatic generation of ground truth masks for comparison using BIM on NVIDIA Omniverse are introduced. This approach enables precise progress evaluation using smartphone-captured video, enhancing its applicability and generalizability. Experiments conducted on three different scenes from the concrete pouring process demonstrate that our method achieves a measurement error range of 1% to 2.2% and 8.7 mAE for element-wise segmentation performance in completed scenes. Furthermore, it achieves 5.7 mAE for progress tracking performance in ongoing process scenes. Overall, these findings are significant for improving vision-based progress monitoring and efficiency on construction sites.
AB - Efficient progress monitoring is crucial for construction project management to ensure adherence to project timelines and cost control. Traditional methods, which rely on either 3D point cloud data or 2D image transformations, face challenges such as data sparsity in point cloud and the need for extensive human labeling. Recent NeRF-based methods offer high-quality image rendering for accurate evaluation, but challenges remain in comparing as-built scenes with as-planned designs or measuring actual dimensions. To address these limitations, this paper proposes a NeRF-based scene understanding approach synchronized with BIM. Additionally, a formalized progress evaluation method and the automatic generation of ground truth masks for comparison using BIM on NVIDIA Omniverse are introduced. This approach enables precise progress evaluation using smartphone-captured video, enhancing its applicability and generalizability. Experiments conducted on three different scenes from the concrete pouring process demonstrate that our method achieves a measurement error range of 1% to 2.2% and 8.7 mAE for element-wise segmentation performance in completed scenes. Furthermore, it achieves 5.7 mAE for progress tracking performance in ongoing process scenes. Overall, these findings are significant for improving vision-based progress monitoring and efficiency on construction sites.
KW - Building information modeling (BIM)
KW - Computer vision
KW - Construction progress evaluation
KW - Neural radiance field (NeRF)
KW - Neural rendering
KW - NVIDIA omniverse
KW - Segment anything model (SAM)
UR - https://www.scopus.com/pages/publications/85217029142
U2 - 10.1016/j.autcon.2025.106013
DO - 10.1016/j.autcon.2025.106013
M3 - Article
AN - SCOPUS:85217029142
SN - 0926-5805
VL - 172
JO - Automation in Construction
JF - Automation in Construction
M1 - 106013
ER -