From hand sketches to daylight performance: a mixed-input neural prediction framework

Thanh Luan Le, Hee Gun Chong, Binh M. Le, H. Nguyen-Xuan, Sung Ah Kim

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Deep learning can accelerate daylight analysis, but existing methods require multiple tools and complex coding. This paper proposes a streamlined framework enabling daylight predictions from architectural hand-sketches with real-time 3D visualization. The method is implemented based on three main modules: (1) hand-sketch recognition and conversion, (2) mixed-input neural network (MINN), and (3) mixed-input pix2pix (MIpix2pix). Three modules were integrated into a concept application, allowing a comprehensive daylight prediction from a hand-sketched floor plan. Training data were generated using Rhino, Grasshopper, PlanFinder, Ladybug, and Honeybee. The MINN achieved a coefficient of determination above 0.92 for spatial daylight autonomy and 0.959 for annual sunlight exposure. The MIpix2pix2 model generate useful daylight illuminance images with SSIM values exceeding 0.93, closely aligning with simulations. This high-accuracy, fully integrated approach streamlines daylight analysis from concept to evaluation. By simplifying AI-based predictions, the framework offers a practical, efficient alternative to existing workflows.

Original languageEnglish
JournalJournal of Building Performance Simulation
DOIs
StateAccepted/In press - 2025

Keywords

  • Artificial Intelligence (AI)
  • architectural hand sketches
  • augmented reality (AR)
  • daylight predictions
  • mixed-input neural network
  • mixed-input pix2pix

Fingerprint

Dive into the research topics of 'From hand sketches to daylight performance: a mixed-input neural prediction framework'. Together they form a unique fingerprint.

Cite this