TY - JOUR
T1 - Enhanced gamma-ray spectrum transformation
T2 - NaI(Tl) scintillator to HPGe semiconductor via machine learning
AU - Saeidi, Zohreh
AU - Afarideh, Hossein
AU - Ghergherehchi, Mitra
N1 - Publisher Copyright:
© The Author(s) 2025.
PY - 2025/3
Y1 - 2025/3
N2 - Thallium-activated sodium iodide scintillation (NaI(Tl)) and high-purity germanium semiconductor (HPGe) detectors are two commonly employed gamma spectroscopy devices. NaI(Tl) detectors are preferred for their cost-effectiveness, efficiency, and ease of construction, while HPGe detectors have superior resolution but face challenges in temperature operation and they are expensive. This article investigates the application of machine learning algorithms, specifically K-Nearest Neighbors (KNN) and a Multi-Channel Output Regression based on Support Vector Regression (MCO-SVR), to enhance the performance of NaI(Tl) detectors by transforming its gamma spectrum into HPGe spectrum. The model was trained using datasets generated from a limited radioisotope library and demonstrated excellent performance across a diverse range of measured experimental test data. The evaluation included various scenarios, such as low-count spectra and background effects. The KNN model exhibited optimal performance, achieving an accuracy of 98.69% with a Manhattan distance metric. In contrast, the MCO-SVR model, employing both direct and chained approaches, exhibited varied results with different kernel types, with the polynomial kernel in the direct approach yielding the value 97.45% accuracy. Overall, the results indicate that machine learning algorithms have the potential to improve the performance of NaI(Tl) detectors and expand their applications in various fields of nuclear security.
AB - Thallium-activated sodium iodide scintillation (NaI(Tl)) and high-purity germanium semiconductor (HPGe) detectors are two commonly employed gamma spectroscopy devices. NaI(Tl) detectors are preferred for their cost-effectiveness, efficiency, and ease of construction, while HPGe detectors have superior resolution but face challenges in temperature operation and they are expensive. This article investigates the application of machine learning algorithms, specifically K-Nearest Neighbors (KNN) and a Multi-Channel Output Regression based on Support Vector Regression (MCO-SVR), to enhance the performance of NaI(Tl) detectors by transforming its gamma spectrum into HPGe spectrum. The model was trained using datasets generated from a limited radioisotope library and demonstrated excellent performance across a diverse range of measured experimental test data. The evaluation included various scenarios, such as low-count spectra and background effects. The KNN model exhibited optimal performance, achieving an accuracy of 98.69% with a Manhattan distance metric. In contrast, the MCO-SVR model, employing both direct and chained approaches, exhibited varied results with different kernel types, with the polynomial kernel in the direct approach yielding the value 97.45% accuracy. Overall, the results indicate that machine learning algorithms have the potential to improve the performance of NaI(Tl) detectors and expand their applications in various fields of nuclear security.
UR - https://www.scopus.com/pages/publications/85218264587
U2 - 10.1140/epjp/s13360-025-06048-y
DO - 10.1140/epjp/s13360-025-06048-y
M3 - Article
AN - SCOPUS:85218264587
SN - 2190-5444
VL - 140
JO - European Physical Journal Plus
JF - European Physical Journal Plus
IS - 2
M1 - 113
ER -