Score distillation for anomaly detection

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Recently, significant performance improvements have been achieved in deep learning-based anomaly detection methods by introducing large neural network architectures and complex anomaly scoring functions. However, the computational cost and memory usage required in the inference phase have also increased significantly, thereby limiting their use in real-time applications. In this paper, we propose a score distillation method that adopts the concept of knowledge distillation. An existing high-performance anomaly detection method is used as the teacher. A small neural network is then trained as the student to mimic the scoring function of the teacher. In the inference phase, the anomaly score for a query instance is obtained by a single forward pass through the student network without requiring any complicated computation processes. We demonstrate that the proposed method makes anomaly detection faster and more efficient while maintaining high performance.

Original languageEnglish
Article number111842
JournalKnowledge-Based Systems
Volume295
DOIs
StatePublished - 1 Jul 2024

Keywords

  • Anomaly detection
  • Knowledge distillation
  • Score distillation
  • Unsupervised learning

Fingerprint

Dive into the research topics of 'Score distillation for anomaly detection'. Together they form a unique fingerprint.

Cite this