Luma-Attention-based Chroma Intra Prediction for Versatile Video Coding

Bumyoon Kim, Yongseong Kim, Hyunki Jeong, Byeungwoo Jeon

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The remarkable advancements in machine learning are impacting most areas of research, and video coding is no exception. In this paper, we study a neural network-based chroma intra prediction technique which utilizes weighted coefficients obtained only from luma attention information. Experimental results show coding gains of 0.45%, 1.46%, and 1.28% for the Y, Cb, and Cr channels compared to VVC Test Model (VTM) version 23.0. These results highlight potential of chroma prediction solely from luma information as a novel approach to chroma intra prediction.

Original languageEnglish
Title of host publication2025 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB 2025
PublisherIEEE Computer Society
ISBN (Electronic)9798331519988
DOIs
StatePublished - 2025
Externally publishedYes
Event20th IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB 2025 - Dublin, Ireland
Duration: 11 Jun 202513 Jun 2025

Publication series

NameIEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB
ISSN (Print)2155-5044
ISSN (Electronic)2155-5052

Conference

Conference20th IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB 2025
Country/TerritoryIreland
CityDublin
Period11/06/2513/06/25

Keywords

  • Intra Chroma Prediction
  • Neural Network-based Video Coding
  • VVC

Fingerprint

Dive into the research topics of 'Luma-Attention-based Chroma Intra Prediction for Versatile Video Coding'. Together they form a unique fingerprint.

Cite this