TY - GEN
T1 - QSG Transformer
T2 - 45th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2022
AU - Park, Choongwon
AU - Ko, Youngjoong
N1 - Publisher Copyright:
© 2022 ACM.
PY - 2022/7/7
Y1 - 2022/7/7
N2 - Query-Focused Summarization (QFS) is a task that aims to extract essential information from a long document and organize it into a summary that can answer a query. Recently, Transformer-based summarization models have been widely used in QFS. However, the simple Transformer architecture cannot utilize the relationships between distant words and information from a query directly. In this study, we propose the QSG Transformer, a novel QFS model that leverages structure information on Query-attentive Semantic Graph (QSG) to address these issues. Specifically, in the QSG Transformer, QSG node representation is improved by a proposed query-attentive graph attention network, which spreads the information of the query node into QSG using Personalized PageRank, and it is used to generate a summary that better reflects the information from the relationships of a query and document. The proposed method is evaluated on two QFS datasets, and it achieves superior performances over the state-of-the-art models.
AB - Query-Focused Summarization (QFS) is a task that aims to extract essential information from a long document and organize it into a summary that can answer a query. Recently, Transformer-based summarization models have been widely used in QFS. However, the simple Transformer architecture cannot utilize the relationships between distant words and information from a query directly. In this study, we propose the QSG Transformer, a novel QFS model that leverages structure information on Query-attentive Semantic Graph (QSG) to address these issues. Specifically, in the QSG Transformer, QSG node representation is improved by a proposed query-attentive graph attention network, which spreads the information of the query node into QSG using Personalized PageRank, and it is used to generate a summary that better reflects the information from the relationships of a query and document. The proposed method is evaluated on two QFS datasets, and it achieves superior performances over the state-of-the-art models.
KW - graph neural networks
KW - graph-based method
KW - query-focused summarization
UR - https://www.scopus.com/pages/publications/85135036186
U2 - 10.1145/3477495.3531901
DO - 10.1145/3477495.3531901
M3 - Conference contribution
AN - SCOPUS:85135036186
T3 - SIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
SP - 2589
EP - 2594
BT - SIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
PB - Association for Computing Machinery, Inc
Y2 - 11 July 2022 through 15 July 2022
ER -