Skip to main content

2024 | OriginalPaper | Buchkapitel

Path-Aware Cross-Attention Network for Question Answering

verfasst von : Ziye Luo, Ying Xiong, Buzhou Tang

Erschienen in: Advances in Knowledge Discovery and Data Mining

Verlag: Springer Nature Singapore

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Reasoning is an essential ability in QA systems, and the integration of this ability into QA systems has been the subject of considerable research. A prevalent strategy involves incorporating domain knowledge graphs using Graph Neural Networks (GNNs) to augment the performance of pre-trained language models. However, this approach primarily focuses on individual nodes and fails to leverage the extensive relational information present within the graph fully. In this paper, we present a novel model called Path-Aware Cross-Attention Network (PCN), which incorporates meta-paths containing relational information into the model. The PCN features a multi-layered, bidirectional cross-attention mechanism that facilitates information exchange between the textual representation and the path representation at each layer. By integrating rich inference information into the language model and contextual semantic information into the path representation, this mechanism enhances the overall effectiveness of the model. Furthermore, we incorporate a self-learning mechanism for path scoring, enabling weighted evaluation. The performance of our model is assessed across three benchmark datasets, covering the domains of commonsense question answering (CommonsenseQA, OpenbookQA) and medical question answering (MedQA-USMLE). The experimental results validate the efficacy of our proposed model.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Alsentzer, E., et al.: Publicly available clinical BERT embeddings. In: Proceedings of the 2nd Clinical Natural Language Processing Workshop (2019) Alsentzer, E., et al.: Publicly available clinical BERT embeddings. In: Proceedings of the 2nd Clinical Natural Language Processing Workshop (2019)
2.
Zurück zum Zitat Chen, D., Li, Y., Yang, M., Zheng, H.T., Shen, Y.: Knowledge-aware textual entailment with graph attention network. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management (2019) Chen, D., Li, Y., Yang, M., Zheng, H.T., Shen, Y.: Knowledge-aware textual entailment with graph attention network. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management (2019)
3.
Zurück zum Zitat Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding (2019) Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding (2019)
4.
Zurück zum Zitat Feng, Y., Chen, X., Lin, B.Y., Wang, P., Yan, J., Ren, X.: Scalable multi-hop relational reasoning for knowledge-aware question answering. In: EMNLP (2020) Feng, Y., Chen, X., Lin, B.Y., Wang, P., Yan, J., Ren, X.: Scalable multi-hop relational reasoning for knowledge-aware question answering. In: EMNLP (2020)
5.
Zurück zum Zitat Guan, X., Cao, B., Gao, Q., Yin, Z., Liu, B., Cao, J.: CORN: co-reasoning network for commonsense question answering. In: Proceedings of the 29th International Conference on Computational Linguistics, Gyeongju, Republic of Korea, October 2022. International Committee on Computational Linguistics (2022) Guan, X., Cao, B., Gao, Q., Yin, Z., Liu, B., Cao, J.: CORN: co-reasoning network for commonsense question answering. In: Proceedings of the 29th International Conference on Computational Linguistics, Gyeongju, Republic of Korea, October 2022. International Committee on Computational Linguistics (2022)
6.
Zurück zum Zitat Gururangan, S., et al.: Don’t stop pretraining: adapt language models to domains and tasks. In: Proceedings of ACL (2020) Gururangan, S., et al.: Don’t stop pretraining: adapt language models to domains and tasks. In: Proceedings of ACL (2020)
7.
Zurück zum Zitat Jiang, J., Zhou, K., Wen, J.R., Zhao, X.: \(great~truths~are ~always ~simple\): a rather simple knowledge encoder for enhancing the commonsense reasoning capacity of pre-trained models. In: Findings of the Association for Computational Linguistics (2022) Jiang, J., Zhou, K., Wen, J.R., Zhao, X.: \(great~truths~are ~always ~simple\): a rather simple knowledge encoder for enhancing the commonsense reasoning capacity of pre-trained models. In: Findings of the Association for Computational Linguistics (2022)
8.
Zurück zum Zitat Jin, D., Pan, E., Oufattole, N., Weng, W.H., Fang, H., Szolovits, P.: What disease does this patient have? A large-scale open domain question answering dataset from medical exams. Appl. Sci. 11(14), 6421 (2021)CrossRef Jin, D., Pan, E., Oufattole, N., Weng, W.H., Fang, H., Szolovits, P.: What disease does this patient have? A large-scale open domain question answering dataset from medical exams. Appl. Sci. 11(14), 6421 (2021)CrossRef
9.
Zurück zum Zitat Lee, J., et al.: BioBERT: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics 36(4), 1234–1240 (2020)MathSciNetCrossRef Lee, J., et al.: BioBERT: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics 36(4), 1234–1240 (2020)MathSciNetCrossRef
10.
Zurück zum Zitat Lin, B.Y., Chen, X., Chen, J., Ren, X.: KagNet: knowledge-aware graph networks for commonsense reasoning. In: EMNLP-IJCNLP 2019 (2019) Lin, B.Y., Chen, X., Chen, J., Ren, X.: KagNet: knowledge-aware graph networks for commonsense reasoning. In: EMNLP-IJCNLP 2019 (2019)
11.
Zurück zum Zitat Liu, F., Shareghi, E., Meng, Z., Basaldella, M., Collier, N.: Self-alignment pretraining for biomedical entity representations. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021) Liu, F., Shareghi, E., Meng, Z., Basaldella, M., Collier, N.: Self-alignment pretraining for biomedical entity representations. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
13.
Zurück zum Zitat Lv, S., et al.: Graph-based reasoning over heterogeneous external knowledge for commonsense question answering. In: AAAI 2020 (2020) Lv, S., et al.: Graph-based reasoning over heterogeneous external knowledge for commonsense question answering. In: AAAI 2020 (2020)
14.
Zurück zum Zitat Lv, S., et al.: Graph-based reasoning over heterogeneous external knowledge for commonsense question answering. Proc. AAAI Conf. Artif. Intell. 34(05), 8449–8456 (2020) Lv, S., et al.: Graph-based reasoning over heterogeneous external knowledge for commonsense question answering. Proc. AAAI Conf. Artif. Intell. 34(05), 8449–8456 (2020)
15.
Zurück zum Zitat Mihaylov, T., Clark, P., Khot, T., Sabharwal, A.: Can a suit of armor conduct electricity? A new dataset for open book question answering. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (2018) Mihaylov, T., Clark, P., Khot, T., Sabharwal, A.: Can a suit of armor conduct electricity? A new dataset for open book question answering. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (2018)
16.
Zurück zum Zitat Mihaylov, T., Frank, A.: Knowledgeable reader: enhancing cloze-style reading comprehension with external commonsense knowledge. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (2018) Mihaylov, T., Frank, A.: Knowledgeable reader: enhancing cloze-style reading comprehension with external commonsense knowledge. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (2018)
17.
Zurück zum Zitat Mihaylov, T., Frank, A.: Knowledgeable reader: enhancing cloze-style reading comprehension with external commonsense knowledge. In: ACL (2018) Mihaylov, T., Frank, A.: Knowledgeable reader: enhancing cloze-style reading comprehension with external commonsense knowledge. In: ACL (2018)
18.
Zurück zum Zitat Pan, X., et al.: Improving question answering with external knowledge. In: Proceedings of the 2nd Workshop on Machine Reading for Question Answering, November 2019 Pan, X., et al.: Improving question answering with external knowledge. In: Proceedings of the 2nd Workshop on Machine Reading for Question Answering, November 2019
19.
Zurück zum Zitat Park, J., et al.: Relation-aware language-graph transformer for question answering (2022) Park, J., et al.: Relation-aware language-graph transformer for question answering (2022)
20.
Zurück zum Zitat Santoro, A., et al.: A simple neural network module for relational reasoning. In: Guyon, I., et al. (eds.) Advances in Neural Information Processing Systems, vol. 30 (2017) Santoro, A., et al.: A simple neural network module for relational reasoning. In: Guyon, I., et al. (eds.) Advances in Neural Information Processing Systems, vol. 30 (2017)
21.
Zurück zum Zitat Schlichtkrull, M., Kipf, T.N., Bloem, P., Van Den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: The Semantic Web: 15th International Conference, ESWC 2018 (2018) Schlichtkrull, M., Kipf, T.N., Bloem, P., Van Den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: The Semantic Web: 15th International Conference, ESWC 2018 (2018)
22.
Zurück zum Zitat Sun, Y., Shi, Q., Qi, L., Zhang, Y.: JointLK: joint reasoning with language models and knowledge graphs for commonsense question answering. In: Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2022) Sun, Y., Shi, Q., Qi, L., Zhang, Y.: JointLK: joint reasoning with language models and knowledge graphs for commonsense question answering. In: Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2022)
23.
Zurück zum Zitat Talmor, A., Herzig, J., Lourie, N., Berant, J.: CommonsenseQA: a question answering challenge targeting commonsense knowledge. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2019) Talmor, A., Herzig, J., Lourie, N., Berant, J.: CommonsenseQA: a question answering challenge targeting commonsense knowledge. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2019)
24.
Zurück zum Zitat Wang, P., Peng, N., Ilievski, F., Szekely, P., Ren, X.: Connecting the dots: a knowledgeable path generator for commonsense question answering. In: Findings of the Association for Computational Linguistics, EMNLP 2020, November 2020 (2020) Wang, P., Peng, N., Ilievski, F., Szekely, P., Ren, X.: Connecting the dots: a knowledgeable path generator for commonsense question answering. In: Findings of the Association for Computational Linguistics, EMNLP 2020, November 2020 (2020)
25.
Zurück zum Zitat Yang, A., et al.: Enhancing pre-trained language representations with rich knowledge for machine reading comprehension. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (2019) Yang, A., et al.: Enhancing pre-trained language representations with rich knowledge for machine reading comprehension. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (2019)
26.
Zurück zum Zitat Yasunaga, M., Ren, H., Bosselut, A., Liang, P., Leskovec, J.: QA-GNN: reasoning with language models and knowledge graphs for question answering. In: North American Chapter of the Association for Computational Linguistics (NAACL) (2021) Yasunaga, M., Ren, H., Bosselut, A., Liang, P., Leskovec, J.: QA-GNN: reasoning with language models and knowledge graphs for question answering. In: North American Chapter of the Association for Computational Linguistics (NAACL) (2021)
27.
Zurück zum Zitat Zhang, X., et al.: GreaseLM: graph reasoning enhanced language models. In: International Conference on Learning Representations (2021) Zhang, X., et al.: GreaseLM: graph reasoning enhanced language models. In: International Conference on Learning Representations (2021)
Metadaten
Titel
Path-Aware Cross-Attention Network for Question Answering
verfasst von
Ziye Luo
Ying Xiong
Buzhou Tang
Copyright-Jahr
2024
Verlag
Springer Nature Singapore
DOI
https://doi.org/10.1007/978-981-97-2253-2_9

Premium Partner