Downloads: 1
India | Computer Science and Information Technology | Volume 15 Issue 1, January 2026 | Pages: 143 - 148
Natural Language Processing (NLP): A Comparative Study on Applications of GNNs for the Tasks Like Text Classification, Semantic Role Labelling, and Abstract Meaning Representation Parsing
Abstract: The dominant paradigm in Natural Language Processing (NLP) has traditionally relied on sequential modelling, often overlooking the rich, non-linear structural dependencies inherent in linguistic data. Graph Neural Networks (GNNs) offer a powerful alternative by modelling text as structured graphs, yet a systematic evaluation of their comparative efficacy across tasks of varying structural complexity remains under-explored. This paper presents a comprehensive comparative study of three prominent GNN architectures- Graph Convolutional Networks (GCN), Graph Attention Networks (GAT), and Graph SAGE- applied to three distinct NLP tasks: Text Classification, Semantic Role Labelling (SRL), and Abstract Meaning Representation (AMR) Parsing. We construct task-specific graph topologies, utilizing heterogeneous corpus graphs for document classification and syntactic dependency trees for SRL and AMR. Our empirical results, derived from benchmarks on the 20 Newsgroups, CoNLL-2009, and AMR 2.0 datasets, reveal a clear correlation between architectural properties and task complexity. While GCNs provide a robust and efficient baseline for global text classification (Accuracy: 96.6%), they struggle with the fine-grained structural modelling required for semantic tasks. Conversely, GATs demonstrate superior performance on syntax-heavy tasks, achieving a Labelled F1 score of 85.8% on SRL and a Smatch score of 74.5 on AMR parsing, significantly outperforming the GCN baseline (82.5% and 71.2%, respectively). These findings suggest that the anisotropic aggregation capability of attention mechanisms is critical for capturing the nuanced, long-range dependencies in natural language, establishing GAT as the preferred architecture for structure-dependent NLP applications.
Keywords: Natural Language Processing (NLP), Graph Neural Networks (GNNs), Deep Learning, Graph Convolutional Network (GCNs), Graph Attention Networks (GATs), GraphSAGE, Text Classification, Semantic Role Labelling (SRL), Abstract Meaning Representation (AMR) Parsing
How to Cite?: Pathan Ashhad Zakirkhan, Sandhya Kaprawan, "Natural Language Processing (NLP): A Comparative Study on Applications of GNNs for the Tasks Like Text Classification, Semantic Role Labelling, and Abstract Meaning Representation Parsing", Volume 15 Issue 1, January 2026, International Journal of Science and Research (IJSR), Pages: 143-148, https://www.ijsr.net/getabstract.php?paperid=MR251231122151, DOI: https://dx.doi.org/10.21275/MR251231122151