Abstract
Knowledge Graph Question Answering (KGQA) systems allow users to interact with knowledge graphs using natural language queries, which are translated into structured database queries like Cypher. Existing KGQA approaches often rely on large language models, leading to high computational costs and slower inference times that impede real-time applications. To address these challenges, DistilCypherGPT is introduced as an efficient KGQA framework employing knowledge distillation in a teacher-student architecture, optimized for Cypher query generation on academic knowledge graphs. DistilCypherGPT significantly reduces computational demands, enabling deployment in resource-constrained environments while retaining high accuracy. Experimental results show that DistilCypherGPT maintains 99.51% accuracy, achieving a 23% reduction in model size and a 30% improvement in inference speed compared to the baseline. These findings corroborate DistilCypherGPT’s potential as a scalable, high-performance solution for KGQA, advancing efficient, real-time query translation with minimal computational overhead.
| Original language | English |
|---|---|
| Article number | 74 |
| Journal | Data Mining and Knowledge Discovery |
| Volume | 39 |
| Issue number | 6 |
| DOIs | |
| Publication status | Published - Nov 2025 |
Free Keywords
- Generative pretrained transformers
- Knowledge distillation
- Knowledge graph
- Language model
- Question answering
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Computer Networks and Communications