Multilingual Question Answering for Malaysia History with Transformer-based Language Model

Qi Zhi Lim, Chin Poo Lee, Kian Ming Lim, Jing Xiang Ng, Eric Khang Heng Ooi, Nicole Kai Ning Loh

Research output: Journal PublicationArticlepeer-review


In natural language processing (NLP), a Question Answering System (QAS) refers to a system or model that is designed to understand and respond to user queries in natural language. As we navigate through the recent advancements in QAS, it can be observed that there is a paradigm shift of the methods used from traditional machine learning and deep learning approaches towards transformer-based language models. While significant progress has been made, the utilization of these models for historical QAS and the development of QAS for Malay language remain largely unexplored. This research aims to bridge the gaps, focusing on developing a Multilingual QAS for history of Malaysia by utilizing a transformer-based language model. The system development process encompasses various stages, including data collection, knowledge representation, data loading and pre-processing, document indexing and storing, and the establishment of a querying pipeline with the retriever and reader. A dataset with a collection of 100 articles, including web blogs related to the history of Malaysia, has been constructed, serving as the knowledge base for the proposed QAS. A significant aspect of this research is the use of the translated dataset in English instead of the raw dataset in Malay. This decision was made to leverage the effectiveness of well-established retriever and reader models that were trained on English data. Moreover, an evaluation dataset comprising 100 question-answer pairs has been created to evaluate the performance of the models. A comparative analysis of six different transformer-based language models, namely DeBERTaV3, BERT, ALBERT, ELECTRA, MiniLM, and RoBERTa, has been conducted, where the effectiveness of the models was examined through a series of experiments to determine the best reader model for the proposed QAS. The experimental results reveal that the proposed QAS achieved the best performance when employing RoBERTa as the reader model. Finally, the proposed QAS was deployed on Discord and equipped with multilingual support through the incorporation of language detection and translation modules, enabling it to handle queries in both Malay and English.

Original languageEnglish
Pages (from-to)675-686
Number of pages12
JournalEmerging Science Journal
Issue number2
Publication statusPublished - Apr 2024
Externally publishedYes


  • BERT
  • DeBERTaV3
  • Historical Knowledge
  • MiniLM
  • Natural Language Processing
  • Question Answering
  • RoBERTa

ASJC Scopus subject areas

  • General


Dive into the research topics of 'Multilingual Question Answering for Malaysia History with Transformer-based Language Model'. Together they form a unique fingerprint.

Cite this