• Login
    View Item 
    •   eScholar Home
    • Graduate & Postdoctoral Studies
    • Electronic Theses and Dissertations
    • View Item
    •   eScholar Home
    • Graduate & Postdoctoral Studies
    • Electronic Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Transformer-based models for answer extraction in text-based question/answering

    Thumbnail
    View/Open
    Ahmadi_Najafabadi_Marzieh.pdf (11.58Mb)
    Date
    2023-04-01
    Author
    Ahmadi Najafabadi, Marzieh
    Metadata
    Show full item record
    Abstract
    The success of transformer-based language models has led to a surge of research in various natural language processing tasks, among which extractive question-answering/answer span detection, has received considerable attention in recent years. However, to date, no comprehensive studies have been conducted to compare and examine the performance of different transformer-based language models in the task of question-answering (QA). Furthermore, while these models can capture significant semantic and syntactic knowledge of a natural language, their potential for enhancing performance, in QA, through the incorporation of linguistic features remains unexplored. In this study, we compare the efficacy of multiple transformer-based models for the task of QA, as well as their performance on particular question types. Moreover, we investigate whether augmenting a set of linguistic features extracted from the question and context passage can enhance the performance of transformer-based language models in QA. In particular, we examine a few feature-augmented transformer-based architectures for the task of QA to explore the impact of these linguistic features on several transformer-based language models. Furthermore, an ablation study is conducted to analyze the individual effect of each feature. Through conducting extensive experiments on two question-answering datasets (i.e., SQuAD and NLQuAD), we show that the proposed framework can improve the performance of transformer-based models.
    URI
    https://hdl.handle.net/10155/1598
    Collections
    • Electronic Theses and Dissertations [1369]
    • Master Theses & Projects [302]

    DSpace software copyright © 2002-2016  DuraSpace
    Contact Us | Send Feedback
    Theme by 
    Atmire NV
     

     

    Browse

    All of eScholarCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    DSpace software copyright © 2002-2016  DuraSpace
    Contact Us | Send Feedback
    Theme by 
    Atmire NV