Skip to Main content Skip to Navigation
Conference papers

Applying Distilled BERT for Question Answering on ASRS Reports

Abstract : This paper employs the Bidirectional Encoder Representations from Transformers (BERT), a language model, fine-tuned on the question answering task, on the Aviation Safety Reporting System (ASRS) dataset’s free text reports, that describe incident occurrences in an International aviation safety context. A four-step method is used to evaluate the produced results. This paper outlines what are the limitations of this approach, as well as its usefulness in trying to extract information from thirty randomly selected free text reports when asking the following question: “When did the incident happen?”. We aim to try to integrate one of the algorithms resulting of the recent advances in Natural Language Processing (NLP) to leverage information in natural language narratives, as opposed to working directly with the structured part of the ASRS dataset. We find that our approach yields interesting results, with roughly seventy percent correct answers, including answers that have information that is not overlapping with the ASRS dataset’s metadata.
Keywords : NLP aviation safety BERT ASRS
Complete list of metadata
Contributor : Laurence Porte Connect in order to contact the contributor
Submitted on : Wednesday, January 6, 2021 - 5:50:28 PM
Last modification on : Tuesday, October 19, 2021 - 11:02:55 AM


Files produced by the author(s)




Samuel Kierszbaum, Laurent Lapasset. Applying Distilled BERT for Question Answering on ASRS Reports. NTCA 2020 New Trends in Civil Aviation, Nov 2020, Prague, Czech Republic. pp.33-38, ⟨10.23919/ntca50409.2020.9291241⟩. ⟨hal-03094753⟩



Record views


Files downloads