Accéder directement au contenu Accéder directement à la navigation
Communication dans un congrès

Applying Distilled BERT for Question Answering on ASRS Reports

Abstract : This paper employs the Bidirectional Encoder Representations from Transformers (BERT), a language model, fine-tuned on the question answering task, on the Aviation Safety Reporting System (ASRS) dataset’s free text reports, that describe incident occurrences in an International aviation safety context. A four-step method is used to evaluate the produced results. This paper outlines what are the limitations of this approach, as well as its usefulness in trying to extract information from thirty randomly selected free text reports when asking the following question: “When did the incident happen?”. We aim to try to integrate one of the algorithms resulting of the recent advances in Natural Language Processing (NLP) to leverage information in natural language narratives, as opposed to working directly with the structured part of the ASRS dataset. We find that our approach yields interesting results, with roughly seventy percent correct answers, including answers that have information that is not overlapping with the ASRS dataset’s metadata.
Keywords : NLP aviation safety BERT ASRS
Liste complète des métadonnées

https://hal-enac.archives-ouvertes.fr/hal-03094753
Contributeur : Laurence Porte Connectez-vous pour contacter le contributeur
Soumis le : mercredi 6 janvier 2021 - 17:50:28
Dernière modification le : mercredi 3 novembre 2021 - 08:17:50

Fichier

camera-ready.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

Collections

Citation

Samuel Kierszbaum, Laurent Lapasset. Applying Distilled BERT for Question Answering on ASRS Reports. NTCA 2020 New Trends in Civil Aviation, Nov 2020, Prague, Czech Republic. pp.33-38, ⟨10.23919/ntca50409.2020.9291241⟩. ⟨hal-03094753⟩

Partager

Métriques

Consultations de la notice

101

Téléchargements de fichiers

304