Transformer models, trained and publicly released over the last couple of years, have proved effective in many NLP tasks. We decided to test their usefulness in particular on the stance detection task. We performed experiments on the data from the Fake News Challenge Stage 1 (hbox{FNC-1}), adding contextual representations from several transformer models to an MLP base classifier. We were indeed able to improve the reported state-of-the-art on the challenge, by exploiting the generalization power of large language models based on the Transformer architecture. Specifically (1) we improved the hbox{FNC-1} best performing model exploiting BERT sentence embeddings as model features for input sentences, (2) we fine-tuned BERT, XLNet, and RoBERTa transformers on the hbox{FNC-1} extended dataset and obtained state-of-the-art results on hbox{FNC-1} task.

Transfer Learning from Transformers to Fake News Challenge Stance Detection (FNC-1) Task

Giuseppe Attardi;Valeriya Slovikovskaya
2020-01-01

Abstract

Transformer models, trained and publicly released over the last couple of years, have proved effective in many NLP tasks. We decided to test their usefulness in particular on the stance detection task. We performed experiments on the data from the Fake News Challenge Stage 1 (hbox{FNC-1}), adding contextual representations from several transformer models to an MLP base classifier. We were indeed able to improve the reported state-of-the-art on the challenge, by exploiting the generalization power of large language models based on the Transformer architecture. Specifically (1) we improved the hbox{FNC-1} best performing model exploiting BERT sentence embeddings as model features for input sentences, (2) we fine-tuned BERT, XLNet, and RoBERTa transformers on the hbox{FNC-1} extended dataset and obtained state-of-the-art results on hbox{FNC-1} task.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1040216
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact