BERT-Based Sentiment Analysis Using Distillation
| dc.contributor.author | Lehečka, Jan | |
| dc.contributor.author | Švec, Jan | |
| dc.contributor.author | Ircing, Pavel | |
| dc.contributor.author | Šmídl, Luboš | |
| dc.date.accessioned | 2021-03-01T11:00:24Z | |
| dc.date.available | 2021-03-01T11:00:24Z | |
| dc.date.issued | 2020 | |
| dc.description.abstract-translated | In this paper, we present our experiments with BERT (Bidirectional Encoder Representations from Transformers) models in the task of sentiment analysis, which aims to predict the sentiment polarity for the given text. We trained an ensemble of BERT models from a large self-collected movie reviews dataset and distilled the knowledge into a single production model. Moreover, we proposed an improved BERT’s pooling layer architecture, which outperforms standard classification layer while enables per-token sentiment predictions. We demonstrate our improvements on a publicly available dataset with Czech movie reviews. | en |
| dc.format | 13 s. | cs |
| dc.format.mimetype | application/pdf | |
| dc.identifier.citation | LEHEČKA, J., ŠVEC, J., IRCING, P., ŠMÍDL, L. BERT-Based Sentiment Analysis Using Distillation. In Statistical Language and Speech Processing, SLSP 2020. Cham: Springer, 2020. s. 58-70. ISBN 978-3-030-59429-9, ISSN 0302-9743. | cs |
| dc.identifier.doi | 10.1007/978-3-030-59430-5_5 | |
| dc.identifier.isbn | 978-3-030-59429-9 | |
| dc.identifier.issn | 0302-9743 | |
| dc.identifier.obd | 43930643 | |
| dc.identifier.uri | 2-s2.0-85092196103 | |
| dc.identifier.uri | http://hdl.handle.net/11025/42765 | |
| dc.language.iso | en | en |
| dc.project.ID | TN01000024/Národní centrum kompetence - Kybernetika a umělá inteligence | cs |
| dc.publisher | Springer | en |
| dc.relation.ispartofseries | Statistical Language and Speech Processing, SLSP 2020 | en |
| dc.rights | Plný text je přístupný v rámci univerzity přihlášeným uživatelům. | cs |
| dc.rights | © Springer | en |
| dc.rights.access | restrictedAccess | en |
| dc.subject.translated | Sentiment analysis | en |
| dc.subject.translated | BERT | en |
| dc.subject.translated | Knowledge distillation | en |
| dc.title | BERT-Based Sentiment Analysis Using Distillation | en |
| dc.type | konferenční příspěvek | cs |
| dc.type | conferenceObject | en |
| dc.type.status | Peer-reviewed | en |
| dc.type.version | publishedVersion | en |