0000000001203024

AUTHOR

Dan Tufiş

showing 1 related works from this author

A Lite Romanian BERT: ALR-BERT

2022

Large-scale pre-trained language representation and its promising performance in various downstream applications have become an area of interest in the field of natural language processing (NLP). There has been huge interest in further increasing the model’s size in order to outperform the best previously obtained performances. However, at some point, increasing the model’s parameters may lead to reaching its saturation point due to the limited capacity of GPU/TPU. In addition to this, such models are mostly available in English or a shared multilingual structure. Hence, in this paper, we propose a lite BERT trained on a large corpus solely in the Romanian language, which we cal…

Human-Computer InteractionComputer Networks and CommunicationsBERT; transformers; ALBERT; NLP; RomanianVDP::Teknologi: 500::Informasjons- og kommunikasjonsteknologi: 550Computers
researchProduct