Skip to content

contextual

Seid Muhie Yimam edited this page Nov 14, 2021 · 2 revisions

Contextualized semantic Models

In this section, we will present the two contextual embedding models that are built the Amharic corps

RoBERTa

The RoBERTa model is already available in the Huggingface repository

You can download and use the model as follows:

from transformers import AutoTokenizer, AutoModelForMaskedLM
  
tokenizer = AutoTokenizer.from_pretrained("uhhlt/am-roberta")

model = AutoModelForMaskedLM.from_pretrained("uhhlt/am-roberta")

FLAIR

Clone this wiki locally