We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hello, i would like to use pre-trained word embedding, which code will i change. Thank you
The text was updated successfully, but these errors were encountered:
You need to change code here https://github.com/atulkum/pointer_summarizer/blob/master/data_util/data.py#L35 and here https://github.com/atulkum/pointer_summarizer/blob/master/training_ptr_gen/model.py#L45
you might need something like
word_emb_matrix = get_word_embd(vocab, config) embd_vector = torch.from_numpy(word_emb_matrix).float() self.word_embeds = nn.Embedding.from_pretrained(embd_vector, freeze=False)
This link might help https://github.com/atulkum/sequence_prediction/blob/e1659b6414ca951a8229e737ae032a9040bde81d/neural_ner/model_utils.py#L81
Let me know how it goes
Sorry, something went wrong.
No branches or pull requests
Hello, i would like to use pre-trained word embedding, which code will i change. Thank you
The text was updated successfully, but these errors were encountered: