Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

interactive_shell_NER.ipynb 실행시 model load 부분 문제 #32

Open
wwjhang opened this issue Apr 12, 2022 · 0 comments
Open

interactive_shell_NER.ipynb 실행시 model load 부분 문제 #32

wwjhang opened this issue Apr 12, 2022 · 0 comments

Comments

@wwjhang
Copy link

wwjhang commented Apr 12, 2022

main() 실행 시 RuntimeError: Input, output and indices must be on the current device라고 뜨면서 NER수행이 안되는데 혹시 수정할 부분 있을까요 의존성은 requirements.txt대로 깔았습니다 ㅠ
전체 에러문입니다.
`RuntimeError Traceback (most recent call last)
Input In [5], in <cell line: 1>()
----> 1 main()

Input In [3], in main()
45 list_of_input_ids = tokenizer.list_of_string_to_list_of_cls_sep_token_ids([input_text])
46 x_input = torch.tensor(list_of_input_ids).long()
---> 47 list_of_pred_ids = model(x_input)
49 list_of_ner_word, decoding_ner_sentence = decoder_from_res(list_of_input_ids=list_of_input_ids, list_of_pred_ids=list_of_pred_ids)
50 print("output>", decoding_ner_sentence)

File /electra/ner_tagging/.venv/lib/python3.8/site-packages/torch/nn/modules/module.py:889, in Module._call_impl(self, *input, **kwargs)
887 result = self._slow_forward(*input, **kwargs)
888 else:
--> 889 result = self.forward(*input, **kwargs)
890 for hook in itertools.chain(
891 _global_forward_hooks.values(),
892 self._forward_hooks.values()):
893 hook_result = hook(self, input, result)

File /electra/ner_tagging/pytorch-bert-crf-ner/model/net.py:41, in KobertCRF.forward(self, input_ids, token_type_ids, tags)
38 attention_mask = input_ids.ne(self.vocab.token_to_idx[self.vocab.padding_token]).float()
40 # outputs: (last_encoder_layer, pooled_output, attention_weight)
---> 41 outputs = self.bert(input_ids=input_ids,
42 token_type_ids=token_type_ids,
43 attention_mask=attention_mask)
44 last_encoder_layer = outputs[0]
45 last_encoder_layer = self.dropout(last_encoder_layer)

File /electra/ner_tagging/.venv/lib/python3.8/site-packages/torch/nn/modules/module.py:889, in Module._call_impl(self, *input, **kwargs)
887 result = self._slow_forward(*input, **kwargs)
888 else:
--> 889 result = self.forward(*input, **kwargs)
890 for hook in itertools.chain(
891 _global_forward_hooks.values(),
892 self._forward_hooks.values()):
893 hook_result = hook(self, input, result)

File /electra/ner_tagging/.venv/lib/python3.8/site-packages/pytorch_pretrained_bert/modeling.py:730, in BertModel.forward(self, input_ids, token_type_ids, attention_mask, output_all_encoded_layers)
727 extended_attention_mask = extended_attention_mask.to(dtype=next(self.parameters()).dtype) # fp16 compatibility
728 extended_attention_mask = (1.0 - extended_attention_mask) * -10000.0
--> 730 embedding_output = self.embeddings(input_ids, token_type_ids)
731 encoded_layers = self.encoder(embedding_output,
732 extended_attention_mask,
733 output_all_encoded_layers=output_all_encoded_layers)
734 sequence_output = encoded_layers[-1]

File /electra/ner_tagging/.venv/lib/python3.8/site-packages/torch/nn/modules/module.py:889, in Module._call_impl(self, *input, **kwargs)
887 result = self._slow_forward(*input, **kwargs)
888 else:
--> 889 result = self.forward(*input, **kwargs)
890 for hook in itertools.chain(
891 _global_forward_hooks.values(),
892 self._forward_hooks.values()):
893 hook_result = hook(self, input, result)

File /electra/ner_tagging/.venv/lib/python3.8/site-packages/pytorch_pretrained_bert/modeling.py:267, in BertEmbeddings.forward(self, input_ids, token_type_ids)
264 if token_type_ids is None:
265 token_type_ids = torch.zeros_like(input_ids)
--> 267 words_embeddings = self.word_embeddings(input_ids)
268 position_embeddings = self.position_embeddings(position_ids)
269 token_type_embeddings = self.token_type_embeddings(token_type_ids)

File /electra/ner_tagging/.venv/lib/python3.8/site-packages/torch/nn/modules/module.py:889, in Module._call_impl(self, *input, **kwargs)
887 result = self._slow_forward(*input, **kwargs)
888 else:
--> 889 result = self.forward(*input, **kwargs)
890 for hook in itertools.chain(
891 _global_forward_hooks.values(),
892 self._forward_hooks.values()):
893 hook_result = hook(self, input, result)

File /electra/ner_tagging/.venv/lib/python3.8/site-packages/torch/nn/modules/sparse.py:156, in Embedding.forward(self, input)
155 def forward(self, input: Tensor) -> Tensor:
--> 156 return F.embedding(
157 input, self.weight, self.padding_idx, self.max_norm,
158 self.norm_type, self.scale_grad_by_freq, self.sparse)

File /electra/ner_tagging/.venv/lib/python3.8/site-packages/torch/nn/functional.py:1916, in embedding(input, weight, padding_idx, max_norm, norm_type, scale_grad_by_freq, sparse)
1910 # Note [embedding_renorm set_grad_enabled]
1911 # XXX: equivalent to
1912 # with torch.no_grad():
1913 # torch.embedding_renorm_
1914 # remove once script supports set_grad_enabled
1915 no_grad_embedding_renorm(weight, input, max_norm, norm_type)
-> 1916 return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)

RuntimeError: Input, output and indices must be on the current device`

@wwjhang wwjhang changed the title interactive_shell_NER.ipynb 실행시 checkpoint 미존재 interactive_shell_NER.ipynb 실행시 model load 부분 문제 Apr 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant