Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is the code still able to run? #9

Open
bing0037 opened this issue Aug 28, 2021 · 3 comments
Open

Is the code still able to run? #9

bing0037 opened this issue Aug 28, 2021 · 3 comments

Comments

@bing0037
Copy link

Hi,

I am trying to reproduce your result of BERT. I followed the Prerequisite:

# Pytorch pretrained BERT
git clone https://github.com/pmichel31415/pytorch-pretrained-BERT
cd pytorch-pretrained-BERT
git checkout paul
cd ..
# Install the pytorch-pretrained_BERT:
cd pytorch-pretrained-BERT
pip install .
cd ..
# Run the code:
bash experiments/BERT/heads_ablation.sh MNLI

But got this error:

02:06:57-INFO: Weights of BertForSequenceClassification not initialized from pretrained model: ['classifier.weight', 'classifier.bias']
02:06:57-INFO: Weights from pretrained model not used in BertForSequenceClassification: ['cls.predictions.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.weight', 'cls.seq_relationship.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias']
Traceback (most recent call last):
  File "pytorch-pretrained-BERT/examples/run_classifier.py", line 582, in <module>
    main()
  File "pytorch-pretrained-BERT/examples/run_classifier.py", line 275, in main
    model.bert.mask_heads(to_prune)
  File "/home/guest/anaconda3/envs/huggingface_env/lib/python3.6/site-packages/torch/nn/modules/module.py", line 594, in __getattr__
    type(self).__name__, name))
AttributeError: 'DataParallel' object has no attribute 'bert'


1(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error

Any idea or suggestion?

@pmichel31415
Copy link
Owner

Hi @bing0037 I haven't run this code in a while but it used to work. My first guess would be probably an incompatibility with a newer version of pytorch. Can you try again in an environment with pytorch 1.0 or 1.1?

If that doesn't solve it then I'm not too sure... I wasn't using DataParallel in the code so I'm not sure why it would show up in the eror message... let me know how changing the pytorch version goes.

@caidongqi
Copy link

I met the same problem today and solved it by adding the following code block in pytorch-pretrained-BERT/examples/run_classifier.py

# around line 260
model = torch.nn.DataParallel(model)
+ model = model.module

Hope it helps.

Reference:

  1. https://blog.csdn.net/weixin_41990278/article/details/105127101
  2. https://zhuanlan.zhihu.com/p/92759707

@vrunm
Copy link

vrunm commented Feb 4, 2023

Hi @caidongqi I tried changing the run_classifier.py file as you have done. But ran into the same errors as @bing0037.
I was also trying to reproduce the result for BERT.
I am using Python 3.8 and PyTorch 1.8.0

1(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error

Any ideas or solutions to this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants