Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot reproduce the recognition accuracy of certain expressions in the paper #8

Open
AMG024 opened this issue Apr 26, 2021 · 8 comments

Comments

@AMG024
Copy link

AMG024 commented Apr 26, 2021

I used the msceleb weights pre-training model parameters in the paper to train the model’s accuracy on a certain expression that is not as accurate as the recognition accuracy in the confusion matrix in the paper. The parameters used are the default parameters in the paper, so I want to know how In order to reproduce the recognition accuracy of each expression in the confusion matrix of the paper

@nothingeasy
Copy link

I meet the same problem . Does anyone reproduce the result in the paper?

@anony-mous-rep
Copy link

I also meet the same problem . Is my data set error? I use RAF-DB data set to train, only get the accuracy of 86.44%.

@nothingeasy
Copy link

Anyone try this code? How about the resulut? Please help~

@yun-kai
Copy link

yun-kai commented Jun 10, 2021

I use RAF-DB data set to train and get the accuracy of 86.25%, and use AffectNet data set to train and get the accuracy of 56.8%.
I don't know why the accuracy of AffectNet is too far from the paper result. Does anyone meet same problem?

@hhwwxx11
Copy link

Now the ms-celeb weight pre-training model cannot be downloaded, can you share it?

@lbbbbbbbbbb
Copy link

请问你有了么,链接还是失效

@ZechengLi19
Copy link

Now the ms-celeb weight pre-training model cannot be downloaded, can you share it?

Same question!

@ZechengLi19
Copy link

请问你有了么,链接还是失效

you can check this link https://github.com/zyh-uaiaaaa/relative-uncertainty-learning

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants