-
-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Calculation of the loss function #7
Comments
Hi there, |
Hi, Thank you for your feedback. I think you may get confused by the code at modeling/SA_models.py#L199. We use X_tilde_3 for calculating the imputation loss of MIT. It has no problem because the imputation loss comes from the artificially missing values and all imputations in the complement feature vector come from X_tilde_3. I don't understand your second point. In the line of code for MIT loss calculation that I mentioned above, we use |
I really appreciate your reply. Your reply answered my doubts. The second question is caused by my carelessness, which has been solved now. Thank you. Best regards, |
Absolutely my pleasure 😃 |
Thank you for excellent work!
The imputation loss of MIT is not covered the complement feature vector in the code.
Secondly, the paper also talks about taking the raw data X without artificially-masking as input to the MIT formula, and I found in the corresponding code that you used the manual masked X^.
Is there something I don't understand. I look forward to your resolution of my doubts!
The text was updated successfully, but these errors were encountered: