Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why use Hadamard product in Matrix Factorization router? #26

Closed
awsvmaringa opened this issue Jul 16, 2024 · 1 comment
Closed

Why use Hadamard product in Matrix Factorization router? #26

awsvmaringa opened this issue Jul 16, 2024 · 1 comment

Comments

@awsvmaringa
Copy link

What is the significance of using Hadamard product between model embedding and query embedding in the Matrix Factorization router? Why not directly compute the score using the dot product?

@thwu1
Copy link
Collaborator

thwu1 commented Jul 18, 2024

Thanks for the great question!

The reason is that the Hadamard product preserves more information about the interaction between the two vectors. Initially, we explored applying additional non-linear operations to the fusion of these vectors. However, we find that a linear classification on top provided the best performance. In this case, the Hadamard product is not strictly necessary because we can collapse all linear transformations into one as discussed in #9.

Despite this, we decided to keep the original code for simplicity and potential future modifications.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants