You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What is the significance of using Hadamard product between model embedding and query embedding in the Matrix Factorization router? Why not directly compute the score using the dot product?
The text was updated successfully, but these errors were encountered:
The reason is that the Hadamard product preserves more information about the interaction between the two vectors. Initially, we explored applying additional non-linear operations to the fusion of these vectors. However, we find that a linear classification on top provided the best performance. In this case, the Hadamard product is not strictly necessary because we can collapse all linear transformations into one as discussed in #9.
Despite this, we decided to keep the original code for simplicity and potential future modifications.
What is the significance of using Hadamard product between model embedding and query embedding in the Matrix Factorization router? Why not directly compute the score using the dot product?
The text was updated successfully, but these errors were encountered: