We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
https://github.com/Deriq-Qian-Dong/III-Retriever/blob/main/src/modeling.py#L193C68-L193C90 你好,在这里看上去query重建器的输入,32个token emb,都是mask token的emb,和paper说的32个可学习的emb不一致呀?希望得到您的指正。
The text was updated successfully, but these errors were encountered:
你好,感谢关注。mask token的emb会随着训练一起变化,这里也可以随机初始化一个新的special token,效果一样
Sorry, something went wrong.
感谢回复。 mask token只有一个,是吗?我纠结的地方是: 有‘32’个可学习的emb VS 有‘1’个可学习的emb
另外还想请教,reconstructor先用T5生成的query做一个训练的动机是什么? 如果query-passage对的训练数据足够大,是否可以直接用真实的query而不用fake query了?
No branches or pull requests
https://github.com/Deriq-Qian-Dong/III-Retriever/blob/main/src/modeling.py#L193C68-L193C90
你好,在这里看上去query重建器的输入,32个token emb,都是mask token的emb,和paper说的32个可学习的emb不一致呀?希望得到您的指正。
The text was updated successfully, but these errors were encountered: