You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some of the pre-trained models are just described as "pre-trained", while others are described as "pre-trained then fine-tuned on x". What data was the original pre-trained performed on, and for how long?
'gin_supervised_contextpred': A GIN model pre-trained with supervised learning and context prediction
'gin_supervised_masking_BACE': A GIN model pre-trained with supervised learning and masking, and fine-tuned on BACE
The text was updated successfully, but these errors were encountered:
You may find the details of pre-training in https://arxiv.org/abs/1905.12265. supervised means supervised pre-training on a ChEMBL dataset was performed. contextpred means self-supervised pre-training with context prediction on a ZINC15 dataset was performed.
Some of the pre-trained models are just described as "pre-trained", while others are described as "pre-trained then fine-tuned on x". What data was the original pre-trained performed on, and for how long?
e.g. from the docs:
The text was updated successfully, but these errors were encountered: