-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature extraction - ESM embeddings #13
Commits on Jul 25, 2023
-
Configuration menu - View commit details
-
Copy full SHA for eae44be - Browse repository at this point
Copy the full SHA eae44beView commit details -
fix(dataset): -esm +prot_seq #8
Storing Esm embeddings wont work since they are 320-d vectors PER amino acid... Instead the better approach is to just store the sequence strings and leave the esm emb calc to be done on the model side of things. See #8 for more
Configuration menu - View commit details
-
Copy full SHA for 40f40ad - Browse repository at this point
Copy the full SHA 40f40adView commit details
Commits on Jul 26, 2023
-
Configuration menu - View commit details
-
Copy full SHA for 33069a6 - Browse repository at this point
Copy the full SHA 33069a6View commit details -
test(results): kiba run init results
Not too great, outliers ruining training?
Configuration menu - View commit details
-
Copy full SHA for db0d963 - Browse repository at this point
Copy the full SHA db0d963View commit details -
fix(train_test): og_model_opt t/f -> string input
This has to be done since argparse doesnt handle boolean choices well through the cli.
Configuration menu - View commit details
-
Copy full SHA for 3469618 - Browse repository at this point
Copy the full SHA 3469618View commit details -
Merge branch 'feature_extraction' of https://github.com/jyaacoub/MutDTA…
… into feature_extraction
Configuration menu - View commit details
-
Copy full SHA for 2cbfbb1 - Browse repository at this point
Copy the full SHA 2cbfbb1View commit details -
Configuration menu - View commit details
-
Copy full SHA for f13b71e - Browse repository at this point
Copy the full SHA f13b71eView commit details -
Configuration menu - View commit details
-
Copy full SHA for 34155b6 - Browse repository at this point
Copy the full SHA 34155b6View commit details -
Merge branch 'feature_extraction' of https://github.com/jyaacoub/MutDTA…
… into feature_extraction
Configuration menu - View commit details
-
Copy full SHA for 519ac8e - Browse repository at this point
Copy the full SHA 519ac8eView commit details -
Configuration menu - View commit details
-
Copy full SHA for f0761b6 - Browse repository at this point
Copy the full SHA f0761b6View commit details -
Configuration menu - View commit details
-
Copy full SHA for 7a1f64b - Browse repository at this point
Copy the full SHA 7a1f64bView commit details -
perf: optimize memory usage during training
Main thing is to freeze ESM layers since its too large to train.
Configuration menu - View commit details
-
Copy full SHA for 8038746 - Browse repository at this point
Copy the full SHA 8038746View commit details -
This is useful since epochs would be cluttered with transformer tokenizer warning logs
Configuration menu - View commit details
-
Copy full SHA for c34897f - Browse repository at this point
Copy the full SHA c34897fView commit details