OpenFold Local Jupyter Notebook 📔 | Metrics, Plots, Concurrent Inference #484
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Overview
This PR introduces a fully featured Local Notebook for performing inference, obtaining metrics, ranking the best model, and generating plots in a structured and reproducible manner, particularly for experimentation with large datasets.
The metrics are similar to those in the Colab notebook but optimized for a local installation with Docker. It also introduces parallel execution to leverage multiple GPUs.
The notebook operates by executing Docker commands using the Docker client and accessing OpenFold functions within a standalone environment. This approach ensures that the OpenFold codebase remains unaffected, serving as a client to help reproduce metrics and results from the Colab notebook locally.
Usage
Refer to instructions in notebooks/OpenFoldLocal.ipynb
Setup the notebook
Fist, build Openfold using Docker. Follow this guide.
Then, go to the notebook folder
cd notebooks
Create an environment to run Jupyter with the requirements
mamba create -n openfold_notebook python==3.10
Activate the environment
mamba activate openfold_notebook
Install the requirements
pip install -r src/requirements.txt
Start your Jupyter server in the current folder
jupyter lab . --ip="0.0.0.0"
Access the notebook URL or connect remotely using VSCode.
Inference example
Initializing the client:
Running Inference:
Using a file:
Screenshots