-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Native Windows installation and execution instructions #300
Comments
This was the procedure that finally lead me to success: .# WSL INSTALL ON WINDOWS 11 restart Windows #------------------------------------------------------------------------------- sudo apt-get install build-essential #sudo apt update && sudo apt upgrade #------------------------------------------------------------------------------- sudo apt install software-properties-common .# check what is the default version .# check installed versions .# install missing package #------------------------------------------------------------------------------------------------------------------------- .# check success #------------------------------------------------------------------------------------------------------------------------- |
With WSL, I was able to run Zamba in the CLI, but how can I run it as a Python package? Not sure even after reading all tutorials. I have VS code. |
From the documentation, you can see the basic example of using as a Python package. I copy that here: from zamba.models.model_manager import predict_model
from zamba.models.config import PredictConfig
predict_config = PredictConfig(data_dir="example_vids/")
predict_model(predict_config=predict_config) As linked from that documentation, the same things that you can pass to the CLI, you can pass into the So, for example, if I wanted to predict on a different directory called from zamba.models.model_manager import predict_model
from zamba.models.config import PredictConfig
predict_config = PredictConfig(
data_dir="my_vids/",
save_dir="zamba_output/", # save to different directory
model_name="european", # run the model with European species
weight_download_region="eu", # download the model from the EU data center
)
predict_model(predict_config=predict_config) If you save that script as |
Thanks for your reply. I have Windows 11+WSL/VS Code and have to put everything under if __name__ == '___main__': to run correctly . |
Currently, we recommend Docker desktop or WSL for Windows.
It should be possible to run natively if
yolox
at our version pin can be compiled.PR #222 had some initial work on this, but it was not generalizable to all Windows systems.
The text was updated successfully, but these errors were encountered: