-
Notifications
You must be signed in to change notification settings - Fork 113
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable sm_35 support in pytorch #129
Comments
I've been testing more and more - and with this docker-compose.yaml file, tensorflow detects the GPU ok:
Output:
|
I think I managed to get this working! Firstly, this is on Ubuntu 20.04 - and we need to install python 3.7 - and do this all as the root user:
Create a python 3.7 venv and activate it:
Now we want to install a proper version of torch that includes the stuff we need. I chose the same version of torch that's used in the deepstack install:
Run deepstack and map in the alternative torch package:
This will map in the alternative torch packages that in my case supports Results:
Compared to running on a Jetson Nano 4Gb:
Output of
|
I'm trying to get GPU processing working on my older 2Gb GeForce GT 710 - which I believe should be about as fast as a jetson nano...
When I try to run deepstack with GPU enabled, I get:
Is there a way to enable sm_35 support in pytorch used in these containers? I can't quite see where it gets set....
Here's the
nvidia-smi
output from within the container:Supported versions seem to be:
The text was updated successfully, but these errors were encountered: