You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
changing the requirements.txt will be enough to run the model on GPU as it is using torch as the backend of transformer. Can you please make a pull request so that I can merge it?
I actually used the model without installing the libraries from the requirements.txt because I ran into some issues while trying to install them. But it still managed to run all the examples mentioned in the readme, including a few I came up with. So, if it's possible, could you maybe update the requirements.txt so the model can run on a GPU? That would be awesome!
I am using this git in Colab and kaggle. But it always runs on CPU. I have tried many things but can not get it to run on GPU.
Can you give me solution on how to run it in GPU.
The text was updated successfully, but these errors were encountered: