Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

change model from gpu to cpu #65

Open
fanlamda opened this issue Nov 2, 2016 · 5 comments
Open

change model from gpu to cpu #65

fanlamda opened this issue Nov 2, 2016 · 5 comments

Comments

@fanlamda
Copy link

fanlamda commented Nov 2, 2016

I wish to train the model on GPU,but predict with cpu,so I tried using cudnn.convert(self.model, nn). But it seems something in the model remains Cuda form.

Is there any method to solve this problem? Any advice is welcome.

@SeanNaren
Copy link
Owner

In order to switch back to CPU mode you would also need to call self.model:float() as well as cudnn.convert(self.model, nn).

@fanlamda
Copy link
Author

fanlamda commented Nov 2, 2016

Yes,I used self.model:float() After cudnn.convert(), but I found cudnn.batchBRNNReLU remains unchanged. It comes error like 'unknown object cudnn.batchBRNNReLU'

@SeanNaren
Copy link
Owner

Ah this is my fault there is no CPU version of cudnn.BatchBRNNReLU. I'll have to modify the DeepSpeechModel class to take this into consideration; the alternative is to use the rnn package seqBRNN which will allow conversion, which will involve having to train a new model

@SeanNaren
Copy link
Owner

I'm training a smaller model on AN4 that is CPU based (but training on GPU with a few hacks). Will add this to the pre-trained networks once finished.

@saurabhvyas
Copy link

I dont have a gpu , I am still learning , Is there any small pretrained model that will work with only cpu(core i5 of my laptop and raspberry pi 3 ? )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants