Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gpu memory changes in recent updates #383

Open
cybershrapnel opened this issue Jul 7, 2023 · 10 comments
Open

gpu memory changes in recent updates #383

cybershrapnel opened this issue Jul 7, 2023 · 10 comments

Comments

@cybershrapnel
Copy link

I have been running bark on my 1650 ti computer for a couple months, 6gb vram, works fine without the small models option

So then i installed it on my 2070 and I get the memory error issue that is discussed in another issue thread.

I took the advice in that thread and just added use small models, but the audio was 10 times worse then what my 1650ti was making. Just garbled audio every other generation or so.

So then i figured, screw it, I'm a copy this working directory from my 1650ti pc over to the 2070 and try it, what the heck right. and sure enough it works fine. uses all 8gb of vram on my 2070 and audio garble is gone. runs much slower now of course lol.

But i I uninstalled pytorch a hundred times thinking it had to be a pytorch issue because why would it run on my 6gb card and not my 8gb card? I haven't had time to look through the code to see what changed yet. But I thought I would mention it as I didn't see this covered in resolved issues. And there are definitely some people addressing this issue in issues thinking it is a memory issue. Which I guess it is. we need 12gb. but it was and is running just fine on whatever version I seem to have installed now for 6gb and 8gb cards and I AM NOT RUNNING SMALL MODELS for sure. i checked and double checked. It only uses 2 to 4 gb vram on the current git and i have to enable small models, but it uses all 6 or 8 on the old git without small models enabled.

@cybershrapnel
Copy link
Author

oh, fyi, I suspect it has to do with the cli error changework around that was updated from the other issue a week or two ago. just a stab in the dark though.

@cybershrapnel
Copy link
Author

meant to post this on the main bark page, thought they deleted, fml

@cybershrapnel
Copy link
Author

another interesting thing about this is, when generating the audio, the new version, the first pass, goes from 1 to liek 400-700, like this 1/432...432/432
the old version that works goes from 1 to 100 on the first pass...

@anjy077
Copy link

anjy077 commented Jul 8, 2023

I seem to have the same problem as you

@cybershrapnel
Copy link
Author

the old version work confirmed... I don't know what version it is, but I installed it about a month or so ago, and the new version is from a week or two ago... I'm gonna repost the old version on my github just so I have it archived for now... when I get a chance.

@tongbaojia
Copy link
Contributor

Hi @cybershrapnel, thanks for reporting this. I'd like to understand this better.

Could you provide the git version that you observe the memory issue and the versions that you don't? I went through the last month's commit history and it is not obvious to me which one would cause such a big difference.

Also, it would be helpful to have the input on how you run bark, so I can try to reproduce it. If you can provide the envs, like python version and pytorch version, that's better.

@JonathanFly
Copy link
Contributor

I think that was my PR so I hope that didn't cause some problem... it shouldn't be changing the generation at all, just the speed calculation reported in the CLI. Purely output.

@cybershrapnel
Copy link
Author

I'll post more on this soon, I will have to figure out which version of bark I have been shuffling around on my older machines. Just an fyi. I just bought a 3060 12gb model and 2 3090s, the current version runs flawless on them. but the older 8gb and 6gb cards are still exhibiting this strange audio hiccups. From the looks of the logs, pytorch is trying to set the maximum vram memory on startup to check the mem, and it is looking for 15 MiB too much everytime. So it crashes for being short 15 megs of vram. When it normally never needs anywhere near that much. You can see where it is messing up too. It is anytime you have music or tone inputs, like : "" [] music note etc. it doesn't seem to be able to process them and it kicks it into garble mode. So I just strip that stuff out on less than 12gb cards. But I'm setting up a bunch of equipment stuff, I will package up the version of bark I'm using and upload it to my git repos. sorry, I am the type that sets something up and forgets it lol.
But bark runs amazing well on a 12gb 3060 just as an fyi :)

@lc-spxl
Copy link

lc-spxl commented Aug 20, 2023

@cybershrapnel how does it perform on 3060 ? Mine takes ~15sec for outputting 2sec audio is it inline ?
Thanks!

@asterocean
Copy link

I figured out a solution, load modules on demand rather than load them all at the same time, try this pull request: #531

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants