You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I receive the following error when starting webui.py -enablemps on my Mac mini with M2 CPU. This only happens with Swap or Clone Voice. TTS works fine.
File "/Users/jan/anaconda3/lib/python3.11/site-packages/torch/serialization.py", line 165, in validate_cuda_device
raise RuntimeError('Attempting to deserialize object on a CUDA '
RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.
I does not seem to load the Hubert_model into MPS but tries to use CUDA (that I do not have).
Do I need to adjust the code somewhere? I thought it would be enough to set the -enablemps switch.
Thank you very much for your support.
The text was updated successfully, but these errors were encountered:
I solved this by replacing the line "model.load_state_dict(torch.load(path))" with "model.load_state_dict(torch.load(path, map_location='mps'))" in file "customtokenizer.py" in folder "/bark-gui/bark/hubert/". Hope this helps others.
I receive the following error when starting webui.py -enablemps on my Mac mini with M2 CPU. This only happens with Swap or Clone Voice. TTS works fine.
File "/Users/jan/anaconda3/lib/python3.11/site-packages/torch/serialization.py", line 165, in validate_cuda_device
raise RuntimeError('Attempting to deserialize object on a CUDA '
RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.
I does not seem to load the Hubert_model into MPS but tries to use CUDA (that I do not have).
Do I need to adjust the code somewhere? I thought it would be enough to set the -enablemps switch.
Thank you very much for your support.
The text was updated successfully, but these errors were encountered: