Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errors when running collab notebook #14

Open
Rsalganik1123 opened this issue Oct 20, 2023 · 0 comments
Open

Errors when running collab notebook #14

Rsalganik1123 opened this issue Oct 20, 2023 · 0 comments

Comments

@Rsalganik1123
Copy link

Hello,
I am experiencing a series of errors when trying to run the collab notebook provided with the code.

First, installing jukebox throws the following error:

× python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> See above for output.

However, this can be solved by installing a different version of jukebox:

!pip install --upgrade git+https://github.com/craftmine1000/jukebox-saveopt.git

However, then in the initialization block, the following errors arise:

/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:2025: UserWarning: for encoders.0.level_blocks.0.model.0.0.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:2025: UserWarning: for encoders.0.level_blocks.0.model.0.0.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:2025: UserWarning: for encoders.0.level_blocks.0.model.0.1.model.0.model.1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
...
``` in the line: `top_prior = make_prior(hparams, vqvae, device)#device)`

Please advise on how to resolve. Thanks in advance
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant