Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

grid_sampler_2d_backward is not implemented #25

Open
Louvivien opened this issue Jun 11, 2023 · 0 comments
Open

grid_sampler_2d_backward is not implemented #25

Louvivien opened this issue Jun 11, 2023 · 0 comments

Comments

@Louvivien
Copy link

Hello when running the training i have this error:

Traceback (most recent call last):
  File "/content/stylegan2-ada-pytorch/train.py", line 538, in <module>
    main() # pylint: disable=no-value-for-parameter
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/click/decorators.py", line 26, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "/content/stylegan2-ada-pytorch/train.py", line 531, in main
    subprocess_fn(rank=0, args=args, temp_dir=temp_dir)
  File "/content/stylegan2-ada-pytorch/train.py", line 383, in subprocess_fn
    training_loop.training_loop(rank=rank, **args)
  File "/content/stylegan2-ada-pytorch/training/training_loop.py", line 284, in training_loop
    loss.accumulate_gradients(phase=phase.name, real_img=real_img, real_c=real_c, gen_z=gen_z, gen_c=gen_c, sync=sync, gain=gain)
  File "/content/stylegan2-ada-pytorch/training/loss.py", line 131, in accumulate_gradients
    (real_logits * 0 + loss_Dreal + loss_Dr1).mean().mul(gain).backward()
  File "/usr/local/lib/python3.10/dist-packages/torch/_tensor.py", line 487, in backward
    torch.autograd.backward(
  File "/usr/local/lib/python3.10/dist-packages/torch/autograd/__init__.py", line 200, in backward
    Variable._execution_engine.run_backward(  # Calls into the C++ engine to run the backward pass
RuntimeError: derivative for aten::grid_sampler_2d_backward is not implemented
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant