Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature(wrh): add edm initial implementation #1

Open
wants to merge 16 commits into
base: main
Choose a base branch
from

Conversation

ruiheng123
Copy link

def __init__(self, config: Optional[EasyDict]=None) -> None:

super().__init__()
self.config= config
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use single-layer config

grl/generative_models/edm_diffusion_model/edm_utils.py Outdated Show resolved Hide resolved
grl/generative_models/edm_diffusion_model/test.ipynb Outdated Show resolved Hide resolved
return loss


def _get_sigma_steps_t_steps(self, num_steps=18, epsilon_s=1e-3, rho=7):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if this method is called with the same argument during training, you should just call it once

c_in = 1 / (sigma ** 2 + 1).sqrt()
c_noise = self.M - 1 - self.round_sigma(sigma, return_index=True).to(torch.float32)
elif self.precondition_type == "EDM":
c_skip = self.sigma_data ** 2 / (sigma ** 2 + self.sigma_data ** 2)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for some constant variable, such as self.sigma_data ** 2, we can pre-compute them to save computation.

@zjowowen zjowowen force-pushed the main branch 3 times, most recently from 49f5fc0 to 9fa1b41 Compare October 31, 2024 13:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants