Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

normalize function doesn't actually normalize anything in blending functions #10

Open
blepping opened this issue Aug 5, 2024 · 0 comments

Comments

@blepping
Copy link

blepping commented Aug 5, 2024

the normalize() in latent_utils.py never actually has an effect because target_min and target_max are never passed - so it normalizes to the same min/max as the latent it was passed.

it doesn't literally have no effect because of small imprecisions that result from floating point math but i verified that the output passes torch.isclose(input, output, atol=1e-05, rtol-1e-05). the difference is enough to change seeds but no actual normalization is occurring in blend functions like

    # Simulates a brightening effect by adding tensor b to tensor a, scaled by t.
    'linear dodge': lambda a, b, t: normalize(a + b * t),

since there are two latents involved you could possibly do something like:

def normalize(latent, *, reference_latent=None, dim=(-3, -2, -1)):
    if reference_latent is None:
        return latent
    min_val, max_val = (
        latent.amin(dim=dim, keepdim=True),
        latent.amax(dim=dim, keepdim=True),
    )
    target_min, target_max = (
        reference_latent.amin(dim=dim, keepdim=True),
        reference_latent.amax(dim=dim, keepdim=True),
    )

    normalized = (latent - min_val) / (max_val - min_val)
    return normalized * (target_max - target_min) + target_min

and

    # Simulates a brightening effect by adding tensor b to tensor a, scaled by t.
    'linear dodge': lambda a, b, t: normalize(a + b * t, a),

to normalize it to the same scale as a. i'm not really sure what's reasonable since that seems arbitrary.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant