Skip to content

Activation Checkpointing has hardcoded GPT2Block #32

Open
@le1nux

Description

@le1nux

We should be able to set the modules that are subject to activation checkpointing via the config

Currently, it is hardcoded:

def is_module_to_apply_activation_checkpointing(submodule: torch.nn.Module):
    return isinstance(submodule, GPT2Block)

see: https://github.com/Modalities/modalities/blob/main/src/modalities/activation_checkpointing.py#L15

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingenhancementNew feature or requesthelp wantedExtra attention is needed

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions