Skip to content

Lisibility of saved tensor #69

Open
@Varal7

Description

@Varal7

The commit pytorch/pytorch@1efa863 changes the way variables are saved during the forward (in certain cases).

This can create comical graphs when visualized with torchviz because the saved variable shares the same id as the input.

For instance:

a = torch.randn(5, requires_grad=True)
y = a * a
make_dot(y, show_saved=True)

Before:

After pytorch/pytorch@1efa863:

In this second image the blue node and the two orange nodes are merged into one.

I'm opening this issue to discuss how we would to fix this.

One option is to revert back to the old behavior by giving unique names to each node (so something like dot.node(SAVED_PREFIX + attr + str(id(val)), get_var_name(val, attr), fillcolor='orange')).
The drawback is that the user loses the information that those three nodes are indeed backed by the same tensor.

Another option would be to draw a dotted edge between the saved variable and its base, for instance:

cc @soulitzer @albanD

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions