You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have a limitation in that if someone has a custom operator, the custom operator kernels for functorch transforms are not allowed to call a functorch transform. This is problematic for developers (e.g. the functionalize rule for cond should call functionalize on both branches) and for users (autograd.Function with functorch -- e.g. a user may not call a functorch transform inside backward)
Note that in general, users are able to call functorch transforms from within functorch transforms (e.g. vmap(vmap(vmap(f)))(x)); the distinction here is that one cannot call a functorch transform from inside an implementation of an operation.
cc @samdow@Chillee@ezyang -- the notion of "levels being a tree" sounds similar to some of the torch_dispatch + mode + functorch generalizations we thought through last year.
The text was updated successfully, but these errors were encountered:
Those wrapper tensors aren't escaping from their level. Wrapper tensors with the same level (but that come from different interpreters!) should never interact, assuming a correctly implemented functorch (and non-adversarial user code)
The immediate problem is that we have some internal asserts somewhere that say that there cannot be two different alive interpreters with the same level
Motivation
We have a limitation in that if someone has a custom operator, the custom operator kernels for functorch transforms are not allowed to call a functorch transform. This is problematic for developers (e.g. the functionalize rule for cond should call functionalize on both branches) and for users (autograd.Function with functorch -- e.g. a user may not call a functorch transform inside backward)
Note that in general, users are able to call functorch transforms from within functorch transforms (e.g. vmap(vmap(vmap(f)))(x)); the distinction here is that one cannot call a functorch transform from inside an implementation of an operation.
For further reading
More information in doc: https://docs.google.com/document/d/1iZ_ioIC8u6osb87Ue43yGrr4ACIQRx9dMlqg2KQx2co/edit#heading=h.49xwcme71bbe
cc @samdow @Chillee @ezyang -- the notion of "levels being a tree" sounds similar to some of the torch_dispatch + mode + functorch generalizations we thought through last year.
The text was updated successfully, but these errors were encountered: