-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pass transforms around instead of making duplicates #416
base: main
Are you sure you want to change the base?
Conversation
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
e79aa21
to
5ee0660
Compare
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
b85b9fb
to
fb39cf3
Compare
…#416) Summary: Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
This pull request was exported from Phabricator. Differential Revision: D65155103 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D65155103 |
Summary: Pull Request resolved: facebookresearch#422 Sobol generators would return the incorrect shape when handling multi stimuli generation. This would not cause problem because of the ask converted inadvertantly avoided the problem. Fixed and clarified the docstring what should happen. Sort of a bandaid fix, tensor shapes may need to be unified more carefully when n-choice models are implemented. Differential Revision: D65239074
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
fb39cf3
to
5130a3d
Compare
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
5130a3d
to
4ff2276
Compare
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
b300da4
to
f503e5f
Compare
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
This pull request was exported from Phabricator. Differential Revision: D65155103 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
f503e5f
to
010af92
Compare
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
147683f
to
d93f493
Compare
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
d93f493
to
dde7686
Compare
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
dde7686
to
dede2d3
Compare
This pull request was exported from Phabricator. Differential Revision: D65155103 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D65155103 |
dede2d3
to
9feaef2
Compare
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
9feaef2
to
dd322b3
Compare
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
dd322b3
to
6ffadda
Compare
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…esearch#401) Summary: Pull Request resolved: facebookresearch#401 Parameter transforms will be handled by wrapping generator and model objects. The wrappers surfaces the base object API completely and even appears to be the wrapped object upon type inspection. Methods that requires the transformations are overridden by the wrapper to apply the required (un)transforms. The wrappers expects transforms from BoTorch and new transforms should follow BoTorch's InputTransforms. As a baseline a log10 transform is implemented. Differential Revision: D64129439
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
6ffadda
to
a698c89
Compare
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
a698c89
to
3f09521
Compare
This pull request was exported from Phabricator. Differential Revision: D65155103 |
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
3f09521
to
8f1bdda
Compare
…#416) Summary: Pull Request resolved: facebookresearch#416 Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects. Differential Revision: D65155103
This pull request was exported from Phabricator. Differential Revision: D65155103 |
8f1bdda
to
aef5c78
Compare
Summary: Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.
Differential Revision: D65155103