Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pass transforms around instead of making duplicates #416

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

JasonKChow
Copy link
Contributor

Summary: Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 29, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 29, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 30, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@JasonKChow JasonKChow force-pushed the export-D65155103 branch 2 times, most recently from b85b9fb to fb39cf3 Compare October 30, 2024 22:16
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 30, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

Summary:
Pull Request resolved: facebookresearch#422

Sobol generators would return the incorrect shape when handling multi stimuli generation. This would not cause problem because of the ask converted inadvertantly avoided the problem.

Fixed and clarified the docstring what should happen.

Sort of a bandaid fix, tensor shapes may need to be unified more carefully when n-choice models are implemented.

Differential Revision: D65239074
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 30, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 31, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 31, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@JasonKChow JasonKChow force-pushed the export-D65155103 branch 2 times, most recently from b300da4 to f503e5f Compare October 31, 2024 23:28
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 31, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

…esearch#401)

Summary:
Pull Request resolved: facebookresearch#401

Parameter transforms will be handled by wrapping generator and model objects.

The wrappers surfaces the base object API completely and even appears to be the wrapped object upon type inspection. Methods that requires the transformations are overridden by the wrapper to apply the required (un)transforms.

The wrappers expects transforms from BoTorch and new transforms should follow BoTorch's InputTransforms.

As a baseline a log10 transform is implemented.

Differential Revision: D64129439
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants