Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove normalize_inputs and replace with parameter transform #431

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

JasonKChow
Copy link
Contributor

Summary:
normalize_inputs (the one that min-max scales paraemters) is confusingly named (there's another normalize_inputs that concatenates data and ensures they're all the right types) and is a hard-coded transformation that is applied to all parameters. This means that there's no way to turn the behavior off selectively nor is it obvious that it is happening.

This diff removes the normalize_inputs method and replaces it with an parameter transform that will also allow selective application of the transform via an index.

Differential Revision: D65069497

Summary:
Pull Request resolved: facebookresearch#422

Sobol generators would return the incorrect shape when handling multi stimuli generation. This would not cause problem because of the ask converted inadvertantly avoided the problem.

Fixed and clarified the docstring what should happen.

Sort of a bandaid fix, tensor shapes may need to be unified more carefully when n-choice models are implemented.

Differential Revision: D65239074
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 1, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65069497

…esearch#401)

Summary:
Pull Request resolved: facebookresearch#401

Parameter transforms will be handled by wrapping generator and model objects.

The wrappers surfaces the base object API completely and even appears to be the wrapped object upon type inspection. Methods that requires the transformations are overridden by the wrapper to apply the required (un)transforms.

The wrappers expects transforms from BoTorch and new transforms should follow BoTorch's InputTransforms.

As a baseline a log10 transform is implemented.

Differential Revision: D64129439
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…kresearch#431)

Summary:
Pull Request resolved: facebookresearch#431

`normalize_inputs` (the one that min-max scales paraemters) is confusingly named (there's another `normalize_inputs` that concatenates data and ensures they're all the right types) and is a hard-coded transformation that is applied to all parameters. This means that there's no way to turn the behavior off selectively nor is it obvious that it is happening.

This diff removes the normalize_inputs method and replaces it with an parameter transform that will also allow selective application of the transform via an index.

Differential Revision: D65069497
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65069497

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…kresearch#431)

Summary:
Pull Request resolved: facebookresearch#431

`normalize_inputs` (the one that min-max scales paraemters) is confusingly named (there's another `normalize_inputs` that concatenates data and ensures they're all the right types) and is a hard-coded transformation that is applied to all parameters. This means that there's no way to turn the behavior off selectively nor is it obvious that it is happening.

This diff removes the normalize_inputs method and replaces it with an parameter transform that will also allow selective application of the transform via an index.

Differential Revision: D65069497
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65069497

Summary: Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65069497

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…kresearch#431)

Summary:
Pull Request resolved: facebookresearch#431

`normalize_inputs` (the one that min-max scales paraemters) is confusingly named (there's another `normalize_inputs` that concatenates data and ensures they're all the right types) and is a hard-coded transformation that is applied to all parameters. This means that there's no way to turn the behavior off selectively nor is it obvious that it is happening.

This diff removes the normalize_inputs method and replaces it with an parameter transform that will also allow selective application of the transform via an index.

Differential Revision: D65069497
…kresearch#431)

Summary:
Pull Request resolved: facebookresearch#431

`normalize_inputs` (the one that min-max scales paraemters) is confusingly named (there's another `normalize_inputs` that concatenates data and ensures they're all the right types) and is a hard-coded transformation that is applied to all parameters. This means that there's no way to turn the behavior off selectively nor is it obvious that it is happening.

This diff removes the normalize_inputs method and replaces it with an parameter transform that will also allow selective application of the transform via an index.

Differential Revision: D65069497
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65069497

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants