Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove normalize_inputs and replace with parameter transform #431

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Commits on Oct 30, 2024

  1. fix sobol generator multi stimuli reshape (facebookresearch#422)

    Summary:
    Pull Request resolved: facebookresearch#422
    
    Sobol generators would return the incorrect shape when handling multi stimuli generation. This would not cause problem because of the ask converted inadvertantly avoided the problem.
    
    Fixed and clarified the docstring what should happen.
    
    Sort of a bandaid fix, tensor shapes may need to be unified more carefully when n-choice models are implemented.
    
    Differential Revision: D65239074
    JasonKChow authored and facebook-github-bot committed Oct 30, 2024
    Configuration menu
    Copy the full SHA
    0acb6c5 View commit details
    Browse the repository at this point in the history

Commits on Nov 1, 2024

  1. implement parameter transforms as generator/model wrappers (facebookr…

    …esearch#401)
    
    Summary:
    Pull Request resolved: facebookresearch#401
    
    Parameter transforms will be handled by wrapping generator and model objects.
    
    The wrappers surfaces the base object API completely and even appears to be the wrapped object upon type inspection. Methods that requires the transformations are overridden by the wrapper to apply the required (un)transforms.
    
    The wrappers expects transforms from BoTorch and new transforms should follow BoTorch's InputTransforms.
    
    As a baseline a log10 transform is implemented.
    
    Differential Revision: D64129439
    JasonKChow authored and facebook-github-bot committed Nov 1, 2024
    Configuration menu
    Copy the full SHA
    93000cc View commit details
    Browse the repository at this point in the history
  2. pass transforms around instead of making duplicates

    Summary: Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.
    
    Differential Revision: D65155103
    JasonKChow authored and facebook-github-bot committed Nov 1, 2024
    Configuration menu
    Copy the full SHA
    46d0957 View commit details
    Browse the repository at this point in the history
  3. Remove normalize_inputs and replace with parameter transform (faceboo…

    …kresearch#431)
    
    Summary:
    Pull Request resolved: facebookresearch#431
    
    `normalize_inputs` (the one that min-max scales paraemters) is confusingly named (there's another `normalize_inputs` that concatenates data and ensures they're all the right types) and is a hard-coded transformation that is applied to all parameters. This means that there's no way to turn the behavior off selectively nor is it obvious that it is happening.
    
    This diff removes the normalize_inputs method and replaces it with an parameter transform that will also allow selective application of the transform via an index.
    
    Differential Revision: D65069497
    JasonKChow authored and facebook-github-bot committed Nov 1, 2024
    Configuration menu
    Copy the full SHA
    1a07ce2 View commit details
    Browse the repository at this point in the history