Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add scaled_dot_product_attention fallback to controlmodel_ipadapter.py for PyTorch v1 compatibility #2707

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Commits on Mar 29, 2024

  1. Add scaled_dot_product_attention fallback to controlmodel_ipadapter.p…

    …y for PyTorch v1 compatibility
    ochen1 committed Mar 29, 2024
    Configuration menu
    Copy the full SHA
    90bc061 View commit details
    Browse the repository at this point in the history

Commits on Mar 31, 2024

  1. Configuration menu
    Copy the full SHA
    34b964f View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    642cc4d View commit details
    Browse the repository at this point in the history