Skip to content

Commit

Permalink
[Chore] add dummy lora attention processors to prevent failures in ot…
Browse files Browse the repository at this point in the history
…her libs (huggingface#8777)

add dummy lora attention processors to prevent failures in other libs
  • Loading branch information
sayakpaul authored Jul 3, 2024
1 parent 84bbd2f commit 06ee4db
Showing 1 changed file with 20 additions and 0 deletions.
20 changes: 20 additions & 0 deletions src/diffusers/models/attention_processor.py
Original file line number Diff line number Diff line change
Expand Up @@ -2775,6 +2775,26 @@ def __call__(
return hidden_states


class LoRAAttnProcessor:
def __init__(self):
pass


class LoRAAttnProcessor2_0:
def __init__(self):
pass


class LoRAXFormersAttnProcessor:
def __init__(self):
pass


class LoRAAttnAddedKVProcessor:
def __init__(self):
pass


ADDED_KV_ATTENTION_PROCESSORS = (
AttnAddedKVProcessor,
SlicedAttnAddedKVProcessor,
Expand Down

0 comments on commit 06ee4db

Please sign in to comment.