Skip to content

Commit

Permalink
support for wrapped schedulefree optimizer when using deepspeed (#3266)
Browse files Browse the repository at this point in the history
* support for wrapped schedulefree optimizer when using deepspeed

* add comment and lint
  • Loading branch information
winglian authored Dec 2, 2024
1 parent c6f34a0 commit 4a100ee
Showing 1 changed file with 7 additions and 0 deletions.
7 changes: 7 additions & 0 deletions src/accelerate/optimizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,13 @@ def train(self):
"""
if hasattr(self.optimizer, "train") and callable(self.optimizer.train):
self.optimizer.train()
elif (
hasattr(self.optimizer, "optimizer")
and hasattr(self.optimizer.optimizer, "train")
and callable(self.optimizer.optimizer.train)
):
# the deepspeed optimizer further wraps the optimizer
self.optimizer.optimizer.train()

def eval(self):
"""
Expand Down

0 comments on commit 4a100ee

Please sign in to comment.