Skip to content

Commit

Permalink
[Bugfix]Change clip_grad_norm_fp8 to clip_grad_norm_fp32 (#97)
Browse files Browse the repository at this point in the history
**Description**
Description
The function name of clip_grad_norm_fp8 in megatron/init.py is wrong. It
should be clip_grad_norm_fp32. The gpt3 example can not run successfully
without this PR merged.
  • Loading branch information
tocean authored Aug 23, 2023
1 parent 9f6d59e commit aed29d6
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions msamp/megatron/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

"""Expose the interface of MS-AMP megatron package."""

from msamp.megatron.optimizer.clip_grads import clip_grad_norm_fp8
from msamp.megatron.optimizer.clip_grads import clip_grad_norm_fp32
from msamp.megatron.distributed import FP8DistributedDataParallel
from msamp.common.utils.lazy_import import LazyImport

Expand All @@ -13,6 +13,6 @@
FP8DistributedOptimizer = LazyImport('msamp.megatron.optimizer.distrib_optimizer', 'FP8DistributedOptimizer')

__all__ = [
'clip_grad_norm_fp8', 'FP8DistributedDataParallel', 'FP8LinearWithGradAccumulationAndAsyncCommunication',
'clip_grad_norm_fp32', 'FP8DistributedDataParallel', 'FP8LinearWithGradAccumulationAndAsyncCommunication',
'FP8DistributedOptimizer'
]

0 comments on commit aed29d6

Please sign in to comment.