Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docathon] Fix No.28 and N0.29 API label #57623

Merged
merged 2 commits into from
Sep 26, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion python/paddle/incubate/optimizer/lars_momentum.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ class LarsMomentumOptimizer(Optimizer):
This parameter is required in dygraph mode. \
The default value is None in static graph mode, at this time all parameters will be updated.
regularization (WeightDecayRegularizer, optional): The strategy of regularization. There are two method: \
:ref:`api_base_regularizer_L1Decay` , :ref:`api_base_regularizer_L2Decay` . If a parameter has set \
:ref:`api_paddle_regularizer_L1Decay` , :ref:`api_paddle_regularizer_L2Decay` . If a parameter has set \
regularizer using :ref:`api_paddle_ParamAttr` already, the regularization setting here in optimizer will be \
ignored for this parameter. Otherwise, the regularization setting here in optimizer will take effect. \
Default None, meaning there is no regularization.
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/incubate/optimizer/lbfgs.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ class LBFGS(Optimizer):
This parameter is required in dygraph mode. The default value is None.
weight_decay (float|WeightDecayRegularizer, optional): The strategy of regularization. \
It canbe a float value as coeff of L2 regularization or \
:ref:`api_base_regularizer_L1Decay`, :ref:`api_base_regularizer_L2Decay`.
:ref:`api_paddle_regularizer_L1Decay`, :ref:`api_paddle_regularizer_L2Decay`.
If a parameter has set regularizer using :ref:`api_paddle_ParamAttr` already, \
the regularization setting here in optimizer will be ignored for this parameter. \
Otherwise, the regularization setting here in optimizer will take effect. \
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/optimizer/adadelta.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ class Adadelta(Optimizer):
The default value is None in static graph mode, at this time all parameters will be updated.
weight_decay (float|WeightDecayRegularizer, optional): The strategy of regularization. \
It canbe a float value as coeff of L2 regularization or \
:ref:`api_base_regularizer_L1Decay`, :ref:`api_base_regularizer_L2Decay`.
:ref:`api_paddle_regularizer_L1Decay`, :ref:`api_paddle_regularizer_L2Decay`.
If a parameter has set regularizer using :ref:`api_paddle_ParamAttr` already, \
the regularization setting here in optimizer will be ignored for this parameter. \
Otherwise, the regularization setting here in optimizer will take effect. \
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/optimizer/adam.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ class Adam(Optimizer):
The default value is None in static graph mode, at this time all parameters will be updated.
weight_decay (float|WeightDecayRegularizer, optional): The strategy of regularization.
It canbe a float value as coeff of L2 regularization or
:ref:`api_base_regularizer_L1Decay`, :ref:`api_base_regularizer_L2Decay`.
:ref:`api_paddle_regularizer_L1Decay`, :ref:`api_paddle_regularizer_L2Decay`.
If a parameter has set regularizer using :ref:`api_paddle_ParamAttr` already,
the regularization setting here in optimizer will be ignored for this parameter.
Otherwise, the regularization setting here in optimizer will take effect.
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/optimizer/adamax.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ class Adamax(Optimizer):
The default value is None in static graph mode, at this time all parameters will be updated.
weight_decay (float|WeightDecayRegularizer, optional): The strategy of regularization.
It can be a float value as coeff of L2 regularization or
:ref:`api_base_regularizer_L1Decay`, :ref:`api_base_regularizer_L2Decay`.
:ref:`api_paddle_regularizer_L1Decay`, :ref:`api_paddle_regularizer_L2Decay`.
If a parameter has set regularizer using :ref:`api_paddle_ParamAttr` already,
the regularization setting here in optimizer will be ignored for this parameter.
Otherwise, the regularization setting here in optimizer will take effect.
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/optimizer/lbfgs.py
Original file line number Diff line number Diff line change
Expand Up @@ -339,7 +339,7 @@ class LBFGS(Optimizer):
This parameter is required in dygraph mode. The default value is None.
weight_decay (float|WeightDecayRegularizer, optional): The strategy of regularization. \
It canbe a float value as coeff of L2 regularization or \
:ref:`api_base_regularizer_L1Decay`, :ref:`api_base_regularizer_L2Decay`.
:ref:`api_paddle_regularizer_L1Decay`, :ref:`api_paddle_regularizer_L2Decay`.
If a parameter has set regularizer using :ref:`api_paddle_ParamAttr` already, \
the regularization setting here in optimizer will be ignored for this parameter. \
Otherwise, the regularization setting here in optimizer will take effect. \
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/optimizer/momentum.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ class Momentum(Optimizer):
The default value is None in static graph mode, at this time all parameters will be updated.
weight_decay (float|WeightDecayRegularizer, optional): The strategy of regularization. \
It can be a float value as coeff of L2 regularization or \
:ref:`api_base_regularizer_L1Decay`, :ref:`api_base_regularizer_L2Decay`.
:ref:`api_paddle_regularizer_L1Decay`, :ref:`api_paddle_regularizer_L2Decay`.
If a parameter has set regularizer using :ref:`api_paddle_ParamAttr` already, \
the regularization setting here in optimizer will be ignored for this parameter. \
Otherwise, the regularization setting here in optimizer will take effect. \
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/optimizer/optimizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ class Optimizer:
The default value is None in static graph mode, at this time all parameters will be updated.
weight_decay (float|WeightDecayRegularizer, optional): The strategy of regularization. \
It canbe a float value as coeff of L2 regularization or \
:ref:`api_base_regularizer_L1Decay`, :ref:`api_base_regularizer_L2Decay`.
:ref:`api_paddle_regularizer_L1Decay`, :ref:`api_paddle_regularizer_L2Decay`.
If a parameter has set regularizer using :ref:`api_paddle_ParamAttr` already, \
the regularization setting here in optimizer will be ignored for this parameter. \
Otherwise, the regularization setting here in optimizer will take effect. \
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/optimizer/rmsprop.py
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ class RMSProp(Optimizer):
The default value is None in static graph mode, at this time all parameters will be updated.
weight_decay (float|WeightDecayRegularizer, optional): The strategy of regularization.
It can be a float value as coeff of L2 regularization or \
:ref:`api_base_regularizer_L1Decay`, :ref:`api_base_regularizer_L2Decay`.
:ref:`api_paddle_regularizer_L1Decay`, :ref:`api_paddle_regularizer_L2Decay`.
If a parameter has set regularizer using :ref:`api_paddle_ParamAttr` already,
the regularization setting here in optimizer will be ignored for this parameter.
Otherwise, the regularization setting here in optimizer will take effect.
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/optimizer/sgd.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ class SGD(Optimizer):
The default value is None in static graph mode, at this time all parameters will be updated.
weight_decay (float|WeightDecayRegularizer, optional): The strategy of regularization. \
It can be a float value as coeff of L2 regularization or \
:ref:`api_base_regularizer_L1Decay`, :ref:`api_base_regularizer_L2Decay`.
:ref:`api_paddle_regularizer_L1Decay`, :ref:`api_paddle_regularizer_L2Decay`.
If a parameter has set regularizer using :ref:`api_paddle_ParamAttr` already, \
the regularization setting here in optimizer will be ignored for this parameter. \
Otherwise, the regularization setting here in optimizer will take effect. \
Expand Down