You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/simple.md
+11-3
Original file line number
Diff line number
Diff line change
@@ -1149,11 +1149,11 @@ print(B) -- output
1149
1149
## Normalize ##
1150
1150
1151
1151
```lua
1152
-
module=nn.Normalize(p, [eps])
1152
+
module=nn.Normalize(p, [dim], [eps])
1153
1153
```
1154
-
Normalizes the input Tensor to have unit `L_p` norm. The smoothing parameter `eps` prevents division by zero when the input contains all zero elements (default = `1e-10`).
1154
+
Normalizes the input Tensor to have unit `L_p` norm over dimension `dim` (by default -1, i.e., the last dimension). The smoothing parameter `eps` prevents division by zero when the input contains all zero elements (default = `1e-10`).
1155
1155
1156
-
Input can be 1D or 2D (in which case it's considered as in batch mode)
1156
+
The `dim` parameter can take both positivs and negative values (in which case it is counted from the end). Negative dimensions are specially useful if one wants to be invariant to batch-mode.
1157
1157
1158
1158
```lua
1159
1159
A=torch.randn(3, 5)
@@ -1163,6 +1163,14 @@ B = m:forward(A) -- B is also 3 x 5
1163
1163
print(torch.norm(B, 2, 2)) -- norms is [1, 1, 1]
1164
1164
```
1165
1165
1166
+
Here is an example of normalizing the feature maps of an image
1167
+
```lua
1168
+
I=torch.randn(2, 3, 2, 2)
1169
+
m=nn.Normalize(1, -3) -- the third from the last element
1170
+
B=m:forward(I)
1171
+
print(torch.norm(B, 1, 2))
1172
+
```
1173
+
1166
1174
`Normalize` has a specialized implementation for the `inf` norm, which corresponds to the maximum norm.
0 commit comments