Skip to content

Commit 62effe1

Browse files
committed
updated the docs
1 parent 1e5a24c commit 62effe1

File tree

12 files changed

+17
-14
lines changed

12 files changed

+17
-14
lines changed

docs/docs/callbacks/train_logger.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ TrainLogger is a NeuralPy Callback class that generates training logs. It genera
2424

2525
## Supported Arguments
2626

27-
- `path`: (String) Directory where the logs will be stored
27+
- `path`: (String) Directory where the logs will be stored
2828

2929
## Example Code
3030

docs/docs/layers/activation_functions/gelu.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ To learn more about GELU, please check PyTorch [documentation](https://pytorch.o
2626

2727
## Supported Arguments
2828

29-
- `name`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
29+
- `name=None`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
3030

3131
## Example Code
3232

docs/docs/layers/activation_functions/leaky_relu.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ To learn more about LeakyReLU, please check PyTorch [documentation](https://pyto
2727
## Supported Arguments
2828

2929
- `negative_slope`: (Float) A negative slope for the LeakyReLU, default value is 0.01
30-
- `name`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
30+
- `name=None`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
3131

3232
## Example Code
3333

docs/docs/layers/activation_functions/relu.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ To learn more about ReLU, please check PyTorch [documentation](https://pytorch.o
2626

2727
## Supported Arguments
2828

29-
- `name`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
29+
- `name=None`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
3030

3131
## Example Code
3232

docs/docs/layers/activation_functions/selu.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ To learn more about SELU, please check PyTorch [documentation](https://pytorch.o
2626

2727
## Supported Arguments
2828

29-
- `name`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
29+
- `name=None`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
3030

3131
## Example Code
3232

docs/docs/layers/activation_functions/sigmoid.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ To learn more about Sigmoid, please check PyTorch [documentation](https://pytorc
2626

2727
## Supported Arguments
2828

29-
- `name`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
29+
- `name=None`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
3030

3131
## Example Code
3232

docs/docs/layers/activation_functions/softmax.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ To learn more about Softmax, please check PyTorch [documentation](https://pytorc
2727
## Supported Arguments
2828

2929
- `dim`: (Integer) A dimension along which Softmax will be computed (so every slice along dim will sum to 1).
30-
- `name`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
30+
- `name=None`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
3131

3232
## Example Code
3333

docs/docs/layers/activation_functions/tanh.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ To learn more about Tanh, please check PyTorch [documentation](https://pytorch.o
2626

2727
## Supported Arguments
2828

29-
- `name`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
29+
- `name=None`: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
3030

3131
## Example Code
3232

docs/docs/layers/linear/bilinear.mdx

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,12 +14,13 @@ hide_title: true
1414
neuralpy.layers.linear.Bilinear(n_nodes, n1_features=None, n2_features=None, bias=True, name=None)
1515
```
1616

17-
:::info
17+
:::danger
1818

19-
Bilinear Layer is mostly stable and can be used for any project. In the future, any chance of breaking changes is very low.
19+
Bilinear Layer is unstable and buggy, not ready for any real use
2020

2121
:::
2222

23+
2324
Bilinear layer performs a bilinear transformation of the input.
2425

2526
To learn more about Bilinear layers, please check [pytorch documentation](https://pytorch.org/docs/stable/nn.html?highlight=bilinear) for it.

docs/docs/layers/pooling/avgpool3d.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
id: avgpool3d
33
title: AvgPool3D
4-
sidebar_label: AvgPool1D
4+
sidebar_label: AvgPool3D
55
slug: /layers/pooling-layers/avgpool3d
66
description: Applies Batch Normalization over a 2D or 3D input
77
image: https://user-images.githubusercontent.com/34741145/81591141-99752900-93d9-11ea-9ef6-cc2c68daaa19.png

0 commit comments

Comments
 (0)