Skip to content

Commit

Permalink
updated documentation to include fairness loss function.
Browse files Browse the repository at this point in the history
  • Loading branch information
saist1993 committed Sep 12, 2022
1 parent 48e84f9 commit cbaa4dd
Show file tree
Hide file tree
Showing 4 changed files with 12 additions and 0 deletions.
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,10 @@ for inputs, labels, protected_attributes in train_iterator:
optimizer.step()
```


We highly recommend to **standardize features** by removing the mean and scaling to unit variance.
This can be done using standard scalar module in sklearn.

# Citation
```
@article{maheshwari2022fairgrad,
Expand Down
3 changes: 3 additions & 0 deletions docs/source/introduction.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,9 @@ Below is a simple example::

For complete worked out example refer to example folder on github.

We highly recommend to **standardize features** by removing the mean and scaling to unit variance.
This can be done using standard scalar module in sklearn.

Citation
-------------

Expand Down
1 change: 1 addition & 0 deletions docs/source/reference.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,4 @@ Reference

reference/fairness_function.rst
reference/torch/cross_entropy.rst
reference/torch/fairness_loss.rst
4 changes: 4 additions & 0 deletions docs/source/reference/torch/fairness_loss.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
fairgrad.torch.fairness_loss
=============================
.. automodule:: fairgrad.torch.fairness_loss
:members:

0 comments on commit cbaa4dd

Please sign in to comment.