From 341582bf6eec0772a204d6bf847f7b6ca718fcce Mon Sep 17 00:00:00 2001 From: jafermarq Date: Tue, 19 Dec 2023 20:39:26 +0100 Subject: [PATCH] notice --- doc/source/ref-changelog.md | 2 ++ examples/dp-sgd-mnist/README.md | 2 ++ 2 files changed, 4 insertions(+) diff --git a/doc/source/ref-changelog.md b/doc/source/ref-changelog.md index 603e6c602274..a7a76c5b13a0 100644 --- a/doc/source/ref-changelog.md +++ b/doc/source/ref-changelog.md @@ -10,6 +10,8 @@ - FedVSSL [#2412](https://github.com/adap/flower/pull/2412) +- **Deprecating TF-privacy example** We are bring a Flower-native way of adding DP and other PET to your FL settings. This example will be updated accordintly soon ([#2725](https://github.com/adap/flower/pull/2725)) + ## v1.6.0 (2023-11-28) ### Thanks to our contributors diff --git a/examples/dp-sgd-mnist/README.md b/examples/dp-sgd-mnist/README.md index fcf602306c90..890fcb584c08 100644 --- a/examples/dp-sgd-mnist/README.md +++ b/examples/dp-sgd-mnist/README.md @@ -1,5 +1,7 @@ # Flower Example Using Tensorflow/Keras and Tensorflow Privacy +> This example is deprecated. It will soon be replaced with a Flower-native way of adding DP to FL experiments. + This example of Flower trains a federeated learning system where clients are free to choose between non-private and private optimizers. Specifically, clients can choose to train Keras models using the standard SGD optimizer or __Differentially Private__ SGD (DPSGD) from [Tensorflow Privacy](https://github.com/tensorflow/privacy). For this task we use the MNIST dataset which is split artificially among clients. This causes the dataset to be i.i.d. The clients using DPSGD track the amount of privacy spent and display it at the end of the training.