Skip to content

Commit

Permalink
Merge branch 'main' into fds-improve-ds-type-error-message
Browse files Browse the repository at this point in the history
  • Loading branch information
jafermarq authored Sep 13, 2024
2 parents d5f72bc + 541064b commit b442142
Show file tree
Hide file tree
Showing 182 changed files with 1,299 additions and 911 deletions.
2 changes: 1 addition & 1 deletion .github/actions/bootstrap/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ description: "Bootstrap Python environment (install and configure Python version
inputs:
python-version:
description: "Version range or exact version of Python or PyPy to use, using SemVer's version range syntax."
default: 3.8
default: 3.9
pip-version:
description: "Version of pip to be installed using pip"
default: 24.1.2
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/cache-cleanup.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.8
python-version: 3.9

- name: Cleanup caches by directories
# Only keep caches that match the latest keys for each directory
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.8
python-version: 3.9
- name: Install build tools
run: |
python -m pip install -U pip==23.3.1
Expand Down
2 changes: 1 addition & 1 deletion e2e/strategies/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "Keras Federated Learning Quickstart with Flower"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = { path = "../../", develop = true, extras = ["simulation"] }
tensorflow-cpu = "^2.9.1, !=2.11.1"
tensorflow-io-gcs-filesystem = "<0.35.0"
2 changes: 1 addition & 1 deletion examples/advanced-pytorch/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ authors = [
]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = ">=1.0,<2.0"
flwr-datasets = { extras = ["vision"], version = ">=0.0.2,<1.0.0" }
torch = "1.13.1"
Expand Down
2 changes: 1 addition & 1 deletion examples/advanced-tensorflow/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "Advanced Flower/TensorFlow Example"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = ">=1.0,<2.0"
flwr-datasets = { extras = ["vision"], version = ">=0.0.2,<1.0.0" }
tensorflow-cpu = { version = ">=2.9.1,<2.11.1 || >2.11.1", markers = "platform_machine == \"x86_64\"" }
Expand Down
2 changes: 1 addition & 1 deletion examples/android-kotlin/gen_tflite/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ description = ""
authors = ["Steven Hé (Sīchàng) <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
numpy = ">=1.23,<2.0"
tensorflow-cpu = ">=2.12,<3.0"
pandas = ">=2.0,<3.0"
Expand Down
2 changes: 1 addition & 1 deletion examples/android-kotlin/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,5 +9,5 @@ description = ""
authors = ["Steven Hé (Sīchàng) <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = ">=1.0,<2.0"
2 changes: 1 addition & 1 deletion examples/android/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "Android Example"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = ">=1.0,<2.0"
tensorflow-cpu = { version = ">=2.9.1,<2.11.1 || >2.11.1", markers = "platform_machine == \"x86_64\"" }
tensorflow-macos = { version = ">=2.9.1,<2.11.1 || >2.11.1", markers = "sys_platform == \"darwin\" and platform_machine == \"arm64\"" }
2 changes: 1 addition & 1 deletion examples/app-pytorch/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "Multi-Tenant Federated Learning with Flower and PyTorch"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = "^3.8"
python = "^3.9"
# Mandatory dependencies
flwr = { version = "^1.8.0", extras = ["simulation"] }
torch = "2.2.1"
Expand Down
2 changes: 1 addition & 1 deletion examples/custom-mods/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "Multi-Tenant Federated Learning with Flower and PyTorch"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = { path = "../../", develop = true, extras = ["simulation"] }
tensorboard = "2.16.2"
torch = "1.13.1"
Expand Down
2 changes: 1 addition & 1 deletion examples/ios/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,5 +9,5 @@ description = "Example Server for Flower iOS/CoreML"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = ">=1.0,<2.0"
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "PyTorch: From Centralized To Federated with Flower"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = ">=1.0,<2.0"
flwr-datasets = { extras = ["vision"], version = ">=0.0.2,<1.0.0" }
torch = "1.13.1"
Expand Down
2 changes: 1 addition & 1 deletion examples/quickstart-jax/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ description = "JAX example training a linear regression model with federated lea
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = "1.0.0"
jax = "0.4.17"
jaxlib = "0.4.17"
Expand Down
2 changes: 1 addition & 1 deletion examples/quickstart-mlcube/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "Keras Federated Learning Quickstart with Flower"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = ">=1.0,<2.0" # For development: { path = "../../", develop = true }
tensorflow-cpu = { version = ">=2.9.1,<2.11.1 || >2.11.1", markers = "platform_machine == \"x86_64\"" }
tensorflow-macos = { version = ">=2.9.1,<2.11.1 || >2.11.1", markers = "sys_platform == \"darwin\" and platform_machine == \"arm64\"" }
Expand Down
7 changes: 4 additions & 3 deletions examples/quickstart-monai/monaiexample/task.py
Original file line number Diff line number Diff line change
Expand Up @@ -189,9 +189,10 @@ def _download_and_extract_if_needed(url, dest_folder):
# Download the tar.gz file
tar_gz_filename = url.split("/")[-1]
if not os.path.isfile(tar_gz_filename):
with request.urlopen(url) as response, open(
tar_gz_filename, "wb"
) as out_file:
with (
request.urlopen(url) as response,
open(tar_gz_filename, "wb") as out_file,
):
out_file.write(response.read())

# Extract the tar.gz file
Expand Down
2 changes: 1 addition & 1 deletion examples/quickstart-pandas/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ authors = ["Ragy Haddad <[email protected]>"]
maintainers = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = ">=1.0,<2.0"
flwr-datasets = { extras = ["vision"], version = ">=0.0.2,<1.0.0" }
numpy = "1.23.2"
Expand Down
2 changes: 1 addition & 1 deletion examples/quickstart-tabnet/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "Tabnet Federated Learning Quickstart with Flower"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = ">=1.0,<2.0"
tensorflow-cpu = { version = ">=2.9.1,<2.11.1 || >2.11.1", markers = "platform_machine == \"x86_64\"" }
tensorflow-macos = { version = ">=2.9.1,<2.11.1 || >2.11.1", markers = "sys_platform == \"darwin\" and platform_machine == \"arm64\"" }
Expand Down
2 changes: 1 addition & 1 deletion examples/vertical-fl/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "PyTorch Vertical FL with Flower"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = { extras = ["simulation"], version = ">=1.0,<2.0" }
torch = "2.1.0"
matplotlib = "3.7.3"
Expand Down
2 changes: 1 addition & 1 deletion examples/whisper-federated-finetuning/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "On-device Federated Downstreaming for Speech Classification"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = { extras = ["simulation"], version = ">=1.0,<2.0" }
transformers = "4.32.1"
tokenizers = "0.13.3"
Expand Down
2 changes: 1 addition & 1 deletion examples/xgboost-comprehensive/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ description = "Federated XGBoost with Flower (comprehensive)"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.9,<3.11"
flwr = { extras = ["simulation"], version = ">=1.7.0,<2.0" }
flwr-datasets = ">=0.2.0,<1.0.0"
xgboost = ">=2.0.0,<3.0.0"
18 changes: 18 additions & 0 deletions glossary/aggregation.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
---
title: "Aggregation"
description: "Combine model weights from sampled clients to update the global model. This process enables the global model to learn from each client's data."
date: "2024-05-23"
author:
name: "Charles Beauville"
position: "Machine Learning Engineer"
website: "https://www.linkedin.com/in/charles-beauville/"
github: "github.com/charlesbvll"
related:
- text: "Federated Learning"
link: "/glossary/federated-learning"
- text: "Tutorial: What is Federated Learning?"
link: "/docs/framework/tutorial-series-what-is-federated-learning.html"
---

During each Federated Learning round, the server will receive model weights from sampled clients and needs a function to improve its global model using those weights. This is what is called `aggregation`. It can be a simple weighted average function (like `FedAvg`), or can be more complex (e.g. incorporating optimization techniques). The aggregation is where FL's magic happens, it allows the global model to learn and improve from each client's particular data distribution with only their trained weights.

17 changes: 17 additions & 0 deletions glossary/client.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
---
title: "Client"
description: "A client is any machine with local data that connects to a server, trains on received global model weights, and sends back updated weights. Clients may also evaluate global model weights."
date: "2024-05-23"
author:
name: "Charles Beauville"
position: "Machine Learning Engineer"
website: "https://www.linkedin.com/in/charles-beauville/"
github: "github.com/charlesbvll"
related:
- text: "Federated Learning"
link: "/glossary/federated-learning"
- text: "Tutorial: What is Federated Learning?"
link: "/docs/framework/tutorial-series-what-is-federated-learning.html"
---

Any machine with access to some data that connects to a server to perform Federated Learning. During each round of FL (if it is sampled), it will receive global model weights from the server, train on the data they have access to, and send the resulting trained weights back to the server. Clients can also be sampled to evaluate the global server weights on the data they have access to, this is called federated evaluation.
22 changes: 22 additions & 0 deletions glossary/docker.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
---
title: "Docker"
description: "Docker is a containerization tool that allows for consistent and reliable deployment of applications across different environments."
date: "2024-07-08"
author:
name: "Robert Steiner"
position: "DevOps Engineer at Flower Labs"
website: "https://github.com/Robert-Steiner"
---

Docker is an open-source containerization tool for deploying and running applications. Docker
containers encapsulate an application's code, dependencies, and configuration files, allowing
for consistent and reliable deployment across different environments.

In the context of federated learning, Docker containers can be used to package the entire client
and server application, including all the necessary dependencies, and then deployed on various
devices such as edge devices, cloud servers, or even on-premises servers.

In Flower, Docker containers are used to containerize various applications like `SuperLink`,
`SuperNode`, and `SuperExec`. Flower's Docker images allow users to quickly get Flower up and
running, reducing the time and effort required to set up and configure the necessary software
and dependencies.
40 changes: 40 additions & 0 deletions glossary/edge-computing.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
---
title: "Edge Computing"
description: "Edge computing is a distributed computing concept of bringing compute and data storage as close as possible to the source of data generation and consumption by users."
date: "2024-09-10"
author:
name: "Chong Shen Ng"
position: "Research Engineer @ Flower Labs"
website: "https://discuss.flower.ai/u/chongshenng"
github: "github.com/chongshenng"
related:
- text: "IoT"
link: "/glossary/iot"
- text: "Run Flower using Docker"
link: "/docs/framework/docker/index.html"
- text: "Flower Clients in C++"
link: "/docs/examples/quickstart-cpp.html"
- text: "Federated Learning on Embedded Devices with Flower"
link: "/docs/examples/embedded-devices.html"
---

### Introduction to Edge Computing

Edge computing is a distributed computing concept of bringing compute and data storage as close as possible to the source of data generation and consumption by users. By performing computation close to the data source, edge computing aims to address limitations typically encountered in centralized computing, such as bandwidth, latency, privacy, and autonomy.

Edge computing works alongside cloud and fog computing, but each serves different purposes. Cloud computing delivers on-demand resources like data storage, servers, analytics, and networking via the Internet. Fog computing, however, brings computing closer to devices by distributing communication and computation across clusters of IoT or edge devices. While edge computing is sometimes used interchangeably with fog computing, edge computing specifically handles data processing directly at or near the devices themselves, whereas fog computing distributes tasks across multiple nodes, bridging the gap between edge devices and the cloud.

### Advantages and Use Cases of Edge Computing

The key benefit of edge computing is that the volume of data moved is significantly reduced because computation runs directly on board the device on the acquired data. This reduces the amount of long-distance communication between machines, which improves latency and reduces transmissions costs. Examples of edge computing that benefit from offloading computation include:
1. Smart watches and fitness monitors that measure live health metrics.
2. Facial recognition and wake word detection on smartphones.
3. Real-time lane departure warning systems in road transport that detect lane lines using on-board videos and sensors.

### Federated Learning in Edge Computing

When deploying federated learning systems, edge computing is an important component to consider. Edge computing typically take the role of "clients" in federated learning. In a healthcare use case, servers in different hospitals can train models on their local data. In mobile computing, smartphones perform local training (and inference) on user data such as for next word prediction.

### Edge Computing with Flower

With the Flower framework, you can easily deploy federated learning workflows and maximise the use of edge computing resources. Flower provides the infrastructure to perform federated learning, federated evaluation, and federated analytics, all in a easy, scalable and secure way. Start with our tutorial on running Federated Learning on Embedded Devices (link [here](https://github.com/adap/flower/tree/main/examples/embedded-devices)), which shows you how to run Flower on NVidia Jetson devices and Raspberry Pis as your edge compute.
19 changes: 19 additions & 0 deletions glossary/evaluation.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
---
title: "Evaluation"
description: "Evaluation measures how well the trained model performs by testing it on each client's local data, providing insights into its generalizability across varied data sources."
date: "2024-07-08"
author:
name: "Heng Pan"
position: "Research Scientist"
website: "https://discuss.flower.ai/u/pan-h/summary"
github: "github.com/panh99"
related:
- text: "Server"
link: "/glossary/server"
- text: "Client"
link: "/glossary/client"
---

Evaluation in machine learning is the process of assessing a model's performance on unseen data to determine its ability to generalize beyond the training set. This typically involves using a separate test set and various metrics like accuracy or F1-score to measure how well the model performs on new data, ensuring it isn't overfitting or underfitting.

In federated learning, evaluation (or distributed evaluation) refers to the process of assessing a model's performance across multiple clients, such as devices or data centers. Each client evaluates the model locally using its own data and then sends the results to the server, which aggregates all the evaluation outcomes. This process allows for understanding how well the model generalizes to different data distributions without centralizing sensitive data.
14 changes: 14 additions & 0 deletions glossary/federated-learning.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
title: "Federated Learning"
description: "Federated Learning is a machine learning approach where model training occurs on decentralized devices, preserving data privacy and leveraging local computations."
date: "2024-05-23"
author:
name: "Julian Rußmeyer"
position: "UX/UI Designer"
website: "https://www.linkedin.com/in/julian-russmeyer/"
related:
- text: "Tutorial: What is Federated Learning?"
link: "/docs/framework/tutorial-series-what-is-federated-learning.html"
---

Federated learning is an approach to machine learning in which the model is trained on multiple decentralized devices or servers with local data samples without exchanging them. Instead of sending raw data to a central server, updates to the model are calculated locally and only the model parameters are aggregated centrally. In this way, user privacy is maintained and communication costs are reduced, while collaborative model training is enabled.
Loading

0 comments on commit b442142

Please sign in to comment.