Skip to content

Commit

Permalink
Merge branch 'main' into add-tensorboard-middleware
Browse files Browse the repository at this point in the history
  • Loading branch information
tanertopal authored Dec 20, 2023
2 parents 3711e0d + aded3e0 commit af835c9
Show file tree
Hide file tree
Showing 36 changed files with 731 additions and 108 deletions.
3 changes: 3 additions & 0 deletions .github/CODEOWNERS
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,6 @@

# Flower Baselines
/baselines @jafermarq @tanertopal @danieljanes

# Flower Examples
/examples @jafermarq @tanertopal @danieljanes
8 changes: 4 additions & 4 deletions .github/workflows/docker-server.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@ on:
description: "Version of Flower e.g. (1.6.0)."
required: true
type: string
base-image-version:
description: "Version of the Flower base image."
base-image-tag:
description: "The tag of the Flower base image."
required: false
type: string
default: "py3.11-ubuntu22.04"
Expand All @@ -27,9 +27,9 @@ jobs:
file-dir: src/docker/server
build-args: |
FLWR_VERSION=${{ github.event.inputs.flwr-version }}
BASE_IMAGE_VERSION=${{ github.event.inputs.base-image-version }}
BASE_IMAGE_TAG=${{ github.event.inputs.base-image-tag }}
tags: |
${{ github.event.inputs.flwr-version }}-${{ github.event.inputs.base-image-version }}
${{ github.event.inputs.flwr-version }}-${{ github.event.inputs.base-image-tag }}
${{ github.event.inputs.flwr-version }}
latest
secrets:
Expand Down
12 changes: 6 additions & 6 deletions .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,12 +30,12 @@ jobs:
- name: Test wheel
run: ./dev/test-wheel.sh
- name: Upload wheel
if: ${{ github.repository == 'adap/flower' && !github.event.pull_request.head.repo.fork }}
if: ${{ github.repository == 'adap/flower' && !github.event.pull_request.head.repo.fork && github.actor != 'dependabot[bot]' }}
id: upload
env:
AWS_DEFAULT_REGION: ${{ secrets. AWS_DEFAULT_REGION }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets. AWS_SECRET_ACCESS_KEY }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
run: |
cd ./dist
echo "WHL_PATH=$(ls *.whl)" >> "$GITHUB_OUTPUT"
Expand Down Expand Up @@ -73,7 +73,7 @@ jobs:
dataset: |
import tensorflow as tf
tf.keras.datasets.cifar10.load_data()
- directory: tabnet
dataset: |
import tensorflow_datasets as tfds
Expand All @@ -83,7 +83,7 @@ jobs:
dataset: |
from torchvision.datasets import CIFAR10
CIFAR10('./data', download=True)
- directory: pytorch-lightning
dataset: |
from torchvision.datasets import MNIST
Expand All @@ -102,7 +102,7 @@ jobs:
- directory: fastai
dataset: |
from fastai.vision.all import untar_data, URLs
untar_data(URLs.MNIST)
untar_data(URLs.MNIST)
- directory: pandas
dataset: |
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/update-pr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ jobs:
runs-on: ubuntu-22.04
steps:
- name: Automatically update mergeable PRs
uses: adRise/[email protected]
uses: adRise/update-pr-branch@cd305ecbd76bf63056c9400ce2c725293fc3e0c0 # v0.7.0
with:
token: ${{ secrets.FLWRMACHINE_TOKEN }}
base: 'main'
Expand Down
16 changes: 16 additions & 0 deletions datasets/e2e/pytorch/pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
[build-system]
requires = ["poetry-core>=1.4.0"]
build-backend = "poetry.core.masonry.api"

[tool.poetry]
name = "fds-e2e-pytorch"
version = "0.1.0"
description = "Flower Datasets with PyTorch"
authors = ["The Flower Authors <[email protected]>"]

[tool.poetry.dependencies]
python = "^3.8"
flwr-datasets = { path = "./../../", extras = ["vision"] }
torch = "^1.12.0"
torchvision = "^0.14.1"
parameterized = "==0.9.0"
131 changes: 131 additions & 0 deletions datasets/e2e/pytorch/pytorch_test.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,131 @@
import unittest

import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from datasets.utils.logging import disable_progress_bar
from parameterized import parameterized_class, parameterized
from torch import Tensor
from torch.utils.data import DataLoader
from torchvision.transforms import Compose, ToTensor, Normalize

from flwr_datasets import FederatedDataset


class SimpleCNN(nn.Module):
def __init__(self):
super(SimpleCNN, self).__init__()
self.conv1 = nn.Conv2d(3, 6, 5)
self.pool = nn.MaxPool2d(2, 2)
self.conv2 = nn.Conv2d(6, 16, 5)
self.fc1 = nn.Linear(16 * 5 * 5, 120)
self.fc2 = nn.Linear(120, 84)
self.fc3 = nn.Linear(84, 10)

def forward(self, x):
x = self.pool(F.relu(self.conv1(x)))
x = self.pool(F.relu(self.conv2(x)))
x = x.view(-1, 16 * 5 * 5)
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
x = self.fc3(x)
return x


# Using parameterized testing, two different sets of parameters are specified:
# 1. CIFAR10 dataset with the simple ToTensor transform.
# 2. CIFAR10 dataset with a composed transform that first converts an image to a tensor
# and then normalizes it.
@parameterized_class(
[
{"dataset_name": "cifar10", "test_split": "test", "transforms": ToTensor()},
{"dataset_name": "cifar10", "test_split": "test", "transforms": Compose(
[ToTensor(), Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
)},
]
)
class FdsToPyTorch(unittest.TestCase):
"""Test the conversion from FDS to PyTorch Dataset and Dataloader."""

dataset_name = ""
test_split = ""
transforms = None
trainloader = None
expected_img_shape_after_transform = [3, 32, 32]

@classmethod
def setUpClass(cls):
"""Disable progress bar to keep the log clean.
"""
disable_progress_bar()

def _create_trainloader(self, batch_size: int) -> DataLoader:
"""Create a trainloader from the federated dataset."""
partition_id = 0
fds = FederatedDataset(dataset=self.dataset_name, partitioners={"train": 100})
partition = fds.load_partition(partition_id, "train")
partition_train_test = partition.train_test_split(test_size=0.2)
partition_train_test = partition_train_test.map(
lambda img: {"img": self.transforms(img)}, input_columns="img"
)
trainloader = DataLoader(
partition_train_test["train"].with_format("torch"), batch_size=batch_size,
shuffle=True
)
return trainloader

def test_create_partition_dataloader_with_transforms_shape(self) -> None:
"""Test if the DataLoader returns batches with the expected shape."""
batch_size = 16
trainloader = self._create_trainloader(batch_size)
batch = next(iter(trainloader))
images = batch["img"]
self.assertEqual(tuple(images.shape),
(batch_size, *self.expected_img_shape_after_transform))

def test_create_partition_dataloader_with_transforms_batch_type(self) -> None:
"""Test if the DataLoader returns batches of type dictionary."""
batch_size = 16
trainloader = self._create_trainloader(batch_size)
batch = next(iter(trainloader))
self.assertIsInstance(batch, dict)

def test_create_partition_dataloader_with_transforms_data_type(self) -> None:
"""Test to verify if the data in the DataLoader batches are of type Tensor."""
batch_size = 16
trainloader = self._create_trainloader(batch_size)
batch = next(iter(trainloader))
images = batch["img"]
self.assertIsInstance(images, Tensor)

@parameterized.expand([
("not_nan", torch.isnan),
("not_inf", torch.isinf),
])
def test_train_model_loss_value(self, name, condition_func):
"""Test if the model trains and if the loss is a correct number."""
trainloader = self._create_trainloader(16)
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")

# Create the model, criterion, and optimizer
net = SimpleCNN().to(device)
criterion = nn.CrossEntropyLoss()
optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9)

# Training loop for one epoch
net.train()
loss = None
for i, data in enumerate(trainloader, 0):
inputs, labels = data['img'].to(device), data['label'].to(device)
optimizer.zero_grad()
outputs = net(inputs)
loss = criterion(outputs, labels)
loss.backward()
optimizer.step()

self.assertFalse(condition_func(loss).item())


if __name__ == '__main__':
unittest.main()
2 changes: 1 addition & 1 deletion dev/format.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ python -m docformatter -i -r examples

# Notebooks
python -m black --ipynb -q doc/source/*.ipynb
KEYS="metadata.celltoolbar metadata.language_info metadata.toc metadata.notify_time metadata.varInspector metadata.accelerator metadata.vscode cell.metadata.id cell.metadata.heading_collapsed cell.metadata.hidden cell.metadata.code_folding cell.metadata.tags cell.metadata.init_cell cell.metadata.vscode"
KEYS="metadata.celltoolbar metadata.language_info metadata.toc metadata.notify_time metadata.varInspector metadata.accelerator metadata.vscode cell.metadata.id cell.metadata.heading_collapsed cell.metadata.hidden cell.metadata.code_folding cell.metadata.tags cell.metadata.init_cell cell.metadata.vscode cell.metadata.pycharm"
python -m nbstripout doc/source/*.ipynb --extra-keys "$KEYS"
python -m nbstripout examples/*/*.ipynb --extra-keys "$KEYS"

Expand Down
135 changes: 135 additions & 0 deletions doc/source/contributor-how-to-build-docker-images.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,135 @@
How to build Docker Flower images locally
=========================================

Flower provides pre-made docker images on `Docker Hub <https://hub.docker.com/r/flwr/server/tags>`_
that include all necessary dependencies for running the server. You can also build your own custom
docker images from scratch with a different version of Python or Ubuntu if that is what you need.
In this guide, we will explain what images exist and how to build them locally.

Before we can start, we need to meet a few prerequisites in our local development environment.

#. Clone the flower repository.

.. code-block:: bash
$ git clone https://github.com/adap/flower.git && cd flower
#. Verify the Docker daemon is running.

Please follow the first section on
`Run Flower using Docker <https://flower.dev/docs/framework/how-to-run-flower-using-docker>`_
which covers this step in more detail.

Currently, Flower provides two images, a base image and a server image. There will also be a client
image soon. The base image, as the name suggests, contains basic dependencies that both the server
and the client need. This includes system dependencies, Python and Python tools. The server image is
based on the base image, but it additionally installs the Flower server using ``pip``.

The build instructions that assemble the images are located in the respective Dockerfiles. You
can find them in the subdirectories of ``src/docker``.

Both, base and server image are configured via build arguments. Through build arguments, we can make
our build more flexible. For example, in the base image, we can specify the version of Python to
install using the ``PYTHON_VERSION`` build argument. Some of the build arguments have default
values, others must be specified when building the image. All available build arguments for each
image are listed in one of the tables below.

Building the base image
-----------------------

.. list-table::
:widths: 25 45 15 15
:header-rows: 1

* - Build argument
- Description
- Required
- Example
* - ``PYTHON_VERSION``
- Version of ``python`` to be installed.
- Yes
- ``3.11``
* - ``PIP_VERSION``
- Version of ``pip`` to be installed.
- Yes
- ``23.0.1``
* - ``SETUPTOOLS_VERSION``
- Version of ``setuptools`` to be installed.
- Yes
- ``69.0.2``
* - ``UBUNTU_VERSION``
- Version of the official Ubuntu Docker image.
- Defaults to ``22.04``.
-

The following example creates a base image with Python 3.11.0, pip 23.0.1 and setuptools 69.0.2:

.. code-block:: bash
$ cd src/docker/base/
$ docker build \
--build-arg PYTHON_VERSION=3.11.0 \
--build-arg PIP_VERSION=23.0.1 \
--build-arg SETUPTOOLS_VERSION=69.0.2 \
-t flwr_base:0.1.0 .
The name of image is ``flwr_base`` and the tag ``0.1.0``. Remember that the build arguments as well
as the name and tag can be adapted to your needs. These values serve as examples only.

Building the server image
-------------------------

.. list-table::
:widths: 25 45 15 15
:header-rows: 1

* - Build argument
- Description
- Required
- Example
* - ``BASE_REPOSITORY``
- The repository name of the base image.
- Defaults to ``flwr/server``.
-
* - ``BASE_IMAGE_TAG``
- The image tag of the base image.
- Defaults to ``py3.11-ubuntu22.04``.
-
* - ``FLWR_VERSION``
- Version of Flower to be installed.
- Yes
- ``1.6.0``

The following example creates a server image with the official Flower base image py3.11-ubuntu22.04
and Flower 1.6.0:

.. code-block:: bash
$ cd src/docker/server/
$ docker build \
--build-arg BASE_IMAGE_TAG=py3.11-ubuntu22.04 \
--build-arg FLWR_VERSION=1.6.0 \
-t flwr_server:0.1.0 .
The name of image is ``flwr_server`` and the tag ``0.1.0``. Remember that the build arguments as well
as the name and tag can be adapted to your needs. These values serve as examples only.

If you want to use your own base image instead of the official Flower base image, all you need to do
is set the ``BASE_REPOSITORY`` and ``BASE_IMAGE_TAG`` build arguments. The value of
``BASE_REPOSITORY`` must match the name of your image and the value of ``BASE_IMAGE_TAG`` must match
the tag of your image.

.. code-block:: bash
$ cd src/docker/server/
$ docker build \
--build-arg BASE_REPOSITORY=flwr_base \
--build-arg BASE_IMAGE_TAG=0.1.0 \
--build-arg FLWR_VERSION=1.6.0 \
-t flwr_server:0.1.0 .
After creating the image, we can test whether the image is working:

.. code-block:: bash
$ docker run --rm flwr_server:0.1.0 --help
7 changes: 6 additions & 1 deletion doc/source/how-to-install-flower.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ For simulations that use the Virtual Client Engine, ``flwr`` should be installed
Verify installation
-------------------

The following command can be used to verfiy if Flower was successfully installed. If everything worked, it should print the version of Flower to the command line::
The following command can be used to verify if Flower was successfully installed. If everything worked, it should print the version of Flower to the command line::

python -c "import flwr;print(flwr.__version__)"
1.5.0
Expand All @@ -32,6 +32,11 @@ The following command can be used to verfiy if Flower was successfully installed
Advanced installation options
-----------------------------

Install via Docker
~~~~~~~~~~~~~~~~~~

`How to run Flower using Docker <https://flower.dev/docs/framework/how-to-run-flower-using-docker.html>`_

Install pre-release
~~~~~~~~~~~~~~~~~~~

Expand Down
Loading

0 comments on commit af835c9

Please sign in to comment.