Fawkes is a privacy protection system developed by researchers at SANDLab, University of Chicago. For more information about the project, please refer to our project webpage. Contact us at [email protected].
We published an academic paper to summarize our work "Fawkes: Protecting Personal Privacy against Unauthorized Deep Learning Models" at USENIX Security 2020.
This code is intended only for personal privacy protection or academic research.
$ fawkes
Options:
-m
,--mode
: the tradeoff between privacy and perturbation size. Select frommin
,low
,mid
,high
. The higher the mode is, the more perturbation will add to the image and provide stronger protection.-d
,--directory
: the directory with images to run protection.-g
,--gpu
: the GPU id when using GPU for optimization.--batch-size
: number of images to run optimization together. Change to >1 only if you have extremely powerful compute power.--format
: format of the output image (png or jpg).
when --mode is custom
:
--th
: perturbation threshold--max-step
: number of optimization steps to run--lr
: learning rate for the optimization--feature-extractor
: name of the feature extractor to use--separate_target
: whether select separate targets for each faces in the diectory.
fawkes -d ./imgs --mode min
- The perturbation generation takes ~60 seconds per image on a CPU machine, and it would be much faster on a GPU machine. Use
batch-size=1
on CPU andbatch-size>1
on GPUs. - Turn on separate target if the images in the directory belong to different people, otherwise, turn it off.
- Run on GPU. The current Fawkes package and binary does not support GPU. To use GPU, you need to clone this, install the required packages in
setup.py
, and replace tensorflow with tensorflow-gpu. Then you can run Fawkes bypython3 fawkes/protection.py [args]
.
We are actively working on this. Python scripts that can test the protection effectiveness will be ready shortly.
Install from PyPI:
pip install fawkes
If you don't have root privilege, please try to install on user namespace: pip install --user fawkes
.
For academic researchers, whether seeking to improve fawkes or to explore potential vunerability, please refer to the following guide to test Fawkes.
To protect a class in a dataset, first move the label's image to a seperate location and run Fawkes. Please use --debug
option and set batch-size
to a reasonable number (i.e 16, 32). If the images are already cropped and aligned, then also use the no-align
option.
If you would like to contribute to make Fawkes software better, please checkout our project list which contains our TODOs. If you are confident in helping, please open a pull requests and explain the plans for your changes. We will try our best to approve asap, and once approved, you can work on it.
@inproceedings{shan2020fawkes,
title={Fawkes: Protecting Personal Privacy against Unauthorized Deep Learning Models},
author={Shan, Shawn and Wenger, Emily and Zhang, Jiayun and Li, Huiying and Zheng, Haitao and Zhao, Ben Y},
booktitle={Proc. of {USENIX} Security},
year={2020}
}