Anomaly detection demo for SBIC 2022
Anomaly detection is common problem found in many different industries. This repository demonstrates usage of FELT for anomaly detection on distributed data. You can image 3 factories each producing same product (cables in our case). Using FELT and Ocean Protocol we can calculate statistics across all datasets without revealing data itself. After that we can use those statistics for detecting anomalies in each factory.
Using version 0.5.1
of FELT Labs application. All requirements are listed in requirements.txt.
You can find the dataset files in datasets folder. Same files are published on Ocean Protocol as 3 different datasets with following DIDs:
- did:op:493def4e00cda410adde2017ebaf5d644cf2bdec81cec5fee29d1fc9c73d66fa
- did:op:9e7f56f83422b156c016fe83e87722dac4b882ba9cd03f6d88e39fea04495669
- did:op:53ffd278eff009d92a130db6f3b7415158d17f176ceef8114b0071bf6ec40a88
Data are created from this dataset. This data captures basic properties from copper wire production line. Each line in this dataset represents one production period.
Algorithm used for anomaly detection is in src/detection_algorithm.py. We published this file as Ocean Protocol asset with following DID:
The algorithm works as follows. Based on the trained model (containing value of mean and standard deviation), we calculate z-score for each data point:
Then we assume that point is anomaly if z-score is greater than 2. Finally, the algorithm creates simple chart and stores the results.
- Go to FELT web application: https://app.feltlabs.ai/multiple (you will need to connect your MetaMask wallet)
- Pick name and fill following DIDs of our datasets:
- In next step pick
Analytics
and select bothMean
andStandard deviation
- In the last step leave target column equal to
-1
and start the training
This will start local jobs on each dataset. We won't be downloading those datasets. Computation will happen in place where those data are stored and only final encrypted results will be available for us. In order to combine local results into final model do the following:
- Go to: https://app.feltlabs.ai/jobs
- Open up your job and click
Reload
until all local jobs are finished - Click
Aggregation
button at the bottom to create final model - Wait for aggregation to finish (use
Reload
button) and download the final model
We already published the algorithm for evaluation (anomaly detection). We just need to run it on our data (could be new data coming from those factories).
- Go to: https://app.feltlabs.ai/experimental
- Pick name, upload final model and fill following DIDs:
- Start the algorithm using
Run
button - Go to: https://app.feltlabs.ai/jobs
- Wait until your compute job is finished (use
Reload
button) and then downloadresult.jpg
with the final chart
This simulates the decentralized process in local environment.
Install Python 3.8 or newer.
pip install -r requirements.txt
# In case you want to contribute to repository run following as well:
pre-commit install
Then you can start jupyter lab/notebook as follows:
jupyter lab
Alternatively, you can use Makefile which will create virtual environment and install requirements for you. Using following command:
make install
Once you have requirements installed and jupyter running. You can open main.ipynb which will walk you through the main usage of FELT.
You can also open the notebook in Google Colab: