Requirements: MKL, TensorFlow (for fetching MNIST), CUDA, PyTorch
Usage:
python kfac_pytorch.py
Using MKL
Step 0 loss 97.542419434
Step 1 loss 62.339828491
Step 2 loss 44.860393524
Step 3 loss 79.031013489
Step 4 loss 56.055324554
Step 5 loss 48.206447601
Step 6 loss 43.934066772
Step 7 loss 40.302700043
Step 8 loss 38.371196747
Step 9 loss 38.781795502
Times: min: 388.66, median: 400.81, mean: 2198.33
- Write-up: Optimizing deeper networks with KFAC in PyTorch.
- Experiments: deep_autoencoder.ipynb