The easiest way for me to learn is by breaking topics down to their fundamental components and then teaching the concepts to others, à la Richard Feynman.
This is my public drafting space where I exercise that approach to learn about or reinforce interesting computational topics. These topics include machine learning algorithms, deep learning models, image processing techniques, and statistical methods, among others.
The resulting educational materials that I generate for each topic are published on Medium in my 'Breaking it Down' series, typically hosted by the Towards Data Science publication. These educational materials lean heavily on manim for visualizations.
Example trailer for the K-Means post
K-Means.Trailer.mov
Field | Method | Date |
---|---|---|
ML | K-Means | 11/06/2022 |
ML | Logistic Regression | 08/21/2022 |
ML/DL | Gradient Descent | 07/25/2022 |
ML/DL | Softmax | 06/23/2022 |
Statistics | Principal Component Analysis | 06/18/2022 |
The goal for these articles that I create is to convert them into Medium posts. The Post Guidelines describe my approach to this process.
Please feel free to open an issue if you find anything wrong with the examples or have any suggestions for improvement.
Copyright © 2024 Jacob Bumgarner, Ph.D.