A web application helps children learn sign language interactively. Uses machine learning and image recognition to take pictures of the user's hand from their web camera and classify the hand gesture according to its American Sign Language letter. Wrote the frontend in React and Flask, and wrote the backend in Python, using Tensorflow and OpenCV to build a neural network and Multi-Nomial Logistic Regression model.