A real-time application for detecting and recognizing sign language gestures using a webcam feed.
This project leverages Flask for the web interface and TensorFlow/Keras for the machine learning model to recognize sign language gestures in real-time from a webcam feed.
- Real-time sign language gesture detection.
- Web interface for video feed and gesture recognition.
- Utilizes a CNN model for gesture recognition.
-
Clone the repository:
git clone https://github.com/RiaanSadiq/Sign-Language-Detection.git cd Sign-Language-Detection
-
Create and activate a virtual environment:
python -m venv venv source venv/bin/activate # On Windows use `venv\Scripts\activate`
-
Install required libraries:
pip install -r requirements.txt
-
Ensure a webcam is connected to your system.
-
Start the Flask application:
python app.py
-
Open your web browser and navigate to
http://127.0.0.1:5000/
. -
The web interface will display the webcam feed and detected sign language gestures.
- Flask: Web framework
- OpenCV: Video capture and processing
- NumPy: Numerical operations
- TensorFlow/Keras: Machine learning model
- Splitfolders: Dataset splitting
- Logging: Logging messages
For questions or suggestions, please open an issue or contact me at [[email protected]].