Featured Project
Gesture Controlled Rock Paper Scissors
Play against a computer, as if it can really see you

PythonTensorFlowPyTorchOpenCVNumPy
Overview
This game proccesses a live video of the users hand gestures, analyzing them for recognized patterns. Landmark detection was applied to a custom dataset of expected gestures, which were then used to train the main model.
The project includes:
• Convolutional neural networks for image feature extraction
• Live video feed
• Instant response from a randomly generated computer 'move'
The system was trained on bursts of image data of my hands.
Challenges
- ▹Balancing model sensitivity and specificity
- ▹Model interface with live data
Outcomes
- ▹Responsive gameplay
- ▹Accurate detection of users' gestures
- ▹Scoreboard
Year: 2024
View CodeLive Demo
Designed & Built by Jonathan Lyashko