Rhythmic Lyre
Personal Project, 2023
Game Design - Python - Gesture Recognition - AI/ML - Sound Design - Prototyping - Solo Project Execution
Details
A personal experiment combining AI, machine learning, and game design. I explored video recognition as a hands-free game controller, using camera input to track simple physical gestures (lifting arrow signs) to control a minimalist rhythm-based game built in Unity.
Motivation
I wanted to explore how AI and computer vision could be used in alternative game controls. Rather than building another keyboard game, I challenged myself to create a system where simple camera-based gesture input could trigger in-game actions, turning movement into interaction.
Approach & Constraints
Use video recognition to interpret directional gestures
Design a simple and responsive game loop that reacts in real-time
Avoid advanced ML models and keep the solution lightweight
Build the full experience solo: from input tracking to game design and sound logic
Make it playable with basic webcam and printed arrows
My Contributions
As a solo developer and designer, I handled all aspects of the project:
Trained a basic video recognition system to detect “up” and “down” arrow signs
Programmed the gesture-based control system using webcam input
Designed a minimalist game where the player moves a dot to collect falling “notes” before time runs out
Balanced timing, spawning, and feedback to create a rhythm-like, reactive experience
Results
The final game is a fully functional prototype that explores gesture-based input without traditional controllers. It offers a simple, musical experience controlled entirely by physical movement. It helped me learn more about machine learning, camera input, real-time controls, and how to make a game feel responsive and fun to play.