Welcome to Edge Impulse, the largest community of edge AI developers!
This is a public Edge Impulse project, use the navigation bar to see all data and models in this project; or clone to retrain or deploy to any edge device.
This is a public Edge Impulse project, use the navigation bar to see all data and models in this project; or clone to retrain or deploy to any edge device.
Hand Gestures
In this simulation, we will create four gesture classes: neutral (fist), five (open hand), peace (V-sign), and good (thumbs up). The bounding box detection will provide class data along with the object’s x, y, w, and h values, which we will use as real-time input to replace a keyboard or joystick in the classic game we’re developing. The trained model will then be deployed to a Raspberry Pi, integrated into our Python code that uses the PyGame library for easy development and rendering of the game on an LCD display.
Run this model
On any device
Dataset summary
Data collected
107 itemsLabels
five, good, neut, peaceProject info
Project ID | 766818 |
License | 3-Clause BSD |
No. of views | 164 |
No. of clones | 0 |