This is a public Edge Impulse project, use the navigation bar to see all data and models in this project; or clone to retrain or deploy to any edge device.
AI-driven Ancillary Lab Assistant w/ UNO Q
About this project
Based on Arduino UNO Q, this lab assistant is able to show real-time sensor readings, identify users by fingerprint, recognize lab equipment via object detection, generate AI lessons via Gemini, and talk through its full-fledged web dashboard.
This FOMO (Faster Objects, More Objects) object detection model is trained on various lab equipment images to enable the lab assistant to identify lab equipment for generating AI lessons about them via Google Gemini based on the provided lesson questions.
While labeling the lab equipment image samples, I simply applied the name of the target lab equipment:
- skeleton_model
- microscope
- alcohol_burner
- bunsen_burner
- dynamometer
After training and validating, I deployed my FOMO model. Since I developed this project on the Arduino UNO Q, I was able to employ the official pipeline to directly import my model into the Arduino App Lab by linking my Arduino account to Edge Impulse Studio.
The project GitHub repository provides:
- Code files
- The lab assistant App Lab application's ZIP folder
- PCB design files (Gerber)
- 3D part design files (STL)
- Edge Impulse FOMO object detection model (EIM binary for UNO Q)
Run this model
Dataset summary
Data collected
150 itemsLabels
alcohol_burner, bunsen_burner, dynamometer, microscope, skeleton_modelProject info
| Project ID | 947877 |
| License | 3-Clause BSD |
| No. of views | 124 |
| No. of clones | 0 |