This is a public Edge Impulse project, use the navigation bar to see all data and models in this project; or clone to retrain or deploy to any edge device.
Bee Winter
HappyBees: Distributed IoT Beehive Monitoring System Winter Model
About this project
HappyBees: Distributed IoT Beehive Monitoring System
HappyBees is a distributed IoT beehive monitoring system. It uses edge ML on a Raspberry Pi Pico 2 W to detect swarming events via acoustic analysis and uploads telemetry to a central server with a real-time dashboard. The project is powered by Bee Summer and Bee Winter models.
This project contains the Winter Model, designed to detect life-threatening anomalies like freezing or starvation during the dormant season.
Architecture
The model is a PyTorch Autoencoder deployed as an anomaly detector. To ensure efficiency on the Pico 2 W, the pipeline follows three steps:
- Develop: Custom PyTorch Autoencoder (Compression/Reconstruction Network).
- Integration: Integrated via Edge Impulse's Bring Your Own Model (BYOM) feature - exported as ONNX.
- Deploy: Exported as an optimized TensorFlow Lite neural network (C++ library).
It took us 8 models before we got here.
Biological Basis
While Summer is defined by high-energy foraging and swarming, Winter is defined by energy conservation and clustering. Bees form a tight "winter cluster" to vibrate their wing muscles and generate heat, maintaining a core temperature even in freezing weather.
This activity creates a distinct "heating" acoustic signature (specific frequency hum) and stable thermal patterns. By learning these healthy clustering patterns, the model can detect when the cluster breaks (due to starvation or extreme cold) without needing to know exactly what a "dead" hive looks like.
Winter model Logic
The Winter model functions as an Anomaly Detector. It monitors the stability of the "heating cluster."
- Primary Logic: Autoencoder (Reconstruction) - The model compresses input data into a tiny summary (3 numbers) and attempts to rebuild it. It is trained only on healthy, clustering bees (Nov 1 - Nov 21 data).
- The "Anomaly" Trigger - If the bees are healthy, the model reconstructs the data accurately (Low Error). If the cluster breaks or the hive dies, the data looks "alien" to the model, resulting in a high reconstruction error (High Anomaly Score).
- Efficiency Metric - It specifically monitors "Heater Efficiency"—are the bees generating enough heat relative to the amount of noise (energy) they are expending?
Key Insight: We do not train on "Dead" hives. We train on "Healthy" ones. Anything that deviates significantly from "Healthy" is flagged as an anomaly.
Model Architecture
1.1 Input Features (5 elements)
The model expects a 5-element float32 feature vector focusing on thermal stability and specific audio frequencies.
| Index | Feature | Description |
|---|---|---|
| 0 | Temperature | Current internal hive temperature (°C) |
| 1 | Humidity | Current internal humidity (%) |
| 2 | Temp Stability | Variance over last 12 readings (~3 hours). Formula: Var = Σ(x - μ)² / N |
| 3 | Heater Power | Sum of audio energy in "Clustering Frequencies" (180Hz–250Hz). |
| 4 | Heater Ratio | Efficiency metric: Heater Power / Total Audio Density. |
For input details and how to extract them correctly, see the HappyBees GitHub Model Guide.
1. 2 Sensor wiring
The bees build mathematically perfect hexagons, and here I am making solder joints that look like a pigeon flew over the board and just let go.
Use hardware at your own risk.
Thankfully you can try out the model using numeric inputs here at Edge Impulse Studio.
1.3 Output Interpretation
Unlike a classifier, this model does not output a simple "Class."
- Input: Feed the 5 raw features.
- Output: The model returns its "best guess" (reconstruction) of those 5 features.
- Result: Calculate the Mean Squared Error (MSE) between Input and Output.
- MSE < Threshold: Normal (Healthy Cluster)
- MSE > Threshold: Anomaly (Cluster broken/Starvation)
Technical Specifications
The Winter Model is a symmetric Autoencoder wrapped in a custom scaling layer.
For full code see the HappyBees GitHub.
| Feature | Specification |
|---|---|
| Architecture | PyTorch Autoencoder (Reconstruction) |
| Structure | Encoder (Compress) → Bottleneck → Decoder (Expand) |
| Bottleneck | 3 Neurons (Data compressed to just 3 numbers) |
| Input Shape | (Batch, 5) |
| Format | ONNX (with embedded Min/Max scaling wrapper) |
Architecture Flow
The model forces the data through a severe "information bottleneck" to learn only the most essential patterns of a healthy hive.
- Wrapper: Normalizes inputs to 0-1 range using baked-in min/max stats (from training data).
- Encoder:
Linear(5→12)→Tanh→Linear(12→6)→Tanh→Linear(6→3) - Decoder:
Linear(3→6)→Tanh→Linear(6→12)→Tanh→Linear(12→5) - Wrapper: Denormalizes output back to real-world units for error calculation.
Training Parameters
- Optimizer: Adam (
lr=0.002) - Loss Function: MSELoss (Mean Squared Error)
- Dataset: Trained exclusively on "Healthy" November data (Nov 1 - Nov 21).
Watch demo
Watch HappyBees Demo on YouTube
Want to learn more?
More details on the summer model can be found in HappyBees GitHub Model Guide.
More details on the whole project can be found on HappyBees GitHub.
Check out the Summer Bees Edge Impulse project
Run this model
Project info
| Project ID | 837280 |
| License | 3-Clause BSD |
| No. of views | 64 |
| No. of clones | 0 |