anetezepa / Bee Winter Public

Bee Winter

HappyBees: Distributed IoT Beehive Monitoring System Winter Model

audio humidity temperature weight

About this project

HappyBees: Distributed IoT Beehive Monitoring System

beewatch_logo.png

HappyBees is a distributed IoT beehive monitoring system. It uses edge ML on a Raspberry Pi Pico 2 W to detect swarming events via acoustic analysis and uploads telemetry to a central server with a real-time dashboard. The project is powered by Bee Summer and Bee Winter models.

This project contains the Winter Model, designed to detect life-threatening anomalies like freezing or starvation during the dormant season.

HappyBees_small.gif


Architecture

The model is a PyTorch Autoencoder deployed as an anomaly detector. To ensure efficiency on the Pico 2 W, the pipeline follows three steps:

  1. Develop: Custom PyTorch Autoencoder (Compression/Reconstruction Network).
  2. Integration: Integrated via Edge Impulse's Bring Your Own Model (BYOM) feature - exported as ONNX.
  3. Deploy: Exported as an optimized TensorFlow Lite neural network (C++ library).

It took us 8 models before we got here.


Biological Basis

While Summer is defined by high-energy foraging and swarming, Winter is defined by energy conservation and clustering. Bees form a tight "winter cluster" to vibrate their wing muscles and generate heat, maintaining a core temperature even in freezing weather.

This activity creates a distinct "heating" acoustic signature (specific frequency hum) and stable thermal patterns. By learning these healthy clustering patterns, the model can detect when the cluster breaks (due to starvation or extreme cold) without needing to know exactly what a "dead" hive looks like.

Screenshot 2025-11-30 at 20.44.25.png


Winter model Logic

The Winter model functions as an Anomaly Detector. It monitors the stability of the "heating cluster."

  • Primary Logic: Autoencoder (Reconstruction) - The model compresses input data into a tiny summary (3 numbers) and attempts to rebuild it. It is trained only on healthy, clustering bees (Nov 1 - Nov 21 data).
  • The "Anomaly" Trigger - If the bees are healthy, the model reconstructs the data accurately (Low Error). If the cluster breaks or the hive dies, the data looks "alien" to the model, resulting in a high reconstruction error (High Anomaly Score).
  • Efficiency Metric - It specifically monitors "Heater Efficiency"—are the bees generating enough heat relative to the amount of noise (energy) they are expending?

Key Insight: We do not train on "Dead" hives. We train on "Healthy" ones. Anything that deviates significantly from "Healthy" is flagged as an anomaly.

output.gif


Model Architecture

1.1 Input Features (5 elements)

The model expects a 5-element float32 feature vector focusing on thermal stability and specific audio frequencies.

Index Feature Description
0 Temperature Current internal hive temperature (°C)
1 Humidity Current internal humidity (%)
2 Temp Stability Variance over last 12 readings (~3 hours). Formula: Var = Σ(x - μ)² / N
3 Heater Power Sum of audio energy in "Clustering Frequencies" (180Hz–250Hz).
4 Heater Ratio Efficiency metric: Heater Power / Total Audio Density.

For input details and how to extract them correctly, see the HappyBees GitHub Model Guide.


1. 2 Sensor wiring

happybees_schematic.png

sensors_top.jpeg

sensor_wiring_back.jpg

The bees build mathematically perfect hexagons, and here I am making solder joints that look like a pigeon flew over the board and just let go.

Use hardware at your own risk.

Thankfully you can try out the model using numeric inputs here at Edge Impulse Studio.


1.3 Output Interpretation

Unlike a classifier, this model does not output a simple "Class."

  1. Input: Feed the 5 raw features.
  2. Output: The model returns its "best guess" (reconstruction) of those 5 features.
  3. Result: Calculate the Mean Squared Error (MSE) between Input and Output.
    • MSE < Threshold: Normal (Healthy Cluster)
    • MSE > Threshold: Anomaly (Cluster broken/Starvation)

Technical Specifications

The Winter Model is a symmetric Autoencoder wrapped in a custom scaling layer.

For full code see the HappyBees GitHub.

Feature Specification
Architecture PyTorch Autoencoder (Reconstruction)
Structure Encoder (Compress) → Bottleneck → Decoder (Expand)
Bottleneck 3 Neurons (Data compressed to just 3 numbers)
Input Shape (Batch, 5)
Format ONNX (with embedded Min/Max scaling wrapper)

Architecture Flow

The model forces the data through a severe "information bottleneck" to learn only the most essential patterns of a healthy hive.

  1. Wrapper: Normalizes inputs to 0-1 range using baked-in min/max stats (from training data).
  2. Encoder: Linear(5→12)TanhLinear(12→6)TanhLinear(6→3)
  3. Decoder: Linear(3→6)TanhLinear(6→12)TanhLinear(12→5)
  4. Wrapper: Denormalizes output back to real-world units for error calculation.

Training Parameters

  • Optimizer: Adam (lr=0.002)
  • Loss Function: MSELoss (Mean Squared Error)
  • Dataset: Trained exclusively on "Healthy" November data (Nov 1 - Nov 21).

Watch demo

Watch HappyBees Demo on YouTube


Want to learn more?

More details on the summer model can be found in HappyBees GitHub Model Guide.

More details on the whole project can be found on HappyBees GitHub.

Check out the Summer Bees Edge Impulse project

This project contains no data yet.

Run this model

On any device

Project info

Project ID 837280
License 3-Clause BSD
No. of views 64
No. of clones 0