Automated Bee Traffic Monitoring
- Bee traffic monitoring is the automated quantification and analysis of honey bee ingress and egress using sensor and imaging systems.
- It integrates machine learning, computer vision, and edge computing to deliver high-resolution, real-time data on forager activity and colony health.
- Practical applications include early-warning systems for colony collapse and optimized apiary management through accurate, non-invasive monitoring.
Bee traffic monitoring is the automated quantification and analysis of individual honey bees entering and exiting a hive, typically using visual or sensor-based systems. The primary aim is to acquire high-resolution time series on forager activity, colony ingress/egress patterns, and health-related anomalies, leveraging these signals as proxies for colony strength, resource intake, population trends, and early warning of stressors, including disease and environmental threats. Modern bee traffic monitoring systems integrate machine learning, computer vision, and edge/embedded computation, supporting non-invasive, scalable, and near-real-time deployment across a wide range of apicultural settings.
1. Physical Instrumentation and Hardware Design
Two primary physical configurations dominate the literature: narrow tunnel (quasi-1D traffic) systems and full-entrance wide-field systems. Tunnel-based approaches constrain bee movement through acrylic or PVC corridors (typically 2–3 cm wide, 1–2 cm high), allowing simpler segmentation and flow direction inference. Overhead or frontal camera mounting (10–50 cm from the entrance) with wide-angle lenses (90°–120° field of view) is used for higher-throughput settings, at the cost of increased occlusion and background complexity (Bilik et al., 2022, Bilik et al., 13 Jun 2024).
Lighting strategies include pure daylight (with vented, rain-proof enclosures as in IntelliBeeHive (Narcia-Macias et al., 2023)), diffused white/IR LED panels for uniform exposure, and infrared-only variants (e.g., IR tunnels for 24-hour operation and minimal visual disturbance [CHEN2012], (Boenisch et al., 2018)). Embedded deployments favor low-power cameras (e.g., Raspberry Pi Camera V2.1, 8 MP) paired with NVIDIA Jetson Nano/TX2, Coral Edge TPU, or ARM Cortex-M MCUs, depending on model complexity and power availability (Sucipto et al., 10 Sep 2025).
Typical example (IntelliBeeHive (Narcia-Macias et al., 2023)):
- Enclosure: 3D-printed PLA or laser-cut plywood, mesh viewing window, camera at 120–155 mm from flight path.
- Imaging: 1280×720 px at 10 FPS, downscaled to 640×420 for processing.
- Edge compute: Jetson Nano (PoE powered), temperature/humidity sensor (BME680), water-tight cable routing.
2. Computer Vision Methods for Traffic Detection and Counting
2.1 Conventional Computer Vision (Pre-2018)
Early methods rely on foreground-background segmentation (e.g., Gaussian Mixture Models/MOG2 (Sucipto et al., 10 Sep 2025, Bilik et al., 2022)), frame differencing, and optical flow [Mukherjee & Kulyukin2020], occasionally augmented with Hough transform or SVMs on HOG features for tagged bee detection [CHEN2012]. Motion-based pipelines partition the ROI into spatial slices, tracking bulk signal sweeps to infer ingress/egress, but suffer under occlusion or slow-moving/clustered bees (Bilik et al., 13 Jun 2024).
2.2 Convolutional Neural Networks and Object Detection
Recent systems predominantly use deep CNNs and single-stage detectors:
- YOLO family (YOLOv3/v4/v5/v7/v8-tiny): anchor-based detectors with low parameter counts (e.g., 8–12M for "tiny" models), running post-quantization and pruning to support real-time inference on Jetson Nano or TX2 (~25 FPS) (Sucipto et al., 10 Sep 2025, Narcia-Macias et al., 2023, Bilik et al., 2022).
- MobileNetV2/EfficientNet-Lite: inverted residual and compound-scaled architectures suited for Coral Edge TPU/MCU deployment (Sucipto et al., 10 Sep 2025).
- ResNet-50 and SqueezeNet: used as sliding-window classifiers for bee cutouts, achieving 87–93% test accuracy depending on dataset and lighting (Bilik et al., 13 Jun 2024).
Detection/tracking logic typically includes:
- Input resizing (416×416 or 640×420 px), RGB normalization, letterbox padding (standard YOLOv7 preprocessing).
- Kalman filtering or centroid matching to associate detections frame-to-frame (Bilik et al., 2022, Sucipto et al., 10 Sep 2025), with matching thresholds tuned (e.g., ±50 px in IntelliBeeHive).
- Event rules: inbound/outbound assigned when tracklets cross virtual gates or band thresholds (e.g., Y-axis bands).
Table 1. Representative bee traffic detection models and metrics
| Model | Precision | Recall | F1-score | Hardware | FPS |
|---|---|---|---|---|---|
| YOLOv7-tiny (bees) | 0.981 | 0.981 | 0.950 | Jetson Nano | ~37* |
| YOLOv7-tiny (pollen) | 0.821 | 0.821 | 0.831 | Jetson Nano | ~37* |
| YOLOv4-tiny | 0.96 | 0.94 | 0.95 | Raspberry Pi | ~5 |
| ResNet-50 (BUT2) | – | – | 0.93 | RTX 3090 | ~10 |
| YOLOv8m | – | – | 0.97 mAP | RTX 3060 | ~33 |
*YOLOv7-tiny inference time reduced to 27 ms/frame using TensorRT (Narcia-Macias et al., 2023).
3. Tracking and Flow Rate Computation
Tracking methods span frame-to-frame centroid association, Hungarian algorithm-based global assignment (as in BeesBook (Boenisch et al., 2018)), SORT/DeepSORT/ByteTrack for object re-identification (Bilik et al., 2022), and marker-based identity persistence (e.g., 12-bit circular tags in BeesBook). For markerless systems (YOLO+tracking), bee identity is usually ephemeral and event-based (new ID per crossing).
Key equations (IntelliBeeHive, (Narcia-Macias et al., 2023)):
- Throughput:
- Net flow:
- Accuracy: where are arrivals/leavings in interval , is automated, manual count.
BeesBook (Boenisch et al., 2018) provides a robust two-stage ID tracking system:
- Tracklets linked via linear SVM (spatial + angular + ID probability vector), followed by Random Forest–based merging to bridge short gaps.
- Reported tracking ID error rates: raw detection ≈ 13.3%, after two-stage linking ≈ 1.9%.
Event logic commonly includes band-based triggers: e.g., "Arriving" if the Y centroid crosses below y₁, "Leaving" above y₂, or via direction-class counts across gates (Narcia-Macias et al., 2023, Bilik et al., 13 Jun 2024).
Advanced pipelines compute local densities, speed/flow fields, dwell-time distributions, and support spatiotemporal analyses of jammed flows or anomalous collective states (Boenisch et al., 2018, Bilik et al., 2022).
4. Performance Benchmarks, Datasets, and Evaluation
Standard metrics include Precision, Recall, F1-score, mean Average Precision ([email protected]), counting accuracy, and mean Absolute Percentage Error (MAPE):
- YOLOv7-tiny bee recognition: F1 = 0.95, precision/recall = 0.981 (Narcia-Macias et al., 2023)
- YOLOv4-tiny (BeeVid): precision = 0.96, recall = 0.94 (Sucipto et al., 10 Sep 2025)
- ResNet-50 classifier: 87–93% test accuracy (Bilik et al., 13 Jun 2024)
- YOLOv8m: [email protected] = 0.97 (Bilik et al., 2022)
Public datasets:
- BeeVid: 5,819 RGB frames, YOLO annotation (Sucipto et al., 10 Sep 2025)
- VnPollenBee: 2,051 images, 60,826 boxes (pollen, non-pollen) (Sucipto et al., 10 Sep 2025)
- "BUT1"/"BUT2" (Bilik et al., 13 Jun 2024): >3,000 images, 10,000+ annotations for fine-grained bee behaviors
Jetson Nano and TX2 support real-time inference (up to 25–37 FPS with quantized YOLO models); inference on MCU/TPU (Edge, GAP9) can reach sub-10 ms times with heavily pruned/quantized networks (Sucipto et al., 10 Sep 2025). Accuracy falls under severe occlusion or complex outdoor lighting, especially for unpruned large-class models and extreme traffic densities (ATA drops from 0.80 to 0.20 above 50 bees/frame (Bilik et al., 2022)).
5. Applications in Colony Health and Apiary Management
Automated bee traffic monitoring underpins early-warning systems for colony collapse, disease, and swarm prediction by tracking sudden drops or anomalies in ingress/egress patterns and pollen-forager ratios (Narcia-Macias et al., 2023, Sucipto et al., 10 Sep 2025). Metrics such as time-series throughput, net flux, and pollen/bee ratios enable statistical anomaly detection (e.g., z-scoring hourly rates), while cross-modality integration (weight, temperature/humidity sensors) supports more holistic hive-state assessment.
In practical deployments, real-time dashboards report flow counts, speed distributions, and environmental sensor data, with custom alert logic (e.g., drop in daily in-count by >30% triggers warning (Bilik et al., 13 Jun 2024)). Systems such as IntelliBeeHive are scalable (software open-source, hardware <$250 USD, PoE-chained Jetson Nanos, cloud aggregation), extending cost-effective traffic monitoring to both hobbyists and commercial apiaries (Narcia-Macias et al., 2023).
6. Challenges, Trade-Offs, and Future Directions
Principal challenges include:
- Occlusion and clustering: Bee overlaps impair segmentation/detection, requiring channeling (tunnels) or advanced model/data association (OSNet/appearance embeddings) (Bilik et al., 2022).
- Lighting and background nonstationarity: Deep networks gain robustness via transfer learning, domain adaptation, and aggressive on-the-fly augmentation; background subtraction fails in non-tunnel settings (Bilik et al., 13 Jun 2024).
- Energy constraints: Embedded MCUs, event-driven cameras, and duty-cycled processing (vibration/IR sensors triggering camera activation) are critical for off-grid, solar-powered deployments (Sucipto et al., 10 Sep 2025).
- Dataset shift: Models trained on one location can degrade elsewhere due to entrance/hive/bio-geographical variation; adaptive edge learning and standardization initiatives (MLPerf Tiny) are ongoing responses (Sucipto et al., 10 Sep 2025).
Emerging directions prioritise multi-modal sensor fusion (visual, acoustic, environmental), deployment-specific benchmarking (accuracy/energy/memory), multi-task CNNs (traffic, pollen, Varroa, genus) (Bilik et al., 2022), and real-time streaming anomaly detection with direct, cloud-agnostic farmer notifications.
7. Summary and Comparative Table of Representative Methods
| Reference | Sensing/Model | Key Metrics (F1/Acc) | Deployment Context |
|---|---|---|---|
| IntelliBeeHive (Narcia-Macias et al., 2023) | YOLOv7-tiny + tracking | Bees: 0.95 (F1), 96.3% (acc) | Jetson Nano, low-cost |
| BeesBook (Boenisch et al., 2018) | IR tunnel + marker + SVM/RF | ID error: 1.9% | Hi-res, long-term |
| Ngo2021 (Bilik et al., 2022) | YOLOv3-tiny + Kalman | Pollen: 0.94 | Jetson Nano, real-time |
| "CV+ResNet-50" (Bilik et al., 13 Jun 2024) | Sliding window CNN | 87–93% (class acc) | Tunnel, but low-power |
| YOLOv8m+ByteTrack (Bilik et al., 2022) | YOLOv8m + track | mAP: 0.97 | Full-entrance, GPU |
Bee traffic monitoring is thus a mature but rapidly evolving field, with validated methodologies for high-resolution, automated forager flow assessment now broadly accessible. Continuing research is focused on increased robustness, data-efficient domain transfer, and multi-modal integration to further enhance deployment sustainability and ecological insight.