Station Framework: Scalable Cosmic Ray Detection
- The Station Framework is a modular, internet-enabled system for cosmic ray detection, integrating flexible detector configurations and real-time data processing.
- It employs programmable trigger logic and precise GPS-timed TDC measurements to ensure accurate event detection and synchronization.
- Its scalable, containerized backend and robust IoT protocols support dynamic network expansion and reliable data storage for scientific campaigns.
The Station Framework, as introduced in "A modular and flexible data acquisition system for a cosmic rays detector network" (Saito et al., 2022), comprises a highly modular, internet-enabled detection and data acquisition system engineered for a scalable network of cosmic ray detector stations. Each station functions as an autonomous, internet-controllable hardware node capable of interfacing with up to eight detector planes. The architectural emphasis is on flexibility—permitting a variety of detector geometries and orientations using a consistent baseline design—and robustness, leveraging commodity mixed-signal hardware for precise event timing, environmental monitoring, and cloud-native data operations.
1. Hardware Architecture and Signal Processing
Each station consists of up to four identical frontend detector modules aggregated by a single backend. The backend centralizes power distribution, timing synchronization, data acquisition, trigger definition, and network connectivity. The frontend modules integrate the cosmic ray detection and analog preprocessing stack:
- Detector Plane (Frontend):
- Plastic scintillator slabs (10 mm × 150–400 mm).
- Optical coupling to up to four silicon photomultipliers (SiPM; 3.9 mm × 3.9 mm) via optical grease.
- Two-channel, low-noise amplifiers and fast discriminators. Adjustable SiPM summing for slab size scaling.
- Onboard EEPROM (I²C) for board-ID/inventory management.
- Dual firmware-controlled LEDs for in situ calibration.
- Signal-Shaping and Discrimination:
- Transimpedance/non-inverting amplification for rapid edge formation into the discriminator.
- Discriminator threshold set by onboard DAC (per-channel, controlled via the frontend microcontroller).
- Per-channel fine tuning of SiPM bias voltage.
- Backend Processing Module:
- Input: Up to 4 HDMI-style frontend modules (each: 2 discriminated channels + 2 LED lines).
- Cypress PSoC5LP microcontroller implements analog routing, a user-configurable trigger LUT, event builder, and memory interface.
- Dual TI TDC7200s: one for GPS timestamping (ΔtGPS = trigger – GPS PPS), one for Time-over-Threshold (ToT = t_fall – t_rise).
- u-Blox Neo-6 GPS receiver (σGPS ≈ 30 ns) for absolute event time.
- ESP32 Wi-Fi SoC streams data via MQTT or stores locally on flash if connectivity is unavailable.
- Flexible Geometry:
- Single-layer telescopes, double-area stacks, or arbitrary slab orientations are all supported, enabled by programmable backend logic.
2. Embedded Firmware Workflows
The PSoC5LP's mixed-signal architecture orchestrates timing, triggering, data acquisition, and buffering:
- Trigger Generation:
- Data Acquisition Pipeline:
- Discriminator output multiplexed to PSoC input.
- Programmable trigger logic generates an interrupt.
- TDC timing (start at trigger, stop at GPS PPS); TDC measures Δt and ToT.
- SPI bus: PSoC retrieves TDC results, packages as event records (Δt, ToT, channel mask).
- Circular buffer for outbound event records.
- Timing Performance:
- TDC time resolution σTDC ≈ 50 ps; combined system resolution σt ≈ 30 ns (dominated by GPS jitter).
- All events timestamped with respect to the prior GPS pulse: Δt = t_detector – t_GPS_pulse.
3. Environmental and Orientation Sensing
Each frontend is equipped for situational awareness, supporting contextual event analysis:
- IMU Subsystem:
- ICM-20948: 3-axis accelerometer, gyroscope, magnetometer, sampled at 100 Hz (configurable).
- Orientation (pitch, roll, heading) is read at the start of each "Cosmic Block".
- Environmental Measurements:
- BME680: temperature, humidity, barometric pressure at 1–10 Hz.
- Data Fusion:
- At each "Cosmic Block", IMU and BME680 are polled and the results timestamped.
- Environmental/orientation data associated with event-level blocks for offline correlation (e.g., rate dependence on pressure/temperature/plane misalignment).
4. Software Stack and Data Network Infrastructure
Data, configuration, and meta-information leverage a cloud-oriented, containerized backend scalable to full-network operation:
- Communication Protocol:
- MQTT topics for configuration (
station/{id}/config), event (station/{id}/event), and environment (station/{id}/env); QoS level 1 for at-least-once delivery. - Binary or JSON event payloads structured as (Δt, ToT, channel mask) per event.
- MQTT topics for configuration (
- Streaming and Persistence:
- Python off-loader subscribes to all station topics; feeds:
- Redis (real-time dashboards, e.g., Grafana).
- MySQL (permanent storage; tables: stations, events, environmental readings, blocks).
- Server Infrastructure:
- Docker Compose stack: Mosquitto MQTT broker, Redis, MySQL, Python offloader, Grafana frontend, JupyterHub notebooks.
- Stateless and volume-backed services enable seamless horizontal scaling and replication.
5. Performance, Scalability, and Modularity
The framework is designed for both scientific reliability and ease of scaling to large deployments:
- Single-Station Performance:
- Typical event rate ≈ 1.5 events/s under 40 cm concrete at 760 m a.s.l.
- Backend processing latency ≲ 100 μs per event.
- Cloud round-trip < 100 ms per record.
- Network Scalability:
- MQTT broker supports thousands of messages/s; each station load ≲ 2 msg/s.
- Redis/MySQL scalability via sharding, read replicas.
- Docker orchestration (Kubernetes/Swarm) for dynamic off-loader/visualization pod deployment.
- Modularity and Upgradability:
- Frontends are hot-swappable and auto-detected by backend.
- LUT-based trigger logic reconfigured via MQTT (minutes, no firmware flash).
- Backend supports up to eight discriminators/Cherenkov detector extension via firmware and minor hardware updates.
| Parameter | Value | Notes |
|---|---|---|
| TDC Resolution | ≈ 50 ps | TI TDC7200 |
| GPS Jitter | ≈ 30 ns | u-Blox Neo-6, 1 PPS |
| Event Throughput | ≈ 1.5 Hz | Ta = 40 cm concrete, 760 m asl |
| Processing Latency | ≲100 μs/event | Backend PSoC FIFO to MQTT |
| MQTT Msg Rate | ≲2 Hz/station |
A plausible implication is that the combination of hardware modularity, programmable trigger logic, robust time-stamping, and containerized backend allows rapid expansion of the detector network, easy integration of new detector hardware, and real-time operations and monitoring at scale. The entire system is designed for both educational outreach deployments and scientific cosmic ray measurement campaigns, balancing advanced timing/trigger accuracy with robust IoT-style data flows and horizontal backend scalability.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days free