Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian De-quantization and Data Compression for Low-Energy Physiological Signal Telemonitoring (1506.02154v1)

Published 6 Jun 2015 in cs.IT and math.IT

Abstract: We address the issue of applying quantized compressed sensing (CS) on low-energy telemonitoring. So far, few works studied this problem in applications where signals were only approximately sparse. We propose a two-stage data compressor based on quantized CS, where signals are compressed by compressed sensing and then the compressed measurements are quantized with only 2 bits per measurement. This compressor can greatly reduce the transmission bit-budget. To recover signals from underdetermined, quantized measurements, we develop a Bayesian De-quantization algorithm. It can exploit both the model of quantization errors and the correlated structure of physiological signals to improve the quality of recovery. The proposed data compressor and the recovery algorithm are validated on a dataset recorded on 12 subjects during fast running. Experiment results showed that an averaged 2.596 beat per minute (BPM) estimation error was achieved by jointly using compressed sensing with 50% compression ratio and a 2-bit quantizer. The results imply that we can effectively transmit n bits instead of n samples, which is a substantial improvement for low-energy wireless telemonitoring.

Summary

We haven't generated a summary for this paper yet.