Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Annotating sleep states in children from wrist-worn accelerometer data using Machine Learning (2312.07561v1)

Published 9 Dec 2023 in eess.SP, cs.CV, cs.CY, and cs.LG

Abstract: Sleep detection and annotation are crucial for researchers to understand sleep patterns, especially in children. With modern wrist-worn watches comprising built-in accelerometers, sleep logs can be collected. However, the annotation of these logs into distinct sleep events: onset and wakeup, proves to be challenging. These annotations must be automated, precise, and scalable. We propose to model the accelerometer data using different ML techniques such as support vectors, boosting, ensemble methods, and more complex approaches involving LSTMs and Region-based CNNs. Later, we aim to evaluate these approaches using the Event Detection Average Precision (EDAP) score (similar to the IOU metric) to eventually compare the predictive power and model performance.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (9)
  1. Emotion, emotion regulation and sleep: An intimate relationship. AIMS neuroscience, 5(1):1, 2018.
  2. Sleep tracking: A systematic review of the research using commercially available technology. Current sleep medicine reports, 5:156–163, 2019.
  3. Wearable sleep technology in clinical and research settings. Medicine and science in sports and exercise, 51(7):1538, 2019.
  4. Estimating sleep parameters using an accelerometer without sleep diary. Scientific reports, 8(1):12975, 2018.
  5. Sleep classification from wrist-worn accelerometer data using random forests. Scientific reports, 11(1):24, 2021.
  6. Multimodal ambulatory sleep detection using lstm recurrent neural networks. IEEE journal of biomedical and health informatics, 23(4):1607–1617, 2018.
  7. New support vector algorithms. Neural computation, 12(5):1207–1245, 2000.
  8. Leo Breiman. Random forests. Machine learning, 45:5–32, 2001.
  9. Alex Sherstinsky. Fundamentals of recurrent neural network (rnn) and long short-term memory (lstm) network. Physica D: Nonlinear Phenomena, 404:132306, 2020.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com