Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Extracting Digital Biomarkers for Unobtrusive Stress State Screening from Multimodal Wearable Data (2303.04484v1)

Published 8 Mar 2023 in cs.LG, cs.HC, and eess.SP

Abstract: With the development of wearable technologies, a new kind of healthcare data has become valuable as medical information. These data provide meaningful information regarding an individual's physiological and psychological states, such as activity level, mood, stress, and cognitive health. These biomarkers are named digital since they are collected from digital devices integrated with various sensors. In this study, we explore digital biomarkers related to stress modality by examining data collected from mobile phones and smartwatches. We utilize machine learning techniques on the Tesserae dataset, precisely Random Forest, to extract stress biomarkers. Using feature selection techniques, we utilize weather, activity, heart rate (HR), stress, sleep, and location (work-home) measurements from wearables to determine the most important stress-related biomarkers. We believe we contribute to interpreting stress biomarkers with a high range of features from different devices. In addition, we classify the $5$ different stress levels with the most important features, and our results show that we can achieve $85\%$ overall class accuracy by adjusting class imbalance and adding extra features related to personality characteristics. We perform similar and even better results in recognizing stress states with digital biomarkers in a daily-life scenario targeting a higher number of classes compared to the related studies.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Berrenur Saylam (3 papers)
  2. Özlem Durmaz İncel (4 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.