Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MunchSonic: Tracking Fine-grained Dietary Actions through Active Acoustic Sensing on Eyeglasses (2405.21004v2)

Published 31 May 2024 in cs.HC and cs.ET

Abstract: We introduce MunchSonic, an AI-powered active acoustic sensing system integrated into eyeglasses to track fine-grained dietary actions. MunchSonic emits inaudible ultrasonic waves from the eyeglass frame, with the reflected signals capturing detailed positions and movements of body parts, including the mouth, jaw, arms, and hands involved in eating. These signals are processed by a deep learning pipeline to classify six actions: hand-to-mouth movements for food intake, chewing, drinking, talking, face-hand touching, and other activities (null). In an unconstrained study with 12 participants, MunchSonic achieved a 93.5% macro F1-score in a user-independent evaluation with a 2-second resolution in tracking these actions, also demonstrating its effectiveness in tracking eating episodes and food intake frequency within those episodes.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Saif Mahmud (11 papers)
  2. Devansh Agarwal (21 papers)
  3. Ashwin Ajit (2 papers)
  4. Qikang Liang (3 papers)
  5. Thalia Viranda (2 papers)
  6. Cheng Zhang (388 papers)
  7. Francois Guimbretiere (13 papers)

Summary

We haven't generated a summary for this paper yet.