Cross-Device Motion Interaction via Apple's Native System Frameworks (2508.01110v1)
Abstract: We introduce an open-source, fully offline pipeline that transforms a consumer-grade iPhone into a motion controller with real-time tactile feedback, using only native Apple frameworks. Designed for rapid prototyping and applied mobile HCI scenarios, the system integrates CoreMotion for inertial sensing, MultipeerConnectivity for peer-to-peer data transmission at 10 Hz, and CoreHaptics for immediate tactile confirmation. A built-in logger captures end-to-end latency without requiring clock synchronization, yielding a mean delay of 70.4 ms and 95th percentile below 74 ms on typical 5 GHz Wi-Fi (-55 dBm RSSI). We validated the pipeline through a real-time demonstrator game, KeepCalm, deployed during a public event with 21 participants. Results showed stable connections, zero packet loss, and negligible power impact (24 mW on iPhone 13 mini). With fewer than 500 lines of Swift code and no reliance on cloud infrastructure, this system provides a compact, reproducible foundation for embodied interaction research, casual games, and offline educational tools. All source code, latency logs, and provisioning scripts are openly released under an MIT license.