Finger-Level Wrench Measurements
- Finger-level wrench measurement is the direct sensing of forces and torques at individual finger segments through methods like optical, magnetic, and tendon tension sensing.
- It leverages advanced modalities such as LED-PDMS arrays, photo-reflector sensors, Hall-effect taxels, and IMU–EMG fusion to achieve high-resolution, real-time measurements.
- Applications span robotic dexterous manipulation, prosthetics, and wearable interfaces, with performance enhanced via calibration, modeling, and data-driven control.
Finger-level wrench measurement refers to the direct sensing of forces and torques (collectively, the "wrench") applied at the level of a single finger segment in human or robotic hands. Accurate, high-bandwidth wrench measurement at this scale is foundational for dexterous manipulation, tactile skill transfer, prosthetics, and haptic feedback. Methods span tactile skin arrays, embedded optical and magnetic sensors, tendon-tension proportional sensors, and sophisticated wearable fusion systems. Recent work demonstrates compact, high-resolution, real-time solutions suitable for integration into both robot fingers and wearable human–machine interfaces, with sub-Newton error rates and compliance tuning for robust manipulation.
1. Fundamental Principles and Sensor Modalities
Finger-level wrench measurement systems sense external contact-induced actions (forces F and moments Ï„) using three main physical principles:
- Deformation-Based Sensing: Mechanical displacement in a compliant structure (elastomer, flexure, or skin) is transduced via optical (LED, photoreflector), magnetic (Hall-effect), or strain-gauge elements. Local deformation fields are mapped to 3- or 6-DOF wrench estimates.
- Tendon Tension Sensing: In tendon-driven actuators, tension sensors in series with the tendon infer fingertip force by mapping transmitted force through joint geometry.
- Physiological Indirect Sensing: In human hands, muscle activity (EMG) and inertial measurements (IMU) are fused via deep learning to estimate per-finger force in real time.
A representative comparison of recent finger-level wrench sensors is as follows:
| Sensing Principle | Integration Context | Key Performance Metrics |
|---|---|---|
| Optical/LED-PDMS | Inside robot finger | MAE ≈ 0.05–0.07 N (xyz, 0–2 N); 500 Hz BW |
| Photo-reflector | Tendon path in finger | Resolution 9.9 mN; RMSE 0.455 N; 5 kHz SB |
| Hall-effect (taxels) | Fingertip (human/robot) | MAE (x,y,z): 0.21, 0.16, 0.44 N; 100 Hz |
| IMU+EMG fusion | Wearable (human) | RMSE ≈ 1 N (force), r ≈ 0.76; 30 ms lag |
2. Mechanical and Sensor Design Architectures
Optical LED-Based Displacement Sensing
A compact, finger-scale six-axis force/torque sensor is constructed using two parallel rigid plates (Ø ≃ 27 mm, h ≃ 20 mm) separated by a transparent polydimethylsiloxane (PDMS) elastomer (10:1 base/curing agent). Each plate supports custom PCBs with 6 LED emitters and 24 LED receivers arranged in three clusters. PDMS serves as a compliant, kinematically-constrained six-DOF flexure. Under load, plate displacement δ shifts the alignment of emitter-receiver pairs, modulating receiver intensity ΔI with near-linear ΔI = αδ + β for small δ. No amplification electronics or external optical paths are required (El-Azizi et al., 2024).
Miniature Photo-Reflector Tension Sensor
A 13 mm × 7 mm × 6.5 mm symmetric AL7075-T6 elastomer with flexure hinges and fillets is integrated into the tendon pathway of a robotic finger. A VCNT2020 photo-reflector (IR LED and phototransistor) operates in the 0.2–0.5 mm near-field, outputting a voltage V(δ) fit via a 3rd-order polynomial to the displacement of the elastomer. Mechanical modeling uses Timoshenko beam theory, confirmed by FEM. Assembly requires no adhesives—elastomer is bolt-clamped through PCB holes. This configuration achieves <0.01 N resolution and sub-1% nonlinearity/hysteresis (Kim et al., 1 Jul 2025).
Hall-Effect Taxel Fingertip Arrays
Each FingerTac tactile sensor comprises 20 three-axis Hall-effect ICs (Melexis MLX90393) paired with neodymium magnets embedded in hard-silicone bumps, distributed as taxels across a flexible PCB shell fitted to the fingertip (human/robot). Local deformation moves the magnet relative to the Hall sensor, yielding (h_x, h_y, h_z), which are mapped via a calibrated 2nd-order polynomial to local (f_x, f_y, f_z) force vectors (Sathe et al., 2023).
Wearable IMU–EMG Fusion Systems
Wrist2Finger employs a ring with a 9-axis IMU (thumb) and a smartwatch EMG sensor (wrist), streaming 30–55 Hz data via BLE. A dual-branch transformer model fuses kinematic (IMU) and muscular (EMG) signals, outputting both pose (joint angles) and per-finger force estimates. Calibration includes orientation normalization (IMU), rest/max contraction scaling (EMG), and applies losses for pose accuracy, force prediction, smoothness, and physiological saturation (Xiao et al., 5 Oct 2025).
3. Signal Processing, Calibration, and Modeling
Optical and Magnetic Sensor Calibration
- LED-Displacement Sensors: Baseline correction (first 50 samples) is performed, followed by median filtering (45 samples at 500 Hz). Features are normalized to zero mean/unit variance. A supervised feed-forward neural network (3×128 shared ReLU layers; 6 heads per force/torque axis) is trained with Adam (lr=1e–3, batch size=2000, 50 epochs) to minimize mean squared error on (force, torque) ground truth (El-Azizi et al., 2024).
- Photo-Reflector Tension Sensor: A 16-bit ADC digitizes output; force-voltage mapping uses a polynomial fit V(F) during calibration, inverted for runtime force estimation. Zero-force noise is ≈9.9 mN, and nonlinearity/hysteresis are both <1% (Kim et al., 1 Jul 2025).
- FingerTac Taxels: Each taxel is regressed independently using:
$\begin{pmatrix}f_x\f_y\f_z\end{pmatrix} = W_0 + W_1 \begin{pmatrix}h_x\h_y\h_z\end{pmatrix} + W_2 \begin{pmatrix}h_x^2\h_y^2\h_z^2\end{pmatrix}$
where , , with negligible cross-terms. Wrenches are computed by summing force vectors and position-weighted moments from all taxels (Sathe et al., 2023).
Wearable Sensor Fusion and Learning
IMU features include raw acceleration and orientation rotation matrices. EMG signals are rectified, low-pass filtered (LPF~5 Hz), min–max scaled by per-user rest/MVC, and embedded via 1D convolution. The transformer network employs cross-modal attention, MLP fusion, and biomechanically-informed multi-term loss (pose, force, kinematic smoothness, force saturation). Final force outputs are mapped to absolute physical scale using per-user calibration (Xiao et al., 5 Oct 2025).
4. Performance Evaluation and Metrics
Performance metrics reflect accuracy, dynamic range, bandwidth, and robustness.
| Sensor Type | Force Accuracy (MAE/RMSE) | Dynamic Range | Bandwidth / Latency |
|---|---|---|---|
| LED-PDMS optical | 0.05–0.07 N (xyz, MAE) | 0–2 N (tuned), up to 10 N | 2.5 kHz sampling, 500 Hz log |
| Photo-reflector tension | 9.9 mN (resolution), 0.455 N (RMSE) | 0–200 N | Up to 5 kHz sampling |
| Hall array (FingerTac) | 0.16–0.44 N (MAE), 0.21–0.52 N (RMSE) | ±6 N (normal), ±2 N (shear) | 100 Hz |
| IMU–EMG fusion | ≈1 N (force RMSE), r ≈ 0.76 | ≈0–25 N (calibrated) | 8–29 ms (real-time) |
LED-based and photo-reflector designs achieve high accuracy and fast response, with the latter offering superior resolution in tension-driven contexts. Hall-effect arrays are capable of distributed 3D force mapping with low per-axis errors. IMU–EMG fusion yields real-time, per-finger force prediction suitable for wearable applications, with lower absolute accuracy compared to embedded approaches but significant utility for interaction, VR/AR, and prosthetics.
5. Integration Strategies and Application Contexts
Robotic Hands
- Embedded Optical/Mechanical Sensors: LED-PDMS and photo-reflector sensors are dimensioned for finger-distal segment integration, using FFC or CAN-FD for electronics, and are suited for multi-finger and underactuated hands (El-Azizi et al., 2024, Kim et al., 1 Jul 2025).
- Taxel-Based Tactile Skins: Hall-effect arrays are fitted as fingertip shells, providing full 3D contact information for manipulation skill transfer, haptic teleoperation, and grasp controller feedback (Sathe et al., 2023).
- Tendon-Driven Actuators: Photo-reflector tension sensors, aligned with the tendon, output high-fidelity tension data for direct mapping to fingertip force via kinematic parameters.
Wearable and Human-In-the-Loop Systems
- IMU–EMG Fusion Wearables: Minimal hardware—one ring + one EMG channel—enables real-time finger force estimation and pose tracking. Applications include VR/AR hand control, ergonomic assessment, and low-intrusion prosthetics (Xiao et al., 5 Oct 2025).
- Interchangeable Fingertip Sensors: FingerTac modules can be swapped between human and robot, facilitating direct measurement and transfer of human tactile strategies for robotic learning and feedback.
6. Limitations and Prospective Developments
- Axis Sensitivity and Cross-Talk: In LED-based optical designs, z-axis force and torque sensitivity depend critically on emitter–receiver geometry and may require further geometric or material optimization. Reflective coatings and refined placement can amplify out-of-plane responses (El-Azizi et al., 2024).
- Hysteresis and Drift: Elastomer-based devices (PDMS, flexure) show hysteresis and mechanical settling delays; active drift compensation or higher-frequency demodulation is proposed.
- Wearable-Specific Issues: EMG-based approaches display user, position, and context dependence, necessitating per-user calibration and potential meta-learning. Multi-channel/extra-ring extensions would improve performance for digits with low baseline signal (e.g., little finger) (Xiao et al., 5 Oct 2025).
- Bandwidth Constraints: Photo-reflector tension sensors can sample at 5 kHz, but actuation system compliance may impose lower force-control bandwidths (~10 Hz observed in TSA/PI experiments) (Kim et al., 1 Jul 2025).
- Assembly and Integration: Hall-effect arrays and photo-reflector systems are designed for ease of installation (bolted, snap-fit), with low cost and minimal adhesive requirement; however, scale-up to extremely miniaturized or highly anthropomorphic hands may require further miniaturization.
- Data-Driven Controllers: Direct embedding of model inference (e.g., neural network for optical sensors) on hardware MCUs enables closed-loop, high-bandwidth grasp control and tactile exploration applications (El-Azizi et al., 2024).
7. Comparative Evaluation and Future Directions
Recent research demonstrates a diverse range of finger-level wrench measurement architectures optimized for form factor, bandwidth, and integration:
- LED-displacement sensors achieve full 6-axis measurement with sub-0.1 N error and no amplification electronics, suitable for dense finger integration (El-Azizi et al., 2024).
- Photo-reflector tension sensors provide high-resolution, low-drift tension-to-force mapping for real-time force feedback in tendon-driven hands (Kim et al., 1 Jul 2025).
- Hall-effect taxel arrays offer distributed, human–robot-interchangeable 3-axis force readings for dexterous skill transfer and multimodal manipulation (Sathe et al., 2023).
- Wearable IMU–EMG systems enable per-finger force estimation in socially acceptable packages with sub-cm pose estimation for AR/VR and prosthetic scenarios (Xiao et al., 5 Oct 2025).
A plausible implication is a trend toward fusion—combining embedded high-bandwidth physical sensors with data-driven wearable systems for end-to-end manipulation pipelines in human–robot interactive contexts. Future work will likely address axis decoupling, adaptive learning across users/contexts, and the development of standardized interfaces for robotic hand sensor integration.
References:
- (El-Azizi et al., 2024) Compact LED-Based Displacement Sensing for Robot Fingers
- (Xiao et al., 5 Oct 2025) Wrist2Finger: Sensing Fingertip Force for Force-Aware Hand Interaction with a Ring-Watch Wearable
- (Kim et al., 1 Jul 2025) A Miniature High-Resolution Tension Sensor Based on a Photo-Reflector for Robotic Hands and Grippers
- (Sathe et al., 2023) FingerTac -- An Interchangeable and Wearable Tactile Sensor for the Fingertips of Human and Robot Hands