The development debuted at the CHI 2026 conference and is already being considered as a standard for evaluating the usability of mobile app interfaces.
The system uses a neural network model and a digital "twin" of the hand with 63 muscle nodes. It accurately models the load on fingers during typical actions—scrolling feeds, taps, and swipes. The **Screen Mirror** function reproduces user actions in real Android apps, capturing every micro-movement and even errors.
The technology detects overstrain in wrists and fingers, which is characteristic of prolonged gadget use.
Log2Motion will help developers create ergonomic interfaces, reducing the risk of chronic injuries—tendinitis, carpal tunnel syndrome. A revolution in UX design is expected: buttons, gestures, and scrolling will be optimized for biomechanics.
Smartphones have become an integral part of life, but their impact on hand health is underestimated.