When an application, such as a password manager, requires mouse movements to generate enough entropy to build a secret key, for example, how is that entropy getting calculated? Is it based on (x, y) coordinates? Knowing that there ar 9 adjacent coordinates to any non-edge (x>0, y>0) pixel, that’s approximately 3.17 bits per pixel. But it’s unlikely that a user will change direction of his mouse after a large movement, so most of that direction is biased.
Is the movement based on time somehow? After t-seconds, n-bits are calculated? Where is the reasoning behind this if so? Perhaps a combination of time and coordinate deltas are used?
Basically, how does a developer determine when 128-bits of entropy have been estimated based strictly on mouse movements? Of course, the dev should use the operating system’s CPRNG, but I’m interested in the theory of computer mouse entropy.