For what it’s worth, Apple has had an attention API ( for checking if the user is interacting / viewing ) since the debut of their facial tracking sensors on the iPhone X. Although, Apple makes its very clear it’s not to be used for ads and the such. If it helps I don’t know of any developers / Apple abusing that API.
Thanks for the question, it actually made me look for the api. Looks like I misremembered it, and there aren’t actually any exposed APIs for developers regarding attention. Internally it’s used by iOS for checking when you’re looking at the screen for faceID and keeping the screen lit when you’re reading.
There are APIs for developers that expose the position of the users head, but apparently it excludes eye information. Looks like it’s also pretty resource intensive, and mainly for AR applications.
The faceID / touchID api essentially only returns “authenticated”, “authenticating”, and “unautheticated”. The prompts / UI are stock iOS and cannot be altered, save showing a reason.