Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s my understanding that eye tracking isn’t great as an input method, it should be used more for stuff like rendering or NPC interactions.



I think that eye tracking perhaps could be used to enhance gesture-based input methods though. It could provide a hint at which object a gesture is directed at in cases where that would be ambiguous.

I have tried eye tracking as primary input method some years ago in another setting, and I very much did not like the experience.


> I think that eye tracking perhaps could be used to enhance gesture-based input methods

That's how it works on the Vision Pro. I didn't really feel annoyed by it, but make sure the device is calibrated to your eyes, or the results might not be that great.


It's going to be a great, sneaky way of installing an ad blocker onto wetware - users will train themselves to not even gaze at anything resembling an ad, lest they look at it long enough for the headset to register it as a click.


Apps are not able to register where you’re looking. Only the headset itself does.


Sure, but the proof of looking is in the clicking. The threat scenario I'm describing here is "looked at the ad long enough for the headset to interpret it as wanting to click on it".


Does ios not rely on hovering/mouse over for alt text or similar? Im mostly on windows but i wonder if theyll eventually concede a second gesture for "put cursor here"


On visionOS the APIs essentially ask all UI elements for "alt text" and the system display is it when appropriate.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: