Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I see a minor complaint about eye tracking, but people conveniently forget about the status quo. The status quo is holding large objects which are strapped on your hands and using point-and-shoot to click on items. You literally have to shoot items like on a shooting range, and hope that you hit. It's ridiculous. You can't use standard UIs with that kind of interaction, because everything is way too small.

And while you're strapped to those items you can't use keyboard or anything else with your hands. You can't scratch an itch or put something in your mouth. Your hands are occupied and you feel like you're in jail. Surprisingly you actually want to have hands.

This is the single most important innovation in Apple Vision. It gives you hands.




Only if it already wasn't available in other VR/AR platforms. Hand tracking is available on Meta Quest platform (albeit not very reliable, I admit) since December 2019.


It's a combination of eye tracking and hand gestures. You use eyes as the primary pointing device on Vision Pro. Hands are used for clicking, scrolling etc.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: