>The eye tracking driven input method which was seen as holy grail turns out to be annoying after a while because people don't naturally always look at what they want to click on.
This has always been the case and this technology has been around for a while. I'm surprised Apple would have chosen to use it for user input.
I was curious about this so I tried surfing the web and seeing how I click on things. I am literally unable to click on buttons or text inputs without actually looking at them. If I try to only use a corner of my eye, I miss the buttons 90% of the time. How do people not look at what they are clicking?
My guess is that people actually do look in order to aim their mouse but by the time they click, their eyes have already moved onto wherever they’re looking next.
one reviewer said they kept hitting "No" and "cancel" when they meant "Yes" because their natural progression was to look at the buttons in order, pick out the one they want to press but actually click it after they finish scanning all of them with their "mouse" paused on the one they want to hit. It's kind of fascinating that the VisionPro's input method works so well when it does work that it tricks people into doing this, because they really do just expect it to work like magic.
Yep, focused on the play button on a YouTube video to get my mouse there and then my eyes immediately went back to the center of the video a fraction before I clicked the mouse button.
Maybe they could "remember" where your eyes were looking at a few milliseconds ago and click on that, but then they wouldn't know the difference between that or maybe you actually did mean to click in the center of the content area.
The problem is the Vision OS has no cursor. Cursor position is critical. Your eyes move around a lot more than your cursor moves, but in Vision OS, cursor position and eye position are the same. Very annoying. Maybe they could put a little cursor in the center of the screen that you could move around with another gesture.
If you want to stick with the old mouse paradigm. I rather think people are going to get used to looking at what they want to click on for slightly longer very quickly.
From when I've used them in the past, the issue is the difference between looking in the general direction of something and using your eye like a joystick that jumps around if you happen to glance a little to the left or right.
Just now I moved the mouse to hover over "Reply" while I was still reading the comment. It was close enough to where my eyes were focusing that I could put the pointer over it, but I never looked at it directly. I've noticed this kind of cueing behavior in screen recordings many, many times over the years. I'm sure most people do it without thinking.
This has always been the case and this technology has been around for a while. I'm surprised Apple would have chosen to use it for user input.