Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The pixel density in the Vision Pro is high enough that I don't think this is a significant concern.

The displays are really, really good.




I mean it is non-Retina for a reason, because the pixels are noticeable by eye so presumably less density would worsen that.

It is obscured though by the softened out of focus presentation, maybe that blurring makes the difference unnoticeable.


Let me put it this way, then: having used a Vision Pro for two weeks (I bought one and returned it) I would gladly take a greater field of view in exchange for a slightly lower pixel density, and it would be a very easy decision.

I had some major issues with the Vision Pro, but pixel density was not one of them.


Narrow FOV means you have to keep your eyes "locked forward" and people don't realize how much they look at things by moving their eyes.


When I did the AVP demo I was impressed by how quickly my eyes would relax and drift away slightly from looking at something after initially focusing on it. It took some conscious effort at first to maintain steady eye focus on an object whilst actuating it with a gesture.


This is, coincidentally, one of my biggest issues with the Vision Pro. I never got used to it. I'd very frequently want to select one thing while also moving my eyes around to look at other things.


You don’t realize how often you click on something you had been looking at (but no longer) until clicking requires a constant gaze.


New addition to the wish list:

• The pinching focus remains on the control you just looked at for a fraction of a second as your eyes zoom off it.

That would be killer.

In retrospect, it would have been vintage Apple to have realized this was an important detail and resolved it already.

Also widely spreading fingers within a moment of looking at a control could act as "hovering" with a mouse pointer. Add some kind of glow to the control as an intuitive indicator that it is staying in pinch focus. So you can lock in, still look around, and pinch if you decide to - or just relax the hand to abort.

This ability to lock on last gaze, and pinch/abort, would dramatically free the eyes to continue roaming naturally.


> • The pinching focus remains on the control you just looked at for a fraction of a second as your eyes zoom off it.

I'm not convinced this would solve it. What if I really do want to select a different button quickly? Certainly, when I use traditional computers I hate mouse latency. You could add additional pinch controls like you said, but I think it would all continue to be finicky.

I think the problem is that eye tracking is bad, at least as a primary input mechanism.

If I were Apple, I would focus more on hand gestures. The Vision Pro lets you move application windows around with your hands (although you need to initially select them with your eyes), and this always felt much more natural to me than. I can imagine a system where you use your hands to manipulate a cursor in space, possibly even a 3D cursor.


I think if you pinch within fraction of a second after a minimum time previous gaze, and before another minimum time next gaze, it would work really well.

These minimum gaze times could be quite small. When we move our eyes, they move fast. I am guessing in the order of 10-30 ms. Just under our perception of what we are actually doing.

I find myself deciding and starting to pinch, but moving my eyes away at the same time. That is a small window where carry-over gaze would reliably do-what-I-mean without undercutting a follow up gaze and pinch




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: