Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wow, I remember when it was announced, to press materials gave the impression that it would be closely integrable with macOS. I was picturing using VS Code from within it. Which I guess I could do if I set up VNC... but I'm not paying thousands for that.

Is an AR productivity tool really so hard? Apple owns the whole stack here. Nintendo can do Mario Kart AR in your living room with an RC car, but I can't get unlimited AR desktops for software development etc?




It does Airplay from a Mac to the headset. You look at the Mac and then tap the "connect" button that appears in mid air above it. Apparently it's very seamless and has good latency.

What you don't get is Mac windows intermingled with AVP windows, it's just your Mac screen as one window that you can move about. It sounds good though, and there has been a fair amount of movement in screen sharing over the last few macOS releases which suggests more could be coming here (like multi-screen support).


Yeah, that's not great. I guess that's slightly useful, but what I'm looking for is the ability is more like virtual desktops but being able to place each desktop into physical space. Being able to do that with individual windows would be even better, but I'd be fine with the virtual desktops. If Vision Pro can do Airplay, then I don't know why Apple decided not to do an integration with virtual desktops from macOS. Maybe they will, but not having that at day one is puzzling to me. Without that, it seems we've got yet another AR/VR device that is a glorified tech demo. I really hope that it gets better (and less expensive) than what we are currently seeing.


I've been keeping an eye on the Simula One for some time, it does what I think you're looking for[1]. I have to assume outside of the Apple ecosystem this type of this is already possible with existing VR headsets, but the closest I've seen is Virtual Desktop[2].

I guess for the Apple ecosystem we've gotta wait for some software updates to make it more useful. Not that I can justify paying my own money for the Vision Pro or Simula One.

[1] https://youtu.be/a3uN7d51Cco?t=16 [2] https://www.vrdesktop.net/


I'm sure this will be possible in future Vision Pro devices, main issue right now is probably performance. Macbook Pros already can only push to 2-3 monitors (albeit at higher resolution) and the Vision Pro is also running its own apps simultaneously alongside the Mac window (curious if it is the Mac or Vision Pro doing compression and other processing work for streaming to the headset). Apple being obsessive about frame rates and performance are likely erring on the side of caution before allowing too much. For example the recent downgrade from 1080p to 720p when it comes to streaming the Vision Pro screen to other devices.


Yeah I’d love to use one for PCB design at a cafe, but I don’t even own a Mac. I’m not going to drop $5k for an experience I can already do with my old laptop just a little bit fancier.


I'm surprised we still don't have an X-like "render on the drawing device" model. Rather than sending pixel, you send a higher abstraction model from which the UI can be drawn.


Absolutely. This is how RDP works as well. It's one the few areas where I think the Windows ecosystem has a better approach.


I suspect as hardware encode gets better and wireless bandwidth gets greater there’s less and less merit in this.


You can theoretically serialize draw calls, but that's significantly more complicated than sending over a video stream.


Then you lose the power of a second computing device if all view rendering happens on the AVP.


We still had this (--NSDisplay ?) in the first Developer Previews of MacOSX.


> but I can't get unlimited AR desktops for software development etc?

You can't because it's computationally impossible. There is simply no computing device that can render unlimited high-res desktops at 60Hz per eye.

Not to mention the need to stream all that data to the headset - since you're not going to put a high-end graphics card needed to even attempt this in anything approaching a wearable form factor. Good luck getting multiple 4k or even just HD streams between your laptop and your AR headset over Wi-Fi.


To clarify, I should have said virtual "desktops" instead of monitors. I don't even need all desktops to be active at once, but be available in physical spaces for me to access in a way that is more intuitive and customizable than pressing the F2 button on my Macbook keyboard and then hovering my cursor over the desktops bar. That would totally suffice. Also, somehow that bar can show what's on each virtual desktop, and in real time, so I know that's at least possible.


> Also, somehow that bar can show what's on each virtual desktop, and in real time, so I know that's at least possible.

That bar shows a small lower-res preview of each of the desktops, it is not rendering all the desktops at full 4K res. Essentially the smaller a window appears on the screen, the lower effort is needed to draw it.


AVP knows exactly which monitor you are looking at. The mac doesn't need to render all monitors at full resolution, just the one being looked at.


True, but apart from video/games/possibly a browser they absolutely don’t need to refresh at 60hz. I’m surprised they couldn’t somehow work in a “one nice screen, the rest refresh when needed” solution - though maybe that’s just too clunky for Apple.


Rest of the VR industry already systems that only really put the rendering hardware to use where you're looking it's called Foveated rendering, because essentially your eyes can't even see the detail in the edges of vision even if it were rendered there. Handles this all very fast and seamlessly.

Not a stretch the same thing could be applied to a virtual desktop environment especially when you have eye tracking.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: