Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
PSVR2 hardware authentication has been cracked on PC (neogaf.com)
119 points by haunter on May 8, 2023 | hide | past | favorite | 81 comments


One of the more interesting parts of the PSVR2 is that it proves that the PS5 is the only console that can output two different video streams independently (HDMI and the front USB-C port). PSVR1 did some passthrough hack.

Imagine the possibilities if Microsoft had actually added multiple HDMI ports on the Xbox Series X:

- screen mirroring for capture cards without passthrough

- native single console multiscreen cockpit view modes for Forza

- Split screen Halo/Goldeneye without actually splitting the screen - LAN party with one console

- minimaps, HUDS on separate screens like the Wii U

- Discord / chat on a seperate screen while streaming

- support for the ecosystem of actually sometimes quite high quality Windows Mixed Reality VR headsets

I wonder if they could technically add more video out ports via the PCI-E based expansion card slot...


Imagine they did and now you have to pay $50 extra to never use one of these niche and gimmicky possibilities


The original PS3 demoed at E3 2005 had 2 HDMI ports and I suspect they removed it for the same reason. Extra cost and little value.


The Raspberry Pi 4 has dual HDMI and it costs $35 for the entire board. Cost is absolutely NOT the factor for an extra port. Extra ports and extended monitor support are literal pennies.

Ignoring HDCP for the secondary port, it would cost an additional $0.27 to add a secondary hardware port along with whatever cost for the software, which still wouldn't even add up to $5, let alone $50.


I don't know how you see the prospect of saving $27 million dollars and say that cost isn't a factor.


Pretty much all PCs support multiple video outputs. It's not a gimmick at all.


True - but I can't think of a single game which actually makes use of multiple monitors even on PC where plenty of people have the hardware to support that sort of thing. A second monitor with Discord on it, or for streaming? Sure. But nothing that video games themselves make use of.

Even if games supported it, who has multiple TVs in their lounge room close enough to connect both of them to your xbox? That sounds super niche.


Supreme Commander did it all the way back in 2007, you could have a second independent viewport on the second monitor to keep an eye on a different part of the map. IIRC it had to jump through some weird hoops to pull that off with the graphics APIs of the time, it spawned a second instance of the game which acted as a spectator client for the primary instance.



Simracers use triples quite often! Alignment is a pain.


Gran Turismo 5 supported a triple monitor setup, but you needed three PS3s and three copies of the game.


Gran Turismo 4 had that too, all you needed was three PS2s, three network adapters, three TVs and three copies of the game. The feature was mainly intended for installations at E3 and the like but it was left in the retail build so anyone dedicated enough could set it up at home.

https://www.youtube.com/watch?v=wHM42OcnX_E


That is cool, I didn’t know that! PC simracers just use wide viewports, I think.


Live For Speed renders from multiple cameras - one camera per monitor - to prevent distortions. It can render a 360 degree view if you have enough monitors. Can't do that with a single viewport.


Perspective projection only works for FOV < 180deg. At FOV = 180 the projected coordinate has to be +inf and it won’t work, so games that support FOV beyond 180 stitches multiple viewports, either on display or by having multiple displays.


I really want to try that one, it sounds like an amazing game.


I figured out what you meant eventually, but when I first read this comment I thought you meant paying $50 extra if you didn't want these features.


I think that about the HDMI In port every time I see it on my OG Xbox One.


Is that why modern GPUs are so costly? Half a dozen unused connections?




They’re expensive simply because people will pay that much.


I remember reading about a technology that used the 3D functionality of TVs, along with 3D glasses, to display two separate video feeds to two separate users - split screen, but without splitting the screen.

Instead of the video feed alternating between left and right images, it alternates between first player and second player feeds. The glasses, instead of alternating between left-open/right-closed and vice-versa, alternates between both-open/both-closed.

I'm not sure how audio would be handled in this case, and I don't know if anyone actually really supported it in practice, but it was a pretty neat idea.


You’re describing the gaming PC. They’re quite popular is my understanding.


Not every game is on PC (e.g NHL series), and some popular games that are, aren’t the best versions available (e.g. Last of Us)


NHL 97 is on the PC and that’s all anyone should ever need. :)


Plus sometimes I don't want to develop a high dimensional regression model for how the setting affects the fps or deal with all this shader compilation stutter.

It can be a lot of fun to squeeze out every frame you can, but other times I just want to play a game.


Atomic Heart had the best shader compile I’ve seen yet. It happened with a loading bar on the main menu and didn’t stop you from exploring settings. IMO the compile should be part of the install step outside the game though


Setting up shader precomp is apparently a manual process that can be really developer intensive, especially in open world titles. That said, I agree, there is a lot of low hanging fruit that can be handled. Devs seem to be deciding between all or nothing and often just pick the nothing path.


There is never a reason to disable the main menu.

Just because everyone is doing it does not mean it was ever a good idea.


The newest Call of Duty does that as well


You could just leave it on the defaults? You'll get better FPS than on console and higher quality graphics, assuming a decent gaming PC.


You still have to manage the PC and games. My console has never let me down when I want to relax. I do this for a day job, I prefer not to fuck with troubleshooting at night.


> My console has never let me down when I want to relax.

You've never run into the 'must update the system firmware and download a 400TB patch to play this game you played yesterday' thing, I guess.


Nope, my PS5 and previous PS4 both update(d) themselves while asleep at night. I only have to update games I haven’t played in forever. My internet is fast enough that I don’t care about that either.

When I had a gaming PC a few years ago, it always wanted to update something on steam or windows then restart.


My gaming PC always does that while asleep at night...


The defaults can be a double edged sword. Places like DF put out optimized defaults for really mainstreams config for just that reason.

If you have a 3080 ti or 4090, sure, you have paid to just push play and expect stuff to work (well that still doesn't solve the compilation issues), but then you are leaving a lot of performance / fidelity on the table.

However, if you are even somewhat underpowered (think of all the people that bought 6500 xt or 3050 because of the GPU shortage), you are going to have to do some type of manual tuning. Most users according to the April Steam survey use worse than a 3060 and a large fraction of those use something way worse than a 3060.

Plus the CPU side isn't always a solved issue. I play on an 64 core epyc I primarily use for simulation. I have plenty of power on tap, but the lower clocks plus NUMA arch force me to do some fiddling. And I'm spoiled on that front, over half of gamers still play on 4 and 6 core machines (again, see the steam hardware survey for April) that chock gpus very quickly.

A 300 dollar Series S may not show the most crisp image, but I often use mine to avoid the whole song and dance.


Depends on the game. My 5800x/3080 Ti doesn’t do much better at maintaining a stable 60fps in Jedi Survivor than the PS5 does. Recent triple-A PC releases have been super bad.


Unless you play on HDR/OLED TV, in which case the better colors give more quality than any graphical setting difference.


Oh, you mean like the QD-OLED monitor hooked to my PC?


PC HDR sucks a lot compared to consoles.


An extra $10 BOM cost that results in a $25 price premium for a gaming PC is no big deal. That same $25 would have a huge impact on console sales.


Sadly the lead dev has stepped away from this. https://twitter.com/iVRy_VR/status/1655554037650006017?s=20



I struggle to follow these threads. I have no idea what's going on.


Two tips to follow twitter threads:

1) Only ever use it from a desktop browser with a sufficiently configured ad blocker. The mobile apps are intentionally designed to pull you out of the thread with only a single line of text delineating where it does so and no way to actually continue with the thread. It's a miserable experience and the people responsible should be ashamed of themselves. They try to pull the same bullshit on the website but it's more obvious there.

2) Twitter has two reply modes. Direct replies are put under the post they're replying to, much like normal linear forum discussions. But then there's retweets. They create a new thread, quoting the previous thread in the topic starter, so to speak... and unlike the standard linear model, the topic starter is done by top posting, with the quoted bit below.


I think it's part stress of the project and part stress of the spotlight. I would imagine the developer was stressing over how to manage requesting donations and then all the feedback (good and bad) that they were receiving.


I could see how he was getting tired of annoying people, but then I managed to navigate to the author's profile, and there's a pinned tweet where he's selling this thing as a product for $20

> Use your Sony #PSVR, Oculus, Pico or Daydream Headset, or Smartphone as a #SteamVR #VR Headset for your PC. Evaluate for free, use In-App Purchase or Steam DLC for "Premium Edition"

If he doesn't want to deal with customers, maybe he shouldn't be selling this as a product

That said, there's nothing wrong with discontinuing the product if he doesn't feel like working on it any more


To clarify, he isn't selling any of the work that he's doing on the PSVR2 currently. He had a donation page for that work that he has since removed after some minor criticisms.

The advertisement is for the iVRy Driver that he developed for some other hardware options.

The work he's done on reversing the PSVR2 is unreleased and closed source for now.


> there's a pinned tweet where he's selling this thing as a product for $20

That's not what they are selling, 'this thing' is work on the PSVR2 not PSVR, one of the things supported by his existing, available now product.


Yeah, I went straight to their profile and found a thread that made sense. Even if they wanted a little money, the thankless work of being a hacker/dev/whatever on a highly anticipated project sucks and drove this guy to take a break. Good on him.


hopefully sony wont get overly excited and get the developer squashed like after the ps3 got cracked last decade.


Isn't one of the major barriers that you'd have to reimplement the entire tracking code for PC? Or does that actually run on the headset independently and not on the PS5?


I'm pretty sure that runs on the headset.

If you plug a headset into a VirtualLink adapter it will go into the "cinematic mode" and function as a floating 1080p monitor with 3dof tracking done by the headset.

https://twitter.com/Thrilluwu/status/1635374573368778752?s=2...


The difference between the headset doing 3dof by default (expected for something like time warp and only need accelerometers) and getting an API that parses the camera feed for 6dof head and controller tracking and wiring that up to OpenXR seems like a huge leap.


My understanding is that this wasn't particularly the issue, the main problem for PC use is that you'd need a VirtualLink adaptor, and that will hold back people doing much work on this.


The PSVR2 turned out not to actually use VirtualLink, just more-or-less standard DisplayPort + USB 3 + USB PD over USB-C. Most of the compatibility issues seem to be from it requiring parts of the spec that aren't widely implemented at the same time, like 12V support and display compression. The main open question is how to actually switch it into VR mode so that it's possible to send a stereo image instead of 2D virtual cinema mode, and I don't think anyone is quite sure whether this helps with that at the moment.


> USB PD

USB Power Delivery (USB PD)

In case anybody else also wasn't familiar


VirtualLink isn't that exotic, just another USB-C Alt mode that has 4xDP lanes with USB3 signaling pushed over the USB2 pins. There's likely plenty of ICs on the market that could do it, just few products that implement the feature because there's zero demand.

Given someone gets the PSVR2 working on PC there would be incentive among enthusiasts to create hobbyist-grade hardware at least.


That and the fact that the PSVR2 uses camera-based SLAM tracking, similar to the Oculus Quest, and reimplementing that from scratch is non-trivial to say the least.


There's already a mostly-working open source SLAM implementation that people have been using on a few of the existing headsets. What complicates matters here is that at least feature extraction seems to run on the PSVR2 headset itself so the existing code can't just be dropped in as-is. (Also, whilst the controllers work in basically the same way as many other VR controllers using a constellation of IR LEDs, there's not really good open source support for any of those controllers either.)


Isn't the SLAM tracking processing done on the headset itself and then the computed movement data sent to the PC/PS5 via USB?


Is it? I haven't seen it definitively confirmed either way, but intuitively it would make more sense to do SLAM on the PS5s main processor rather than driving up the PSVR2s BOM with an on-board processor fast enough to do it. They have to stream the camera frames back to the PS5 anyway for the passthrough feature to work.


>driving up the PSVR2s BOM with an on-board processor fast enough to do it

The Quest 2 can do SLAM on the head set plus the 2*2K resolution 3D rendering in real time on a 4 year old QUALCOMM chip running Android OS on top and it costs less than the PSVR2.

So Sony can definitely afford to put a processor in the headset just for the SLAM compute alone since the 3D rendering is done on the PS5.

The chip in any optical mouse is a low resolution camera and DSP computing SLAM algos at hundreds of times a second and costs peanuts. VR headsets have more resolution to process and account for the third dimension in space but they also don't have to process the entire picture but just the differential movement of the LEDs captured in their blanking intervals, processed as white dots in B/W images, further simplifying things.

So pretty sure the PSVR2 SLAM compute can be done on an ARM chip or FPGA worth ~ 10 USD nowadays.


Apparently the radeon RX6xxx series has them, and nvidia 2xxx cards with USB-C ports. Though yes, it seems unlikely there will be continued support.


Yeah, GPU manufacturers seem to have given up on supporting VR headsets.


There's nothing to support, the VirtualLink consortium was dissolved and it's now a zombie standard with nobody at the wheel. The VR headset makers that originally backed it aren't supporting it anymore either, Valve announced and then cancelled a VirtualLink cable for the Index, and HTC/Oculus/Microsoft never made it as far as announcing any products.


For anyone interested in why VirtualLink didn't go anywhere, one of the limiting factors was that USB 3 and above is entirely separate from USB 2 and below.

Normal USB-C DP Alt Mode delivers either four lanes of DP and USB 2 or two lanes of DP and both varieties of USB. VirtualLink takes the 4xDP variant and upgrades the USB pairs to support 3.x modes, but that means if you plugged a USB 2 device in to the headset itself there'd be nowhere to route that traffic.

A chip was developed to bridge the gap and allow USB 2 devices to connect to the USB 3 bus, but there were quirks and it ended up not being ready in time to be useful to the VR market. Valve eventually cancelled their adapter plans and that was the end of it.

This Hackaday article has a lot of useful links including an open source breakout board design for the chip in question: https://hackaday.com/2022/03/07/a-chip-to-address-the-fundam...


I suspect the rise of good-enough wireless PCVR over commodity WiFi hardware didn't help matters either - being able to tether a headset over a single USB-C cable is mildly convenient but nowhere near as convenient as having no cable at all. Oculus and Pico both went down that route and it looks like Valve are also going to with their next headset.


>nowhere near as convenient as having no cable at all.

AIUI the tech just isn't there for this.

There's latency and packet loss to deal with, whenever there's a radio in the path.


The tech has been there for years, if you look at discussions of using the Quest for PCVR the default recommendation is to use WiFi streaming rather than the tethered USB mode. The trick is that the reprojection still runs on the headset itself so the critical part for avoiding motion sickness has minimal latency regardless of the signal quality.


As an early adopter of the Vive (didn't preorder but bought within the first few weeks) who also has a Quest 2, the "good-enough" part of your previous post is definitely open to interpretation.

I will say I agree with using it wireless being the default recommendation, but there's a reason people still buy fiber optic USB-C cables.

If you have a good WiFi setup, which a lot of people do not (and a lot of people who think they should have good WiFi because they spent a lot of money on it still don't because they don't understand what they bought), it's perfectly fine for most games that don't depend on tight timing, but you can still tell when playing Beat Saber or similar. I noticed it most as input jank on the controllers, both compared to the Vive and native Quest apps.

IMO the right answer is a Quest-like standalone handset with a VirtualLink style input to allow uncompressed signals with predictable latency to flow between the two. Best of both worlds. Wired Quest now is effectively just using the USB port as a network interface rather than a display.


I've been using Virtual Desktop (vrdesktop.net) for a while now and it works great. With some judicious use of its resolution scaling options and a physically nearby Wifi 6 router it's pretty easy to get latency down to 30-40ms. The only real trouble spots I still run into are games that have wildly unoptimized elements like VRChat.


30-40ms is an eternity for VR.

I get a headache just from thinking about it.


It's cool to hear what I use every day is tech that isn't here yet :)

I use my Quesr 2 with my PC (4090) all the time and it's great. The lag is minimal and I have no motion sickness.

I use a dedicated wifi 6 ap with its own SSID though because I noticed multiple devices on the same AP causes stuttering. Probably from switching between higher and lower modulations.

But this way it works perfectly. I hope the Quest 3 will get wifi 6E for more free channels


Ah, damn, I had no idea


The first time I saw someone using a VR headset was in 1994, the technology might have improved, but the adoption issues hardly changed.


There are more propper vr headsets sold than Xbox series consoles.


That still doesn't make VR a success at scale.

How many of them are actually being used and not stored in some closet?


It might not make VR a success per se, but objectively there are more VR headsets per capita sold today than 1984.


Sure, if we ignore that after all these decades it is as much of a success as 3D TVs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: