If this were true, there would be a lot more similarity between the CV1 and the Vive. From my medium-depth technical understanding, they seem about as dissimilar as it's possible to be while still both being PC-connected VR headsets and both being compatible with the same third party Unity/UnrealEngine games.
They’re both nearly identical in the ways that matter except for their outward appearance and minor things like audio, accessory ports, and the Vive’s camera.
They both have the exact same resolution OLED displays with a butterfly mechanism to adjust the IPD and fresnel lenses. They both use IMU’s as the primary tracking system with external components used for drift correction.
They both use optical tracking (except Vive is inside-out and Rift is outside-in.)
They both have incredibly similar software, with a very minor tweak needed to fully translate all the Rift calls to work completely on Vive.
The original Rift and Vive are very, very similar headsets. The most different thing about them is their controllers.
This sort of analogizing might fool a layman, but it will not fool someone who has looked at teardowns and done actual software development in the space. You need to be able to tell whether a similarity is more like two cars having the same engine controller firmware, or more like two cars both having carburetors, or more like two cars both having bought tires from the same third party supplier.
You picked the wrong two headsets to compare. The steam sight hmd is not the vive. You read a comment about the steam sight hmd and made a comment about the vive.
What secret knowledge of the steam sight headset do you have?
It's not really fair to call the Lighthouse system an "inside-out" tracking system. "Inside-out" generally refers to the tracking data collection and processing happening on data collected exclusively by the headset, without any specialized, external reference. That's not what Lighthouse does.
Lighthouse is nearly identical to outside-in camera tracking, with the one wrinkle being that the photons flow in the opposite direction. Instead of a fixed, designed constellation of point-like emitters on the headset, you have a constellation of point-like detectors. Instead of a stationary grid detector, you have a stationary grid emitter. But otherwise, the data is practically the same, the math is all the same, the calibrations are all the same, and the whole system doesn't work without those stationary, external reference points.
Similarly, ray tracing doesn't simulate photons leaving light sources, bouncing off surfaces, and arriving at a camera sensor. The simulation is of anti-photons, leaving the camera, bouncing off surfaces, and seeing what lights they hit. It's like conventional current actually being the opposite direction of electron flow. The systems can run forwards or backwards and get the same answer.
Actually inside-out tracking does a completely different thing. The acronym "SLAM" stands for "simultaneous locating and mapping". It's building up a coherent, consistent model of the world around it. It adapts to new surroundings.
Bump a Lighthouse emitter or CV1 camera out of position and everything stops working because the data no longer makes sense. Designing a Lighthouse headset or controller requires given the tracking code a 3D model of the position of all the detectors.
But move the furniture around in a room and SLAM catches up in a few seconds. SLAM also doesn't care about the shape of thing you're tracking. Hell, it really doesn't care all that much about the quality of the camera feed, other than being relatively high framerate and not very noisy.
It's simple enough to say that on the lighthouse system, the sensors are on the headset/controllers, and the beacons are cast from the lighthouse boxes.
On the oculus system the sensors were the external cameras, and the beacons were the LEDs on the devices.
My experience with both was that the oculus system did really well in a seated system but for room scale games the lighthouse system does better, especially when the controllers go behind you like in the valve archery game.
I haven't bought an oculus system since the DK2 so not sure how sophisticated it is now.
Windows MR (both VR headsets and the HoloLens), Magic Leap, Vive Focus, Pico Neo, and the upcoming Linx all use inside-out cameras, all with their own implementations. HTC Vive, Vive Pro, Vive Cosmos, Valve Index, Varjo XR-3 and VR-3, and PiMax headsets are the only ones using outside-in tracking anymore, and they're all using specifically Lighthouse.
First of all, you just don't really do that very often. People have rotator cuffs and elbows that make any action in those regions fairly uncomfortable.
Second, the all current VR systems primarily use inertial tracking. The visual tracking is only there to correct for drift out of the reference frame. Whether it's Lighthouse or Rift CV1 outside-in cameras or inside-out cameras on every other system, you can put your hands in the sensor blind spot for several seconds before it becomes a problem.
99% of the time, you're working with your hands in front of you. Lighthouse doesn't care about your hands in relation to your body. But it does care about your body in relation to the base stations. Lighthouse's blind spots are constantly changing over the course of your play session. Quest's are always in the same spot.
So many times I've found myself in a corner on the opposite axis of my base stations and my own body is blocking my controllers' view of the base stations. When that happens, you have to have enough awareness of what is going on to understand why your hands start slowly floating away while you are trying to work on something, having forgotten your orientation in the real world room. It's literally immersion breaking.
"Inside-out cameras can't track behind your head" is really not the problem that your random Valve fanboy on Reddit makes it out to be.
Controllers behind the head are tracked by some algorithm magic that fuses the last seen position by the cameras with accelerometer and gyro data for the blind spots. Seems to work like a charm. Probably not as good as full lighthouse system, but good enough.
Every extant tracking system uses IMUs as the primary tracking sytem. The Lighthouse base stations and the Oculus camera tracking are used to correct for sensor drift.
You need it to be this way, because the IMUs can run at fairly high frequencies (200 - 1000Hz), which is (in part) keeps latency low. The data paths and processing needed for the reference frame corrections are so complex that they can't be run anywhere near as fast. It's why the hand tracking on the Quest is so high-latency: there's no IMU on your hand.
And it's not "algorithm magic". It's mostly just Kalman Filtering.
according to Yates the optical tracking and fresnel lenses were the only things Oculus changed from the Steam Sight to the CV1. The rest of the actual architecture was the same.
The question will be what changed from the CV1 to the Rift, but I don't think valve is going to sue oculus either way. This is just to expand the understanding of how slimy a company oculus was even before it was acquired by facebook.
The vive is a device engineered by HTC. The Steam Sight headset was the device valve developed in their office. They are different headsets, and most people have never seen a steam sight headset.
The vive was developed in a hurry once valve realized Oculus' betrayal and the very real chance that the VR renaissance would be primarily owned by facebook and oculus' closed garden.
Unfortunately FB is gaining the market at the low end already. While not as high quality, you can even plug your Quest into a PC (assuming the PC is powerful enough to run VR) and run PC VR games. And with video cards so expensive, the only way VR is going mainstream any time soon is with lower end less expensive setups. Vive is about to release the Flow, which looks interesting, but from their product page I don't fully understand the niche it's looking to fill.
The flow is an ultra portable (think ultra light HMD that uses glasses arms to support itself on your head), phone controlled (Accelerometer and screen swiping) and phone driven (as in uses the phone video graphics) device which is a 3DOF headset primarily aimed at content consumption rather than games. Unfortunately at its price point, and lacking the controller and movement I'm use to with my headsets, it's a non starter for me.
It's an interesting idea at half its price, maybe.
Facebook gaining control of mass VR is basically why I feel pretty bearish about VR in general. If there's any company that can turn it into an anti consumer ad driven experience that experiments on you, its facebook.
Hopefully valve continues to put out competing hardware even if its niche for people willing to pay for a premium experience. The main issue right now is, as you say video card availability and prices.