Neat theory, but AFAIK just the opposite. At least, as a young optical engineer, I was bombarded with technical experts proclaiming with great assurance that "digital would never fully replace film photography", because "its resolution would never compare".
And yet, if you took a crayon and continued the line of maximum resolution achieved on a single chip... that line wasn't plateauing.
Somehow, everyone believed Moore's Law, but not as it applied to detectors (which are basically transistors, which is what Moore's Law discusses).
The tl;dr is that it uses garbage collection to let readers see the older version of the tree they were walking while the latest copy still gets updated.
I read a paper recently about concurrent interval skip lists the other day as well which was interesting.