The notion that encoding/transmitting could be simpler than decoding/receiving is interesting. It reminds me of the way optical drives for many years could write at, say, 48x but read at 8x, such that the majority of time spent was the verification step (if enabled) rather than the burn step. Just speculating, I assume it's because of things like error correction, filtering out noise/degradation. Producing the extra bits that facilitate error correction is one trivial calculation, while actually performing error correction on damaged media is potentially many complex calculations. Yeah?
I'd always assumed that was due to differences in power levels needed for reading versus writing, and because writing onto disc is more error prone at higher speeds. Not necessarily anything to do with a difference in the algorithm for encoding versus decoding the bits on the disc itself.
As best as I understand it, we can start with thinking about it in terms of a music vinyl disc. For the sake of ease, let’s say that a vinyl is 60 rpm, or one revolution every second to “read” the song. (It’s actually about half that.) This is somewhat similar to how a “music cd” works and is why you can only get around 70-80 minutes of music on a CD that can hold hours of that same music in a compressed data format. The audio is uncompressed, therefore much like a vinyl. This establishes our 1x speed, in this case using one revolution per second.
Now to the speed differences. To read, the laser needs only to see a reflection (or not) at a specific point, while to write, the laser needs time to heat up that same point. It’s like the difference between seeing a laser reflect off a balloon, versus the time required for that same laser to pop it. This heating is how CDs are written, quite literally by heating up points on the disc until they are no longer reflective. That’s why it is called “burning”. While more power might speed up the process, there is still time required. Meanwhile, all that is needed to read faster is an increase in the speed to observe, or the frequency to “read”, the light reflection.
With more powerful lasers operating at a faster frequency and with more precision, we can have a laser “see” these differences at 48 times the normal speed, but can only burn at 8 times the normal speed before the reliability of the process suffers.
Bonus: for a rewritable disc, it works slightly different. Instead of destructively burning the CD, you can think of it as being a material that becomes non-reflective at one temperature, and reflective again at another. This allows data to be “erased”. Also, when you “close” a disc to prevent rewriting, you aren’t actually preventing it from being rewritten. It is more like using a sharpie to put a name on the disc, with the words “do not overwrite” that all drive software/firmware respects.
It's more to do with the speed of writing. While the last generations of CD writers got '48x' speeds the quality of the media is less when written at such a high speed. I remember a C!T magazine test years ago where they stated everything written at above 8x speeds would sooner develop reading errors. Maybe it's better now but I wouldn't count on it since investments in optical drives are practically zero these years.
The Voyager had an experimental reed-Solomon encoder. Encoding ‘just’ is a lookup table from a n-bit value to a m-bit one with m > n. Such a table takes 2^n × m bits.
Decoding also can be table-driven, but then takes 2^m × n bits, and that’s larger.
For example, encoding each byte in 16 bits (picking an example that leads to simple math), the encoding table would be 256 × 16 bits = 512 bytes and the decoding one 65,536 × 8 bits = 64kB.
Problem for Voyager was that 2^n × m already was large for the time.
Others have noted you got the CD-R speeds wrong, but sometimes sending is indeed easier than receiving. I used to work on radio signal processing for phones, and we'd spend far more of both DSP cycles and engineering effort on the receive side. Transmission is basically just implementing a standardized algorithm, but on the receive side you can do all kinds of clever things to extract signal from the noise and distortions.
Video codecs like h264 or VP9 are the opposite: Decoding is just following an algorithm, but an encoder can save bits by spending more effort searching for patterns in the data.
> Video codecs like h264 or VP9 are the opposite: Decoding is just following an algorithm, but an encoder can save bits by spending more effort searching for patterns in the data.
This is a more general point about the duality of compact encoding (compressing data to the lowest number of bits e.g. for storage) and redundant encoding (expanding data to allow error detection when transmitted across a noisy medium.)
This is the era I'm referring to, and I recall the difference being a bit beyond marginal. Literally the verification (i.e. read) phase of the burning sequence would take several times longer... in practice, not in terms of advertised maximums. Maybe it would read data discs at 48x but it would refuse to read audio discs beyond 8x or something like that. Same goes for ripping software like Exact Audio Copy (EAC); it could not read at high speed. And I don't think Riplock had anything to do with it, as that's a DVD thing whereas my experience dates back to CDs.
You and the GP are misremembering (also the abundant misinformation sticking around the web is of no help). CD-R are mostly obsolete but some of us still have working equipment and do continue to burn CD-R, so that era hasn't completely ended.
No idea exactly what you're referring to taking several times longer, perhaps software was misconfigured.
However what is more likely: The market was flooded with terrible quality media, combined with touting write speeds that were more for marketing than any concern for integrity, it was easy to burn discs just at the edge of readability, with marginal signal and numerous errors. This would cause effective read speed to be terrible, but this was more an indication that the discs were poor quality and/or poorly written then any inherent limitations in the process or how drives worked.
There are 48X "max" CD burners. But that maximum is no different than the maximum for reading. It's MAX because that speed is only attainable at the extreme outside of the disc. These higher speed drives operate with constant angular velocity (essentially a fixed RPM). In order to attain 52X at the inside of the disc would require a speed of around 30k RPM and no CD drive gets anywhere near that (though this was a common misconception). The top RPM for half height drives is around 10k - or about 50x the linear velocity of a CD at the outside.
Currently I usually use an Lite-On iHAS124 DVD/CD burner made in the last 6 years. It will write at up-to 48X and this speed is the maximum. The average burn speed for an entire disc when using "48x" is about 25x, or just about 3 minutes for the disc. For supported media it runs at a constant angular velocity around 10k RPM.
Exact Audio Copy / Red Book CD audio ripping is an entirely different subject. It can take longer due to cache busting and other issues that have nothing to do with the physical capabilities of the drive and more to do with the difficulty of directly streaming Red Book Audio, and issues with specific drives and their firmware. You can read at top speed though with a properly configured setup, I do it all the time.
> Red Book CD audio ripping is an entirely different subject
> difficulty of directly streaming Red Book Audio
Actually, it's what I was alluding to this whole time. Sorry for not saying so out of the gate. Red Book audio was my life for a while. I recall writing cue sheets [0] for CDRWIN by hand! Ripping groups would brag that a given release was created with EAC at no more than 2.4x or something like that...
I believe data CDs (whichever color book that was) had more robust error correction (given that computer files can't just have glitches interpolated like audio can to some extent) which is why if you completely filled a CD with Red Book audio (74/80 minutes), ripped it to an uncompressed format like WAV/AIFF, and tried to put all of it on a data format CD as files, it wouldn't fit; it was a decent amount larger than 640/700MB and not just due to metadata.