Outside of lossy compression, we've been fairly close to theoretical limits for quite some time, but what changes is a little bit what we want to compress, but also how we leverage resources to do the compression.
A good deal of the time we care about decompression speeds, or the tradeoff between speed and bandwidth. The algorithms that reigned in the 90's and still kind of reign today are unwieldy. So new techniques that get within a few percent of optimal much faster or using less memory are an easy sell.
And once in a while we get something like Burrows Wheeler which isn't a compression, it's a transform (hence BWT) that can unearth some broader patterns in a file and make them more conducive to being compressed without a large memory structure that grows faster than the data under inspection.
I'd say there are some possibilities in compression formats that are transparent to certain operations, that is compressed data you can process as is (without decompressing).
A good deal of the time we care about decompression speeds, or the tradeoff between speed and bandwidth. The algorithms that reigned in the 90's and still kind of reign today are unwieldy. So new techniques that get within a few percent of optimal much faster or using less memory are an easy sell.
And once in a while we get something like Burrows Wheeler which isn't a compression, it's a transform (hence BWT) that can unearth some broader patterns in a file and make them more conducive to being compressed without a large memory structure that grows faster than the data under inspection.