That is what it should mean, but in image compression people use it to mean "mosquito noise" artifacts, which come from quantizing DCT compression applied to edges. (Nyquist theorem = DCT is bad at edges because they're made of an infinite number of frequencies.)
That is the exact same phenomenon. Artificial sharpening is introducing high bandwidth components to a signal. If you bandwidth limit (low pass) a signal to fit below the Shannon-Nyquist limit you will get ringing as the signal cannot be represented accurately and will smear in the time domain. Given a bandwidth constraint, artificial sharpening above a certain threshold will result in ringing.
Images don't have infinite bandwidth, so that doesn't apply. The filter used in H264 and newer codecs is exact and nearly reversible, there aren't artifacts from applying it. The artifacts come from the rounding afterward.
Sharpening and bandwidth-limiting have the exact same effect, because the maximum sharpness of an image (like any other signal) depends on its bandwidth. There is no difference in the type of artifact produced. That's why the artifact from both has the same name of "ringing".
It doesn't come from "sharpening" at all though. That implies an increase in frequencies. This compression artifact comes from rounding them, so they're moved in both directions.