> I find using decimals places (like 16.1% and 21.1%) in human experiments pretty irritating. It feels like false precision.
Whether you say 21.1%, 21%, or 20%, you still have a single number. You could make an argument that decimal places like 21.147258% add clutter, but without an actual measure of uncertainty, all you're doing is reporting a summary of the data in the sample with different amounts of arbitrary rounding. That's not particularly helpful as a substitute for the full distribution.
It's not just a number. It's a string of characters conveying information about a measured value. The way it's specified conveys information about the number of significant figures (https://en.wikipedia.org/wiki/Significant_figures).
Whether you say 21.1%, 21%, or 20%, you still have a single number. You could make an argument that decimal places like 21.147258% add clutter, but without an actual measure of uncertainty, all you're doing is reporting a summary of the data in the sample with different amounts of arbitrary rounding. That's not particularly helpful as a substitute for the full distribution.