Anandtech takes care of testing the 2 cases, 100% random data and non-random data, to give a range of expected performance on real-world workloads.
However they only do this for tests run with IOMeter, because most other benchmarking tools are unable to write random data. Which brings me back to one of my points that many benchmarking tools are flawed in the sense they don't take into account potential deduplication.
If you only test the pathological it isn't informative if your real workloads could benefit from compression and de-duplication.