There are all kinds of ways to cheat verification and validation. In my personal experience, it usually comes down to "cheating" to avoid missing a schedule milestone. It usually comes down to misrepresenting a system. One example that comes to mind was a programmer who created an artificial flag to avoid static analysis tools reporting some errors; passing static analysis was a requirement for their system to be cleared for use. When confronted, the team's reply said they did so because if the report identified issues, they'd be forced to fix them and they needed to meet schedule. I have lots of other examples, some more egregious, some less. Sometimes they border on being silly, such as when a team said they didn't need to validate their software because, since the system operation required a human to turn it on, it wasn't considered "software" in their definition of the word.
For context, these were a safety-critical systems. A lot of times when something went wrong on a research effort, people can cover their tracks by calling it an "anomaly" and avoid further digging to press forward. But a lot of the "anomalies" can be traced back to avoiding requirements or equivalently lying about test outcomes, most of the time due to schedule pressure or just a simple lack of expertise. And because many safety issues are low probability events, people can get lulled into complacency where this behavior becomes normal since it's still rare for something bad to happen.
Another example further down in the discussion was VW's cheating of emission tests by changing their vehicles operation when it was connected to a test stand.
For context, these were a safety-critical systems. A lot of times when something went wrong on a research effort, people can cover their tracks by calling it an "anomaly" and avoid further digging to press forward. But a lot of the "anomalies" can be traced back to avoiding requirements or equivalently lying about test outcomes, most of the time due to schedule pressure or just a simple lack of expertise. And because many safety issues are low probability events, people can get lulled into complacency where this behavior becomes normal since it's still rare for something bad to happen.
Another example further down in the discussion was VW's cheating of emission tests by changing their vehicles operation when it was connected to a test stand.