2. You create a system that detects high entropy content before being logged.
3. You don't want to drop all high entropy content, so you create some rules about where in requests to look for high entropy content.
4. Something about the request structure changes, breaking your log filtering.
5. There is nothing that notices the drop in the amount of content filtered out of logs.
There's oodles of ways this could happen. I'd wager that more than half of all businesses that have a website that handles passwords has logged passwords in plaintext somewhere.
Perhaps also checking logs for x of the most common passwords could work? On the scale of Facebook this might very likely trigger some positives with such a bug?
2. You create a system that detects high entropy content before being logged.
3. You don't want to drop all high entropy content, so you create some rules about where in requests to look for high entropy content.
4. Something about the request structure changes, breaking your log filtering.
5. There is nothing that notices the drop in the amount of content filtered out of logs.
There's oodles of ways this could happen. I'd wager that more than half of all businesses that have a website that handles passwords has logged passwords in plaintext somewhere.