If you're trying to do UT1 (70%-80% of max HR) training and instead you're doing UT2 (55%-70%) or AT (80%-85%), then you're not going to get the benefits you were going for.
Further, the only way you can accurately measure your rate of improvement is by comparing average watts across time at a fixed HR. So if your HR isn't at your target level then not only are you not going to be making progress at an acceptable rate, but you're not even going to know that you're not on track until it's already too late.
Agree with everything you said. But if you're running for 90 minutes let's say, checking in on the last 1 minute average heart rate should be quite sufficient to set your pace.
And I certainly expect the app would easily be able to show you average watts vs heart rate over time. Given weight, height, pace via GPS, and heart rate for every run you could definitely do the necessary analytics.
The watch isn't showing you the one-minute average though, it's showing the instantaneous value but the researchers averaged it for the purpose of the study.
It's not gobblygook, it's training at a serious level. Those levels correspond well with lactic acid levels in the blood, for various types of training. I've both rowed at a high level and done tech research work with an Olympic cycling team, and this (or variants) are used frequently. HR for power (not pace etc) is a good measure of personal fitness and wellbeing (e.g. high resting heart rate means you're possibly falling ill or not well rested). However, all this doesn't matter for anyone who just wants to be healthy. That said, I defer to Garmin For proper HR smart watches and accessories, which are tailored for specific sports.
Further, the only way you can accurately measure your rate of improvement is by comparing average watts across time at a fixed HR. So if your HR isn't at your target level then not only are you not going to be making progress at an acceptable rate, but you're not even going to know that you're not on track until it's already too late.