Nearly all x86 CPU architectures originate from desktop/server CPUs which had much higher thermal and power limits, but nearly all ARM CPU architectures originate from embedded CPUs where there are strict thermal and power limits.
It's not as if there's something fundamental with x86 that makes it use more power. There used to be the case that x86 CPUs had many more transistors (and thus more power usage and heat) due to the need to implement more instructions, but I don't believe this is the case anymore. I think right now is that its easier to scale up an architecture, than it is to scale down.
If you look at something like Project Denver, it started as a x86 compatible CPU that would of had ARM-like power usage.
It's not as if there's something fundamental with x86 that makes it use more power. There used to be the case that x86 CPUs had many more transistors (and thus more power usage and heat) due to the need to implement more instructions, but I don't believe this is the case anymore. I think right now is that its easier to scale up an architecture, than it is to scale down.
If you look at something like Project Denver, it started as a x86 compatible CPU that would of had ARM-like power usage.