AFAIK iOS (and tvOS) devices only support playing formats+codecs with native hardware decoder support; they don't have any pure-CPU decoding implementations.
This, from what I recall, was an intentional choice to disincentivize developers from making the "easy choice" to ship assets in the same format+codec for all devices, where such a format would burn the iOS device's battery doing CPU decoding. Instead, most developers will choose to separately encode their assets for iOS, so that they can use the iOS media frameworks—which coincidentally means that they'll be using the efficient hardware encoding and not burning much battery at all.
However, you can ship custom CPU decoding in your app if you want (e.g. VLC.ipa).
Websites, obviously, aren't (usually) built for single architectures, though, so it's a bit of a pain for them. Larger web video providers (most porn sites, for example) keep iOS-specific encodings on their CDN and try to detect the browser's user agent to decide which version to deliver. Other providers just give up on the concept and build specific iOS-specific apps in order to ensure the correct media is delivered (this being why there are more "video-sharing site" apps for iOS than for Android.)
When iOS was first released everybody and his dog was still running their own proprietary movie (container) format. Microsoft was pushing .wmv, websites used Flash, illegal sources were using .avi files.
Apple forced these sites to provide their streams in a standardized format, MP4.
It might not be the standard you want because it isn't patent free but at least it is a standard, it's not controlled by Apple and it works on just about any device and works pretty well.