For the longest time (like, the past 15 years) I’ve wanted a cheap small box that can overlay custom info, eg text, on the TV. The reason such a thing does not exist, other than projects like Bunnie’s NeTV (https://makezine.com/article/technology/review-hardware-hack...) is that decoding HDMI signals is illegal if you’re not in the club. I think the ELI5 reason is that content providers are afraid of unauthorized copying of DVDs (try taking a screenshot of a frame from your DVD player on your laptop).
I’d be willing to pay up to $199 for such a box if it has an open API to input overlay text and icons.
While reading your comment I was reminded of the much touted key capabilities of the Video Toaster (late 80s), it's character generator and chromakey, used to do exactly that: overlaying text on video, for a fraction of the cost of what professional video production cost. And now, 30+ years later, we still don't have an accessible way to do that using modern video tech/protocols, but now it's about rent seeking and licencing rather than technical capability.
That brings back some good memories. I had access to a video toaster setup a couple days a week after school as well as library checkouts of big vhs camera recorders. We'd get a stack of tapes, do the 1 day rental of a camera and record all over the neighborhood, then after school we'd take the tapes in to the studio and mix them together.
Thanks for the suggestion; however, this is a video production device which aims for different functionality than the broad one I had in mind.
The simplest useful app would be: watching a movie together with friends, have them type their comments on their phones and display them on my TV, in a lower third. The chat part is trivial but how to get that from a server to the video stream? Maybe there’s a simple way to do it using a device like the Mini Pro but when I search I do t see it.
While it's not the same as what you're asking for, you could do this pretty easily with a Raspberry Pi.
That feels like such an obvious thing to say that I hesitated even saying it, but then I started thinking about how funny it is. A solution to HDMI being so locked down is to buy a $20-ish computer, that does all the things a computer can do, so that we can feed the TV the video stream we want. Over HDMI.
Given that it's so easy to circumvent now, I wonder why anyone even bothers with HDMI any more, aside from inertia.
The Atem can definitely do this. There are youtube tutorials on it.
You use one of the many apps that can output your text, set it to output the background as green or whatever.
Then you can use the Atem to do one video stream inside the other, picture in picture, you can customize the size of the picture in the picture.
You can also have ot chroma key out the green (or whatever) from the text app, so it will only display the text, and the background will be the main video stream.
> watching a movie together with friends, have them type their comments on their phones and display them on my TV, in a lower third
This is what twitch streamers do all day using OBS and twitch chat. Hang out with their followers and watch/react to YouTube videos and read chat comments.
Also higher speeds. HDMI 2.1 has a bandwidth of 48 Gbit/s. But because of driver shenanigans in the linked article, us pesky computer users are stuck with HDMI 2.0, which has a bandwidth of 18 Gbit/s.
The latest version of DisplayPort has a bandwidth of 80 Gbit/s, and you get drivers on day 1.
Thinking about it more, there's no down side to DP. Every card manufacturer should spend an extra $2 and ship with a DP -> HDMI adapter and drop HDMI universally. There's no down side. And that $2 will be well worth the licensing fees they don't have to pay, and the software they don't have to write.
Even if it isn't 2.1, the off the shelf adapters will definitely cost, but it is a long-term strategy. If every card supports DP natively and everyone is shipping hdmi 2.0 adapters, monitor manufacturers will be strongly inclined to include a native DP port everywhere, no more HDMI only devices, including potentially TVs. At that point the card manufacturers stop shipping with the dongle.
This was great, and specifically regards overlaying video on a HDMI signal, explicitly without needing to actually decode the HDMI data stream, thus not violating DMCA.
Right now it’s just a proof of concept (check out the video with a moving overlay rectangle and the red 8 that is converted into a partially green 8), but it wouldn’t take a whole lot of work to add support for subtitles.
There are still some of these boxes for sale on Amazon, but supply is limited.
The really dumb thing is that nobody wants to copy DVDs or Blu-Rays via HDMI anyways. It's much more convenient to decrypt the discs directly with tools like MakeMKV.
However, I suspect that isn't the real holdup here since DisplayPort also fully supports HDCP
You can get devices that "hack" HDMI to various other unprotected signals such as SDI. So you get a Decimator or something, covert HDMI to SDI, and then do the needful.
Blackmagic's Mini Converters are at their core nothing more than SerDes chips and an FPGA that does the protocol translation. I _think_ that the recent models with USB-C power supply inputs also have the data pins connected to the FPGA.
Might be worth a try to check how hard BMD has locked down the FPGA bitstream.
Maybe the closest thing to this is a Blackmagic box of some sort. I use a the Ultrastudio 4K to capture 4K60 output, pipe through OBS and pass through as well.
Will run you $1K though. Corsair/Elgato has some solutions in your price range but the devil is in the details of precisely what you're trying to accomplish.
An HDMI splitter (i.e. an HDCP down converter/stripper) and a video capture dongle or card can be had for like $60, plus a computing running OBS. It’s not an elegant small box, but it’ll get the job done for lower resolution needs.
Is there a reason a RPi or whatever with a cheap USB HDMI capture card isn't a good option? I guess there might be a minor quality loss but it should do what you want.
HDCP usually is an issue if you can't turn it off on the source. There are ways around that, though those are more than $199 if you're dealing with an HDMI 2.x source.
From what I understand (and I could be very wrong here) the biggest issue here is simply the amount of memory that is required in order to keep a copy of the current frame from a HDMI signal, and the fact that to modify anything in the HDMI signal, you need to decrypt (if watching a HDCP protected stream), decode each frame, and then finally re-encrypt the HDMI signal.
I was looking into what would be necessary to build a HDMI switch that allowed for seamless switching of video between different inputs, and basically there was no hardware for it. The closest chips that I could find was Analog Devices' ADV7626 and Lattice's SiI9777S.
Lattice CP9777 might be worth having a look at if you can understand anything about the datasheet :)
As far as I can see there's no need to re-encrypt the signal - just leave it as a non-HDCP stream?
There are HDMI splitters which silently strip HDCP to output to two screens at once - they don't advertise this feature but they do it anyway in order to function. In order to achieve the overlay you'd just need one of these and then a non-HDCP-aware overlay apparatus before the display.
It would be a LOT more seamless if the output didn't have any HDCP to re-synchronize. Though the different frame timings (gen-lock is a term you want to search for) between the inputs and outputs so such a system might introduce presentation latency which would have to be accounted for in the audio streams.
I don't know if the ~$20 HDMI USB capture cards work on a raspberry pi, but connecting one to a pi or laptop and capturing the signal to a v4l stream, overlaying in software and then outputting on built-in HDMI out might work?
I have been down this rabbit hole many times over the past 15 years as well. I worked for a company that had a vested interest in me finding a cost effective solution.
Unfortunately, it never materialized. Bunnie is a hero and his work is liberating. I could never get the NeTV stable enough to go into commercial production and the NeTV2 is excellent but a little too expensive.
It sounds like you've taken a deep long journey into the mess here: I'm curious, what does it take to be a member of the club?
I hope it's not just FAANGs and blessed hardware OEMs...but it also sounds likely it must be, you can't let just anyone in if you're worried about the specs leaking...but then again that sounds weird too, in that, they used to provide the spec more openly until 2.1?
HDMI adopter registration details here[1]; in general, $10k annual fee + royalties and of bunch of legalese. Prevailing list of 2451 adopters and affiliates here[2]; yes, AMD (listed as unabbreviated Advanced Micro Devices) has been a registered Adopter since Aug 2006 and is currently listed as a 2.1b licensee.
There is a lot of answers here already, but I would remind everyone, what if your target is 38-in-1 DVD/HD/whatever to resell for $5 the only thing you need is a $30 cam pointed at the display.
Yes, it's not pixel perfect. Thing is, the people who buys 38-in-1 do not care about pixel perfect picture. They just want an affordable way to spend an hour and half with their family.
38-in-1 is some random storage media (USB stick, DVD. SD card, CD ) with "38" (or 45 or 2, or 99, pick a number) media files (movie, TV show, YouTube series whatever ) in various resolutions, formats, modes etc sold for next to nothing in the street.
Can be played Android-type boxes or even on phones / tablets
Essentially sneaker-net but for watchable media.
Not something you will see in much of in NA or Europe but very common in Asia and Africa where bandwidth /internet and more important, electricity, is relatively expensive.
The 90 minutes with your family comment is spot on.
in NA and Europe, this is mostly because folks don't have access to continuous internet. You are in back-country or don't have service or whatever.
In the Africa / Asia case, its because you don't have internet and / or power.
Most of the time, this media is viewed off TVs running on batteries, charged by solar, off an USB port. The electricity budget (because battery) says you get "90" mins of TV a day (not exactly 90: the power is shared between lights, charging phones, would be be more time but because electricity is limited, "TV time" is limited).
So those "90" minutes are family time, we all watch the same thing together.
Point being, in that world, Cam or SD of whatever works, no need for HD, or UHD or 4k or 8k - it is completely worthless. The screens that the content is being watched on are 720p most of the time.
Something like this (edit replaced original link with a new one that explains what is happening better)
I think this is also the reason most "scene" releases of TV shows and movies were xvid encoded, and capped at 175MB and 700MB respectively for so long. So that they wouldn't break compatibility with burned CDs on existing players.
The reply by @throwaway201606 is on point, though I would add what it was extremely popular in Eastern Europe too, till the broadband/4G became ubiquitous.
I think the last time I saw a 9-in-1 DVD on sale there is somewhere around... 2018 probably?
There are some no name USB capture sticks that take any HDMI signal and expose it as a standard USB UVC camera, no drivers required. This could be a start of a DIY project.
If it's to watch a movie, why bother with the capture card at all? Why not throw together a way to stream the video from the Pi doing the text overlay to all the remote participants.
Is the GP commenter suggesting that they'd like to have people in remote locations each load a copy of their DVD into a player at the same time so they can comment on it together? If so, then yeah, you need a capture card I guess, but the notion of doing that seems a bit bonkers.
> HDMI is not packet-based and so when it uses USB-C it takes over the entire cable (alt-mode)
AFAIK, when HDMI is used over USB-C, it's actually DisplayPort over USB-C; while a HDMI alt mode was specified, nobody actually implemented it, everyone instead implemented the DisplayPort alt mode and then used a DP-to-HDMI converter chip whenever an HDMI output was required.
> DisplayPort is packet-based and can be multiplexed with other USB-C traffic through a hub
That's the case only for USB4 (and AFAIK earlier Thunderbolt 3). Other USB-C ports with DisplayPort alt mode simply use some of the USB 3.x pairs for raw DisplayPort, and whenever the use of all four USB 3.x pairs is required for DisplayPort due to the target resolution, then only USB 2.x can be available (USB 2.x has its own dedicated pair in the cable).
Back in the Thunderbolt 3 era, it was up to each motherboard manufacturer to decide how many pairs they routed to USB alt-mode, so it's not necessarily a safe bet to depend on it working.
So, in one specific example, 4k60 over USB alt-mode DisplayPort is not supported on Apple's $4999 iMac Pro, as USB alt-mode is only assigned two pairs; however, 4k60 over Thunderbolt DisplayPort is supported, as Thunderbolt is assigned four pairs. The only way to get 4k60 out of that device is to use a Thunderbolt-only DisplayPort adapter, that has no USB mode at all.
This was resolved by USB 4 and Thunderbolt 4 both incorporating a more modern DSC (display stream compression), among other things as described above, but your mileage will vary much more wildly with USB 3.
> DisplayPort is packet-based and can be multiplexed with other USB-C traffic through a hub
This is part of why DP is $$$ compared to HDMI.
I would love to see DP start eating HDMI's lunch after this and the absolute shit show that was the HDMI 2.0 roll out but cheaper to implement is almost certainly going to be the driving factor when it comes to consumer grade TV / Displays and no console or other set top box maker is going to bother putting display port on their device if nobody's got a TV that can use it.
In terms of complexity, implementing DP vs HDMI 2.1 is not materially different. They both have fixed rate links, packets, Reed-Solomon forward error correction, DSC, etc.
Speaking about 'packet-based approach', the way it's presented in the popular press as if it's similar to Ethernet, is a pretty gross distortion, the impact of which is not nearly as much as people seem to think it is.
The majority of DP traffic is still brute force video data, interspersed with heavily packetized secondary data.
Over the years, I've spent many hours wading through DisplayPort data debug traces, and I've always wondered what people were smoking when they called it 'packetized like Ethernet'. It's just not true. (And FWIW: even old HDMI can transport secondary data packets just the way you can with DP. It's how audio-over-HDMI is done...)
And the HDMI is pretty much a direct digital translation of analogue CRT signalling, blanking and timing weirdness included. Which is also not how modern LCDs implement their internal screen driving, so needs a similar level of complexity to decode and convert.
I think the biggest difference in DP vs HDMI cost is simply scale - there's probably orders of magnitude more HDMI chips sold than DP.
ARC and CEC come to mind. ARC probably isn't too important, but CEC is nice being able to control various boxes with the same remote. Or being able to control the TV's volume with another remote.
I see no __technical__ reason why you couldn't do CEC over the AUX channel... all that's required is for the software on either end to be updated to squirt and interpret the bits, and it's a low-speed protocol, so it'll fit just fine in the AUX channel.
"ARC" seems to be "Audio over HDMI using CEC for discovery and control", which (if you've bothered to run CEC over the DisplayPort AUX channel) you get the rest automatically with DisplayPort.
However, because both CEC and ARC are HDMI standards, you bet your biscuits that the HDMI Consortium will bar anyone who wants to use HDMI ports on their devices from shipping official firm/software that does the braindead-simple thing of running CEC over DP AUX, and having ARC-compatible firm/software.
As is nearly always the case (and when it's not, it's not for long) DisplayPort is totally capable of everything HDMI is, but the HDMI Consortium stands in the way of having commercially-distributed DisplayPort "home theater" products that are compatible with the nice-to-have HDMI features.
When I say DisplayPort needs these features, I don't necessarily mean they need the HDMI Consortium's specific implementation of them. I see no reason VESA couldn't implement their own alternatives to CEC, ARC, Auto Lipsync, etc. These HDMI features solve a host of problems that are unique to home theater setups, so any HDMI alternative that wants to supplant it needs to also solve them as well.
Like you point out, there's no technical reason DisplayPort cannot provide similar features. The issue is the lack of any standards for them built into the DisplayPort specification. Some of these features, like CEC, are 20 years old and could easily be improved upon in an ecosystem that doesn't have to worry about backwards compatibility.
> When I say DisplayPort needs these features, I don't necessarily mean they need the HDMI Consortium's specific implementation of them.
Ah, okay.
Well, DisplayPort handles EDID, DDC/CI, E-DDC & friends... that's the CEC-equivalents handled. VESA does have standards for remote control of displays... it turns out that that's a thing that people want to be able to do.
> The issue is the lack of any standards for them built into the DisplayPort specification.
Nah. DisplayPort already supports everything (or just about everything) CEC can do.
The reason you don't see this stuff often making its way into TVs is because the HDMI Consortium gets in the way of folks who want to add DP ports to their TVs. The reason you don't see explicit support for HDMI Consortium protocols such as CEC in VESA standards is -again- because the HDMI Consortium gets in the way of folks who want to add DP ports to their TVs... so why bother? (Especially when actually supporting the protocol is trivial... squirt exactly what you'd send over the HDMI cable over the DP AUX channel, instead.)
If it was actually politically possible to have DP ports on TVs, then you'd see some of the more esoteric aspects of CEC (like manipulating a TV tuner) be quickly standardized into the VESA equivalents... assuming that VESA didn't just say "Oh yeah, y'all just go talk CEC to these new DP-equipped TVs. You already know how.".
eARC is important not only because cable management and CEC, but because it's the only high bitrate digital audio output on most TVs: the only other commonly available digital audio output, TOSLINK, is limited to two-channel PCM[1], AC3, and DTS.
[1] While the physical TOSLINK cable is able to support higher channel counts via ADAT[2], I'm not aware of any TVs with ADAT support.
Similarly, while 192 kHz / 24-bit TOSLINK support is common in pro audio and audiophile gear, the standard only requires 48 kHz / 20-bit. I imagine most TVs output 48 kHz / 20-bit, if only for the sake of configuration simplicity: TOSLINK is strictly unidirectional, so automatically negotiating format support beyond mandatory minima is impossible.
> eARC lets your TV pass those along to your compatible receiver even if the TV itself can't make heads or tails of the format.
DisplayPort Multi Stream Transport (MST) serves that role. Given that you'd only be passing along the audio data, rather than the video data, you could save money by putting the slowest available hardware (only BRR-capable) into the receiver.
Or, the TV could "just" pass along the audio stream to any plugged-in receiver. You don't gotta have a standard for that... just a standardized way to ask the TV to do it (assuming that you want to control the behavior and not have the conditional be "Is there a receiver plugged in? Pass along the audio and don't send it to the TV speakers.").
> DisplayPort v1.2 also adds new audio enhancements including the following: — Audio Copy Protection and category codes — High definition audio formats such as Dolby MAT, DTS HD, all Blu-Ray formats, and the DRA standard from China — Synchronization assist between audio and video, multiple audio channels, and multiple audio sink devices using Global Time Code (GTC)
MAT claims to require an Atmos-capable decoder, so that sure seems like Atmos to me. I dunno what DTS HD, but that sounds like a souped-up DTS. Also take note of the "High definition audio formats such as" statement... that list of formats is incomplete.
Actually, it is the only (display) protocol. HMDI Alt Mode, while specified, never really manifested and is effectively dead. DisplayPort is the only alternative and is more flexible anyway.
Its packet-based nature is not relevant for DisplayPort Alt Mode, by the way, because it gets dedicated pins. Sometimes enough for two lanes, sometimes enough for four. It's hilariously consumer-unfriendly. Only Thunderbolt leverages the packet stuff and can transport up to eight lanes worth of DisplayPort.
You can run DP over USB-C. It's a use case they specifically support. However, if you are the cutting edge you will be using the actual DP cords. DP2.1 uses 80Gbit/s and Thunderbolt 3 USB-C is only 40 GBit/s. Once you start pushing high resolutions (at least 4K) and high framrates the entire system will be bottle necked by the cable. Top end displays are always designed to take advantage of the latest DP port, and use the entire bandwidth.
USB4 will push 120GBit/s. This will be enough for 8k 120hz/4k 240hz but not much more. This will probably be enough for high end consumer displays and GPUs for a decade out or so.
This is upsetting on principle alone. But I'm also upset because I have my PC hooked up to my TV for couch gaming and was hoping to do that on Linux sooner rather than later when HDR works properly. I wasn't aware HDMI 2.1 was another problem... I guess Windows is sticking around for that.
FWIW I have my Linux desktop hooked up to a 4k/120hz TV via a $30 CableMatters DisplayPort->HDMI dongle that seemingly supports most of the relevant functionality, as far as I can tell. It even works very smoothly over a 50 foot active fiber optic HDMI cable. There is a ton of discussion about these dongles in the issue thread linked in the article.
That's an excellent point. Coincidentally I also have a 15m active fiber optic HDMI cable. If and when this is the last blocker, I'll definitely look at available adapters!
I have been looking for a display port -> hdmi adapter with vrr support for a while.AFAIK, nothing on the market currently supports it, including the cable from cablemod mentioned by OP
HDMI 2.0 is capable of doing 4k @ 120Hz, but only up to YCbCr 4:2:0 8bit instead of the full YCbCr 4:4:4 10bit (though I'm not sure how close wayland is to 10bit support). It will be fine for gaming, just not for everyday PC use.
FWIW I also didn't realise this until just now. I've been running my desktop at 4k@120hz recently for the buttery smooth neovide, but have been noticing that text rendering, especially syntax-highlighted text, looks awful. I'd seen the same oled panel render text way better in WSL/Windows (both using a custom pixel geometry[0] via Mactype, but also without), so I spent more time than I'm willing to admit to wrapping my head around custom pixel layouts and hinting in freetype. But no, turns out it was this all along.
If you want to see the effect of 420 vs 444 chroma subsampling on text rendering, this writeup[1] has some great test images and is well worth a read. Also, if you happen to have an LG OLED panel, you can get a little debug window that confirms your signal format and refresh rate, by pressing the green button on the remote 7-8 times.
I disagree. Gaming can involve lots of (colored) text too. And besides, if you have invested in a 4k@120Hz monitor are you really going to accept chroma subsampling even for non-text content?
I hate how the connection just magically switches to chroma subsampling (and/or DSC with newer connection standards) instead of that being something you have to explicitly enable. Well with my montor (and AMD card limited to HDMI 2.0 due the BS in TFA) it at least defaults to 60 Hz but there is no indication what the consequences of setting it to 120 Hz are.
You're not even told that your connection has been downgraded by any OS I have tried. Absurd that you have to result to these test patterns.
I once had a bad cable, which forced a 4k@60 monitor into chroma subsampling. Or rather Windows decided it would rather enable Chroma subsampling than drop resolution or framerate.
I immediately noticed there was something wrong. I agree it's terrible for desktop use. I would probably also have blamed freetype if it only happened on Linux, though.
There are different cable converters available. That said, if the hardware is user hostile don't use it. What are birthdays for if not throwing out trash and replacing it.
What is the hardware I would be replacing here? My TV? I'm not aware of any options with DP support. My GPU? I only just replaced my 3070 Ti with a 7900 XTX because of the open source drivers and much better Linux support. The converters are an excellent point though. Not sure why I didn't think of them...
> What are birthdays for if not throwing out trash and replacing it.
You are free to send me a couple grant for my birthday to buy a new equivalent TV with display port connectors. Except you can't even do that because such a TV does not exist.
I think it's unlikely to be a patent issue. More likely a contractual one. If you sign up to the HDMI Forum to get access to the spec, you will have to sign away certain rights. Those contract terms will have been put in place by the same people who think HDCP is a good idea.
If that's the case, someone needs to either leak the spec or reverse engineer the driver. One or the other will happen eventually.
Taking a bit of a step back from your question, this Hackaday article on why DisplayPort is so much better than HDMI from a technical perspective is great.
If you want big resolutions and big refresh rates. For example I have a 4K screen with 144 Hz refresh rate. That sort of bandwidth can only be delivered right now via HDMI 2.1.
Yes there are DisplayPort standards that can reach that, but in practice all the top end GPUs have only the old DisplayPort 1.4 version. That means the only choices are to use HDMI 2.1 or to use video compression over DisplayPort 1.4.
I will say that VESA has been very successful with their Display Stream Compression (DSC). Not that the algorithm is something special, but in terms of marketing. You never see monitor reviewers talk about the fact that using DSC will give you artifacts. VESA markets this compression as visually lossless, which every non-expert explains as actually lossless. In reality the losslessness is defined in the standard as when all the observers fail to correctly identify the reference image more than 75% of the trials. Beyond that, like with any compression, it will fail at high entropy images. Take for example this 2017 study titled Large Scale Subjective Evaluation of Display Stream Compression [1] which found that performance on some images was not visually lossless, however, those images were challenging images with high entropy.
That's great, but have you actually seen those artifacts ever?
(DSC also has a great property of making the image delivery significantly more resilient to cable issues and thus much more reliable for most people using high res monitors.)
> That's great, but have you actually seen those artifacts ever?
No, I use HDMI 2.1 without DSC.
Otherwise though, yes I have very good vision and work with images professionally. I do spot artifacts when they are there.
I get it that not everyone needs perfect images. I mean, people watching YouTube certainly won't be able to tell if there is an additional artifact on top of the low bitrate video they're watching.
DSC also has the great property of causing my RTX 3060 to crash if I try to watch videos on an external monitor and do the wrong thing on my main display. This issue has gone unresolved for over a year now. HDMI, DSC and NVIDIA can go eat a big bag of dicks.
Nope, affects several different monitors. It's also not flickering, the GPU driver will crash and stop sending output to the external monitor. At times it's even hung my entire system.
Most displays that are marketed as a "TV" instead of a "monitor" are exclusively HDMI. This is why many TVs from ~2020 don't have any 4K@120 input, despite actually displaying at 4K@120 (from the internal GPU that does frame interpolation), because HDMI 2.0 is limited to 4K@60.
There are many more displays that support hdmi. It is a simpler protocol. Display port is better and more modern, it's packetised so the timing is not fixed by the source like udp. HDMI is fixed timing and used to be much simpler hardware.
> HDMI is fixed timing and used to be much simpler hardware.
What do you mean by 'fixed timing'? The fact that the transmission data rate is proportional to the pixel clock?
We're talking HDMI 2.1 here, which uses FRL (fixed rate link) and thus has pixel clock decoupled from the pixel clock just like for DP, with data split into packets. There's not a lot of difference between DP and HDMI in terms of functionality and complexity.
It is true that HDMI 2.1 requires more logic, but one of the annoying parts of the earlier HDMI versions is that a sink needs a PLL that covers a large, continuous frequency range. DP and HDMI FRL only need to support a few fixed frequencies, which is much easier for analog designers to design for.
Clicking through to the product page it claims that VRR/Gsync is not supported with that adapter unfortunately.
It is also a USB-C to HDMI adapter, not a pure DP adapter. I think only the 20-series of Nvidia GPUs and maybe some of the 5000-series AMD GPUs have DP capable USB-C outputs. And apparently the M1/M2 Macs that the first article is about.
CableMatters is a good brand so if anyone could do it it'd be them, but it appears it's not quite there yet. Maybe when DP 2.0 becomes more common on GPUs it'll be a solved issue.
> I think only the 20-series of Nvidia GPUs and maybe some of the 5000-series AMD GPUs have DP capable USB-C outputs.
Based on my experience, this doesn't matter. I have a video card with no USB-C outputs. I go from full-sized DisplayPort to a female <-> female DisplayPort coupler, to this [0] bidirectional DP <-> USB-C cable, which plugs into my monitor. It works great and even does 4K 60FPS uncompressed HDR no problem.
I see no reason why you'd be unable to then slap on a female <-> female USB-C coupler and then that USB-C -> HDMI adapter.
(There's also no reason you couldn't cut out the first DP cable and coupler and plug the DP <-> USB-C cable directly into the video card. I just have very long (fiber-optic) DP cables that the second cable plugs into.)
A 65inch OLED isn't going to have DisplayPort input. Most laptops don't have DisplayPort output, just HDMI or type C which may or may not support DP Alt mode, and even then, that caps at 40gbps
In some laptops the HDMI port is managed by the integrated GPU and the DisplayPort by the discrete GPU, so ultimately it affects which GPU you're using.
Yes, and they achieve that by routing the data through the integrated GPU.
I had a problem with Windows and AMD drivers where in this configuration the integrated GPU would run full throttle despite not doing anything serious, making the system run hot.
I "solved" it by using the DisplayPort - 10°C difference on the CPU and, more importantly, no throttling.
I am writing this in Linux, while looking at a Dell monitor connected through DisplayPort, and my loudspeakers are connected to an audio output of the monitor.
Can AMD fork HDMI, add in driver support, and call it their own standard that just so happens to be compatible with HDMI?
I'm sure they wouldn't make such a move lightly, but is there something fundamental here about the laws or specifications that prevents them from doing this?
I think they could, simply by IP laws, but they have a contract with the HDMI Forum that probably doesn't allow that. They could cancel it (and reverse-engineer HDMI instead of going by the NDA'd spec), but I don't think they'd do that just for Linux support.
Hopefully someone leaks the spec. Maybe it's already available?
Note that the latest VESA specs are also restricted.
FWIW, it's easily possible to sniff the control channel of an HDMI or DP connection. At that point one could attempt to reverse engineer the enabling features.
Acknowledging these documents can taint your mind for life if you're not careful.
In the emulator scene, there is a set of documents that describe, at deep and intimate levels, the inner workings of the N64, released by a disgruntled SGI employee somewhere on USENET. It is common knowledge that reading those documents taints you from working on basically any graphics or game related source for the rest of your life.
This doesn't just apply for leaked things. There's people who've worked on the Deep Parts of Windows, MacOS, etc. that they are basically barred from making contributions to certain open source projects (e.g. Wine, AsahiLinux) as anything they do would likely involve secrets that are tainted with knowledge from their former employer.
Every graphics, emulator, game engine, and embedded guru on the planet has watched the Gigaleaks out of nintendo with caution, as they now have to be VERY careful where some things come from. If someone reads code from The Gigaleak, then contributes code to an emulator, the emulator may be tainted.
This came to a head when the PowerVR SGX drivers were leaked ( https://www.phoronix.com/news/MTg0NTQ ) and several developers eyes were burned as a result.
I understand the "clean room" reason behind avoiding tainting one's mind, but in court how do people distinguish plagiarism from "convergent evolution"?
Only the truly stupid would do this stuff under their real identities. Then again, a lot of OSS contribution is for ego anyway.
This came to a head when the PowerVR SGX drivers were leaked
...and some far-East modding communities managed to make unofficial Windows 9x and XP drivers using that. They will of course not tell you who they really are.
Does anyone know what the HDMI forum want to achieve by locking down the spec like this? I can't remember any reasons given back when they decided to restrict publication of future spec updates.
Microsoft is as pro Linux as the average web company values your privacy. They are pro using Linux where it is profitable. They are not pro Linux getting ahead wherever they can prevent it.
At best: Get kicked out of the forum and can never support HDMI products going forward.
At Worse: Get sued for breaking NDA, lose, pay a metric fuck ton in damages, and get kicked out of the forum and can never support HDMI products going forward.
If _someone_ released an unofficial patch for the official OSS driver it might get by, it might get "DMCA'ed", though DMCA might not be the correct takedown method for patent violations (I'm just presuming this _someone_ released only their own code and nothing from AMD, so AMD wouldn't have a copyright claim over the code itself), but hey thats never stopped companies mis-using the DMCA in the past! To file a valid counter notice to a DMCA, that _someone_ would have to give details which the forum could then use to sue the publisher of the patch.
But yeah even if the patch some how was made public, and it wasn't nuked out of orbit, ongoing support and bug fixes would be a pain in the ass. (Because as an example, no one from AMD would be allowed to touch the "patch code")
<edit>
If AMD's planned patch was leaked for example, as AMD had not officially released it, its not yet "open source" and because of that, not yet public, and I'm sure there will be a clause in the terms that state that AMD would have to go on the offensive to get their code removed from any public repos.
</edit>
When it comes to lawfare, The Forum wouldn't even have to be in the right (in a legal sense), just have a big enough war chest to make everyone else's life a pain in the ass!
Someone else in the comments has suggested that maybe an entity or someone in a country like France where copyright laws are less strict might be able to write and maintain the patch without fear of persecution, but as you said, those guys may somehow still make life a pain in the ass :(
So what if you can't host the patch on GitHub or other corporate platform. You can't (yet) successfully get things off the entire Internet, DMCA or not. And I doubt the HDMI Forum would put in too much effort to stop this considering the various unlicensed dongles that are available for sale.
"HDMI" is a trademark. You can't claim to offer HDMI in any of your products.
The patent pools around approximately all the codecs used for media delivery are heavily cross-licensed. That includes HDMI and HDCP, but also h.264 and h.265. Most likely AMD can't legally use hardware decoding or encoding of any popular codecs at that point. Good luck with game video, streaming, or playing media discs.
So it would cost AMD - for example - their entire PlayStation/XBox business. At a minimum.
> Serious question, what if they just ignore the Forum and do it anyways?
Probably makes it tricky to get HDMI forum's blessing for any future devices.
AFAIK, the hdmi standards are public-ish so anybody can create a device that is HDMI compatible but you're only allowed to put "HDMI" on the packaging / marketing material with the forum's blessing.
Just call it a 19-bin display connector in that case. Or only put in dual-mode display port connectors (DP++), and let people get passive adapters (maybe bundle them for a bit; you can probably label the passive adapters as HDMI)
> Video Out port: DVI-D signal in 640х480 px, 60 Hz. The port supports a well-known video standard that we can't name due to copyright limitations :shush: The first letter is H, and the last one is I.
This is pretty much an actual strategy. The academic/crowdfunded ULX3S FPGA development board went with calling it "GPDI" (general-purpose digital interface). Other strategies include labeling the connector with a little TV/monitor icon, calling it "digital video", or just having a little diagram in the marketing/documentation material with it connected to a TV with a vaguely HDMI-shaped cable.
It's funny how baby mentally drives these companies. Really, writing "HDMI" is an issue? But if I write HDMl, then that's fine? Come on. Find a real way to generate revenue rather than wasting everyone's time.
They don't even need this to generate revenue. The HDMI forum is mostly companies making TVs and other HDMI devices. You'd think they want as many compatible sources as possible.
To most consumers, DisplayPort is just "that thing that won't fit into my HDMI port for some reason." DP is royalty-free, but it seems like HDMI is cheaper to implement in the end because it's on every low-end device.
I'm sure people said similar things when indoor bathrooms were a new thing, but there's always some luddite who says "we never needed those things before!"
You are the one making bad faith arguments by claiming that there is a customer choice here when you know full well that monitors with similar specs are more expensive and don't there aren't even any comparable options at the high end.
I recognize the choice isn't palatable to the likes of many, but ultimately there is a choice being made. The choice of convenience and entertainment over principles of privacy and ownership.
All I'm saying is that there are still choices, they aren't perfect.
I searched for "Unicorn" and also found a bunch. Doesn't mean I can actually get one not to mention one that is even remotely comparable to the HDMI TV I'd be replacing.
Are there? This seems to be one of those illusion of choice industries where the different brands are only relabeling of the same factory reference designs.
Seems like you could have a separate shell organization that built this functionality by looking at existing drivers, and just provide the patch to the kernel.
I'm wondering if this is not something that could go against any anti trust laws, at least in Europe.
Here we see the proof that a private consortium is forcing one of its member to restrict it's innovation to avoid competitors to the member of the HDMI forum to have access to its technology...
Take their closed-source driver, run it through a decompiler and get an LLM to "understand" it, then write a "cleanroom" implementation? Might be one of the few good things this new AI stuff is good for.
I find those type of blatant anti consumerism and anti competitive so frustrating.
From my understanding, the spirit of those type of industry consortium/forum is to foster industry wide collaboration and innovation while style protecting the IP and patent of the originating companies.
But as usual, even when the companies are well served, the consumer don't have a say on those decisions...
I wish the FTC would investigate what are the economical impact of those type of decision on the consumer and the market in general.
the funny thing is that exists, freesync over hdmi works via precisely this mechanism (sending a DP signal over hdmi) since it predates the official introduction of HDMI Org VRR lol
I think you are fabulously optimistic if you think that Linux support will even make a noticeable difference to HDMI adoption. Even other computer uses are likely an afterthought to the HDMI consortium.
The attitude however, towards someone actively working for free to make your product better and wider-used, is what makes me say this.
The actively working against someone trying to make your stuff work more and give more people the opportunity to use your product, is nothing if not a clear indication of your wish for your product to die.
When you're literally spending more effort working against their own stuff..
I’d be willing to pay up to $199 for such a box if it has an open API to input overlay text and icons.