HDMI vs DisplayPort vs USB C Explained – HDMI 2.2, DisplayPort 2.1, DisplayPort ALT Mode
What cable to use keeps getting more and more complicated, so let me do my best to break it down so you know what to use, what to look for, and what is a load of crap you should steer clear of. Sound good? Awesome, let’s dive in. Let me start with the arguably simpler option, DisplayPort, so we can work our way up to the utter nonsense that is HDMI. DisplayPort’s version numbering is pretty clear, with most displays that have one either being 1.4, or newer ones that are 2.0, with 2.1 becoming more popular on brand new panels too. The connector has stayed the same between versions, meaning you absolutely can connect a DisplayPort 2.1 rated cable to a 1.4 port on a PC and a 1.2 port on an old monitor with no problems. DisplayPort is also backwards compatible, so connecting a new 2.0 or 2.1 monitor to an old 1.2 GPU will still work, it just won’t be able to do everything the monitor needs for it to give you the full resolution, colour or refresh rate. The different versions, then, are really about what goes on in the cable, than adding more pins, and how fast each end can talk to the other. The big difference between DisplayPort 1.4 and 2.0, 2.1 and 2.1a, is really just what speed the connection can carry. DisplayPort uses 4 lanes (four pairs of wires), each of which can run at up to 8.1Gb/s on 1.4, or up to a whopping 20 Gb/s on 2.1, meaning up to 32.4 Gb/s of maximum bandwidth on a 1.4 port, but up to 80 Gb/s on a 2.1 port! That’s pretty cool!
That extra bandwidth is what allows most of the other differences – for example DisplayPort 1.4 can run a 4K Display at 120Hz, but flat out can’t run an 8K display at 120Hz. DisplayPort 2.1 running at the full UHBR20 (ultra high bitrate) can do all but 8K 120 and up natively without any compression. If you do add in compression, specifically a feature called DSC, or display-stream compression, you can do 8K up to 240Hz! There are some extra differences, like how the extra twisted pair of wires in a DisplayPort cable are used for bidirectional data transfer of things like the monitor’s EDID information. DisplayPort does support audio too, although it isn’t often used. Because DisplayPort is a royalty free port, it is common to find more of them on, say, your graphics card, and because it doesn’t change as often, it’s normally more than enough for any monitor’s requirements, so the only important thing to do when buying a new monitor is check that your graphics card supports the same (or higher) DisplayPort version as the monitor you want. DisplayPort 2.1 is pretty new, so only the 50 series NVIDIA GPUs come with them, although AMD was ahead of the curve as the 7000 series and newer have it. Everything older (up to RTX 2000) is DisplayPort 1.4, which for anything short of 8K is still perfectly fine. Generally speaking, DisplayPort cables are pretty tolerant, but if you do want to look for the best stuff, look for the UHBR20 branding and are VESA Certified. UHBR10 is also plenty fine for basically anything short of a high refresh rate 8K display, but if you need that, you can shell out for the 20 no problem!
USB C, on the other hand, is messier. Because USB C can just power, power and USB 2 data (two wires), USB 3 data (six wires), USB 3.2 Gen 1 or 2 or 2×2 with even more data, or Thunderbolt, or USB 4…. Yeah, it’s a mess. Finding a cable that actually supports a display signal isn’t as easy as you’d like, but looking for a USB 3.2 rated cable is a good start. USB C carries a DisplayPort signal over its pairs of wires, known as DisplayPort ALT mode. This basically lets a USB C cable act as a one-cable solution to charge your source device (say, a laptop or tablet), while the tablet supplies the video signal, and the monitor can also act as a USB 3 hub, so with one cable you can get power, video out, and data in. Amazing! A number of office and gaming monitors have USB C inputs which let you use the display as a KVM (keyboard, video and mouse) switch, where you connect your peripherals to the monitor’s USB hub, then when you select the USB C input, the monitor switches your peripherals to the USB C port, then when you switch to DisplayPort or HDMI, the monitor switches the USB hub to connect via the USB B port instead.
HDMI though is just the worst. HDMI is arguably the most common display connector around, with even gaming monitors generally coming with two HDMI ports, instead of just one DisplayPort. That is because it is basically the standard display connector – consoles, TV boxes, TV’s, monitors, PCs, laptops, projectors… everything that takes a display signal – or gives one – almost always has an HDMI port, for better or for worse. HDMI has been around for a long while – 2002 specifically for version 1.0. HDMI’s versions, namely 1.4, 2.0, 2.1 and now 2.2, are somewhat similar to DisplayPort in that they upgrade the bandwidth each time, but HDMI has a bunch of feature differences between the versions as well, so let’s dive head first at the utter insanity that is HDMI 2.0 and 2.1. HDMI 2.0 was the most common HDMI version you’d find on devices until very recently, offering up to 18Gb/s of bandwidth (total), (which was almost HALF what DisplayPort 1.4 offered before DSC!) although it was essentially renamed to HDMI 2.1 in late 2021. This massive failure of branding means you now need in-depth knowledge of the differences between what was an HDMI 2.0 port (which, for example means a 4K display can do 60Hz or 120Hz with 8 bit colour) and a ‘full spec’ HDMI 2.1 port (which can do 120Hz natively, or 240Hz with DSC) just so you can get your stuff working together. This is so incredibly stupid.
Anyway, HDMI 2.0 is fine for up to 4K 60Hz or 1440p 144Hz or 100 hertz if it’s 1440p ultrawide. A true (full bandwidth) HDMI 2.1 port can do 48 Gb/s, which means most display configurations will once again fit through an HDMI cable. At least comparing 2.1 to 2.1 (a truly stupid yet necessary statement), beyond the “full” bandwidth of 48 Gb/s (rather than some that are 24, or the 2.0 standard of 18), other features to look out for are VRR (variable refresh rate), ALLM (auto low-latency mode), DSC (display stream compression), QFT (quick frame transport) and perhaps eARC (enhanced audio return channel) and QMS (quick media switching). As opposed to DisplayPort where features are not optional, HDMI Licensing has decided to make their standard less of a standard and more a confusing mess by allowing these features to all be optional. Don’t you just love a non-standard? Anyway, I’m happy to report that these qualms may be coming to a (slow, graduated) end, as HDMI 2.2 is coming, and that offers double the bandwidth again at 96Gb/s, and in theory should make things a little simpler.
Picking a suitable HDMI cable is a mess, although the new “Ultra96” branding is meant to bring some clarity, helping ensure new cables will meet the full bandwidth requirement, and therefore feature set. With HDMI you really need to be careful and look at both your display source AND display to make sure the two match. I just experienced this with my sim rig build and the wonderful Philips EVNIA 6500, which has two HDMI 2.0 ports, and I had to check the product page to find out that if I used an HDMI cable here, I’d get just 100 hertz, instead of the 175 hertz DisplayPort can do. For some monitors you absolutely have to check if they support features like VRR and ALLM (especially if you want to game on a console) before buying. Just reading “HDMI 2.1” isn’t enough. It needs to advertise those features and the correct refresh rate too. This is why DisplayPort is (and always has been) my preference. You don’t need to think about it. It just works. Screw HDMI.
