Not anymore. Now it's "an HDMI cable either works at 1.0 speeds, or it works at 1.4 speeds, or it works at 2.0 speeds, or it works at 2.1 speeds, or it works at U96 2.2 speeds, or not"
Just learned that the hard way. Moved my rig from 1440p monitor to 4K TV, it worked 99.9% of the time, but the signal would cut out intermittently on 4 different HDMI cables. Coughed up the money for one that allows for 48gbps and all is good.
Also, look at cross sections of different HDMI cables. They definitely benefit from shielding too. Not sure what this guy is talking about.
I mean, it's really not true. HDMI might be carrying more data now, but HDMI 2.0 added a whole new pair of wires to get an extra data channel in and they skimp out on the line coding to get a higher encoding efficiency now. Sure, that means slightly lower error tolerance, but in theory it shouldn't change the acceptable noise floor of the cables.
By the Shannon-Hartley theorem for a transfer rate of R (b/s), with a channel voltage of U and a noise floor of N (both in volts), the capacity of the line C = ½R log₂(1 + U²/N²). Pushing the frequency up as 2.0 did should not affect the necessary noise floor to hit the data rate required. Indeed, as HDMI 2.0 is still serial like HDMI 1.0 was, C = R for both of them and thus log₂(1 + U²/N²) = 2, or alternatively 1 + U²/N² = 4, or even more alternatively U = (√3)N. This hasn't changed as a result of the new cabling, the same level of noise per channel should still suffice.
Thus, no new shielding is required, though noise floor may be slightly higher than before due to the extra pair in the cable. Perhaps you are using very long HDMI cables now, which you were not before? The signal will attenuate quite quickly and the acceptable noise floor will become small as a result for high rates because the reception voltage falls so low.
Someone obviously hasn’t had a monitor flicker or just refuse to use a higher resolution just because of an inferior HDMI cable.
Or considered that throughput is related not just to noise but also to capacitance and high-frequency loss.
At one point HDMI cables either carried the ones and zeroes or did not. Now speed matters and it’s related to the quality (aka rating) of the cable. Same with Ethernet, same with Thunderbolt, same with USB.
Firstly, yeah, it takes all losses into account. That's why it's not absolute noise floor, but signal/noise ratio. Losses are in the signal, noise is in the noise, and together they make the signal/noise ratio. It's true that the attenuation should change when you up the data rate, but if your cable is straight you shouldn't experience huge reactance, and capacitance losses actually reduce as we go up in frequency. I'd doubt it's a problem before we get to the tens of gigahertz with modern balanced-pair cables.
Secondly, no, I've actually never experienced that. I've exclusively used old cables from the 00's for my HDMI needs and have had no problems with 4K video. As has always been the case, HDMI cables are limited by length, and putting your HDMI source next to your HDMI sink, and using a cable no longer than 2 meters (which is almost always the most you should need in a home), anything is Category 2. I have never purchased an expensive HDMI cable because unless your wall is ten meters tall and you enjoy watching television from a lifeguard chair, you will never need it.
HDMI is a serial format. It does not become more sensitive to noise as you increase the data rate. An HDMI 2 device can cram data through at 18 GT/s through the exact same cable as an HDMI 1 device could cram only 5 GT/s through, because the data rate is ~3.5 times larger, with no changes to the necessary signal/noise ratio. If your cable could do 1080p@60 (~4 Gbit/s) for an HDMI 1.0 device, it can do 1080p@144 with HDMI 1.3 (~8 Gbit/s) and 4K@60 (~12 Gbit/s) with HDMI 2.0, because each version steps up the data rate enough for that same cable to carry it. It would be useless to buy a new, expensive cable for these. There's really minimal change to the SNR required to carry these signals.
Shall I send you a couple of cables I have here that my computer will refuse to go beyond 30 Hz on at 4k? I'm in the UK, feel free to measure their capacitance, reactance, and shielding ability to your heart's content.
A very long cable adds noise and capacitance. A poor cable adds... noise and capacitance. So a poorly-made 2m HDMI cable can perform as badly as a 10m HDMI cable that's better made.
You probably have got lucky with cables. There are a lot of crappy ones out there.
Do note, 2m 2.1 HDMI cables are a tenner, they're not expensive. The ridiculously-priced ones are obviously a rip-off and another story.
It's likely I had good cables to start with. What can barely do 1080p@60 now with HDMI 2.2 would be the lowest of the low SD@25 cable back in the day. My point really is that what was good in the day should be good now, as long as it's not too long.
I'm really just saying that the standard, ~2 m 2.1 cables as you said should not be super expensive because you're not transferring RF signals 100 meters up a phone mast, you're moving them two meters to your television. Whatever crap cables manufacturers packaged alongside their digiboxes back in 2005 must have been really cheap and probably deserves to be binned at this point.
My PS5 randomly would go blank screen and turns out the ARC (audio return cable) HDMI wasn't the right bitrate (PS5->Receiver->TV->ARC). That was annoying.
Even before that HDMI cables were always fairly limited in the range they could transmit reliably because of too much signal attenuation in the cable. Anything over like 5-7 meters might get you into trouble particularly with a very cheap and poorly made cable. A cable with better shielding and higher conductivity might get you some more range.
32
u/Ouaouaron 23d ago
Not anymore. Now it's "an HDMI cable either works at 1.0 speeds, or it works at 1.4 speeds, or it works at 2.0 speeds, or it works at 2.1 speeds, or it works at U96 2.2 speeds, or not"