HDMI cable purchasing is about to get a whole lot more complicated

Posted:
in General Discussion
Source-based tone mapping will be introduced with HDMI 2.1a during the 2022 Consumer Electronics Show, but between that and the newly redefined HDMI 2.1, it won't make buying cables any easier.

HDMI 2.1a won't make it any easier to buy a cable
HDMI 2.1a won't make it any easier to buy a cable


The HDMI Forum has announced that HDMI 2.1a will be introduced during CES 2022. The new spec replaces the older HDMI 2.1 spec introduced in 2017, however little will change for customers.

According to The Verge, HDMI 2.1a adds support for source-based tone mapping to devices.

The addition means that an Apple TV or PlayStation 5 will be able to perform HDR tone mapping before sending the data to the television.Offloading processes to the source device could mean reduced lag times, improved picture calibration, and better mixed content mapping.

The HDMI Forum says this update could be added via a firmware update to devices. Hardware manufacturers may wait to introduce the spec in new products.

The HDMI spec isn't the easiest to understand. The HDMI Forum states that manufacturers are not required to adopt new features in order to label products with the latest specs. Earlier in December, the HDMI Forum actually eliminated the HDMI 2.0 spec and placed all devices using HDMI 2.0 as a subset of HDMI 2.1, despite HDMI 2.0 devices like the MacBook Pro not supporting some HDMI 2.1 features.

Manufacturers are asked to clearly label which features a product takes advantage of, but it isn't always clear. Marketing names can often obfuscate the actual capabilities of a cable or port, and identification labels aren't always in use.

Customers will have to be as vigilant to ensure they are getting the right HDMI features when purchasing a TV, monitor, or cable. CES takes place from January 5 through January 8, and AppleInsider will be covering the show as it progresses.

Read on AppleInsider
«1

Comments

  • Reply 1 of 37
    Yet another convoluted mess. Many of the cables aren't even marked. So if you have them stuffed into a bag there's no way to know which ones are worth keeping.
    baconstanggregoriusmGeorgeBMacdewmepscooter63watto_cobra
  • Reply 2 of 37
    It's easy. Just read the clearly labeled spec version that's printed on the HDMI plugs ... not.
    The HDMI forum should replace the people who make decisions. They are useless.
    muthuk_vanalingamGeorgeBMacwatto_cobra
  • Reply 3 of 37
    There should be some sort of handheld device that when you plug in the cable, it can detect which version. 
    GeorgeBMacwatto_cobra
  • Reply 4 of 37
    lkrupplkrupp Posts: 10,557member
    The Apple Discussion Forums (Apple TV Hardware section) is chocked full of questions and issues about getting Dolby Vision and Dolby Atmos working with their Apple TV 4K, AVR gear, sound bars, and TVs. Lots of arguing over HDMI cables (48gbps certified, 16gbps certified, etc.) AVR and TV settings to enable things, misinformation about HDR, HDR10, HDR10+, Dolby Vision. It usually boils down to blaming the Apple TV for all their troubles. I mean, how could Sony, Denon, Samsung, Marantz, Yamaha, TCL possibly be the cause. Third party hardware/firmware is always assumed to be perfect by those with issues.

    The HDMI cable is the root cause of issues most of the time with the the length, the speed, the version being the culprits. And of course Samsung TVs don't support Dolby Vision but hawks Samsung's proprietary HDR10+ format. Talk about a labyrinth.


    edited December 2021 Beatscg27ravnorodomgregoriusmuraharabeowulfschmidtpscooter63watto_cobra
  • Reply 5 of 37
    Because of course they had to make it “simpler”…

     :s 

    …and they wonder why people get confused.
    edited December 2021 GeorgeBMacwatto_cobra
  • Reply 6 of 37
    XedXed Posts: 2,543member
    There should be some sort of handheld device that when you plug in the cable, it can detect which version. 
    👍 That's an interesting idea. Not just HDMI but how other cables are rated, too. I have C+USB-C cables that are a mess: 60W v 100W PD, PD v data.
    GeorgeBMacwatto_cobra
  • Reply 7 of 37
    XedXed Posts: 2,543member
    Yet another convoluted mess. Many of the cables aren't even marked. So if you have them stuffed into a bag there's no way to know which ones are worth keeping.
    I haven't done it to all my cables (yet) but I have been using the below label maker to create labels for many of my cables. The iOS app still leaves a lot to be desired, but it does work well enough, has countless options for configuring the output and has many built-in templates for use types, like cable management.

    I also use it to put labels on PSUs so I can know what product they're for and their output. Apple's PSUs are easy to remember, but most products are hard to read or generic.

    https://www.brother-usa.com/products/ptp300bt
    gregoriusmwatto_cobra
  • Reply 8 of 37
    BeatsBeats Posts: 3,073member
    Apple should create a new cable (USB-C?) that makes HDMI obsolete. HDMI has so many issues nowadays it needs to be discontinued.

    A new high bandwidth cable that isn’t dependent on itself, so it doesn’t matter if you bought it at a thrift store or a high end audio website it acts the same. The bandwidth is future proof to 20 years+ so it can handle anything that’s thrown at it. New features will be dictated by the hardware/software and the cable is just a tube for features. For example PlayStation 6 wants to add 3D images to a TV: done. Apple TV 6 wants to add an exclusive audio standard like Spatial Audio to receivers: done. The cable you bought is irrelevant because the capability is so high from the start.

    Can a new “HDMI 3.1a gen 3x Series 2” standard add these features? Sure. But the fact it has the same form-factor adds more chaos to sort through. HDMI needs to end ASAP.
    watto_cobra
  • Reply 9 of 37
    BeatsBeats Posts: 3,073member
    There should be some sort of handheld device that when you plug in the cable, it can detect which version. 
     But now you have a new device you need to purchase for ONE niche purpose that shouldn’t even exist.

    Better if all new devices can detect what is plugged into it. For example you plug in your HDMI cord into your TV and it displays “HDMI 2.1a plugged in...” then the notification slowly fades away. All settings on devices, whether it’s a game console or Apple TV should tell you what cable is plugged in. This is ridiculous.
    ravnorodompichaelmuthuk_vanalingambeowulfschmidtpscooter63watto_cobra
  • Reply 10 of 37
    XedXed Posts: 2,543member
    lkrupp said:
    The Apple Discussion Forums (Apple TV Hardware section) is chocked full of questions and issues about getting Dolby Vision and Dolby Atmos working with their Apple TV 4K, AVR gear, sound bars, and TVs. Lots of arguing over HDMI cables (48gbps certified, 16gbps certified, etc.) AVR and TV settings to enable things, misinformation about HDR, HDR10, HDR10+, Dolby Vision. It usually boils down to blaming the Apple TV for all their troubles. I mean, how could Sony, Denon, Samsung, Marantz, Yamaha, TCL possibly be the cause. Third party hardware/firmware is always assumed to be perfect by those with issues.

    The HDMI cable is the root cause of issues most of the time with the the length, the speed, the version being the culprits. And of course Samsung TVs don't support Dolby Vision but hawks Samsung's proprietary HDR10+ format. Talk about a labyrinth.
    tvOS has a Settings option for checking your HDMI cable. I've run this lengthy test many times without ever finding an issue with the cable, but I have found that my audio issues went away after switching cables.

    I wish the test was smart enough to tell me what the capabilities of the cable and overall connection between devices are—in the same vein as how BlackMagic tells you what kind of media capabilities your disk performance is capable of, I wish it would tell me that what the cable is, what my overall connection is, and what features I may not be getting.
    watto_cobra
  • Reply 11 of 37
    There should be some sort of handheld device that when you plug in the cable, it can detect which version. 
    Here's one, but it doesn't cover 2.1   yet.
    https://www.hdtvsupply.com/hdmi-testers.html  

    Here's 2.1 8K stuff...
    https://www.hdtvsupply.com/8k-hdmi-product.html
    edited December 2021 ravnorodomGeorgeBMacwatto_cobra
  • Reply 12 of 37
    zimmiezimmie Posts: 651member
    Beats said:
    Apple should create a new cable (USB-C?) that makes HDMI obsolete. HDMI has so many issues nowadays it needs to be discontinued.

    A new high bandwidth cable that isn’t dependent on itself, so it doesn’t matter if you bought it at a thrift store or a high end audio website it acts the same. The bandwidth is future proof to 20 years+ so it can handle anything that’s thrown at it. New features will be dictated by the hardware/software and the cable is just a tube for features. For example PlayStation 6 wants to add 3D images to a TV: done. Apple TV 6 wants to add an exclusive audio standard like Spatial Audio to receivers: done. The cable you bought is irrelevant because the capability is so high from the start.

    Can a new “HDMI 3.1a gen 3x Series 2” standard add these features? Sure. But the fact it has the same form-factor adds more chaos to sort through. HDMI needs to end ASAP.
    That would be nice, but the problem is we're at the limits of what simple copper cables can do. This is why Thunderbolt, HDMI, and several other standards are moving to active cables, which is what leads to all of these problems where a cable with USB-C on both ends could have any of a dozen performance levels.

    We're also at the current limit of what affordable active cables can do, which is why these standards aren't moving forward even faster. Timing correction and error correction chips which can operate at higher speeds are enormously more expensive. You want something faster than HDMI 2.1's 48 gigabits? Get ready to pay $350 for a single 1 meter cable.
    ravnorodomkayesswatto_cobra
  • Reply 13 of 37
    Now I see why Apple was pushing USB-C over HDMI
    watto_cobra
  • Reply 14 of 37
    cpsrocpsro Posts: 3,198member
    Vote with your wallet
    williamlondonwatto_cobra
  • Reply 15 of 37
    There should be some sort of handheld device that when you plug in the cable, it can detect which version. 
    Not really possible.  As far as cables are concerned, there is no "version" and no ID chip.  There is simply the maximum amount of bandwidth that the cable can carry with a clean signal:

    • Category 1 ("standard") HDMI requires cables to pass a 74.25 MHz signal.  It says nothing about digital bits-per-second that can be transmitted without error but should be able to support any combination of features from HDMI 1.0 through 1.3 (which may require up to about 5 Gbit/s).
    • Category 2 ("high speed") HDMI requires 340 MHz, but also has no data-bits requirement.  It should be able to support all HDMI 1.0 through 1.4 features (up to 10.2 Gbit/s).
    • "Premium High Speed" is also category 2, but with new certification rules that require the cable to pass 18 Gbit/s.  This is because HDMI 2.0 adds a bunch of features that may require up to 18 Gbit/s, but claims to be compatible with existing category 2 cables - some of which can't actually handle it.
    • Category 3 ("ultra high speed", "48G") HDMI says nothing about frequency, but requires the cable to pass 48 Gbit/s, which is enough to support every valid combination of HDMI 2.1 features.


    The important thing is that the amount of bandwidth you require depends on what features your devices are using (resolution, color depth, encoding, etc.).  Even if your devices are HDMI 2.1 compliant, if your particular combination of devices only requires (for example) 8 Gbit/s, then you can use cheap category 2 ("high speed") cables.  You don't need a category 3 cable unless your devices require more than 18 Gbit/s.

    The real problem is that if you're not a very technical person, you probably don't know how much bandwidth your devices require.  So you'll probably take a look at the version number and pick your cable based on that (e.g. get a category 3 cable if the devices are HDMI 2.1), but this may easily result in paying for more bandwidth than you actually require.
    edited December 2021 pscooter63watto_cobra
  • Reply 16 of 37
    Xed said:
    There should be some sort of handheld device that when you plug in the cable, it can detect which version. 
    👍 That's an interesting idea. Not just HDMI but how other cables are rated, too. I have C+USB-C cables that are a mess: 60W v 100W PD, PD v data.

    Yes, good idea - may simply the device you are using it with could tell you that? Simply and intuitively?
    watto_cobra
  • Reply 17 of 37
    Xed said:
    There should be some sort of handheld device that when you plug in the cable, it can detect which version. 
    👍 That's an interesting idea. Not just HDMI but how other cables are rated, too. I have C+USB-C cables that are a mess: 60W v 100W PD, PD v data.

    Yes, good idea - may simply the device you are using it with could tell you that? Simply and intuitively?
  • Reply 18 of 37
    MplsPMplsP Posts: 3,925member
    Beats said:
    Apple should create a new cable (USB-C?) that makes HDMI obsolete. HDMI has so many issues nowadays it needs to be discontinued.

    A new high bandwidth cable that isn’t dependent on itself, so it doesn’t matter if you bought it at a thrift store or a high end audio website it acts the same. The bandwidth is future proof to 20 years+ so it can handle anything that’s thrown at it. New features will be dictated by the hardware/software and the cable is just a tube for features. For example PlayStation 6 wants to add 3D images to a TV: done. Apple TV 6 wants to add an exclusive audio standard like Spatial Audio to receivers: done. The cable you bought is irrelevant because the capability is so high from the start.

    Can a new “HDMI 3.1a gen 3x Series 2” standard add these features? Sure. But the fact it has the same form-factor adds more chaos to sort through. HDMI needs to end ASAP.
    Now I see why Apple was pushing USB-C over HDMI
    Except USB C isn’t any better. In many ways it’s worse. For starters, USB C is just the connector. You have no idea what it’s connecting to. It could be USB 3, USB 4, thunderbolt or just a plain power port. And a given USB C cable may be capable of supporting one use but not the others. 

    In general, HDMI cables have been simpler. They either work or they don’t. 
    baconstangGeorgeBMacwilliamlondonwatto_cobraavon b7
  • Reply 19 of 37
    MplsPMplsP Posts: 3,925member
    lkrupp said:
    The Apple Discussion Forums (Apple TV Hardware section) is chocked full of questions and issues about getting Dolby Vision and Dolby Atmos working with their Apple TV 4K, AVR gear, sound bars, and TVs. Lots of arguing over HDMI cables (48gbps certified, 16gbps certified, etc.) AVR and TV settings to enable things, misinformation about HDR, HDR10, HDR10+, Dolby Vision. It usually boils down to blaming the Apple TV for all their troubles. I mean, how could Sony, Denon, Samsung, Marantz, Yamaha, TCL possibly be the cause. Third party hardware/firmware is always assumed to be perfect by those with issues.

    The HDMI cable is the root cause of issues most of the time with the the length, the speed, the version being the culprits. And of course Samsung TVs don't support Dolby Vision but hawks Samsung's proprietary HDR10+ format. Talk about a labyrinth.


    My biggest issue has been HDCP. Trying to play copy protected content from an Apple device has turned into a crap shoot. I’ve even had movies stop playing 30 minutes in. 
    thtGeorgeBMacwatto_cobra
  • Reply 20 of 37
    So the EU has issue with Apple’s continued use of their lightning port, and while Apple moves everything else to Thunderbolt 3 (as 3 and 4 are the same on Apple systems, Windows users need to utilize Thunderbolt 4,… 

    meanwhile there are a dozen different cable standards for hdmi and displayport bleh 
    baconstangwilliamlondonwatto_cobra
Sign In or Register to comment.