New DisplayPort 2.0 spec uses Thunderbolt 3 for 16K displays

Posted:
in General Discussion
VESA has published the DisplayPort 2.0 video standard encompassing up to 16K displays -- and it uses USB-C and Thunderbolt 3 to do so.

The new DisplayPort 2.0 will support up to 16K resolutions


VESA announced the DisplayPort 2.0 video standard, the first major update to the DisplayPort standard since March 2016, increasing data bandwidth performance up to three times more than the previous version of DisplayPort 1.4a.

DisplayPort 2.0 is backward compatible with previous versions of DisplayPort and includes all the key features of DisplayPort 1.4a, such as visually lossless Display Stream Compression, HDR metadata transport, and Forward Error Correction.

Utilizing the Thunderbolt 3 physical interface layer, DisplayPort 2.0 boosts the data bandwidth and promote convergence across industry-leading IO standards.

These new data rates will facilitate multi-stream transport of DisplayPort 2.0 devices for a single DisplayPort on the source device, driving multiple displays either via a docking station or displays that can be daisy-chained. It will also allow for simultaneous higher-speed USB data transfer without compromising display performance.

DisplayPort 2.0 also supports resolutions up to 16k, higher refresh rates, HDR support at higher resolutions, improved support for multiple displays, and improvements to augmented and virtual reality displays.

Specifically, single-display resolutions are 15260x8460 at 60Hz with compression, or 10240 x 4320 at 60Hz with compression. Daisy-chaining displays allows for two 8K displays at 120Hz, or three 10K displays at 60Hz, all with Thunderbolt 3. Using only two lanes with a non-Thunderbolt 3 USB-C cable, DP Alternate Mode allows for three 4K displays at 144Hz, two 4K x 4K for virtual reality at 120Hz, or three 2450 x 1440 displays at 120Hz.

According to the VESA press release, the first products incorporating DisplayPort 2.0 are projected to appear on the market by late 2020.
«1

Comments

  • Reply 1 of 23
    melgrossmelgross Posts: 31,787member
    While this is nice, and Anandtech has a very detailed report about it, there is something that worries me. Intel has released the thunderbolt spec to a royalty free group. While this seems good, as we can see by this use of it here, my question is what it means for the future of the TB spec.

    going back to the beginning, Intel stated that in ten years TB would be at 100Gb/s. It’s still at 40. We know all about the cable “problem”, which DisplayPort now shares. But that problem can be overcome with amplified cabling. At a cost, of course. But are we now at the end of the TB advance> with Intel giving the license out for free—no more charging OEMs for ports, does that mean they’re letting go of TB altogether? Nobody knows that outside of Intel right now.
    jeffythequickiqatedowatto_cobra
  • Reply 2 of 23
    22july201322july2013 Posts: 739member
    16K! Does that make the new modular Mac Pro with its 6K monitor obsolete even before it's released? :smile: Will Google release a Chromebook with a 16K monitor for $998, one less dollar than the Apple monitor stand?

    I'm interested in Mel's comments. His questions seem reasonable but they shouldn't affect any purchasing decisions that I will make. I don't have a negative impression of TB yet, but I do have a negative impression of USB-C with all of its complexity and different connector types. Maybe Intel can take the best of both and merge them into a single cable called USTB.
  • Reply 3 of 23
    melgrossmelgross Posts: 31,787member
    16K! Does that make the new modular Mac Pro with its 6K monitor obsolete even before it's released? :smile: Will Google release a Chromebook with a 16K monitor for $998, one less dollar than the Apple monitor stand?

    I'm interested in Mel's comments. His questions seem reasonable but they shouldn't affect any purchasing decisions that I will make. I don't have a negative impression of TB yet, but I do have a negative impression of USB-C with all of its complexity and different connector types. Maybe Intel can take the best of both and merge them into a single cable called USTB.
    Intel and other companies are restrained by the working groups that control these standards. For example, the first TB used a different connector. Do you remember which one? No, because it was disallowed almost immediately. We had similar problems with hdmi. Apple had an adapter, but had to withdraw it.
  • Reply 4 of 23
    sflocalsflocal Posts: 4,674member
    It’s my understanding that TB4 is final and will be at  80gb/s.  It’s a fantastic technology and a superior tech with lower overhead than USB.  I really hope Intel does not do something stupid and kill the tech.  They have been screwing up as of late.
    watto_cobra
  • Reply 5 of 23
    StrangeDaysStrangeDays Posts: 8,262member
    16K! Does that make the new modular Mac Pro with its 6K monitor obsolete even before it's released?
    No, because obsolete means no longer made, used, or useful. That higher resolution specialized tools exist doesn’t render a lower resolution tool obsolete. Thus my 1080 monitor sitting on my desk isn’t obsolete today, either. 
    watto_cobra
  • Reply 6 of 23
    AppleExposedAppleExposed Posts: 1,381unconfirmed, member
    16K! Does that make the new modular Mac Pro with its 6K monitor obsolete even before it's released?
    No, because obsolete means no longer made, used, or useful. That higher resolution specialized tools exist doesn’t render a lower resolution tool obsolete. Thus my 1080 monitor sitting on my desk isn’t obsolete today, either. 

    He may have been joking but I have to wonder what the use of anything beyond 10k is?

    I can see it being useful for commercial purposes like theme parks and movie theaters.
    watto_cobra
  • Reply 7 of 23
    16?   Funk that...I want 32K.  On my phone.   Or maybe just go to 64K -- that should be good enough for anybody according to a respected source.
    JustSomeGuy1FileMakerFellerwatto_cobra
  • Reply 8 of 23
    melgrossmelgross Posts: 31,787member
    If only we could get rid of HDMI.

    What various USB-C connectors are you referring to? Do you mean the internal connection types, like whether it has TB or not, or "DP-alternate" or whatever the modes are for video output?
    It would be easier if you responded to someone using the quote button at the bottom right of a post. Otherwise it’s hard to be sure who you’re responding too.
    watto_cobra
  • Reply 9 of 23
    melgrossmelgross Posts: 31,787member
    Where did you see that?
    sflocal said:
    It’s my understanding that TB4 is final and will be at  80gb/s.  It’s a fantastic technology and a superior tech with lower overhead than USB.  I really hope Intel does not do something stupid and kill the tech.  They have been screwing up as of late.
    watto_cobra
  • Reply 10 of 23
    elijahgelijahg Posts: 988member
    melgross said:
    While this is nice, and Anandtech has a very detailed report about it, there is something that worries me. Intel has released the thunderbolt spec to a royalty free group. While this seems good, as we can see by this use of it here, my question is what it means for the future of the TB spec.

    going back to the beginning, Intel stated that in ten years TB would be at 100Gb/s. It’s still at 40. We know all about the cable “problem”, which DisplayPort now shares. But that problem can be overcome with amplified cabling. At a cost, of course. But are we now at the end of the TB advance> with Intel giving the license out for free—no more charging OEMs for ports, does that mean they’re letting go of TB altogether? Nobody knows that outside of Intel right now.
    Actually "amplified" cabling won't solve squat. Amplified noise is just loud noise. Optical is the next reasonable step really, but it's pricey. And TB isn't cheap now.
    watto_cobra
  • Reply 11 of 23
    melgross said:
    While this is nice, and Anandtech has a very detailed report about it, there is something that worries me. Intel has released the thunderbolt spec to a royalty free group. While this seems good, as we can see by this use of it here, my question is what it means for the future of the TB spec.

    going back to the beginning, Intel stated that in ten years TB would be at 100Gb/s. It’s still at 40. We know all about the cable “problem”, which DisplayPort now shares. But that problem can be overcome with amplified cabling. At a cost, of course. But are we now at the end of the TB advance> with Intel giving the license out for free—no more charging OEMs for ports, does that mean they’re letting go of TB altogether? Nobody knows that outside of Intel right now.
    For "amplified" substitute "optical". It's unlikely that you'll ever see copper thunderbolt at 100gbps- I'd guess that optical will be both cheaper and better, to the extent that if you need a powered cable it'll be a hybrid glass/copper. (Or POF/copper- POF may finally reach this performance level outside the labs.)

    Ultimately, if Intel stops developing TB, it won't matter that much. The demand for faster cabling is there - DP is a perfect example, and AR/VR will continue to drive bandwidth needs for a while at least. USB will keep pushing forward if TB doesn't.
    [...]I don't have a negative impression of TB yet, but I do have a negative impression of USB-C with all of its complexity and different connector types. Maybe Intel can take the best of both and merge them into a single cable called USTB.
    It's already done, and it's called "USB4".
    melgross said:
    Intel and other companies are restrained by the working groups that control these standards. For example, the first TB used a different connector. Do you remember which one? No, because it was disallowed almost immediately. We had similar problems with hdmi. Apple had an adapter, but had to withdraw it.
    What are you talking about? The original TB connector was mDP, and it stuck around for years (it was used in TB2 also).
    sflocal said:
    It’s my understanding that TB4 is final and will be at  80gb/s.  It’s a fantastic technology and a superior tech with lower overhead than USB.  I really hope Intel does not do something stupid and kill the tech.  They have been screwing up as of late.
    You are confused. There is no TB4 yet, and nobody knows what it will be (aside from guessing at a doubling of data rates to 80gbps, which seems moderately likely). You might be thinking about PCIe4.
    watto_cobra
  • Reply 12 of 23
    iqatedoiqatedo Posts: 1,608member
    elijahg said:
    melgross said:
    While this is nice, and Anandtech has a very detailed report about it, there is something that worries me. Intel has released the thunderbolt spec to a royalty free group. While this seems good, as we can see by this use of it here, my question is what it means for the future of the TB spec.

    going back to the beginning, Intel stated that in ten years TB would be at 100Gb/s. It’s still at 40. We know all about the cable “problem”, which DisplayPort now shares. But that problem can be overcome with amplified cabling. At a cost, of course. But are we now at the end of the TB advance> with Intel giving the license out for free—no more charging OEMs for ports, does that mean they’re letting go of TB altogether? Nobody knows that outside of Intel right now.
    Actually "amplified" cabling won't solve squat. Amplified noise is just loud noise. Optical is the next reasonable step really, but it's pricey. And TB isn't cheap now.
    Interested to know where the cost is in optical... the drivers perhaps (optoelectronics)?
    watto_cobra
  • Reply 13 of 23
    melgrossmelgross Posts: 31,787member
    elijahg said:
    melgross said:
    While this is nice, and Anandtech has a very detailed report about it, there is something that worries me. Intel has released the thunderbolt spec to a royalty free group. While this seems good, as we can see by this use of it here, my question is what it means for the future of the TB spec.

    going back to the beginning, Intel stated that in ten years TB would be at 100Gb/s. It’s still at 40. We know all about the cable “problem”, which DisplayPort now shares. But that problem can be overcome with amplified cabling. At a cost, of course. But are we now at the end of the TB advance> with Intel giving the license out for free—no more charging OEMs for ports, does that mean they’re letting go of TB altogether? Nobody knows that outside of Intel right now.
    Actually "amplified" cabling won't solve squat. Amplified noise is just loud noise. Optical is the next reasonable step really, but it's pricey. And TB isn't cheap now.
    Amp,ivied digital cables have been used for many years, though amplified isn’t the technical term. They’re repeaters, and allow the signal to cross a much greater distance. Optical cables also have these devices, amp,I died, repeaters, or whatever you want to call them. It’s an established technology.
    watto_cobra
  • Reply 14 of 23
    melgrossmelgross Posts: 31,787member
    melgross said:
    While this is nice, and Anandtech has a very detailed report about it, there is something that worries me. Intel has released the thunderbolt spec to a royalty free group. While this seems good, as we can see by this use of it here, my question is what it means for the future of the TB spec.

    going back to the beginning, Intel stated that in ten years TB would be at 100Gb/s. It’s still at 40. We know all about the cable “problem”, which DisplayPort now shares. But that problem can be overcome with amplified cabling. At a cost, of course. But are we now at the end of the TB advance> with Intel giving the license out for free—no more charging OEMs for ports, does that mean they’re letting go of TB altogether? Nobody knows that outside of Intel right now.
    For "amplified" substitute "optical". It's unlikely that you'll ever see copper thunderbolt at 100gbps- I'd guess that optical will be both cheaper and better, to the extent that if you need a powered cable it'll be a hybrid glass/copper. (Or POF/copper- POF may finally reach this performance level outside the labs.)

    Ultimately, if Intel stops developing TB, it won't matter that much. The demand for faster cabling is there - DP is a perfect example, and AR/VR will continue to drive bandwidth needs for a while at least. USB will keep pushing forward if TB doesn't.
    [...]I don't have a negative impression of TB yet, but I do have a negative impression of USB-C with all of its complexity and different connector types. Maybe Intel can take the best of both and merge them into a single cable called USTB.
    It's already done, and it's called "USB4".
    melgross said:
    Intel and other companies are restrained by the working groups that control these standards. For example, the first TB used a different connector. Do you remember which one? No, because it was disallowed almost immediately. We had similar problems with hdmi. Apple had an adapter, but had to withdraw it.
    What are you talking about? The original TB connector was mDP, and it stuck around for years (it was used in TB2 also).
    sflocal said:
    It’s my understanding that TB4 is final and will be at  80gb/s.  It’s a fantastic technology and a superior tech with lower overhead than USB.  I really hope Intel does not do something stupid and kill the tech.  They have been screwing up as of late.
    You are confused. There is no TB4 yet, and nobody knows what it will be (aside from guessing at a doubling of data rates to 80gbps, which seems moderately likely). You might be thinking about PCIe4.
    Optical cables are very expensive. Look up the prices. They also use, well, let’s just say repeaters, since the term amplification seems to raise bad connotations among some here.

    No, the first connector was usb with an addition on the top. It was shot down by the usb group who didn’t want to allow a modified usb standard. You see, I said nobody would remember. You could have looked it up.

    from Wikipedia:

    CNET's Brooke Crothers said it was rumored that the early-2011 MacBook Pro update would include some sort of new data port, and he speculated it would be Light Peak (Thunderbolt).[35] At the time, there were no details on the physical implementation, and mock-ups appeared showing a system similar to the earlier Intel demos using a combined USB/Light Peak port.[36] Shortly before the release of the new machines, the USB Implementers Forum (USB-IF) announced they would not allow such a combination port, and that USB was not open to modification in that way.

    link:

    https://en.wikipedia.org/wiki/Thunderbolt_(interface)

    edited June 26 muthuk_vanalingam
  • Reply 15 of 23
    melgross said:
    Amp,ivied digital cables have been used for many years, though amplified isn’t the technical term. They’re repeaters, and allow the signal to cross a much greater distance. Optical cables also have these devices, amp,I died, repeaters, or whatever you want to call them. It’s an established technology.
    In fact, there are amplifiers, and they will amplify noise as well. They are useful because in certain important scenarios loss is a bigger problem than noise. However, there's only so much amplification you can do before noise becomes a problem (and there's no room for that at all in TB3 cables), and then you need to regenerate the signal. Repeaters that do that are expensive and powered, not good qualities for a home-use long cable. I don't think they exist at all for TB3, unless you want to count actual devices like disks that have dual TB ports.
  • Reply 16 of 23
    iqatedoiqatedo Posts: 1,608member
    melgross said:
    melgross said:
    While this is nice, and Anandtech has a very detailed report about it, there is something that worries me. Intel has released the thunderbolt spec to a royalty free group. While this seems good, as we can see by this use of it here, my question is what it means for the future of the TB spec.

    going back to the beginning, Intel stated that in ten years TB would be at 100Gb/s. It’s still at 40. We know all about the cable “problem”, which DisplayPort now shares. But that problem can be overcome with amplified cabling. At a cost, of course. But are we now at the end of the TB advance> with Intel giving the license out for free—no more charging OEMs for ports, does that mean they’re letting go of TB altogether? Nobody knows that outside of Intel right now.
    For "amplified" substitute "optical". It's unlikely that you'll ever see copper thunderbolt at 100gbps- I'd guess that optical will be both cheaper and better, to the extent that if you need a powered cable it'll be a hybrid glass/copper. (Or POF/copper- POF may finally reach this performance level outside the labs.)

    Ultimately, if Intel stops developing TB, it won't matter that much. The demand for faster cabling is there - DP is a perfect example, and AR/VR will continue to drive bandwidth needs for a while at least. USB will keep pushing forward if TB doesn't.
    [...]I don't have a negative impression of TB yet, but I do have a negative impression of USB-C with all of its complexity and different connector types. Maybe Intel can take the best of both and merge them into a single cable called USTB.
    It's already done, and it's called "USB4".
    melgross said:
    Intel and other companies are restrained by the working groups that control these standards. For example, the first TB used a different connector. Do you remember which one? No, because it was disallowed almost immediately. We had similar problems with hdmi. Apple had an adapter, but had to withdraw it.
    What are you talking about? The original TB connector was mDP, and it stuck around for years (it was used in TB2 also).
    sflocal said:
    It’s my understanding that TB4 is final and will be at  80gb/s.  It’s a fantastic technology and a superior tech with lower overhead than USB.  I really hope Intel does not do something stupid and kill the tech.  They have been screwing up as of late.
    You are confused. There is no TB4 yet, and nobody knows what it will be (aside from guessing at a doubling of data rates to 80gbps, which seems moderately likely). You might be thinking about PCIe4.
    Optical cables are very expensive. Look up the prices. They also use, well, let’s just say repeaters, since the term amplification seems to raise bad connotations among some here...

    I suppose by very expensive you're referring to total system cost (Cable, drivers etc) because the fibre itself shouldn't be expensive. For the lengths of fibre required for local hardware connectivity (computer to monitor or external device), no repeaters would be required, not by a long measure.
  • Reply 17 of 23
    melgross said:
    melgross said:
    While this is nice, and Anandtech has a very detailed report about it, there is something that worries me. Intel has released the thunderbolt spec to a royalty free group. While this seems good, as we can see by this use of it here, my question is what it means for the future of the TB spec.

    going back to the beginning, Intel stated that in ten years TB would be at 100Gb/s. It’s still at 40. We know all about the cable “problem”, which DisplayPort now shares. But that problem can be overcome with amplified cabling. At a cost, of course. But are we now at the end of the TB advance> with Intel giving the license out for free—no more charging OEMs for ports, does that mean they’re letting go of TB altogether? Nobody knows that outside of Intel right now.
    For "amplified" substitute "optical". It's unlikely that you'll ever see copper thunderbolt at 100gbps- I'd guess that optical will be both cheaper and better, to the extent that if you need a powered cable it'll be a hybrid glass/copper. (Or POF/copper- POF may finally reach this performance level outside the labs.)

    Ultimately, if Intel stops developing TB, it won't matter that much. The demand for faster cabling is there - DP is a perfect example, and AR/VR will continue to drive bandwidth needs for a while at least. USB will keep pushing forward if TB doesn't.
    [...]I don't have a negative impression of TB yet, but I do have a negative impression of USB-C with all of its complexity and different connector types. Maybe Intel can take the best of both and merge them into a single cable called USTB.
    It's already done, and it's called "USB4".
    melgross said:
    Intel and other companies are restrained by the working groups that control these standards. For example, the first TB used a different connector. Do you remember which one? No, because it was disallowed almost immediately. We had similar problems with hdmi. Apple had an adapter, but had to withdraw it.
    What are you talking about? The original TB connector was mDP, and it stuck around for years (it was used in TB2 also).
    sflocal said:
    It’s my understanding that TB4 is final and will be at  80gb/s.  It’s a fantastic technology and a superior tech with lower overhead than USB.  I really hope Intel does not do something stupid and kill the tech.  They have been screwing up as of late.
    You are confused. There is no TB4 yet, and nobody knows what it will be (aside from guessing at a doubling of data rates to 80gbps, which seems moderately likely). You might be thinking about PCIe4.
    Optical cables are very expensive. Look up the prices. They also use, well, let’s just say repeaters, since the term amplification seems to raise bad connotations among some here.

    No, the first connector was usb with an addition on the top. It was shot down by the usb group who didn’t want to allow a modified usb standard. You see, I said nobody would remember. You could have looked it up.
    Oh, that. I thought you were talking about actual released TB, not the Light Peak tech demo.

    As for optical cable pricing... Yes of course. (If you added up all the optical cabling I've got in data centers, it'd be measured in miles.) That's why 40gbps 2m TB3 cables are "active" copper cables with chips embedded in the connectors on each end. But POF may provide a reasonable compromise. Or not... in the next year or two I we should know. But it really doesn't matter. You are not going to see copper cables of any length at those speeds.

    The amp/repeater situation is NOT the same between optical and copper. With copper, you can't do TB cables longer than a couple of meters, even "active". Repeaters would be silly. With optical, you can do many meters, and a repeater could have legitimate uses in some cases (long video cable runs, for example). It's just like Ethernet on a slightly smaller scale - nobody buys repeaters for copper (and if they really need it, they just put a switch in). But for optical, once you exceed the range of long-distance optics (140km? I don't remember exactly), repeaters are necessary.
  • Reply 18 of 23
    19831983 Posts: 1,184member
    I thought the resolution wars had stopped with 8K. I don’t see any practical use cases for 10K-16K video, except maybe for a few very niche markets. 8K is already over 33MP. That’s still considered high-res for stills let alone video!
  • Reply 19 of 23
    melgrossmelgross Posts: 31,787member
    iqatedo said:
    melgross said:
    melgross said:
    While this is nice, and Anandtech has a very detailed report about it, there is something that worries me. Intel has released the thunderbolt spec to a royalty free group. While this seems good, as we can see by this use of it here, my question is what it means for the future of the TB spec.

    going back to the beginning, Intel stated that in ten years TB would be at 100Gb/s. It’s still at 40. We know all about the cable “problem”, which DisplayPort now shares. But that problem can be overcome with amplified cabling. At a cost, of course. But are we now at the end of the TB advance> with Intel giving the license out for free—no more charging OEMs for ports, does that mean they’re letting go of TB altogether? Nobody knows that outside of Intel right now.
    For "amplified" substitute "optical". It's unlikely that you'll ever see copper thunderbolt at 100gbps- I'd guess that optical will be both cheaper and better, to the extent that if you need a powered cable it'll be a hybrid glass/copper. (Or POF/copper- POF may finally reach this performance level outside the labs.)

    Ultimately, if Intel stops developing TB, it won't matter that much. The demand for faster cabling is there - DP is a perfect example, and AR/VR will continue to drive bandwidth needs for a while at least. USB will keep pushing forward if TB doesn't.
    [...]I don't have a negative impression of TB yet, but I do have a negative impression of USB-C with all of its complexity and different connector types. Maybe Intel can take the best of both and merge them into a single cable called USTB.
    It's already done, and it's called "USB4".
    melgross said:
    Intel and other companies are restrained by the working groups that control these standards. For example, the first TB used a different connector. Do you remember which one? No, because it was disallowed almost immediately. We had similar problems with hdmi. Apple had an adapter, but had to withdraw it.
    What are you talking about? The original TB connector was mDP, and it stuck around for years (it was used in TB2 also).
    sflocal said:
    It’s my understanding that TB4 is final and will be at  80gb/s.  It’s a fantastic technology and a superior tech with lower overhead than USB.  I really hope Intel does not do something stupid and kill the tech.  They have been screwing up as of late.
    You are confused. There is no TB4 yet, and nobody knows what it will be (aside from guessing at a doubling of data rates to 80gbps, which seems moderately likely). You might be thinking about PCIe4.
    Optical cables are very expensive. Look up the prices. They also use, well, let’s just say repeaters, since the term amplification seems to raise bad connotations among some here...

    I suppose by very expensive you're referring to total system cost (Cable, drivers etc) because the fibre itself shouldn't be expensive. For the lengths of fibre required for local hardware connectivity (computer to monitor or external device), no repeaters would be required, not by a long measure.
    Very expensive varies as to time. When I was using scsi, later cables cost almost $200 for one meter. That about double in today’s dollars. Today, $80 for for a 2 meter cable is considered to be expensive.
  • Reply 20 of 23
    melgrossmelgross Posts: 31,787member

    melgross said:
    melgross said:
    While this is nice, and Anandtech has a very detailed report about it, there is something that worries me. Intel has released the thunderbolt spec to a royalty free group. While this seems good, as we can see by this use of it here, my question is what it means for the future of the TB spec.

    going back to the beginning, Intel stated that in ten years TB would be at 100Gb/s. It’s still at 40. We know all about the cable “problem”, which DisplayPort now shares. But that problem can be overcome with amplified cabling. At a cost, of course. But are we now at the end of the TB advance> with Intel giving the license out for free—no more charging OEMs for ports, does that mean they’re letting go of TB altogether? Nobody knows that outside of Intel right now.
    For "amplified" substitute "optical". It's unlikely that you'll ever see copper thunderbolt at 100gbps- I'd guess that optical will be both cheaper and better, to the extent that if you need a powered cable it'll be a hybrid glass/copper. (Or POF/copper- POF may finally reach this performance level outside the labs.)

    Ultimately, if Intel stops developing TB, it won't matter that much. The demand for faster cabling is there - DP is a perfect example, and AR/VR will continue to drive bandwidth needs for a while at least. USB will keep pushing forward if TB doesn't.
    [...]I don't have a negative impression of TB yet, but I do have a negative impression of USB-C with all of its complexity and different connector types. Maybe Intel can take the best of both and merge them into a single cable called USTB.
    It's already done, and it's called "USB4".
    melgross said:
    Intel and other companies are restrained by the working groups that control these standards. For example, the first TB used a different connector. Do you remember which one? No, because it was disallowed almost immediately. We had similar problems with hdmi. Apple had an adapter, but had to withdraw it.
    What are you talking about? The original TB connector was mDP, and it stuck around for years (it was used in TB2 also).
    sflocal said:
    It’s my understanding that TB4 is final and will be at  80gb/s.  It’s a fantastic technology and a superior tech with lower overhead than USB.  I really hope Intel does not do something stupid and kill the tech.  They have been screwing up as of late.
    You are confused. There is no TB4 yet, and nobody knows what it will be (aside from guessing at a doubling of data rates to 80gbps, which seems moderately likely). You might be thinking about PCIe4.
    Optical cables are very expensive. Look up the prices. They also use, well, let’s just say repeaters, since the term amplification seems to raise bad connotations among some here.

    No, the first connector was usb with an addition on the top. It was shot down by the usb group who didn’t want to allow a modified usb standard. You see, I said nobody would remember. You could have looked it up.
    Oh, that. I thought you were talking about actual released TB, not the Light Peak tech demo.

    As for optical cable pricing... Yes of course. (If you added up all the optical cabling I've got in data centers, it'd be measured in miles.) That's why 40gbps 2m TB3 cables are "active" copper cables with chips embedded in the connectors on each end. But POF may provide a reasonable compromise. Or not... in the next year or two I we should know. But it really doesn't matter. You are not going to see copper cables of any length at those speeds.

    The amp/repeater situation is NOT the same between optical and copper. With copper, you can't do TB cables longer than a couple of meters, even "active". Repeaters would be silly. With optical, you can do many meters, and a repeater could have legitimate uses in some cases (long video cable runs, for example). It's just like Ethernet on a slightly smaller scale - nobody buys repeaters for copper (and if they really need it, they just put a switch in). But for optical, once you exceed the range of long-distance optics (140km? I don't remember exactly), repeaters are necessary.
    It was more than a demo. I saw those early boards. But Apple was about to go into production with those connectors, and had to redesign their boards once Intel went to DisplayPort.
Sign In or Register to comment.