Intel announces next-gen Thunderbolt with 4K resolution support, 20Gbps speeds coming in 2014
At the National Association of Broadcasters (NAB) Show in Las Vegas, Nevada, Intel on Monday announced the next-generation of Thunderbolt transfer technology, which promises to bring data transfer speeds up to 20-gigabits per second, a doubling over existing Thunderbolt hardware.
Soure: Intel
According to Intel, the upgraded Thunderbolt interface will be able to hit the 20Gbps transmit speeds over two channels, meaning adopters could see a theoretical doubling in performance from current Thunderbolt systems, reports Engadget. The high-speed tech, code-named Falcon Ridge, allows for simultaneous 4K video file transfer and display over two I/O channels.
Driving the tech will be a new Thunderbolt controller, currently designated by the codename Redwood Ridge, which is expected to be integrated with Intel's fourth-generation Core processor lineup this year. Improvements over the previous Cactus Ridge controller are DisplayPort 1.2 capability when connecting to native DP displays, improved power management and a supposed decrease in material cost.
The Thunderbolt ecosystem has been slowly gaining momentum after Apple and Intel introduced the interface in 2011, with manufacturers finally bringing products to the consumer marketplace.
For example, Corning showed off its thin optical cables for the standard at this year's Macworld. Apple was first to support Thunderbolt with its early-2011 MacBook Pro. Some Windows-based PCs have started to implement the protocol, though it is far less common than competing technology like USB 3.0.
Due to initial pricing of Thunderbolt hardware, which mostly consisted of external hard drive arrays, the interface remained out of reach for non-professional buyers. Adding to the tech's adoption problems was Intel's reportedly strict licensing practices.
Intel is preparing for production to begin later this year ahead of a 2014 release, and notes existing Thunderbolt cables and connectors will be compatible with the buffed protocol.
Soure: Intel
According to Intel, the upgraded Thunderbolt interface will be able to hit the 20Gbps transmit speeds over two channels, meaning adopters could see a theoretical doubling in performance from current Thunderbolt systems, reports Engadget. The high-speed tech, code-named Falcon Ridge, allows for simultaneous 4K video file transfer and display over two I/O channels.
Driving the tech will be a new Thunderbolt controller, currently designated by the codename Redwood Ridge, which is expected to be integrated with Intel's fourth-generation Core processor lineup this year. Improvements over the previous Cactus Ridge controller are DisplayPort 1.2 capability when connecting to native DP displays, improved power management and a supposed decrease in material cost.
The Thunderbolt ecosystem has been slowly gaining momentum after Apple and Intel introduced the interface in 2011, with manufacturers finally bringing products to the consumer marketplace.
For example, Corning showed off its thin optical cables for the standard at this year's Macworld. Apple was first to support Thunderbolt with its early-2011 MacBook Pro. Some Windows-based PCs have started to implement the protocol, though it is far less common than competing technology like USB 3.0.
Due to initial pricing of Thunderbolt hardware, which mostly consisted of external hard drive arrays, the interface remained out of reach for non-professional buyers. Adding to the tech's adoption problems was Intel's reportedly strict licensing practices.
Intel is preparing for production to begin later this year ahead of a 2014 release, and notes existing Thunderbolt cables and connectors will be compatible with the buffed protocol.
Comments
8 bits = 1 byte
Cool hardware though
Quote:
Originally Posted by Phone-UI-Guy
20Gbps is not 20-Gigabytes per second... Standard nomenclature is a lower case b is bits and not Bytes which is uppercase B.
Yeah but how fast is
20Gpbs in the headline?
Clearly faster than the speed of thought.
This stuff is hot off the press, there's going to be more typos than usual.
Another total FAIL at description/reporting (again!)
Quote:
Originally Posted by hmurchison
With this speed increase Retina is possible across the iMac line. It may also offer display daisy chaining
What indication do you have that it is currently not possible...or that it has anything to do with Thunderbolt bandwidth?
Meaning, other than a Retina iMac extending its display to a Retina Thunderbolt Display, what does this have to do with actually producing either product?
Yes what resolution is possible on the current thunderbolt? Obviously not 3840x2160....?
(edit: I see someone did get 2560x1600 on an external monitor
http://chris.dziemborowicz.com/blog/2012/07/04/fix-external-monitor-resolution-on-macbook-pro-with-retina-display/
and Apple says it supports 2 such displays
http://www.apple.com/macbook-pro/specs-retina/
)
Quote:
Originally Posted by SolipsismX
The headline now says 20Gpbs. Gibi pers bit second?
A lot of people don't know what a Gibi is. I know what a GiB is.
Quote:
Originally Posted by SolipsismX
The headline now says 20Gpbs. Gibi pers bit second?
I'm not sure what the difference is but the chart tells me that whatever it is it's 4 times faster the USB 3.
For some reason, the dual Thunderbolt ports seem to support more than 10Gbps using (I assume) one controller. 2560 x 1600 x 24bpp x 60Hz x 2 = 11.8Gbps. Anandtech got 11Gbps data transfer with dual TB ports. Each port can't do more than 10Gbps though and 4K needs 3840 x 2160 x 60 x 32 = 15.9Gbps.
Monitors yes, TVs under 60" no.
I flip back and forth between Windows and Mac depending on the client/project, and lack of plug play with a thunderbolt drive basically made me need to switch to USB3.
Seriously, did they not think about this, or are they trying to supplant USB3 as number 1 by completely dropping the ball on windows PCs? I mean when was the last time you had a device that wasn't PnP?
1) All TVs have built-in monitors.
2) Where is this new "under 60" qualifier coming from? Are you now saying that 60" and up will eventually be 4K? If so, then you are then agreeing with my original point.
I don't know about that I find 1080 P monitors to be very grainy.
As for this report, I think I will go to the horses mouth as this article is confused.
I think its just a matter of there not being any reason to have a 4k monitor at this point - Bluray is only 1080p, and it'll be a while before any film companies start encoding at 4k and any streaming/digital companies start selling 4k. The bandwidth is way higher, and they won't be able to charge that much extra for 4k in most cases. Also, so many people have invested in Bluray drives and 1080p that sales of 4k tvs will probably be pretty slow. 1080p upscaled to 4k won't be noticable on smaller televisions either. (although 60" is a pretty big cutoff - i would say 1080p upscaled to 4k wouldn't be very noticeable on a 40" tv at normal viewing distance- 60" is a bit of a stretch.)
On a computer monitor, however, 4K will be very useful, especially when it comes to photo editing/graphic design. A pair of retina (or higher) type displays at 24-26" will be great to work on!