When will it be (or will it ever be) plug-and-play in windows?
MS has completely lost direction and frankly I'm not sure they will ever get their mojo back. Business is actually developing a hostile attitude towards the company due to all of the issues around running and upgrading Windows. I also think MS has a bit of an attitude due to being left out of the Apple/Intel partnership to develop this technology.
I flip back and forth between Windows and Mac depending on the client/project, and lack of plug play with a thunderbolt drive basically made me need to switch to USB3.
I've stated this again and again but Apple got everything they wanted out of TB as a docking port. Everything on top of that is gravy for them.
Seriously, did they not think about this, or are they trying to supplant USB3 as number 1 by completely dropping the ball on windows PCs? I mean when was the last time you had a device that wasn't PnP?
2) Where is this new "under 60" qualifier coming from? Are you now saying that 60" and up will eventually be 4K? If so, then you are then agreeing with my original point.
I'm not sure what other have been thinking but from my perspective, when the mention of 1080P monitors is made, the resolution on a significantly large enough screen is not acceptable. I suspect this is why apple uses higher res screens in the iMac. 4K may be an overkill in some cases for a computer monitor but the 1080P node is just terrible.
I'm not sure what other have been thinking but from my perspective, when the mention of 1080P monitors is made, the resolution on a significantly large enough screen is not acceptable. I suspect this is why apple uses higher res screens in the iMac. 4K may be an overkill in some cases for a computer monitor but the 1080P node is just terrible.
Absolutely! In the 4K HDTV thread from a few weeks back I talked about that and ran numbers as to where the "Retina" effect most of us saw on smaller 1080p and even 720p panels start to fall away in the 50" to 60" ranges. I even talked about people buying a new HDTV and thinking it's worse than there previous picture precisely because they've enlarged the display but not their living room thus giving a worse experience in certain ways.
Where is this new "under 60" qualifier coming from? Are you now saying that 60" and up will eventually be 4K? If so, then you are then agreeing with my original point.
TVs coming out that are that size are 4K. They just cost around $38k.
TVs coming out that are that size are 4K. They just cost around $38k.
You have been participating in a thread where I specifically link to an article that lists 55" and 65" UltraHD TVs from Sony to be released in 2 weeks for $4,999 and $6,999, respectively, yet are now claiming that they are all around $38,000 on top of assuming that prices today would still be the same as in the future without any acknowledgment that Thunderbolt with 4K support isn't yet available. You are sounding like a Luddite with falsified claims and desire to never see a display get to 4K (or surpass) it despite phones now available with 1080p.
You have been participating in a thread where I specifically link to an article that lists 55" and 65" UltraHD TVs from Sony to be released in 2 weeks for $4,999 and $6,999, respectively, yet are now claiming that they are all around $38,000 on top of assuming that prices today would still be the same as in the future
I'm well aware of the $5k Sony display and even referenced it before you posted the link. I don't think 4K is beneficial at those sizes. You'd have to be less than 6ft away - who sits less than 6ft away from a 60" TV? With an 85" display, I could see some people sitting closer than 10ft away. Between 60"-85", any pixels/blurriness would be more noticeable the higher up it went.
You are sounding like a Luddite with falsified claims and desire to never see a display get to 4K (or surpass) it despite phones now available with 1080p.
It's not that I don't want to see them go to 4K or 8K or higher, I just don't see how it's going to make a difference. If I said to you 4K was no good, some media is higher than 4K, why not go all the way to 16K and future proof it, wouldn't you think it was unnecessary? I just think moving to 4K is unnecessary for the purposes of home entertainment. Obviously manufacturers are running out of marketing terms and TV margins are too low so they need to push something new. It's a solution without a problem though.
I'm well aware of the $5k Sony display and even referenced it before you posted the link. I don't think 4K is beneficial at those sizes. You'd have to be less than 6ft away - who sits less than 6ft away from a 60" TV? With an 85" display, I could see some people sitting closer than 10ft away. Between 60"-85", any pixels/blurriness would be more noticeable the higher up it went.
That is all sorts of wrong. You made this statement before where you think that a higher PPI means that you need to sit closer. Again, 1080p isn't cutting it as we move into larger and larger displays. For example (like I showed in the other thread), a 55" 1080p HDTV will require the user to sit up to 7' away to get the minimum Retina effect for someone with 20/20(6/6) vision. Unless you want to argue that HEC displays will not get bigger and that living rooms will get shorter thus making 1080p the only option for the foreseeable future but that's a hard position to reasonably take, especially with Intel, VESA, HDMI Consortium, Sony, etc. all supporting 4K right now.
I'm not sure what you mean here. It's not yet available..
That's exactly what I mean. You keep talking about today when all this discussion is about what's coming tomorrow. Hence the article title which contains announces, next-gen, and coming in 2014.
If I said to you 4K was no good, some media is higher than 4K, why not go all the way to 16K and future proof it, wouldn't you think it was unnecessary.
Reductio ad absurdum. We're now just getting to the point where 2x 1080p can be supported effectively with HW and within a few years of the prices being where HDTVs were when they took off and you want to go 4x and 8x the number of pixels? In the words of the prophet Ed Lover, "Come on, Son!"
I'm surprised PC fans haven't posted the usual defensive post about how awesome USB3 is.
Well, it *IS* finally useful, and for a lot of applications may be a perfectly reasonable, lower-cost alternative to TB. Like an external drive that's not SSD or RAID, or capturing a single video stream.
I'm surprised PC fans haven't posted the usual defensive post about how awesome USB3 is.
1) Macs now have USB 3.0 with Ivy Bridge. They usually stop something being a feature after Apple adds it. Usually it's the "it's about time" or "such-and-such has had it for x-long" type comments, often followed by a pic of someone's collection of gadgets that they think proves they aren't a troll.
2) A lot of "PC" manufacturers, especially at the top end, have adopted TB or DP protocols, oft with mDP ports so that makes it hard to argue how much Apple sucks when the vendor they claim is better is also backing it.
It would be nice of they would upgrade the iDevices to TB sooner rather than later. USB just doesn't cut it for synching 64GB. Sure, the internal flash only goes so fast, but that won't be the bottleneck forever. Even more interestig though would be TB on AppleTV, especially dual TB, with iOS given the ability to recognize external drives.
It would be nice of they would upgrade the iDevices to TB sooner rather than later. USB just doesn't cut it for synching 64GB. Sure, the internal flash only goes so fast, but that won't be the bottleneck forever.
Even more interestig though would be TB on AppleTV, especially dual TB, with iOS given the ability to recognize external drives.
1) Even once the NAND gets faster it has a long way to go before it comes close to exceeding USB 2.0 speeds. 18-25MB/s for NAND with 35-40MB/s actual throughput for USB2.0. Then consider that they could use USB3.0 speeds which will then jump it ahead considerably.
2) You'll hear people say that an iDevice can't have a TB controller because it doesn't have an Intel Core processor. That's not accurate. Since we're only talking about the iDevice being a peripheral it's possible, just like we have with displays, external drives, etc. That said, the cost, power usage, and size all make it less than likely for some time to come.
What indication do you have that it is currently not possible...or that it has anything to do with Thunderbolt bandwidth?
Meaning, other than a Retina iMac extending its display to a Retina Thunderbolt Display, what does this have to do with actually producing either product?
"Improvements over the previous Cactus Ridge controller are DisplayPort 1.2 capability when connecting to native DP displays".
DisplayPort 1.2 supports daisy chaining of DisplayPort 1.2 monitors. No Apple Thunderbolt Display required.
For some reason, the dual Thunderbolt ports seem to support more than 10Gbps using (I assume) one controller. 2560 x 1600 x 24bpp x 60Hz x 2 = 11.8Gbps. Anandtech got 11Gbps data transfer with dual TB ports. Each port can't do more than 10Gbps though and 4K needs 3840 x 2160 x 60 x 32 = 15.9Gbps.
Wikipedia lists other values: 2560 × 1600 × 30 bpp @ 60 Hz for 10.46 Gb/s for CVT and 8.06 Gb/s for CVT-R.
For 4K they list: 3840 × 2160 × 30 bpp @ 60 Hz 21.39 Gb/s for CVT and 16.00 Gb/s for CVT-R, which is supported with DP 1.2 that is available with today's TB chipset update. The delay of the Mac Pro (and ATD) updates, and new rumours of notebook updates this quarter might have been planned around the DP 1.2 update in the new chipset.
We need a daisychainable realistically priced (=read simple, max 350Euros) box that has space to fit one PCIe card and about 3-4 2,5 inch drives. Amen.
You could have different models but they should all be chainable. Now having another model much the same but have two PCIe slots, you could have a controller/pcie ssd and descrete graphics...
I let the rest to your imagination. Now that would seriously allow some crazy applications for macs that they dont really fit that well currently...
2) You'll hear people say that an iDevice can't have a TB controller because it doesn't have an Intel Core processor. That's not accurate. Since we're only talking about the iDevice being a peripheral it's possible, just like we have with displays, external drives, etc. That said, the cost, power usage, and size all make it less than likely for some time to come.
That's a good point, though I imagine people are also hoping that Thunderbolt on an iPad or iPhone could also be used for video out, which presumably would require the Intel chipset?
Out of interest, in the hypothetical situation where Apple puts a Thunderbolt controller and slim port on an iPhone (perhaps after switching to an Intel CPU), does Thunderbolt support crossover for attaching controller to controller for when you plug it into your Mac? i.e. is it like Firewire and ethernet, or like USB? Seems like it'd be a good idea, but I don't think I've seen it stated anywhere.
Comments
TB was never intended to replace USB in any form.
I'm not sure what other have been thinking but from my perspective, when the mention of 1080P monitors is made, the resolution on a significantly large enough screen is not acceptable. I suspect this is why apple uses higher res screens in the iMac. 4K may be an overkill in some cases for a computer monitor but the 1080P node is just terrible.
Quote:
Originally Posted by Marvin
[...] 4K needs 3840 x 2160 x 60 x 32 = 15.9Gbps.
Why times 60 rather than 30? What source is 60 frames per second?
Don't get me wrong, I'm all for supporting 60fps, I'm just curious how you arrived at that figure?
Absolutely! In the 4K HDTV thread from a few weeks back I talked about that and ran numbers as to where the "Retina" effect most of us saw on smaller 1080p and even 720p panels start to fall away in the 50" to 60" ranges. I even talked about people buying a new HDTV and thinking it's worse than there previous picture precisely because they've enlarged the display but not their living room thus giving a worse experience in certain ways.
TVs coming out that are that size are 4K. They just cost around $38k.
A computer refresh rate is usually 60Hz or 75Hz.
You have been participating in a thread where I specifically link to an article that lists 55" and 65" UltraHD TVs from Sony to be released in 2 weeks for $4,999 and $6,999, respectively, yet are now claiming that they are all around $38,000 on top of assuming that prices today would still be the same as in the future without any acknowledgment that Thunderbolt with 4K support isn't yet available. You are sounding like a Luddite with falsified claims and desire to never see a display get to 4K (or surpass) it despite phones now available with 1080p.
I'm well aware of the $5k Sony display and even referenced it before you posted the link. I don't think 4K is beneficial at those sizes. You'd have to be less than 6ft away - who sits less than 6ft away from a 60" TV? With an 85" display, I could see some people sitting closer than 10ft away. Between 60"-85", any pixels/blurriness would be more noticeable the higher up it went.
I'm not sure what you mean here. It's not yet available.
It's not that I don't want to see them go to 4K or 8K or higher, I just don't see how it's going to make a difference. If I said to you 4K was no good, some media is higher than 4K, why not go all the way to 16K and future proof it, wouldn't you think it was unnecessary? I just think moving to 4K is unnecessary for the purposes of home entertainment. Obviously manufacturers are running out of marketing terms and TV margins are too low so they need to push something new. It's a solution without a problem though.
Originally Posted by jlandd
Joke about supporting hardware shipping in 2024 in 3, 2,....
I don't see how that would apply here, given that Apple has currently adopted Thunderbolt infinitely faster than any other computer manufacturer.
Infinitely because none of them use it yet, after three years.
Now, if you're talking PC adoption, that's a joke.
Originally Posted by v5v
What source is 60 frames per second? Don't get me wrong, I'm all for supporting 60fps…
ARE YOU INSANE, MAN?! DON'T QUESTION IT, THEN!
Get the support out there from the get-go. That way the whiners won't be able to complain, "Well, there's no support for it; let's just not do it."
That is all sorts of wrong. You made this statement before where you think that a higher PPI means that you need to sit closer. Again, 1080p isn't cutting it as we move into larger and larger displays. For example (like I showed in the other thread), a 55" 1080p HDTV will require the user to sit up to 7' away to get the minimum Retina effect for someone with 20/20(6/6) vision. Unless you want to argue that HEC displays will not get bigger and that living rooms will get shorter thus making 1080p the only option for the foreseeable future but that's a hard position to reasonably take, especially with Intel, VESA, HDMI Consortium, Sony, etc. all supporting 4K right now.
That's exactly what I mean. You keep talking about today when all this discussion is about what's coming tomorrow. Hence the article title which contains announces, next-gen, and coming in 2014.
Reductio ad absurdum. We're now just getting to the point where 2x 1080p can be supported effectively with HW and within a few years of the prices being where HDTVs were when they took off and you want to go 4x and 8x the number of pixels? In the words of the prophet Ed Lover, "Come on, Son!"
I know an I-O/peripherals company has some interesting things planned in conjunction with this.
Quote:
Originally Posted by Marvin
Quote:
Originally Posted by v5v
Why times 60 rather than 30? What source is 60 frames per second?
A computer refresh rate is usually 60Hz or 75Hz.Oh yeah... duh. How can you tell I'm a TV guy first and computer guy second... or ninth...
I'm surprised PC fans haven't posted the usual defensive post about how awesome USB3 is.
Quote:
Originally Posted by Suddenly Newton
I'm surprised PC fans haven't posted the usual defensive post about how awesome USB3 is.
Well, it *IS* finally useful, and for a lot of applications may be a perfectly reasonable, lower-cost alternative to TB. Like an external drive that's not SSD or RAID, or capturing a single video stream.
1) Macs now have USB 3.0 with Ivy Bridge. They usually stop something being a feature after Apple adds it. Usually it's the "it's about time" or "such-and-such has had it for x-long" type comments, often followed by a pic of someone's collection of gadgets that they think proves they aren't a troll.
2) A lot of "PC" manufacturers, especially at the top end, have adopted TB or DP protocols, oft with mDP ports so that makes it hard to argue how much Apple sucks when the vendor they claim is better is also backing it.
Even more interestig though would be TB on AppleTV, especially dual TB, with iOS given the ability to recognize external drives.
1) Even once the NAND gets faster it has a long way to go before it comes close to exceeding USB 2.0 speeds. 18-25MB/s for NAND with 35-40MB/s actual throughput for USB2.0. Then consider that they could use USB3.0 speeds which will then jump it ahead considerably.
2) You'll hear people say that an iDevice can't have a TB controller because it doesn't have an Intel Core processor. That's not accurate. Since we're only talking about the iDevice being a peripheral it's possible, just like we have with displays, external drives, etc. That said, the cost, power usage, and size all make it less than likely for some time to come.
Quote:
Originally Posted by pmz
What indication do you have that it is currently not possible...or that it has anything to do with Thunderbolt bandwidth?
Meaning, other than a Retina iMac extending its display to a Retina Thunderbolt Display, what does this have to do with actually producing either product?
"Improvements over the previous Cactus Ridge controller are DisplayPort 1.2 capability when connecting to native DP displays".
DisplayPort 1.2 supports daisy chaining of DisplayPort 1.2 monitors. No Apple Thunderbolt Display required.
Wikipedia lists other values: 2560 × 1600 × 30 bpp @ 60 Hz for 10.46 Gb/s for CVT and 8.06 Gb/s for CVT-R.
For 4K they list: 3840 × 2160 × 30 bpp @ 60 Hz 21.39 Gb/s for CVT and 16.00 Gb/s for CVT-R, which is supported with DP 1.2 that is available with today's TB chipset update. The delay of the Mac Pro (and ATD) updates, and new rumours of notebook updates this quarter might have been planned around the DP 1.2 update in the new chipset.
You could have different models but they should all be chainable. Now having another model much the same but have two PCIe slots, you could have a controller/pcie ssd and descrete graphics...
I let the rest to your imagination. Now that would seriously allow some crazy applications for macs that they dont really fit that well currently...
Quote:
Originally Posted by SolipsismX
2) You'll hear people say that an iDevice can't have a TB controller because it doesn't have an Intel Core processor. That's not accurate. Since we're only talking about the iDevice being a peripheral it's possible, just like we have with displays, external drives, etc. That said, the cost, power usage, and size all make it less than likely for some time to come.
That's a good point, though I imagine people are also hoping that Thunderbolt on an iPad or iPhone could also be used for video out, which presumably would require the Intel chipset?
Out of interest, in the hypothetical situation where Apple puts a Thunderbolt controller and slim port on an iPhone (perhaps after switching to an Intel CPU), does Thunderbolt support crossover for attaching controller to controller for when you plug it into your Mac? i.e. is it like Firewire and ethernet, or like USB? Seems like it'd be a good idea, but I don't think I've seen it stated anywhere.