How does Thunderbolt work with a pci-e video card? On a desktop?
It doesn't. Or, rather?
Quote:
Why are there no DATA only Thunderbolt cards?
The spec requires video throughput. Thunderbolt that doesn't do video isn't Thunderbolt at all. And I think the ports have to also be on the main logic board, not any daughterboard or expansion card.
Quote:
?Thunderbolt data cards that can take DP in?
This is asking for exactly what a Thunderbolt port is supposed to be, but on an expansion card, and again, I don't think that's allowed.
It is pretty simple, I can't see Apple buying into TB without a solution for its ARM based machines. Beyond that there is mouthing in the publicly available specs that indicates that the tech is wedded to Intel processors.
Quote:
Originally Posted by SolipsismX
I don't follow. TB is on Macs and Macs are Intel. So far every requirement for TB has to have an Intel chip in the mix.
No what you are seeing is current implementations. An implementation does not imply a requirement. I would not be surprised at all to see TB built into IPad sometime soon. It is the perfect alternative to RF technologies and provides Apple with a long term solution for hard wired connections.
It makes sense to me. If you think about it from the standpoint of running your own business would you want your future high speed port of choice only hooking up to a fraction of your products.
Quote:
Maybe that's why Apple can't use use TB in its iDevices that run on ARM cpus. Maybe the iPhone and iPad will get Atom cpus when they are fabbed at the 22 nm process node.
I can't ever see Apple going intel for the portable I/O devices.
Quote:
It sounds crazy but from what I've read, Atom at 22 nm will be a real competitor to ARM cpus. If Intel use their process advantage against ARM they may be able make cpus that ARM can't match in terms of performance and low power consumption.
Competitive with three year old ARM tech. Mind you ARM tech built on older processes. If ARM cores only 500 miiliamps at 45 or 60 nm just imagine what happens at 28nm! One of the reasons I kinda believe that Apples A5X is real is that Apple may be able to double performance at around the same power profile. Intel has recently switched to Imagination GPUs in the latest ATOM chips and is running them very fast (600 to 800MHz, memory is getting old so ball park figures. In any event it gives us something to speculate about on A6/A5X.
Quote:
Its easy to right off Intel in this area as they haven't done anything special so far. But they could be about to change.
Sure they could, in fact I think they need to be aggressive here. In the end though ARM solutions are very attractive. Intel could see a good portion of their business dry up if the industry shift even more towards ARM so Intel certainly to eton the ball.
I don't buy that either. I don't really believe that Apple is happy that they are literally pulling Intel into the future kicking and screaming. Apple has pretty much demanded OpenCL and better all around GPU support from Intel. For good reason too as NVidias 9400m really provided a benchmark for what a low cost system could do.
Of course we won't likely know for some time what actually happened nor Apples mindset with Intel. What is obvious though is that Intels future and Apples just don't seem to be aligned.
Intel did basically kick NVidia out there, so I'm not surprised Apple would want similar performance now from Intel. Apple needs to be proactive in getting decent gpu drivers out though.
Quote:
Originally Posted by wizard69
In fact I'd go so far as to say that it has to work on other hardware. Do you really think Apple would invest in TB if it didn't have a play for its embedded ARM chips?
They've gone back on stuff before. I think it'll depend on adoption and reduced costs.
It is pretty simple, I can't see Apple buying into TB without a solution for its ARM based machines. Beyond that there is mouthing in the publicly available specs that indicates that the tech is wedded to Intel processors.
No what you are seeing is current implementations. An implementation does not imply a requirement. I would not be surprised at all to see TB built into IPad sometime soon. It is the perfect alternative to RF technologies and provides Apple with a long term solution for hard wired connections.
Why would Intel let Thunderbolt be used in a chain that doesn't have an Intel chip on one end? This has nothing to do with an iDevice having an ARM CPU but whether the machine it connects has Thunderbolt which requires Intel's approval which means an Intel chip. I'm sure you know that AMD is working on their own Thunderbolt competitor.
The spec requires video throughput. Thunderbolt that doesn't do video isn't Thunderbolt at all. And I think the ports have to also be on the main logic board, not any daughterboard or expansion card.
This is asking for exactly what a Thunderbolt port is supposed to be, but on an expansion card, and again, I don't think that's allowed.
what about front case ports they are on daughterboard hooked to headers on the MB.
Saying main board only is dumb. and what about servers? a lot of them have basic mosly VGA on board video on the PCI (Not Pci-e) bus
Why would Intel let Thunderbolt be used in a chain that doesn't have an Intel chip on one end? This has nothing to do with an iDevice having an ARM CPU but whether the machine it connects has Thunderbolt which requires Intel's approval which means an Intel chip. I'm sure you know that AMD is working on their own Thunderbolt competitor.
that is a big anti trust issue. And at the desktop level I don't see Intel getting away with locking out ATI and NVIDIA video cards.
No idea, but Nvidia drivers tend to be better than ATI for the past five years when it comes to gaming. A hotly debated topic, in the end anyways as you imply Intel GPU/ Nvidia/ ATI/ Windows just plain fails at a decent gaming experience. Since late last year I gave up on PC gaming and finally got an Xbox360. Not looking back.
No idea, but Nvidia drivers tend to be better than ATI for the past five years when it comes to gaming. A hotly debated topic, in the end anyways as you imply Intel GPU/ Nvidia/ ATI/ Windows just plain fails at a decent gaming experience. Since late last year I gave up on PC gaming and finally got an Xbox360. Not looking back.
No idea, but Nvidia drivers tend to be better than ATI for the past five years when it comes to gaming. A hotly debated topic, in the end anyways as you imply Intel GPU/ Nvidia/ ATI/ Windows just plain fails at a decent gaming experience. Since late last year I gave up on PC gaming and finally got an Xbox360. Not looking back.
I haven't had any more problems with ATI/AMD drivers in ten years of self-builds than Nvidia drivers. AMD seems a bit late with game updates in the last few months, that's about it. YMMV. And on the Apple side, they aren't made just by the chip manufacturer, Apple is a big part of the development too.
what about front case ports they are on daughterboard hooked to headers on the MB.
Then depending on the means used to connect said ports to the motherboard, they'd either be possible or impossible. I would guess they'd be perfectly fine, given that they're not generally a user-changeable accessory.
Quote:
Saying main board only is dumb. and what about servers? a lot of them have basic mosly VGA on board video on the PCI (Not Pci-e) bus
What does that have to do with anything? Instead of VGA out you'd have Thunderbolt out. I see absolutely nothing to prevent Thunderbolt on servers, particularly when that's where it's supposed to be used in the first place.
Adoption is more important than any lock in they might get. If they have too the may license the tech. Just look at the difference between the PCI & USB markets vs FireWire. Success for USb and PCI came from wide availability not Intel only chipsets.
Quote:
Originally Posted by SolipsismX
Why would Intel let Thunderbolt be used in a chain that doesn't have an Intel chip on one end? This has nothing to do with an iDevice having an ARM CPU but whether the machine it connects has Thunderbolt which requires Intel's approval which means an Intel chip. I'm sure you know that AMD is working on their own Thunderbolt competitor.
AMDs best bet would be to license the tech from Intel and maintain compatibility. In the long run it is in Intels best interest to make sure TB is widely adopted.
Then depending on the means used to connect said ports to the motherboard, they'd either be possible or impossible. I would guess they'd be perfectly fine, given that they're not generally a user-changeable accessory.
What does that have to do with anything? Instead of VGA out you'd have Thunderbolt out. I see absolutely nothing to prevent Thunderbolt on servers, particularly when that's where it's supposed to be used in the first place.
I'm not sure what the guy responding to you was basing his comments on but PCI-Express is used extensively in server machines. So that comment is bogus. As you note servers could easily migrate away from VGA ports, eventually they will have too. Like many other video standards eventually hardware won't be there tO connect to VGA ports.
No idea, but Nvidia drivers tend to be better than ATI for the past five years when it comes to gaming. A hotly debated topic, in the end anyways as you imply Intel GPU/ Nvidia/ ATI/ Windows just plain fails at a decent gaming experience. Since late last year I gave up on PC gaming and finally got an Xbox360. Not looking back.
NVidia is known for better drivers on Windows. While there are a lot of things that annoy me about the Mac drivers, they haven't been so far behind NVidia under OSX.
I'm not sure what the guy responding to you was basing his comments on but PCI-Express is used extensively in server machines. So that comment is bogus. As you note servers could easily migrate away from VGA ports, eventually they will have too. Like many other video standards eventually hardware won't be there tO connect to VGA ports.
Server tend to have low end video chips on the pci 33 bus and do not use video build in to the chip set.
they may move to dvi / hdmi but as for routing video over a local IO / EXT port in a sever seems like a useless thing.
And I don't see a sever needed the added cost of DP out / eating up board space to have a dvi , vga and DP port on the system or a DP + vga port.
Imperial Family Medicine Homepage , D0O2B2I5T order aldactone Whilst consuming Aldactone medication, you should not take potassium supplements or any other diuretics. http://www.acaiberry180.com/ - order spironolactone
Comments
How does Thunderbolt work with a pci-e video card? On a desktop?
It doesn't. Or, rather?
Why are there no DATA only Thunderbolt cards?
The spec requires video throughput. Thunderbolt that doesn't do video isn't Thunderbolt at all. And I think the ports have to also be on the main logic board, not any daughterboard or expansion card.
?Thunderbolt data cards that can take DP in?
This is asking for exactly what a Thunderbolt port is supposed to be, but on an expansion card, and again, I don't think that's allowed.
I don't follow. TB is on Macs and Macs are Intel. So far every requirement for TB has to have an Intel chip in the mix.
No what you are seeing is current implementations. An implementation does not imply a requirement. I would not be surprised at all to see TB built into IPad sometime soon. It is the perfect alternative to RF technologies and provides Apple with a long term solution for hard wired connections.
That's an interesting point you bring up.
It makes sense to me. If you think about it from the standpoint of running your own business would you want your future high speed port of choice only hooking up to a fraction of your products.
Maybe that's why Apple can't use use TB in its iDevices that run on ARM cpus. Maybe the iPhone and iPad will get Atom cpus when they are fabbed at the 22 nm process node.
I can't ever see Apple going intel for the portable I/O devices.
It sounds crazy but from what I've read, Atom at 22 nm will be a real competitor to ARM cpus. If Intel use their process advantage against ARM they may be able make cpus that ARM can't match in terms of performance and low power consumption.
Competitive with three year old ARM tech. Mind you ARM tech built on older processes. If ARM cores only 500 miiliamps at 45 or 60 nm just imagine what happens at 28nm! One of the reasons I kinda believe that Apples A5X is real is that Apple may be able to double performance at around the same power profile. Intel has recently switched to Imagination GPUs in the latest ATOM chips and is running them very fast (600 to 800MHz, memory is getting old so ball park figures. In any event it gives us something to speculate about on A6/A5X.
Its easy to right off Intel in this area as they haven't done anything special so far. But they could be about to change.
Sure they could, in fact I think they need to be aggressive here. In the end though ARM solutions are very attractive. Intel could see a good portion of their business dry up if the industry shift even more towards ARM so Intel certainly to eton the ball.
I don't buy that either. I don't really believe that Apple is happy that they are literally pulling Intel into the future kicking and screaming. Apple has pretty much demanded OpenCL and better all around GPU support from Intel. For good reason too as NVidias 9400m really provided a benchmark for what a low cost system could do.
Of course we won't likely know for some time what actually happened nor Apples mindset with Intel. What is obvious though is that Intels future and Apples just don't seem to be aligned.
Intel did basically kick NVidia out there, so I'm not surprised Apple would want similar performance now from Intel. Apple needs to be proactive in getting decent gpu drivers out though.
In fact I'd go so far as to say that it has to work on other hardware. Do you really think Apple would invest in TB if it didn't have a play for its embedded ARM chips?
They've gone back on stuff before. I think it'll depend on adoption and reduced costs.
It is pretty simple, I can't see Apple buying into TB without a solution for its ARM based machines. Beyond that there is mouthing in the publicly available specs that indicates that the tech is wedded to Intel processors.
No what you are seeing is current implementations. An implementation does not imply a requirement. I would not be surprised at all to see TB built into IPad sometime soon. It is the perfect alternative to RF technologies and provides Apple with a long term solution for hard wired connections.
Why would Intel let Thunderbolt be used in a chain that doesn't have an Intel chip on one end? This has nothing to do with an iDevice having an ARM CPU but whether the machine it connects has Thunderbolt which requires Intel's approval which means an Intel chip. I'm sure you know that AMD is working on their own Thunderbolt competitor.
It doesn't. Or, rather?
The spec requires video throughput. Thunderbolt that doesn't do video isn't Thunderbolt at all. And I think the ports have to also be on the main logic board, not any daughterboard or expansion card.
This is asking for exactly what a Thunderbolt port is supposed to be, but on an expansion card, and again, I don't think that's allowed.
what about front case ports they are on daughterboard hooked to headers on the MB.
Saying main board only is dumb. and what about servers? a lot of them have basic mosly VGA on board video on the PCI (Not Pci-e) bus
Why would Intel let Thunderbolt be used in a chain that doesn't have an Intel chip on one end? This has nothing to do with an iDevice having an ARM CPU but whether the machine it connects has Thunderbolt which requires Intel's approval which means an Intel chip. I'm sure you know that AMD is working on their own Thunderbolt competitor.
that is a big anti trust issue. And at the desktop level I don't see Intel getting away with locking out ATI and NVIDIA video cards.
that is a big anti trust issue. And at the desktop level I don't see Intel getting away with locking out ATI and NVIDIA video cards.
There is nothing antitrust about Intel creating a technology and keeping it for themselves.
because intel's drivers are great...?
No idea, but Nvidia drivers tend to be better than ATI for the past five years when it comes to gaming. A hotly debated topic, in the end anyways as you imply Intel GPU/ Nvidia/ ATI/ Windows just plain fails at a decent gaming experience. Since late last year I gave up on PC gaming and finally got an Xbox360. Not looking back.
No idea, but Nvidia drivers tend to be better than ATI for the past five years when it comes to gaming. A hotly debated topic, in the end anyways as you imply Intel GPU/ Nvidia/ ATI/ Windows just plain fails at a decent gaming experience. Since late last year I gave up on PC gaming and finally got an Xbox360. Not looking back.
fyi, intel's drivers are horrible.
No idea, but Nvidia drivers tend to be better than ATI for the past five years when it comes to gaming. A hotly debated topic, in the end anyways as you imply Intel GPU/ Nvidia/ ATI/ Windows just plain fails at a decent gaming experience. Since late last year I gave up on PC gaming and finally got an Xbox360. Not looking back.
I haven't had any more problems with ATI/AMD drivers in ten years of self-builds than Nvidia drivers. AMD seems a bit late with game updates in the last few months, that's about it. YMMV. And on the Apple side, they aren't made just by the chip manufacturer, Apple is a big part of the development too.
what about front case ports they are on daughterboard hooked to headers on the MB.
Then depending on the means used to connect said ports to the motherboard, they'd either be possible or impossible. I would guess they'd be perfectly fine, given that they're not generally a user-changeable accessory.
Saying main board only is dumb. and what about servers? a lot of them have basic mosly VGA on board video on the PCI (Not Pci-e) bus
What does that have to do with anything? Instead of VGA out you'd have Thunderbolt out. I see absolutely nothing to prevent Thunderbolt on servers, particularly when that's where it's supposed to be used in the first place.
Why would Intel let Thunderbolt be used in a chain that doesn't have an Intel chip on one end? This has nothing to do with an iDevice having an ARM CPU but whether the machine it connects has Thunderbolt which requires Intel's approval which means an Intel chip. I'm sure you know that AMD is working on their own Thunderbolt competitor.
AMDs best bet would be to license the tech from Intel and maintain compatibility. In the long run it is in Intels best interest to make sure TB is widely adopted.
Then depending on the means used to connect said ports to the motherboard, they'd either be possible or impossible. I would guess they'd be perfectly fine, given that they're not generally a user-changeable accessory.
What does that have to do with anything? Instead of VGA out you'd have Thunderbolt out. I see absolutely nothing to prevent Thunderbolt on servers, particularly when that's where it's supposed to be used in the first place.
I'm not sure what the guy responding to you was basing his comments on but PCI-Express is used extensively in server machines. So that comment is bogus. As you note servers could easily migrate away from VGA ports, eventually they will have too. Like many other video standards eventually hardware won't be there tO connect to VGA ports.
No idea, but Nvidia drivers tend to be better than ATI for the past five years when it comes to gaming. A hotly debated topic, in the end anyways as you imply Intel GPU/ Nvidia/ ATI/ Windows just plain fails at a decent gaming experience. Since late last year I gave up on PC gaming and finally got an Xbox360. Not looking back.
NVidia is known for better drivers on Windows. While there are a lot of things that annoy me about the Mac drivers, they haven't been so far behind NVidia under OSX.
I'm not sure what the guy responding to you was basing his comments on but PCI-Express is used extensively in server machines. So that comment is bogus. As you note servers could easily migrate away from VGA ports, eventually they will have too. Like many other video standards eventually hardware won't be there tO connect to VGA ports.
Server tend to have low end video chips on the pci 33 bus and do not use video build in to the chip set.
they may move to dvi / hdmi but as for routing video over a local IO / EXT port in a sever seems like a useless thing.
And I don't see a sever needed the added cost of DP out / eating up board space to have a dvi , vga and DP port on the system or a DP + vga port.