Originally Posted by al_bundy
with USB and PCI it took Intel a long time to get their technology into computers. with Apple's record of shipping computers without "legacy" ports they found a partner that will ship millions of units with their port.
PCI was very quick to be adopted that I remember. It sems like it only took one motherboard generation. USB was a different matter. I think the biggest hurdle was OS support.
Originally Posted by fyngyrz
The future is wireless. Get the damned cables off my desk. All of them. I don't want to see anything more than power cords, and I'm not all that happy with them, either.
Sometimes I think the world just gets stuck. Cables are out. RF is in.
Wires contains the signal and limit interference, easier to design for. For every wireless standard, you'll find a wired counterpart that's faster. Also, the higher the frequency, the more line of sight matters. Wireless still requires power that a wired cable usually supplies as a default, even this Light Peak idea offers power connectors. Not only that, wireless is less efficient with its signal, the signal goes in all directions, wired sends most of its signal down a controlled path.
Originally Posted by mdriftmeyer
Apple invented Firewire, created miniDVI/miniDisplayPort and many other modifications to standards all subsequently added to standards used by the Industry.
Who else has adopted miniDVI? Has anyone else adopted miniDP yet?
Originally Posted by Joe The Dragon
but can this do pci-e over it?
video cards need to be on the pci-e bus and not a super usb bus with high cpu load.
Video cards don't have to be on PCIe. That's just the current favored standard. If that changes, the card makers will adapt and switch to a new connector like they have in the past with ISA->PCI->AGP->PCIe transitions.
Originally Posted by X38
Systems like Verizon FIOS have already solved the "last mile" problem and easily beat cable in cities lucky enough to have it. Even FIOS is bottlnecked by copper once it gets inside the home though. Maybe we should call that the "last foot" problem. I've heard they have been working on a solution, but haven't made it economical yet.
This Ligh Peak looks like it might quickly end up being the de facto solution to the "last foot" problem. Just imagine - a high speed optical connection all the way from the phone company trunk lines to the motherboard of your computer.
It's not a problem for the immediate FIOS doesn't saturate 100Mbit Ethernet. Then there's gigabit. There are 10gigabit ethernet standards that use copper too. When you're talking about shorter distances, optical doesn't have quite the same advantage, though optical is probably more reliable at 10gig anyways.
Originally Posted by huntercr
Apple *did* have the 1000base-T ports in desktops far before anyone else. I always thought that was really weird since no mortal user even had a switch that could handle the bandwidth ( at home anyway ), but I guess edit houses using the "brand new" ( at the time ) Final Cut probably loved it.
The chart has a double standard anyways, comparing the high end Mac platform against mainstream PCs. Not only that, if a high end Mac gets a port, then the entire Mac platform is counted, but it wasn't counted on the Wintel platform unless most machines get it. Gigabit ethernet is pretty common on PCs now, they've been standard on workstations and business desktops for maybe fie years now, and are starting to encroach on even the budget machines.
I still need to get a reliable gigE switch. I have one, but once a week it quit working properly, requiring a power cutoff to work, so I just quit using it. This is despite my having several PCs and macs, and just this week, even a printer, with gigE.