Originally Posted by Avon B7
SATA and Firewire belong in different universes because they were designed for completely different reasons. SATA was designed to be an internal storage technology. Firewire was designed to be an external peripheral technology. They both move data over networks but from a design and planning point of view they had very different goals.
eSATA was simply an afterthought to SATA and it shows. If they had considered an external option form the start, cable length and bus power simply would not be an issues now and need to be pegged on at a later date.
In the case of HANA still more problems had to be overcome. It's one thing to send data over ethernet from computer to computer (where there are ample resources at both ends and someone can administer traffic) but something completely different to send media to audio/video equipment of all kinds where resources will be extremely limited and non-upgradeable in a non-administered environment. In those situations things have to 'just work' as Apple likes to put it.
Networks move data, that leads to congestion at some point. What is key is finding a way to guarantee QoS where necessary. Firewire has always provided for this.
You keep repeating that same line about E-SATA, can't you come up with another one? It's tiring reading it again.
SCSI was also at first, an internal implementation. It was also extended, and never had power. It was very successful despite that.
SATA is the same sort of interface. E-SATA extends it. Power is not THAT important, but is being added.
We'll even be getting power over Ethernet before too long. 10 GB Ethernet with power will be kick-ass.
The truth is that even the amount of power FW has, isn't nearly enough for most practical purposes. Most photographers in the field who shoot digital (almost all) use generators to get enough power for fashion, and other big shoots. They don't power their external drives off the internal battery of their laptops. Thats a joke!
As far as power in the studio, or any other fixed location goes, power over the bus simply isn't important.
Power Over SATA is coming out simply because few people really want to run single FW drives anymore once they compare the speeds with E-SATA. I hope you read the link I posted yesterday to Bare Feats. Even FW 400 doesn't have too much of an advantage over USB 2 for HDD's now.
It really leaves just a few things. Older camcorders, and a few new models, and a few Audio devices.
In another year or so, most of those will be on USB 2 or 3, make no mistake about that.
I bet MOTO is kicking themselves for removing the USB 2 interface now.
The problem FW has, is that it wasn't thought out too well back in the early days. In the beginning, FW was 100 Mb/s. That's all most camcorders can do, as they only have 100 Mb/s FW chips. The first FW chips couldn't even handle anything other than camcorders. It took over a year after the first implementations before that problem was resolved. It was thought that FW 400 would be all anyone would really need, so faster implementations were put on the back burner, until after USB 2 came out, and they suddenly realized that they waited too long. Then they started to work on an ambitious program.
Unfortunately, they found that FW was pretty complex, as we found out from all the problems we had with lost data, and corrupted drives. FW 800 was about two years late. 1600 should have been out over 18 months ago, and 3200 should have been here NOW, today, in machines.
They screwed the pooch, as it's said, and it isn't likely they will ever make up for it.
One of the biggest failures was not being able to convince HDD manufacturers to produce "native" FW HDD controllers, as they did with IDE, SCSI, and now SATA. That doomed the standard in the beginning.
Even Hi Def. Tv, for which FW 400 was originally part of the standard, has abandoned it a while ago.
So, while Apple did jump a bit early, the trend is already established. FW is dying. The fact that it will linger on for a while in some expensive equipment doesn't matter for the vast majority. Those needing that expensive equipment will pay for expensive computers to run it, as they always have. Everyone else will get something else.
As far as networks go, you are repeating the same errors.
Most equipment now have cpus inside that can, and do, manage data, streaming or not, priorities etc. All BD players have a computer built-in, for example. All set-top boxes are computers, etc. FW is becoming less important all around as computing is reaching the "ubiquitous" stage. That is, with everything having its own built-in intelligence.
In the end, Intel, and other cpu manufacturers win, because the cost of this intelligence has become worth pennies per device. My toaster has a four bit cpu, with its own Flash memory, and the toaster only costs about $50.
I doubt that there is a single camcorder these days without a cpu controlling its functions. Even some SLR lenses have two computers inside, and the cameras often have at least three.
The iPhone has three computer chips, and most other phones have at least one.
The argument for FW was good in the days when cpu's were expensive still, and didn't exist in most other devices. The world has changed, and so that's no longer true.
Thus, that last argument for FW has disappeared, as each device communicates around the network, and can participate in managing itself.
Ethernet has been taking over these functions too.
This will become even more obvious as time moves on.