I think you need to study the history of Apple a bit further. Dating back to the original Apple I & II projects, they have always tended to design things themselves when existing technology wasn't good enough (e.g. Wozniak's custom disk controller). They also used SCSI instead of serial/parallel ports, and Firewire instead of USB for many years. Jobs was notorious for shunning technologies he didn't feel were well-designed (even if they were cheaper).
The only reason USB succeeded (and Apple finally relented) is because Intel ate all of the development costs and pretty much forced it to become the standard on PCs by making it artificially cheap for PC manufacturers. Intel wanted USB to succeed because it requires very complex logic to control the bus, so it maintains their business model where connected devices need a CPU to communicate with each other. Whereas Firewire didn't require a CPU to control the bus, the logic could be embedded in much simpler chips on the devices themselves. Hence why you could connect things like cameras and TVs together without needing a computer. But it didn't have a big company taking a loss to make it cheaper. That's the reality of why USB succeeded -- it wasn't better by any means, it was just artificially cheaper.
Anyways, to me, it's not a sign of arrogance to design your own technology, it's an attention to detail and care that most companies just simply don't have. Sure, if there's an existing technology that works well enough, then it should be used. But design-by-committee doesn't always lead to the best technologies coming to market.
Your understanding of USB vs Firewire is false. Apple made USB popular by iMac G3. At that time most PC still sticks to serial ports.
The Mac marketshare was abysmally small in 1998 when the iMac G3 debuted. Yes, PCs at the time were still using serial ports, but USB emerged as the PC standard although it took years. Don't forget that the iMac G3 also had two FireWire ports in addition to the USB 1.1 ports.
Remember, Apple itself debuted a media player -- the iPod -- that could only sync via FireWire in 2002. The iPod was basically a Mac-only device; PC users had Creative and other MP3 players. Only as iPod began to increase in popularity did Apple eventually add Windows PC and USB support.
Your understanding of USB vs Firewire is false. Apple made USB popular by iMac G3. At that time most PC still sticks to serial ports.
Apple put USB on the iMac because they needed a port for mice + keyboards and were transitioning from the old ADB ports. It was never considered to be used for anything more than than low-bandwidth devices. Apple was behind Firewire as the next generation, high bandwidth device connector for things like external hard drives, cameras, iPods, etc.
I think Apple has reached its peak when Steve Jobs dies. Cook has followed a route that is unApple. The old Apple is never so active in research. Apple used USB in the iMac and wifi in AirPort base station. Both technology are not developed by Apple. But both has benefited mankind in the most unimagined way. Apple is working closely with SONY and making itself more and more like SONY. SONY is a dying corporation. It developed several technology that are best in class. However, they are so expensive that consumers became repulsive to SONY. SONY has not changed recently. It is still an arrogant corporation.
To develop all technologies in house has a big drawback. Even the most valuable corporation in the world can not hire all the talents in the world. To those lucky to be hired by Apple, they obviously will work hard to outperform. Then Apple will become arrogant to overlook the technologies developed by other companies.
The third problem is Apple is becoming evil. It easily discarded long term suppliers like Omnivision which has supplied the 1.2 MP front cameras for years. This is unethical business practice.
Shrinking the computer to pocket size and adding radios, sensors, auxiliary processors, displays and other outputs of several kinds means that Apple and any other company has to expand research just to keep things integrated and increasingly smaller and more functional. Your component suppliers won't do this for you. Your battery tech depends on your processor and your display tech and your software, etc.
The old Apple won't work in this new environment. As to your Omnivision point, I can't speak. Could be Apple was their usual arrogant, all-business selves. And Sony? Yes they lost focus, but surely the camera sensor division is still world class for mass manufacture plus quality, no?
The Mac marketshare was abysmally small in 1998 when the iMac G3 debuted. Yes, PCs at the time were still using serial ports, but USB emerged as the PC standard although it took years. Don't forget that the iMac G3 also had two FireWire ports in addition to the USB 1.1 ports.
Remember, Apple itself debuted a media player -- the iPod -- that could only sync via FireWire in 2002. The iPod was basically a Mac-only device; PC users had Creative and other MP3 players. Only as iPod began to increase in popularity did Apple eventually add Windows PC and USB support.
The first iMac does not have Firewire ports. The PCs did not adopt USB even though it is developed by Intel. This is because the Windows PC sell with very small profit margin. They can not afford to add USB without hurting their slim profit. But after Apple introduced iMac with USB, many peripheral companies introduced new devices that work with USB. Especially the keyboard and mouse. This makes the PC companies easier to switch to USBs.
I think Apple has reached its peak when Steve Jobs dies. Cook has followed a route that is unApple. The old Apple is never so active in research.
You have no idea what you're talking about. Apple spends, and has spent for years, a higher fraction of its total expenditures on research than most high tech companies. This was the case in the early '80s (I was in the Apple//&/// division then), and it's true now.
As for switching camera suppliers, it's not unethical, it's normal business practice when something better is available. Too bad your business couldn't keep up.
Remember, FireWire was a technology developed by Apple. It was originally called IEEE 1394, but Apple's marketing department wanted a more consumer-friendly moniker and picked "FireWire."
FireWire was conceptually intended to replace SCSI and parallel interfaces, for high-speed data transfer. The original USB was so slow (1.5Mbit/sec) that it was really only suitable to replace low-speed serial ports and Apple's own aging ADB interfaces.
The above photo of the iMac G3 has two FireWire 400 ports.
You're right. It was the second iteration of the iMac (about 1 year later) which added FireWire ports. But regardless, it was a fixture on all Macs after that point up until recently.
The PCs did not adopt USB even though it is developed by Intel. This is because the Windows PC sell with very small profit margin. They can not afford to add USB without hurting their slim profit. But after Apple introduced iMac with USB, many peripheral companies introduced new devices that work with USB. Especially the keyboard and mouse. This makes the PC companies easier to switch to USBs.
Some PC manufacturers had already adopted it at that point, otherwise device manufacturers wouldn't have cared about the Mac-only market (which, despite the success of the iMac, was still very tiny in 1998-2000). Also, PS/2 to USB adapters were commonly included with devices as a cheap way to add USB support.
But it was really the fact that Intel essentially gave USB away for free in their reference motherboard designs that made it ubiquitous (only extra cost was the ports). As opposed to FireWire where you had to pay a license fee to include it. So, as I said, Intel ate the R&D costs for it.
Let's just call a spade a spade here and say that most manufacturers simply want cheap (as little R&D spend as possible, no licensing fees), not great. They're all fighting to sell products as cheaply as possible. Apple and Sony aim for great (high R&D spending, paying for talent, taking risks on unproven technology), instead of trying to be the cheapest, and get called arrogant for doing so.
You're right. It was the second iteration of the iMac (about 1 year later) which added FireWire ports. But regardless, it was a fixture on all Macs after that point up until recently.
Some PC manufacturers had already adopted it at that point, otherwise device manufacturers wouldn't have cared about the Mac-only market (which, despite the success of the iMac, was still very tiny in 1998-2000). Also, PS/2 to USB adapters were commonly included with devices as a cheap way to add USB support.
But it was really the fact that Intel essentially gave USB away for free in their reference motherboard designs that made it ubiquitous (only extra cost was the ports). As opposed to FireWire where you had to pay a license fee to include it. So, as I said, Intel ate the R&D costs for it.
Let's just call a spade a spade here and say that most manufacturers simply want cheap (as little R&D spend as possible, no licensing fees), not great. They're all fighting to sell products as cheaply as possible. Apple and Sony aim for great (high R&D spending, paying for talent, taking risks on unproven technology), instead of trying to be the cheapest, and get called arrogant for doing so.
You forget one of my original points. Majority of consumers do not like to spend unnecessary more money on a product. Otherwise we will not see so many special deals posted on AI. The success of a product is not determined by 'great'. Sony became very successful because of Walkman which is not very expensive.
You have no idea what you're talking about. Apple spends, and has spent for years, a higher fraction of its total expenditures on research than most high tech companies. This was the case in the early '80s (I was in the Apple//&/// division then), and it's true now.
As for switching camera suppliers, it's not unethical, it's normal business practice when something better is available. Too bad your business couldn't keep up.
I was talking about the front facing camera in iPhone 6. It has the same 1.2 MP as in 5S which is provided by Omnivision. Why Apple need to switch supplier for the same resolution camera? I discovered (guess) the switching is probably due to one ex-Aptina employee who now works for Apple for several years. Aptina was a fierce competitor to Omnivision. I think this person was involved in the decision.
You forget one of my original points. Majority of consumers do not like to spend unnecessary more money on a product. Otherwise we will not see so many special deals posted on AI. The success of a product is not determined by 'great'. Sony became very successful because of Walkman which is not very expensive.
Off-topic, but I vividly remember listening to the original demo tape that came with the Walkman back in 1978. It completely blew me away and the hardware was very well constructed on the unit itself.
I think Apple has reached its peak when Steve Jobs dies. Cook has followed a route that is unApple. The old Apple is never so active in research. Apple used USB in the iMac and wifi in AirPort base station. Both technology are not developed by Apple. But both has benefited mankind in the most unimagined way. Apple is working closely with SONY and making itself more and more like SONY. SONY is a dying corporation. It developed several technology that are best in class. However, they are so expensive that consumers became repulsive to SONY. SONY has not changed recently. It is still an arrogant corporation.
To develop all technologies in house has a big drawback. Even the most valuable corporation in the world can not hire all the talents in the world. To those lucky to be hired by Apple, they obviously will work hard to outperform. Then Apple will become arrogant to overlook the technologies developed by other companies.
The third problem is Apple is becoming evil. It easily discarded long term suppliers like Omnivision which has supplied the 1.2 MP front cameras for years. This is unethical business practice.
LOLOLOL. Sony's actually in the middle of a turn around. The PlayStation division is generating billions in profits again, and all divisions except the mobile division are showing a profit. But keep telling yourself that.
I was talking about the front facing camera in iPhone 6. It has the same 1.2 MP as in 5S which is provided by Omnivision. Why Apple need to switch supplier for the same resolution camera? I discovered (guess) the switching is probably due to one ex-Aptina employee who now works for Apple for several years. Aptina was a fierce competitor to Omnivision. I think this person was involved in the decision.
Sony makes the front camera for the iPhones 6 as well, they were both simplifying the supply chain and getting a better product.
So what might the new R&D facility be for? Well, I seem to recall that Apple had worked with Sharp on advanced OLED display technology. And supposedly the two had patented a manufacturing process for producing high quality OLED panels at low cost with a novel "printing" process. Just search for "Sharp Apple OLED inkjet printing daisy wheel." (I know. Sounds weird.) You'll get plenty of hits from all kinds of sources.
You forget one of my original points. Majority of consumers do not like to spend unnecessary more money on a product. Otherwise we will not see so many special deals posted on AI. The success of a product is not determined by 'great'. Sony became very successful because of Walkman which is not very expensive.
Sony had already been working with cassette playing and recording technology for a long time before they released the Walkman. It's only because the components had come down in cost (and R&D costs were much lower) that they were able to produce the Walkman for that cheap at that point in time. They also had a fantastic design team. This is one of my favourites.
Both Apple and Sony have typically operated in the high-end consumer electronics market. The reason is because they spend the money to do R&D on cutting edge technologies (and/or license it) and typically do end-to-end integration in-house. It's high risk and potentially high reward (or very expensive failure). However, they should not be derided as arrogant simply because they choose to go their own path in many cases instead of contributing their work to a technology pool or settling for existing technology which may not create the type of user experience they're aiming for.
But it's perfectly fine that other manufacturers look for the lowest cost components to create similar products for cheaper. There's room for both types of products to exist in the marketplace. This includes using technology designed by groups of companies who are willing to fund the R&D (like Intel) because they'll benefit in the long term. Similar to how Android is developed and given away for free to manufacturers by Google so that, in the long run, they can stream advertisements and collect data from users. Manufacturers get the benefit of a free mobile OS (low R&D cost), which brings the cost of the product down.
Consumers benefit from both types of companies: those who work on the cutting edge and design/integrate new technology into consumer products, and those who work to make existing technology for as cheap as possible (thus giving more people access to it).
I think you need to study the history of Apple a bit further. Dating back to the original Apple I & II projects, they have always tended to design things themselves when existing technology wasn't good enough (e.g. Wozniak's custom disk controller). They also used SCSI instead of serial/parallel ports, and Firewire instead of USB for many years. Jobs was notorious for shunning technologies he didn't feel were well-designed (even if they were cheaper).
The only reason USB succeeded (and Apple finally relented) is because Intel ate all of the development costs and pretty much forced it to become the standard on PCs by making it artificially cheap for PC manufacturers. Intel wanted USB to succeed because it requires very complex logic to control the bus, so it maintains their business model where connected devices need a CPU to communicate with each other. Whereas Firewire didn't require a CPU to control the bus, the logic could be embedded in much simpler chips on the devices themselves. Hence why you could connect things like cameras and TVs together without needing a computer. But it didn't have a big company taking a loss to make it cheaper. That's the reality of why USB succeeded -- it wasn't better by any means, it was just artificially cheaper.
Anyways, to me, it's not a sign of arrogance to design your own technology, it's an attention to detail and care that most companies just simply don't have. Sure, if there's an existing technology that works well enough, then it should be used. But design-by-committee doesn't always lead to the best technologies coming to market.
Quote:
Originally Posted by tzeshan
Your understanding of USB vs Firewire is false. Apple made USB popular by iMac G3. At that time most PC still sticks to serial ports.
Apple made USB popular because Steve could see the trajectory it was on long before PC manufacturers could. However, Auxio is correct that Firewire is technically (vastly) superior to USB, providing full peer-to-peer communications that can be implemented in a low-resourced, peripheral environment whereas USB is master-slave. If I had the time to comment on your earlier posts I would, as some of your claims have fact and fiction well and truly intertwined.
I think Apple has reached its peak when Steve Jobs dies. Cook has followed a route that is unApple. The old Apple is never so active in research. Apple used USB in the iMac and wifi in AirPort base station. Both technology are not developed by Apple. But both has benefited mankind in the most unimagined way. Apple is working closely with SONY and making itself more and more like SONY. SONY is a dying corporation. It developed several technology that are best in class. However, they are so expensive that consumers became repulsive to SONY. SONY has not changed recently. It is still an arrogant corporation.
To develop all technologies in house has a big drawback. Even the most valuable corporation in the world can not hire all the talents in the world. To those lucky to be hired by Apple, they obviously will work hard to outperform. Then Apple will become arrogant to overlook the technologies developed by other companies.
The third problem is Apple is becoming evil. It easily discarded long term suppliers like Omnivision which has supplied the 1.2 MP front cameras for years. This is unethical business practice.
There are two ways to compete. Use the same technology as everyone else and compete on price, or try to develop something better and compete on features not price. There is room for both in the world. The first kind is good because it means more people can afford things. The second is good because it pushes technology forward faster.
But I do agree they should not develop everything in house, only when they can't find a good enough supplier already. For example with the Retina iMac. They could have used a 4K screen from suppliers, but they thought 4K was not a high enough DPI for 27", so they developed their own custom hardware. Custom display port connector, custom timing controller. But that is not a case of being arrogant and rejecting suppliers, it is a case of there being no supplier so you have to do it yourself.
Comments
I think you need to study the history of Apple a bit further. Dating back to the original Apple I & II projects, they have always tended to design things themselves when existing technology wasn't good enough (e.g. Wozniak's custom disk controller). They also used SCSI instead of serial/parallel ports, and Firewire instead of USB for many years. Jobs was notorious for shunning technologies he didn't feel were well-designed (even if they were cheaper).
The only reason USB succeeded (and Apple finally relented) is because Intel ate all of the development costs and pretty much forced it to become the standard on PCs by making it artificially cheap for PC manufacturers. Intel wanted USB to succeed because it requires very complex logic to control the bus, so it maintains their business model where connected devices need a CPU to communicate with each other. Whereas Firewire didn't require a CPU to control the bus, the logic could be embedded in much simpler chips on the devices themselves. Hence why you could connect things like cameras and TVs together without needing a computer. But it didn't have a big company taking a loss to make it cheaper. That's the reality of why USB succeeded -- it wasn't better by any means, it was just artificially cheaper.
Anyways, to me, it's not a sign of arrogance to design your own technology, it's an attention to detail and care that most companies just simply don't have. Sure, if there's an existing technology that works well enough, then it should be used. But design-by-committee doesn't always lead to the best technologies coming to market.
Your understanding of USB vs Firewire is false. Apple made USB popular by iMac G3. At that time most PC still sticks to serial ports.
The Mac marketshare was abysmally small in 1998 when the iMac G3 debuted. Yes, PCs at the time were still using serial ports, but USB emerged as the PC standard although it took years. Don't forget that the iMac G3 also had two FireWire ports in addition to the USB 1.1 ports.
Remember, Apple itself debuted a media player -- the iPod -- that could only sync via FireWire in 2002. The iPod was basically a Mac-only device; PC users had Creative and other MP3 players. Only as iPod began to increase in popularity did Apple eventually add Windows PC and USB support.
Your understanding of USB vs Firewire is false. Apple made USB popular by iMac G3. At that time most PC still sticks to serial ports.
Apple put USB on the iMac because they needed a port for mice + keyboards and were transitioning from the old ADB ports. It was never considered to be used for anything more than than low-bandwidth devices. Apple was behind Firewire as the next generation, high bandwidth device connector for things like external hard drives, cameras, iPods, etc.
Shrinking the computer to pocket size and adding radios, sensors, auxiliary processors, displays and other outputs of several kinds means that Apple and any other company has to expand research just to keep things integrated and increasingly smaller and more functional. Your component suppliers won't do this for you. Your battery tech depends on your processor and your display tech and your software, etc.
The old Apple won't work in this new environment. As to your Omnivision point, I can't speak. Could be Apple was their usual arrogant, all-business selves. And Sony? Yes they lost focus, but surely the camera sensor division is still world class for mass manufacture plus quality, no?
The Mac marketshare was abysmally small in 1998 when the iMac G3 debuted. Yes, PCs at the time were still using serial ports, but USB emerged as the PC standard although it took years. Don't forget that the iMac G3 also had two FireWire ports in addition to the USB 1.1 ports.
Remember, Apple itself debuted a media player -- the iPod -- that could only sync via FireWire in 2002. The iPod was basically a Mac-only device; PC users had Creative and other MP3 players. Only as iPod began to increase in popularity did Apple eventually add Windows PC and USB support.
The first iMac does not have Firewire ports. The PCs did not adopt USB even though it is developed by Intel. This is because the Windows PC sell with very small profit margin. They can not afford to add USB without hurting their slim profit. But after Apple introduced iMac with USB, many peripheral companies introduced new devices that work with USB. Especially the keyboard and mouse. This makes the PC companies easier to switch to USBs.
I think Apple has reached its peak when Steve Jobs dies. Cook has followed a route that is unApple. The old Apple is never so active in research.
You have no idea what you're talking about. Apple spends, and has spent for years, a higher fraction of its total expenditures on research than most high tech companies. This was the case in the early '80s (I was in the Apple//&/// division then), and it's true now.
As for switching camera suppliers, it's not unethical, it's normal business practice when something better is available. Too bad your business couldn't keep up.
The first iMac does not have Firewire ports.
Remember, FireWire was a technology developed by Apple. It was originally called IEEE 1394, but Apple's marketing department wanted a more consumer-friendly moniker and picked "FireWire."
FireWire was conceptually intended to replace SCSI and parallel interfaces, for high-speed data transfer. The original USB was so slow (1.5Mbit/sec) that it was really only suitable to replace low-speed serial ports and Apple's own aging ADB interfaces.
The above photo of the iMac G3 has two FireWire 400 ports.
The first iMac does not have Firewire ports.
You're right. It was the second iteration of the iMac (about 1 year later) which added FireWire ports. But regardless, it was a fixture on all Macs after that point up until recently.
Some PC manufacturers had already adopted it at that point, otherwise device manufacturers wouldn't have cared about the Mac-only market (which, despite the success of the iMac, was still very tiny in 1998-2000). Also, PS/2 to USB adapters were commonly included with devices as a cheap way to add USB support.
But it was really the fact that Intel essentially gave USB away for free in their reference motherboard designs that made it ubiquitous (only extra cost was the ports). As opposed to FireWire where you had to pay a license fee to include it. So, as I said, Intel ate the R&D costs for it.
Let's just call a spade a spade here and say that most manufacturers simply want cheap (as little R&D spend as possible, no licensing fees), not great. They're all fighting to sell products as cheaply as possible. Apple and Sony aim for great (high R&D spending, paying for talent, taking risks on unproven technology), instead of trying to be the cheapest, and get called arrogant for doing so.
You're right. It was the second iteration of the iMac (about 1 year later) which added FireWire ports. But regardless, it was a fixture on all Macs after that point up until recently.
Some PC manufacturers had already adopted it at that point, otherwise device manufacturers wouldn't have cared about the Mac-only market (which, despite the success of the iMac, was still very tiny in 1998-2000). Also, PS/2 to USB adapters were commonly included with devices as a cheap way to add USB support.
But it was really the fact that Intel essentially gave USB away for free in their reference motherboard designs that made it ubiquitous (only extra cost was the ports). As opposed to FireWire where you had to pay a license fee to include it. So, as I said, Intel ate the R&D costs for it.
Let's just call a spade a spade here and say that most manufacturers simply want cheap (as little R&D spend as possible, no licensing fees), not great. They're all fighting to sell products as cheaply as possible. Apple and Sony aim for great (high R&D spending, paying for talent, taking risks on unproven technology), instead of trying to be the cheapest, and get called arrogant for doing so.
You forget one of my original points. Majority of consumers do not like to spend unnecessary more money on a product. Otherwise we will not see so many special deals posted on AI. The success of a product is not determined by 'great'. Sony became very successful because of Walkman which is not very expensive.
You have no idea what you're talking about. Apple spends, and has spent for years, a higher fraction of its total expenditures on research than most high tech companies. This was the case in the early '80s (I was in the Apple//&/// division then), and it's true now.
As for switching camera suppliers, it's not unethical, it's normal business practice when something better is available. Too bad your business couldn't keep up.
I was talking about the front facing camera in iPhone 6. It has the same 1.2 MP as in 5S which is provided by Omnivision. Why Apple need to switch supplier for the same resolution camera? I discovered (guess) the switching is probably due to one ex-Aptina employee who now works for Apple for several years. Aptina was a fierce competitor to Omnivision. I think this person was involved in the decision.
Off-topic, but I vividly remember listening to the original demo tape that came with the Walkman back in 1978. It completely blew me away and the hardware was very well constructed on the unit itself.
I think Apple has reached its peak when Steve Jobs dies. Cook has followed a route that is unApple. The old Apple is never so active in research. Apple used USB in the iMac and wifi in AirPort base station. Both technology are not developed by Apple. But both has benefited mankind in the most unimagined way. Apple is working closely with SONY and making itself more and more like SONY. SONY is a dying corporation. It developed several technology that are best in class. However, they are so expensive that consumers became repulsive to SONY. SONY has not changed recently. It is still an arrogant corporation.
To develop all technologies in house has a big drawback. Even the most valuable corporation in the world can not hire all the talents in the world. To those lucky to be hired by Apple, they obviously will work hard to outperform. Then Apple will become arrogant to overlook the technologies developed by other companies.
The third problem is Apple is becoming evil. It easily discarded long term suppliers like Omnivision which has supplied the 1.2 MP front cameras for years. This is unethical business practice.
LOLOLOL. Sony's actually in the middle of a turn around. The PlayStation division is generating billions in profits again, and all divisions except the mobile division are showing a profit. But keep telling yourself that.
I was talking about the front facing camera in iPhone 6. It has the same 1.2 MP as in 5S which is provided by Omnivision. Why Apple need to switch supplier for the same resolution camera? I discovered (guess) the switching is probably due to one ex-Aptina employee who now works for Apple for several years. Aptina was a fierce competitor to Omnivision. I think this person was involved in the decision.
Sony makes the front camera for the iPhones 6 as well, they were both simplifying the supply chain and getting a better product.
Japan have a rich history of technology. Remember the Walkman?
It's pleasing to see Apple investing in Japan, as they seem a friendly and wise people, and their country is in dire straits. And it's not China.
And Japanese consumers love Apple products. iPhone has something like 48% of the Japanese smartphone market.
The first full quarter of iPhone 6/6+ sales should bump that even higher.
Source: http://appleinsider.com/articles/14/12/03/huge-iphone-6-sales-drive-ios-to-40-smartphone-market-share-in-australia-us-uk-japan
So what might the new R&D facility be for? Well, I seem to recall that Apple had worked with Sharp on advanced OLED display technology. And supposedly the two had patented a manufacturing process for producing high quality OLED panels at low cost with a novel "printing" process. Just search for "Sharp Apple OLED inkjet printing daisy wheel." (I know. Sounds weird.) You'll get plenty of hits from all kinds of sources.
You forget one of my original points. Majority of consumers do not like to spend unnecessary more money on a product. Otherwise we will not see so many special deals posted on AI. The success of a product is not determined by 'great'. Sony became very successful because of Walkman which is not very expensive.
Sony had already been working with cassette playing and recording technology for a long time before they released the Walkman. It's only because the components had come down in cost (and R&D costs were much lower) that they were able to produce the Walkman for that cheap at that point in time. They also had a fantastic design team. This is one of my favourites.
Both Apple and Sony have typically operated in the high-end consumer electronics market. The reason is because they spend the money to do R&D on cutting edge technologies (and/or license it) and typically do end-to-end integration in-house. It's high risk and potentially high reward (or very expensive failure). However, they should not be derided as arrogant simply because they choose to go their own path in many cases instead of contributing their work to a technology pool or settling for existing technology which may not create the type of user experience they're aiming for.
But it's perfectly fine that other manufacturers look for the lowest cost components to create similar products for cheaper. There's room for both types of products to exist in the marketplace. This includes using technology designed by groups of companies who are willing to fund the R&D (like Intel) because they'll benefit in the long term. Similar to how Android is developed and given away for free to manufacturers by Google so that, in the long run, they can stream advertisements and collect data from users. Manufacturers get the benefit of a free mobile OS (low R&D cost), which brings the cost of the product down.
Consumers benefit from both types of companies: those who work on the cutting edge and design/integrate new technology into consumer products, and those who work to make existing technology for as cheap as possible (thus giving more people access to it).
Or a tax break from the Japanese government.
That's OK. Apple doesn't pay any taxes anyway...
/s
I think you need to study the history of Apple a bit further. Dating back to the original Apple I & II projects, they have always tended to design things themselves when existing technology wasn't good enough (e.g. Wozniak's custom disk controller). They also used SCSI instead of serial/parallel ports, and Firewire instead of USB for many years. Jobs was notorious for shunning technologies he didn't feel were well-designed (even if they were cheaper).
The only reason USB succeeded (and Apple finally relented) is because Intel ate all of the development costs and pretty much forced it to become the standard on PCs by making it artificially cheap for PC manufacturers. Intel wanted USB to succeed because it requires very complex logic to control the bus, so it maintains their business model where connected devices need a CPU to communicate with each other. Whereas Firewire didn't require a CPU to control the bus, the logic could be embedded in much simpler chips on the devices themselves. Hence why you could connect things like cameras and TVs together without needing a computer. But it didn't have a big company taking a loss to make it cheaper. That's the reality of why USB succeeded -- it wasn't better by any means, it was just artificially cheaper.
Anyways, to me, it's not a sign of arrogance to design your own technology, it's an attention to detail and care that most companies just simply don't have. Sure, if there's an existing technology that works well enough, then it should be used. But design-by-committee doesn't always lead to the best technologies coming to market.
Your understanding of USB vs Firewire is false. Apple made USB popular by iMac G3. At that time most PC still sticks to serial ports.
Apple made USB popular because Steve could see the trajectory it was on long before PC manufacturers could. However, Auxio is correct that Firewire is technically (vastly) superior to USB, providing full peer-to-peer communications that can be implemented in a low-resourced, peripheral environment whereas USB is master-slave. If I had the time to comment on your earlier posts I would, as some of your claims have fact and fiction well and truly intertwined.
All the best.
I think Apple has reached its peak when Steve Jobs dies. Cook has followed a route that is unApple. The old Apple is never so active in research. Apple used USB in the iMac and wifi in AirPort base station. Both technology are not developed by Apple. But both has benefited mankind in the most unimagined way. Apple is working closely with SONY and making itself more and more like SONY. SONY is a dying corporation. It developed several technology that are best in class. However, they are so expensive that consumers became repulsive to SONY. SONY has not changed recently. It is still an arrogant corporation.
To develop all technologies in house has a big drawback. Even the most valuable corporation in the world can not hire all the talents in the world. To those lucky to be hired by Apple, they obviously will work hard to outperform. Then Apple will become arrogant to overlook the technologies developed by other companies.
The third problem is Apple is becoming evil. It easily discarded long term suppliers like Omnivision which has supplied the 1.2 MP front cameras for years. This is unethical business practice.
There are two ways to compete. Use the same technology as everyone else and compete on price, or try to develop something better and compete on features not price. There is room for both in the world. The first kind is good because it means more people can afford things. The second is good because it pushes technology forward faster.
But I do agree they should not develop everything in house, only when they can't find a good enough supplier already. For example with the Retina iMac. They could have used a 4K screen from suppliers, but they thought 4K was not a high enough DPI for 27", so they developed their own custom hardware. Custom display port connector, custom timing controller. But that is not a case of being arrogant and rejecting suppliers, it is a case of there being no supplier so you have to do it yourself.