Originally Posted by djames42
Gotcha - I assumed you were talking about their computer hardware. No doubt they make a pretty penny on much of their software, and certainly it's a well known fact that the iPod line has a huge markup. Makes me wonder why Microsoft doesn't really lower the price of the Zune. I think that, as one of the few real competitors to the iPod (yes, there is the Sansa and the Archos and the Zen line, all of which do more than either the iPod or the Zune, but they don't have the integrated ecosystem), it seems to me that Microsoft's markup must be similar, and they could easily undercut the iPod and increase their marketshare.
My guess is they probably don't think cutting the price will make a huge impact. They were too late to the game and, quite frankly, the product isn't as good.
I understand being happy with margins, especially the big ones that Apple has on their computers. But I think in Apple's case you could make an argument for trimming up top end bonuses (and they'd still be very nice) and cutting margin only slightly.
Historically Apple has had a cost of around 60%. That's quite low for cost. It also increases greatly when people do add-ons via custom builds. For instance, it costs $75 to upgrade from a 320GB SATA HDD on the iMac to a 640GB one. Apple likely pays less than this amount for the drive itself AND will put the 320GB drive back in for manufacturing. A Seagate 7200RPM Barracuda at 320GB costs $55 on NewEgg. The 640 GB Seagate with 32MB of cache that I bought is $70. So realistically it is a $15 upgrade. If Apple charged $20 they would maintain a similar profit margin to the base model, but the hammer the buyer with a $75 upgrade fee. And those drives are loaded off of a mirror, so it doesn't take much work to load the system on them. They actually have them preloaded. Even at $25 it would be a reasonable price for an upgrade, but $75 is a HUGE markup.
It costs $100 to go from 2GB of DDR1066 to 4GB of it. A 4GB kit from Kingston at NewEgg costs $61.99. Apple pays less for memory. Yes, they'd have to pull out the 2x1GB sticks, but those would go right back into manufacturing. 2x1GB costs $35.49. So a $30 upgrade fee or even $35 would be reasonable and they'd be making extra cash. Instead they charge $100. Yes, you can do it yourself just like the HDD, but then you get the privilege of tossing the 2GB worth of RAM instead of Apple reusing it in manufacturing. It would seem "greener" to me to charge a reasonable price for a memory upgrade and insure end users don't just toss their extra RAM.
I'm not going to deny being a fanboy (although I don't think that's necessarily a bad thing, I have a strong preference [based on real-world experience will various platforms] and definitely stick to it--and my bias shows
). However, I also argue that there is more here than simply cost and performance, there is also user experience as well as included extras. I find the Mac experience to be genuinely enjoyable as well as productive, while I find using Windows to be quite unpleasurable, very utilitarian, and frequently unproductive (especially when I consider the amount of lost data my co-workers experience with their Dell laptops and their almost monthly blue screens as well as the downtime involved with bi-weekly reboots due to security patches, and the performance hit taken by the anti-virus software we have to use).
I like OSX better myself. I'm not a fan of buy a ready made PC, but I've found Vista to be fairly enjoyable. I like that I was able to build a machine without a bunch of preloaded garbage, something Apple is very good about on machines they build.
I should say that I've had very few security patches with Vista thus far, but I got it SP1. No crashes, no problems, and it is the most secure Windows OS I've ever used.
I also have no problem paying a premium for a Mac, but the question is how much of a premium.
A base iMac is $1199. Let's assume the usual 60% cost, so Apple pays around $719 for the hardware. Other factors have to be rolled in, such as development, marketing, etc. However the partnership with Intel has cut down Apple's costs for engineering, especially on the logic board. Let's assume we need an additional 20% off that $719 for additional costs, though those costs go down the more iMacs that are sold because the are distributed across the line. That brings us to $863. Let's assume a 15% profit margin on the iMac based on the end price. The price would be at $1015 for a base iMac. Let's go up to $1049 just to make it all nice and even.
And remember that as HDD price, CPU price, GPU price, and RAM prices go down the margins get better.
I would guess, and I've heard it from friends that work closely with Apple, that Apple tends to run about 30% profit on their hardware before bonuses are paid out.
The margins are likely to same or better on laptops.
Apple would still make a boatload of money starting the iMac out at $1099 AND they would also get more people in the door.
Imagine if they had a desktop that started at $799 or $899. That's hardly a bottom feeder at that price - and if they offered dual core and quad core options as upgrades while offering REASONABLE upgrade prices.
I don't think Apple need give up premium pricing because I think A LOT of people don't mind paying $100-200 extra on the lower end of things for a Mac. I know I wouldn't have a problem with it. The problem is that on the bottom end of things it is likely closer to $400-500 and it gets worse as you move up.
My example comparing to my machine is based on Apple's builds. The reason THAT looks so bad for Apple is because they offer nothing like my machine. Mac Pro is overkill and the iMac is underkill, but they overcharge to get to 8GB (Apple charges $1100 to upgrade to 8GB). I know 4GB sticks are hard to find and expensive, but they're not THAT expensive. $400-500 would be reasonable for an upgrade to 8GB on the iMac, though a design with 4 memory slots would make 8GB far more attainable. OSX is 64 bit but if you don't have a MacPro then good luck affordably getting beyond 4GB.
It wasn't the motherboard and controller components I was referring to necessarily. However, if I think back to the hardware I've built, I've often used low-end Taiwan-made motherboards with their questionable driver support, off-branded burners and graphics cards (which may use the same chipset, but are unquestionably not built to the same standards as the name brand cards). The freezes I frequently experience, I attribute to the crummy drivers that come with the hardware.
I'm very careful when I build. I check the MoBo. I check all the hardware. I pay more for better parts. I didn't buy the bottom rung Asus MoBo. I got a Seagate HDD, the same make my iMac had. A got a Gigabyte nVidia GPU.
However, that seems to go against the paradigm switch in thinking that even Microsoft is pushing for. It's been said that we now have too much information to keep organised with simple hierarchies, so we might as well stuff all our documents into a single document folder, our images in a single folder, and use metadata to keep our information organized, much as software such as iTunes and iPhoto (or in my case, Aperture) does. I used to be anal about keeping my music and photo folders organised. Now I left the software take care of it. I've found it to be more efficient, and I don't have to think about it anymore.
I understand why people would like this, but I hate the performance hit a system takes by indexing the drive. I always turn the indexing off because I hate the performance hit. I just prefer to stay organized from the beginning.
Hmm, I do recall a small (but significant) decline in the previous quarter, but don't recall one prior to that. Statistics can be fudged to look however we want them to, so I say this with a grain of salt, but one could attribute much of that decline to the economy. Sales of technology items are down pretty much across the board. Apple's appear to be down less than the majority of the sector.
I believe it was a 1.1% decline first quarter versus a .6% growth for the overall PC industry largely driven by netbook sales.
The word is that Apple has experienced a 3% decline thus far in the new quarter, with iMacs being the hardest hit.
I honestly haven't looked much at the new hardware, other than the MacBook Pro, as my 1st generation 1.83 CD is very aged, and doesn't perform as well as I'd like when working in Aperture. I'll almost certainly be stretching this machine out another year though. And while I certainly did pay a premium for this laptop, it's already lived a longer life than my previous two Windows laptops, one of which was an even more expensive Vaio.
Honestly, I don't disagree at all on the laptops, though I dislike the lack of selection. I don't want to have to go to the MacBook Pro to get a 15" screen and I do think the prices on the MacBook are a bit high right now. The previous iteration was fairly priced, but they got really out of whack on this one and assumed that minor upgrades were worth more than they truly are.
I'm not a laptop guy. I have had them and I would be likely to get a Mac, but I find the MacBook to be about $200 more than it should be - and even with that mark down it would still cost more than equivalent PC notebooks. That's fine because it is much better.
I love the idea of an iMac. They're beautiful, functional, and minimalist. I agree that something more powerful would be very welcome. I don't much care for upgradability. I generally find that by the time I feel the need for a more powerful video card, it's also time for a full CPU and motherboard upgrade. Seems like room could be made for a second hard drive though
That's why I love the desktop. I can spend $50 for a new nVidia card and link it up with my current one via SLI and double or better my GPU power. I can also buy a socket compatible processor. My MoBo supports AM3, so quad-core 3GHZ and beyond will be available for me with more L1, L2, and L3 cash at a cost of $150 or maybe less, depending on when I upgrade.
Have they? I think much of Microsoft's troubles are rooted in their insistence on backward compatibility with 15 year old software. While I haven't tried it yet, I have an old card game I used to play on Windows 3.1 that would probably still work under Windows 7. I know it worked fine under XP. That's nice, but as I said before, create a new architecture and provide a compatibility sandbox for legacy software. That's the only way to get software companies to really migrate to the new technology.
I know I can't run my old Windows stuff in Vista. That's one reason why MS got dumped on for it - they threw out a lot of compatibility. And the 64 bit version can't run 16 bit Windows stuff. Some of the full blown Win32 stuff can be run.
That's why you can download Virtual PC for free for Vista. They did that to address the compatibility issues. Also a lot of old hardware won't work.
I disagree that the Mini is a horribly incapable machine. I wouldn't use it for my Aperture work (although I hear the new one performs comparably there to my aging MBP), but I bought last year's version for the office and it's my primary work machine. It works great running Entourage, NeoOffice, X11, Pages, and a multitude of other tools I use day in and out. I'd argue that it's really the perfect machine for the vast majority of homes who really don't need anything more powerful than that.
Anyway, I do have to thank you for writing a cordial response. It was more than I deserved (or expected) after my hostile response to your post
The big problem I have with the mini again is the price and lack of any flexibility. The average user cannot upgrade the memory, so they're stuck paying Apple's crazy prices ($150 for an upgrade to 4GB on the base). That $599 model has no keyboard, mouse, a 120GB HDD, and only 1GB. For $399 or $449 it might be okay, but the 250GB HDD costs very little extra and an extra GB probably costs Apple $10, if that.