US Mac sales down 5% year over year in Apple's June quarter, NPD says

2»

Comments

  • Reply 21 of 40
    hydr wrote: »
    Apple should become more aggressive with their Macs. More new products, shorter periods of updates, more features - start making it a main agenda like IOS. 

    Add fingerprint reader, rapidly expand iCloud - why can´t *all* my Mac settings be stored in iCloud? Why can´t I just swipe my finger on a brand new RETINA Macbook Air, and iCloud can automatically download and configure everything in OSX?

    More focus on Product Mr. Cook. Bring the passion back, get Apple aggressive again. Invent, push and improve our lives.

    Actually what you are advocating is less focus: put all of Apple's products into a blender, add fingerprint reader, and poop out "innovation." News flash: spray and pray is not how Apple works.
     0Likes 0Dislikes 0Informatives
  • Reply 22 of 40
    gqbgqb Posts: 1,934member

    Quote:

    Originally Posted by ecs View Post





    So, people playing next gen videogames, amateur artists, individual developers, etc, are "high end professionals". Anybody who needs more than a web browser with facebook is a "high end professional". It's obvious nobody here is a computer user, but a social network user. And yes, current Intel offerings are more than enough for you, so I'm glad you're happy with Intel not increasing CPU performance. However, just consider what I was saying: the Mac is for people who needs more power than the iPad can deliver. If the Mac doesn't get substantial 2x performance increases because of Intel failing to do so, then there's no wonder desktops become less interesting (why buy a new desktop -or a new laptop- if your old one has just slightly lower performance?)


    His point was that every one of the use cases you cite function perfectly well with existing CPU speeds, and that by-and-large, the only ones really benefiting from a 2x jump is likely to be pros who need to squeeze every second out of performance in the service of billable hours.

     0Likes 0Dislikes 0Informatives
  • Reply 23 of 40
    muum79muum79 Posts: 1member
    I kind of understand, since In the last 9 months, only the macbook airs have been updated, i think Apple is in a bad cycle pattern where everything (or the majority) gets updated at the same time macs/ipad/iphone, tim cook even acknowledged they want to spread out the updates. I think it makes sense to a product launches in the summer prior to the back to school periods and in the winter prior to the holidays,
     0Likes 0Dislikes 0Informatives
  • Reply 24 of 40
    mstonemstone Posts: 11,510member

    Quote:

    Originally Posted by ecs View Post




    If the Mac doesn't get substantial 2x performance increases because of Intel failing to do so, then there's no wonder desktops become less interesting (why buy a new desktop -or a new laptop- if your old one has just slightly lower performance?)



    As it turns out Moore's law is mistakenly represented by a straight first degree linear function where as over time we will see a plateau such as a Gompertz function because CPUs at some point won't need to be faster except in extreme cases where academics will simply cluster them to achieve the required computational power. The majority of the population will probably never need more computational power than they already have. Faster networks, yes, but more CPU/GPU, not so much. Why do you think the iPad is such a success? That is all people really need. If you need more than a MBP 15" retina SSD is pumping out then you should probably give up on notebooks and go for the multi-CPU desktop or server with lots of memory. If you are in such a niche market that demands extreme power then price should be no object.

     0Likes 0Dislikes 0Informatives
  • Reply 25 of 40
    tallest skiltallest skil Posts: 43,388member
    muum79 wrote: »
    tim cook even acknowledged they want to spread out the updates.

    When?
     0Likes 0Dislikes 0Informatives
  • Reply 26 of 40
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Tallest Skil View Post





    When?


    I think he was referring to this.

     0Likes 0Dislikes 0Informatives
  • Reply 27 of 40
    tallest skiltallest skil Posts: 43,388member
    hmm wrote: »
    I think he was referring to this.

    Oh, but that was only because of the stupidity where you couldn't buy an iMac for three months because they hadn't made enough of the new ones yet. Two ways to go about that intelligently and they didn't pick either. Either keep selling the old iMac and don't even mention the new one or allow PREORDERS of the new one during the entire time you don't have anything for sale.

    That's why Apple doesn't announce products before they're ready to ship. It's a shame Cook had to learn that at all, much less the hard way, but at least it got out of the way early.

    "But the Mac Pro…"

    Is a completely different argument in that–no, shhh–in that the continued sale of the current Mac Pro is to shut up all the whiners who don't think the new model will be the future of all high end professional computing for the next decade or two. This way you can buy a 2010 Mac Pro and have your precious (slow) giant tower. Then you don't get to complain when it's taken off the market.

    Apple did the same thing with the Intel transition. They gave people a 6 month notice that hey, our computers are going to be different in the future. You want one of the ones we sell now, have at it. And heck, the PowerMac G5 remained on sale until August of the following year, so over 12 months with that. People still bought it during the transition, mainly for the same reason as above.

    The iMac quote isn't proof–or even evidence–that they wished the overall updates were more spread out.
     0Likes 0Dislikes 0Informatives
  • Reply 28 of 40
    analogjackanalogjack Posts: 1,073member


    I am amazed that so many people care about percentage shifts that eventually just translate into numbers in a bank account somewhere, which basically just means strings of electron spin orientation holding information.


     


    Sure I get it that people need a certain amount of money or whatever in order to live, but as long as Apple is able to produce the splendid products that it now produces, like from my pov, an rMBP and iPod Touch then that's worth more to me as far as enjoying this amazing age that we live in, than whether Apple makes 20 billion next year or 10 Billion. For sure they are not going to go broke in the foreseeable future.


     


    And in the midst of their 'decline' in share price they are releasing the extraordinary paradigm shifting new Mac Pro. As long as they keep this level of quality product introduction, which they seem to be doing, then I fail to see why anyone, other than those who have shares in Apple and are intent of accumulating zeros on the end of their bank balances, cares.

     0Likes 0Dislikes 0Informatives
  • Reply 29 of 40
    qamfqamf Posts: 87member


    Where is the proof that iPad sales are the factor....



    Anyhow, until we know how Windoze OEM's are doing it means nothing.



    -QAMF

     0Likes 0Dislikes 0Informatives
  • Reply 30 of 40
    arlorarlor Posts: 533member

    Quote:

    Originally Posted by Seankill View Post


    I think it is the best approach. I like Intel processors, I am willing to pay the premium for their processor over AMD's. It seems like all the AMD PC's I have used have been pathetic. 



     


    Very true nowadays, and fifteen years ago. Ten to five years ago, AMD was the performance for price leader (at least on desktops) by a wide margin. I still thank them for forcing Intel to up their game. 

     0Likes 0Dislikes 0Informatives
  • Reply 31 of 40
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Tallest Skil View Post





    Oh, but that was only because of the stupidity where you couldn't buy an iMac for three months because they hadn't made enough of the new ones yet. Two ways to go about that intelligently and they didn't pick either. Either keep selling the old iMac and don't even mention the new one or allow PREORDERS of the new one during the entire time you don't have anything for sale.



    That's why Apple doesn't announce products before they're ready to ship. It's a shame Cook had to learn that at all, much less the hard way, but at least it got out of the way early.



    "But the Mac Pro…"



    Is a completely different argument in that–no, shhh–in that the continued sale of the current Mac Pro is to shut up all the whiners who don't think the new model will be the future of all high end professional computing for the next decade or two. This way you can buy a 2010 Mac Pro and have your precious (slow) giant tower. Then you don't get to complain when it's taken off the market.



    Apple did the same thing with the Intel transition. They gave people a 6 month notice that hey, our computers are going to be different in the future. You want one of the ones we sell now, have at it. And heck, the PowerMac G5 remained on sale until August of the following year, so over 12 months with that. People still bought it during the transition, mainly for the same reason as above.



    The iMac quote isn't proof–or even evidence–that they wished the overall updates were more spread out.




    It's not healthy to be angry about everything. In the context of the mac pro, its last real update was 2010. Even today the bottom option contains a 2009 era cpu option. The gpu is 2 generations back. I don't think they are making many at the moment, so it shouldn't be a big deal. With the imac they could have pushed it back and kept the old one on sale. They set up their own problem there. Even after invoking the osbourne effect, they could have maintained minimal production on the old one. 2 week window returns can happen either way. The G5 was different. Some of those applications didn't run suitably through Rosetta. On the software end it would have been a much bigger transition. I bought one during that transition even though I later changed to a first generation mac pro. The reason was my G4 died at that time.


     


    Anyway I agree it's not proof. It was from a  one post account. Take it as you will.

     0Likes 0Dislikes 0Informatives
  • Reply 32 of 40
    epsicoepsico Posts: 39member


    I seriously hope Apple doesn't lose interest on the Mac; it's the main reason why I choose Apple products at all.


     


    Quote:

    Originally Posted by ecs View Post



    The problem begins with Intel, IMHO: they're so worried on power efficiency that every new processor just adds about a 10% performance boost over the previous generation. While the (integrated) GPUs seem to be progressing, the new CPUs are a shame. How would you be interested on a new computer if it doesn't at least _double_ the performance of your older computer? (as it was usual in the past)



    There's need for powerful computers. Emulators such as MAME or MESS need powerful new CPUs for emulating recent arcades or recent consoles. Last generation raytracers and unbiased renderers take a day to complete a render on current hardware. New videogames need GPUs that just cannot fit inside an iPad. So, new computers are needed. But new computers that double or triple previous performance, not the jokes that Intel is releasing.


    I have to address this.


     


    You still have significant improvements, however processor technology has matured to the point where it's simply no longer possible to provide seamless improvements.  Nowadays, and since 2004, CPUs provide seamless improvements in the form of optimizations, however they also provide improvements in terms of new features.  Case in point: a dual core CPU has as much processor power as two single core CPUs using the same technology at the same clock speeds, however applications written for single core CPUs simply don't scale to multicore CPUs at all; they need to be rewritten, with their logic fundamentally changed to maximize parallelism and minimize concurrency.  Another example: Intel introduced AVX with Sandy Bridge, and I have heavily optimized code written for SSE4A and NEON that could, in theory, run 2-3 times faster if I ported it to AVX as well (part of the speed improvement, and the reason it would go beyond 2x, is because AVX instructions support 3 operands, like most NEON instructions), however I can't be bothered, because each time I did one of these optimizations, I spend months working on it to take the most out of the instruction sets and avoid stalling pipelines as much as possible.


     


    Regarding ray tracers and simple highly parallel code, their turf is the GPU, not the CPU, because while on a dual 6-core Xeon you can do 24 threads at the same time (and I'm being generous here by counting hyperthreading as true parallel processing), a single high-end consumer-grade NVIDIA card can run 16384 threads at the same time provided that your kernels or shaders avoid branching, plus video cards tend to have much faster memory and MIMD implementations that are better suited to running massively parallel code.


     


    EDIT: Fixed a few typos, hope nobody noticed.

     0Likes 0Dislikes 0Informatives
  • Reply 33 of 40
    seankillseankill Posts: 568member

    Quote:

    Originally Posted by ecs View Post





    So, people playing next gen videogames, amateur artists, individual developers, etc, are "high end professionals". Anybody who needs more than a web browser with facebook is a "high end professional". It's obvious nobody here is a computer user, but a social network user. And yes, current Intel offerings are more than enough for you, so I'm glad you're happy with Intel not increasing CPU performance. However, just consider what I was saying: the Mac is for people who needs more power than the iPad can deliver. If the Mac doesn't get substantial 2x performance increases because of Intel failing to do so, then there's no wonder desktops become less interesting (why buy a new desktop -or a new laptop- if your old one has just slightly lower performance?)


     


    You shouldn't say comments like "no one here is a computer user."  I use my MacBook Pro for engineering work that includes simulations so I use the quad-core to its maximum potential. 


     


    If it's so easy to increase power by two fold, why isn't AMD passing intel?

     0Likes 0Dislikes 0Informatives
  • Reply 34 of 40
    qamfqamf Posts: 87member

    Quote:

    Originally Posted by Arlor View Post


     


    Very true nowadays, and fifteen years ago. Ten to five years ago, AMD was the performance for price leader (at least on desktops) by a wide margin. I still thank them for forcing Intel to up their game. 



    Agreed, also, it showed how strong Intel's relationships with OEMs is (sadly was/is super-strong).


    Quote:

    Originally Posted by hmm View Post




    It's not healthy to be angry about everything. In the context of the mac pro, its last real update was 2010. Even today the bottom option contains a 2009 era cpu option. The gpu is 2 generations back. I don't think they are making many at the moment, so it shouldn't be a big deal. With the imac they could have pushed it back and kept the old one on sale. They set up their own problem there. Even after invoking the osbourne effect, they could have maintained minimal production on the old one. 2 week window returns can happen either way. The G5 was different. Some of those applications didn't run suitably through Rosetta. On the software end it would have been a much bigger transition. I bought one during that transition even though I later changed to a first generation mac pro. The reason was my G4 died at that time.


     


    Anyway I agree it's not proof. It was from a  one post account. Take it as you will.



    Er, there are 680 cards (http://videocardz.com/40799/evga-announces-geforce-gtx-680-mac-edition) that you can get for the Mac Pro (current)



    Also, to be honest, CPU's haven't gotten much faster, just less power hungry (generalization of course) which is kind of a non-issue with a workstation.




    To be honest, I am very happy they changed the case.  I admit I am pro-AMD, so I also love the dual Tahiti GPUs.



    I have a Power Mac G5 (bought about a year ago) I almost never use it, but the case is very nice, I plan to use it for my cheap-ass  gaming PC build... I have to take it apart, but it only used to backup my music currently.



    -QAMF

     0Likes 0Dislikes 0Informatives
  • Reply 35 of 40
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by QAMF View Post


    Agreed, also, it showed how strong Intel's relationships with OEMs is (sadly was/is super-strong).


    Er, there are 680 cards (http://videocardz.com/40799/evga-announces-geforce-gtx-680-mac-edition) that you can get for the Mac Pro (current)



    Also, to be honest, CPU's haven't gotten much faster, just less power hungry (generalization of course) which is kind of a non-issue with a workstation.




    To be honest, I am very happy they changed the case.  I admit I am pro-AMD, so I also love the dual Tahiti GPUs.



    I have a Power Mac G5 (bought about a year ago) I almost never use it, but the case is very nice, I plan to use it for my cheap-ass  gaming PC build... I have to take it apart, but it only used to backup my music currently.



    -QAMF





    CPUs haven't gotten much faster at equivalent core counts. The bigger gains are from core count increases. Notice how Apple's up to switched from 2 x 6 to 1 x 12. The gains at comparable price levels have also been arguably weak. Power consumption isn't that much lower at the Xeon EP level, maybe somewhat with Ivy. It's a much bigger deal with servers than workstations. AMD never chased the computation market prior to tahiti. The firepros were really built to add extra driver features and deal with certain OpenGL apps. I still suspect it may be mostly a rebranding effort as 7970s with 6GB vram retail around $600 each. It's probably something like that + ECC ram, so those could be $1000+ per card in Apple's calculations. I do not think Apple would go with 2 x $3k cards. I don't even think they will hold that $3k retail price much longer. Typically these things drop off a bit on longer cycles.

     0Likes 0Dislikes 0Informatives
  • Reply 36 of 40
    ecsecs Posts: 307member
    seankill wrote: »
    You shouldn't say comments like "no one here is a computer user."  I use my MacBook Pro for engineering work that includes simulations so I use the quad-core to its maximum potential. 

    If it's so easy to increase power by two fold, why isn't AMD passing intel?
    Finally, let's see if my point is understood! If your simulations take hours to complete (and I don't label FEA as "high end professionals", but normal computer users), and a new $2000 Mac cuts the simulation time down to a half, would you have interest on a new Mac? Well, probably. But if all you get is a 20% of savings, will you be interested in replacing your old Mac? Likely not. That was all what I was saying: in the past, new computers had the candy of higher performance. That candy isn't there nowadays. No wonder desktops and laptops sell less: because new products aren't such a substantial improvement over old ones. I'm not saying its easy to improve CPU performance, just that new CPUs aren't really "new CPUs"
     0Likes 0Dislikes 0Informatives
  • Reply 37 of 40
    cash907cash907 Posts: 893member


    Well of course sales are down. They updated the Air line first when what people wanted was an update for the rest of the Macs first. Duh. I'm not dumping 1700 bucks on a retina macbook when a significant update is set to drop in a couple of months.


    Mavericks is plenty stable. GM this badboy and release the refreshes already.

     0Likes 0Dislikes 0Informatives
  • Reply 38 of 40
    seankillseankill Posts: 568member

    Quote:

    Originally Posted by ecs View Post





    Finally, let's see if my point is understood! If your simulations take hours to complete (and I don't label FEA as "high end professionals", but normal computer users), and a new $2000 Mac cuts the simulation time down to a half, would you have interest on a new Mac? Well, probably. But if all you get is a 20% of savings, will you be interested in replacing your old Mac? Likely not. That was all what I was saying: in the past, new computers had the candy of higher performance. That candy isn't there nowadays. No wonder desktops and laptops sell less: because new products aren't such a substantial improvement over old ones. I'm not saying its easy to improve CPU performance, just that new CPUs aren't really "new CPUs"


     


    I agree with your comments as far as people not upgrading as much. That's just how things work. Computers should be replaced on a 3-5 year cycle not a one year cycle. So it 10% boost happens 3 times, you are looking at a processor 33.1% faster. Now that sound interesting. The boost in graphic over that 3 year cycle is even better. 


     


    BTW, FEA on large models can be quite demanding. The Xeon processor I have on my work station is often running pretty hard. It'd be cooler if I got to add a SSD though. 


    I have to hit on you "FEA being normal computer users."  I don't believe my mother does FEA on her laptop. I would define the engining work as pretty high end professional work as much as a dim editor or even artists. 


     


    On the die hard gamers, the market will never truly care about them too much, they are a minority. Intel would go broke caring only about them

     0Likes 0Dislikes 0Informatives
  • Reply 39 of 40
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Epsico View Post




     


    Regarding ray tracers and simple highly parallel code, their turf is the GPU, not the CPU, because while on a dual 6-core Xeon you can do 24 threads at the same time (and I'm being generous here by counting hyperthreading as true parallel processing), a single high-end consumer-grade NVIDIA card can run 16384 threads at the same time provided that your kernels or shaders avoid branching, plus video cards tend to have much faster memory and MIMD implementations that are better suited to running massively parallel code.


     


    EDIT: Fixed a few typos, hope nobody noticed.



    Well most raytracers are performed by offline renderers. In most cases the GPU is actually more limiting there. You'll notice any offline CUDA or OpenCL renderers tend to have severe limitations on the depth of a shader stack and the amount of texture data that can be loaded. What you say could be true in the future, but people have been saying the same thing for years. They aren't there yet, even if they have the potential to greatly advance that functionality. We are at a point where you might be able to deal with still shots and possibly some motion graphics work, just not where they would start to displace render farms. It would be cool to see enormous CUDA render farms.

     0Likes 0Dislikes 0Informatives
  • Reply 40 of 40
    If fewer and fewer want to blow $1500 bucks on an Apple laptop then OS/X will be moved 100% onto the IOS income model.

    I reckon 2015 will be the big move that way though I would not rule out it happening in 2014
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.