Intel, Nvidia show off next-gen silicon potentially bound for Apple's future Macs

2456

Comments

  • Reply 21 of 112
    ksecksec Posts: 1,569member
    Performance in the past few years has already gone past the level where any consumer would care, especially with SSD installed machines. And with Stacked DRAM GPU coming in few years time that should finally take care of the minority of us who still wants more performance. So for even Prosumer / Professional usage i think Apple is waiting for time to fix itself.

    For the rest that would need as much computation as they need, that is were the Mac Pro fits in, and i think ECC Ram would be extremely important as the amount of memory grows. It would still need a 2 Socket CPU, and hopefully 2x Telsa K20 GPU with 2x SSD in Raid. ( And hopefully all these will fit into a Cube Shape )
  • Reply 22 of 112
    solipsismxsolipsismx Posts: 19,566member
    zoffdino wrote: »
    What I meant was that a workstation with unbuffered RAM would be suitable for some usage scenarios.

    I know exactly what you meant. How is it my comments are unclear to both you and jragosta?
  • Reply 23 of 112
    dysamoriadysamoria Posts: 3,430member
    ksec wrote: »
    ... i think ECC Ram would be extremely important as the amount of memory grows.

    Exactly. It should be standard by now.
  • Reply 24 of 112
    The only reason I want a new computer is for better graphics performance. Everything else I do with my computer could be done satisfactorily with a ten year old machine. I would love to have one of those super Nvidia chips in a portable machine with a video out port.
  • Reply 25 of 112
    mike fixmike fix Posts: 270member
    All signs are pointing to Apple returning to the G5 PPC processor for their Mac Pro line of computers. Since they can't seem to make any steps forward, steps backwards is the logical conclusion.
  • Reply 26 of 112
    solipsismxsolipsismx Posts: 19,566member
    dysamoria wrote: »
    Exactly. It should be standard by now.

    I don't see why more RAM would require ECC on the RAM. Too many cons and not enough pros for consumer devices? Where do you draw the line? Android phones are shipping with 2GB of RAM and that's multudes higher than the amount of RAM from the 70s so should those get ECC, too?
  • Reply 27 of 112
    tallest skiltallest skil Posts: 43,388member


    Originally Posted by Mike Fix View Post

    Since they can't seem to make any steps forward…


     


    Knock off the FUD or add an /s to this sort of thing.

  • Reply 28 of 112
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by SolipsismX View Post





    I don't see why more RAM would require ECC on the RAM. Too many cons and not enough pros for consumer devices? Where do you draw the line? Android phones are shipping with 2GB of RAM and that's multudes higher than the amount of RAM from the 70s so should those get ECC, too?




    This gets posted at times. The concern is the continually increasing density. If bit flipping was a common problem in properly functioning ram, ECC would probably become the standard. Even then ECC correction is specifically aimed at single bit errors, so like I said, it just prevents bit flipping anomalies. It isn't the only point of concern when it comes to data. I will say that pricing is is relatively similar to non-ECC variants today. Ten or more years ago that was not the case. IIRC workstations at that time didn't always ship with ECC by default due to the higher cost.

  • Reply 29 of 112
    jcm722jcm722 Posts: 40member

    Quote:

    Originally Posted by Tallest Skil View Post


     


    Yep, let's just go ahead and ignore that iMac, shall we? image





    I realize this is my first post, and I will be looked at as an automated troll for a while. Hey, I've been away so long, I forgot my old user name and password, so all is new. Yeah, let's ignore the iMac as desktop box. It's an AIO and I feel that is a category of it's own. I may have lost my mind, but years back, the Pro could be downgraded in cost to something like $1899. Anyone remember that? I want to select my own screen, not an AIO. So, that leaves me with the mini, or Pro. The mini is more than enough for my needs. I realize others want/need more. No matter what you own, it will become outdated in about 5 years, right? Just because the iMac is neat and tidy, doesn't make it the best choice. Why replace both computer and screen? I would use the same monitor for as long as possible, not pitch the whole thing for a refresh.

  • Reply 30 of 112
    tallest skiltallest skil Posts: 43,388member


    Originally Posted by JCM722 View Post

    Yeah, let's ignore the iMac as desktop box. It's an AIO and I feel that is a category of it's own.


     


    It's not, though.






    I want to select my own screen, not an AIO.



     


    What's wrong with the H-IPS panel the iMac uses?


     



    The mini is more than enough for my needs.


     


    But it's "laptop parts", as was said, so it's "invalid"…






    Why replace both computer and screen?





    Because you can use an iMac as a display anyway. So use it for seven years, buy a brand new Mac Mini, and plug it in.

  • Reply 31 of 112
    nikon133nikon133 Posts: 2,600member
    gazoobee wrote: »
    The average computer user shouldn't have to experience these kinds of waits (especially on what is essentially a background task), but no one has ever devoted serious resources to solving those kind of problems.  Instead it's always … "better graphics!"  As a life long computer user that doesn't give a crap about gaming, I find it annoying. 

    Graphics are one obvious element to improve, as there is easy to focus target in front - making visuals lifelike.

    But it is not like other aspects of computing haven't improved. From power-hungry overheating pig Intel P4 was, to Core i7. From 128MB of RAM (my first laptop had when purchased back in 2001) to 4GB, soon likely to be 8GB standard. From 40GB 4200 IDE HDDs to 1TB 7200 SATA3 spinners. From CD readers to Bluray burners. From 14" CRT screens to 20+" LEDs. Every aspect of personal computing has improved.

    Problem here is, most people do not care - or even do not notice - number of those improvements. 5 years old Core 2 Duo will work perceivably as fast in everyday tasks as latest i7. In fact, Core 2 Duo is already too fast for majority of those tasks, a reason why low-powered tablets became so popular.

    Technology evolves much faster than mankind. We are the bottleneck right now. We do more Internet, more emailing, more social networking - but we don't do it faster than we did before. We still type more or less as fast as we ever did, and read as fast as we ever did, and 20 minutes YouTube video still takes 20 minutes to watch. My current hardware with my current broadband speed can probably stream dozens of videos at once, but to what success? I can still really focus on only one.

    So, the graphics. One of the things we can see improvement. True, not for non-gamers and others who do not benefit from faster graphics hardware; but then, those people will not notice improvement from almost anything IT can bring. But those who do - and those are not only gamers, but everyone who does benefit from any sort of hardware acceleration modern graphics can bring to data processing - are, at present, market segment that wants to pay for better. And manufacturers are only answering to market demand.

    I happen to be a gamer, but I also edit lots of photos, videos... so I'm definitely interested in better graphics and all the benefits they bring. I've been playing computer days since Sinclair ZX81 and also developed interest in observing how that part of technology evolved - and it was amazing voyage, from ZX81 Scramble to modern PC Battlefield 3 and Far Cry 3. I'm really looking forward what tomorrow brings.

    But even if I wasn't. I mean, I don't race cars, but I'm still not annoyed when Ferrari releases new super car.
  • Reply 32 of 112
    MarvinMarvin Posts: 15,309moderator
    jcm722 wrote: »
    I forgot my old user name and password, so all is new.

    http://forums.appleinsider.com/u/56644/WPLJ42
    jcm722 wrote: »
    Just because the iMac is neat and tidy, doesn't make it the best choice. Why replace both computer and screen? I would use the same monitor for as long as possible, not pitch the whole thing for a refresh.

    It's not just about what the buyer wants but what works best for the seller. Intel and NVidia could easily double performance every single year if they wanted to but instead they do it every other year if we're lucky so they stay in business twice as long. If they don't have any competition, what do they care?

    Apple doesn't want to sell you a machine and a display separately because all you'd do is go and buy a display from someone else that doesn't cost $1000. For some people a machine gets outdated more quickly, for others the display does. With an AIO, it doesn't matter, they both have to upgrade more often and Apple makes the margins on both parts.

    It's not an important issue because laptops are by far the highest selling PCs in the world and they are AIOs. That doesn't make iMacs laptops, it means they follow the same upgrade model and the PC industry has woken up to the fact that it might actually be a profitable way to do business because it increases their average selling price, allows them to have unique selling points outside of raw performance and cut down on manufacturing as they aren't building two boxes with two power supplies.
  • Reply 33 of 112
    philboogiephilboogie Posts: 7,675member
    gazoobee wrote: »
    Sometimes I think it's funny when I'm importing some shows into iTunes and the whole computer just dies for a minute or two while it does it.<snip> I find it annoying. 

    Oe that's a good point! You are really making sense here by sharing this experience. iTunes is really faulty; no matter how fast your Mac is, iTunes simply does not update fast. Or quickly. It stalls. When renaming, updating info, adding lyrics - it stalls. Even happens with scrolling. I have the media on HDD and the rest is on ~, which is on an incredibly fast PCIeSSD. I know that info gets written into the header of .mp3 files and such, so the HDD is obviously the slow down in the chain. But just from using iTunes you can tell the software is slow as well. From watching the Activity Monitor, iTunes still uses only one CPU, not even with HT.
    solipsismx wrote: »
    I know exactly what you meant. How is it my comments are unclear to both you and jragosta?

    It simply has to be a mis-read, forgetting the un-part. There's no other explanation.
  • Reply 34 of 112
    evilutionevilution Posts: 1,399member


    2016! That's 10 years away!


     


    OK 3 but it always seems forever when you are waiting for awesome tech.

  • Reply 35 of 112
    wizard69wizard69 Posts: 13,377member
    Sadly it doesn't look like it will be as good of an improvement as many of us hoped for. We still have HD5000 to come. I suspect though that even geeks care about better graphics in Intels processors. That is really what makes things like AIR so attractive today.
    ascii wrote: »
    It could make the Mac Mini quite a bit better (due to the HD4600 integrated graphics). Better integrated graphics are a reason for non-geeks to care about new Intel chips.
  • Reply 36 of 112
    jcm722jcm722 Posts: 40member


    Marvin ... Well, I have a unique low-vision situation, and cannot use any of Apple's monitors. I also can't use zoom/magnification, as blurry gets blurrier. So, I am presently using a W7 AIO with an 18.5 screen @ 1366 x 768. The standard text in the AI forum is tiny and hard to see. I have sight and am not ready to surrender to VoiceOver. Also, Apple is rich, while I am poor. Don't care what Apple wants, I am the customer, and will do what I want and must do for my visual needs. If the 27 inch iMac could double-down, for lack of a better term, to 1280 x 720, with no loss in clarity, Apple just might get me.


     


    Tallest Skil ... I was not aware the iMac could be used as a monitor.

  • Reply 37 of 112
    jragostajragosta Posts: 10,473member
    jcm722 wrote: »
    Marvin ... Well, I have a unique low-vision situation, and cannot use any of Apple's monitors. I also can't use zoom/magnification, as blurry gets blurrier. So, I am presently using a W7 AIO with an 18.5 screen @ 1366 x 768. The standard text in the AI forum is tiny and hard to see. I have sight and am not ready to surrender to VoiceOver. Also, Apple is rich, while I am poor. Don't care what Apple wants, I am the customer, and will do what I want and must do for my visual needs. If the 27 inch iMac could double-down, for lack of a better term, to 1280 x 720, with no loss in clarity, Apple just might get me.

    Tallest Skil ... I was not aware the iMac could be used as a monitor.

    Wait a second. You can't use a 27" Apple AIO, but you can use an 18.5" W7 AIO?

    Considering that Apple has typically offered far better quality screens on its AIOs than the competition, that's pretty hard to believe. Or were you simply unaware taht you can change the resolution on an iMac, too. The newest 27" iMac will certainly handle 1366 x 768 - it may go lower, as well. And considering the quality and size of the screen compared to your 18.5" AIO, the iMac would be FAR more readable than what you have.

    Head into an Apple Store or Best Buy and set the resolution to its minimum to see how much better.
  • Reply 38 of 112
    wizard69wizard69 Posts: 13,377member
    Enough baloney to feed every hungry person in the world!!
    gazoobee wrote: »
    Yeah, but I would argue that this *needn't* be the case and that while the computer has enough "power" for the average user, their experience could certainly be improved in many ways.  
    One of the reasons the average user upgrades is to get getter performance. Admittedly many of those users are running Windows with all the performance issues there.
    It's been known for a long time now that the gaming industry "drives" the improvements in chip technology, but this just means that the type of improvements we get are aimed at the needs of that group.  
    It is a driver for GPU cards, processors (CPUs) in general are designed to serve many purposes. CPUs advance when technology allows. This is why Haswell is seeing modest integer gains while it is benefiting from significant floating point increases. It could be a long time before Intel engineers find a way to boost integer performance significantly.
    The average user's experience could be vastly improved by focussing on other things like concurrent execution and multi-processing,
    You have heard of multi core processors haven't you?
    but that wouldn't help games at all, so nothing is ever really done about that.  
    That depends upon the game and the state of the toolkit used to build the game.
    I mean even on a multi-core Mac Pro with the best graphic cards and gigs and gigs of memory, it can still basically only do one intensive task at a time.
    Baloney!! If the user knows what he is doing a Mac Pro can easily handily many processes running at once. Frankly that is one of the reasons to have workstation class machines.
     Sometimes I think it's funny when I'm importing some shows into iTunes and the whole computer just dies for a minute or two while it does it.  It's hardly any different to when I was using my IBM XT and waiting for a print job to execute all those years ago.  
    That is called using all available resources to get the job done!
    The average computer user shouldn't have to experience these kinds of waits (especially on what is essentially a background task), but no one has ever devoted serious resources to solving those kind of problems.  
    Again this is baloney and makes me wonder if you have any sense of history. I've on many occasions have run a compiler, ITunes and Safari (at the same time) on an old 2008 MBP with the machine remaining functional if not slow. That is with a machine containing only 2GB of RAM. The fact that the machine can do this at all highlights the strength and reliability of Apples Mac OS.

    Now I sure would love better performance and I'm sure I would get that with an upgrade but who doesn't expect no better performance when upgrading a 5 year old computer.
    Instead it's always … "better graphics!"  As a life long computer user that doesn't give a crap about gaming, I find it annoying. 

    I find you out of touch. First; the GPUs are used in a number of ways these days. So a good GPU doesn't imply gaming platform. Second; how did you mis all the improvements that have gone into Intels CPUs over the last few years?
  • Reply 39 of 112
    jcm722jcm722 Posts: 40member


    jragosta ... You missed the part where I said blurry gets blurrier. I am also using W7 and not OS X. So far, I cannot use any monitor that is not in the native resolution. I have an iMac 20 inch gathering dust, since I cannot see it. Nope, cannot drop the resolution either. Some people can see through the slight out of focus left behind at a different resolution. I can't. I might be able to use a mini and a 27 inch/1080 monitor. If not, my next choice is a 26 inch HDTV at 720/768 resolution. Trust me, I've been messing around with this ever since the demise of the CRT, where you could alter the resolution with no loss in clarity.


     


    My point ... Apple could use desktop CPUs if the mini wasn't so small. The Pro is the only other true desktop from Apple. OK fine, everyone seems to want notebooks anyway, so to hell with desktops and choice. The iMac is a fixed resolution display to me, and as a result, useless. Same goes for everything but the iPad.


     


    Also, how does the "Site Only (no email)" feature work? How does that differ from not subscribing?  Yes, I've been away from the AI forum a while.

  • Reply 40 of 112
    jragostajragosta Posts: 10,473member
    jcm722 wrote: »
    jragosta ... You missed the part where I said blurry gets blurrier. I am also using W7 and not OS X. So far, I cannot use any monitor that is not in the native resolution. I have an iMac 20 inch gathering dust, since I cannot see it. Nope, cannot drop the resolution either. Some people can see through the slight out of focus left behind at a different resolution. I can't. I might be able to use a mini and a 27 inch/1080 monitor. If not, my next choice is a 26 inch HDTV at 720/768 resolution. Trust me, I've been messing around with this ever since the demise of the CRT, where you could alter the resolution with no loss in clarity.

    OK, so your point is that you ruled out the 27" iMac without even trying it. With its incredible native resolution, it's going to be far sharper than you think. Or even if it's run at double resolution, there wont' be any interpolation - and the pixels will still be larger than what you have now.

    If you run it at double resolution, you have a 1280x720 monitor - and it's not going to be blurry. It's nothing like the resolution interpolation that you're thinking about. Go try one out before complaining that it won't work.
    jcm722 wrote: »
    My point ... Apple could use desktop CPUs if the mini wasn't so small. The Pro is the only other true desktop from Apple. OK fine, everyone seems to want notebooks anyway, so to hell with desktops and choice. The iMac is a fixed resolution display to me, and as a result, useless. Same goes for everything but the iPad.

    Then get an iMac and add whatever monitor you want. It's still cheaper than buying a Pro. And what's wrong with the Mini? It's a perfectly useful desktop computer - and you can add your own monitor. What are you doing that the Mini is unsuitable for?
Sign In or Register to comment.