Intel, Nvidia show off next-gen silicon potentially bound for Apple's future Macs

1356

Comments

  • Reply 41 of 112
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by wizard69 View Post





    Baloney!! If the user knows what he is doing a Mac Pro can easily handily many processes running at once. Frankly that is one of the reasons to have workstation class machines.

    That is called using all available resources to get the job done!


    A lot of modern software has the ability to set priorities, especially when you want something to run in the background on available cycles. It already happens in a variety of things. It's just at the level of "pro apps" for lack of a better term, some level of micro management is often available without the use of scripting.

  • Reply 42 of 112
    tallest skiltallest skil Posts: 43,388member


    Originally Posted by JCM722 View Post


    Also, how does the "Site Only (no email)" feature work? How does that differ from not subscribing?  Yes, I've been away from the AI forum a while.



     


    Let's see… there's a user page where you can see your subscriptions if you have it set to site only, but then you don't get e-mail about it. But if you turn off subscription, you won't have anything added to that page (but previous subscriptions will still be there).


     


    I've never really understood the forum 'subscription' setup, myself. 

  • Reply 43 of 112
    jeffdmjeffdm Posts: 12,951member
    solipsismx wrote: »
    Less expensive RAM and accompanying logic board are the biggest benefits. Unbuffered (non-ECC) RAM may also have better performance than ECC RAM due to the removal of the error correction. Error correction sounds like a good thing but you have plenty of other ways to verify data integrity. If we're talking about a server or workstation it can make sense but not so much for a consumer system.

    Buffered and ECC are different things. I don't think I can explain it exactly, but buffering is a way of reducing the fan-in and fan-out of digital inputs and outputs, so you can reliably run more RAM chips on a memory bus, somewhat like a USB hub. The principle is that there are only so many devices you can drive on a bus, and buffering acts as a multiplier by hiding more devices behind it. ECC is simply a way of detecting and correcting bit flip errors. You can have buffered ECC RAM, non-buffered ECC RAM. I think you can have buffered non-ECC, but it probably doesn't make that much sense.

    I'm not sure what other means you mention to verify data integrity, I suppose you can keep checksums on blocks of data, but say you detect an error, you can't correct it.

    Bit flip errors aren't that common though, I don't think it matters a lick in consumer systems. Except for bad memory, I've never seen a bit flip recorded when ever I checked. I've seen it written that you can expect a bit flip per week per gigabyte of RAM. I have 10GB on my old Mac Pro and never seen one recorded in the System Report window.
  • Reply 44 of 112
    ecsecs Posts: 307member


    Does this mean NVIDIA doesn't have any remarkable new release before 2016? This must be a joke, we even don't know what computers we'll be using by 2016...

  • Reply 45 of 112
    ecsecs Posts: 307member

    Quote:

    Originally Posted by mitchell_pgh View Post


    [...]


    I love the iMac, minus the tangled mess of wires you have if you have a few external hard drives, media reader, etc. Also, I would much rather have two 22" screens vs. one 27" screen.



     


    My feelings exactly. For all users who want a discrete GPU and a custom screens setting, the only thing Apple has to say is "buy a Mac Pro". It's nonsense to force you to buy a Xeon if you don't want an "all in one".


     


    An iMac without screen, please...

  • Reply 46 of 112
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by ecs View Post


     


    My feelings exactly. For all users who want a discrete GPU and a custom screens setting, the only thing Apple has to say is "buy a Mac Pro". It's nonsense to force you to buy a Xeon if you don't want an "all in one".


     


    An iMac without screen, please...





    Buying a Xeon doesn't inherently add that much to the price. It requires an LGA2011 board rather than one set up for LGA1155. The LGA2011 i7s cost the same amount. Looking at the bottom configuration, they use the daughterboard configuration to avoid having to use dual socket components. The cpu ($300 retail), gpu (PC version $149 retail), ram (under $50 retail), hard drive ($80 ish retail for a comparable one), sleds, and pretty much everything in that case are not that expensive, especially when compared to a 27" display panel.  Please don't fall into the trap of linking its price to the use of Xeons. It is more likely that Apple priced it that way to space it out from the imac. The rest of their line reads similarly. Now you don't have to like that, but it's pretty much what they make. The outrage at Xeons is just misdirected anger as the components needed to build the 3.2 are incomparable in price relative to those needed to build a maxed out 12 core. It's also important to note that dual socket machines are often sold at much higher markups.

  • Reply 47 of 112
    MarvinMarvin Posts: 15,323moderator
    ecs wrote: »
    My feelings exactly. For all users who want a discrete GPU and a custom screens setting, the only thing Apple has to say is "buy a Mac Pro". It's nonsense to force you to buy a Xeon if you don't want an "all in one".

    An iMac without screen, please...

    You can buy an iMac with a 3rd party display or two, even a Macbook Pro. If you don't like the display, shove it on the floor like you would with a PC tower. The only difference is you can't easily upgrade your internal storage. You can get a refurb MBP though:

    http://store.apple.com/us/product/FD103LL/A/refurbished-macbook-pro-23ghz-quad-core-intel-i7

    put in an SSD, 16GB RAM, hook up a 3rd party display and you even get a DVD drive. That's roughly the price Apple would sell a display-less iMac for. You're not getting a 680 GPU but it's still a powerful machine. It's equivalent to a desktop with a Core i7-3770T and the Radeon 5770 they use in the Mac Pro.
    ecs wrote:
    Does this mean NVIDIA doesn't have any remarkable new release before 2016? This must be a joke, we even don't know what computers we'll be using by 2016.

    They will have Maxwell next year:

    http://www.digitalartsonline.co.uk/news/creative-hardware/nvidia-reveals-next-generation-graphics-chips-maxwell-volta/

    This year is rebranding time so Kepler refresh, probably some higher clock speeds but not much different from last year.

    Their stacked DRAM later on might not make that much improvement. They talk about things like moving a Blu-Ray size of data in 1/50th of a second or whatever but the video memory is limited in size anyway, usually to 4GB or less. It'll help with higher resolutions and larger sized textures but it won't boost raw calculation speed the way that more cores does. I hope they won't go the route of marketing other features to distract away from processing power. They will probably struggle to hit lower process nodes and they're just going to leave this wide open for Intel. The target looks healthy enough though at what looks like 24 DP GLOPs per watt. That would mean higher-end desktop GPUs would be capable of around 5TFLOPs double precision and laptop GPUs around 0.5-1TFLOP.
  • Reply 48 of 112
    hmmhmm Posts: 3,405member


    I wanted to point out that they don't always coincide well with intel's refreshes, and of course sometimes the low end continues in rebranded form. I wouldn't totally count on Maxwell in the Macs next year. It depends on what vendor Apple chooses and what is available at the time they want to launch. I hope they stick with NVidia. CUDA can run on certain things where OpenCL isn't yet supported or as fast. I'm curious what they mean by this statement.


     


     


    Quote:


    Kayla brings the computing power and GeForce and Tesla to one computer, Huang said. Nvidia said the computer is capable of doing real-time ray tracing, which generates accurate images by tracing paths of light. In addition, the computer also supports CUDA 5, OpenGL and also PhysX.



    The example of raytracing is extremely ambiguous given the range that is covered by that term. They probably used it because it has the potential to be a highly parallel operation, which makes it a good candidate for comparison with some amount of context. I would point out that a lot of details affect the rate of calculation. Uninterpolated samples require a greater number to clear up noise. Reversed raytracing (camera to light) is also typically faster than light source to camera.

  • Reply 49 of 112
    MarvinMarvin Posts: 15,323moderator
    hmm wrote: »
    I'm curious what they mean by this statement.
    Kayla brings the computing power and GeForce and Tesla to one computer, Huang said. Nvidia said the computer is capable of doing real-time ray tracing, which generates accurate images by tracing paths of light. In addition, the computer also supports CUDA 5, OpenGL and also PhysX.

    The example of raytracing is extremely ambiguous given the range that is covered by that term. They probably used it because it has the potential to be a highly parallel operation, which makes it a good candidate for comparison with some amount of context. I would point out that a lot of details affect the rate of calculation. Uninterpolated samples require a greater number to clear up noise. Reversed raytracing (camera to light) is also typically faster than light source to camera.

    They have a video demo of Kayla:


    [VIDEO]


    It's a mobile platform too as in Tegra. It continuously renders so it won't get final quality in real-time but it looks quite powerful for what it is. The Logan version they said would be fanless so potentially these would go into tablets and phones. I think they said they'd ship with OpenGL 4.3 support. If only everyone could do that. They have the whole GTC presentation on their Youtube page:



    They show off a new facial animation demo for Titan in part 2. I can't believe they replaced the fairy. The only reason computers exist is so that we can one day render virtually perfect women that do everything we ask and they switched over to some Steve Ballmer lookalike.
  • Reply 50 of 112
    MarvinMarvin Posts: 15,323moderator
    hmm wrote: »
    I'm curious what they mean by this statement.
    Kayla brings the computing power and GeForce and Tesla to one computer, Huang said. Nvidia said the computer is capable of doing real-time ray tracing, which generates accurate images by tracing paths of light. In addition, the computer also supports CUDA 5, OpenGL and also PhysX.

    The example of raytracing is extremely ambiguous given the range that is covered by that term. They probably used it because it has the potential to be a highly parallel operation, which makes it a good candidate for comparison with some amount of context. I would point out that a lot of details affect the rate of calculation. Uninterpolated samples require a greater number to clear up noise. Reversed raytracing (camera to light) is also typically faster than light source to camera.

    They have a video demo of Kayla:


    [VIDEO]


    It's a mobile platform too as in Tegra. It continuously renders so it won't get final quality in real-time but it looks quite powerful for what it is. The Logan version they said would be fanless so potentially these would go into tablets and phones. I think they said they'd ship with OpenGL 4.3 support. If only everyone could do that. They have the whole GTC presentation on their Youtube page:



    They show off a new facial animation demo for Titan in part 2. I can't believe they replaced the fairy. The only reason computers exist is so that we can one day render virtually perfect women that do everything we ask and they switched over to some Steve Ballmer lookalike.
  • Reply 51 of 112
    jcm722jcm722 Posts: 40member


    Originally Posted by Marvin View Post

    You can buy an iMac with a 3rd party display or two, even a Macbook Pro. If you don't like the display, shove it on the floor like you would with a PC tower. The only difference is you can't easily upgrade your internal storage. You can get a refurb MBP though:



    http://store.apple.com/us/product/FD103LL/A/refurbished-macbook-pro-23ghz-quad-core-intel-i7



    put in an SSD, 16GB RAM, hook up a 3rd party display and you even get a DVD drive. That's roughly the price Apple would sell a display-less iMac for. You're not getting a 680 GPU but it's still a powerful machine. It's equivalent to a desktop with a Core i7-3770T and the Radeon 5770 they use in the Mac Pro.


     


    Marvin ... Your kidding, right? Why would anyone want to purchase an iMac and not use the screen? Why would someone purchase an iMac, with poor accessibility, instead of the Pro? I truly know nothing about Xeon CPUs, but it does seem odd, as "hmm" mentions, the 3.4GHz i7 in the upgraded ($2199) iMac costs $294. The Xeon CPU in the $2499 Mac Pro is also $294. So Marvin, why would anyone, for the difference of $300 at this point, shove an iMac on the floor, and not a fully accessible Mac Pro? For now, the Pro still has an optical drive. For that matter, so does the MacBook Pro, right?

  • Reply 52 of 112
    tallest skiltallest skil Posts: 43,388member


    Originally Posted by JCM722 View Post


    Why would someone purchase an iMac, with poor accessibility, instead of the Pro?



     


    Then buy the Mac Pro!






    I truly know nothing about Xeon CPUs, but it does seem odd, as "hmm" mentions, the 3.4GHz i7 in the upgraded ($2199) iMac costs $294. The Xeon CPU in the $2499 Mac Pro is also $294.



     


    Different processor families, different processor lines, released at different times…






    For now, the Pro still has an optical drive. For that matter, so does the MacBook Pro, right?



     


    Why should that be a factor in purchase anymore? 

  • Reply 53 of 112
    MarvinMarvin Posts: 15,323moderator
    jcm722 wrote: »
    Why would anyone want to purchase an iMac and not use the screen?

    Some people want an iMac without the screen so buying one and not using the screen has the same effect. Worst case you'd be paying for a display you don't use. Big deal, it's still cheaper than the Mac Pro and if you really had to, rip the display out and sell it on eBay and that has the added benefit of making it accessible. Most likely once you unboxed it, you would use the display so it solves two problems: you get an iMac without the screen and you get a free screen so you don't have to buy one.
    jcm722 wrote: »
    why would anyone, for the difference of $300 at this point, shove an iMac on the floor, and not a fully accessible Mac Pro?

    To save $300 and get USB 3 and Thunderbolt, possibly a Fusion drive, an iSight camera (which they can use if they sit on the floor) and a nice 27" IPS display, which would otherwise cost at least $350 for 1080p or $650 for the same resolution.

    A better question is, why would anyone pay $300 more for a large box you most likely won't open more than once, doesn't have a display, is slower yet uses more power, doesn't have USB 3, takes up more space and is heavier to move around?
  • Reply 54 of 112
    jcm722jcm722 Posts: 40member


    Marvin ... My old iMac never moved from the spot it was in. Same goes for my HP AIO. So, what happens if the HD in an iMac fails? You have to take the entire unit to an Apple Store for repair, rather than replace just the HD in a Pro. If Apple Care has expired, it must cost way more for a certified Apple tech to fix an HD, rather than the DIY ease of use in the Pro. Don't you also believe the Pro will get Thunderbolt and USB in 2013? In looking at the specs, the Wireless and Bluetooth need updating in the Pro. If the Xeon CPU is such a dog, why is Apple using it? I believe the Sandy Bridge i7 iMac 27 was proven to be faster than the single CPU Pro. I still use my optical drive and LightScribe from time to time.

  • Reply 55 of 112
    philboogiephilboogie Posts: 7,675member
    jcm722 wrote: »
    In looking at the specs, the Wireless and Bluetooth need updating in the Pro.

    Wow, I don't know a single company, institution, or person, who relies on wireless anything that uses a MP...for Pro usage. Dual ethernet on the other hand...
  • Reply 56 of 112
    tallest skiltallest skil Posts: 43,388member


    Originally Posted by JCM722 View Post

    If the Xeon CPU is such a dog, why is Apple using it?


     


    Because there isn't anything faster out.

  • Reply 57 of 112
    MarvinMarvin Posts: 15,323moderator
    jcm722 wrote: »
    My old iMac never moved from the spot it was in. Same goes for my HP AIO. So, what happens if the HD in an iMac fails? You have to take the entire unit to an Apple Store for repair, rather than replace just the HD in a Pro.

    It is a problem and one I hoped Apple would fix instead of make worse by gluing it shut but they obviously want to make sure they make the money on the storage and there is a design issue with how they allow access to the storage. With the Mac Pro, they already get their margin with the high entry price so it's not important to lock it down. The laptops on the other hand are very easily accessible for now as they can't feasibly seal them. The Mini is a bit of a hassle but it can be done in under 20 minutes.

    If they offered affordable SSD-only options, it wouldn't be much of an issue and this will come in time.
    jcm722 wrote: »
    Don't you also believe the Pro will get Thunderbolt and USB in 2013? In looking at the specs, the Wireless and Bluetooth need updating in the Pro. If the Xeon CPU is such a dog, why is Apple using it? I believe the Sandy Bridge i7 iMac 27 was proven to be faster than the single CPU Pro.

    The Intel chipset for Ivy Bridge EP won't support USB 3 so Apple would have to implement it using a separate controller. Haswell-EP chipsets support USB 3. Thunderbolt depends on what they do with the GPU. The Xeon class of processor allows Apple to use multiple CPUs and scale beyond 4 cores. When it only has 4 cores though, the i7 is better value.
  • Reply 58 of 112
    macroninmacronin Posts: 1,174member


    Now we can go ahead and turn this into an xMac thread…!


     


    I, for one, would LOVE if Apple were to come out with a consumer-oriented mini-tower, one that would allow me to have adequate cooling for all components and a full-size graphics card.


     


    Right now I am using an older MAcBook, and WoW (World of Warcraft) kicks its ASS… SMC Fan Control is installed, regular operating is at 4k rpm & about 120 degrees F. Launch WoW, fan cranks up to 6k+, temp skyrockets to 210+ F, if I run it too long, eventually the system just plain shuts down…


     


    So, if I could get a desktop i7 with a decent fan & heatsink, a full-size GPU, again with a decent fan & heatsink; these would be good things…


     


    I fully realize that games can be played on the hardware that is in the top-of-the-line iMac, but all of the iMacs i have used get pretty hot when things are cranking. A nice mini-tower might allow the heating issues to be alleviated.


     


    Looks like my next Mac might actually end up being a Hackintosh, unless I can get a REALLY good deal on an outgoing single CPU model Mac Pro (once the 'new' Mac Pro comes out sometime this year) and throw that 'new' (yeah, a year old I think, but WAY newer than the existing card that comes in the Mac Pro) ATI card…

  • Reply 59 of 112
    philboogiephilboogie Posts: 7,675member
    macronin wrote: »
    Now we can go ahead and turn this into an xMac thread…!

    Please don't. If you want anything faster than a MacBook, get a Mac Pro. No, if you're outside of the EU, or wait for a new model to arrive if there's no incentive in getting it this very moment
  • Reply 60 of 112
    tallest skiltallest skil Posts: 43,388member


    Originally Posted by PhilBoogie View Post

    Please don't. If you want anything faster than a MacBook, get a Mac Pro.




    Or suck it the heck up and get an iMac.

Sign In or Register to comment.