Step 1:Add more cores, Step 2: ???, Step 3:Faster Computer!

Posted:
in Future Apple Hardware edited January 2014
Assuming that most programs today are single threaded, how does a multi-core or multi-CPU set up benefit the user, unless you're a Pro who multitasks most of the time? I realized that modern OSes are SMP, and there are a few programs out that support SMP, but the 80% (just a guess on my part, anyone have a number on this?) that are single threaded will perform worse on a slower multi-core system than a faster single CPU one.



I know that 'multi-multi-core' is the future, but at present it looks to me like the new 'megahertz myth'. Instead of "Our CPU is 25% faster than the competitor!", it's "Our CPU has FOUR CORES and the competitor only has TWO!". Meanwhile, almost no (individual) programs on either system will benefit from the additional cores, until future versions are released. For the programmers out there, (I'm studying hardware/networking) Is there any way to patch a current program to use SMP? That would require a complete recompile/rewrite, correct?



«1

Comments

  • Reply 1 of 22
    chuckerchucker Posts: 5,089member
    Quote:
    Originally Posted by iPoster


    Assuming that most programs today are single threaded, how does a multi-core or multi-CPU set up benefit the user, unless you're a Pro who multitasks most of the time?



    Because OS X is heavily multitasked regardless. Even when you spend all your time in just one app, say Microsoft Word, you still have several dozen other processes running, from the more obvious ones (Finder and Dock) to the less obvious ones, such as SystemUIServer.



    And then, of course, there's stuff that you may just like to keep running, like iTunes for background music, Adium/iChat/Skype/etc. for instant messages, your e-mail client or at least a notifier for new e-mail, etc. It adds up.



    Even assuming that the one program you use the most is single-threaded, your other core(s) will be able to host all the other stuff, and that will help greatly.



    Quote:

    I realized that modern OSes are SMP, and there are a few programs out that support SMP, but the 80% (just a guess on my part, anyone have a number on this?) that are single threaded will perform worse on a slower multi-core system than a faster single CPU one.



    Yes.



    Quote:

    For the programmers out there, (I'm studying hardware/networking) Is there any way to patch a current program to use SMP? That would require a complete recompile/rewrite, correct?



    It requires the program to use a lot of threads. Threading isn't exactly the easiest programming topic, so a lot of developers try and avoid it.
  • Reply 2 of 22
    wmfwmf Posts: 1,164member
    Quote:
    Originally Posted by Chucker


    Because OS X is heavily multitasked regardless. Even when you spend all your time in just one app, say Microsoft Word, you still have several dozen other processes running, from the more obvious ones (Finder and Dock) to the less obvious ones, such as SystemUIServer.



    And then, of course, there's stuff that you may just like to keep running, like iTunes for background music, Adium/iChat/Skype/etc. for instant messages, your e-mail client or at least a notifier for new e-mail, etc. It adds up.



    It only adds up to about 20% of a core. So you use one core to run your main app, 20% of the second core to run all the background stuff, and the other cores are doing what?



    Fortunately, I don't think this problem will really hit until 2008 (when the iMac goes quad).
  • Reply 3 of 22
    marcukmarcuk Posts: 4,442member
    well the reality is that for most uses, you'll never need more than 2 cores at present. As has been said, one core takes care of your main app, while the other is doing background services. Indeed AMD's K8L can shut down 2 cores and run the other two at less than full speed if required, and I suspect for most of you, this is pretty much how your quad-core chips will operate most of the time.



    Clearly, with X360, PS3 and cell, developers are going to be thinking alot more about multithreading games, so in reality, unless you're playing cutting edge games, spend alot of time 3d rendering, video encoding - those sort of things, then quadcores will probably be a bit of a waste.



    But we'll still want them!
  • Reply 4 of 22
    iposteriposter Posts: 1,560member
    Quote:
    Originally Posted by MarcUK


    well the reality is that for most uses, you'll never need more than 2 cores at present. As has been said, one core takes care of your main app, while the other is doing background services. Indeed AMD's K8L can shut down 2 cores and run the other two at less than full speed if required, and I suspect for most of you, this is pretty much how your quad-core chips will operate most of the time.



    Clearly, with X360, PS3 and cell, developers are going to be thinking alot more about multithreading games, so in reality, unless you're playing cutting edge games, spend alot of time 3d rendering, video encoding - those sort of things, then quadcores will probably be a bit of a waste.



    But we'll still want them!



    My point was that the hardware technology for multiple cores seems to be outstripping the software ability to make full use of those 2/4(eventually 6/8 ) core machines.



    But like you said, we'll still want one!
  • Reply 5 of 22
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by iPoster


    My point was that the hardware technology for multiple cores seems to be outstripping the software ability to make full use of those 2/4(eventually 6/8 ) core machines.



    But like you said, we'll still want one!



    The number of cores will probably always be powers of two, so there probably won't be a six core anything any more than you can find a single stick of memory that's 0.75 GB. I doubt a typical computer configuration would offer a two core and four core CPU in the same computer.
  • Reply 6 of 22
    leonardleonard Posts: 528member
    Quote:
    Originally Posted by MarcUK


    well the reality is that for most uses, you'll never need more than 2 cores at present. As has been said, one core takes care of your main app, while the other is doing background services. Indeed AMD's K8L can shut down 2 cores and run the other two at less than full speed if required, and I suspect for most of you, this is pretty much how your quad-core chips will operate most of the time.



    That's the same argument as most users don't need more than a 2Ghz computer. You're talking about computing power, no matter whether you get it from more Ghz or more CPUs/copres. We eventually find a use, and low and behold you do in your next paragraph:



    Quote:
    Originally Posted by MarcUK


    Clearly, with X360, PS3 and cell, developers are going to be thinking alot more about multithreading games, so in reality, unless you're playing cutting edge games, spend alot of time 3d rendering, video encoding - those sort of things, then quadcores will probably be a bit of a waste.



    But we'll still want them!



    That's the same comment as
  • Reply 7 of 22
    thttht Posts: 5,443member
    Quote:
    Originally Posted by iPoster


    Assuming that most programs today are single threaded, how does a multi-core or multi-CPU set up benefit the user, unless you're a Pro who multitasks most of the time?



    We will have to wait for the software to catch up to the hardware this time around.



    In the future, the extra cores will be used to support CPU/IO system services that are much more elaborate than those used today. Filesystem journaling takes up a non-trivial amount of CPU (5 to 10%). Spotlight indexing, caching, and whatnot takes a non-trivial amount of CPU and IO. Virus protection, scanning, etc. takes up non-trivial CPU.



    With more cores, perhaps these sorts of services can be expanded to make them even more useful, thus requiring more threads, more cores, lots of more IO.



    Quote:

    I know that 'multi-multi-core' is the future, but at present it looks to me like the new 'megahertz myth'. Instead of "Our CPU is 25% faster than the competitor!", it's "Our CPU has FOUR CORES and the competitor only has TWO!". Meanwhile, almost no (individual) programs on either system will benefit from the additional cores, until future versions are released.



    Single threaded performance will increase in the future. Witness Core 2 Duo's single threaded performance increase over existing CPUs. They claimed 20% instead of 25%. I don't think it'll change in the near term.
  • Reply 8 of 22
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by Leonard


    That's the same argument as most users don't need more than a 2Ghz computer. You're talking about computing power, no matter whether you get it from more Ghz or more CPUs/cores.



    Currently, there isn't much need for the mainstream use of 2+GHz or multi-core except for games, and Macs are simply not the choice platform for that anyway.



    The real question is, will there be a use that comes about during the life of the computer? I don't know, but I'm generally pretty happy with 3+ year old computers, though I'll admit that I just bought a decommissioned original release PowerMac G5 dual 2.0. I have it set to "reduced" in the power saving just so that the fans don't spin up and annoy me. I suppose it's like a dual 1.0GHz machine, I don't really know how far the G5 downclocks.
  • Reply 9 of 22
    marcukmarcuk Posts: 4,442member
    Quote:
    Originally Posted by Leonard


    That's the same argument as most users don't need more than a 2Ghz computer. You're talking about computing power, no matter whether you get it from more Ghz or more CPUs/copres. We eventually find a use, and low and behold you do in your next paragraph:







    That's the same comment as



    I dont think most users need more than a 2ghz processor today. Besides there's 2ghz and then theres 2 ghz. Pentium 4 or Core 2?



    Anyway, there is an application that millions of us will soon be using that requires all the horsepower you can throw at it - and that is the playback of HD content. But I guess Sony pretty much have that wrapped up. I see the original HD players were using 3.6ghz Pentium 4's - and even then the experience isn't that nice. No wonder they're so expensive.
  • Reply 10 of 22
    robmrobm Posts: 1,068member
    Hey Mark - you'd be a brave man to bank on Sony anything at the moment, other than short term.



    Sony are the Microsoft of the media industry atm - corporate arrogance will be their downfall.

    They have that aplenty - it pervades their culture and marketing.

    I won't bore everyone with unnecessary details - but their launch of xdcam here in NZ was nothing short of pathetic.

    As an example.

    BlueRay - HD DVD ???

    The average joe is still paying off the plasma they didn't really need but thought they did.



    Sorry for wandering ot there - but I don't necessarily buy into any of the seemingly foregone conclusions about what consumers will stump up and pay for. I mean go back a few years - who'd athunk mp3 would be good enough to listen to ?

    Who knows what joe blow will deem as being acceptable as far as hd content compression is concerned ...
  • Reply 11 of 22
    iposteriposter Posts: 1,560member
    Quote:
    Originally Posted by JeffDM


    The number of cores will probably always be powers of two, so there probably won't be a six core anything any more than you can find a single stick of memory that's 0.75 GB. I doubt a typical computer configuration would offer a two core and four core CPU in the same computer.



    That 6 was an error in my post, I meant the progression of cores on a single die, 2->4->8, etc.
  • Reply 12 of 22
    marcukmarcuk Posts: 4,442member
    Quote:
    Originally Posted by RobM


    Hey Mark - you'd be a brave man to bank on Sony anything at the moment, other than short term.



    Sony are the Microsoft of the media industry atm - corporate arrogance will be their downfall.

    They have that aplenty - it pervades their culture and marketing.

    I won't bore everyone with unnecessary details - but their launch of xdcam here in NZ was nothing short of pathetic.

    As an example.

    BlueRay - HD DVD ???

    The average joe is still paying off the plasma they didn't really need but thought they did.



    Sorry for wandering ot there - but I don't necessarily buy into any of the seemingly foregone conclusions about what consumers will stump up and pay for. I mean go back a few years - who'd athunk mp3 would be good enough to listen to ?

    Who knows what joe blow will deem as being acceptable as far as hd content compression is concerned ...



    but short of an insane disaster, there are going to be about 10 million Blu-Ray drives shipped with the PS3 by this time next year. Even if PS3 is a disaster, there's still going to be millions more Blu-rays in use than HD-DVD. And Cell is a technical wonder, given the right tools - and sony did ship a linux on the PS2, so given that they ship a kind of OS - and I have reason to think they will, people are going to be buying these for the insane power of the cell. Checkout link for possibilities.

    http://news.bbc.co.uk/1/hi/technology/5287254.stm
  • Reply 13 of 22
    robmrobm Posts: 1,068member
    heh - yeah I'm agreeing with you.

    We'll be getting them whether we need them or not.

    Cell tech is the future - no arguement from me there



    I'm just really "boo" on Sony right now, for any number of reasons. And M$ as well.

    I dunno - we are at a crux right now.

    One the one hand - video/digital cinema camera's are coming out offering 2k -4k capture resolutions that WILL require the sheer horsepower of cell computing in order to be able to work with the footage.

    On the other we seem to want to be able to deliver the content on hand held devices - crazy, no ?



    Roll on this from InPhase and others http://www.inphase-technologies.com/...nal/index.html

    BlueRay and HD dvd may well go the way of the dodo faster than the big corporations are planning.
  • Reply 14 of 22
    endymionendymion Posts: 375member
    Open Activity Monitor and show the # of threads column and you'll see that most have more than 1 thread that can be distributed across multiple cores.
  • Reply 15 of 22
    wmfwmf Posts: 1,164member
    Quote:
    Originally Posted by Endymion


    Open Activity Monitor and show the # of threads column and you'll see that most have more than 1 thread that can be distributed across multiple cores.



    What matters is the number of threads that are running, not the number of threads that exist.
  • Reply 16 of 22
    What also matters is the breakdown of those threads. If 1 thread is 90% of the work, you don't see much gain. But as more and more programs become multi-threaded, we'll start to see bigger gains.



    Remember that dual-core is roughly 18 months old on the desktop and low-end server. And only in the last 9 months has dual-core become semi main-stream (with the introduction of Yonah, and the price cuts in X2s/P-Ds) It's existence in high-end IBM or Sun servers is pretty irrelevant to user software. Multi-core desktops are very, very new, and I think we'll see a lot of improvement in the next version or two of major software.
  • Reply 17 of 22
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by ZachPruckowski


    Remember that dual-core is roughly 18 months old on the desktop and low-end server. And only in the last 9 months has dual-core become semi main-stream (with the introduction of Yonah, and the price cuts in X2s/P-Ds) It's existence in high-end IBM or Sun servers is pretty irrelevant to user software. Multi-core desktops are very, very new, and I think we'll see a lot of improvement in the next version or two of major software.



    I don't know what Sun and IBM servers have to do with this. Apple had dual CPU computers available for quite some time, and Apple knew that the Core Duo systems were coming but didn't bother to update any of the media software that's included on them to take advantage of the additional power. Despite the fact that media encoding is relatively easily parallelizable, Apple's included encoders are limited in the amount of CPU power they can use in a given instant, they can't use more than the equivalent of 100% of one CPU, or 50% each of two.
  • Reply 18 of 22
    chuckerchucker Posts: 5,089member
    Quote:
    Originally Posted by JeffDM


    I don't know what Sun and IBM servers have to do with this. Apple had dual CPU computers available for quite some time, and Apple knew that the Core Duo systems were coming but didn't bother to update any of the media software that's included on them to take advantage of the additional power. Despite the fact that media encoding is relatively easily parallelizable, Apple's included encoders are limited in the amount of CPU power they can use in a given instant, they can't use more than the equivalent of 100% of one CPU, or 50% each of two.



    What are you talking about? QuickTime has been multithreaded for a long time. Converting a song in iTunes creates no less than five additional threads.
  • Reply 19 of 22
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by Chucker


    What are you talking about? QuickTime has been multithreaded for a long time. Converting a song in iTunes creates no less than five additional threads.



    iDVD will only use an equivalent of one full CPU when encoding a DVD. Quicktime's encoder is only useful if you pay extra for the Pro feature and even then will only use two full CPUs when encoding, some codecs, only one. iTunes audio conversion isn't that relevant because it doesn't take much time, I don't remember seeing it take advantage of more than 50% each of two CPUs.
  • Reply 20 of 22
    I realise the thin client topic has emerged before, but it seems to be increasingly plausible and provides a direct answer to the query that was originally posted. Apple produces a powerful multi-core workstation, excellent server software that is apparently orders of magnitude easier for non-specialists to use than any other offering, wireless communication and simple networking technology integrated into their product line.



    In the old days, mainframe computers were expensive and Unix was developed in ways that allowed multiple users to share the same machine. Thus, at its roots, OS X shares this legacy of multiple users sharing one computer. It would seem possible for multiple users who are sharing a single machine to each use a single core for their work. This would not (I assume) require too much fancy coding as each user would have a dedicated hardware domain. Software solutions to improve CPU-sharing could evolve with time, but as this was possible decades ago, it should be quite possible today. I realise that the older among us remember the days of banks being brought to their knees by the "computer being down", but systems are much more stable these days.



    How could thin clients be used? I can think of two very significant markets, education and the home. Increasingly, homes are equipped with multiple computers that are networked. Apple also has some new wireless networking technology brewing with iTV. A dedicated CPU for each user is overkill in the home. One beefy computer with networked clients would be a great and inexpensive alternative. Probably a more realistic market in the short-term, however, would be the education market. Even though "laptops for every student programs" are well publicised, computer labs and desktop machines in the classrooms are a more pervasive reality. Multi-user workstation computers with wireles networking would reduce initial costs, reduce maintenance and software administration.



    Why not?
Sign In or Register to comment.