Road to Mac OS X Snow Leopard: 64-bits, Santa Rosa, and the great PC swindle

124

Comments

  • Reply 61 of 97
    Quote:
    Originally Posted by solipsism View Post


    That isn't what you said. YOu said there are more 64-bit Windows applcations now than their ever will be for 64-bit OS X. High-priced apps account for a small part of all apps and your knowledge seems to stem from a limited, anecdotal accounting of 64-bit apps. SInce all 32-bit apps work on 64-bit Leopard but the same can't be said for Windows shouldn't we also count all of them since the original issue stemmed from 64-bit Windows having a very limited app numbers compared to 32-bit Windows?



    Yes, you're correct DOS and 16-bit applications won't run under Vista x64.



    And no, you are incorrect wrt 32-bit applications running under Vista x64.



    So let's count all 32-bit and 64-bit applications on both platforms, shall we?
  • Reply 62 of 97
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by franksargent View Post


    Yes, you're correct DOS and 16-bit applications won't run under Vista x64.







    Quote:

    And no, you are incorrect wrt 32-bit applications running under Vista x64.



    So let's count all 32-bit and 64-bit applications on both platforms, shall we?



    I don't mind being incorrect, but I do like to have some verifiable facts that will help educate me. Call me a cynic, but I actually require proof before changing my mind.
  • Reply 63 of 97
    Quote:
    Originally Posted by zmonster View Post


    HUH?? You are completely clueless. The Object-C object size in a 64-bit compilation is exactly 2x the size of the 32-bit data structure. The integer is 2x the size. Most every other data structure is nearly 2x the size. 64-bit apps take up ROUGHLY 2x the size that 32-bit apps do. PERIOD. You are wrong, my friend.



    If you don't believe me, find a 64 bit machine, install both the 32-bit version of Java, and a 64-bit JVM. Run the same app on both. This is NOT limited to Apple, EVERY 64-bit architecture takes up nearly twice the memory that a 32-bit architecture takes up. Take a computer science class.



    x86-64 instructions range in size from 1-15 bytes. When AMD designed the

    architecture, several considerations were made to minimize the overhead associated

    with the increase in register/address space. By default, pointers are 64-bit

    in width, but as pointed out by other people while pointers are prevalent, they

    are by no means the only determinants of program size. Most if not all

    x86-64 long mode instructions actually use 32-bit operands, unless overridden by

    REX prefix, which effectively limit the overall instruction size to equal to legacy

    or compatibility mode instructions. AMD figured 32-bit fixed point integer arithmetics

    would suffice for most applications. Also, 64-bit applications by default use

    RIP relative addressing. A side effect/benefit of this is that the instruction size

    is limited, since the relative pointer is often 32-bit.



    The architecture designs like above allow implementation of modes such as

    LLP64 used by Microsoft Visual C++, where only pointers and long long

    are 64-bit. It is of course compiler/application dependent issue,

    and any program has the freedom to bloat as allowed by the architecture.

    However, a properly optimized program should not do this. The 2x bloating

    certainly does not happen to Linux 32 vs 64, or Windows Vista 32 vs 64.
  • Reply 64 of 97
    Quote:
    Originally Posted by solipsism View Post








    I don't mind being incorrect, but I do like to have some verifiable facts that will help educate me. Call me a cynic, but I actually require proof before changing my mind.



    I don't have hard numbers to give you at this time.



    All I can say is that Vista x64 SP1 has been able to run 100% of the x86 applications that I've thrown at it to date.



    That is in fact one of Vista's x64 selling points, better x86 compatibility than XP SP2 x64.



    In fact, I convinced one engineer at work to switch over to XP SP2 x64 (the government doesn't support Vista yet), basically because of the 32-bit OS memory and speed limitations of XP SP3 x86. He has the Windows/Intel equivalent of the top end Mac Pro (8-cores of 3GHz Xeon goodness for a 12 month old PC workstation) and was running a 2D wave overtopping RANS model.



    A lot of horsepower was going to waste there with only 3GB of memory and running a 64-bit application as a 32-bit application. VM thrashing the HD with a typical 40 hour runtime for a single run. He typically did eight runs at a time (one run for each core). He also needed to get the latest Intel FORTRAN x86/x64 compiler.
  • Reply 65 of 97
    amdahlamdahl Posts: 100member
    Most 64-bit apps should not see a doubling in the memory used. I think I remember reading that 33% larger was typical. While 'long' may double in size, apps that have currently been using 'long' as a 32-bit value should instead specify 'int', which continues to be 32-bits even on 64-bit systems. If an app needs a 64-bit value, it would already be using 'long long' on 32-bit systems. Pointers, of course, do double and there is nothing you can do about that.



    But it is NOT TRUE that a 64-bit version of an app needs to take twice as much RAM. Carelessness could certainly cause that, but it is not a requirement.
  • Reply 66 of 97
    The article is very clear that 64-bit windows does exist, but if it lacks 64-bit chipsets... Also, the point was made that earlier Mac's had 32bit chipsets, which would encounter similar issues.



    "No version of 32-bit Windows supports this..." is in no way including 64-bit versions...



    Other articles that I have read this year, have discussed the same issue, though with much less technical detail really



    Cheers !





    Quote:
    Originally Posted by minderbinder View Post


    That's wrong. There's also XP64 which can use as much ram as a mac (actually more in most cases since there are so many more 64 bit windows apps).



    I guess there are probably some windows users who have more ram installed than the system can handle, but there are also plenty of PC users running the 64 bit version of either XP or Vista and using every bit of their ram.



    This article is REALLY misleading, it spins things like mac users can use all the ram while PC users can't when in fact PC users can use it all with a 64 bit OS, and there are way more 64 bit apps on the PC side.



  • Reply 67 of 97
    Quote:
    Originally Posted by AppleInsider View Post


    With Intel's "Santa Rosa" platform, Apple's Core 2 Duo machines gained chipset support to internally handle 8GB of address space. This allows Santa Rosa Macs to shove MMIO up into the high end of the space and reclaim all of the addresses below the 4GB mark, making the full amount available to the system. No version of 32-bit Windows supports this, and conversely, there is no 32-bit version of Mac OS X Leopard, so the "where is my full 4GB?" issue is now a Windows-only problem going forward.



    Wow! When was Santa Rosa added to the C2D Mac Mini? I've been waiting for Apple to upgrade the Mini to Santa Rosa since May 2007. I thought all Mac Minis were still using Napa chips. Although at this point, I am expecting the Mac Mini to leapfrog Santa Rosa and just go straight Montevina chips. Won't that be fun to jump from GMA 950 graphics to GMA X4500 graphics.
  • Reply 68 of 97
    Quote:
    Originally Posted by ascii View Post


    Not all video cards take address space away from main memory do they? Some are communicated with via the PCI bus.



    The article isn't talking about onboard graphics. The video memory on a separate graphics card is still part of the 32-bit address space, as the CPU needs to keep track of it somehow.
  • Reply 69 of 97
    asciiascii Posts: 5,941member
    Quote:
    Originally Posted by Squozen View Post


    The article isn't talking about onboard graphics. The video memory on a separate graphics card is still part of the 32-bit address space, as the CPU needs to keep track of it somehow.



    What do you mean by "keep track of it?" The GPU needs to be able to address the bytes of video ram individually, but the CPU just needs to be able to push data to it through the expansion bus. The only main memory address space required is for the control/data lines of the bus.
  • Reply 70 of 97
    dr_lhadr_lha Posts: 236member
    Quote:
    Originally Posted by franksargent View Post


    B

    The Mac GUI is still 32-bit, only under the hood are applications 64-bit native.



    No, that's not true, it was true for Tiger, but not Leopard. Cocoa is 64 bit, its perfectly possible using Cocoa to make fully 64 bit native apps, including the GUI. Only Carbon lacks 64 bit support. Of course most of the "major" apps on Mac (Photoshop, Office) use Carbon.
  • Reply 71 of 97
    dr_lhadr_lha Posts: 236member
    Quote:
    Originally Posted by HipPriest View Post


    What 32-bit Linux apps did you have trouble running? We run an all 64-bit Debian Linux environment at work on hundreds of systems and my experience has been much better than yours. 64-bit Debian has packages for the most common 32-bit libraries used by third-parties and of course most Linux software is open source and 64-bit native. The only thing 32-bit we run is shake, which runs fine, and the flash plugin, which needs a wrapper, and runs as well as... flash usually runs.



    The use of the word "most" in this paragraph is the killer, yes, most stuff works, but for mission critical machines it *all* has to work. Installing 32bit Debian made it all work, 64 bit wasn't worth the hassle for us. As an aside, the hassle that linux gives us in general is why we've almost exclusively moved to Macs in the space of 4 years at work.
  • Reply 72 of 97
    welshdogwelshdog Posts: 1,817member
    Will Apple rewrite Final Cut Studio 2 so all apps are 64 bit? Seems like this suite would greatly benefit from 64 bit addressing. I've already read that Quicktime will finally be 64.
  • Reply 73 of 97
    Can anyone explain what the advantage of a 64 bit kernel is? I feel like Leopard's 32/64 bit hybrid approach is the best of both worlds- you get access to large amounts of memory, you can run 64 bit apps natively, and you have good backward compatibility. Once Snow Leopard comes out, a lot of hardware won't be usable on it because of a lack of 64 bit drivers. I tried using Vista x64 but ran into this issue.
  • Reply 74 of 97
    seek3rseek3r Posts: 179member
    Quote:
    Originally Posted by dr_lha View Post


    This may be true to a hardcore Linux user, but let me tell you, at work one of our guys recently got a new Linux box with 64 bit linux installed, and non of the software we used worked on it. Getting the box to a point where it could actually run 32bit linux apps was by no means trivial, and in the end we just decided to install 32bit Debian on it as the easiest answer. So please don't make out that 64 bit linux is completely trivial compared to Mac OS X.



    As for 64 bit Windows don't make me laugh, there's a reason why nobody uses it.



    Macs have had by far the smoothest transistion to 64 bit. I've been running 64 bit apps on my Mac since Tiger came out. The only ball that Apple have dropped was not delivering 64 bit Carbon.



    Apple's one OS strategy for 32bit/64bit CPUs compared to Windows and Linux separate versions is the far superior strategy both for compatibilty and user experience. I speak as someone who works in a 3 OS environment, where we have 64 bit capable chips, but due to the issues with Linux and Windows, only the Macs we have are actually capable of running 64 bit apps.



    What were you guys running that you couldnt get to port easily to a 64bit install? I haven't had issues with 64bit linux (and yes, I'm a debian guy) in a *very* long time. hell, even most precompiled 32bit binaries will prolly run with the 32bit compat libs installed (apt-get install ia32-libs).



    Also, as far as separate vers go... debian uses a more "pure" approach than many distros in their 64bit release, hence why it's possible the 32bit libs may not work, then you get the *fun* of creating a 32bit chroot, which may be what your coworker needed (though on lenny it'll be even less of a problem than it is now, which honestly isnt usually much of a problem for most things). You dont need to do that with all distros though, and debians approach makes very good sense for most of *it's* direct users (rock solid on servers, where we've been solidly 64bit for quite a while, and in most large-scale server-side apps)
  • Reply 75 of 97
    Quote:
    Originally Posted by zmonster View Post


    HUH?? You are completely clueless. The Object-C object size in a 64-bit compilation is exactly 2x the size of the 32-bit data structure. The integer is 2x the size. Most every other data structure is nearly 2x the size. 64-bit apps take up ROUGHLY 2x the size that 32-bit apps do. PERIOD. You are wrong, my friend.



    If you don't believe me, find a 64 bit machine, install both the 32-bit version of Java, and a 64-bit JVM. Run the same app on both. This is NOT limited to Apple, EVERY 64-bit architecture takes up nearly twice the memory that a 32-bit architecture takes up. Take a computer science class.



    Warning: computer science geekery!



    zmonster:



    There are a couple mistakes in your math there. The only data types that automatically double in size are pointers, which must by definition be 64 bits large to reference 64 bits of memory. While modern apps do use a lot of pointers, pointers are by far a tiny part of a program's total memory usage. 'long' also doubles in size, but to be fair, it's rarely used precisely because its size is a big "it depends".



    For everything else, programmers are free to continue with the same sized types as before. 'int' and 'float' were 32-bit types before, and they still are now. 'long long' and 'double' were 64-bit types before, and they still are now. Really, nothing changed other than 'long long' now calculating just as fast as 'int' -- on 32-bit systems, they were slower.



    Programmers choose types based on just how big their numbers can get, and if a range of zero to four billion is big enough for their needs, they'll keep using 32-bit math as long as they like. The jump to 32-bit systems didn't stop 8-bit 'char' and 16-bit 'short' from working either.



    Objective-C objects also do not completely double in size. They definitely get bigger; every object stores a pointer to its class definition, and when you work with objects in your code you store pointers to the objects. But that's just the overhead. When it comes to the actual *data* contained within each object, programmers are free to use whatever types they want. And to be fair, many "power-user" apps only use Objective-C for the interface; the meat of the app is often done in a language with less RAM overhead.



    Lastly, about Photoshop: this part killed your argument. 64-bit Photoshop does not exist yet, so you're stating facts about something that's fiction outside of Adobe. But if it did, I doubt that it would be that much bigger than the current Photoshop. The vast majority of Photoshop's memory usage is with image data, and 8-bit images eat up one byte per color channel, per pixel, per layer, regardless of what kind of computer you're running it on.



    Rich
  • Reply 76 of 97
    (Ultimately, I think that, as smooth as the transition to 64bitness has been for the OS, narrowing the upgrade from Carbon+Cocoa to Cocoa UI-able apps-only results into a developmental bottleneck which could exclude a lot of heavyweights for years or forever. The results are there for all to see: Photoshop 64 delayed for another year or year and a half, no Maya64 on the horizon, etc. Perhaps the only major creative apps to transtion to 64 bits are Lightroom (Cocoa-based from the start) and Cinema4D. In that sense, Windows, for all its accidents, is progressing far more solidly)
  • Reply 77 of 97
    boogabooga Posts: 1,081member
    Quote:
    Originally Posted by Clive At Five View Post


    I'm going to have to agree with Sol on this one. I consider myself a pro-user and I still couldn't put up with 64-bit XP, though I can't speak for Vista. A pro-user friend of mine is running 64-bit Vista and loves it, but he's a Windows-only guy, so who knows?



    I know I criticize Apple a lot, but I have to hand it to them: their transition to 64-bit has been seamless. I'm really hoping Windows 7 will make it easier to switch, because there's no chance I'm going to install Vista.



    -Clive



    When Apple does make the switch to a 64-bit kernel, we'll see. Until then, we don't know how Apple's 64-bit transition is going to go-- it hasn't fully happened yet. Unlike Windows, in which the fully 64-bit version (something that doesn't exist on MacOS) is rapidly overtaking the 32-bit version in popularity and driver support.
  • Reply 78 of 97
    Quote:
    Originally Posted by thor79 View Post


    Yeah...PC users are getting screwed over if they stay uninformed. If they choose to become informed even the most basic research will deliver the correct answer, upgrade to a 64 bit OS. Since all recent essential hardware supports 64 bit just throwing on a 64 bit OS is all that is needed to solve the problem.







    PC Fan Boys such as me already know of the issue...this is nothing new. This only affects non-techies who know crap about their computer. These are the same people who take their computer troubles to the Geek Squad and get screwed over there...losing documents unnecessarily and buying unnecessary hardware when their current hardware is fine. These are the same people who, when they get fed up with their PC, switch to Mac after some salesman tells them Macs run flawlessly, which is another lie.



    @thor79



    I am not a gamer, but reading some of their posts-- gamers seem to favor PCs... they build their own boxes, [supposedly] selecting higher-quality/higher-performance matched components, use the latest/greatest graphics cards, yadda, yadda, yadda. One of the things they all seem to do is max out the RAM to get the best performance.



    Have these "gamers" found a way to exploit the RAM address space? Or, do they upgrade from 2G to 4G, knowing that they really can only use .3G of the additional RAM?



    If the above is true, wouldn't gamers put a lot of pressure on MS to fix the OS and/or put pressure on the game mfgrs. to port to a 64-bit OS, like the Mac.
  • Reply 79 of 97
    Quote:
    Originally Posted by franksargent View Post


    And has nothing to do with the OS'es bitness.



    but has a lot to do with one system using modern APIs while the other system is still constrained by API decisions made for 16 bit that have been carried over all the way to 64 bit windows.
  • Reply 80 of 97
    Quote:
    Originally Posted by seek3r View Post


    What were you guys running that you couldnt get to port easily to a 64bit install? I haven't had issues with 64bit linux (and yes, I'm a debian guy) in a *very* long time. hell, even most precompiled 32bit binaries will prolly run with the 32bit compat libs installed (apt-get install ia32-libs).



    Your needs are very clearly different from mine. For example, a lot of software we use cannot simply be recompiled without costing us a butt load of money. Also: again the used of that word "most".



    Quote:

    Also, as far as separate vers go... debian uses a more "pure" approach than many distros in their 64bit release, hence why it's possible the 32bit libs may not work, then you get the *fun* of creating a 32bit chroot, which may be what your coworker needed (though on lenny it'll be even less of a problem than it is now, which honestly isnt usually much of a problem for most things).



    It makes life more complicated though doesn't it? Apple's approach is seemless to users, which is why I believe it is the superior one. It may be scoffed at by Linux hackers, but our users simply want their machines to work, and not to have to fiddle around with chrooted logins and the like.
Sign In or Register to comment.