32-bit lifespan

Posted:
in Future Apple Hardware edited January 2014
Now 64-bit is here, how many years do you think before killer and normal apps make the transition. Interested whether any of you forward-thinking folks can imagine must-have apps (2005 equivalent of iSight, say) that will be 64-bit dependent.

Just trying to get a little brain-storming session going.
«13

Comments

  • Reply 1 of 60
    hmurchisonhmurchison Posts: 12,419member
    32 bit apps will be around for a Looooooooooong time. It took over a Decade to move from 16bit apps and 16bit apps had ways more weaknesses than 32
  • Reply 2 of 60
    neutrino23neutrino23 Posts: 1,561member
    Most apps will probably never make use of 64 bit capabilities. 32 bits is just fine for a whole lot of stuff. A 32 bit number lets you deal with 4 billion items in a chunk. If you only occasionally need to exceed that you could use double integers. 64 bits helps apps that need a little more memory than 4GB and that routinely deal with larger numbers (like the National deficit for this year).
  • Reply 3 of 60
    nevynnevyn Posts: 360member
    Just echoing the others. 32 won't die for a long, long time. Probably a close approximation of "never" in embedded space.



    There were a LOT of reasons why 8, 16, 24 bit computers were 'too small'.



    1) Instruction space

    2) Loops/algorithms

    3) RAM usage

    4) plenty of others I've skipped.



    But '32-bit' is PLENTY for one & two. Especially if you have the option of having some of your code 32-bit and some _other_ part of your code 64-bit.



    For #1, you won't _ever_ need more instructions. 4 billion is plenty. This isn't even really a discussion.



    For #2:

    Most loops are small loops. Sometimes they pass 256 -> using a machine where the 'bitness' is larger is a plus. But the size of loops is strongly weighted towards shorter loops. If I have 100 loops that fit inside 8bits, and _one_ that fits inside 16 bits, the 8bit processor should still crush the 16 bit computer in calculating these loops. (The 16bit processor has to pass 2x the bandwidth around, and 2x RAM requirements etc.) The way to "do" a 16-bit loop on an 8bit machine is also easy to understand: you just have counter-i, and counter-j. j is incremented every time, and when j hits 255 i is incremented once & j is set to zero, repeat... That's somewhere around 512 "extra" operations on an 8-bit machine on a program that's looping over 65000+ times -> no "big overhead" there. The 16-bit version on the 16-bit CPU had 'zero extra algorithmic overhead', but it had to pass a lot more information around.



    Most loops even today are a heck of a lot smaller than 0-65000.

    Let alone 0-4,000,000,000. (32-bit range)



    Now, there _are_ 16-bit algorithms that are nuts to use in an 8-bit CPU. Likewise for 64-bit to 32-bit.... But what if we had a CPU that could switch back and forth on the fly? If your-particular-loop requires more than 4billion, you say "Hey, I'd like 64-bit mode" and you have what you want. If it doesn't - then you don't need that -> you still have the optimum for your program.



    For 3)

    Well, yeah. It's almost like Moore's law -> DRAM chips will have larger sizes (in GB) and programs will gobble it up. But having more than 4GB of RAM isn't going to force _everything_ to go 64-bit. Mac OS X appears to be going towards allowing each process its own individual 4GB address space - unless the process explicitly says "I need more". If iCal, or Calculator.app, or a whole slew of other programs _ever_ becomes 64-bit, it will be because the CPU no longer has a 32-bit mode. And anything _forced_ to go 64-bit will be moving more bits around than it needs to -> bandwidth/memory hog.



    The 970's ability to switch back and forth is _VERY_ nice. There is no reason I can think of to _EVER_ lose this ability. It it happened, it would be a chip-hardware-simplification issue - and this seems pretty darn simple.



    To force an all 64-bit world, you need an increase in the number of _algorithms_ such that 64-bit-algorithms >> 32-bit-algorithms. It may take awhile.
  • Reply 4 of 60
    arty50arty50 Posts: 201member
    Perhaps a better question would be when will OS X become a true 64-bit OS? Or will such a thing ever really exist?
  • Reply 5 of 60
    thttht Posts: 5,421member
    When a desktop operating system comes with a "lifestream" like, database-like system for its filing and user interface system, 64 bit systems will be necessary. Just imagine, the computer will archive every little bit in the system through time. You can record every single website you've ever visited. You can see whatever you were doing on the computer in the present, 6 hours ago, 1 day ago, 7 days ago, 1 year ago, all the way back to the beginning when the 64 bit computer was first turned on. I'm waiting on it Apple.



    Oh I forgot. 64 bit systems will start to become necessary in about 4.5 years when computers ship with 4 GB of RAM because of marketing necessity.
  • Reply 6 of 60
    snoopysnoopy Posts: 1,901member
    Quote:

    Originally posted by Arty50

    Perhaps a better question would be when will OS X become a true 64-bit OS? Or will such a thing ever really exist?



    What do you mean by a true 64-bit OS? Everyone has something different in mind about what makes an OS 64-bits, but few ever says what it is. For example, here are four possibilities.



    1. The OS is all 64-bit code and requires 64-bit applications. Old 32-bit applications must be modified to run.



    2. The OS is all 64-bit code but it runs both 32 and 64-bit applications.



    3. The OS is a mix of 64 and 32-bit code but all performance critical code is 64-bit and it runs both 32 and 64-bit applications.



    4. The OS fully supports and runs 64-bit applications, but does not achieve optimum 64-bit performance. It runs both 32 and 64-bit applications.
  • Reply 7 of 60
    smirclesmircle Posts: 1,035member
    Quote:

    Originally posted by onit

    Now 64-bit is here, how many years do you think before killer and normal apps make the transition.



    As long as the PPC in its various incarnations is around, 32 Bit applications will be supported as the PPC ISA was 32/64 Bit from the beginning.



    As long as the CPU offers a 32 Bit mode, moving iCal, iChat and so on to 64 Bit doesn't make any sense since they would be just slower.
  • Reply 8 of 60
    Smircle:

    I have seen quoted many times recently that PowerPC is a 64-bit architecture with a 32-bit off-shoot. There is provision for a temporary 32->64-bit bridge architecture (as used in 970). But that it is definately a temporary bridge.
  • Reply 9 of 60
    programmerprogrammer Posts: 3,457member
    Quote:

    Originally posted by AngryAngel

    Smircle:

    I have seen quoted many times recently that PowerPC is a 64-bit architecture with a 32-bit off-shoot. There is provision for a temporary 32->64-bit bridge architecture (as used in 970). But that it is definately a temporary bridge.




    What is temporary about the bridge is the particular way in which 32-bit code can be made to working in a 64-bit address space. The 32-bit mode of the processor is a permanent part of the spec and is something all 64-bit PowerPCs have had, and likely will have going into the future. The required effort to support 32-bit mode is quite minor, and the reasons to support it are quite strong. Once Apple's OS has full 64-bit implementations of all their user space system software then the temporary bridge will no longer be needed, but the OS will still support 32-bit applications.



    A 32-bit application is not a bad thing. If an app only needs 100K of memory and never deals with numbers >4 billion then it is better off as a 32-bit application... and that is most applications.
  • Reply 10 of 60
    onitonit Posts: 44member
    For those apps that deal with numbers >4 billion is there any more simplicity in writing for 32-bit than 64-bit? Apologies if that's a stupid question, but I really have no idea.
  • Reply 11 of 60
    programmerprogrammer Posts: 3,457member
    Quote:

    Originally posted by onit

    For those apps that deal with numbers >4 billion is there any more simplicity in writing for 32-bit than 64-bit? Apologies if that's a stupid question, but I really have no idea.



    If you aren't worried about getting absolute maximum performance then most compilers will actually generate the necessary code to work on 64-bit numbers even though you're in 32-bit mode. Its a bit slower, but unless you are doing some extremely heavy integer math you won't even notice. And if you're doing math that heavy you really ought to consider using floating point, which is significantly faster on the G5 in many ways... and 64-bit floating point works just fine in 32-bit mode (and all PowerPCs can do it, at least those found in Macs).
  • Reply 12 of 60
    yevgenyyevgeny Posts: 1,148member
    I rather suspect that five years down the road, that Mac OS X will be 64 bit only because all that Apple will be shipping will be 64 bit machines. Apple has a tendency to use the OS as a cudgel to force users to upgrade (can't say that I blame them).
  • Reply 13 of 60
    The question is, when will we see 128 bit Computers?
  • Reply 14 of 60
    smirclesmircle Posts: 1,035member
    Quote:

    Originally posted by flofighter

    The question is, when will we see 128 bit Computers?



    I am not sure I'll live long enough for that.
  • Reply 15 of 60
    nevynnevyn Posts: 360member
    Quote:

    Originally posted by onit

    For those apps that deal with numbers >4 billion is there any more simplicity in writing for 32-bit than 64-bit? Apologies if that's a stupid question, but I really have no idea.



    Yes and no.

    It depends on how heavily used those numbers >4B are.



    Also note that this has to be _integers_ greater than 4B. 64-bit floating point numbers dwarf this whole discussion. High accuracy and very large scales are both readily available in floating-point-land, 64-bit-integers don't seem crucial to 'numerical computing' type problems. (Fluid dynamics, heat transfer, weather modeling etc.)



    There's a whole bunch of things that "use" 64-bit-range numbers on 32-bit machines and won't see a net benefit. Things that can be expressed as a loop-inside-a-loop for instance. It isn't necessarily _simpler_, but it may well be faster/consume less resources.



    Then there's other sorts of 'apps that deal with >4B'. Large databases come to mind. In that case you are doing lots of 64-bit-compares, or 64-bit-integer-adds or whatever. A 64-bit-compare is _at_least_ twice as efficient on a machine with 64-bit registers. So apps of this sort (Oracle, some of the integer-only math libraries, all the crypto that isn't altivec already) will use 64-bit operations. Even if Apple screams "that's all unsupported for now" from the rooftops.



    So programs fall into one of these categories (more if you think of different pieces of the program separately)



    1) I don't need to count that high, and even the best program in my class EVAR won't need to count that high -> I will see a performance drop if _forced_ to use 64-bit operations. (Calculator.app, Safari, most everything on a stock Mac)



    2) I don't need to count that high, but perhaps someday I could. -> A drop for now, but future expansion potential.



    3) I do need to count that high, but it is such a trivial use that I'm better off faking it by using two separate 32-bit entities. I would see a speed drop if _forced_ to use 64-bit operations.



    4) I do need to count that high, and there's a whole slew of 64-bit operations going on. I'll see a dramatic increase in speed. (376% was reported on one 64-bit-intensive benchmark comparing a 32-bit AMD to a 64-bit AMD after MHz was factored out).



    5) 64-bit? Heck, I need 256-bit-and-counting. I guess 64-bit would help though. (These programs would also see a dramatic increase on 64-bit machines relative to 32-bit machines).



    The thing is that the VAST majority of programs are in category 1. And 2 is similarly more populated than the others. Class 3 contains many more programs than 4 & 5...



    Note well the use of the word _forced_ in 1-3. The 970 does _not_ force these programs to deal with 64-operations. -> We're getting the best of both pieces of the puzzle. Class 1 programs run at full tilt... Class 3 programs can mix & match if they like, and class 4/5 programs run better than they could on _any_ current 32-bit machine.
  • Reply 16 of 60
    yevgenyyevgeny Posts: 1,148member
    Quote:

    Originally posted by flofighter

    The question is, when will we see 128 bit Computers?



    Depends on what you mean by 128 bit comptuers G4's are 128 bit computers in that Altivec registers are 128 bit.



    I can not see the practical reasons for computers with general purpose 128 bit integer registers. There is no reason why you would need such a memory address. I'm speaking as a programmer who DESPERATELY wished that 64bit x86 was as common as 64 bit PPC will soon be. I am not aware of any integer based problems that would need more than 64 bits.



    Either way, the answeris pretty straightofrward: If we ever see 128 bit computers, then it won't be anytime soon.
  • Reply 17 of 60
    nevynnevyn Posts: 360member
    Quote:

    Originally posted by flofighter

    The question is, when will we see 128 bit Computers?



    What's Altivec?



    RAM usage is going along and doubling... (4k, 8k, 16k...)

    CPU-bitness is going along and _doubling_in_exponent_.

    (2^8, 2^16, 2^32, 2^64...)



    It will take a LONG time.

    (Research the RAM-doubling rate, research the CPU-bitness-doubling-rate, calculate how many years it will take to intersect. Now calculate time-to-heat-death-of-universe. Chuckle.)



    Note also that if we ever do get to the point where there's a strong desire for >64bit computing, it is quite possible that it would be a separate unit. There's so many things that would be hampered you'd probably be far better off with a 64-bit general purpose CPU and a separate task-specific-designed-for-the-job-128 bit unit.



    Note: That's what Altivec _is_. It isn't a general purpose 'CPU', since it can't do a lot of the things you'd expect a 128-bit CPU to be able to do (like loop from 0 to 2^127). But it does allow some useful operations on 128-bit entities, like comparing/permuting... The GPU is another 'task specific' coprocessor that might be considered >64-bit....
  • Reply 18 of 60
    airslufairsluf Posts: 1,861member
    Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.



    Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.



    Don't worry, as soon as my work resetting my posts is done I'll disappear forever.

  • Reply 19 of 60
    snoopysnoopy Posts: 1,901member
    Quote:

    Originally posted by Yevgeny

    I rather suspect that five years down the road, that Mac OS X will be 64 bit only because all that Apple will be shipping will be 64 bit machines. Apple has a tendency to use the OS as a cudgel to force users to upgrade (can't say that I blame them).



    If you mean someday OS X will not run 32-bit applications, that would be a silly thing for Apple to do. The ability for 64-bit PPC processors to run 32-bit applications with zero penalty is a great selling point. If you read Programmer's earlier posting, you see that it is also very easy to keep this capability in the OS. Quote: "The required effort to support 32-bit mode is quite minor, and the reasons to support it are quite strong." So I say it is a permanent feature of the Mac.
  • Reply 20 of 60
    tjmtjm Posts: 367member
    Personally, I have great faith in software engineers' ability to build software that maxes out the capabilities of any system available. Today's desktops run programs that were in the realm of multimillion-dollar supercomputers a decade or two ago. If consumer-grade computers can run it, somebody will figure out a way to make climate modeling or nuclear explosion simulation code somehow useful on the desktop in some sort of application (probably games, first).



    IMO, it won't take long before the G5s, Opterons, and Itaniums are at their limits and people start screaming for 128-bit capabilities. My guess is that 128-bit chips will debut before 2020.
Sign In or Register to comment.