How long before we're living in a 64bit world?

Posted:
in General Discussion edited January 2014
Since there are some people around here who know a bit about this sort of thing, I have some questions.



64bit machines will be upon us soon, even if consumers may not be using them for awhile yet. Looking at the prospectus for IBM (PPC) and AMD and Intel, how long do you all think it will be before 64bit machines make up the base line of computing? Like when 32bit supplanted 16bit so many years ago.



The reason I ask is that while it looks feasible for a IBM PPC970 variant to appear in a notebook, nothing I've read about Hammer or Itanium suggests they'll be in a laptop anytime soon. They both look hot and hungry. Still, Tom's Hardware had a list of quite a few ready to go Hammer motherboards, some of which looked to be targeted at the consumer/enthusiast.



Then, even if these X86 replacements come in cheap enough for consumer use, it doen't look like they'll be ready for laptop use for some time (.09u minimum, and probably only with some severe energy saving schemes). Will we end up with a few years of hybridized 32/64 bit environments?



I dunno, where do you guys see the future, say the next 2-3 years, of computing on both the mac and PC, laptop and desktop, consumer and pro?



Will there be a big divide?

Comments

  • Reply 1 of 7
    hmurchisonhmurchison Posts: 12,425member
    well I don't think 64bit is going to play much of a part with consumers.



    I'd like to see computers move to different form factors and they must become much more efficient with electricity and noise pollution.



    Perhaps we can move back to a thin client mode where you install the "guts" of your computer away and utilizing just the screen and storage devices on your desk. You computer becomes just another Node.
  • Reply 2 of 7
    kecksykecksy Posts: 1,002member
    64-bits will become mainstream in this decade. 32-bits will disappear like 16-bits did in the early 90s. 64-bits will arrive sooner than we think as companies are likely to market the hell out of it. Trust me, Apple, AMD, and Intel will try to convince everyone that they need 64-bits in order to sell more machines.
  • Reply 3 of 7
    I think Apple will be first to bring 64-bit processors to the mainstream with the help of IBM's PPC970 because this IBM chip sounds like it has the capacity to run in laptops as well as desktops and has legs to scale up over the next few years.



    I wouldn't be surprised if nVidia ships a video card with 64-bit color coupled with an OS X update that supports it.



    For people who need or want 64-bit, the Mac will be the only decent choice for a year or two while everybody else plays catch-up.
  • Reply 4 of 7
    matsumatsu Posts: 6,558member
    I was thinking exactly the same thing IBM looks like the only company that could possibly supply a 64bit laptop solution. Still, Hammer boards look like they have a consumer oriented focus (at least some of them). But I dunno about laptops. Intel is interesting too, in that as far as I can tell they don't really have a 64 bit roadmap for anything but workstations???



    If Apple went 64bit on Pro machines (laptops and desktops) how long before they transitioned the consumer machines? Would they take their time? Or is the PPC-32 roadmap so abysmal that they'll change those machines aswell, or have intense pressure to do so???
  • Reply 5 of 7
    ast3r3xast3r3x Posts: 5,012member
    [quote]Originally posted by rogue27:

    <strong>I think Apple will be first to bring 64-bit processors to the mainstream with the help of IBM's PPC970 because this IBM chip sounds like it has the capacity to run in laptops as well as desktops and has legs to scale up over the next few years.



    I wouldn't be surprised if nVidia ships a video card with 64-bit color coupled with an OS X update that supports it.



    For people who need or want 64-bit, the Mac will be the only decent choice for a year or two while everybody else plays catch-up.</strong><hr></blockquote>



    do you mean a 64bit processor, because i thought that 24bit color was 16.7million colors and the human eye couldn't even see all of them
  • Reply 6 of 7
    xmogerxmoger Posts: 242member
    The human eye can't distinguish 64 bits of color. However when a single pixel has a half dozen operations done on it before it gets to the screen, the rounding errors cumulatively become visible. So more precision is needed.
  • Reply 7 of 7
    kecksykecksy Posts: 1,002member
    nVidia and ATi are skipping 64-bit color and going straight to 128-bit floating point color for the best accuracy. 128-bit color will be used internally, and the final image will be converted in 32-bits for viewing.
Sign In or Register to comment.