32 bit vs. 64 bit

2»

Comments

  • Reply 21 of 23
    hirohiro Posts: 2,663member
    Quote:

    Originally posted by Zandros

    Yes, yes it has. It was only the PS2 and Dreamcast that were touted as 128-bit, I were wrong. Though they aren't true 128-bit either.



    they were touted as having 128-bit graphics, a function of the GPU not the CPU.
     0Likes 0Dislikes 0Informatives
  • Reply 22 of 23
    toweltowel Posts: 1,479member
    Quote:

    Originally posted by lundy

    Marketing-speak also can claim that something is 64 or 128 or 256 bit when in fact they are referring to some internal bus in the product that has nothing to do with the CPU.



    I thought I remembered reading a long tme ago that the market-speak bitness was actually a sum of several numbers - like 32 bits for the CPU word size, 32 more for the GPU, 32-bit color, 32-bit addressing, etc. Add them all up and you got a completely meaningless 64 or 128. I can't source that recollection, though.
     0Likes 0Dislikes 0Informatives
  • Reply 23 of 23
    As stated above, the whole 8-128 bit scheme was more of a marketing scheme than actual CPU/GPU performance. Back in those days "modern" America was still relatively computer iliterate. A PC in the home wasn't even normal yet. But obviously things changed.



    While it's true that it began with the Atari and NES with an 8-bit color palette, and the SNES and Genesis with 16-bits, those consoles were technologically primitive and in truth little more than a motherboard and 2-16k of RAM. The CPU was actually a part of the board, not a seperate chip as we see it today.



    But as consoles evolved and the systems became more complex, the "biting" scheme was droped. Why? Because as stated above, it wasn't really true anymore. But also because as systems starting getting more parts in them, the performance of the system wasn't ranked on one part.



    For example, without the Xbox's Geforce 3/4 hybrid card in it, it's pentium chip would have made little difference from the PS2.



    Obviously it's more complex than that, but tyhat's the basics. The GPU wasn't really a factor in a NES because the main board had the power to render the game without any aid from a seperate chipset.



    what i said isn't exactly correct, but it gets the point across.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.