Why are requirements on game consoles less than on a computer?
seems strange to me and always had. a lot of games that work amazingly well on game consoles require pretty hefty requirements in their computer versions.
i remember back when the N64 was out that the games available on that graphically blew away most computer games. but now it just seems the requirements are a lot higher for computers.
ie: doesn't the gamecube have a 400Mhz processor? and it's sort of G3/G4 related, yet, games that run on that and have mac versions often require a lot more mac horsepower.
i remember back when the N64 was out that the games available on that graphically blew away most computer games. but now it just seems the requirements are a lot higher for computers.
ie: doesn't the gamecube have a 400Mhz processor? and it's sort of G3/G4 related, yet, games that run on that and have mac versions often require a lot more mac horsepower.
Comments
Computer users aren't happy unless their pushing games at at least 1280 x 1024.
Makes me wonder if I should run my computer games at 640 x 480 and just sit back from the monitor
Originally posted by hmurchison
Because games only have to output NTSC resolution which is what 720 x 480.
Computer users aren't happy unless their pushing games at at least 1280 x 1024.
Makes me wonder if I should run my computer games at 640 x 480 and just sit back from the monitor
but take Tony Hawk pro Skater 4 for example.
runs fine and dandy on a gamcube.
yet it requires a 733 Mhz G4 and 32 MB Radeon or Geforce 2 MX. and i assume that is just for 640 x 480
and I'm pretty sure gamecube and Xbox support higher resolutions since they have High-def support
It also takes a long time to come up to speed, note that all the consoles are still getting faster, more graphically impressive games, as developers learn all the tricks that can be used.
The computer world is different. It is a moving target. There is more than one configuration that a game has to support. All the tricks that a developer learns about hardware change by the time a game is released. There is a complex OS running behind the game that can do stuff the game cant affect.
Consider this, two years after a console is released, it is really hitting its stride. Two years after PC hardware is released ( eg: a video card ), it is obsolete, and no one is developing for it anymore.
Finally, consoles have a different balance of hardware. More production cost goes to graphics and audio subsystems, while less goes the the cpu ( see the weak cpu's in all three current consoles ). This is feasible because the target software of the console is well known ( games ). For a PC, most users will be able to make use of a fast CPU more often than a fast graphics card, so you get machines skewed towards that. Current cheap PC's are shipping with Intel Extreme Graphics - which are painfully slow. The Power Mac G5's start with a GeForce 5200, still slower than high end cards of two years ago. Low end macs are still shipping with painfully slow video systems - geforce 4mx, radeon 7500. You want games, get a console.
and this thing sells for 99....not 800. how can that be?
whereas Xbox and GC have very smooth games.
I suppose part of it probably has to do with the OS, since the GC is dedicated gaming machine, there is no bulky OS to deal with.
Also, from what I understand that GPUs in the GC and Xbox are finely tuned to allow for smoothing and what have you.
But, it is certainly pretty weird how consoles can play better looking games(for the most part) with such low specs comparatively.
Aren't console games written in low-level code instead of higher-level languages? Do the gaming/graphics engines, being dedicated frameworks, get more streamlined than the swiss-army-knife approach of computers? Does the chip/hardware technology get specifically tailored for the task instead of being ready for just about anything like a desktop/productivity computer? Do consoles use hard drives, or is everything still flash memory, huge caches and that sort of thing?
Just some points to ponder. I'm totally ignorant with regard to games and console machines.
Originally posted by BuonRotto
It might say a lot about the overhead the OS itself puts on the system. (?)
Aren't console games written in low-level code instead of higher-level languages? Do the gaming/graphics engines, being dedicated frameworks, get more streamlined than the swiss-army-knife approach of computers? Does the chip/hardware technology get specifically tailored for the task instead of being ready for just about anything like a desktop/productivity computer? Do consoles use hard drives, or is everything still flash memory, huge caches and that sort of thing?
Just some points to ponder. I'm totally ignorant with regard to games and console machines.
im ignorant too (obviously) but...
1. pretty sure they use a higher-level code. hence how they are so easily "ported" to multiple consoles/platforms
2. i would think (or hope) that the frameworks (Open GL, etc) would be just as streamlined. and even if not, that the greater hardware power would more than make up for it
3. probably a bit more so
4. Consoles use optical media and load a set amount into ram. so, don't see how that is much different, if anything, computers using hard drives would have an advantage. and i do believe computers have faster ram and definitely more of it
the xbox is even more of a conundrum, the thing practically IS a PC, only with a very slow Clock speed(533 is it?) and a custom designed geforce 3 variant, runs a very stripped down version of windows...etc.
yet it pushes some major eye candy games....WTF?
Originally posted by Wrong Robot
I think, while the gamecube is impressive...
the xbox is even more of a conundrum, the thing practically IS a PC, only with a very slow Clock speed(533 is it?) and a custom designed geforce 3 variant, runs a very stripped down version of windows...etc.
yet it pushes some major eye candy games....WTF?
EXACTLY
im so confused
Originally posted by Wrong Robot
the xbox is even more of a conundrum, the thing practically IS a PC, only with a very slow Clock speed(533 is it?) and a custom designed geforce 3 variant, runs a very stripped down version of windows...etc.
yet it pushes some major eye candy games....WTF?
The graphics card in an Xbox is better than that of an eMac. It was almost 3 generations ahead of PC Graphics Cards when it was designed/released. Now, when they write a game for a PC, they can't assume someone has the best available graphics card. When they do, the games look better than an Xbox game.
Basically it all comes down to crappy programming.
A bad console game sells 500,000 copies.
A hit PC games sells 500,000 copies.
The economics of this means that even mediocre console games can afford to have the same effort invested into them as a hit PC game.
When you consider Mac games, which sell far fewer copies, there just isnt the money available to pay a dev team to optimise the shit out of games.
Also note. that because of the size of the Mac market there just arent as many programmers around who can push systems to their limits.
There are a lot of reasons for consoles being faster, but money, and being a well understood, standing target really are at the top.
Originally posted by Wrong Robot
the xbox is even more of a conundrum, the thing practically IS a PC, only with a very slow Clock speed(533 is it?) and a custom designed geforce 3 variant, runs a very stripped down version of windows...etc.
PIII 733