What exactly happened to the Playstation 3?

145791017

Comments

  • Reply 121 of 322
    Quote:
    Originally Posted by gregmightdothat View Post


    ...but everything I've heard is just that it's gimmicky and doesn't add any real value to the games.



    I hear lots of things too, but it doesn't mean I automatically believe them. Kind of like when I heard that Iraq had WMDs...I didn't automatically believe it.



    Short story: go play one for yourself before spreading info that you heard from someone else.
  • Reply 122 of 322
    sdw2001sdw2001 Posts: 18,016member
    Quote:
    Originally Posted by Carniphage View Post


    "The King's new outfit is the smartest thing you ever did see! Everyone agrees."



    PS2 was an utterly crappy hardware, but despite that, fierce brand-loyalty and Sony's steroid-pumping marketing muscle made it a success. The crapness was perhaps more apparent to developers than it was to end users. and it was the developers that paid the price for the weakness in the hardware.



    But people have memories, and there is only so many times that the old bait-and-switch trick works. It's clear that less people are being fooled this time around.



    Suddenly a voice cries out, "Hey everyone! The King is stark bollock naked!"



    More evidence - The Ebay prices are now less than the store prices.

    Analysts are touting a price cut. And developers are showing declining confidence in the PS3.



    C.



    The best selling game console in history was a disaster for developers.



    That's the crux of your statement. Get some perspective.
  • Reply 123 of 322
    sdw2001sdw2001 Posts: 18,016member
    Quote:
    Originally Posted by OldCodger73 View Post


    Carniphage, a question. You come across so anti-Sony. What happened, did they turn down one of your games?



    Yeah really. It's one thing to talk about why a console might succeed or fail, but he's off the reservation on this one. I'm sorry...I don't care how hard the PS2 was to write for, it clearly didn't matter in the end. It had great games and was hugely successful.



    As for the PS3, well it's clear he's pulling out the same line. He may be right about the price, but I think that's a different argument. It's not going to be the architecture that it loses on.
  • Reply 124 of 322
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by Carniphage View Post


    I wish I was at the top of the food chain! Lol! I am more in a bottom feeding role right now.



    Splinemodel - the world has moved on. Apart from product design - everyone uses sub-ds. Subdivision surfaces have all the benefits of splines - without the disadvantages. Checkout Pixar's recent work etc.



    This article might interest you. it describes exactly what you were saying about using the CPU to generate geometry on the fly....



    http://arstechnica.com/articles/paed.../xbox360-1.ars



    C.



    Enough folks in the community have commented that while procedural synthesis has it's place it isn't a panacea. As I said...great for forests, not so much for dungeons. And the conclusion of that article says for the 360 "shows great promise but we'll see" tho' there was some interesting things there.



    WRT splines and lofts are still used when you want to model something accurately or simply as a guide to build polys and sub-d's. Just another tool in the chest. Not that I do any of that stuff but I depend on folks that do for my models so you need to learn enough to be dangerous. Airplanes and subs I was told.



    Vinea
  • Reply 125 of 322
    carniphagecarniphage Posts: 1,984member
    Quote:
    Originally Posted by vinea View Post


    Enough folks in the community have commented that while procedural synthesis has it's place it isn't a panacea. As I said...great for forests, not so much for dungeons. And the conclusion of that article says for the 360 "shows great promise but we'll see" tho' there was some interesting things there.



    WRT splines and lofts are still used when you want to model something accurately or simply as a guide to build polys and sub-d's. Just another tool in the chest. Not that I do any of that stuff but I depend on folks that do for my models so you need to learn enough to be dangerous. Airplanes and subs I was told.



    Vinea



    What I was clumsily trying to say was that Splinemodels' suggestion of utilising the Cell to tesselate spline-patches on the fly was equivalent to this technology.



    Namely, the procedural synthesis technology in the 360 could be used to dynamically tesselate and animate subdivision surfaces. (not just plonk grass into a field, or bricks into a dungeon). During animation, Sub-ds are less prone to having discontiuities than nurbs.



    And yes, while the same thing is possible on the PS3 - I think there are bandwidth issues which make it less attractive as a solution.



    C.
  • Reply 126 of 322
    carniphagecarniphage Posts: 1,984member
    Quote:
    Originally Posted by SDW2001 View Post


    Yeah really. It's one thing to talk about why a console might succeed or fail, but he's off the reservation on this one. I'm sorry...I don't care how hard the PS2 was to write for, it clearly didn't matter in the end. It had great games and was hugely successful.



    As for the PS3, well it's clear he's pulling out the same line. He may be right about the price, but I think that's a different argument. It's not going to be the architecture that it loses on.



    Ok instead of being quite so provocative, I will try to offer up a balanced statment.



    The PS3 still could become the dominant console platform in this round.

    A year ago no-one doubted it. However. now people do. I am hearing doubting voices, many of those voices are in the development community.



    The reasons working agains Sony are these.



    1) Sony are charging a premium to consumers for their platform. Sales don't look that great. There is consensus that the launch was botched and outside the die-hard fans, the PS3 does not currently offer great value. (yet)



    2) Sony were second to market. Meaning weak PS3 launch titles are pitted against second gen 360 titles. Real world 360 prices are falling. MS now breaks even on hardware sales.



    3) Their architecture is - arguably - less capable than the main rival product.

    (this is an arguable point - Sony wins on Flops and Storage. 360 wins on GPU, Ram and Processors)



    4) Software development costs are 20%-50% higher on the Sony platform. This did not hurt Sony for PS2 but it did hurt some developers. Free middleware and the option of PC skus is tempting for many developers. Even some in Asia.



    5) Sony's unblemished track-record is tarnished by the failure of the PSP - and generally Sony as an organisation looks more vulnerable than it has for many years.



    6) And this is the kicker. The sales needed to break even on a Next Gen game title are around the 500,000 mark. With low console sales and low attach rates developers are getting cold feet.

    Or at the very least would prefer to hedge their bets.



    Working *for* Sony is.



    1) Phenomenal brand loyalty.



    2) Everyone hates Microsoft. (me included!! Vista sucks etc.)



    C.
  • Reply 127 of 322
    sdw2001sdw2001 Posts: 18,016member
    Quote:
    Originally Posted by Carniphage View Post


    Ok instead of being quite so provocative, I will try to offer up a balanced statment.



    The PS3 still could become the dominant console platform in this round.

    A year ago no-one doubted it. However. now people do. I am hearing doubting voices, many of those voices are in the development community.



    The reasons working agains Sony are these.



    1) Sony are charging a premium to consumers for their platform. Sales don't look that great. There is consensus that the launch was botched and outside the die-hard fans, the PS3 does not currently offer great value. (yet)



    2) Sony were second to market. Meaning weak PS3 launch titles are pitted against second gen 360 titles. Real world 360 prices are falling. MS now breaks even on hardware sales.



    3) Their architecture is - arguably - less capable than the main rival product.

    (this is an arguable point - Sony wins on Flops and Storage. 360 wins on GPU, Ram and Processors)



    4) Software development costs are 20%-50% higher on the Sony platform. This did not hurt Sony for PS2 but it did hurt some developers. Free middleware and the option of PC skus is tempting for many developers. Even some in Asia.



    5) Sony's unblemished track-record is tarnished by the failure of the PSP - and generally Sony as an organisation looks more vulnerable than it has for many years.



    6) And this is the kicker. The sales needed to break even on a Next Gen game title are around the 500,000 mark. With low console sales and low attach rates developers are getting cold feet.

    Or at the very least would prefer to hedge their bets.



    Working *for* Sony is.



    1) Phenomenal brand loyalty.



    2) Everyone hates Microsoft. (me included!! Vista sucks etc.)



    C.



    1. Sales data just can't be compared right now due to the initial availability. But if you must compare, at least consider that Sony had a lot less product to sell, and still sold what...750,000 off that bat?



    2. I think that's pretty much irrelevant. People know the bigger titles are coming.



    3. Fully disagree. The PS3 hardware has far more pontential, particularly in terms of processor.



    4. Support your figures.



    5. PSP sales have dropped badly, it's true. However, the data I can find shows that they've sold almost 25 million of them. I don't know that it qualifies as a failure, nor do I think it impacts the PS3 all that much, if at all.



    6. Unsupported. Console sales are not "low." We'll talk in a year or so and see what sales are like.
  • Reply 128 of 322
    carniphagecarniphage Posts: 1,984member
    Quote:
    Originally Posted by SDW2001 View Post


    1. Sales data just can't be compared right now due to the initial availability. But if you must compare, at least consider that Sony had a lot less product to sell, and still sold what...750,000 off that bat?



    2. I think that's pretty much irrelevant. People know the bigger titles are coming.



    3. Fully disagree. The PS3 hardware has far more pontential, particularly in terms of processor.



    4. Support your figures.



    5. PSP sales have dropped badly, it's true. However, the data I can find shows that they've sold almost 25 million of them. I don't know that it qualifies as a failure, nor do I think it impacts the PS3 all that much, if at all.



    6. Unsupported. Console sales are not "low." We'll talk in a year or so and see what sales are like.



    1. Yes it can be compared. Check out this..

    http://www.gamesindustry.biz/content_page.php?aid=22621

    Which shows that *attach rates* for the PS3 are very low.



    2. So is Halo 3. So is the Zephyr. There's always something coming. Which title do you think will make the big difference for PS3?



    3. Like I said. This is arguble. And here we are arguing. I have said enough on this already.

    But look at.

    http://www.1up.com/do/feature?cId=3155393

    http://www.theinquirer.net/default.aspx?article=32171 Yes, I know the Inqurer is crap.

    and

    http://rofl-at.us/?page_id=85



    4. Costs for PS3 development are *significantly* more. But it's hard to provide you with anything that is not anecdotal. But look into the licencing costs of middleware. Find out about development team sizes / production times etc.



    5. Again you are looking at console sales (money lost) instead of game sales (money made)

    Perhaps this will help the PSP recover...

    http://www.joystiq.com/2007/02/05/hm...s3-pre-orders/



    6. Yeah they really are low.

    http://www.gamesindustry.biz/content_page.php?aid=22595

    But the attach rate is the big problem.



    I'm happy to leave this for six months or so. And see where things are then.



    If you personally had nine or ten million dollars to invest in a videogame project and you need to sell half-a million units to break even. Would you?

    a) Go exclusively for the PS3 - given what we now know?

    b) Spread your risk, and go multi-format?

    c) Do a PC / 360 only. And save some cash?



    Which would you do?



    C.
  • Reply 129 of 322
    Quote:
    Originally Posted by Carniphage View Post


    What I was clumsily trying to say was that Splinemodels' suggestion of utilising the Cell to tesselate spline-patches on the fly was equivalent to this technology.



    You have to realize that a lot of the things I say are hypothetical. I try to phrase them that way. Having done some "animation" work (mostly positioning for stills), I don't think I've ever operated directly with spline patches. I do in the modeling phase, but that's neither here nor there. My mantra here is that I think it's best to optimize new technologies, even if it means changing the paradigm. If spline-patches can deliver a better product on the Cell, then I say use them (I haven't looked into whtether they can or cannot, and quite possibly they cannot). If markov modeling or fuzzy matrices can deliver a better AI product on the Cell than can procedural means, then I say use them. The journey here takes some effort, but it ends up advancing the whole state of computer science, which I think is a good thing.



    When optimal methods are used for each, there's no doubt in my mind that the PS3 can deliver a vastly better gaming product than can the xBox 360. You are hung-up on the process of developing optimal methods. That's your prerogative, but I can guarantee to you that for every developer like you, there is one that wants to optimize the PS3/Cell and is working towards that. We have already seen evidence that optimized Cell middleware is indeed becoming a profitable market. So, again, it's your prerogative to target the 360, but sooner than you think, you're going to be yesterday's game developer.
  • Reply 130 of 322
    Quote:
    Originally Posted by Splinemodel View Post


    You have to realize that a lot of the things I say are hypothetical. there's no doubt in my mind that the PS3 can deliver a vastly better gaming product than can the xBox 360.



    Yup. You are right. That is a hypothetical statement all right. In fact its more than hypothetical. It's bordering on a religious-type of belief. (What do you call Cell worshippers? Celloastrian? Cellebate? Cellebrant? Sonanist? )



    But every religion has it's Richard Dawkins. after all, who would not want to be married to Lalla Ward?



    Have a read of this again...

    http://www.theinquirer.net/default.aspx?article=32171

    ... and have an epiphany on me.



    C.
  • Reply 131 of 322
    Quote:
    Originally Posted by Carniphage View Post


    Yup. You are right. That is a hypothetical statement all right. In fact its more than hypothetical. It's bordering on a religious-type of belief. (What do you call Cell worshippers? Celloastrian? Cellebate? Cellebrant? Sonanist? )



    It's not very religious, because it's based entirely on fact. If I were to use the colloquial vernacular rather than the appropriate term (hypothesis) I would have used the word "theory" instead, to convey the idea that there are scientific grounds behind the argument. We know all about the Cell. Any hypotheses I make are based on observations of the Cell and my knowledge of comp-arch and programmability. I think this falls into hard-science by definition.



    I do have a degree in EE, after all, and I've spent a lot of time working with digital signals. I can tell you that I know with certainty that most elements of a game program can be broken down into chunks that work well on the Cell. Again, whether you decide to use the old ways instead of the "new" ways is up to you, but I can guarantee you that there are other game developers at work on the new ways.
  • Reply 132 of 322
    carniphagecarniphage Posts: 1,984member
    Let's, for the sake of science, do a thought experiement.

    And, just for an instant, I will set aside my view that the Cell is good for videodisk playing an lousy for game logic, and assume that the Cell is the mightiest processor in Christendom.



    So let us think of game programming.

    Think of a game program as a string. (each part is connected to the next)



    Input->Game Logic & AI -> Simulation & Animation -> Transformation -> Rasterization



    Science shows us that metal chains are much stronger than weak string, right? And stronger is better right?



    So imagine we replace half of the string with a mighty cellumantium chain, it will be stronger right?



    Wrong. Because every kid knows that a chain is only as strong as its weakest point.



    The GPU on the PS3 was added too late (their engineers thought they could do everything on the Cell). The resultant design means the GPU is an afterthought. The limited memory bandwidth is really a serious design flaw - and in game terms it becomes the bottleneck. The almighty Cell can simulate every leaf on a tree - but the GPU can't draw 'em. Especially not at 1080p when the all of the available bandwidth is eaten up with just servicing the frame-buffer.



    What we have here is a 1000 Horsepower car - with slippery tyres.

    This is the 500 Watt Hi Fi Amp - with Radio Shack speakers.

    This is Flo-Jo on ice.

    This is the Williams sisters with badminton rackets.



    C.



    mmmmmm Lalla Ward.
  • Reply 133 of 322
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by Carniphage View Post


    Have a read of this again...

    http://www.theinquirer.net/default.aspx?article=32171

    ... and have an epiphany on me.



    C.





    Rob Fahey 17:06 07/06/2006

    "They've got the wrong end of the stick grasped firmly in both hands."



    Senior developers working on PlayStation 3 titles have told GamesIndustry.biz that reports of serious problems with the system's Cell and RSX hardware are "misleading and uninformed."



    Earlier this week, technology news site The Inquirer reported that the PS3 was "slow and broken", with a correspondent for the site claiming that massive flaws with the console's hardware mean that it is "hobbled" compared to Microsoft's Xbox 360.



    The site based its assertions on a claim that the NVIDIA-designed RSX graphics unit has a slower triangle setup rate than the ATI-designed part in the Xbox 360, and on a slide from Sony's Devstation event a few months back showing the memory access speeds within the console.



    However, speaking to GamesIndustry.biz this week, several developers who are familiar with the PS3 hardware have rubbished the claims made by The Inquirer - describing both sets of figures as "entirely meaningless."



    Although our sources declined to be named due to the continuing secretive and NDA-laden nature of PlayStation 3 development, they were unanimous in claiming that the figures, while they may well be true, have been grossly misinterpreted.



    The contentious triangle setup figure, which The Inquirer claims to be 270 million triangles per second, compared to around 500 million per second in the Xbox, came under fire first.



    "It's just a pointless measurement," one programmer told us. "Where's the context? How were these numbers measured? There are loads of different ways you can measure tri performance, and just putting up headline figures like that tells you nothing."



    "In fact, the PlayStation 2 had better tri performance than the Xbox, on paper," he continued. "Everyone knows that the Xbox was more powerful at running real games, but if you just wanted to fill a screen with 2D, flat colour, unlit triangles, then the PS2 was much better at that, so it looked great in benchmarks. That just shows how meaningless this measurement is - it's really pointless."



    However, particular scorn was heaped upon the claim that the Cell is being "hobbled" by slow memory access - based on a Devstation slide which shows Cell having only 16Mb/s read access to "Local Memory", compared to the 10-25Gb/s access figures for other component and memory types in the PS3.



    "They've got the wrong end of the stick grasped firmly in both hands," said another source regarding this claim. "I'm not even sure if they're holding the right stick."



    Each developer concurred that the slide in question was referring to local memory on the RSX - the graphics memory, in other words, and not the local memory on the Cell processor which The Inquirer claimed was in question.



    "I didn't see that slide at Devstation, but all the numbers add up," one coder said, "and it's a total non-issue. You never, ever need to access that memory from the Cell - I can think of some useful debugging things you might do with that access in the testing stage, but that's about it. In fact, on the PS2 you couldn't access that memory from the CPU at all, and it was never really a problem!"



    "I can see a couple of reasons why you might want to use it," another developer told us, "but really, they're pretty obscure, and you could probably do them on the RSX anyway, since it's quite flexible. Besides, if you really need to access video memory from the Cell, you can use the RSX to copy it over into main memory really quickly - it's all there on the slide."



    "I doubt a single person in the room batted an eyelid when they showed that slide," continued the first source. "It's exactly what we'd expect, and the bits that we actually need to use to make games are perfectly fast."



    While dismissing The Inquirer's claims as entirely spurious - and pointing out that even if they were true, they would be flaws so serious that Sony would simply not be able to release the Cell chip in that state - at least one of our sources admitted that PS3 was taking some time to get used to, but perhaps not as much as some parts of the media have suggested.



    "I'd say PS3 was a challenge to work on," he said, "but every new platform takes a while to get used to. Put it like this, I worked on early PS2 games, and those were a real nightmare - we're getting code up and running on PS3 much faster than we did last time around."



    "Once people start doing really impressive stuff on PS3 and Xbox 360, they're both going to be much the same [in terms of difficulty]," he concluded. "Sony's giving us better tools this time around - they're still not great at communicating and there are some weird holes in their developer support, but they've learned a lot of lessons from PS2."



    Vinea
  • Reply 134 of 322
    carniphagecarniphage Posts: 1,984member
    So in Summary of that article.



    Un named developer says....

    ""

    It's not that much broken. Numbers mean nothing.

    Programming PS3 is a bit of a challenge.

    PS2 was a disaster but Sony have improved its tools a bit.

    Its easier now than PS2 (meaning we can get a triangle on the screen in less than a week)

    ""



    Massively convincing that. Number mean nothing. Ignore the slide. War = Hate. Fast = Slow.



    Remember that the most significant Sony development is being done by teams working on Sony 1st Party titles. Such teams may be asked to act as a rebuttal unit - to defent the the "superior hardware". And if I were being paid by Sony, I'd join the line to defend the honor of the company.



    Of course what they say in private might be different.





    Ok - rebut this...





    GPU

    Even ignoring the bandwidth limitations the PS3's GPU is not as powerful as the Xbox 360's GPU.



    Below are the specs from Sony's press release regarding the PS3's GPU.



    RSX GPU

    ? 550 MHz

    ? Independent vertex/pixel shaders

    ? 51 billion dot products per second (total system performance)

    ? 300M transistors

    ? 136 shader operations per clock



    The interesting ALU performance numbers are 51 billion dot products per second (total system performance), 300M transistors, and more than twice as powerful as the 6800 Ultra.



    The 51 billions dot products per cycle were listed on a summary slide of total graphics system performance and are assumed to include the Cell processor. Sony's calculations seem to assume that the Cell can do a dot product per cycle per DSP, despite not having a dot product instruction.



    However, using Sony's claim, 7 dot products per cycle * 3.2 GHz = 22.4 billion dot products per second for the CPU. That leaves 51 * 22.4 = 28.6 billion dot products per second that are left over for the GPU. That leaves 28.6 billion dot products per second / 550 MHz = 52 GPU ALU ops per clock.



    It is important to note that if the RSX ALUs are similar to the GeForce 6800 ALUs then they work on vector4s, while the Xbox 360 GPU ALUs work on vector5s. The total programmable GPU floating point performance for the PS3 would be 52 ALU ops * 4 floats per op *2 (madd) * 550 MHz = 228.8 GFLOPS which is less than the Xbox 360's 48 ALU ops * 5 floats per op * 2 (madd) * 500 MHz= 240 GFLOPS.



    With the number of transistors being slightly larger on the Xbox 360 GPU (330M) it's not surprising that the total programmable GFLOPs number is very close.





    The PS3 does have the additional 7 DSPs on the Cell to add more floating point ops for graphics rendering, but the Xbox 360's three general purpose cores with custom D3D and dot product instructions are more customized for true graphics related calculations.



    The 6800 Ultra has 16 pixel pipes, 6 vertex pipes, and runs at 400 MHz. Given the RSX's 2x better than a 6800 Ultra number and the higher frequency of the RSX, one can roughly estimate that it will have 24 pixel shading pipes and 4 vertex shading pipes (fewer vertex shading pipes since the Cell DSPs will do some vertex shading). If the PS3 GPU keeps the 6800 pixel shader pipe co-issue architecture which is hinted at in Sony's press release, this again gives it 24 pixel pipes* 2 issued per pipe + 4 vertex pipes = 52 dot products per clock in the GPU.



    If the RSX follows the 6800 Ultra route, it will have 24 texture samplers, but when in use they take up an ALU slot, making the PS3 GPU in practice even less impressive. Even if it does manage to decouple texture fetching from ALU co-issue, it won't have enough bandwidth to fetch the textures anyways.



    For shader operations per clock, Sony is most likely counting each pixel pipe as four ALU operations (co-issued vector+scalar) and a texture operation per pixel pipe and 4 scalar operations for each vector pipe, for a total of 24 * (4 + 1) + (4*4) = 136 operations per cycle or 136 * 550 = 74.8 GOps per second.





    Given the Xbox 360 GPU's multithreading and balanced design, you really can't compare the two systems in terms of shading operations per clock. However, the Xbox 360's GPU can do 48 ALU operations (each can do a vector4 and scalar op per clock), 16 texture fetches, 32 control flow operations, and 16 programmable vertex fetch operations with tessellation per clock for a total of 48*2 + 16 + 32 + 16 = 160 operations per cycle or 160 * 500 = 80 GOps per second.



    Overall, the automatic shader load balancing, memory export features, programmable vertex fetching, programmable triangle tesselator, full rate texture fetching in the vertex shader, and other well beyond shader model 3.0 features of the Xbox 360 GPU should also contribute to overall rendering performance.



    Bandwidth

    The PS3 has 22.4 GB/s of GDDR3 bandwidth and 25.6 GB/s of RDRAM bandwidth for a total system bandwidth of 48 GB/s.



    The Xbox 360 has 22.4 GB/s of GDDR3 bandwidth and a 256 GB/s of EDRAM bandwidth for a total of 278.4 GB/s total system bandwidth.





    Why does the Xbox 360 have such an extreme amount of bandwidth? Even the simplest calculations show that a large amount of bandwidth is consumed by the frame buffer. For example, with simple color rendering and Z testing at 550 MHz the frame buffer alone requires 52.8 GB/s at 8 pixels per clock. The PS3's memory bandwidth is insufficient to maintain its GPUs peak rendering speed, even without texture and vertex fetches.



    The PS3 uses Z and color compression to try to compensate for the lack of memory bandwidth. The problem with Z and color compression is that the compression breaks down quickly when rendering complex next-generation 3D scenes.



    HDR, alpha-blending, and anti-aliasing require even more memory bandwidth. This is why Xbox 360 has 256 GB/s bandwidth reserved just for the frame buffer. This allows the Xbox 360 GPU to do Z testing, HDR, and alpha blended color rendering with 4X MSAA at full rate and still have the entire main bus bandwidth of 22.4 GB/s left over for textures and vertices.



    CONCLUSION

    When you break down the numbers, Xbox 360 has provably more performance than PS3. Keep in mind that Sony has a track record of over promising and under delivering on technical performance. The truth is that both systems pack a lot of power for high definition games and entertainment.



    The best sentence is this one...



    The PS3's memory bandwidth is insufficient to maintain its GPUs peak rendering speed, even without texture and vertex fetches.





    In all these articles. The PS3 is either "comparable" or "broken" but never better.

    In side by side comparisons of identical titles. You can see worse. You can see same.



    Dude where's my "better"?



    I can't believe it's not better?



    C.
  • Reply 135 of 322
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by Carniphage View Post


    Let's, for the sake of science, do a thought experiement.

    And, just for an instant, I will set aside my view that the Cell is good for videodisk playing an lousy for game logic, and assume that the Cell is the mightiest processor in Christendom.



    So let us think of game programming.

    Think of a game program as a string. (each part is connected to the next)



    Input->Game Logic & AI -> Simulation & Animation -> Transformation -> Rasterization



    Science shows us that metal chains are much stronger than weak string, right? And stronger is better right?



    So imagine we replace half of the string with a mighty cellumantium chain, it will be stronger right?



    Wrong. Because every kid knows that a chain is only as strong as its weakest point.



    And programmers are supposed to be good at logic. Perhaps analogies is where we fall down and cry.



    If you replace half the "string" with better capability you have more string to use on the not so hot areas. If I have a machine that does transformation and rasterization well and offloaded that work then I have more CPU budget to spend on Simulation and Animation.



    Quote:

    The GPU on the PS3 was added too late (their engineers thought they could do everything on the Cell). The resultant design means the GPU is an afterthought. The limited memory bandwidth is really a serious design flaw - and in game terms it becomes the bottleneck. The almighty Cell can simulate every leaf on a tree - but the GPU can't draw 'em. Especially not at 1080p when the all of the available bandwidth is eaten up with just servicing the frame-buffer.



    While the 360 unified architecture and the EDRAM a very nice advantage (essentially giving you HDR and 4XAA for almost free) the RSX isn't a horrid GPU. Also on the PS3 side the Cell isn't useless. Incognito did all their cloud rendering (volumetric raytracer) on the Cell IIRC. The SPEs can help keep the GPU fed.



    Vinea
  • Reply 136 of 322
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by Carniphage View Post


    So in Summary of that article.



    Un named developer says....

    ""

    It's not that much broken. Numbers mean nothing.




    And the "proof" of that adage is that Oblivion looks as good or better on the PS3 (from previews) than the 360.



    Quote:

    Remember that the most significant Sony development is being done by teams working on Sony 1st Party titles. Such teams may be asked to act as a rebuttal unit - to defent the the "superior hardware". And if I were being paid by Sony, I'd join the line to defend the honor of the company.



    Of course what they say in private might be different.



    Arguably there's no proof that anyone is paying you for anything.



    Quote:

    Ok - rebut this...



    I read that a while back. The rebuttal is the numbers game on both sides is simply posturing and theory. The proof has been and always will be in the games. If the PS3 falls down on 1080 games then it will become obvious. On the other hand for both consoles 720 is likely the sweet spot.



    Quote:

    In all these articles. The PS3 is either "comparable" or "broken" but never better.

    In side by side comparisons of identical titles. You can see worse. You can see same.



    The broken article is clearly wrong. 16Mb to local memory is obviously to the RSX memory as explained or Sony might as well have not bothered. The PS3 just launched and its too early to tell.



    Quote:

    Dude where's my "better"?



    I can't believe it's not better?



    C.



    Jeez...if you are a game dev you must be a Mac game dev.



    Vinea
  • Reply 137 of 322
    Quote:
    Originally Posted by Carniphage View Post


    Let's, for the sake of science, do a thought experiement.



    The only thing you show is that your workflow as is may not be best suited for the PS3. Making half-cocked analogies doesn't do anything to advance your hypothesis that the PS3 has technical ceilings that won't be surpassed in the near future.
  • Reply 138 of 322
    carniphagecarniphage Posts: 1,984member
    Quote:
    Originally Posted by vinea View Post


    While the 360 unified architecture and the EDRAM a very nice advantage (essentially giving you HDR and 4XAA for almost free) the RSX isn't a horrid GPU. Vinea



    Not horrid. Just not as good. You accept right?



    Rasterization is a weak(er) bit of the PS3 chain.

    Now lets look at the other end of the chain.



    All the games I have worked on had some kind of "interaction engine". It's pretty universal. Each entity in the game world usually has a script or a program fragment which determines how the entity behaves on a moment by moment basis. Just like Keanu said in Matrix 3.



    Some of these scripts would be trivially simple. Some would be much more complex, with A-star route-solving and so forth. The vast majority are code that takes the form of state-machines or case statements. Lots of "if"s and branching. Lots of hopping about in memory.

    The PS2 hated this stuff.



    In older games you'd have a few tens of these entities on the go at once. In newer games you could have hundreds. And that was a problem because "branches per second" is not something modern processors like to boast about. Cache coherency goes out the window.



    This game-entity-logic is something that you need to run on a conventional processor. It's too bitty, integer-based and fragmented to offload onto a FPU. You need a conventional CPU to do it, because most of these entities are asking questions of the game database. (What's my nearest enemy? What material tag am I standing on? If I shoot out this ray, what do I hit?) This type of code is what makes games interactive, and this type of code does not map well onto vector processors.



    Physics simulation, on the other hand, *is* something you could do on the Cell.

    Although again searching the collision database is gonna be a problem.



    So to go back to the string. ( The PS3 architecture is a weaker performer at game logic. Strong in simulation and weaker in rasterization) It's inherent strength, in multiplying billions of numbers together, is a strong link in the middle of a weak string. No matter how strong that link is, it can't overcome the weakness at *both* ends of the string.





    C.
  • Reply 139 of 322
    sdw2001sdw2001 Posts: 18,016member
    Here we go again. Carniphage, you're descending into the techno-geek plane yet another time. You're trying desperately (why you are should be examined as well) to show the technical inferiority of the PS3, which is a dubious-at-best claim anyway. But even where you succeed in doing so, you miss the rather obvious point:



    IT DOESN'T MATTER





    Developers will write for the PS3 provided there is enough of an installed user base to do so. Right now, there's not a reason to think sales will be poor enough to cause a major falloff in development, gigaflops and attachment rates and triangle rendering data aside.
  • Reply 140 of 322
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by Carniphage View Post


    Not horrid. Just not as good. You accept right?



    Sure. Offset by the Cell.



    Quote:

    All the games I have worked on had some kind of "interaction engine". It's pretty universal.



    ...



    Some of these scripts would be trivially simple. Some would be much more complex, with A-star route-solving and so forth. The vast majority are code that takes the form of state-machines or case statements. Lots of "if"s and branching. Lots of hopping about in memory.



    ...



    And that was a problem because "branches per second" is not something modern processors like to boast about. Cache coherency goes out the window.



    This game-entity-logic is something that you need to run on a conventional processor. It's too bitty, integer-based and fragmented to offload onto a FPU.



    Well, one you still have a PPU and two the SPUs have been shown that even with software branch prediction (compiler based) it works reasonably well with legacy code even in comparison to the PPU. When optimized to use the SPU local store they work 20% faster than the PPU. Those numbers come from a PS3 dev who hopefully benchmarked real game code on the SPU.



    Now certainly branch prediction on the SPU is better for loops than when faced with many conditional branches because you need the branch hint appearing 11 cycles before the branch instruction (and you only get one hint) it's still only a 50-50 chance of an 18-cycle branch mispredict penalty (9 cycle average). In some cases you can just execute both branches and do the branch late and suffer no time penalties at the cost of efficiency.



    Whether the current CELL compilers do that...I dunno. But even a missed predition isn't horrible if you're branching inside the SPU's local memory (ie optimizing your algorithm to run on a SPU). If you are branching outside the local store then yes, that is the equivalent to a cache miss (worse really) but most critical code is running in tightly bound loops.



    What you can't do well is handle large pointers and stacks because those live in main memory so for large structures the SPU is not great (it can load blocks of data well but not testing individual bytes in the blocks to see if the rest of the block SHOULD be loaded).



    The implication here is that search and other similar algorithms (sort, collision, etc) need to be parallalized to search part of the data structure in local either in series or better in parallel across a few SPUs. For collision detection you can try to do that when you're doing the geo processing for animation to reduce loading them again just to do collision.



    Quote:

    You need a conventional CPU to do it, because most of these entities are asking questions of the game database. (What's my nearest enemy? What material tag am I standing on? If I shoot out this ray, what do I hit?) This type of code is what makes games interactive, and this type of code does not map well onto vector processors.



    SPUs are more than just a vector processor and can run pretty much any standard C++ code (subject to memory constraints). While it doesn't have hw branch prediction in comparison to the P4 hw branch prediction if you blow that its a 35 cycle hit. You'll miss on the sw branch prediction on the SPU far more often but when you do its an 18 cycle hit. And the compiler optimized for the SPU can try to convert some types of branches to a select or a min, max, and, or combo for you.



    The guidelines I've read for programming the Cell (as an academic exercise...I'm a MDX coder so I'm more likely to do XNA than go do any PS3 coding) typically say avoid inline calls, try to use spu_select to get rid of branches, interleave blocks of unrelated code to reduce dependencies and unroll loops to allow the compiler to move code around for you.



    That doesn't seem too horrid and the point is the cell has 128 registers and a 6 cycle dual pipleline that you can use effectively to offset some of the disadvantages. Computing both sides of a branch and using spu_select might seem a waste but you can do it where you can't put a predict hint within 11 cycles.



    The spu_select method allows you to do branches like



    if (a<b) c = d else c = e



    without branches at the cost the cycles of doing an extra c = e or c = d. And if you can get rid of branches inside loops you get a speedup even if your CPU has 0 branch penalty (because branching inside loops kills optimization).



    Folks report that they can get CPI of less than 1.0 on the SPU on their first compile so it doesn't suck as badly as you say.



    Quote:

    So to go back to the string. ( The PS3 architecture is a weaker performer at game logic. Strong in simulation and weaker in rasterization) It's inherent strength, in multiplying billions of numbers together, is a strong link in the middle of a weak string. No matter how strong that link is, it can't overcome the weakness at *both* ends of the string.



    And PS3 devs say different and the little I've read of IBM's docs (a while ago admittedly but its not like it would have gotten worse in terms of compiler support) imply otherwise. Your analogy is broken as folks have certainly found ways to use a 3.2Ghz SPU to do stuff beyond multiplying numbers together. We can argue about the lack of OOO and rename registers in addition to the lack of branch prediction if you like but the SPUs are reasonably capable little guys.



    The fact is both Xenon and Cell are nice hardware in comparison to the last gen.



    Vinea
Sign In or Register to comment.