Intel roadmap reveals quad-core Xeon details

124

Comments

  • Reply 61 of 87
    Quote:
    Originally Posted by melgross


    No. That has nothing to do with it. AMD mobo's are the same. It's up to the machine manufacturer to decide what they want to put on the board, not the chip manufacturer. They both allow for enough lanes.



    nforce pro has a lot more lanes.
  • Reply 62 of 87
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by Joe_the_dragon


    nforce pro has a lot more lanes.



    And, that has to do with a cpu in what way? Chipsets are not the cpu. As I said there, are mobo's for both Intel and AMD that can do 16 lanes.
  • Reply 63 of 87
    Quote:
    Originally Posted by melgross


    And, that has to do with a cpu in what way? Chipsets are not the cpu. As I said there, are mobo's for both Intel and AMD that can do 16 lanes.



    Yeah. In any case the Chipsets are related to the Mobos. The mobos for Intel and AMD made with ATI or nVidia chipsets, not Intel chipsets, have true PCIExpress x16 x16 lanes, + a few more PCIE lanes to spare. The x16 x16 deal is mainly because of SLI and CrossFire tech/ marketing.



    I have to check if Intel Conroe chipsets have x16 x16 lanes but the Xeon chipsets certainly don't from what we can see.



    Quote:
    Originally Posted by Joe_the_dragon


    so people who do high end 3d work use amd systems.



    That's kinda right by association, the nVidia kickass 3D workstation motherboards are for Opterons. NVIDIA nForce Professional 3600 and 3050 has 56 PCI Express lanes. Supports ONLY Opterons.

    http://www.nvidia.com/page/pg_20060814366736.html



    .................................................. ....................................
  • Reply 64 of 87
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by sunilraman


    Yeah. In any case the Chipsets are related to the Mobos. The mobos for Intel and AMD made with ATI or nVidia chipsets, not Intel chipsets, have true PCIExpress x16 x16 lanes, + a few more PCIE lanes to spare. The x16 x16 deal is mainly because of SLI and CrossFire tech/ marketing.



    I have to check if Intel Conroe chipsets have x16 x16 lanes but the Xeon chipsets certainly don't from what we can see.







    That's kinda right by association, the nVidia kickass 3D workstation motherboards are for Opterons. NVIDIA nForce Professional 3600 and 3050 has 56 PCI Express lanes. Supports ONLY Opterons.

    http://www.nvidia.com/page/pg_20060814366736.html



    .................................................. ....................................



    Just of note. I recently read two things of interest.



    The first was that Quad SLI is a bust, with no real performance advantage, while consuming a vast amount of power, and dissipating an equal amount of heat.



    And second, that only a few thousand SLI machines and mobo's have been sold.



    It doesn't seem to be too popular.
  • Reply 65 of 87
    Quote:
    Originally Posted by melgross


    Just of note. I recently read two things of interest.



    The first was that Quad SLI is a bust, with no real performance advantage, while consuming a vast amount of power, and dissipating an equal amount of heat.



    And second, that only a few thousand SLI machines and mobo's have been sold.



    It doesn't seem to be too popular.



    Yes, the Quad SLI reviews were devastating. nVidia still is pushing it heavily:

    http://www.slizone.com/object/slizone_gf7950_gx2.html



    Hopefully the driver updates will really get some kickass Quad SLI going.



    Actual SLI and CrossFire systems have been niche, but their marketing power is very strong. It does influence a GPU purchaser even if they don't end up buying an SLI motheboard, IMHO.



    What IS interesting is that if you look at nVidia's 7950 GX2 "dualcore" card, its performance is very astonishing and really excellent price for the performance of it. Its performance outstrips that if you just split the two 7950 boards and hooked it up via SLI over two PCIE slots.



    Ironically, it is not SLI but nVidia's multicore offerings on ONE CARD that will carry them through from 2007 onwards.



    SLI and Crossfire will continue to be "wow factor" marketing for ATI and nVidia. It will also just be there for hardcore enthusiast bragging rights.



    nVidia offering two to four 65nm cores on a single board will be the next breakthrough that's needed for them to push forward in 2007 and 2008*. This might show up in the later part of the 8-series cards, alongside support for AA and OpenEXR HDR which should be in the 8-series cards from the get-go. Then there's also DirectX10 support...



    *The implementation of the "dual-core" 7950GX2 shows that they can take multicore GPUs beyond the 1.5x performance level for two GPUs - given NO multicore-GPU-optimisations in games...! Hopefully they can refine this to hit 1.8x performance for two GPUs and something like 3.6x performance in four-core cards -- again given NO multiGPU optimisations in most games.
  • Reply 66 of 87
    SLI on nVidia 6-series and 7-series cards have been hit-and-miss on various games. I was looking at 6600GT SLI results and it was somewhat dissapointing. I don't think I will upgrade my video card anytime soon. 7-series kinda not worth it until DirectX10 kickass stuff - which will require me to dump my AMD64 and go to Conroe (or what comes after that), 2(4?)GB RAM and a 15,000rpm hard drive... Hmm... Anyway have been away from my gaming rig for exactly two months now. It's in Malaysia, I'm in Melbourne for possibly a few more months, maybe more if I can settle into a decent job and OMFG THANK GOD I don't have to go back and live in Malaysia (I like white people Nah, I'm just confused from living in too many different cities in my 28 years of life). Heh.
  • Reply 67 of 87
    The reality is that SLI is impractical and beset with technical issues that make it somewhat problematic and it fails to live up to its hype. In most cases you pay twice as much money, heat, and power but get little-to-no additional performance. Next year's GPU will pretty much always beat this year's GPU in SLI form.



    This isn't entirely a given, but more design effort has to be put into the architecture than nVidia or ATI seem willing to do.



    Note: SLI is not the same as a multi-core single board implementation.
  • Reply 68 of 87
    If I do get back to my gaming rig by the end of 2006 though my Christmas present to myself could be the 7950GX2 (hopefully a GT version) but then again probably I'll leave the GPU and just do heatsink upgrade on the 6600GT GPU, clean up the AMD64 Zalman "copper-flower" heatsink and get 1GB more of Kingston DDR400 ValueRam to go to 2GB RAM - run it at about 2.16ghz CPU with RAM speeds of 392mhz (slow but made up by tight timings of 2.5-3-3-7) (Intel boards DDR2 are 800mhz or so but their timings are like 4-4-4-12)... I'll leave the hard disk then at 7200rpm 80gb. Well, decisions, decisions.... Anyway first I gotta see if I can get a resonable sustainable job here in Melbourne!!
  • Reply 69 of 87
    Quote:
    Originally Posted by Programmer


    The reality is that SLI is impractical and beset with technical issues that make it somewhat problematic and it fails to live up to its hype. In most cases you pay twice as much money, heat, and power but get little-to-no additional performance. Next year's GPU will pretty much always beat this year's GPU in SLI form.



    This isn't entirely a given, but more design effort has to be put into the architecture than nVidia or ATI seem willing to do.



    Yeah, at the end of the day this seems to be the case. \
  • Reply 70 of 87
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by sunilraman


    Yes, the Quad SLI reviews were devastating. nVidia still is pushing it heavily:

    http://www.slizone.com/object/slizone_gf7950_gx2.html



    Hopefully the driver updates will really get some kickass Quad SLI going.



    Actual SLI and CrossFire systems have been niche, but their marketing power is very strong. It does influence a GPU purchaser even if they don't end up buying an SLI motheboard, IMHO.



    What IS interesting is that if you look at nVidia's 7950 GX2 "dualcore" card, its performance is very astonishing and really excellent price for the performance of it. Its performance outstrips that if you just split the two 7950 boards and hooked it up via SLI over two PCIE slots.



    Ironically, it is not SLI but nVidia's multicore offerings on ONE CARD that will carry them through from 2007 onwards.



    SLI and Crossfire will continue to be "wow factor" marketing for ATI and nVidia. It will also just be there for hardcore enthusiast bragging rights.



    nVidia offering two to four 65nm cores on a single board will be the next breakthrough that's needed for them to push forward in 2007 and 2008*. This might show up in the later part of the 8-series cards, alongside support for AA and OpenEXR HDR which should be in the 8-series cards from the get-go. Then there's also DirectX10 support...



    *The implementation of the "dual-core" 7950GX2 shows that they can take multicore GPUs beyond the 1.5x performance level for two GPUs - given NO multicore-GPU-optimisations in games...! Hopefully they can refine this to hit 1.8x performance for two GPUs and something like 3.6x performance in four-core cards -- again given NO multiGPU optimisations in most games.



    Back in the old days , when GPU's had no heat sinks, the high performance boards of the day (all for Apple, by the way!), had two and four cores.



    My theory of progress is that it is like a compressed spring.



    As we go around the wire we come back to what we did before, except it is a bit different, and hopefully, better. Fashion is like that too.
  • Reply 71 of 87
    nVidia and ATI need some of that sweet, sweet Intel performance-per-watt low-nm multicore goodies shoved up their pipelines...!!!
  • Reply 72 of 87
    Quote:
    Originally Posted by melgross


    Back in the old days , when GPU's had no heat sinks, the high performance boards of the day (all for Apple, by the way!), had two and four cores.



    My theory of progress is that it is like a compressed spring.



    As we go around the wire we come back to what we did before, except it is a bit different, and hopefully, better. Fashion is like that too.



    Cool 8) That explains the bellbottoms you're wearing.
  • Reply 73 of 87
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by Programmer


    The reality is that SLI is impractical and beset with technical issues that make it somewhat problematic and it fails to live up to its hype. In most cases you pay twice as much money, heat, and power but get little-to-no additional performance. Next year's GPU will pretty much always beat this year's GPU in SLI form.



    This isn't entirely a given, but more design effort has to be put into the architecture than nVidia or ATI seem willing to do.



    Note: SLI is not the same as a multi-core single board implementation.



    Sometimes, it only takes six to nine months to reach the same performance level, at less cost.



    It always seemed to me to be a better idea to buy the best card, and hold on to it longer.
  • Reply 74 of 87
    Heh. What you mean no HeatSinks!!! Everything has to have heatsinks on it nowadays!!! OMFG a naked chip with no heatsink!!! ARHGHG SCARY!!
  • Reply 75 of 87
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by sunilraman


    Cool 8) That explains the bellbottoms you're wearing.



    Er. USED to wear.
  • Reply 76 of 87




    Did I miss something.



    The multi board GPU reviews I've read suggested to me that the FPS, at the high end, were limited at the CPU end and not the GPU end (i. e. the CPU is starving the GPU).



    I guess I need to revisit some review sites!



  • Reply 77 of 87
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by franksargent






    Did I miss something.



    The multi board GPU reviews I've read suggested to me that the FPS, at the high end, were limited at the CPU end and not the GPU end (i. e. the CPU is starving the GPU).



    I guess I need to revisit some review sites!







    What you're saying is true. It's always been a back and forth issue.



    But you lose efficiency when you go to two or more boards as well.



    The other question is just how fast is fast enough?



    I often think they're going about it the wrong way. higher rez doesn't make it more photorealistic.



    Even VHS, at 240 x 480, was more photorealistic than even the highest 3D game and board combo's are. To get that photo-realism, other paths must be taken. They will never get there on the one they are taking now.



    One of the most important obstacles on the road is physics. Without proper physics covering each and every pixel, we will never get realistic movement. We notice that more than we recognize detail, which isn't important.



    Another thing we need done for every pixel is source originated ray-traced light.



    Then, real texturing. Now, texture isn't real, it's usually a bump map, even if it is generated on the fly. We need texturing that is surface locked and determined. It must have height that exists from all angles. Notice that a ball may have a surface texture, but the edge is smooth? Fix it.



    One of the last things is transparency of real objects. Example: Light travels a short distance through our skin. It varies depending on the kind of light, the intensity, and the color. Who we are also modifies that.



    These things, plus some others must be fixed before anything they do will look real.



    Resolution, antialiasing, and other techniques they use now won't do it. They help, but they aren't the main problems.
  • Reply 78 of 87
    Quote:
    Originally Posted by melgross


    What you're saying is true. It's always been a back and forth issue.



    But you lose efficiency when you go to two or more boards as well.



    The other question is just how fast is fast enough?



    I often think they're going about it the wrong way. higher rez doesn't make it more photorealistic.



    Even VHS, at 240 x 480, was more photorealistic than even the highest 3D game and board combo's are. To get that photo-realism, other paths must be taken. They will never get there on the one they are taking now.



    One of the most important obstacles on the road is physics. Without proper physics covering each and every pixel, we will never get realistic movement. We notice that more than we recognize detail, which isn't important.



    Another thing we need done for every pixel is source originated ray-traced light.



    Then, real texturing. Now, texture isn't real, it's usually a bump map, even if it is generated on the fly. We need texturing that is surface locked and determined. It must have height that exists from all angles. Notice that a ball may have a surface texture, but the edge is smooth? Fix it.



    One of the last things is transparency of real objects. Example: Light travels a short distance through our skin. It varies depending on the kind of light, the intensity, and the color. Who we are also modifies that.



    These things, plus some others must be fixed before anything they do will look real.



    Resolution, antialiasing, and other techniques they use now won't do it. They help, but they aren't the main problems.







    IMHO, the game makers have a long, Long, LONG way to go on the physics end of things.



    Take textile modeling, you know clothing.



    Try to do that, as a single (static) frame, no motion, no friction, no wind, no fluid effects (i. e. rain and/or partially/fully immersed), just gravity.



    Even that problem is to probabilistic in nature. Compare that to the real world empirical dataset that you're modeling, calculate 3D spectral moments (M0, M1, M2, ...), or close one eye and calculate 2D moments. Compare. You're doing really, Really, REALLY good if you can get good agreement on the mean position (M0). Now compare the higher moments. OUCH!



    Oh, did I mention we all have TWO eyes? No perspective, no depth.



    Your eyes/brain are marvelous 3D high resolution RT analog filters! And we all have had (or will have) years to calibrate and integrate that cognitive system!



  • Reply 79 of 87
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by franksargent






    IMHO, the game makers have a long, Long, LONG way to go on the physics end of things.



    Take textile modeling, you know clothing.



    Try to do that, as a single (static) frame, no motion, no friction, no wind, no fluid effects (i. e. rain and/or partially/fully immersed), just gravity.



    Even that problem is to probabilistic in nature. Compare that to the real world empirical dataset that you're modeling, calculate 3D spectral moments (M0, M1, M2, ...), or close one eye and calculate 2D moments. Compare. You're doing really, Really, REALLY good if you can get good agreement on the mean position (M0). Now compare the higher moments. OUCH!



    Oh, did I mention we all have TWO eyes? No perspective, no depth.



    Your eyes/brain are marvelous 3D high resolution RT analog filters! And we all have had (or will have) years to calibrate and integrate that cognitive system!







    That's correct. It's a problem as well. But, at least a camera, when using the correct lens can offer appropriate perspective that helps there. This has been known for a long time, and is how "normal" camera lengths are determined.
  • Reply 80 of 87
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by melgross


    The other question is just how fast is fast enough?



    I often think they're going about it the wrong way. higher rez doesn't make it more photorealistic.

    ...

    One of the most important obstacles on the road is physics

    ...

    Another thing we need done for every pixel is source originated ray-traced light.

    ...

    Then, real texturing.

    ...

    One of the last things is transparency of real objects.

    ...

    These things, plus some others must be fixed before anything they do will look real.

    ...

    Resolution, antialiasing, and other techniques they use now won't do it. They help, but they aren't the main problems.



    Yes, but the real problem isn't answered by any of that stuff. Do those things really make a better game? Sure its a better looking game, but they already look pretty good and does getting all the way to "indistinguishable from reality" (if that is your goal) make the game better?
Sign In or Register to comment.