AMD Radeon 6000 'Barts' specs

Posted:
in Future Apple Hardware edited January 2014
Amd has quite a few aces up its sleeve for the next 12 months.



First off is the Radeon 6000, which doesn't bring a die shrink (TSMC manufactures these, and scrapped 32nm in favor of fast-forwarding 28nm instead), but is instead a refined 40nm chip. The same process as Radeon 5000 cards, but a more mature chip design. Mac users, you won't be seeing these for at least 10 months, but the rest of the world will see these in a few weeks.



http://www.fudzilla.com/graphics/gra...s-specs-leaked

Quote:

The new Barts series should succeed Juniper-based HD 5770 and HD 5750 boards, but it will deliver a significant performance boost. The biggest difference is the transition to a 256-bit memory bus, which will provide quite a bit more bandwidth than the 128-bit Juniper series. Both boards should feature 1GB of memory, but we’re still not sure about memory clocks, as we’re getting some mixed information.



Barts XT will feature 12 SIMDs, 960 shaders, 48 texture units and 32 ROPs. For comparison, Juniper XT packs 800 shaders and 40 texture units, so Barts will have 20 percent more of everything. The core should end up clocked at 850MHz and the TDP is rated at over 150W, which is quite a bit more than the HD 5770.



Barts PRO packs 800 shaders, 40 texture units and 32 ROPs, much like the current Juniper XT, but it will also have a wider 256-bus. It will be clocked between 700MHz and 725MHz and its TDP should end up below 150W.



AMD is pitting both boards against Nvidia’s GTX 460: Barts PRO will take on the 768MB version of Nvidia’s GTX 460, while Barts XT should fight the 1GB version. Judging by the spec, Barts boards should be able to outperform the GTX 460 and we wouldn't be surprised if they match HD 5830 or even 5850 performance in some scenarios, thanks to higher clocks and the 256-bit bus.



Note that these are sub-$200, midrange cards.



Comments

  • Reply 1 of 13
    I hope you're wrong on the timing of when these will show up in Macs.



    It would be nice to see these start showing up in Sandy Bridge Macs, assuming Apple stick with Intel..but who knows, starting next year.



    I won't hold my breath, though.
  • Reply 2 of 13
    Quote:
    Originally Posted by backtomac View Post


    I hope you're wrong on the timing of when these will show up in Macs.



    It would be nice to see these start showing up in Sandy Bridge Macs, assuming Apple stick with Intel..but who knows, starting next year.



    I won't hold my breath, though.



    I expect he said 10 months because of Apple's approximately annual refresh cycle.
  • Reply 3 of 13
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    Amd has quite a few aces up its sleeve for the next 12 months.



    It is good that they are hitting on all cylinders right now.

    Quote:

    First off is the Radeon 6000, which doesn't bring a die shrink (TSMC manufactures these, and scrapped 32nm in favor of fast-forwarding 28nm instead), but is instead a refined 40nm chip. The same process as Radeon 5000 cards, but a more mature chip design. Mac users, you won't be seeing these for at least 10 months, but the rest of the world will see these in a few weeks.



    Don't be so sure about Apple, reports are that drivers are already being updated for the Mac.



    Sometimes I think Apple is becoming more aggressive with their hardware and other times I think the lay back and enjoy the current brisk sales. If they brought the same enthusiasm to their desktop line that they seem to exhibit in their laptop line we could see these in Macs soon

    Quote:



    Note that these are sub-$200, midrange cards.



    The last issue is the one that stood out for me the most. This is the idea that a midrange graphics card these days is an 150 watt + device. This is pretty much out of place when the rest of the platform might be running on 75 watts to 150 watts. Is it reasonable to call it midrange when the GPU card may be sucking up more that 50% of the power used by the system?



    Hopefully AMD is looking at this power usage issue and can make significant strides when moving to the 28nm process. It looks like they have made great strides on the CPU/Integrated GPU side of things so maybe they can filter some of this tech down for use in GPU cards. On the other hand they do far better than NVIdia so maybe they are on the right track. I just don't know for how long high energy usage cards like this will be accepted for midrange GPU performance.











    Dave
  • Reply 4 of 13
    Quote:
    Originally Posted by wizard69 View Post


    The last issue is the one that stood out for me the most. This is the idea that a midrange graphics card these days is an 150 watt + device. This is pretty much out of place when the rest of the platform might be running on 75 watts to 150 watts. Is it reasonable to call it midrange when the GPU card may be sucking up more that 50% of the power used by the system?



    One thing to note is that the idle power consumption of modern high-end graphics cards is insignificant. They only use lots of power when you're loading the GPU.
  • Reply 5 of 13
    Awesome. The biggest criticism of Juniper was the crippled memory bus that really hampered performance when compared to the 5800 series.



    Going back to 256bit should bring things close to the excellent 5850 standard.



    Hope the benches are good, desperately need something below $200 that can do close to 5850-level graphics. The 5850 has been sitting on $300 or so for at least a year because of lack of competition from Nvidia while they sorted out their Fermi troubles...!
  • Reply 6 of 13
    Quote:
    Originally Posted by wizard69 View Post


    The last issue is the one that stood out for me the most. This is the idea that a midrange graphics card these days is an 150 watt + device. This is pretty much out of place when the rest of the platform might be running on 75 watts to 150 watts. Is it reasonable to call it midrange when the GPU card may be sucking up more that 50% of the power used by the system?



    Well, AMD has been the one leading the way when they came out with the 5000 series. They are arguably the best performance-per-watt cards for mainstream to enthusiast gaming GPUs. Nvidia only just came to the party with their 460. The AMD 5850 for example, had some unheard-of low power and heat compared to how well it performed.



    It is still a challenge, for any mainstream to enthusiast gaming GPU we're looking at the GPU alone taking up half the power draw at load, sometimes twice as much.



    AMD is on the right track, it is more up to TSMC to deliver 28nm and then we'll see some nice performance-per-watt.



    I just wish OpenCL was more widely implemented, right now everything is still so CPU-dependent and Intel CPUs pack a real punch. Powerful GPUs are still marginalised to gaming PCs, which are competing with console gaming, where I think consoles are winning in terms of overall ease-of-use and number of titles available.



    Fusion will be interesting.



    Quote:
    Originally Posted by wizard69


    Hopefully AMD is looking at this power usage issue and can make significant strides when moving to the 28nm process. It looks like they have made great strides on the CPU/Integrated GPU side of things so maybe they can filter some of this tech down for use in GPU cards. On the other hand they do far better than NVIdia so maybe they are on the right track. I just don't know for how long high energy usage cards like this will be accepted for midrange GPU performance.



    Well, the thing is, "midrange" is the entry-level for any kind of decent gaming performance on a PC. We're talking up to DX10 or DX11 at 1920x1080p (since these monitors are so commonplace). So the AMD 5700 series is what is considered "midrange", maybe the 5600 but less so.



    Anything 5500 or lower, or integrated, just won't cut it for modern 3D gaming unless you turn the settings way down.



    High energy usage cards may not be accepted for much longer, since there is such a big shift to laptops and people might usually just get a console and be able to play all of the latest titles than struggle with upgrading a big hot desktop box all the time.



    I stopped playing PC games for about 6 months now (had a AMD dualcore CPU and ATI 4830 512MB) because I was increasingly frustrated with the cost of PC game titles, and lack of titles such as God of War and possibly the latest Need For Speed (Hot Pursuit or something(?)). I was also pissed off by the DX10 scam, most games to date are still DX9 and DX10 really kills frame rates. DX11 is supposed to be better because it actually speeds up some processes (eg. hardware tessellation), but DX11 titles will probably be loaded with so much other eye candy you'll need beyond mainstream GPUs.
  • Reply 7 of 13
    Quote:

    Hope the benches are good, desperately need something below $200 that can do close to 5850-level graphics. The 5850 has been sitting on $300 or so for at least a year because of lack of competition from Nvidia while they sorted out their Fermi troubles...!



    Those actually debuted at lower prices, and the prices went up and up over time due to demand and lack of competition! I may buy a 2GB 5850 for my future hackintosh when the 6000 series comes out. (Because the Mac actually supports the 5xxx series).



    BTW, I mentioned Mac users won't see radeon 6xxx for at least ten months because that seems to be the minimum delta between when GPUs are introduced, and when they ship in Macs. Note the 5xxx radeons just appeared in Macs, and are weeks away from obsolescence. Note the lack of Nvidia GTX 460s.



    I think it takes that long for Apple to vet AMD/NVidia's drivers and port them to Mac with support for sleep, etc.



    Dave: keep in mind these were designed for 32nm, and 'backported' to 40nm. The 'real' radeon 6000s debut next year at 28nm.
  • Reply 8 of 13
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by nvidia2008 View Post


    Well, AMD has been the one leading the way when they came out with the 5000 series. They are arguably the best performance-per-watt cards for mainstream to enthusiast gaming GPUs. Nvidia only just came to the party with their 460. The AMD 5850 for example, had some unheard-of low power and heat compared to how well it performed.



    This is kinda my point. NVidia lost sight of other important factors one of which is power usage. At least for the many of us that won't be buying a high end card anytime soon.



    Of course winter is coming so maybe a little space heating is in order.

    Quote:

    It is still a challenge, for any mainstream to enthusiast gaming GPU we're looking at the GPU alone taking up half the power draw at load, sometimes twice as much.



    That is not a problem for gaming but if you look at midrange GPUs as mainstream graphics processors it is a different story. Many of us need respectable cost effective 3D support.

    Quote:

    AMD is on the right track, it is more up to TSMC to deliver 28nm and then we'll see some nice performance-per-watt.



    I'd like to see AMD make the chips themselves, instead of going with external suppliers. Rumors are that their new process is very nice.

    Quote:

    I just wish OpenCL was more widely implemented, right now everything is still so CPU-dependent and Intel CPUs pack a real punch.



    GPU processing will only every benefit certain classes of problems. It is the nature of their SIMD design. AMD has some interesting ideas here for future Fusion processors but the GPU resouces will never become place for general purpose code.



    Beyound that OpenCL is catching on and is being used where it makes sense. I think the problem is that OpenCL has never been marketed well to the public. It is a fine technology but it is not a mirical.

    Quote:

    Powerful GPUs are still marginalised to gaming PCs, which are competing with console gaming, where I think consoles are winning in terms of overall ease-of-use and number of titles available.



    Your perspective seems to be gaming oriented. The problem is in a discussion about Macs the issues have always been driver related. It looks like Aple is slowly correcting that.

    Quote:

    Fusion will be interesting.



    Yes it will be, I'm hoping AMD can win some design ins with Apple. Even here expectations are out of control with comparisons to chips the Bobcat based chips whereno designed to compete with.

    Quote:





    Well, the thing is, "midrange" is the entry-level for any kind of decent gaming performance on a PC. We're talking up to DX10 or DX11 at 1920x1080p (since these monitors are so commonplace). So the AMD 5700 series is what is considered "midrange", maybe the 5600 but less so.



    See here is the problem I think of Midrange as the middle of the cost spread for GPU cards no matter what they are used for. So if I'm running a CAD program at home or on a laptop I expect reasonable performance. No it won't be CAD like performance that one gets from a dedicated card, but none the less servicable.

    Quote:

    Anything 5500 or lower, or integrated, just won't cut it for modern 3D gaming unless you turn the settings way down.



    High energy usage cards may not be accepted for much longer, since there is such a big shift to laptops and people might usually just get a console and be able to play all of the latest titles than struggle with upgrading a big hot desktop box all the time.



    I'm not a console person at all, to many limitations and to high of a cost.



    As to desktops I actually think we might see a shift back to them as the current trend of trying to stuff a desktop machine into a laptop is a bit boneheaded if you ask me. IPad is an enabler here and is an ideal companion to a desktop machine. Especially with remote access software.

    Quote:

    I stopped playing PC games for about 6 months now (had a AMD dualcore CPU and ATI 4830 512MB) because I was increasingly frustrated with the cost of PC game titles, and lack of titles such as God of War and possibly the latest Need For Speed (Hot Pursuit or something(?)). I was also pissed off by the DX10 scam, most games to date are still DX9 and DX10 really kills frame rates. DX11 is supposed to be better because it actually speeds up some processes (eg. hardware tessellation), but DX11 titles will probably be loaded with so much other eye candy you'll need beyond mainstream GPUs.



    Good to hear. Try something physical to keep yourself in shape.



    As a side note you can get games, very creative games, on the iPad dirt cheap. There is something to be said for making it difficult to steal software as the developers are doing very well without charging a lot.



    After thinking about it a bit, expense is likely one big reason I never got into gaming big time. It is a bit like an electronic drug for many. Yes an addiction that drains wallets just as well as crack.
  • Reply 9 of 13
    Quote:

    The problem is in a discussion about Macs the issues have always been driver related. It looks like Apple is slowly correcting that.



    It does?!



    Quote:

    I'd like to see AMD make the chips themselves, instead of going with external suppliers. Rumors are that their new process is very nice.



    Are you referring to Global Foundries? AMD AFAICT no longer fabs anything. GF is pouring $7 Billion into a new fab in Abu Dhabi, and has three existing fabs around the world (Singapore, Germany, NY IIRC).
  • Reply 10 of 13
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    Are you referring to Global Foundries? AMD AFAICT no longer fabs anything. GF is pouring $7 Billion into a new fab in Abu Dhabi, and has three existing fabs around the world (Singapore, Germany, NY IIRC).



    Yeah, what's the state of Global Foundries? Are they making GPUs or just CPUs? How is their 28nm process?
  • Reply 11 of 13
    Good news looks like these cards will be showing up for macs pretty soon.



    http://www.electronista.com/articles...ng.october.22/
  • Reply 12 of 13
    Historically, Apple is at a minumium, 9 months to a year behind in the Graphics card arena. Example: right now.



    That said, 6XXX is a refinement of the 5XXX series, so drivers should be much less of an issue.
  • Reply 13 of 13
    emacs72emacs72 Posts: 356member
    AMD has done a pretty good job with advancing Radeon technology. the 6000 series is no exception and it looks great on paper.
Sign In or Register to comment.