AMD reveals plans to retire ATI graphics brand

2

Comments

  • Reply 21 of 54
    dualiedualie Posts: 334member
    Quote:
    Originally Posted by patrickwalker View Post


    I don't really but it. ATI does indeed have a strong brand name.



    Reminds me of the whole Coors-Molson takeover. Yet another example of how foreign ownership is destroying Canada.





    You need to do a little research before you opine.



    It's called the Molson Coors Brewing Company, not the other way around, and it was a MERGER not a takeover. The new company has TWO HEAD OFFICES, one in Montréal and one in Denver.



    Quote:

    The company's binational nature has caused several sources in both the U.S. and Canada to claim the company is now controlled by interests in the other country.[5][6] In fact, while the company is nominally incorporated in the United States, [7] the company is traded on stock exchanges in both countries, [8] and control is equally shared between the Molson and Coors families.[9] The company is headquartered in Montreal, QC and Denver, CO.





    http://en.wikipedia.org/wiki/Molson_Coors
  • Reply 22 of 54
    ronboronbo Posts: 669member
    Quote:
    Originally Posted by AppleInsider View Post


    AMD said it conducted research that found ... that consumer preference toward ATI triples when they are aware of the ATI-AMD merger.



    Does anybody else find this fishy? I would very much like to see the (loaded) questionnaire that produced this tripling.



    This basically says that for every person who prefers ATI (presumably over nVidia), there are two other people who didn't already know AMD owns it, who initially preferred nVidia over ATI, but who like AMD so much they then switch their preference to ATI on the basis of that knowledge.



    It also implies that ATI is a disliked brand. The initial ATI vs nVidia preferences must have been hugely skewed in favor of nVidia (since ATI's virgin number has to have room to triple, and there's no way in hell that the number tripled from 33% to 99%, or else they'd have told us. It probably tripled from 20% to 60% or 15% to 45%... could ATI really be that disliked????).



    I call shenanigans. You'd have to be a pretty serious AMD fan to switch preferences like that (unless the questionnaire was VERY craftily crafted), and what serious AMD fan didn't already know they owned ATI?
  • Reply 23 of 54
    ronboronbo Posts: 669member
    Quote:
    Originally Posted by jayparry View Post


    WTH is a watershed moment?? why do people talk like this!



    For much the same reason that some people don't. It's been shown that the size of vocabulary has a fairly high correlation with intelligence.
  • Reply 24 of 54
    mr. memr. me Posts: 3,221member
    Quote:
    Originally Posted by Newtron View Post


    From the story:



    "AMD said it conducted research that found its brand is stronger than ATI, and that consumer preference toward ATI triples when they are aware of the ATI-AMD merger."



    Ask your parents about one of the most famous examples of market research in history--New Coke. Market research showed that people thought that New Coke tasted better that the old Coke. When Coca Cola made the switch, it turned into one of the greatest marketing disasters in history. In the minds of the buying public, AMD makes processors. AMD processors are the only alternatives to Intel. Many computer nerds argue that AMD processors are better than Intel processors. Nobody argues that AMD graphics chips are better than anything because there aren't any AMD graphics chips.



    ATI and nVidia are peer rivals. In making this move, AMD switches its brand from one of the two recognized leaders in the graphics market to an brand that is unknown in the market for graphics processors. The move is both arrogant and unnecessary.



    There is another problem with this move:



    [FUD]

    AMD graphics chips don't work with Intel processors.

    [/FUD]



    You know it's coming. When it comes, the FUD will be corrosive to AMD's marketing efforts. A lot of Intel processor buyers will not purchase AMD graphics processors precisely because they fear that AMD is incompatible with Intel.



    While AMD combats the FUD, nVidia will step just a bit farther ahead.
  • Reply 25 of 54
    sheffsheff Posts: 1,407member
    Amazing. While I was a PC gamer I was an AMD/ ATI fan. Both custom and non custom PCs I had were all ATI/AMD. That changed when Intel came out with Core and Core 2 especially, and I had no choice but to go that way. Apple made the right choice by going with Cores at the time.



    Now I feel like AMD is making a slight comeback with the newer Opetrons and Athlon IIs. Not to say that there is a complete parity, but is much closer now I feel then a couple of years ago. We'll see if Apple and AMD join forces, but I will be pretty excited if they do.
  • Reply 26 of 54
    Quote:
    Originally Posted by xsu View Post


    As far as graphic hardwares concerned, ATI has a heck lot more brand recognition than AMD. Especially in the descreet graphic cards market, people look for ATI, not AMD. This is going to cause very much unncessary confusion and errosion of value. Another stupid marketing move from AMD.



    I would agree with you, but I think the Radeon brand is powerful as well. AMD's market research seems to make sense to me. With the Radeon brand, and the red logo, I think it will be very clear to consumers what happened to ATi and what they are getting.



    They also need a singular brand to fit their fusion product strategy. How can you claim that your GPU and CPU, and chipset are integrated and designed to work together when you use two brand names for them?



    Enthusiasts aware of graphics cards are likely to be aware of the merger, especially if they have downloaded drivers in the last few years. This change mostly benefits non-enthusiasts who only know of AMD and Intel.



    At the end of the day, if AMD's new CPU/GPU combo reduces power consumption, you'll be seeing it in a lot of new computers and that in itself will aid the brand the most.
  • Reply 27 of 54
    docno42docno42 Posts: 3,760member
    Quote:
    Originally Posted by Mr. Me View Post


    When Coca Cola made the switch, it turned into one of the greatest marketing disasters in history



    Really? The "cola wars" were at a pretty big stalemate at that point in time. The whole "New Coke" fiasco provided some much needed fodder to make soft drinks interesting again.



    Not that I think Coke was smart enough to plan it out, but it certainly didn't hurt them - they had a market share bump *after* the whole "New Coke" thing and it let them change out cane sugar to cheaper HFCS when they brought Coke Classic as well.



    Crazy like a fox...
  • Reply 28 of 54
    doc362doc362 Posts: 43member
    This just makes me sad.



    When I hear ATI I think of performance. I still think of low rate pentium upgrade chips every time I hear someone mention AMD. It doesn't inspire much confidence.



    Also, I don't think AMD and Apple will ever team up. Intel is the pioneer. AMD has and always will be a step behind. AMD is for budget minded consumers.
  • Reply 29 of 54
    docno42docno42 Posts: 3,760member
    Quote:
    Originally Posted by doc362 View Post


    When I hear ATI I think of performance. I still think of low rate pentium upgrade chips every time I hear someone mention AMD. It doesn't inspire much confidence.



    Only because you are ignorant of history. For example, Intel copied their scheme for grafting 64 bit support onto the x86 architecture from what AMD did. AMD was first to have true multi-core CPU's. If there hadn't been pressure from AMD, the entire Pentium 4 fiasco would have probably drug on longer and the core series wouldn't have come to market as quickly. AMD lit a real fire under Intel that snapped them out of some pretty heady arrogance!



    Quote:

    Also, I don't think AMD and Apple will ever team up.



    They already have and do. I'm anxiously awaiting the 5870 upgrade kit to be released for my Mac Pro!



    Quote:

    Intel is the pioneer. AMD has and always will be a step behind. AMD is for budget minded consumers.



    Intel has the marketing hype. AMD has been ahead of Intel many times over the years. Intel just has many more resources and lots more marketing and hence mindshare - but AMD can and does hold their own on the hight end. AMD's weakness (much like Motorola and IBM) has been in lower power chips for laptops. This is probably the biggest reason we don't see Apple offering AMD chips - however if that were to change with AMD producing a good mobile CPU and Apple had to worry less about offending Intel, don't think Apple wouldn't start shipping Mac's with AMD chips in an instant. Apple was the first major manufacturer using Intel chips to not go with the whole "Intel Inside" marketing...
  • Reply 30 of 54
    sheffsheff Posts: 1,407member
    Quote:
    Originally Posted by doc362 View Post


    This just makes me sad.



    When I hear ATI I think of performance. I still think of low rate pentium upgrade chips every time I hear someone mention AMD. It doesn't inspire much confidence.



    Also, I don't think AMD and Apple will ever team up. Intel is the pioneer. AMD has and always will be a step behind. AMD is for budget minded consumers.



    I've Had awesome gaming rigs with AMD inside. In some ways they were even faster then Intel with hypertransport, 64 bit, more L2 cache, though often less clock cycles (intel had 32bit P4s at the time)



    Only with intro of Core 2 did intel blow away AMD, because AMD did not have the money to move to 45nm and reduce the heat. AMD pushed dual core and 64 bit before Intel (though intel had some 64 bit processors before core 2, they were not marketed as such).



    If AMD had a customer like apple that worked with them i I think AMD could be quite the contender at desktop CPU crown.
  • Reply 31 of 54
    I hate how graphic card model numbers increase and decrerease without apparent pattern. It is very counter-intuitive and you must be aware of all the different families to tell if a card is better than the other!
  • Reply 32 of 54
    This is probably a smart move. AMD should have done this a while ago.
  • Reply 33 of 54
    All I've ever known growing up was ATI vs. nVidia.



    I'm an nVidia fanboy and ATI was often known to have driver issues.



    I'm guessing with the advance of laptops offering discrete graphics, sales of discrete graphics cards for desktop machines could be falling. If people aren't going out and looking for a graphics card, AMD has no reason to keep the ATI name.



    Just an uneducated guess though...
  • Reply 34 of 54
    sheffsheff Posts: 1,407member
    Quote:
    Originally Posted by urbansprawl View Post


    All I've ever known growing up was ATI vs. nVidia.



    I'm an nVidia fanboy and ATI was often known to have driver issues.



    I'm guessing with the advance of laptops offering discrete graphics, sales of discrete graphics cards for desktop machines could be falling. If people aren't going out and looking for a graphics card, AMD has no reason to keep the ATI name.



    Just an uneducated guess though...



    I was always an ATI fanboy, but yea outside of windows their drivers were really shitty. Mostly because they were smaller and had less resources to code for linux. I think they've had a share of Mac driver problems with iMacs I believe but I think so did nVidia.
  • Reply 35 of 54
    mr. memr. me Posts: 3,221member
    Quote:
    Originally Posted by DocNo42 View Post


    Really? The "cola wars" were at a pretty big stalemate at that point in time. The whole "New Coke" fiasco provided some much needed fodder to make soft drinks interesting again.



    ...



    Crazy like a fox...



    I didn't know for a fact that there were people who thought that the New Coke fiasco was a good thing, but I always believed those people existed. Buy this logic, AMD is switching its graphics card brand to AMD in a devious scheme to revitalize the ATI brand.
  • Reply 36 of 54
    Quote:
    Originally Posted by sheff View Post


    I was always an ATI fanboy, but yea outside of windows their drivers were really shitty. Mostly because they were smaller and had less resources to code for linux. I think they've had a share of Mac driver problems with iMacs I believe but I think so did nVidia.



    nvidia has done nothing but shoot blanks on the Mac platform. It wasn't long ago that $75 ATI cards were destroying $400 nvidia cards on basic Mac benchmark tests. Recently my brother showed me two tests of a 2009 Mac Pro with nvidia 285GTX versus a first gen Mac Pro with a Radeon x1900. In gaming the new rig was way faster than the old one, as expected. But he then showed me a couple of benchmark tests in which the 2006 machine was faster. There has to be something seriously wrong with your drivers when a card that's 3 times as fast as your competition loses any test.
  • Reply 37 of 54
    MarvinMarvin Posts: 15,476moderator
    Quote:
    Originally Posted by jayparry View Post


    WTH is a watershed moment?? why do people talk like this!



    It's a turning point for them. AMD and ATI used to be separate, then AMD bought them and the turning point or watershed for the company is that they put the CPU and GPU together into one package and rebrand as AMD.



    Ultimately, all computers will work this way. When integrated graphics get to the point where Crysis-level graphics can run at native resolution and high quality, that's good enough for pretty much everyone and dedicated chips will be relegated to the very small enthusiast crowd. The 320M and 5650 are fine at medium quality so if AMD can equal that inside a quad CPU, possibly exceed it, it's only maybe 2 or 3 iterations away from being the ultimate low-end chip.



    Memory manufacturers will do the same so SSD and RAM will be on the same chip or even be the same chip if they can get NVRAM in high enough densities and low enough price.



    NVidia are done for in all this though. Once they get shunted to the enthusiast crowd only, their revenue will plummet and then in 5 years, they will be bought up for a low price for their IP. It's a shame really as they've done a lot for the industry.
  • Reply 38 of 54
    hattighattig Posts: 860member
    Quote:
    Originally Posted by Mr. Me View Post


    Ask your parents about one of the most famous examples of market research in history--New Coke. Market research showed that people thought that New Coke tasted better that the old Coke. When Coca Cola made the switch, it turned into one of the greatest marketing disasters in history.



    Let's assume that the Coca Cola company has no citrusey lemonadey type drink on the market.



    Coca Cola buys Lilt, and continues to sell it as Lilt. Brand recognition of Lilt remains fairly low, but high in the citrusey lemonadey fan area of the market.



    Coca Cola rebrands Lilt as Lilt from Coca Cola. Market perception changes - Coca Cola make Lilt, I might try this, I like Coke. The Lilt fanboys cry for a little, but the larger market brand of Coca Cola is better for Lilt overall.



    Now if the Radeon HD6xxx series coming out later this year turns out to be rubbish (leaked benchmarks suggest otherwise), then it will be a New Coke level of failure
  • Reply 39 of 54
    hattighattig Posts: 860member
    Quote:
    Originally Posted by doc362 View Post


    This just makes me sad.



    When I hear ATI I think of performance. I still think of low rate pentium upgrade chips every time I hear someone mention AMD. It doesn't inspire much confidence.



    Also, I don't think AMD and Apple will ever team up. Intel is the pioneer. AMD has and always will be a step behind. AMD is for budget minded consumers.



    You appear to have missed out on the years 1999 to 2006, when AMD offered better products than Intel. Innovations included, but weren't limited to:



    HyperTransport (Intel used limited FSB until Nehalem).

    64-bit x86 (Intel forced to adopt, IA64 died as a result).

    Multi-core (true dual-core Opterons in 2005).

    Cheap scalable servers (enabled by HyperTransport, use of commodity memory like DDR2, etc - Intel chased expensive FB-DIMMS and the like instead).



    Intel did wake up eventually.



    "Low rate pentium upgrade chips", it's not 1997 anymore.
  • Reply 40 of 54
    ssquirrelssquirrel Posts: 1,196member
    Quote:
    Originally Posted by Marvin View Post


    When integrated graphics get to the point where Crysis-level graphics can run at native resolution and high quality, that's good enough for pretty much everyone and dedicated chips will be relegated to the very small enthusiast crowd.



    Dedicated graphics are already relegated to a small enthusiast crowd. There are lots of IGP solutions and low end dedicated graphics (altho far les than IGP) from NVIDIA and ATI that make up the video in most computer sales currently. The number of people who go out and buy a new $600 video card are very few. I tend to wait for a bit of trickle down and get the best card I can for about $200 or so. I'm a little behind the curve that way, but that doesn't bother me. I make my video cards last at least 3 years
Sign In or Register to comment.