AMD to acquire graphics chip giant ATI

123578

Comments

  • Reply 81 of 146
    vineavinea Posts: 5,585member
    Quote:

    Originally posted by Splinemodel

    Well, the electronic hardware industry rags don't concern themselves too much with software. It's the sentiment of mostly embedded software developers, although these days embedded design is more common than ever, and it results in more revenue and more code than high-level software.



    OO in the embedded environment is often a poor fit. To say that OO has failed because it is a poor tool for a specific purpose is as blind as saying Java sucks for everything.



    As for more revenue and more code I would ask for a reference as embedded is still a niche arena as near as I can tell. One with high growth perhaps but the software world is vast and highly profitable.



    Oh and a source that ISN'T an embedded industry trade rag. Every industry self inflates their market and the obvious examples of this self-deception were the telcom industry (who's bubble burst) and the internet industry (dot bomb).



    Quote:

    Increasing globalization of the labor force, coupled with inreased focus on embedded devices is only going to fuel the development of more efficient paradigm.



    More efficient paradigm for small embedded programs that solve smaller problems on embedded devices. Not more efficient for large projects. You aren't going to run your core services from an embedded device.



    Globalization is an issue only from the perspective of industrial/technical dominance rather than offshoring. Having worked with Wipro and others, offshoring can have good ROI but must be managed carefully as many companies have realized. And the benefits are not as large as one might hope (i.e. its no silver bullet for developer shortage in the US).



    It is when, say the Indian or Chinese software industry is dominating the software world that the US software industry will be in danger of collapse (like say the US auto industry). When there is an Apple or Microsoft equivalent from a foreign nation that dominates software sales are we in big trouble.



    This isn't to say it can't happen but it hasn't yet. Japan was going to be our doom in the late 80s with their six sigma software quality and software factories that churned out millions of lines of code. Didn't happen.



    Whether the US has a cultural advantage I don't know. So far we've done well.



    Quote:

    One of the analogies I remember reading was the comparison of contemporary software paradigm as a "universal bolt." A lot of developers champion the idea of code re-use to a non-advantageous extent. Whereas a machine will include many types of bolts, each type selected for its suitability in a specific case, software development seems to influence the idea of using fewer total parts, but parts that are much more complex, much less reliable, and much less well suited for each individual task. Put simply, this approach has failed, and continues to be a risk for failure whenever it is used.



    Strawman. You can't read marketing literature (whether its from a vendor, a methodology guru or process maven) that declares a certain technology or technique is a silver bullet and then state that all software engineering is a failure because said technology/technique turns out its not a silver bullet.



    Any technique that claims 25%+ productivity improvement (cough...CMM...cough) is probably statistical BS. OO for example measures out to be around a 6% improvement over SASD and only for certain problem sets.



    Does that make OO a failure? No. 6% beats 0%.



    Contemporary software paradigm does NOT describe a universal bolt and hasn't since...mmm...mid 90s or whenever "software reuse" stopped being the buzzword of the day. Component based software development does advocate using pre-existing components to build a software system but largely no one uses (or advocates) large "universal" components anymore but smaller building blocks like UI widgets, network stacks, etc as you might find in .NET, DirectX, etc.



    Little COTS vs big COTS integration as the industry figured out by the mid 90s that big COTS integration typically works poorly because of interface complexity. I would say that large component CBSD died in the late 90s early 00s if not earlier. Hard to say but it sure isn't a hot topic anymore.



    The biggest takeaway from all this is that fundamental concepts of software development still apply. Examples include coupling and cohesion as indicators of quality and efficiencies in development. Ignore these fundamentals at your own risk. They exist. Software isn't totally a voodoo science. 90% perhaps.



    Quote:

    What's the next step? You don't seem to think there is a next step, which is probably a bad thing to assume since hardware is changing dramatically.



    Of course there's a next step. There's always a next step. However, as with OO, a parallel paradigm is not a silver bullet, nor can it ignore the fundamentals that seem to have survived (although renamed often) since the inception of software engineering.



    For certain problem domains I would expect a parallel processing based paradigm to net...oh around a 6% improvement over OO or procedural methods. A double digit improvement would be a fantastic rarity. Wonderful if we can get it but I won't be drinking the kool-aid till I see the studies that show such "silver bullet" improvements.



    Quote:

    At a certain point, there's only so much that can be done in a compiler: if software developers don't want to learn new paradigms, that's fine -- the business will just move to India, China, and Eastern Europe, where the developers there are more persuadable.



    If you are a software developer that has stopped learning you are management.



    However I call BS that the entire industry is going to move to a paradigm that is best suited for embedded highly parallel applications that needs to be aware of underlying hardware architecture.



    Parallel architectures come and go. Its not like the Cell is the first such incarnation. They have thier place to solve certain problems more efficiently than other techniques. Likewise monolithic architectures have their place to solve other problems more efficiently than other techniques.



    Anyone that poo-poos one or the other is self-limiting their tool box.



    Most problem sets will be solvable using "low-performance software" even on embedded devices. Frankly what you call an embedded device and what I might call an embedded device are probably an order of magnitude different in compute power and storage capability.



    Java works fine in the embedded realm today because the embedded environment are no where near as constrained as when I was a RTOS developer. More and more solutions can be built on embedded Java which is a wonderfully easier development environment than traditional RT development environments or your proposed low level Verilog like environment although the simulation and test capabilities in some of the VHDL environments would be nice have in a more consistent manner in traditional environments.



    Of course that is addressed in Agile methods that emphasize test frameworks.



    Vinea
  • Reply 82 of 146
    vineavinea Posts: 5,585member
    Quote:

    Originally posted by sunilraman

    If you (generic you, not you vinea specifically) have been involved with Win95, 98/Me, 2000, XP, Office, and ShiteOnTheVista, you need to put a paperbag over your head.



    Eh, it was Windows that drove the PC revolution...not unix, not MacOS, not OS9, not AmigaDOS, not GEM, not CPM, not OS360, not VMS, etc.



    All the commodity hardware you see today and where we are is due in large part because of Microsoft (from the software perspective).



    I haven't seen recent statistics but when Capers Jones evaluated MS software practices and defect rates he found they were in the top 5% of all software developers (mmm...thinking in mid 90s).



    So odds are that 95% of all coders talking smack about how MS writes shitty code generates software with higher defect rates and works for a company with poorer software practices.



    Vinea
  • Reply 83 of 146
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by ReCompile

    I agree with you whole-heartily. I was not trying to imply that Intel did not have a good roadmap. But as you reiterated, Apple cannot afford any hiccups. The "Gorilla" does have the strength to keep prices low for longer then AMD could, should it come to that. But I do believe that should AMD keep on the pace they are on, look for it in the future of Apple. Also with buying ATI, this would give them leverage in the price war. Apple buys both. If together they are cheaper than the purchase of a chip from Intel, and graphic cards from ATI or Nvida, then look out Intel. Besides, they are now in a position to use both (Down the road I believe, not right now). This way they would not have to abandon Intel, just keep them on their toes.



    The deal that AMD just made for ATI might break their back, instead of reinforcing it. The market has taken a long, very unhappy look at it today. AMD's price has continued downwards because of it.



    AMD just borrowed $2.5 billion to do this deal. They just lowered the prices on all of their cpu's by almost 50% to match Intel's price/performance ratio; and most in the industry say that it wasn't enough. By by profits. ATI's profits have been even lower than the erratic ones AMD has put on the board, and AMD will have to divest themselves of more businesses to help this deal. The debt load that AMD ? not a huge company, will have to carry because of this, could hurt their R&D down the road. It certainly will constrain any moves they need to make in the near future.



    While the concept seems good, history has shown that a company buying another large (in relation to their own size) company has issues that can take a year or two, if ever, to straighten out. During that time, the acquirer is distracted from their main business goals. This could cause AMD to flounder over the next year or so.



    This is what happened to Compaq after they bought DE, and it happened to Hp after they acquired Compaq.
  • Reply 84 of 146
    recompilerecompile Posts: 100member
    Quote:

    Originally posted by melgross

    The deal that AMD just made for ATI might break their back, instead of reinforcing it. The market has taken a long, very unhappy look at it today. AMD's price has continued downwards because of it.



    AMD just borrowed $2.5 billion to do this deal. They just lowered the prices on all of their cpu's by almost 50% to match Intel's price/performance ratio; and most in the industry say that it wasn't enough. By by profits. ATI's profits have been even lower than the erratic ones AMD has put on the board, and AMD will have to divest themselves of more businesses to help this deal. The debt load that AMD ? not a huge company, will have to carry because of this, could hurt their R&D down the road. It certainly will constrain any moves they need to make in the near future.



    While the concept seems good, history has shown that a company buying another large (in relation to their own size) company has issues that can take a year or two, if ever, to straighten out. During that time, the acquirer is distracted from their main business goals. This could cause AMD to flounder over the next year or so.



    This is what happened to Compaq after they bought DE, and it happened to Hp after they acquired Compaq.




    Actually AMD stayed on a positive upward trend all day, and ended .41 in the plus. But I do agree that the transition will be tough, as any assimilation of that size is bound to be. i do not think that it will break AMD's back, on the contrary, combining a major player in the all essential graphic chip market with a major player in the cpu chip market spells opportunity in a big, big way. Although we are still human, and have the all giving gift to screw about anything up. 8)
  • Reply 85 of 146
    SpamSandwichSpamSandwich Posts: 33,407member
    What might be the impact on Dell?
  • Reply 86 of 146
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by ReCompile

    Actually AMD stayed on a positive upward trend all day, and ended .41 in the plus. But I do agree that the transition will be tough, as any assimilation of that size is bound to be. i do not think that it will break AMD's back, on the contrary, combining a major player in the all essential graphic chip market with a major player in the cpu chip market spells opportunity in a big, big way. Although we are still human, and have the all giving gift to screw about anything up. 8)





    You should have read the articles in the Times and the WSJ Tuesday. I tend to agree with them.



    The market was up today.
  • Reply 87 of 146
    gene cleangene clean Posts: 3,481member
    Quote:

    Originally posted by melgross

    The market was up today.



    You just said it was down.
  • Reply 88 of 146
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by Gene Clean

    You just said it was down.



    No I didn't Gene. I said that when the announcement of the deal was made, AMD's stock went down.



    I haven't seen you around for a while. Where have you been hiding?
  • Reply 89 of 146
    slugheadslughead Posts: 1,169member
    Quote:

    Originally posted by meelash

    I'm thinking it will mean just that. Check this news out for example: Intel pulls ATI (AMD)'s Chip license.



    That's hilarious. I wonder if AMD bought ATI just to piss Intel off.
  • Reply 90 of 146
    jeffdmjeffdm Posts: 12,951member
    Quote:

    Originally posted by slughead

    That's hilarious. I wonder if AMD bought ATI just to piss Intel off.



    That would probably be the dumbest reason to take out a US$2.5B loan.
  • Reply 91 of 146
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by slughead

    That's hilarious. I wonder if AMD bought ATI just to piss Intel off.



    I don't think intel really cares. NVidia is a far greater company than ATI on every level. I've NEVER understood why apple chooses to go with ATI over NVidia.
  • Reply 92 of 146
    chuckerchucker Posts: 5,089member
    Just off the top off my head?



    1) ATi gives Apple better access to driver specifications, allowing Apple to optimize themselves

    2) ATi makes far less power-hungry laptop GPUs.
  • Reply 93 of 146
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by Chucker

    Just off the top off my head?



    1) ATi gives Apple better access to driver specifications, allowing Apple to optimize themselves

    2) ATi makes far less power-hungry laptop GPUs.




    Yes 1) is valid, but the better cards are still better cards. You can't compare a ati 9800 to a nvidia 6800. BTW can you prove that is true (apple getting better access?).



    The problem I see with nvidia, is nvidia doesn't sell cards. ATI does. So I believe apple has to find a company to manufacturer the NVidia cards.



    2), and why do they insist on using them in the mac pro's? (they finally switched with the last go round to nvidia's).
  • Reply 94 of 146
    jeffdmjeffdm Posts: 12,951member
    Quote:

    Originally posted by emig647

    Yes 1) is valid, but the better cards are still better cards. You can't compare a ati 9800 to a nvidia 6800. BTW can you prove that is true (apple getting better access?).



    The problem I see with nvidia, is nvidia doesn't sell cards. ATI does. So I believe apple has to find a company to manufacturer the NVidia cards.



    2), and why do they insist on using them in the mac pro's? (they finally switched with the last go round to nvidia's).




    Isn't the 6800 a lot newer than the 9800?



    The PowerMac switched to the nVidia series eight months ago.
  • Reply 95 of 146
    Quote:

    Originally posted by emig647

    You can't compare a ati 9800 to a nvidia 6800.



    No, but you CAN compare an ATI 9800 to an NVIDIA GeForce4 Ti, which launched about 4 months earlier.



    Let's just say, in the face os the 9800, the GeForce4 was TERRIBLE. It didn't even have DX9 support.



    The 6800 came out 2 years later than the 9800.



    Quote:

    BTW can you prove that is true (apple getting better access?



    This is a forum, not a research paper. Google it yourself.



    It's a well established fact that ATI gives Apple the source code for their drivers, while NVIDIA ports the drivers themselves.
  • Reply 96 of 146
    chuckerchucker Posts: 5,089member
    Uh, the MX series are low-end models. The *800 ones are high-end models. So, no, that's not a fair comparison.



    The 4 MX was also essentially a rebadged 2 MX?
  • Reply 97 of 146
    Quote:

    Originally posted by Chucker

    Uh, the MX series are low-end models. The *800 ones are high-end models. So, no, that's not a fair comparison.



    The 4 MX was also essentially a rebadged 2 MX?




    Haha, I just quickly Wikipedia'd a card released around that time. I didn't follow graphics cards until I actually needed to program for them.
  • Reply 98 of 146
    chuckerchucker Posts: 5,089member
    Ya, 4 Ti is a fairer match.
  • Reply 99 of 146
    emig647emig647 Posts: 2,455member
    No, in the PC world the 5900xt (256bit) came out about the same time the 9800 (256bit) came out. The 5900xt didn't perform well. But about 6 months after the 9800 came out is when the 6800 (256bit) series from NVidia came out.



    The GeForce 4mx ranged from 64bit to 128bit. Chucker is right... it was a remade geforce 2mx. That is like comparing an ATI 9200 to an ATI 9800 skipping over the 9600 completely.



    The mac was extremely deprived of graphics cards at this time. I don't even know why we're talking about old history. Look at today's power macs. NVidia across the screen. But if you compare Nvidia to ati since the 6800 came out, you'll see Nvidia smokes them in every test for every comparable card. You can do the 6800 vs x800, the 6600 vs 9600, the 7600 vs the x1600, the 7800 vs the x1800, the 7800 gt (or gtx) vs the x1850xt....



    I have it out for ATI because their drivers are complete trash. They are 2 binary digits short of spyware!
  • Reply 100 of 146
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by gregmightdothat

    This is a forum, not a research paper. Google it yourself.



    It's a well established fact that ATI gives Apple the source code for their drivers, while NVIDIA ports the drivers themselves.




    Apparently you don't get the point of insider forums.



    Most of the time people call bullshit, so you call proof. It's the nature of the boards. If you don't like being asked to prove something.. gtfo
Sign In or Register to comment.