or Connect
AppleInsider › Forums › General › General Discussion › AMD to acquire graphics chip giant ATI
New Posts  All Forums:Forum Nav:

AMD to acquire graphics chip giant ATI - Page 2

post #41 of 147
Quote:
Originally posted by Splinemodel
It has actually done quite well in embedded devices as a AOT-compiled package. Most cable boxes run Java apps and Java display layers.

Embedded devices, yes. But not in computers that we can use.
post #42 of 147
Quote:
Originally posted by Chucker
Possibly because it offers nothing worth considering. Its syntax is a weak rip-off of C++ (which is in itself horrible), and its alleged "feature" (being platform-agnostic) is, when you really think about it, completely useless. For a console application (including server daemons), you won't want that for performance reasons. For a front-end, you won't want that because UIs are platform-specific for a reason. If every OS had the same UI, what would be the deciding factor of one OS over another? Exactly.

There is nothing Java can offer you that a clean separation between a platform-agnostic (but compiled!) framework/back-end, written in an efficient language (e.g. plain C) and a set of platform-specific (and not necessarily compiled; interpreted can be good enough) front-ends, written in high-level APIs (e.g. Cocoa, .NET) can't give you.

Actually, mostly because of Sun's failure to allow it to go opensource, as they promised to over time, and then drew back again. Meanwhile .net has been overtaking it. I don't agree with your characterization. And I know any number of programmers who prefer it. For years it had been gathering developers, but, in the end, it was Suns decisions that has held it back.
post #43 of 147
Quote:
Originally posted by Chucker
If every OS had the same UI, what would be the deciding factor of one OS over another? Exactly.

What? User interface isn't the only thing that matters in an OS. How about maintainability? Security? Stability? With my experience, the Java UI elements match the look of its host operating system pretty well. As for the operation, it is up to the developer to make it a sensible system. Programs like RSSOwl and Eclipse show that Java can be used very well for operation, and the fact that they are Java based aren't apparent to the user, they behave like native programs.
post #44 of 147
Quote:
Originally posted by JeffDM
How about maintainability? Security? Stability?

Those are all things that have been converging. Most kernels, be it XNU, Linux, OpenSolaris or NT, have been more than robust/stable enough for years. Likewise, most operating systems have been maintainable and secure enough (yes, including Windows; XP's problems have quite little to do with the general security features).

Quote:
With my experience, the Java UI elements match the look of its host operating system pretty well.

You've got to be joking. I can spot a Java app from 10 feet away. Neither on OS X nor on Windows does a Java app look, feel or behave natively. Don't even bother giving me an example to the contrary, because there are none.

If you want to write a good app, write it in a native UI. Otherwise, I'll give you thumbs down for not respecting your users.

Quote:
Programs like RSSOwl and Eclipse show that Java can be used very well for operation, and the fact that they are Java based aren't apparent to the user, they behave like native programs.

So tell me, why are those not popular on Mac OS X? Could it be because they aren't well-behaved Mac apps?
post #45 of 147
Quote:
Originally posted by melgross
Actually, mostly because of Sun's failure to allow it to go opensource,

That's part of it. I don't think it's a significant part, however.

.NET has gained traction not only because Microsoft is behind it, but because it also happens to be really good. You get an excellent API worthy of being compared to Cocoa, you get the ability to write in whatever language you prefer (theoretically possible in Cocoa, e.g. through PyObjC, but practically not implemented well), and you get Bytecode-like treatment.

Quote:
For years it had been gathering developers, but, in the end, it was Suns decisions that has held it back.

I just don't see anything in Java that's compelling. In its OOP concepts, it isn't ahead of Cocoa, and it's far, far behind of Ruby, aka "Everything is an object". In its language syntax, it is virtually equivalent to C++, aka "Teh Uglay". Its bytecode paradigm is interesting but ultimately not very useful.
post #46 of 147
Quote:
Originally posted by Chucker
Possibly because it offers nothing worth considering. Its syntax is a weak rip-off of C++ (which is in itself horrible)

Syntax doesn't make a language. C++ is not quite fish nor fowl. Java and C# are high level languages with better productivity than either C or C++.

Quote:
, and its alleged "feature" (being platform-agnostic) is, when you really think about it, completely useless. For a console application (including server daemons), you won't want that for performance reasons.

Which is why the j2ee stack is a complete failure with no one using it.

Quote:
For a front-end, you won't want that because UIs are platform-specific for a reason. If every OS had the same UI, what would be the deciding factor of one OS over another? Exactly.

There is nothing Java can offer you that a clean separation between a platform-agnostic (but compiled!) framework/back-end, written in an efficient language (e.g. plain C) and a set of platform-specific (and not necessarily compiled; interpreted can be good enough) front-ends, written in high-level APIs (e.g. Cocoa, .NET) can't give you.

Because those of us who have done cross-platform development understand that "platform agnostic" C or C++ is an oxymoron and still annoying to code relative to Java or C# and while Java doesn't quite live up to code once and run everywhere it sure comes a lot closer.

That said, I prefer C#. Java still has some annoyances that really shouldn't exist after all this time.

Vinea

PS I forgot to note that native look and feel is supposedly to be addressed in 1.6.
post #47 of 147
Quote:
Originally posted by melgross
Actually, mostly because of Sun's failure to allow it to go opensource, as they promised to over time, and then drew back again. Meanwhile .net has been overtaking it. I don't agree with your characterization. And I know any number of programmers who prefer it. For years it had been gathering developers, but, in the end, it was Suns decisions that has held it back.

Java's biggest problem is that it sucked until 1.2 (arguably 1.3). For the better part of a decade devs were drowning in the Java kool-aid. .Net works amazingly well for a MS framework unlike the MFC and win32 environments. .Net sucked a lot less coming out of the gate and got reasonably stable by 1.1.

Making Java opensource would simply have allowed MS to extend the language in incompatible ways. The biggest problem of Java wasn't it not being opensource but they usual unix suspects shooting themselves in their collective feet. There was no reason to fork IDEs so we had both Eclipse and NetBeans, neither of which are IMHO has good as Visual Studio.

I mean really, if not being opensource didn't hamper .Net why should any dev care that Java isn't opensource. Its not like you couldn't look at the source if you wanted to.

Vinea
post #48 of 147
Quote:
Originally posted by vinea
Syntax doesn't make a language.

Syntax is the main aspect that makes a language. Modern languages aren't compared as much by their features, which are very similar anyway, but by their syntax, which isn't.

Quote:
Java and C# are high level languages with better productivity than either C or C++.

C and C# have a use. Java and C++ don't.

Quote:
PS I forgot to note that native look and feel is supposedly to be addressed in 1.6.

Yeah, and GNU/HURD is coming next year.
post #49 of 147
Quote:
Originally posted by Chucker

I just don't see anything in Java that's compelling.

What's compelling is its a cross platform language that finally doesn't suck and meets most of the marketing hype of 1996. Of course, now that its a decade old it fails the sexiness test.

While Ruby is just as old it is hot today becuase it wasn't hyped in the 90s. Still Java strikes me as more mature.

Vinea
post #50 of 147
Quote:
Originally posted by vinea
What's compelling is its a cross platform language that finally doesn't suck

Doesn't suck? Java? Bwahaha.

Quote:
While Ruby is just as old it is hot today becuase it wasn't hyped in the 90s. Still Java strikes me as more mature.

More mature and probably faster, but much less interesting as an emerging technology.
post #51 of 147
Quote:
Originally posted by Chucker
Syntax is the main aspect that makes a language. Modern languages aren't compared as much by their features, which are very similar anyway, but by their syntax, which isn't.

The difference between C, C++, Java and C# syntax are minimal enough that anyone that can read one can read any other. Differences in language features and philosophy results in folks effectively coding C in higher level langages.

Also the libraries differ greatly (i.e. built in features).

If you think syntax is the primary difference, what can I say except all semi-competent coders can learn any of the C derived syntax very rapidly.

Quote:
C and C# have a use. Java and C++ don't.

Nice to be a zealot. Keeps things nice and black and white where you don't need to think anymore. Every language that you can draw pay writing has a use. Heck I'm sure there are still MUMPS devs out there.

Vinea
post #52 of 147
Quote:
Originally posted by vinea
The difference between C, C++, Java and C# syntax are minimal enough that anyone that can read one can read any other.

Yes, but see, the premise several posts ago was that Java was this great new thing that developers have been too lazy to catch on to. I don't believe it is.

Quote:
Differences in language features and philosophy results in folks effectively coding C in higher level langages.

I'll stick to my idea that plain C is good for high-performance code and Ruby, .NET (with virtually any language, including obscure stuff like Boo) and Cocoa (with Objective-C) are good for high-level code. There is virtually nothing that can't be done with this combination. So, what niche does Java fill again? Or is it one of those cases where it tries to be many things but doesn't really stand out on any in particular?

Quote:
Also the libraries differ greatly (i.e. built in features).

You mean like glibc? That's not really language-specific. OS X has quite a different libc, for instance.

Quote:
If you think syntax is the primary difference, what can I say except all semi-competent coders can learn any of the C derived syntax very rapidly.

Precisely. Except I don't find Java's syntax compelling. Ruby's I do. Objective C's I do (though it's very confusing at first). Java's? Nah.

Quote:
Nice to be a zealot. Keeps things nice and black and white where you don't need to think anymore. Every language that you can draw pay writing has a use. Heck I'm sure there are still MUMPS devs out there.

Many languages have a niche. Perhaps asinine cellphone games are the "niche" for Java. On a PC/desktop/server/workstation/laptop/Mac/etc., however, I don't see a use.
post #53 of 147
Quote:
Originally posted by Chucker
Doesn't suck? Java? Bwahaha.

Nope doesn't suck. Just like CORBA doesn't suck anymore.

Neither are silver bullets. Both are finally mature enough to go into mission critical apps.

Quote:
More mature and probably faster, but much less interesting as an emerging technology.

Because of Rails? Eh, Ajax is "emerging technology". Who the heck gives a crap about yet another C like language with Smalltalk like OO? Ruby is over a decade old. Creating new languages is passe.

"Emerging technology" is a synonym to "Technology that is overhyped and still sucks." Doesn't mean I won't learn them but its pretty much been there, bled a lot, have the coffee mug, thanks.

Vinea
post #54 of 147
Quote:
Originally posted by vinea
Because of Rails?

No, because, for instance, of higher order messaging.
post #55 of 147
Quote:
Originally posted by Chucker
Yes, but see, the premise several posts ago was that Java was this great new thing that developers have been too lazy to catch on to. I don't believe it is.

It is and it isn't. The rate of adoption from new technology to widely used is a decade. I can find a study that shows that if you like.

Quote:
I'll stick to my idea that plain C is good for high-performance code and Ruby, .NET (with virtually any language, including obscure stuff like Boo) and Cocoa (with Objective-C) are good for high-level code.

.NET is windows specific. Cocoa is Mac specific.

Quote:
There is virtually nothing that can't be done with this combination. So, what niche does Java fill again? Or is it one of those cases where it tries to be many things but doesn't really stand out on any in particular?

J2EE excels in the enterprise stack environment. The language and syntax itself is less important that the api and application stack around it.

Quote:
You mean like glibc? That's not really language-specific. OS X has quite a different libc, for instance.

No, I mean the infrastructure around a language. For example C# as a language is not overly useful seperated from the functionality of the .NET stack. Learning to be a good C# programmer has very little to do with syntax and a whole lot more with understanding what is in .NET.

No, its not language specific and that's the point. The language itself and syntax are almost noise in the equation.

Quote:
Precisely. Except I don't find Java's syntax compelling. Ruby's I do. Objective C's I do (though it's very confusing at first). Java's? Nah.

If you want to judge a book by its cover who am I to argue?

Quote:
Many languages have a niche. Perhaps asinine cellphone games are the "niche" for Java. On a PC/desktop/server/workstation/laptop/Mac/etc., however, I don't see a use.

Yes, this is because you are a zealot. Just like mac zealots or Linux zealots that cant see any use for Windows.

Doesn't make you any less blind or foolish.

Vinea
post #56 of 147
Quote:
Originally posted by vinea
It is and it isn't. The rate of adoption from new technology to widely used is a decade. I can find a study that shows that if you like.

So what kind of technology makes Java unique?

Quote:
.NET is windows specific. Cocoa is Mac specific.

.NET isn't Windows-specific (and Cocoa isn't technically Mac-specific). But even if they were both platform-specific, I'd be quite happy with that.

Quote:
No, I mean the infrastructure around a language. For example C# as a language is not overly useful seperated from the functionality of the .NET stack. Learning to be a good C# programmer has very little to do with syntax and a whole lot more with understanding what is in .NET.

But that's the beauty of .NET. You pick a syntax that you like, and then you learn the .NET framework.
post #57 of 147
Quote:
Originally posted by Chucker
No, because, for instance, of higher order messaging.

In Smalltalk first (or perhaps Objective C) I think and can be implemented in most dynamically typed OO languages. "Emerging"? Eh, I bet Knuth would disagree and find something its been called before.

You can also implement the pattern in java but since its statically typed its messier and not as elegant. Likewise C++.

Vinea
post #58 of 147
Quote:
Originally posted by Chucker
Wha-what now? You speak of innovation and then mention "when the PC is dead", Java, and architecture-agnostic (read: low-quality) software?

Peh.

That's why I say Java will be limited to low-performance software. In twenty years, I'm actually going to go out on a limb and say that most high performance software is going to be written in a new language that resembles VHDL or Verilog more than anything else. But Java still has its pluses, namely that it's fast to write and portable. For certain things, this is valuable. Plus, ARM chips with Jazelle can decode Java Byte Code in hardware.

Other than that, I'm with you. I hate Java for 95% of the things it's used for.
Cat: the other white meat
Reply
Cat: the other white meat
Reply
post #59 of 147
Quote:
Originally posted by Splinemodel
That's why I say Java will be limited to low-performance software.

But if performance isn't relevant, then why even bother with Bytecode? Why not use an interpreted language, be it Boo, Ruby, Python, whatever?

Quote:
But Java still has its pluses, namely that it's fast to write and portable.

Ruby is extremely fast to write and portable. So, that can't be Java's strength either.
post #60 of 147
Quote:
Originally posted by Splinemodel
That's why I say Java will be limited to low-performance software. In twenty years, I'm actually going to go out on a limb and say that most high performance software is going to be written in a new language that resembles VHDL or Verilog more than anything else.

In 20 years the next language will have more levels of abstraction and not less. Eh, I just don't see moving to Verilog or VHDL...if you're going to go to that much trouble for "high performance" then burn the ASIC or load the FPGA.

"Low performance" software is replacing what traditionally was done by RT software because system performance is "fast enough".

Vinea
post #61 of 147
[QUOTE]Originally posted by vinea

1.
.NET is windows specific. Cocoa is Mac specific.

2.
J2EE excels in the enterprise stack environment. The language and syntax itself is less important that the api and application stack around it.



Y'all "wasted" an entire forum page on arguing about Java. I've extracted the 2 points from vinea that's relevant. The Windows world will use .NET primarily. J2EE seems to do well in middleware. Cocoa written in Xcode IDE is the best tool for creating quality Universal Mac software.

Where does that leave Java? As Melgross said back a bit, embedded stuff and mobile phones. I will add, for the end user, Java applets (which used to suck real bad about 5 years ago but seem much better now).

For OS 10.4 and 10.5 any up-to-date Java support is good for plug-in in web browsers and some applications if you really need to use it. In the Mac world, Cocoa in Xcode-IDE out to Universal, native, beautiful-Mac-UI applications is the obvious choice.
post #62 of 147
Quote:
Originally posted by mbaynham
ramblings of a mad man...

One of the best replies I've heard all month!
Citing unnamed sources with limited but direct knowledge of the rumoured device - Comedy Insider (Feb 2014)
Reply
Citing unnamed sources with limited but direct knowledge of the rumoured device - Comedy Insider (Feb 2014)
Reply
post #63 of 147
[QUOTE]Originally posted by nagromme
Not to stray away from the conversation, but...

Worst-case possibilities:

* Intel keeps their lead, but AMD refuses to sell ATI graphics to PC makers (like Apple) who buy from Intel. They are forced to use nVidia or settle for lesser chips from AMD.

* Or Intel keeps their lead, but refuses to sell CPUs to PC makers (like Apple) who use ATI graphics. They are forced to use nVidia or settle for AMD.

* Or AMD develops new CPUs to rival anything from Intel, even for laptops, but Intel punishes severely or turns away companies who use both kinds of processors--they want only exclusive customers. Apple is forced choose either one super-fast processor family or the other, they can't pick from both at once.

All of the above seem unlikely, especially since Intel is so vocal about WANTING to work with Apple. Bullying Apple is not something Intel can do lightly.

Best-case possibilities:

* AMD develops new CPUs to rival Intel. And, like other PC makers, Apple can choose to use CPUs from BOTH companies. Two suppliers are better than one. (Exclusive deals might be lost--which can mean higher prices. Competition would increase--which can mean lower prices.)

* Or AMD develops new ATI graphics technologies that nVidia can't touch, and Apple can use those new GPUs too.

* Or AMD develops low-cost integrated graphics with better performance than Intel. And Apple can use those too.

* Or AMD and ATI push Intel and nVidia to do even more, releasing even better products. The resulting competition helps everyone.

I'm not too dismayed at this point. ATI and Intel are compatible I say leave the Intel-AMD flamewars to PC users. We don't need them any more than we needed IBM vs. Motorola flamewars.



.............................
Let's go chronologically:

2nd half 2006:
Intel CPUs are looking rock solid. Apple is using Intel chipset-motherboards. For the rest of this year, Intel won't mind Apple going with ATI or nVidia, whatever works best for Apple. Though I suspect the entire Mac Intel line will be ATI. (I hear onlooker screaming that FireGL sucks compared to nVidia's Quadros). Crossfire or SLI, I think highly unlikely in the Mac Pro. It's a PC-gaming enthusiast thing. And those that want SLI'ed top-of-the-line Quadros, I think you're better off with a nice dual Woodcrest PC-Windows workstation that supports that.

2007:
Assuming by now AMD-ATI merger is mostly approved, and starts to gain momentum. Intel pipeline still looks very strong. Intel may start to push Apple to move over to nVidia in their new models. Which would be fine. Or Intel (25% chance IMO) will let Apple continue to ride with ATI or nVidia.

2008:
We start to see the first glimmer of what AMD-ATI are really starting to offer. Steve J keeps a close eye on things, but given the momentum of Intel with kickass CPUs in 2006 and 2007 and new developments for 2008, Apple-Intel-nVidia look fine.

If Steve J starts to see some really interesting stuff in the pipeline (AMD-ATI invites him for a nice magic show) he can start some internal Apple R&D to scope out what ATI-AMD's magic may deliver.

2009:
By now it will be clear if AMD-ATI really has anything solid to offer, and if Steve J was really impressed, through the 2nd half of 2008 he'd start loosening up from Intel and 2009 would look good for Apple having wide options for either Intel-nVidia or AMD-ATI, depending on models and stuff, and other new exciting Apple products.

For Apple, July 2007 onwards, they can easily start to steer the boat in a more platform-agnostic direction if needed. At the end of the day, AMD-Intel-nVidia-ATI all duking it out gives Apple the widest options available. And finally, the OSx86 transition will be heralded as Apple's smartest move in the 2nd half of this decade. The first half of the decade being marred by CPU challenges, but supremely boosted by sexy new, compelling products (in spite of all the CPU problems) and of course, birth of the iPod revolution.

I just hope that with Apple having to keep pace now with the breakneck PC component world, their Hardware Engineering will continue to focus on quality and long-lasting products that have been a hallmark, for the most part, of their G3, G4, and G5 stuff. Same on the OSx86 side, I hope the quality standards continue to be upheld/ refined.
post #64 of 147
[QUOTE]Originally posted by sunilraman
Quote:
Originally posted by nagromme
Not to stray away from the conversation, but...

Worst-case possibilities:

* Intel keeps their lead, but AMD refuses to sell ATI graphics to PC makers (like Apple) who buy from Intel. They are forced to use nVidia or settle for lesser chips from AMD.

* Or Intel keeps their lead, but refuses to sell CPUs to PC makers (like Apple) who use ATI graphics. They are forced to use nVidia or settle for AMD.

* Or AMD develops new CPUs to rival anything from Intel, even for laptops, but Intel punishes severely or turns away companies who use both kinds of processors--they want only exclusive customers. Apple is forced choose either one super-fast processor family or the other, they can't pick from both at once.

All of the above seem unlikely, especially since Intel is so vocal about WANTING to work with Apple. Bullying Apple is not something Intel can do lightly.

Best-case possibilities:

* AMD develops new CPUs to rival Intel. And, like other PC makers, Apple can choose to use CPUs from BOTH companies. Two suppliers are better than one. (Exclusive deals might be lost--which can mean higher prices. Competition would increase--which can mean lower prices.)

* Or AMD develops new ATI graphics technologies that nVidia can't touch, and Apple can use those new GPUs too.

* Or AMD develops low-cost integrated graphics with better performance than Intel. And Apple can use those too.

* Or AMD and ATI push Intel and nVidia to do even more, releasing even better products. The resulting competition helps everyone.

I'm not too dismayed at this point. ATI and Intel are compatible I say leave the Intel-AMD flamewars to PC users. We don't need them any more than we needed IBM vs. Motorola flamewars.



.............................
Let's go chronologically:

2nd half 2006:
Intel CPUs are looking rock solid. Apple is using Intel chipset-motherboards. For the rest of this year, Intel won't mind Apple going with ATI or nVidia, whatever works best for Apple. Though I suspect the entire Mac Intel line will be ATI. (I hear onlooker screaming that FireGL sucks compared to nVidia's Quadros). Crossfire or SLI, I think highly unlikely in the Mac Pro. It's a PC-gaming enthusiast thing. And those that want SLI'ed top-of-the-line Quadros, I think you're better off with a nice dual Woodcrest PC-Windows workstation that supports that.

2007:
Assuming by now AMD-ATI merger is mostly approved, and starts to gain momentum. Intel pipeline still looks very strong. Intel may start to push Apple to move over to nVidia in their new models. Which would be fine. Or Intel (25% chance IMO) will let Apple continue to ride with ATI or nVidia.

2008:
We start to see the first glimmer of what AMD-ATI are really starting to offer. Steve J keeps a close eye on things, but given the momentum of Intel with kickass CPUs in 2006 and 2007 and new developments for 2008, Apple-Intel-nVidia look fine.

If Steve J starts to see some really interesting stuff in the pipeline (AMD-ATI invites him for a nice magic show) he can start some internal Apple R&D to scope out what ATI-AMD's magic may deliver.

2009:
By now it will be clear if AMD-ATI really has anything solid to offer, and if Steve J was really impressed, through the 2nd half of 2008 he'd start loosening up from Intel and 2009 would look good for Apple having wide options for either Intel-nVidia or AMD-ATI, depending on models and stuff, and other new exciting Apple products.

For Apple, July 2007 onwards, they can easily start to steer the boat in a more platform-agnostic direction if needed. At the end of the day, AMD-Intel-nVidia-ATI all duking it out gives Apple the widest options available. And finally, the OSx86 transition will be heralded as Apple's smartest move in the 2nd half of this decade. The first half of the decade being marred by CPU challenges, but supremely boosted by sexy new, compelling products (in spite of all the CPU problems) and of course, birth of the iPod revolution.

I just hope that with Apple having to keep pace now with the breakneck PC component world, their Hardware Engineering will continue to focus on quality and long-lasting products that have been a hallmark, for the most part, of their G3, G4, and G5 stuff. Same on the OSx86 side, I hope the quality standards continue to be upheld/ refined.

Sunil, this is all very interesting. But, while I hate to burst your bubble ,
none of this matters. What Intel is not going to be buying from ATI is their chipsets. Sometimes Intel does that to fill a need. Apple will have no problem continuing to use Intel's chipsets. It doesn't affect video if Apple uses seperate gpu's. Where they won't, such as in the Mini, and the MacBook, they will be using IG anyway.

As far as Crossfire goes (and SLI), that's something we've been debating here for awhile. I think the consensus has been that we won't see either, so that won't matter if Intel's chipsets support it or not.

Apple is likely just where they were before it was announced.

I'm not happy about the whole thing just on principle. If there were four or five major GPU companies out there, it wouldn't matter much, but there aren't.

So, now what happens to Nvidia? And does this mean that the toe that Intel has been putting into the water of the high end gpu market again as of late, get plunged all of the way in? Or, do they look to buy Nvidia?

If the AMD ATI deal is approved, an Intel Nvidia deal may have to be as well.

The analyst who saw this coming almost two weeks ago has to get some credit. Now, in retrospect, we can see AMD shedding assets for the purpose of getting enough cash to make this happen.
post #65 of 147
Quote:
Originally posted by vinea
In 20 years the next language will have more levels of abstraction and not less. Eh, I just don't see moving to Verilog or VHDL...if you're going to go to that much trouble for "high performance" then burn the ASIC or load the FPGA.

"Low performance" software is replacing what traditionally was done by RT software because system performance is "fast enough".

Vinea

Have you seen the code for Cell? Have you seen VHDL? The Process statement is very elegant, whereas the way Cell coding does parallelism is certainly not. Behavioral VHDL is actually quite easy to follow as well. RTL and structural HDL are much more confusing, and are required for programmable logic synthesis, but certainly not for parallel CPUs.

And let's face it, fast-clocked sequential OOE is deader than a doornail.

One of the topics often covered by the industry rags is how ever complicated software has become less and less stable. Whereas a plane or car with a million parts works great, a million lines of code will always be full of lingering problems and take as much money to develop. The reason is that coding paradigms are too dated to be effective in modern technology. OO is hardly a step forward: more descriptive syntax, greater emphasis on varied subroutine/function usage rather than forcing things into classes, and a more inherently parallel coding structure are all features that seem to be agreed on as the necessary next steps.
Cat: the other white meat
Reply
Cat: the other white meat
Reply
post #66 of 147
On topic:

I just don't think there's any reason for Intel to buy nVidia when they could just spin something up themselves. A lot of the price AMD is paying for ATI will undoubtedly go towards portions of the company where there is a lot of overlap in AMD. For Intel, which does one hell of lot more R&D than AMD does, probably 90%, if not more, of the cost of nVidia would be wasted in overlapping technology.

If Intel wants to counter AMD tit-for-tat (which I don't think they will), they will more likely invest the 600M from the XScale sell-off towards some hot, new IP in the video/GPU world. The rest, they'd pull from existing projects. This wouldn't be a bad thing: 45nm GPUs when AMD-ATI is still debugging 90nm.
Cat: the other white meat
Reply
Cat: the other white meat
Reply
post #67 of 147
Quote:
Originally posted by Splinemodel
On topic:

I just don't think there's any reason for Intel to buy nVidia when they could just spin something up themselves. A lot of the price AMD is paying for ATI will undoubtedly go towards portions of the company where there is a lot of overlap in AMD. For Intel, which does one hell of lot more R&D than AMD does, probably 90%, if not more, of the cost of nVidia would be wasted in overlapping technology.

If Intel wants to counter AMD tit-for-tat (which I don't think they will), they will more likely invest the 600M from the XScale sell-off towards some hot, new IP in the video/GPU world. The rest, they'd pull from existing projects. This wouldn't be a bad thing: 45nm GPUs when AMD-ATI is still debugging 90nm.

Well, I do agree with that. But, sadly, business is not always 100% rational. There is the element of "If you do that, I'll do it as well." Intel could afford it better than AMD. I've seen many business's buy others for a product line, when it would have been far cheaper to do it themselves. but there are reasons for that as well. Sometimes the brandname, sometimes the customer list, sometimes the speed of getting into the business RIGHT NOW, rather than next year, or the year after that. It depends on why it's being done.

I mentioned that Intel has been investigating getting back into high end GPU's, an area that they were NOT successful in last time they did so. That lack of success could be another reason why they might want to buy in. They have a lot on their minds now. It's hard to say which would be more distracting, buying a company, or starting up an entire division of R&D.

Before, they were just investigating. Now, they may feel it to be more urgent.
post #68 of 147
Quote:
Originally posted by Splinemodel
Have you seen the code for Cell?

Nope.

Quote:
Have you seen VHDL?

Yep.

Quote:
The Process statement is very elegant, whereas the way Cell coding does parallelism is certainly not.

Which is why it will all get hidden by the compiler eventually.

Quote:
And let's face it, fast-clocked sequential OOE is deader than a doornail.

Which matters not at all to the .NET developer and only currently to console developers because the toolset isn't mature.

Quote:
One of the topics often covered by the industry rags is how ever complicated software has become less and less stable. Whereas a plane or car with a million parts works great, a million lines of code will always be full of lingering problems and take as much money to develop.

The reason is that coding paradigms are too dated to be effective in modern technology. OO is hardly a step forward: more descriptive syntax, greater emphasis on varied subroutine/function usage rather than forcing things into classes, and a more inherently parallel coding structure are all features that seem to be agreed on as the necessary next steps.

Agreed by whom? Hardware engineers?

The reason a million lines of code has defects is because even with the highest levels of quality (<0.1 defects per KSLOC...which no one will get with a million SLOC) you still have a 100 defects some of which can be fatal.

A Boeing 777 has only 123K unique parts out of a total 3M. It has roughly 2.3M SLOC. Yea and verily, 777s do come off the line with defects...and that's manufacturing from a proven template.

Software development is a lot more like building the 777 design and prototype. You don't think there were defects during that process? The Aegis weapon system is 31M SLOC. How many verilog projects have 31M SLOC?

And why SHOULDN'T a million lines of code cost as much as an object with a million parts? And name a car with a million parts? The average car has 3,800 unique parts and about 35,000 total items. Windows XP has 40 million SLOC. Linux is 30-100 million SLOC.

http://www.theautochannel.com/news/p...ess018372.html

And if you had a car with 40 million moving parts you'd see it fail three orders of magnitude more often than they do now.

Even with five 9s quality there are still 30K defects.

You vastly underestimate the complexity of modern software and what is more staggering is that each line of code today does more work than they did in the past (because higher level, more abstract languages do more work per line than lower level languages...like say Verilog).

Software developer productivity has kept pace with or exceeded the rapid advances in other disciplines. While we will continue to evolve and provide even greater productivity we certainly have nothing to be ashamed of as an industry.

Vinea
post #69 of 147
[QUOTE]Originally posted by vinea
Software developer productivity has kept pace with or exceeded the rapid advances in other disciplines. While we will continue to evolve and provide even greater productivity we certainly have nothing to be ashamed of as an industry....



If you (generic you, not you vinea specifically) made OSX, the software for the space shuttle perhaps, yes, you have something to be very proud of. If you made Linux, yes, very proud of that, though the end-user side needs a lot of work

If you (generic you, not you vinea specifically) have been involved with Win95, 98/Me, 2000, XP, Office, and ShiteOnTheVista, you need to put a paperbag over your head.
post #70 of 147
[QUOTE]Originally posted by melgross
Well, I do agree with that. But, sadly, business is not always 100% rational. There is the element of "If you do that, I'll do it as well." Intel could afford it better than AMD. I've seen many business's buy others for a product line, when it would have been far cheaper to do it themselves. but there are reasons for that as well. Sometimes the brandname, sometimes the customer list, sometimes the speed of getting into the business RIGHT NOW, rather than next year, or the year after that. It depends on why it's being done.

I mentioned that Intel has been investigating getting back into high end GPU's, an area that they were NOT successful in last time they did so. That lack of success could be another reason why they might want to buy in. They have a lot on their minds now. It's hard to say which would be more distracting, buying a company, or starting up an entire division of R&D.

Before, they were just investigating. Now, they may feel it to be more urgent.



This is IMHO a good post on where Intel stands in all this. Pursue their own R&D or knee-jerk into talking with nVidia.

Intel has CPUs, chipsets, the whole wireless Centrino platform. The Intel integrated graphics is about 50% of the market, IIRC. They get into high-end GPUs, particularly with their fab skillz with 65nm and 45nm down the line, Intel-nVidia could have something really interesting.

Think about Conroe demolishing Athlons at lower TDP. Now think about nVidia GPUs wiping the floor with ATI at lower TDP. Think about Intel integrated graphics where you could actually play a game not older than 3 years
post #71 of 147
Fresh news here, but I am not sure if I understand well what this means.
post #72 of 147
Actually PB someone mentioned this earlier already

ATI makes the Radeon Xpress200 chipset that goes in various motherboards. It's a chipset with integrated graphics based on the Radeon X300 (pretty crap for games, but whatever...)
http://www.ati.com/products/radeonxp...tel/index.html

The motherboard is for Intel CPUs. When ATI's license runs out, they will discontinue making chipset/ motherboard stuff for Intel. They will of course continue to sell the RadeonĀ® Xpress 1100 Series for AMD Desktops chipset and RadeonĀ® Xpress 200 Series for AMD Processors chipset/ motherboard thingy.
http://www.ati.com/products/radeonxp...dsk/index.html
http://www.ati.com/products/radeonxp...eries/amd.html

In the long run, maybe within a year, we'll see some integrated ATI stuff on AMD's AM2 platform. And who knows what other goodies will come from "a transaction that will combine AMDs technology leadership in microprocessors with ATIs strengths in graphics, chipsets and consumer electronics... [resulting in] ...a processing powerhouse: a new competitor, better equipped to drive growth, innovation and choice for its customers in commercial and mobile computing segments and in rapidly-growing consumer electronics segments"
http://www.ati.com/companyinfo/about/amd-ati.html
post #73 of 147
AMD and ATI will have fierce competition. They're hoping synergy will emerge from this merger/ buyout. Intel Core, Core2, etc. is killer in the desktop and laptop space. AthlonX2 loses out to Conroe and Xeon Woodcrests take on Opterons hard and fast. AMD's Turion for laptops has not been that hot. Well, figuratively, not literally . Intel makes chipsets. ATI is trying to compete in the chipset for motherboards space. nVidia also makes chipsets - nForce for PC is pretty good.

Then we look at GPUs, ATI and nVidia neck and neck, with Intel Integrated graphics picking up and dominating the low-end.

AMD-ATI has its work cut out for it. But indeed, maybe we will see some fantastic stuff down the line from the underdogs.

I'm an nVidia fanboi, and now ex-AMD fanboi (a Conroe-descendant will be in my next PC, I suspect...). Still, best wishes to AMD-ATI
post #74 of 147
Indulge me a little, again, I'll say it: what do you think will happen if nVidia got access to Intel's 65nm and 45nm manufacturing skillz? Think about it
post #75 of 147
[QUOTE]Originally posted by PB
Fresh news here, but I am not sure if I understand well what this means.



Hardmac.com gave a good short explanation: "As one would have expected it, now that ATI belongs to AMD, Intel will not renew the license to ATI for having the right to develop and manufacture Intel-compatible chipsets.
For ATI, it is worth 100 million $, and it allows AMD to isolate Intel for the integrated chipset market. Of course Intel develops its own graphical chipsets, but their performance levels can not compete with dedicated solutions provided by ATI or nVidia. The previous scheme where Intel and ATI were partners to compete with the long partnership between AMD and nVidia might evolve in the coming months. Either Intel and nVidia will team up, or Intel will push R&D to develop more competitive integrated chipsets and/or GPUs. The latter has been rumored to be already in action; the forthcoming GMAX3000 being the first result of such efforts."
post #76 of 147
[QUOTE]Originally posted by sunilraman
Quote:
Originally posted by PB
Fresh news here, but I am not sure if I understand well what this means.

Hardmac.com gave a good short explanation:

Thanks sunilraman, I just saw this one. So, no clear impacts for the Mac and nVidia's position in the new order as of yet.
post #77 of 147
[QUOTE]Originally posted by PB
Thanks sunilraman, I just saw this one. So, no clear impacts for the Mac and nVidia's position in the new order as of yet.



Heh. It's kinda like, if the sun explodes, we'll only feel it 8 minutes later (excluding technical details of gravity and physics and sh*t). We can be sure Steve Jobs is meditating for a little while on this news today, but yeah, it's a long term thing to see what really transpires.
post #78 of 147
Quote:
Originally posted by vinea
. . .
Agreed by whom? Hardware engineers? . . .

Well, the electronic hardware industry rags don't concern themselves too much with software. It's the sentiment of mostly embedded software developers, although these days embedded design is more common than ever, and it results in more revenue and more code than high-level software. Increasing globalization of the labor force, coupled with inreased focus on embedded devices is only going to fuel the development of more efficient paradigm.

One of the analogies I remember reading was the comparison of contemporary software paradigm as a "universal bolt." A lot of developers champion the idea of code re-use to a non-advantageous extent. Whereas a machine will include many types of bolts, each type selected for its suitability in a specific case, software development seems to influence the idea of using fewer total parts, but parts that are much more complex, much less reliable, and much less well suited for each individual task. Put simply, this approach has failed, and continues to be a risk for failure whenever it is used.

What's the next step? You don't seem to think there is a next step, which is probably a bad thing to assume since hardware is changing dramatically. At a certain point, there's only so much that can be done in a compiler: if software developers don't want to learn new paradigms, that's fine -- the business will just move to India, China, and Eastern Europe, where the developers there are more persuadable.
Cat: the other white meat
Reply
Cat: the other white meat
Reply
post #79 of 147
Quote:
Originally posted by sunilraman
AMD and ATI will have fierce competition. They're hoping synergy will emerge from this merger/ buyout. Intel Core, Core2, etc. is killer in the desktop and laptop space. AthlonX2 loses out to Conroe and Xeon Woodcrests take on Opterons hard and fast. AMD's Turion for laptops has not been that hot. Well, figuratively, not literally . Intel makes chipsets. ATI is trying to compete in the chipset for motherboards space. nVidia also makes chipsets - nForce for PC is pretty good.

Then we look at GPUs, ATI and nVidia neck and neck, with Intel Integrated graphics picking up and dominating the low-end.

AMD-ATI has its work cut out for it. But indeed, maybe we will see some fantastic stuff down the line from the underdogs.

I'm an nVidia fanboi, and now ex-AMD fanboi (a Conroe-descendant will be in my next PC, I suspect...). Still, best wishes to AMD-ATI

Intel is likely to hit the FSB wall with there quad-cores and amd's true quad-cores will smoke them and amd is still a lot better with 4 cpus them Intel is. Hyper Transport based co-processors and HTX cards may force Intel to starting use Hyper Transport.
post #80 of 147
Quote:
Originally posted by melgross
While I can agree with most of what you said, you left a couple of things out.

The most important is that Apple had a GOOD look at Intel's roadmap well before the deal was consummated. You can be sure of that.

Remember when Jobs was up on stage and talked about the performance/power situation? Many people were thinking, "What is he smoking, the Prescott, and the Xeons use so much power, and they are being killed by AMD, and IBM's G5 is pretty close, and uses less power?"

Going by that, even though Intel is the gorilla, the performance still sucked.

Now, we see otherwise. Apple knew what we didn't.

Apple isn't going to AMD. At this time, they would be fools to do so. And AMD is having many pricing problems which is going to destroy their profits. Intel can afford it, but the still much smaller AMD may not be able to.

I agree with you whole-heartily. I was not trying to imply that Intel did not have a good roadmap. But as you reiterated, Apple cannot afford any hiccups. The "Gorilla" does have the strength to keep prices low for longer then AMD could, should it come to that. But I do believe that should AMD keep on the pace they are on, look for it in the future of Apple. Also with buying ATI, this would give them leverage in the price war. Apple buys both. If together they are cheaper than the purchase of a chip from Intel, and graphic cards from ATI or Nvida, then look out Intel. Besides, they are now in a position to use both (Down the road I believe, not right now). This way they would not have to abandon Intel, just keep them on their toes.
-ReCompile-
"No matter where you go, There you are"
- Buckaroo Bonzai
Reply
-ReCompile-
"No matter where you go, There you are"
- Buckaroo Bonzai
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: General Discussion
AppleInsider › Forums › General › General Discussion › AMD to acquire graphics chip giant ATI