or Connect
AppleInsider › Forums › General › General Discussion › AMD to acquire graphics chip giant ATI
New Posts  All Forums:Forum Nav:

AMD to acquire graphics chip giant ATI - Page 3

post #81 of 147
Quote:
Originally posted by Joe_the_dragon
Intel is likely to hit the FSB wall with there quad-cores and amd's true quad-cores will smoke them and amd is still a lot better with 4 cpus them Intel is. Hyper Transport based co-processors and HTX cards may force Intel to starting use Hyper Transport.

They have something like HyperTransport in the works.
post #82 of 147
Quote:
Originally posted by Splinemodel
Well, the electronic hardware industry rags don't concern themselves too much with software. It's the sentiment of mostly embedded software developers, although these days embedded design is more common than ever, and it results in more revenue and more code than high-level software.

OO in the embedded environment is often a poor fit. To say that OO has failed because it is a poor tool for a specific purpose is as blind as saying Java sucks for everything.

As for more revenue and more code I would ask for a reference as embedded is still a niche arena as near as I can tell. One with high growth perhaps but the software world is vast and highly profitable.

Oh and a source that ISN'T an embedded industry trade rag. Every industry self inflates their market and the obvious examples of this self-deception were the telcom industry (who's bubble burst) and the internet industry (dot bomb).

Quote:
Increasing globalization of the labor force, coupled with inreased focus on embedded devices is only going to fuel the development of more efficient paradigm.

More efficient paradigm for small embedded programs that solve smaller problems on embedded devices. Not more efficient for large projects. You aren't going to run your core services from an embedded device.

Globalization is an issue only from the perspective of industrial/technical dominance rather than offshoring. Having worked with Wipro and others, offshoring can have good ROI but must be managed carefully as many companies have realized. And the benefits are not as large as one might hope (i.e. its no silver bullet for developer shortage in the US).

It is when, say the Indian or Chinese software industry is dominating the software world that the US software industry will be in danger of collapse (like say the US auto industry). When there is an Apple or Microsoft equivalent from a foreign nation that dominates software sales are we in big trouble.

This isn't to say it can't happen but it hasn't yet. Japan was going to be our doom in the late 80s with their six sigma software quality and software factories that churned out millions of lines of code. Didn't happen.

Whether the US has a cultural advantage I don't know. So far we've done well.

Quote:
One of the analogies I remember reading was the comparison of contemporary software paradigm as a "universal bolt." A lot of developers champion the idea of code re-use to a non-advantageous extent. Whereas a machine will include many types of bolts, each type selected for its suitability in a specific case, software development seems to influence the idea of using fewer total parts, but parts that are much more complex, much less reliable, and much less well suited for each individual task. Put simply, this approach has failed, and continues to be a risk for failure whenever it is used.

Strawman. You can't read marketing literature (whether its from a vendor, a methodology guru or process maven) that declares a certain technology or technique is a silver bullet and then state that all software engineering is a failure because said technology/technique turns out its not a silver bullet.

Any technique that claims 25%+ productivity improvement (cough...CMM...cough) is probably statistical BS. OO for example measures out to be around a 6% improvement over SASD and only for certain problem sets.

Does that make OO a failure? No. 6% beats 0%.

Contemporary software paradigm does NOT describe a universal bolt and hasn't since...mmm...mid 90s or whenever "software reuse" stopped being the buzzword of the day. Component based software development does advocate using pre-existing components to build a software system but largely no one uses (or advocates) large "universal" components anymore but smaller building blocks like UI widgets, network stacks, etc as you might find in .NET, DirectX, etc.

Little COTS vs big COTS integration as the industry figured out by the mid 90s that big COTS integration typically works poorly because of interface complexity. I would say that large component CBSD died in the late 90s early 00s if not earlier. Hard to say but it sure isn't a hot topic anymore.

The biggest takeaway from all this is that fundamental concepts of software development still apply. Examples include coupling and cohesion as indicators of quality and efficiencies in development. Ignore these fundamentals at your own risk. They exist. Software isn't totally a voodoo science. 90% perhaps.

Quote:
What's the next step? You don't seem to think there is a next step, which is probably a bad thing to assume since hardware is changing dramatically.

Of course there's a next step. There's always a next step. However, as with OO, a parallel paradigm is not a silver bullet, nor can it ignore the fundamentals that seem to have survived (although renamed often) since the inception of software engineering.

For certain problem domains I would expect a parallel processing based paradigm to net...oh around a 6% improvement over OO or procedural methods. A double digit improvement would be a fantastic rarity. Wonderful if we can get it but I won't be drinking the kool-aid till I see the studies that show such "silver bullet" improvements.

Quote:
At a certain point, there's only so much that can be done in a compiler: if software developers don't want to learn new paradigms, that's fine -- the business will just move to India, China, and Eastern Europe, where the developers there are more persuadable.

If you are a software developer that has stopped learning you are management.

However I call BS that the entire industry is going to move to a paradigm that is best suited for embedded highly parallel applications that needs to be aware of underlying hardware architecture.

Parallel architectures come and go. Its not like the Cell is the first such incarnation. They have thier place to solve certain problems more efficiently than other techniques. Likewise monolithic architectures have their place to solve other problems more efficiently than other techniques.

Anyone that poo-poos one or the other is self-limiting their tool box.

Most problem sets will be solvable using "low-performance software" even on embedded devices. Frankly what you call an embedded device and what I might call an embedded device are probably an order of magnitude different in compute power and storage capability.

Java works fine in the embedded realm today because the embedded environment are no where near as constrained as when I was a RTOS developer. More and more solutions can be built on embedded Java which is a wonderfully easier development environment than traditional RT development environments or your proposed low level Verilog like environment although the simulation and test capabilities in some of the VHDL environments would be nice have in a more consistent manner in traditional environments.

Of course that is addressed in Agile methods that emphasize test frameworks.

Vinea
post #83 of 147
Quote:
Originally posted by sunilraman
If you (generic you, not you vinea specifically) have been involved with Win95, 98/Me, 2000, XP, Office, and ShiteOnTheVista, you need to put a paperbag over your head.

Eh, it was Windows that drove the PC revolution...not unix, not MacOS, not OS9, not AmigaDOS, not GEM, not CPM, not OS360, not VMS, etc.

All the commodity hardware you see today and where we are is due in large part because of Microsoft (from the software perspective).

I haven't seen recent statistics but when Capers Jones evaluated MS software practices and defect rates he found they were in the top 5% of all software developers (mmm...thinking in mid 90s).

So odds are that 95% of all coders talking smack about how MS writes shitty code generates software with higher defect rates and works for a company with poorer software practices.

Vinea
post #84 of 147
Quote:
Originally posted by ReCompile
I agree with you whole-heartily. I was not trying to imply that Intel did not have a good roadmap. But as you reiterated, Apple cannot afford any hiccups. The "Gorilla" does have the strength to keep prices low for longer then AMD could, should it come to that. But I do believe that should AMD keep on the pace they are on, look for it in the future of Apple. Also with buying ATI, this would give them leverage in the price war. Apple buys both. If together they are cheaper than the purchase of a chip from Intel, and graphic cards from ATI or Nvida, then look out Intel. Besides, they are now in a position to use both (Down the road I believe, not right now). This way they would not have to abandon Intel, just keep them on their toes.

The deal that AMD just made for ATI might break their back, instead of reinforcing it. The market has taken a long, very unhappy look at it today. AMD's price has continued downwards because of it.

AMD just borrowed $2.5 billion to do this deal. They just lowered the prices on all of their cpu's by almost 50% to match Intel's price/performance ratio; and most in the industry say that it wasn't enough. By by profits. ATI's profits have been even lower than the erratic ones AMD has put on the board, and AMD will have to divest themselves of more businesses to help this deal. The debt load that AMD not a huge company, will have to carry because of this, could hurt their R&D down the road. It certainly will constrain any moves they need to make in the near future.

While the concept seems good, history has shown that a company buying another large (in relation to their own size) company has issues that can take a year or two, if ever, to straighten out. During that time, the acquirer is distracted from their main business goals. This could cause AMD to flounder over the next year or so.

This is what happened to Compaq after they bought DE, and it happened to Hp after they acquired Compaq.
post #85 of 147
Quote:
Originally posted by melgross
The deal that AMD just made for ATI might break their back, instead of reinforcing it. The market has taken a long, very unhappy look at it today. AMD's price has continued downwards because of it.

AMD just borrowed $2.5 billion to do this deal. They just lowered the prices on all of their cpu's by almost 50% to match Intel's price/performance ratio; and most in the industry say that it wasn't enough. By by profits. ATI's profits have been even lower than the erratic ones AMD has put on the board, and AMD will have to divest themselves of more businesses to help this deal. The debt load that AMD not a huge company, will have to carry because of this, could hurt their R&D down the road. It certainly will constrain any moves they need to make in the near future.

While the concept seems good, history has shown that a company buying another large (in relation to their own size) company has issues that can take a year or two, if ever, to straighten out. During that time, the acquirer is distracted from their main business goals. This could cause AMD to flounder over the next year or so.

This is what happened to Compaq after they bought DE, and it happened to Hp after they acquired Compaq.

Actually AMD stayed on a positive upward trend all day, and ended .41 in the plus. But I do agree that the transition will be tough, as any assimilation of that size is bound to be. i do not think that it will break AMD's back, on the contrary, combining a major player in the all essential graphic chip market with a major player in the cpu chip market spells opportunity in a big, big way. Although we are still human, and have the all giving gift to screw about anything up. 8)
-ReCompile-
"No matter where you go, There you are"
- Buckaroo Bonzai
Reply
-ReCompile-
"No matter where you go, There you are"
- Buckaroo Bonzai
Reply
post #86 of 147
What might be the impact on Dell?

Proud AAPL stock owner.

 

GOA

Reply

Proud AAPL stock owner.

 

GOA

Reply
post #87 of 147
Quote:
Originally posted by ReCompile
Actually AMD stayed on a positive upward trend all day, and ended .41 in the plus. But I do agree that the transition will be tough, as any assimilation of that size is bound to be. i do not think that it will break AMD's back, on the contrary, combining a major player in the all essential graphic chip market with a major player in the cpu chip market spells opportunity in a big, big way. Although we are still human, and have the all giving gift to screw about anything up. 8)


You should have read the articles in the Times and the WSJ Tuesday. I tend to agree with them.

The market was up today.
post #88 of 147
Quote:
Originally posted by melgross
The market was up today.

You just said it was down.
'L'enfer, c'est les autres' - JPS
Reply
'L'enfer, c'est les autres' - JPS
Reply
post #89 of 147
Quote:
Originally posted by Gene Clean
You just said it was down.

No I didn't Gene. I said that when the announcement of the deal was made, AMD's stock went down.

I haven't seen you around for a while. Where have you been hiding?
post #90 of 147
Quote:
Originally posted by meelash
I'm thinking it will mean just that. Check this news out for example: Intel pulls ATI (AMD)'s Chip license.

That's hilarious. I wonder if AMD bought ATI just to piss Intel off.
Mac user since before you were born.
Reply
Mac user since before you were born.
Reply
post #91 of 147
Quote:
Originally posted by slughead
That's hilarious. I wonder if AMD bought ATI just to piss Intel off.

That would probably be the dumbest reason to take out a US$2.5B loan.
post #92 of 147
Quote:
Originally posted by slughead
That's hilarious. I wonder if AMD bought ATI just to piss Intel off.

I don't think intel really cares. NVidia is a far greater company than ATI on every level. I've NEVER understood why apple chooses to go with ATI over NVidia.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #93 of 147
Just off the top off my head?

1) ATi gives Apple better access to driver specifications, allowing Apple to optimize themselves
2) ATi makes far less power-hungry laptop GPUs.
post #94 of 147
Quote:
Originally posted by Chucker
Just off the top off my head?

1) ATi gives Apple better access to driver specifications, allowing Apple to optimize themselves
2) ATi makes far less power-hungry laptop GPUs.

Yes 1) is valid, but the better cards are still better cards. You can't compare a ati 9800 to a nvidia 6800. BTW can you prove that is true (apple getting better access?).

The problem I see with nvidia, is nvidia doesn't sell cards. ATI does. So I believe apple has to find a company to manufacturer the NVidia cards.

2), and why do they insist on using them in the mac pro's? (they finally switched with the last go round to nvidia's).

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #95 of 147
Quote:
Originally posted by emig647
Yes 1) is valid, but the better cards are still better cards. You can't compare a ati 9800 to a nvidia 6800. BTW can you prove that is true (apple getting better access?).

The problem I see with nvidia, is nvidia doesn't sell cards. ATI does. So I believe apple has to find a company to manufacturer the NVidia cards.

2), and why do they insist on using them in the mac pro's? (they finally switched with the last go round to nvidia's).

Isn't the 6800 a lot newer than the 9800?

The PowerMac switched to the nVidia series eight months ago.
post #96 of 147
Quote:
Originally posted by emig647
You can't compare a ati 9800 to a nvidia 6800.

No, but you CAN compare an ATI 9800 to an NVIDIA GeForce4 Ti, which launched about 4 months earlier.

Let's just say, in the face os the 9800, the GeForce4 was TERRIBLE. It didn't even have DX9 support.

The 6800 came out 2 years later than the 9800.

Quote:
BTW can you prove that is true (apple getting better access?

This is a forum, not a research paper. Google it yourself.

It's a well established fact that ATI gives Apple the source code for their drivers, while NVIDIA ports the drivers themselves.
post #97 of 147
Uh, the MX series are low-end models. The *800 ones are high-end models. So, no, that's not a fair comparison.

The 4 MX was also essentially a rebadged 2 MX
post #98 of 147
Quote:
Originally posted by Chucker
Uh, the MX series are low-end models. The *800 ones are high-end models. So, no, that's not a fair comparison.

The 4 MX was also essentially a rebadged 2 MX

Haha, I just quickly Wikipedia'd a card released around that time. I didn't follow graphics cards until I actually needed to program for them.
post #99 of 147
Ya, 4 Ti is a fairer match.
post #100 of 147
No, in the PC world the 5900xt (256bit) came out about the same time the 9800 (256bit) came out. The 5900xt didn't perform well. But about 6 months after the 9800 came out is when the 6800 (256bit) series from NVidia came out.

The GeForce 4mx ranged from 64bit to 128bit. Chucker is right... it was a remade geforce 2mx. That is like comparing an ATI 9200 to an ATI 9800 skipping over the 9600 completely.

The mac was extremely deprived of graphics cards at this time. I don't even know why we're talking about old history. Look at today's power macs. NVidia across the screen. But if you compare Nvidia to ati since the 6800 came out, you'll see Nvidia smokes them in every test for every comparable card. You can do the 6800 vs x800, the 6600 vs 9600, the 7600 vs the x1600, the 7800 vs the x1800, the 7800 gt (or gtx) vs the x1850xt....

I have it out for ATI because their drivers are complete trash. They are 2 binary digits short of spyware!

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #101 of 147
Quote:
Originally posted by gregmightdothat
This is a forum, not a research paper. Google it yourself.

It's a well established fact that ATI gives Apple the source code for their drivers, while NVIDIA ports the drivers themselves.

Apparently you don't get the point of insider forums.

Most of the time people call bullshit, so you call proof. It's the nature of the boards. If you don't like being asked to prove something.. gtfo

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #102 of 147
Quote:
Originally posted by emig647
Apparently you don't get the point of insider forums.

Most of the time people call bullshit, so you call proof. It's the nature of the boards. If you don't like being asked to proove something.. gtfo

It depends upon the nature of what you're bullshitting. This is a well established fact.

This would be like me saying, the G4 is pretty much a G3, but with a vector unit called Altivec, and you saying, "OMG get out of here." Well established fact. I'm not going to scour Motorola's (well, now Freescale's) website for it.

Honestly, I don't even know where to look for the ATI/NVIDIA driver thing. I think I read it on lists.apple.com? I dunno.
post #103 of 147
Quote:
Originally posted by emig647
They are 2 binary digits short of spyware!

Who can argue with that?
post #104 of 147
Quote:
Originally posted by emig647
I have it out for ATI because their drivers are complete trash. They are 2 binary digits short of spyware!

Frankly, I have never seen any ATI-made driver collect or transmit information so I wonder where you are getting that. I've never had any stability problem with any ATI product either. I do wonder if you are trying to commit an act of slander and libel.
post #105 of 147
All I wanted to know was the specifics. Because I've heard multiple stories. This has been hashed on out on these forums a few times and the facts have never been consistent. I've heard Apple builds the drivers WITH ATI. I've heard ATI builds the drivers FOR Apple. I've heard ATI gives the source to Apple for them to figure out on their own. I've heard 3rd parties are highered to develop them.

All I was looking for was some real information. You shouldn't get hurt or mad when someone asks you to prove something. I've been asked to prove things on these boards dozens of times that I assumed others knew. I have a lot more posts than you and have been on for less time. What this COULD mean is you aren't up on these boards as much... which is fine I'm not calling you a newb... but the point being is that you or anyone else for that matter may not being up to speed on events as one may assume. It's the nature of forums, you are asked to prove your bullshit every time you post.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #106 of 147
Quote:
Originally posted by JeffDM
Frankly, I have never seen any ATI-made driver collect or transmit information so I wonder where you are getting that. I've never had any stability problem with any ATI product either. I do wonder if you are trying to commit an act of slander and libel.

Are you talking about for the PC or mac?

On the PC when you first install the catalyst drivers there is a packet that is sent out. But the saying... 2 binary digits short of spyware, was meant to say it isn't spyware but COULD be.

What I was saying (and other DIY builders will agree with me) that ATI drivers / software is Bloatware. It involves so much more software and drivers than Nvidia drivers.

I'm assuming you have used both... compare the 2. Nvidia is a driver with settings done with Win32, where as ATI has this slow flash like application that consumes the majority of the screen. They install 5+ apps (depending on the version) that could easily be morphed into 1.

Don't even get me started on linux. You go to nvidia's website, and install the drivers. No messing with depmod or anything of the sort. But when it comes to ATI drivers you are almost always required to rebuild the kernel to get it to work.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #107 of 147
Quote:
Originally posted by emig647
I'm assuming you have used both... compare the 2. Nvidia is a driver with settings done with Win32, where as ATI has this slow flash like application that consumes the majority of the screen. They install 5+ apps (depending on the version) that could easily be morphed into 1.

You can just install the driver without having to run Cadalyst. It is plainly available for download from the ATI site. The Cadalyst software does install non-driver stuff, a video player, DVD and other things, but nothing horrible, just nothing that I need.

I never let any program run or any program transmit without my deliberate approval and I don't remember one being transmitted during the installation or operation.

And this is for my Windows computers.
post #108 of 147
Quote:
Originally posted by gregmightdothat
Who can argue with that?

I would. That's nonsense. That IS something you would have to prove.

There is a lot of trash talk about ATI's drivers, but I've never had any more problems with them than with Nvidia's.

The truth is that end users of software are very different than the programming community.

I haven't programmed for over ten years now, so I'm not up on all of the little arguments I see here, and other places. I keep out of the real techie stuff in that area for that reason.

But, I rarely care what a programmer has to say about software. I've seen good programming practices, and bad ones. What matters in the end is whether it works properly, or not. ATI's drivers have been fine.

And when comparing boards, use proper comparisons. The latest boards against the latest boards, model to model.

The one thing that's great about the PC board market is that with the many manufacturers out there, they compete. We don't have that here. They tweak their boards, add more memory, put better coolers on, over-clock to different extents, etc.

We have none of that. When some say that we have enough choice, it's BS. We have NO choice.

And, while some Nvidia boards are better, some ATI boards are better also. But, we'll never know, because we don't have the ability to find out.
post #109 of 147
Quote:
Originally posted by emig647
Are you talking about for the PC or mac?

On the PC when you first install the catalyst drivers there is a packet that is sent out. But the saying... 2 binary digits short of spyware, was meant to say it isn't spyware but COULD be.

What I was saying (and other DIY builders will agree with me) that ATI drivers / software is Bloatware. It involves so much more software and drivers than Nvidia drivers.

I'm assuming you have used both... compare the 2. Nvidia is a driver with settings done with Win32, where as ATI has this slow flash like application that consumes the majority of the screen. They install 5+ apps (depending on the version) that could easily be morphed into 1.

Don't even get me started on linux. You go to nvidia's website, and install the drivers. No messing with depmod or anything of the sort. But when it comes to ATI drivers you are almost always required to rebuild the kernel to get it to work.

For this board, what ever MAY be true for PC and Linux drivers, is totally irrelevent.

Compare MAC drivers, nothing else matters here. Even when running BootCamp.
post #110 of 147
Quote:
Originally posted by melgross
For this board, what ever MAY be true for PC and Linux drivers, is totally irrelevent.

Compare MAC drivers, nothing else matters here. Even when running BootCamp.

It may matter though.

If Nvidia (or ati) develops 1 set of drivers for OS X like they have for windows, linux, solaris... We'll be able to use ANY aftermarket nvidia board. I could walk into compusa and pick up a EVGA 7950gtx and slap it in the mac pro (assuming it has SLI support) and away I go.

There is nothing stopping Nvidia from doing this. Unless apple puts up a potential block... but why, it would only look bad on apple's part? Our graphics card options will all the sudden change when the mac pro comes out.

This is why it is important to look at what is available in the PC world. Mac boards will be PC boards. No more big endian vs little endian stuff. For the most part the internal GPU code will be the same. The interfacing with OS X will obviously be different... but I forsee getting any graphics card we so desire from ATI & Nvidia in the future.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #111 of 147
Quote:
Originally posted by gregmightdothat
Who can argue with that?

It was a joke ffs. It came from a joke of Lewis Black stating that the people that wrote the bible were 2 hairs short of being a baboon.

The point being ATI's software installed with the drivers is bloatware. I've never noticed a place to install the driver only if you have the catalyst cd. ATI software is a lot of overhead that could be extremely simplified.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #112 of 147
Quote:
Originally posted by emig647
It may matter though.

If Nvidia (or ati) develops 1 set of drivers for OS X like they have for windows, linux, solaris... We'll be able to use ANY aftermarket nvidia board. I could walk into compusa and pick up a EVGA 7950gtx and slap it in the mac pro (assuming it has SLI support) and away I go.

There is nothing stopping Nvidia from doing this. Unless apple puts up a potential block... but why, it would only look bad on apple's part? Our graphics card options will all the sudden change when the mac pro comes out.

This is why it is important to look at what is available in the PC world. Mac boards will be PC boards. No more big endian vs little endian stuff. For the most part the internal GPU code will be the same. The interfacing with OS X will obviously be different... but I forsee getting any graphics card we so desire from ATI & Nvidia in the future.

Too many "ifs". I don't work with "if". There is no guarantee that Nvidia will do anything. there are very few Mac's with the capability of taking a video card. If Apple can't figure out a way to sell a ton of Towers, there won't ever be. At one time, Apple had most of their lineup that fit some video card. Even the Cube could. Not today.

ATI proved that with just a bit of a firmware tweak, a card could work in either a Mac, or a PC. But they chose to have only one card that could do that, and not a very strong card at that. Nvidia has never shown ANY interest in doing that, so I have to give ATI some credit for doing it at all.

If anyone will do this, therefore, it would be ATI, going on their record over the years. But, don't hold your breath.
post #113 of 147
Quote:
Originally posted by melgross
Too many "ifs". I don't work with "if". There is no guarantee that Nvidia will do anything. there are very few Mac's with the capability of taking a video card. If Apple can't figure out a way to sell a ton of Towers, there won't ever be. At one time, Apple had most of their lineup that fit some video card. Even the Cube could. Not today.

Not going to bite on that one. There will be many more mac pro's than solaris boxes with in a year's time. And Nvidia makes drivers for solaris boxes (they keep updating them too for the quadro cards). The whole point about the new pc cards is you wouldn't have to flash the firmware. You will be able to take a pc graphics card and put it in your mac and run it with the correct drivers! It's the same as running the same graphics card in freebsd, linux, windows, solaris without a flash. For Nvidia, a gpu sale is a gpu sale. If they can gain 20k more gpu sales... I think that is worth it to write some os x drivers. In fact, I guarantee apple will be using nvidia cards with the mac pro. So the code will already be written. Who gets it is of question.

Quote:
ATI proved that with just a bit of a firmware tweak, a card could work in either a Mac, or a PC. But they chose to have only one card that could do that, and not a very strong card at that. Nvidia has never shown ANY interest in doing that, so I have to give ATI some credit for doing it at all.

If anyone will do this, therefore, it would be ATI, going on their record over the years. But, don't hold your breath.

Actually you're wrong about that. Nvidia was the first company to have the flashable graphics card. 3dfx was the FIRST company to do that with the voodoo4. I had one. I did it. It worked. Then Nvidia bought 3dfx. And then did the SAME with the geforce 2mx and the geforce 3 and geforce 3mx.

Here are some articles if you don't believe me:

Flash Geforce 2mx

Flash geforce 3

ATI did it a bit later with the 8500. (I couldn't find the article on it, but I know they did it because my friend did it with his radeon 8500).

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #114 of 147
Quote:
Originally posted by emig647
Not going to bite on that one. There will be many more mac pro's than solaris boxes with in a year's time. And Nvidia makes drivers for solaris boxes (they keep updating them too for the quadro cards). The whole point about the new pc cards is you wouldn't have to flash the firmware. You will be able to take a pc graphics card and put it in your mac and run it with the correct drivers! It's the same as running the same graphics card in freebsd, linux, windows, solaris without a flash. For Nvidia, a gpu sale is a gpu sale. If they can gain 20k more gpu sales... I think that is worth it to write some os x drivers. In fact, I guarantee apple will be using nvidia cards with the mac pro. So the code will already be written. Who gets it is of question.



Actually you're wrong about that. Nvidia was the first company to have the flashable graphics card. 3dfx was the FIRST company to do that with the voodoo4. I had one. I did it. It worked. Then Nvidia bought 3dfx. And then did the SAME with the geforce 2mx and the geforce 3 and geforce 3mx.

Here are some articles if you don't believe me:

Flash Geforce 2mx

Flash geforce 3

ATI did it a bit later with the 8500. (I couldn't find the article on it, but I know they did it because my friend did it with his radeon 8500).

I can't speak about Sun boxes, but I doubt that in the industrial world, Apple has the same standing, or marketpower. There are likely far more Sun users that will buy cards than Mac users.

But, you're talking about flashable cards. I'm not. There is an ATI card, though I don't remember which one, in the 9000 series, though I'm certain someone here will, that comes, out of the box, ready to run on either a Mac (PPC), or a PC.

Flashable cards are nothing new. I've flashed ATI cards, for others, for years.

The average customer isn't interested in flashable cards. They want a card, and they want it to work. That's it.

We are not the average consumer, and we shouldn't forget that. Companies won't flash cards either. If it's not factory standard, it has a small possible market. And, the Mac market is small enough already.
post #115 of 147
Quote:
Originally posted by melgross
I can't speak about Sun boxes, but I doubt that in the industrial world, Apple has the same standing, or marketpower. There are likely far more Sun users that will buy cards than Mac users.

But, you're talking about flashable cards. I'm not. There is an ATI card, though I don't remember which one, in the 9000 series, though I'm certain someone here will, that comes, out of the box, ready to run on either a Mac (PPC), or a PC.

I'm pretty confident there wasn't such an animal. I've kept up on all of that... expecially through the 9000 series. None of which ATI shipped in a box would work in both. At least it never said that on the box. They specifically had to release cards labeled as mac all through the 9000 series.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #116 of 147
Quote:
Originally posted by emig647
I'm pretty confident there wasn't such an animal. I've kept up on all of that... expecially through the 9000 series. None of which ATI shipped in a box would work in both. At least it never said that on the box. They specifically had to release cards labeled as mac all through the 9000 series.

You're wrong I'm afraid...
post #117 of 147
Wow!!

I have never seen this on the market. That is pretty cool.

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #118 of 147
Quote:
Originally posted by Amorya
You're wrong I'm afraid...

Thank you!
post #119 of 147
Perhaps that also paves the way for BIOS + EFI cards. It would be great for Apple and Mac users and it wouldn't be that hard convincing companies, as they'll have to make that switch eventually as EFI is the next thing.
post #120 of 147
Quote:
Originally posted by emig647
Wow!!

I have never seen this on the market. That is pretty cool.

This is exactly why I have NEVER believed any of the theories about PPC vs x86, or any such nonsense. We don't get the cards because we are part of a very small market. a market that is much too small for anyone to care about.

With ATI doing that with one card, which worked very well from what I remember reading, they could have done it for all of their cards. I asked one of the top people at ATI about that. I was told that if the card sold well to Mac people (they can tell by the warranty info people send in) Then they would be happy to consider doing it for other cards. It was a sales test.

Unfortunately, we failed the test. No other cards like that were produced.

Nvidia, anf it's producers, weren't even interested enough to try.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: General Discussion
AppleInsider › Forums › General › General Discussion › AMD to acquire graphics chip giant ATI