Why Apple WhY! Evil ADC!

Posted:
in Current Mac Hardware edited January 2014
As mentioned in general discussion I got what I consider to be an absolute steal of a deal on a nice Quicksilver Mac.



I have had discussions with plenty of friends lately both in real life and online about how I was just getting very tired of waiting for Apple to get their act together. Apple is like a junior high school kid. Big body, little brain, and always some real flashes of intelligence and potential.



The Quicksilver I purchased has a Geforce2mx card. I thought I could get by with that. Turns out I can't and so I wanted to "borrow" the Radeon 8500 I have put into my wife's (formerly mine until she stole it and wouldn't give it back) Sawtooth.



Problem, the 8500 has a DVI connector, not ADC. A check on the Apple site shows a nice $100 converter. A check at OWC.com shows the Radeon 9000 priced at over $160 with tax. The PC version of the current Radeon 9000 (Radeon 9200) is a whole $90 at my local PC club for comparison.



The LCD is absolutely beautiful and a deal considering Apple sales them at twice the cost of anyone else. But why does Apple have to refuse to grow up in doing this!?! The monitor should have come with the DVI to ADC converter since it is already so highly priced.



Why does Apple still have to gouge us after all these years?!? I mean I still remember the screams going out from them shipping machines with 2 megs of RAM while shipping System 7 which couldn't get by with anything less than 4 realistically. (Sort of like the machines they are shipping with 128 megs now, but of course it is 12 years later)



Is this the first sign of aged grumpiness? Is Apple really still refusing to get their act together? Or is it that I am just getting to old for this crap?



Nick
«13

Comments

  • Reply 1 of 49
    hmurchisonhmurchison Posts: 12,419member
    No not really. ADC isn't the best of ideas from Apple.
  • Reply 2 of 49
    dmband0026dmband0026 Posts: 2,345member
    Quote:

    Originally posted by hmurchison

    No not really. ADC isn't the best of ideas from Apple.



    It's all in the name of simplicity boys and girls. One cord with power, video, and USB. That's eliminating 4 or 5 cords and a USB hub (depending on the amount of USB devices you have). I know that I already have enough cords running all over the place, I don't need 4 more. It's a great idea, only if there were more cards out with the connection, either that or if the cards didn't cost too much more.
  • Reply 3 of 49
    lucaluca Posts: 3,833member
    Quote:

    Originally posted by DMBand0026

    It's all in the name of simplicity boys and girls. One cord with power, video, and USB. That's eliminating 4 or 5 cords and a USB hub (depending on the amount of USB devices you have). I know that I already have enough cords running all over the place, I don't need 4 more. It's a great idea, only if there were more cards out with the connection, either that or if the cards didn't cost too much more.



    It only eliminates one cord, the power cord. Unless of course you need the USB hub.



    By the way, Apple should have kept ADB, and serial. There were a lot of serial modems that were able to use power from the ADB port so you didn't have to plug them into the wall. Now if you want an external modem, you have to get a USB modem and a power cord. Way to eliminate cable clutter, Apple



    Seriously, sticking with ADC based entirely on the fact that it eliminates one cable is stupid. By using it, they're:



    1. Preventing the use of many graphics cards with their own displays

    2. Choosing to turn down sales to PC users who realize that the Apple displays are not only beautiful, but also excellent deals. If they used DVI, it would make a great trojan horse. Nope, Apple would prefer not to sell to PC users I guess.

    3. Locking themselves into using ADC forever. Think of it this way - a 20", 22" or 23" cinema display will be a sweet display years after the PowerMac you bought it with is obsolete. You'll want to keep the display after selling the PowerMac, and you have to keep buying ADC compatible graphics cards to use it.

    4. Making it nearly impossible for PowerMac owners to get a replacement power supply if they need it. Since ADC was introduced, Apple started using "special" power supply connectors to provide power to the monitor through the graphics card. So, if the power supply in your ADC-equipped G4 dies, you're locked into buying a new G4-specific power supply instead of being able to just get an ATX power supply (which will probably be cheaper, quieter, AND provide more power).
  • Reply 4 of 49
    kennethkenneth Posts: 832member
    I kind of like the idea of ADC.



    It does eliminate some cables in most cases.
  • Reply 5 of 49
    jadejade Posts: 379member
    Quote:

    Originally posted by Kenneth

    I kind of like the idea of ADC.



    It does eliminate some cables in most cases.




    belkin's adc to dvi is half the price of apple's
  • Reply 6 of 49
    lucaluca Posts: 3,833member
    Quote:

    Originally posted by jade

    belkin's adc to dvi is half the price of apple's



    You mean this?



    http://catalog.belkin.com/IWCatProdu...oduct_Id=92397



    Apple's $100 adapter allows you to connect an ADC display to a DVI port on a PowerBook or a graphics card. This $50 Belkin adapter does the opposite - it lets you connect a DVI display to an ADC port.
  • Reply 7 of 49
    jlljll Posts: 2,713member
    Quote:

    Originally posted by trumptman

    The LCD is absolutely beautiful and a deal considering Apple sales them at twice the cost of anyone else. But why does Apple have to refuse to grow up in doing this!?! The monitor should have come with the DVI to ADC converter since it is already so highly priced.



    The adapter is also the power supply, since the screen can't get power through DVI.



    And you can be 100% sure that the price of the display would be higher if they had to include a power supply with every monitor.
  • Reply 8 of 49
    jlljll Posts: 2,713member
    Quote:

    Originally posted by Luca

    It only eliminates one cord, the power cord. Unless of course you need the USB hub.



    Which many around me does.





    Quote:

    Originally posted by Luca

    By the way, Apple should have kept ADB, and serial. There were a lot of serial modems that were able to use power from the ADB port so you didn't have to plug them into the wall. Now if you want an external modem, you have to get a USB modem and a power cord. Way to eliminate cable clutter, Apple



    You complain that they're are using ADC (an open standard made by IBM btw.) AND complain that they dumped ADB.



    One serial interface is enough!





    Quote:

    Originally posted by Luca

    Seriously, sticking with ADC based entirely on the fact that it eliminates one cable is stupid. By using it, they're:



    1. Preventing the use of many graphics cards with their own displays




    If you aren't using ADC, the display would need a power supply, and the power supply allows you to connect the display to a DVI port.



    Apple isn't preventing the use of DVI cards - they're saving the users with ADC cards the need for a big power supply (thus saving them money).





    Quote:

    Originally posted by Luca

    2. Choosing to turn down sales to PC users who realize that the Apple displays are not only beautiful, but also excellent deals. If they used DVI, it would make a great trojan horse.



    They do - with the optional power supply.





    Quote:

    Originally posted by Luca

    3. Locking themselves into using ADC forever. Think of it this way - a 20", 22" or 23" cinema display will be a sweet display years after the PowerMac you bought it with is obsolete. You'll want to keep the display after selling the PowerMac, and you have to keep buying ADC compatible graphics cards to use it.



    No, you could buy the adapter/power supply.



    Look at it this way: the price of the 23" display is $2,099, but if you have a Mac with an ADC card, you save $100.
  • Reply 9 of 49
    lucaluca Posts: 3,833member
    Quote:

    Originally posted by JLL

    You complain that they're are using ADC (an open standard made by IBM btw.) AND complain that they dumped ADB.



    One serial interface is enough!




    I was joking. I was implying that Apple is clinging to standards that only they use. Yes, I am aware that ADC is an open standard. But guess what? No one uses it except Apple and a few peripheral companies that make devices for Macs. It was the same with ADB and Serial (Mac Serial was different from PC Serial).



    I don't think this would be a problem at all if ALL Mac-compatible graphics cards currently being made were ADC compatible. Unfortunately, ATI is a bitch and refuses to put ADC on many of their retail cards (8500, 9800). Funny... nVidia has put ADC on every one of their Mac-compatible graphics cards.



    When I think about it, ADC actually is very useful and I can see why Apple likes using it. But I think they need to make sure that ALL PowerMacs have it. They're doing a great job now, of course... all PowerMacs since mid-2000 have come with ADC graphics cards. The next step is getting on ATI's ass and making them put ADC on all their Mac-compatible cards, so upgrading to a new graphics card didn't have to mean buying a clunky $100 adapter. And as long as Macs also keep a DVI connection onboard, they'll be totally compatible with any non-Apple monitor, preventing a situation like the one that happened with Apple using DB-15 instead of VGA for monitors back in the beige era. Then the only shortfall would be that PC users would have to buy the adapter to use the Apple displays, but I wonder how many would care even if there were DVI versions.
  • Reply 10 of 49
    gongon Posts: 2,437member
    IMHO ADC is a very bad idea. Apple has screwed itself enough with incompatibility.



    If you want to reduce the amount of visible cables on the table and whatnot, by all means put power, DVI and USB into one cable, but make the connectors at the end separate to retain compatibility! On most computers you never even see the connectors in everyday use. Personally I don't have a single reason to prefer the one-size-fits-all connector.
  • Reply 11 of 49
    gongon Posts: 2,437member
    Just wanted to add:



    ADC is continuously becoming an even worse idea now that we have USB2 and Firewire, the displays grow even bigger, and we're approaching the limits of DVI. There will be changes in a year or two, and I certainly hope the result of the change is not 'ADC2'.
  • Reply 12 of 49
    whisperwhisper Posts: 735member
    Quote:

    Originally posted by Gon

    Just wanted to add:



    ADC is continuously becoming an even worse idea now that we have USB2 and Firewire, the displays grow even bigger, and we're approaching the limits of DVI. There will be changes in a year or two, and I certainly hope the result of the change is not 'ADC2'.




    Why not? We'll need a 'DVI 2', so 'ADC 2' is perfectly logical.
  • Reply 13 of 49
    gongon Posts: 2,437member
    Quote:

    Originally posted by Whisper

    Why not? We'll need a 'DVI 2', so 'ADC 2' is perfectly logical.



    In my previous post, I addressed why they should scrap ADC and not do that again.



    ADC displays are geared for a market that is maybe 0.5% of total PC market, *and* a minority among Apple's own customers. Laptops are growing their share which means ADC compatible machines will be even more of a rarity. These facts alone should have alarm bells ringing in your head.



    Even if Apple was a market leader, they should still have the connectors separate in order to make their equipment interoperate seamlessly with other equipment. When you actively suppress interoperability, you're digging your own grave for short term win. IBM found this out the hard way, and Microsoft is trying their best to be the next example. Unless Apple can get there first..



    As I said, personally I have not heard a single good reason for ADC to exist, and I doubt there is such a reason.
  • Reply 14 of 49
    whisperwhisper Posts: 735member
    Quote:

    Originally posted by Gon

    In my previous post, I addressed why they should scrap ADC and not do that again.



    ADC displays are geared for a market that is maybe 0.5% of total PC market, *and* a minority among Apple's own customers. Laptops are growing their share which means ADC compatible machines will be even more of a rarity. These facts alone should have alarm bells ringing in your head.



    Even if Apple was a market leader, they should still have the connectors separate in order to make their equipment interoperate seamlessly with other equipment. When you actively suppress interoperability, you're digging your own grave for short term win. IBM found this out the hard way, and Microsoft is trying their best to be the next example. Unless Apple can get there first..



    As I said, personally I have not heard a single good reason for ADC to exist, and I doubt there is such a reason.




    In your previous post you failed to convince me. There are two good reasons for ADC: less cable clutter and saving the cost of a power supply. All the macs that have ADC come with an adapter that lets you use regular DVI or VGA monitors, so it's not like Apple's forcing you to use their system.
  • Reply 15 of 49
    gongon Posts: 2,437member
    Quote:

    Originally posted by Whisper

    In your previous post you failed to convince me. There are two good reasons for ADC: less cable clutter and saving the cost of a power supply. All the macs that have ADC come with an adapter that lets you use regular DVI or VGA monitors, so it's not like Apple's forcing you to use their system.



    I gave a reason why equal lack of cable clutter can be achieved without a special connector.



    Also, how do you "save" the cost of a power supply by choosing a different connector?
  • Reply 16 of 49
    jlljll Posts: 2,713member
    Quote:

    Originally posted by Gon

    I gave a reason why equal lack of cable clutter can be achieved without a special connector.



    Also, how do you "save" the cost of a power supply by choosing a different connector?




    ADC supplies power to the display - DVI doesn't.
  • Reply 17 of 49
    lucaluca Posts: 3,833member
    I remember a while back, before ADC, Apple put a female power port on the PowerMac power supplies right next to the male power port so you could plug the monitor's power directly into the PowerMac instead of having to take two power outlets. They sold their PowerMacs with the proper power cable to allow you to plug any monitor into the PowerMac.



    How about a cable that splits at one end?







    The display has the USB (or USB 2, or Firewire, or both) connector(s), the DVI, and the power all lined up. You plug that end of the cable into the display. Or it could be simply built in (simpler). Then the other end splits, and you plug the USB into the USB port, Firewire into the Firewire port, and the power into a power outlet. The power would split earlier so it can reach an outlet, and the USB and DVI will split close to the end to reduce cable clutter. It's not QUITE as perfect as ADC, but it's a hell of a lot more compatible.
  • Reply 18 of 49
    jimzipjimzip Posts: 446member
    My friends laughed at me when they saw me using a converter for my Sony monitor at a LAN I had...

    Aw...



    Jimzip
  • Reply 19 of 49
    jlljll Posts: 2,713member
    Quote:

    Originally posted by Luca

    I remember a while back, before ADC, Apple put a female power port on the PowerMac power supplies right next to the male power port so you could plug the monitor's power directly into the PowerMac instead of having to take two power outlets. They sold their PowerMacs with the proper power cable to allow you to plug any monitor into the PowerMac.



    And those monitors accept 120/240V and have their own adapter built in - this is what you save by using ADC (you're actually using the computer's adapter to convert 120/240V to whatever the monitor needs).
  • Reply 20 of 49
    lucaluca Posts: 3,833member
    Quote:

    Originally posted by JLL

    And those monitors accept 120/240V and have their own adapter built in - this is what you save by using ADC (you're actually using the computer's adapter to convert 120/240V to whatever the monitor needs).



    ¿Qué? No comprende.
Sign In or Register to comment.