Latest G5's = Short EOL?

Posted:
in Future Apple Hardware edited January 2014
With the latest PowerMac 'upgrades' might we have just witnessed the birth of another "PowerMac VX" and/or "B&W G3"? Both computers were kinda like the 'last of their line' and/or 'very short lived'.



Why you ask?



Well these G5 v2 upgrades just SMELL FUNNY.



1 - Only a one new CPU speed introduction (2.5 GHz) all the others are old news.



2 - No appreciable motherboard changes at all and in the case of the 1.8 a downgrade to PCI from PCI-X



3 - No memory upgrades at all and it the case of the 1.8 a downgrade



4 - No hard drive changes at all and in the case of the 1.8 and 2.0 a downgrade



5 - *** Not even a stitch of a change in video cards. In fact, the lack of video cards updates have been pointed out by most everyone right off the bat. A huge letdown to most. Second only to not making the change to PCI-Extreme



6 - *** No display updates - still mismatched holdovers from the G4 days. Quite un-Apple like if you ask most people.



Okay so with what we have above and the ThinkSecret report about new displays that are in the works it got me to thinking.



1 - The new displays are said to be DVI instead of ADC - power usage being the main reason for the dropping of ADC.



2 - Apple had to push or build their own card designs that supported ADC since they are the only company (for the most part) that has adopted the (IBM invented?) standard.



3 - Designing special Video cards that support ADC can't be 'cheap' to do and unless you know your gonna build a lot of them just wouldn't be cost effective.



So could it be that the reason that the "NEW" PowerMac line is still stuck with OLD video card options is due to the fact that it didn't make sense for Apple to redesign 'more modern cards' with ADC knowing full well:



A - The 'new' hardware is short lived

B - The 'current' displays are at their end of life

C - New PowerMacs and Displays will QUICKLY follow that are DVI based and then Apple can utilize cards and/or card designs that are already being made for the rest of the PC industry. (greatly increasing the video card options).



I dunno but this could help explain things...



Dave
«13

Comments

  • Reply 1 of 56
    applenutapplenut Posts: 5,768member
    the new G5s have DVI. there is no need to get rid of ADC if Apple chooses DVI for their next displays. In fact, it would make sense for apple to include ADC for the next generation or two so as to cater to people who already own cinema displays and want to use them on their new macs.
  • Reply 2 of 56
    zapchudzapchud Posts: 844member
    I have no idea, but I surely hope so. But I agree, it actually does smell funny.
  • Reply 3 of 56
    davegeedavegee Posts: 2,765member
    Quote:

    Originally posted by applenut

    In fact, it would make sense for apple to include ADC for the next generation or two so as to cater to people who already own cinema displays and want to use them on their new macs.



    As a person who purchased a MIDDLE END G4 450 the day it was announced (and long before 450 was re-classified TOP END to to 500MHz CPUs being MIA for much longer than anticipated) I for one can say in NO UNCERTAIN TERMS that what you describe is NOT Apples' style! Not by a longshot.



    Being stuck with a NON-ADC G4 pissed me off to no end and while sure they could have designed those 'too cool flat panel displays' that I wanted SO BAD in a way were I could have ordered one for my 'not too old' G4 - I was simply SOL... Sorry but without ADC Apple just didn't wanna know me... DVI-->ADC came out eventually but for a good long while the initial adopters of the G4 were just screwed over....



    Ahhh okay I feel better now.



    In any case... I stand by my comment:



    "so as to cater to people who already own cinema displays " - that AIN'T Apples style!



    Dave
  • Reply 4 of 56
    existenceexistence Posts: 991member
    Here's an interesting thought:



    ThinkSecret says the new 30" LCD display has a resolution of 2560 x 1600. This is above the capatibility of DVI at 60Hz! A single DVI connector cannot transmit that much data at 60Hz.
  • Reply 5 of 56
    leonardleonard Posts: 528member
    They will not be short-lived. They're here for the next several months. Get used to it. Quit make excuses or reasons to hope they won't be here long... they will be. In life, you don't always get what you want, life is unfair.



    You made an error in your list - the video cards did change slightly. They are now the XT versions of the cards (ie Radeon 9600XT and Radeon 9800XT vs Radeon 9600 and Radeon 9800) and as such have increased in the amount of RAM on the card.



    And as I see it the 1.8 wasn't downgraded - the low-end was upgraded from a 1.6 to a 1.8.
  • Reply 6 of 56
    occamoccam Posts: 54member
    DaveGee:



    I suspect (and hope) you're on to something.



    One of the reasons the G5 introduction was so well received was because Apple pulled all technology stops in their design: S-ATA, latest graphics accelerators, whisper quiet, USB2, FW800, etc.. After all the half-baked G4 updates, the G5 was a huge breath of fresh air and seemed to indicate Apple was rededicating itself to *all* leading edge technology.



    These machines do not reflect that dedication and, even though they're decent, they don't continue the aggressive commitment of the original G5s. The graphics card issue really seems odd (unless you're correct) since Apple has really been tooling up for fast graphics on so many fronts... OSX general, gaming, and professional (Motion et al.). Plus, I've been waiting for new monitors since the November 2003 "release date" rumors started.



    (Given WWDC is too soon at this stage) we still have SIGGRAPH in August to look forward to. Perhaps Apple'll introduce new screens, graphics cards, mobos there.



    Even AppleInsider's legendary(infamous?) Kormac77 thinks SIGGRAPH may be special:



    kormac77, Apple futures, & SIGGRAPH 2004



    I'm hoping DG and Kormac77 are on to something(s) and we find out by August 8, 2004. In the meantime, I'm waiting for new monitors (including hopefully 30") to take the G5 plunge.



    Cheers!
  • Reply 7 of 56
    occamoccam Posts: 54member
    Gak!



    The initial message by kormac77 is the one I meant to link. Here's a better link:



    Initial kormac77 post.



    Cheers.
  • Reply 8 of 56
    davegeedavegee Posts: 2,765member
    Quote:

    Originally posted by Leonard

    They will not be short-lived. They're here for the next several months. Get used to it. Quit make excuses or reasons to hope they won't be here long... they will be. In life, you don't always get what you want, life is unfair.



    Huh?



    "Several Months" ***IS*** short lived!



    Dave
  • Reply 9 of 56
    rhumgodrhumgod Posts: 1,289member
    Quote:

    Originally posted by Existence

    Here's an interesting thought:



    ThinkSecret says the new 30" LCD display has a resolution of 2560 x 1600. This is above the capatibility of DVI at 60Hz! A single DVI connector cannot transmit that much data at 60Hz.




    Better read up on the DVI specifications.



    Specifically read page 7 (section 1.2 - Performance Scalability)



    A single TMDS-link supports more than HDTV (1920x1080) resolution at 60Hz LCD 5% blanking. And when the link cannot handle it, you need to add another one. Not too much issue there. Besides with Selective Refresh, only screen changes are pushed to the display, so no need to push the entire resolution to the screen on each refresh.
  • Reply 10 of 56
    existenceexistence Posts: 991member
    Quote:

    Originally posted by Rhumgod

    Better read up on the DVI specifications.



    Specifically read page 7 (section 1.2 - Performance Scalability)



    A single TMDS-link supports more than HDTV (1920x1080) resolution at 60Hz LCD 5% blanking. And when the link cannot handle it, you need to add another one. Not too much issue there. Besides with Selective Refresh, only screen changes are pushed to the display, so no need to push the entire resolution to the screen on each refresh.




    That is why I said "A single DVI connector cannot transmit that much data at 60Hz..." If Apple is going to make a a 30" display with 2560x1600 resolution, Apple needs a dual-link DVI connectors and a graphics card to match. Neither ADC nor a single DVI connector can do this resolution.
  • Reply 11 of 56
    rhumgodrhumgod Posts: 1,289member
    Quote:

    Originally posted by Existence

    That is why I said "A single DVI connector cannot transmit that much data at 60Hz..." If Apple is going to make a a 30" display with 2560x1600 resolution, Apple needs a dual-link DVI connectors and a graphics card to match. Neither ADC nor a single DVI connector can do this resolution.



    So what's the problem? Dual link is still one connector, just another link on the connector.



    Here's one such dual-link cable.
  • Reply 12 of 56
    kenaustuskenaustus Posts: 924member
    I don't think the new PMs are short lived. IBM has made the move to the 90 nm process (after some pain) and Apple has made the move to the chip in terms of engineering changes. The issue now is to continually adjust what they have as faster chips in the 970FX line become available. I think Apple's engineers are capable of making that upgrade and easy one for them to have.



    In terms of the speed of the 970FX, or course it will increase in speed over time - just not as fast as we would like. There are also chips on the Apple/IBM road map that will make the new PMs look like "old" technology. That's also a given. The reality is that Apple delivered what was available when it was available. While it sucks that IBM ran into an unexpected problem with the 90 nm fab process I would guess that most people who have worked for a while have also ran into unexpected problems that delayed schedules, or make it necessary to address an issue in a different direction. I sure have - lots of times - and I've only got a small, one man company.



    Maybe it's just that I'm getting old. When I took an Introduction to Computers course in college they passed around a cup with some small metal rings in it. It was memory - and it was hand wired into place. I don't tend to take new technology for granted like those that were not around at that time and Apple's release of the new PMs look pretty good to me - just as my new 1.5 gig 15" PB does.



    I think that there are some very exciting things in Apple's future. Faster chips, 65 nm processing, Gx chips with dual cores - lost of things. What they have done this week is deliver products that made the most sense to them today.
  • Reply 13 of 56
    dobbydobby Posts: 797member
    I don't think they are short lived.

    Going from 2.0 to 2.5 Ghz is a fairy good speeed bump (an extra Ghz in total being a dual).

    Intel don't do this type of speed bump. Especially in the 64bit arena.

    I still think the casing is too big.

    I don't like the "new" Dual 1.8 having its PCI-x demoted to PCI. That sucks. Graphics cards are lame as well. I would have expected at least 128MB cards as a decent Graphics card has 256MB.



    Dobby.
  • Reply 14 of 56
    kenaustuskenaustus Posts: 924member
    PCI works well for those with cards on hand, or who can't afford PCI-X cards after buying the 1.8 PM.



    $50 gets you a 128 Mb video card - just like on the 15' PM I bought last month.



    It addresses a specific market that doesn't have the financial freedom to go for a topped out 2.5 - those folks are probably going to be glad Apple thought of them.
  • Reply 15 of 56
    safarixsafarix Posts: 18member
    What type of cards use PCI-X?
  • Reply 16 of 56
    beigeuserbeigeuser Posts: 371member
    Here's my take on the ADC connector conspiracy theory of the original post.

    PowerMac G5s continue to use ADC because:



    1. The current displays still use ADC. So G5s should have ADC in case customers want to buy a monitor at the same time.

    2. Many people are upgrading from older Powermacs. Since the new G5s have ADC, then they can continue using their current display.

    2. The current G5s also have DVI so they will be compatible with the new DVI monitors when they come out.



    If Apple was to eliminate the ADC on the computer side, many customers will suffer. Therefore, it is only natural that the ADC disappears on the monitor side first. The G5s will probably continue to have ADC for a few months after the DVI monitors come out.



    The fact that the new G5 still has ADC doesn't mean that the new G5s are temporary measures.



    There are no new video cards is because ATI and Apple are probably hesitating to develop a new ADC-based card when they both know that DVI LCDs is just around the corner. New videocards won't be sold until 1) PCI express becomes mainstream. 2) Apple DVI LCDs are sold in quantity.
  • Reply 17 of 56
    hobbithobbit Posts: 532member
    DaveGee, as much as I would love to see a 'MacStation', I'm doubtful. But hey, hope springs eternal!



    If anything it would have to be a Power5 derived chip (975, 980, whatever the name), and that might not happen before September/October. Even if it is true that IBM plans to ship this chip in July, it'll still take 2-3 months to build enough inventory to actually introduce a product.



    Would Apple unveil it at SIGGRAPH mid August? Hard to tell. Maybe the whole SIGGRAPH presence wasn't planned like that? Those booths need to be booked many months in advance, perhaps at a time when Apple was still hopeful for better yields and earlier introductions?



    But it could all be completely different:



    New Power5 derived CPUs available, but 'slow'?

    These new chips might actually be available next month, but due to the same 90nm issues, might not go any faster than 2.5 odd GHz either, far below the magical 3GHz. However since they are Power5 derived chips with hyperthreading their performance is still a lot better than similarly clocked 970fx chips, hence worth for Apple to use.



    Running hot?

    Their cooling issues might be even worse than with the 970fx. To a point where the cooling is physically still possible, but so expensive that Apple can no longer put them into PowerMacs without raising their prices substantially. At that point Apple might have decided to create a new 'MacStation' line for them which will have faster performance yet more expensive cooling.



    With new displays?

    And it could indeed be that the new displays, especially the 30" model, are intended for those 'MacStations'.



    Introduction at WWDC?

    PowerMac G5s got introduced at WWDC so developers could learn about the software changes required for this new type of processor. Perhaps the same is true for the Power5 derived chip (let's call it the 'G6')? Perhaps IBM's variant of hyperthreading (and other new hardware features planned) also require broad changes in application development?

    So perhaps Steve Jobs will introduce a 'MacStation' at WWDC - much to our surprise.





    But what in all this seems a more tangible issue is the lack of a cheap modular machine. If you already own a good display and want to trade up to a new Mac, your cheapest choice is a $1999 PowerMac G5, which isn't really available at that price since we all know that its 256MB RAM is not nearly enough. No matter how you look at it, the cheapest modular Mac is a $2000+ affair.



    The entry model price hike from $1799 to $2000+ opened the field for a 'PowerMac mini' ('NuCube', whatever).



    If I'd have to chose between a 'MacStation' at WWDC or a 'PowerMac mini', I think the latter is more likely.
  • Reply 18 of 56
    whisperwhisper Posts: 735member
    Quote:

    Originally posted by Rhumgod

    Besides with Selective Refresh, only screen changes are pushed to the display, so no need to push the entire resolution to the screen on each refresh.



    I don't think that will work for games or fullscreen video.
  • Reply 19 of 56
    rhumgodrhumgod Posts: 1,289member
    Quote:

    Originally posted by Whisper

    I don't think that will work for games or fullscreen video.



    Yes, but with a dual-link DVI you won't have to worry - besides not too many games support that resolution anyway. In the future perhaps...
  • Reply 20 of 56
    programmerprogrammer Posts: 3,467member
    I expect these machines will have to hold us over until Oct - Jan. At that point either IBM will have rev'd the 970FX to eek out a bit more speed (although it sounds like that will be less than another 500 MHz), or they will have readied & ramped its successor. I'd be surprised if the successor can manage much better clock rate unless they've gone and stretched the pipelines further, but I don't think they'll play Intel's game. If IBM can get their new materials working acceptably then we might see a bit more improvement in heat/clock, but I'm not expecting much of a leap. 3 GHz might be The Limit for 90nm, and 65nm probably won't be any different since heat density has become a limiting factor and leakage from traces is dominating the equation. Some new optimizations in the design process may yield some improvements, but they are likely to be very incremental. As a result future designs will probably make the chips larger at the same clock rate -- SMT, multi-core, larger caches, embedded RAM, and Cell-type designs.
Sign In or Register to comment.