Future PowerMacs?

Posted:
in Future Apple Hardware edited January 2014
what would you like to see in future PowerMacs?



you can list realistic specs, unrealistic specs and predictions.



My unrealistic setup for a PowerMac is:



Quad Core 3GHz

2GB performance ram standard(non-generic)G Skill LA's or OCZ plats

2 G70's in SLi

2 74GB 10,000RPM Raptors in RAID 0



and my realistic setup for a PowerMac is... well i dont have one
«1

Comments

  • Reply 1 of 29
    mynameheremynamehere Posts: 560member
    ...and mindreading capabilities too...
  • Reply 2 of 29
    hmurchisonhmurchison Posts: 12,259member
    Quad Core (2x chips)3.2Ghz 980s with SMT and Ondie mem controllers

    8 Rambus XDR slots

    HD-DVD Recorder

    Dual Gigabit Ethernet with TOE,iSCSI,RDMA

    GBIC port on the back for 10G Ethernet

    Fibre Channel 2Gb

    SATA/SAS 3Gbps(one external port)

    3 15k SAS drives in RAID-5(HW accelerated)

    5 PCI-Express slots

    Accelerated Audio with HDMI 1.2 outputs.

    Apple Airport SR-71 Pre 802.11n Wireless card

    Bluetooth 2.0

    FW800 dual bus

    FW400 single bus

    USB 2.0 6 slots

    BT 2.0 Keyboard & 3-Button Mouse(Scrollwheel) iTunes cntrls



    That would get me going for starters.
  • Reply 3 of 29
    pbpb Posts: 4,233member
    Quote:

    Originally posted by hmurchison

    Quad Core (2x chips)3.2Ghz 980s with SMT and Ondie mem controllers





    Is this the fabulous Power5-derived processor we all are waiting for and not seeing yet?
  • Reply 4 of 29
    cubistcubist Posts: 954member
    Smaller case, about the size of a Shuttle but a little taller perhaps, with room for either two opticals and one hard or two hards & one optical. One PCI-E graphics card slot and one PCI-X slot. Quiet.
  • Reply 5 of 29
    hmurchisonhmurchison Posts: 12,259member
    Quote:

    Originally posted by PB

    Is this the fabulous Power5-derived processor we all are waiting for and not seeing yet?





    Yessirree. It's rapidly becoming legend.
  • Reply 6 of 29
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by hmurchison

    Yessirree. It's rapidly becoming legend.



    I don't believe the processor your listing even exists. Not if your saying there is a Single chip quad core 980 processor with ondie mem controller. Wouldn't IBM make a quad core Power5 first? Maybe? Then again maybe they do.
  • Reply 7 of 29
    hmurchisonhmurchison Posts: 12,259member
    No I actually modified my post with a (2x) so that people know I'm talking about 2 dual-core procs.



    I have no idea where IBM/Apple are going to go but I think that Intel, IBM and AMD all want to get the hell off of 90nm soon enough. I don't know if they'll rush to 65nm but I don't see them trying to spend a lot of time at 90nm.



    This is good because 65nm is going to give us the extra space needed for SMT and perhaps on OMC and larger cache.
  • Reply 8 of 29
    onlookeronlooker Posts: 5,252member
    No I actually thought you meant 2x quad core chips. Doctor OctoMac.
  • Reply 9 of 29
    junkyard dawgjunkyard dawg Posts: 2,801member
    Twin dual-core PPC 970mp, 2.6, 3.0, 3.4 GHz, w/ ODC or eBus clocked at 1/2 CPU clock speed.

    Support for up to 32 GB RAM.

    4 SATA HD bays, with motherboard RAID support.

    5 PCI-Express slots

    FW800/400 and USB2 ports, on both front and back of tower.

    Blue-Ray Superdrive EXTREME.



    $1999, $2499, $2999



    "Consumer" tower:



    Single dual-core PPC 970mp, 2.2, 2.4, 2.6 GHz.

    Support for up to 16 GB RAM.

    2 SATA HD bays.

    1 PCI-Express slot.

    FW800/400 and USB2 ports, on both front and back of tower.

    Superdrive.



    $1399, $1699, $1999
  • Reply 10 of 29
    gugygugy Posts: 794member
    Quote:

    Originally posted by Junkyard Dawg

    Support for up to 32 GB RAM.





    Dude, are you kidding me?

    Are you ready to spend close to $6k in RAM?
  • Reply 11 of 29
    emig647emig647 Posts: 2,408member
    My buddy just got his dual 2.3. I used it for a few hours in Cinema, Maya, XCode with GCC 4, and FCP.



    I am amazed how good the processing power is in this machine. I am seeing a noticable difference between my dual 2.0 and this 2.3. I think the dual 2.7 would satisfy ANYONE for processing power. Now for the complaint...



    He has the NVidia 6800ultra. Sadly I can not tell a huge difference between my 9600xt and the 6800ultra. Yes cinebench and maya have better performance... BUT there isn't a HUGE difference. You'd think spending 500 dollars on a graphics card would get you what you need. I think NVidia has a lot of work to do with drivers. I would like to see the X800 in person as well... i have a feeling this card isn't up to par either.



    I think as far as the 9600's go... they are fairly closer to the PC versions compared to the 6800/x800 being close to the PC versions of those cards. It seems the more higher end the card the worse the performance drops off.



    Anyways... I think the best thing apple can do is improve these aspects... I think PCI-Express will be the answer. So no matter if we only see a dual 3ghz next rev... I believe it will AT LEAST have PCI-Express.
  • Reply 12 of 29
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by emig647

    You'd think spending 500 dollars on a graphics card would get you what you need. I think NVidia has a lot of work to do with drivers.



    You mean Apple has a lot of work to do on their drivers. Nvidia's drivers are fine. The best there is.
  • Reply 13 of 29
    emig647emig647 Posts: 2,408member
    Quote:

    Originally posted by onlooker

    You mean Apple has a lot of work to do on their drivers. Nvidia's drivers are fine. The best there is.



    NVidia correlates with apple on their drivers... same with ATI. But yes... the 2 companies need to work something out.
  • Reply 14 of 29
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by emig647

    NVidia correlates with apple on their drivers... same with ATI. But yes... the 2 companies need to work something out.



    No they don't. Nvidia gives Apple their source code. The exact same code that the PC drivers are compiled from before it is compiled. Apple does what it wants with it.
  • Reply 15 of 29
    emig647emig647 Posts: 2,408member
    Quote:

    Originally posted by onlooker

    No they don't. Nvidia gives Apple their source code. The exact same code that the PC drivers are compiled from before it is compiled. Apple does what it wants with it.



    I have inside information that NVidia and apple work on the drivers TOGETHER. I had a very long talk with a NVidia developer at WWDC last year. We talked about the 6800 and why there hasn't been any other releases from NVidia since the 5200. NVidia and Apple BOTH worked on the drivers for the 6800ultra and most likely the 6800GT.
  • Reply 16 of 29
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by emig647

    I have inside information that NVidia and apple work on the drivers TOGETHER. I had a very long talk with a NVidia developer at WWDC last year. We talked about the 6800 and why there hasn't been any other releases from NVidia since the 5200. NVidia and Apple BOTH worked on the drivers for the 6800ultra and most likely the 6800GT.



    I believe what you say because it's obvious Apple, and Nvidia both worked on the drivers. Nvidia Wrote them, and Apple re-wrote them for OS X. It doesn't mean they did it together. If he said they both worked on the drivers for the 6800. They did. But Apple did the OS X port.



    I can back up what I'm saying with quotes from Ujesh Desai, Nvidia's General Manager of Desktop GPUs.



    Quote:

    Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver. Apple will control the release schedules for drivers that provide even more performance, features and image quality enhancements.



    Here is a link. Read question #'s 5, and 8. Read the whole thing for that matter. It never mentions colaborations of any kind. And this is after the release of the 6800 DLL.
  • Reply 17 of 29
    emig647emig647 Posts: 2,408member
    Either way...



    My original point is apple and NVidia need to do something....... along with Apple and ATI... It all doesn't lie in apple's hands. This is a team effort. It also doesn't just lie with apple / ati / nvidia... it also lies with OGL 2.0 libraries. It only matters that apple and ati and nvidia need to come up with something. NVidia or ATI... whoever writes the better drivers... sells more cards. period.
  • Reply 18 of 29
    onlookeronlooker Posts: 5,252member
    Well I think it's more of an Apple problem than anyone's. A bird told me they have integrated the core graphics deep into the system in such a way that it's hurting the overall outside graphics application performance.

    It's also more difficult to port a card a to work with it because of reasons I can not explain coherently, for lack of specific knowledge, but wouldn't it be nice to just take the Nvidia Drivers supplied by Nvidia, and compile them for a Mac in a simple step? Any graphics card made available on the PC would be capable from that point, after a ROM flash, and firmware upload? Wouldn't it be nice to be able to Flash the ROM of a Quadro (if you wanted one), and and upload the Mac compiled Quadro firmware? There are a thousand cards out there that should simply be able to work in a Mac but can not because they have gone overboard. They have simply gone too far.

    Now they have even taken the old Microsoft approach of integrating applications into the OS (Dashboard) which is just a bad idea to begin with. Any integrated Application that can gain root access is a crucial flaw. I can't say I saw this coming, but it's not surprising me one bit after the approach, and path they have taken with graphics in the system. Yes, something must be done, and Apple has lost their minds. Why complicate things that are so simple. They should have made the graphics cards more accessable, not more impossible.
  • Reply 19 of 29
    Quote:

    Originally posted by onlooker



    Now they have even taken the old Microsoft approach of integrating applications into the OS (Dashboard) which is just a bad idea to begin with. Any integrated Application that can gain root access is a crucial flaw.




    ...which is why it's such a good thing that Dashboard can't gain root access



    Anyway, did that bird mention why X's graphics affect other graphics? I'm curious if any of the window buffers that Quartz uses waste GPU memory if you're running a game fullscreen. Although if that's all it is, it should be easy to fix.
  • Reply 20 of 29
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by gregmightdothat

    ...which is why it's such a good thing that Dashboard can't gain root access



    .




    Um.. Hello, yes it can.
Sign In or Register to comment.