Apple may drop NVIDIA chips in Macs following contract fight

1235»

Comments

  • Reply 81 of 93
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by bigdaddyp View Post


    Yes Leopard and Sl can be installed from an sd card albeit very slowly. And there is no guarantee that an image on a sd card can be stored as long as a good quality pressed dvd. I can see Apple taking a risk and releasing a future os on an sd but I don't see them doing it with Sl. Imho.



    I don't know about the bit rot variances between flash and optical discs, but I can assure that an OS installation via SD will be considerably faster than optical media while using a lot less power in the process.
  • Reply 82 of 93
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by hmurchison View Post


    Yes Intel's GPU drivers have sucked but Larrabee is different in that it's programmed like a x86 processor.



    That should really disturb people. That means Intel is putting a lot of x86 baggage into Larabee. I can't help but to think this means low preformance and high power usage.

    Quote:

    Maybe Intel will actually be able to leverage their expertise in this avenue. I don't expect Larrabee to be a game changer but I hope it's at least competitive.



    See I don't really want to see Larabee succeed. Not to be mean but if Intel gets a hammer lock on the industry innovation will go out the window. Intel will get fat and sloppy and the next thing you know some Chinese outfit will make a huge jump in performance and another American industry goes down the tubes.



    It is an interesting concept but Intel has to fail in order for the industry to remain strong.





    Dave
  • Reply 83 of 93
    bregaladbregalad Posts: 816member
    Quote:
    Originally Posted by ksec View Post


    Lets Clear up a few confusion.



    That is like comparing Apple to Orange. If core number means anything, Even Mid Range ATI has 320 "core". Not to mention GTX280 and 8800GT are Desktop Class. Apple mainly use Notebook Graphics.



    nVidia and ATI implementations of stream processors is radically different so you can only count "cores" within a manufacturer's own lineup.



    CUDA certainly has a lot of mindshare and real world experience, but I wouldn't count ATI out just yet. Let's remember that OpenCL is not CUDA.



    What little data exists for general purpose calculation via AMD Stream shows that in raw performance ATI cards like the 4850 are every bit as fast as "comparable" nVidia cards like the 9800gt.
  • Reply 84 of 93
    mdriftmeyermdriftmeyer Posts: 7,502member
    Quote:
    Originally Posted by al_bundy View Post


    going 64 bit is useless unless you have 4GB of RAM or more and that hasn't been affordable until very recently. even when AMD's first to market consumer 64 bit chips first came out no one ran in 64 bit mode due to lack of OS and lack of memory. Intel didn't bother to integrate AMD's 64 bit instructions until at least a year or two later



    It's been affordable for 3 years.
  • Reply 85 of 93
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by al_bundy View Post


    going 64 bit is useless unless you have 4GB of RAM or more and that hasn't been affordable until very recently. even when AMD's first to market consumer 64 bit chips first came out no one ran in 64 bit mode due to lack of OS and lack of memory. Intel didn't bother to integrate AMD's 64 bit instructions until at least a year or two later



    Quote:
    Originally Posted by mdriftmeyer View Post


    It's been affordable for 3 years.



    Not only that, 64 bit helps for more than just memory, media encoding and other computation can benefit a lot from the wider registers.
  • Reply 86 of 93
    bigdaddypbigdaddyp Posts: 811member
    Quote:
    Originally Posted by solipsism View Post


    I don't know about the bit rot variances between flash and optical discs, but I can assure that an OS installation via SD will be considerably faster than optical media while using a lot less power in the process.



    I think a lot of that depends on the card. Their actual throughput can vary significantly between brands and even between the same brand of cards. I have not seen any benchmarks but booting from a usb key has usually been much faster. At least that has been my experience so far on my hacintosh which I realize is a bit of an apples to oranges comparison.
  • Reply 87 of 93
    mdriftmeyermdriftmeyer Posts: 7,502member
    Quote:
    Originally Posted by JeffDM View Post


    Heck, you can add two Express Card slots for extra drive space and still have more room. EC34 can be the new style multibay.







    I thought ATI was doing better, but that was with Core Image & Core Video. OCL is probably enough of a different beast.



    It seems to me that the any possible spat can be worked out.



    ATi is doing better with their latest CAD GPGPU FireStream 9270 and their latest embedded ATI Radeon? E4690 Discrete GPU and the new ATI Radeon HD 4890



    Firestream OpenCL:



    Quote:

    OpenCL Coming to a Computer Near You! (2009)



    Right after the release of the OpenCL 1.0 specification, AMD announced its intent to rapidly adopt the OpenCL 1.0 programming standard and integrate a compliant compiler and runtime into its free ATI Stream SDK.



    Even before the OpenCL 1.0 announcement, the ATI Stream Computing team had been busy implementing an OpenCL solution for its customers. AMD is currently working closely with OpenCL content developers and ISVs to provide them with a developer version of the ATI Stream SDK with OpenCL 1.0 support. A public release of ATI Stream SDK v2.0 with OpenCL 1.0 support will be available in the second half of 2009.



    Also in the first quarter of 2009, AMD released the AMD FireStream 9270 which satisfies the needs of the truly high-performance computing user. The FireStream 9270 delivers an impressive 1.2 TFLOPS of performance at a typical power consumption of 160W.



    ATI Radeon? HD 4890 GPU ?



    AMD Delivers the World's First 1 GHz Graphics Processor



    − Nine years after launching the world?s first 1 GHz CPU, AMD is again first to break the gigahertz barrier with the factory overclocked, air-cooled ATI Radeon? HD 4890 GPU ?



    SUNNYVALE, Calif. -- May 13, 2009 --Building on the success of the recently launched ATI Radeon? HD 4890 graphics card -- driven by the world?s most powerful graphics processor1 -- AMD (NYSE: AMD) today announced availability of a factory overclocked graphics processor that is the first to break the 1 Gigahertz (GHz) barrier using standard air-cooling solutions.
    • With this product, AMD achieves a notable engineering milestone as the first graphics company to break the 1 GHz barrier.

    • The new ATI Radeon? HD 4890 utilizes advanced GDDR5 memory and a 1 GHz clock speed to deliver 1.6 TeraFLOPs of compute power, 50 percent more than that of the competition?s best single-GPU solution2. With this level of raw compute power, the1 GHz ATI Radeon HD 4890 is set to deliver new levels of general purpose GPU-accelerated performance in ATI Stream applications such as video transcoding and post processing.

    • This new version of the ATI Radeon? HD 4890 marks the latest addition to the award-winning ATI Radeon? HD 4000 series delivered by AMD technology partners Sapphire, XFX, Asus and TUL.

    • The advanced design of the ATI Radeon HD 4890 delivers an amazing gaming experience in the latest games, including ground-breaking DirectX® 10.1 titles such as Ubisoft?s Tom Clancy?s H.A.W.X.?, Electronic Art?s BattleForge? and SEGA?s Stormrise? released last month, as well as GSC Gameworld?s S.T.A.L.K.E.R: Clear Sky. When compared to DirectX 10 game play, DirectX 10.1 games have proven to deliver higher game performance and an improved visual experience. In addition, these cards feature support for open standards like OpenGL3 with DirectX® 10-like hardware extensions, and the recently ratified OpenCL specification.

    • The ATI Radeon HD 4890 supports advanced game physics. At the 2009 Game Developers Conference in San Francisco, Havok and AMD demonstrated the first implementation of OpenCL running on AMD graphics processors. In the demonstration, Havok?s physics technology delivered complex and realistic simulations of real-world materials like cloth, demonstrating the potential for increased realism in forthcoming games.

    • To date, the ATI Radeon HD 4890 card has won numerous awards, including the prestigious Editor?s Choice Gold Award from HardOCP, the HotHardware Recommended Award and the Editor?s Choice Award from Tweaktown, among others. The accolades speak to the excitement around the product and to the continued strength of the discrete graphics market overall, something analyst Dr. Jon Peddie of Jon Peddie Research predicts will continue to play a strong role in the computing industry.

    • As a result of the worldwide accolades from media, developers, enthusiasts and fans, AMD released an ?inside look? at how the card was made and what it means for gamers. The card marks a new aspect to the AMD ?Dragon? desktop platform technology, providing an even more powerful single GPU desktop graphics option to OEMs, channel partners, and do-it-yourself (DIY) consumers.

    -------------



    What's missing? AMD hasn't released publicly their implementation for these cards that will allow us to see them in action, outside of NDA signed developers.



    I imagine some time around September or October we'll start seeing the fruits of their efforts. The same for NVIDIA with Apple obviously releasing 10.6 having OpenCL implemented and being leveraged throughout it's applications, where possible.
  • Reply 88 of 93
    ghstmarsghstmars Posts: 140member
    Quote:
    Originally Posted by ksec View Post


    From the Rumors Mills, and Intel investment, it seems Arrandale's graphics wont be based on previous Intel GFX. It will be based on PowerVR SGX..



    interesting. mm OpenCl compliant and the architecture is tested already in the iPhone..

    imagination did produce game consoles and desktop gfx chips . intel also owns 19% of the company.
  • Reply 89 of 93
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by mdriftmeyer View Post


    It's been affordable for 3 years.



    Quote:
    Originally Posted by al_bundy View Post


    going 64 bit is useless unless you have 4GB of RAM or more and that hasn't been affordable until very recently. even when AMD's first to market consumer 64 bit chips first came out no one ran in 64 bit mode due to lack of OS and lack of memory. Intel didn't bother to integrate AMD's 64 bit instructions until at least a year or two later



    64 bit is very useful to a number of people, some codes can see signigicant speed increases. This has very little to do with memory directly even though there is a whole set of codes that do love lots of memory.



    As to affordable that depends upon the purchaser.



    As to those early AMD chips I had one of the earlier systems and one of the first 64bit Fedora release on it. While average 64 bit performance might have been 5% better than 32 bit some codes did really well. This like I said on a brand spanking new 64 bit OS. Further Fedora being Fedora there where plenty of updates that slowly improved the system. All of this was on a system with far less than 4GB of memory. Now a days Linux is well established on 64 bit systems and Apple isn't far behind with a major transition in two months.



    As to Intel; they where very reluctant to implement something designed else where. Apparantly there were some strong arm tactics used to get them to go that route. The old "not invented here" syndrome is alive and well. Intels reluctance to go with AMDs solution shouldn't be seen as slighting its effiacy but rather a desire to see their own baby succeed. Also in part AMD was having huge success with it's 64 bit platform which was an additional pressure on Intel. AMD actually demonstrated success with it's hardware without a mainstream OS, that is a nice endorsment of 64 bit right there.





    Dave
  • Reply 90 of 93
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by wizard69 View Post


    64 bit is very useful to a number of people, some codes can see signigicant speed increases. This has very little to do with memory directly even though there is a whole set of codes that do love lots of memory.



    I find the use of "codes" as a replacement for "programs" to be rather odd and archaic. It's rather uncommon for coders to refer to programs as "codes".



    Some programs do see a speed increase due to the x86 register size and the increase in the number of registers and some other enhancements packaged in (but not necessarily 64 bit dependent). But for the most part, the primary benefit is memory.



    Quote:

    Now a days Linux is well established on 64 bit systems and Apple isn't far behind with a major transition in two months.



    There are many folks, even in 2009, still running 32-bit linux. It wasn't until 2008 that Adobe released a 64 bit Flash player (beta...release was in Feb 2009). 64 bit codec support was weak or non-existent, GE I don't think is 64 bit yet.



    Quote:

    As to Intel; they where very reluctant to implement something designed else where. Apparantly there were some strong arm tactics used to get them to go that route.



    No. Intel was pushing IA-64 and the Itanium and the "strong-arm" tactics was simply the market moving to x86-64 and the AMD processors kicking Itaniums butt. So Intel got something from AMD they wanted (x86-64) in exchange for not using the nuclear option they didn't want really do anyway (kill AMD's license to do any x86 chips).



    The EU monopoly complaint thing was just the usual AMD/Intel posturing and gamesmanship. If Itanium was doing well then Intel would have continued with it and ignored x86-64.



    For desktop usage, 64 bit hasn't really made much of a difference.
  • Reply 91 of 93
    randianrandian Posts: 76member
    Are you saying Nehalem is an Itanium replacement? I tend to agree, what with its focus on and huge improvement in floating point and vector arithmetic and NUMA memory architecture. In fact, it looks to be aimed squarely at SGI's last remaining market. The fact that Intel never really improved Itanium in any meaningful way once Alpha, PA-RISC, and SPARC were effectively neutered as potential competition (plus it's still stuck at well under 2 GHz) suggests it was never a real product.
  • Reply 92 of 93
    MarvinMarvin Posts: 14,581moderator
    Don't know if this was posted but Nvidia are apparently denying this claim and saying their relationship with Apple is ok:



    http://www.electronista.com/articles...s.apple.split/

    http://www.fudzilla.com/content/view/14512/34/



    "We spoke to industry sources close to Nvidia and got clear confirmation that the Nvidia-Apple relationship is doing just fine.



    Apple is still buying Nvidia notebook chips and chipsets and nothing has changed in the last few weeks. Nvidia and Jensen himself have a lot of respect for fuss maker Apple, and therefore this relationship has top priority for Nvidia people.



    The report that started the whole fuss that Apple might drop Nvidia was originally posted by my fellow ex-Inqster Mr Demerjia on his new project Semmiacurate.com"



    Like I say, to get traffic to a new site, this is the kind of rumor you post. Once it's found out though, it turns out to be the worst way to keep people coming back.
  • Reply 93 of 93
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by vinea View Post


    I find the use of "codes" as a replacement for "programs" to be rather odd and archaic. It's rather uncommon for coders to refer to programs as "codes".



    What can I say I'm an uncommon man!! In any event what does that have yo do with anything.

    Quote:

    Some programs do see a speed increase due to the x86 register size and the increase in the number of registers and some other enhancements packaged in (but not necessarily 64 bit dependent). But for the most part, the primary benefit is memory.



    How is the use of larger registers not dependant on being 64 bits? The extra registers are part of the new ISA thus tied to 64 bits too.



    In anyevent you mis the point with memory, the larger than 4GB memory is only useful for a limited number of "codes". A vast number of programs or codes really did leverage the 64 bit hardware even on machines with modest RAM allotments.

    Quote:







    There are many folks, even in 2009, still running 32-bit linux. It wasn't until 2008 that Adobe released a 64 bit Flash player (beta...release was in Feb 2009). 64 bit codec support was weak or non-existent, GE I don't think is 64 bit yet.



    Well there are people still running Apple IIs. Besides some of us just don't care about Flash.

    Quote:





    No. Intel was pushing IA-64 and the Itanium and the "strong-arm" tactics was simply the market moving to x86-64 and the AMD processors kicking Itaniums butt.



    nope the reality is Microsoft was key in the process here. The rumor is they told Intel directly that they where not going to support another 64 bit platform beyound x86-64.



    Quote:

    So Intel got something from AMD they wanted (x86-64) in exchange for not using the nuclear option they didn't want really do anyway (kill AMD's license to do any x86 chips).



    That is so wrong that I'd have to suggest that you review the history a bit before going farther.

    Quote:



    The EU monopoly complaint thing was just the usual AMD/Intel posturing and gamesmanship. If Itanium was doing well then Intel would have continued with it and ignored x86-64.



    For desktop usage, 64 bit hasn't really made much of a difference.



    It has and it's importance continues to grow. Between 64 bit and SMP we have a whole new generation of computers.





    Dave
Sign In or Register to comment.