I think Apple may have a trick up its sleeve -- "Q5"

2»

Comments

  • Reply 21 of 33
    g-newsg-news Posts: 1,107member
    [quote]Yikes, MacRonin, don't say "ÜberCard!" Don't you remember the trouble that ensued from the mere mention

    of the ÜberMac?!!?!! There are hypersensitive scholars of German language usage present!

    <hr></blockquote>



    Indeed, cut the pseudo Germanisms and just say "Over-Mac" or Super-Mac and we'll still al know what you mean.



    G-News
  • Reply 22 of 33
    mr.emr.e Posts: 20member
    Thanks for the input UserOne. As a visual artist, too, I see that a tablet would be even better than a laptop. The keyboard wouldn't get in the way. Whenever you want the keyboard you would just tap the icon. You could put it on an antique easel and "paint or draw" directly on it, while standing or sitting. And I guess I wasn't clear, this tablet has no CPU, just a GPU - you would still need an Airport equiped Apple computer; this tablet would only mirror whats on the computer that's in your office. The GPU would hardwire directly to the Airport card. You could also connect the iPod to this puppy and use it's OS, perhaps. The tablet would be useless unless you had an Apple anyway.
  • Reply 23 of 33
    useroneuserone Posts: 55member
    I still would quite like to go outside with it. Maybe I can go for a walk and draw a leaf or sketch down an idea.



    If the pad is dependant on the Mac then I think it must have AV function. GPU replays DVD from your Mac etc.



    But if I go outside where does CPU data crunching for basic tasks come from? Ultra Portable iPod perhaps?



    Ubiquitous computing indeed!



    userone

  • Reply 24 of 33
    [quote]Originally posted by Programmer:

    <strong>



    Computationally expensive scientific calculations, and heavy audio processing are exactly the kind of thing that a fast SIMD unit is good at. That's why AltiVec has made such in-roads in these areas. I wouldn't be the least bit surprised if Apple, faced with questionable CPU futures but happy with the success of AltiVec, turned around and equipped all of their machines with additional vector units.</strong><hr></blockquote>



    I agree that this could be a fruitful approach, though I have a hard time seeing where these vector units would come from (surely not Motorola; they wouldn't design Apple a brown paper bag if it couldn't be used in an embedded product). What makes these mystery vector units more feasible than an Apple-nVidia Quartz GPU which simply builds on the expertise of both parties?



    Of course, who's to say Apple won't do both -- but in due time. While audio and scitech are areas Apple is interested in, we don't see Apple buying up sound software houses or scitech engineering concerns. What we do see are acquisitions in high-end compositing and titling/motion graphics.



    I believe Apple may have decided to knock these pins down one at a time to manage costs and risk. Hence -- high-end video and related media now, through Final Effects and the Q5; audio, scitech and others later once the current target niche is secured.
  • Reply 25 of 33
    [quote]Originally posted by iconmaster:

    <strong>



    While audio and scitech are areas Apple is interested in, we don't see Apple buying up sound software houses or scitech engineering concerns.</strong><hr></blockquote>



    Monday, July 1 -- Apple makes a liar out of me.





    <a href="http://www.apple.com/pr/library/2002/jul/01emagic.html"; target="_blank">Apple buys Emagic, makers of digital audio and MIDI tools</a>



    If Apple really is going to pursue these markets simultaneously, perhaps the Q5 is too narrow a strategy...
  • Reply 26 of 33
    programmerprogrammer Posts: 3,467member
    [quote]Originally posted by iconmaster:

    <strong>I agree that this could be a fruitful approach, though I have a hard time seeing where these vector units would come from (surely not Motorola; they wouldn't design Apple a brown paper bag if it couldn't be used in an embedded product). What makes these mystery vector units more feasible than an Apple-nVidia Quartz GPU which simply builds on the expertise of both parties?

    </strong><hr></blockquote>



    I think Apple has enough hardware design talent to design and build a set of vector units into their core chipset. Heck, perhaps they just license the nVidia vertex shader pipeline. Or maybe all that talent and patent they acquired with Raycer has something to do with this.



    <strong> [quote]

    Of course, who's to say Apple won't do both -- but in due time. While audio and scitech are areas Apple is interested in, we don't see Apple buying up sound software houses or scitech engineering concerns. What we do see are acquisitions in high-end compositing and titling/motion graphics.



    I believe Apple may have decided to knock these pins down one at a time to manage costs and risk. Hence -- high-end video and related media now, through Final Effects and the Q5; audio, scitech and others later once the current target niche is secured.</strong><hr></blockquote>



    Given how fast Apple is acquiring key technologies, I think they've finally thrown up their hands in frustration at waiting for 3rd parties to equip their platform with optimized and integrated solutions to all of these problems. Can't convert 'em? Buy 'em! If they do indeed have some new compute-engine coming that needs to be coded for, then they may have just decided to do it themselves.



    Frankly this could be a decent solution, if they leave the door open to 3rd party support. By this I mean allowing 3rd parties to extend and build on their software -- make the Apple applications scriptable and plug-in'able.



    I'm still hoping that this new Apple technology is a new programming model like I blathered on about in the "Wolf" thread. Hand coding to YAVU ("yet another vector unit") is just never going to catch on with anybody outside of Apple, at least not in a meaningful way. Coding in a new "vector unit" language & programming model, which can then be run on any vector unit in sight, or even be distributed across a cluster if the latency is permissable, would suddenly allow Apple a lot more flexibility and freedom in how they innovate in hardware.



    Go read the nVidia Cg or OpenGL2.0 specs to get some idea of what I'm talking about... I think these languages and their coding model are just the beginning and in the not too distant future I think they will be applied to more than just graphics. I keep hoping Apple will take a leadership role here... there are enough hints at it, that's for sure.
  • Reply 27 of 33
    b8rtm8nnb8rtm8nn Posts: 55member
    Did you notice how Applescript is now a programming language to access OSX alongside Cocoa? Maybe you are onto something, speculative Programmer.
  • Reply 28 of 33
    programmerprogrammer Posts: 3,467member
    [quote]Originally posted by b8rtm8nn:

    <strong>Did you notice how Applescript is now a programming language to access OSX alongside Cocoa? Maybe you are onto something, speculative Programmer.</strong><hr></blockquote>



    While the development of Applescript is definitely a good thing, it isn't suited to compute intensive purposes. It might be interesting if it could generate jobs to be fed to a compute engine, however.
  • Reply 29 of 33
    [quote]Originally posted by Programmer:

    <strong>

    Go read the nVidia Cg or OpenGL2.0 specs to get some idea of what I'm talking about...</strong><hr></blockquote>



    Interesting -- it seems like Cg and OGL2 are competing solutions to the same problem? (Both intend to supply a C-based, high-level shading language to graphics programmers, in order to make real-time effects more accessible and compatible across a variety of hardware.)



    It's curious that nVidia would pursue this as it seems to commodify their hardware -- games are less dependent on this or that GPU. On the other hand, it makes games more open to accepting the ongoing stream of GPU upgrades, which I suppose is nVidia's plan.



    I take it you are suggesting that, now that Apple has tied Quartz into one of these languages (OpenGL), it will proceed to provide dedicated "mini-processors" which are optimized for the most common function calls therein.



    This is just what I was getting at (I think), except in your model the "Q5" really becomes "Q5s." I wonder, however, what the ramifications are for an nVidia-Apple connection if Apple is casting its lot with OpenGL while nVidia pursues Cg?



    [quote]Originally posted by Programmer:

    <strong>

    I think these languages and their coding model are just the beginning and in the not too distant future I think they will be applied to more than just graphics.</strong><hr></blockquote>



    These languages seem pretty specific to real-time cgi. What is it about the coding model which appears to you to have potentially wider application?



    Thanks for the pointer -- I used to do 3D work and still have a soft spot for it, so I try to keep up on developments in cgi.
  • Reply 30 of 33
    amorphamorph Posts: 7,112member
    [quote]Originally posted by iconmaster:

    <strong>I agree that this could be a fruitful approach, though I have a hard time seeing where these vector units would come from (surely not Motorola; they wouldn't design Apple a brown paper bag if it couldn't be used in an embedded product).</strong><hr></blockquote>



    Fortunately for Apple, SIMD units have all kinds of uses in the embedded space.



    [quote]Originally posted by iconmaster:

    <strong>It's curious that nVidia would pursue this as it seems to commodify their hardware -- games are less dependent on this or that GPU. On the other hand, it makes games more open to accepting the ongoing stream of GPU upgrades, which I suppose is nVidia's plan.</strong><hr></blockquote>



    Exactly. As it stands, a new card gets released, the demos - coded to exploit the card - look great, but in actual use the card is usually an incremental improvement on the last generation. This especially hurt with the RADEON/GeForce3 generations. So it's one of those circumstances where it makes sense for nVIDIA to make it easier to exploit ATi's latest and greatest - the demonstrated alternative is that most game developers will exploit neither.
  • Reply 31 of 33
    programmerprogrammer Posts: 3,467member
    Those languages are targeted at graphics, but graphics is really just all about computing arrays of numbers. Since the new graphics units can output these numbers with zero loss of precision (i.e. floating point frame buffers), the obstacles to using them to compute numbers for non-graphical uses are falling fast.



    One nice thing about the execution model is that these are small programs with very well defined inputs and outputs, and their data set can be described quite nicely. This is all because the data & code is bundled up and sent to a GPU which figures out how to execute it on some piece of hardware who's exact capabilities are only known on the receiving end. This kind of a black box with SIMD floating point capability makes it possible to do all sorts of fun things with tossing around large amounts of data that need to have a specific program (or set of programs) run across them. This is what "compute intensive" software is all about.
  • Reply 32 of 33
    liquidh2oliquidh2o Posts: 79member
    just a thought(and my first post, so i hope it's a good one! )





    If there's one piece of hardware that i see apple and nvidia collaborating on together for the benefit of both, i think it would be a motherboard. Nvidia has been delivering its "Nforce" motherboard to the pc market for sometime now, and while it has some fierce competition and ranks almost right up there with other big names like via, bringing the Nforce to the Mac would reap big rewards for both Companies.



    The Nforce is a powerful board, and offers all the features that apple lacks on its current motherboard, and would be the answer to what most apple nuts are asking for.



    Not only that, but this could possibly lead to another line of desktop mac products, or pave the way for cheaper desktops.



    Why Cheaper? The nforce board comes in a flavor which has pretty much everything you need already on the motherboard. Geforce2mx, quality audio, ethernet, etc.. and it comes at quite a savings over buying all these parts seperately in addition to a motherboard.



    Maybe it's just wishful thinking, but i definitely think it'd be a step in the right direction, and i think both Nvidia and Apple would benefit from a deal like this, as would the consumers
  • Reply 33 of 33
    technologies mature (e-ink, OLED etc) we will see high resolution screens EVERYWHERE, and they all need a GPU.



    If this works they'll be bigger than Intel by the end of the decade. Bonus is we all get wall sized 200dpi displays, and..
Sign In or Register to comment.