Nvidia : spilling exciting beans?

Posted:
in Future Apple Hardware edited January 2014
First off, here's the link:

<a href="http://zdnet.com.com/2100-1103-896996.html"; target="_blank">Nvidia: Forget Microsoft--chip's ahoy</a>



It's basically a snapshot of where Nvidia stands in the marketplace right now, with a new product getting set to launch soon. It also touches on their relationship with Microsoft and the Xbox.



BUT, here's the bit that caught my eye:



Huang also noted the inroads the company has made in the Macintosh market, saying that the company expects to have the largest share of that market "very soon." Huang said that Nvidia's chips are in the eMac that Apple Computer introduced Monday.



"You are going to see many more exciting introductions in that space," Huang said.




Any speculation gentle readers?



Drew &lt;----going off to read the rest of the article now
«13

Comments

  • Reply 1 of 48
    splinemodelsplinemodel Posts: 7,311member
    I;ve always prefered ATi over Nvidia, but we'll surely see what's up their corporate sleeve soon enough.
  • Reply 2 of 48
    eugeneeugene Posts: 8,254member
    nVidia is shipping defective GeForce4 MX cards to Apple and right now the Radeon 8500 is top dog. I don't think ATi has to be too worried since they still supply Apple with laptop graphics.
  • Reply 3 of 48
    amorphamorph Posts: 7,112member
    ATi is still the better choice for driving CRTs, for laptops, and for playing DVDs. I don't think they have that much to worry about yet.



    That and (I never thought I'd hear myself say this) their drivers are much better than nVIDIA's.
  • Reply 4 of 48
    OK Here's my speculation, a topic I have been thinking about for a while now. Fits nicely. As you know, Apple is cofounder of the HyperTransport consortium. This will probalbly mean that they are going to offer products implementing HyperTransport somewhere in the future. nVidia is not only another member of that consortium, but also the only company so far offering actual HyperTransport chipset hardware, namely the nForce chipset family. While the Northbridge of this chipset is connected to the x86 architecture, directly accessing the CPU, the Southbridge could theoretically be used in any HaperTransport compatible layout, also in PowerMacs. Further interesting components from nVidia could be a Hypertransport connected GPU, e.g. for all Mac layouts with soldered GPUs, like the Notebooks and iMacs. There is no "HyperTransport-Connector" yet, which could be used as a replacement for e.g. AGP or PCI, but rumours are that it is worked on. So, maybe we could eventually even see HyperTransport GPUs show up in the PowerMacs.
  • Reply 5 of 48
    derrick 61derrick 61 Posts: 178member
    [quote]Originally posted by drewprops:

    <strong>First off, here's the link:

    <a href="http://zdnet.com.com/2100-1103-896996.html"; target="_blank">Nvidia: Forget Microsoft--chip's ahoy</a>



    It's basically a snapshot of where Nvidia stands in the marketplace right now, with a new product getting set to launch soon. It also touches on their relationship with Microsoft and the Xbox.



    BUT, here's the bit that caught my eye:



    Huang also noted the inroads the company has made in the Macintosh market, saying that the company expects to have the largest share of that market "very soon." Huang said that Nvidia's chips are in the eMac that Apple Computer introduced Monday.



    "You are going to see many more exciting introductions in that space," Huang said.




    Any speculation gentle readers?

    </strong><hr></blockquote>



    GeForce 5 anyone?
  • Reply 6 of 48
    cakecake Posts: 1,010member
    Unlike Splinemodel, I've always preferred nVidia over ATi (for as long as we've had the choice. My GeForce3 (PC version flashed to work in my Mac) still keeps up with or out performs the ATi 8500 - and the GF3 has been out, what, two years?



    In the long run I have more faith in nVidia than ATi and Hyper Transport sounds very interesting.
  • Reply 7 of 48
    crusadercrusader Posts: 1,129member
    The abbreviation of the GeForce 5 is G5 .
  • Reply 8 of 48
    noahjnoahj Posts: 4,503member
    [quote]Originally posted by Cake:

    <strong>Unlike Splinemodel, I've always preferred nVidia over ATi (for as long as we've had the choice. My GeForce3 (PC version flashed to work in my Mac) still keeps up with or out performs the ATi 8500 - and the GF3 has been out, what, two years?



    In the long run I have more faith in nVidia than ATi and Hyper Transport sounds very interesting.</strong><hr></blockquote>



    Which card did you buy/flash? Can you do this with any of them? I am looking to upgrade my video somewhat cheaply and this looks like a viable route...
  • Reply 9 of 48
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by Derrick 61:

    <strong>GeForce 5 anyone? </strong><hr></blockquote>



    Its coming, you know it is... if you look at how nVidia sequences product introductions you can tell the future. If you look at the DirectX 9 spec, you can tell what the hardware behind DX9 will have to be able to do. If you know anything about graphics, I'll warn you now... get a towel to mop up the drool...
  • Reply 10 of 48
    airslufairsluf Posts: 1,861member
  • Reply 11 of 48
    bigcbigc Posts: 1,224member
    [quote]Originally posted by AirSluf:

    <strong>We need more than a towel </strong><hr></blockquote>



    Yeah, need a diaper

  • Reply 12 of 48
    cakecake Posts: 1,010member
    [quote]Originally posted by NoahJ:

    <strong>Which card did you buy/flash? Can you do this with any of them? I am looking to upgrade my video somewhat cheaply and this looks like a viable route...</strong><hr></blockquote>



    It's the original, plain vanilla GeForce3 that used the original nVidia spec- none of the Ti series seem to be flashable so far and it might be difficult to find the flashable version, but not impossible. (Specifically, I have a VisionTek GeForce3). It's a great card though.



    <a href="http://www.xlr8yourmac.com/Graphics/flashing_pc_geforce3.html"; target="_blank">Here's how to do it</a> and other related links.



    Now, back to our regularly scheduled thread....
  • Reply 13 of 48
    jasonppjasonpp Posts: 308member
    Let's hope the next card can render 1600x1200x32 Final Fantasy The spirits within clips in real time at full production rendering quality...



    Imagine games as good as FF:TSW?!!



    Even Nvidia has said this was a top priority for this year. We'll probably have to wait until 2003 to see it, but vid cards double performance every 6 months... so imagine a games 4x better than the best a Geforce4 4600 128MB can do today (no one has games that can do this yet)...
  • Reply 14 of 48
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by JasonPP:

    <strong>Let's hope the next card can render 1600x1200x32 Final Fantasy The spirits within clips in real time at full production rendering quality...



    Imagine games as good as FF:TSW?!!

    </strong><hr></blockquote>



    Some things just can't be done in real-time in any sort of a reasonable fashion... at least for the next few years. But in many ways it'll get close enough that the consumer won't be able to tell the difference.



    <strong> [quote]

    Even Nvidia has said this was a top priority for this year. We'll probably have to wait until 2003 to see it

    </strong><hr></blockquote>



    Right: they said it was top priority for this year.



    <strong> [quote]

    but vid cards double performance every 6 months... so imagine a games 4x better than the best a Geforce4 4600 128MB can do today (no one has games that can do this yet)...</strong><hr></blockquote>



    It isn't about increasing performance, necessarily, so much as it is about increasing capability. There will be performance increases, but consider that it might be better for future hardware to draw about the same amount, but make each pixel look 4x better. This does represent an increase in computing performance, but it may not show up in pixel/sec or polygon/sec numbers. The new hardware might only double the pixel rate, but it will be able to do far more calculations per pixel.



    The real question in this forum should be how aggressively Apple will bring these new technologies to the Macintosh. What apps can do through OpenGL determines how much of the new capability can be leveraged, and currently none of it could be... the geForce3&4 are currently largely wasted on the Mac.
  • Reply 15 of 48
    amorphamorph Posts: 7,112member
    [quote]Originally posted by Programmer:

    <strong>The new hardware might only double the pixel rate, but it will be able to do far more calculations per pixel.</strong><hr></blockquote>



    And, if I've read things right, they will also not limit the types of calculations that can be done to a handful of options.



    I've heard of capabilities like - I don't know the technical term, so I'll borrow one from the database world - stored procedures, where the software can actually cache a custom function in the GPU and call it? That alone would immensely improve the capabilities of the software. It seems to my one-semester-of-OpenGL-7-years-ago eyes that one of the major problem facing game programmers is how to disguise polygons. If the GPU makes it easier to disguise them, not as many are necessary because the engine no longer needs to "throw polygons at the problem" that everything looks like a D&D die.



    I'm speculating here, though.



    [quote]<strong>What apps can do through OpenGL determines how much of the new capability can be leveraged, and currently none of it could be... the geForce3&4 are currently largely wasted on the Mac.</strong><hr></blockquote>



    I'm predicting a new OpenGL layer and QT6 in the next major release of OS X, regardless of what the MPEG-LA settles on for licensing terms, and regardless of whether they've settled. Apple can always withhold the streaming software if that's a sticking point, but I think they will need to release the rest.



    Apple might call the OpenGL layer a "beta" if the ARB hasn't finished OpenGL 2.0 yet, just so that developers won't be surprised when they change something to conform to a change in the standard. But I think they'll push it out.
  • Reply 16 of 48
    stevessteves Posts: 108member
    [quote]Originally posted by drewprops:

    <strong>First off, here's the link:

    <a href="http://zdnet.com.com/2100-1103-896996.html"; target="_blank">Nvidia: Forget Microsoft--chip's ahoy</a>



    It's basically a snapshot of where Nvidia stands in the marketplace right now, with a new product getting set to launch soon. It also touches on their relationship with Microsoft and the Xbox.



    BUT, here's the bit that caught my eye:



    Huang also noted the inroads the company has made in the Macintosh market, saying that the company expects to have the largest share of that market "very soon." Huang said that Nvidia's chips are in the eMac that Apple Computer introduced Monday.



    "You are going to see many more exciting introductions in that space," Huang said.




    Any speculation gentle readers?



    Drew &lt;----going off to read the rest of the article now</strong><hr></blockquote>



    Not to add fuel to the fire, but you might find the following link very intersting:



    <a href="http://news.com.com/2100-1040-896850.html"; target="_blank">http://news.com.com/2100-1040-896850.html</a>;



    Specifically:



    [quote]"Also on Monday, Nvidia raised its financial outlook for the just-ended quarter, and Huang said he sees continued market share gains this year leading to more growth. Some of that will come from a new graphics chip slated to arrive in August.



    The new chip will be manufactured on Taiwan Semiconductor Manufacturing Co.'s latest 0.13-micron manufacturing process, Huang said. Huang did not reveal the name or specific features of the chip, but did say it was a fundamentally new architecture from the GeForce 4 Titanium introduced earlier this year.



    "It is the most important contribution we've made to the graphics industry since the founding of this company," Huang said, speaking at the Merrill Lynch Hardware Heaven Technology Conference.



    Huang also noted the inroads the company has made in the Macintosh market, saying that the company expects to have the largest share of that market "very soon." Huang said that Nvidia's chips are in the eMac that Apple Computer introduced Monday.



    "You are going to see many more exciting introductions in that space," Huang said. "<hr></blockquote>



    Hmmm.... I wonder if this would make it in time for a late July announcement at MWNY?



    Steve



    Edit: Ooops, I didn't realize the original link had basically the same information...



    [ 05-02-2002: Message edited by: SteveS ]</p>
  • Reply 17 of 48
    lemon bon bonlemon bon bon Posts: 2,383member
    G5 and G5?



    :cool:



    Well, here's hoping...



    Lemon Bon Bon
  • Reply 18 of 48
    xmogerxmoger Posts: 242member
    [quote]quote:

    --------------------------------------------------------------------------------

    Originally posted by JasonPP:

    Let's hope the next card can render 1600x1200x32 Final Fantasy The spirits within clips in real time at full production rendering quality...

    Imagine games as good as FF:TSW?!!



    --------------------------------------------------------------------------------



    Some things just can't be done in real-time in any sort of a reasonable fashion... at least for the next few years. But in many ways it'll get close enough that the consumer won't be able to tell the difference.

    <hr></blockquote>

    Nvidia already <a href="http://gamespot.com/gshw/stories/news/0,12836,2820060,00.html"; target="_blank">demo'ed</a> this. Although it was at 2 FPS and probably not at the insane resolution they use to print to film.
  • Reply 19 of 48
    aquaticaquatic Posts: 5,602member
    Yeah, SteveS, but didn't nVidia recently have some "problems" with the SEC? I hear Merrill Lynch is having troubles, too...
  • Reply 20 of 48
    Hmm.. I think that along with a new chipset will come a new name. Geforce 5? I think the numbers would also just get too high. Geforce 10? I don't think so.....
Sign In or Register to comment.