AMD making dual core cpu's! What about Apple?

2

Comments

  • Reply 21 of 44
    programmerprogrammer Posts: 3,458member
    Quote:

    Originally posted by emig647

    But cinebench (I know this is a particular program and doesn't show all around results) shows that the x800pro does not have much of a performance gain over the 9800xt... From what I have read the x800pro and 6800 are about neck and neck... sometimes the x800pro edging it by a small margin.



    The latest cards are more about improved capability than improved base performance. Not only can they do more, they can also do more faster than previous cards. They show the least improvement when they are doing lots of less complex operations -- the 9800 is quite capable in that department. The new hardware will really outshine the 9800 when you run more complicated shaders on it, and most software isn't currently using very complex shaders... they're not even leveraging the 9600's capabilities yet!



    Apple should be aiming for their next motherboard revision (6-9 months from now) to incorporate PCIe and ship with the recently announced ATI/nVidia GPUs. Not shipping these parts before then won't actually adversely affect anyone.





    The lack of "pro" 3D graphics hardware is an interesting question. There are certainly higher end and more feature complete boards available for the PC, but what fraction of the pro market really needs those capabilities? Given proper software support the 9800 ought to do just fine for most pro users, with the only real issue being the anti-alias line draw support (the equivalent nVidia GPU doesn't have that issue). For many pros even the 9600 does the job. Nonetheless Apple would do well to get a proper "pro" 3D card into the Mac, if only to satisfy the small fraction of pro users that really need it, and the much larger number of them who just suffer from techno-lust.
  • Reply 22 of 44
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by emig647

    I'll bite onlooker... that could possibly work.



    But I still stand by my main point... PCI-E is coming... why produce all of these AGP 8x cards when the technology will be extinct soon? If this were PCI-E types I'd say for apple to go for it. Someone also mentioned that pc sales needs to drive price down before apple can really use it. Think about how overpriced the x800xt is right now... for not that much performance gain over the 9800xt. Either way by the time pci-e comes out for apple machines, then prices will be acceptable.




    I hate the fact that you keep using ATI as an example when their OpenGL is a known tragedy amongst the 3D community.



    Your right about PCIe though - It's coming, but there are still more Macs with AGP now than PCIe that could use it, and at this stage it would start to excite the 3D community with talk of acceptable 3D capable Apple workstations.

    We also don't truly know when PCIe is going to be regularly integrated into PC motherboards let alone Macs, (well I don't anyway) but the down side to PCIe exclusivity is that is there would be only these graphics cards for new PowerMacs with PCIe. I think a great portion of users that would make this possible would be unable to get a card at that point if that (PCIe cards first) were the case. Because after buying Maya, or Lightwave, (or whatever you use) you may be too broke to get a $3,500.00 PowerMac, and a $2/$3,000.00 graphics card on top of that. And again price, and pro really isn't an issue because these cards need to be at that level so bringing in lesser cards would just totally defeat the purpose.



    That is if Apple wanted to actually break even, or profit the first time they had highend graphics cards ready - PCIe could be a mistake. If they just went with PCIe some users might have to wait until a second generation of 3D readied Macs before they will have enough money to afford such a thing. But on the other hand those users in waiting may not matter because Apple could possibly still sell enough to do just fine without selling to those who are not doing this professionally yet, or just can't afford all 3 purchases combined.

    But if the card(s) is/are AGP - previous AGP PowerMac owners become potential purchase candidates if they are into 3D



    On a last note, I've read a few places that when PCIe is adopted there will still probably be an AGP port on most of the early motherboards before being PCIe exclusive. If Apple were to do the same, and not go PCIe exclusive on the first motherboard then making the 1st generation Pro 3D card be an AGP version on purpose would be seriously beneficial in many ways for reasons I already mentioned, or alluded to, but I'll go over them again.



    #1 Older Machines can use them - which in turn also applies to #2



    #2 Making the second generation Pro 3D card a PCIe version for the second generation of PCIe motherboard in the PowerMac - that is then assumably a PCIe graphics exclusive motherboard version. Then that would give the last PowerMac with the AGP, and PCIe port upgradability which in turn looks good for sales future to 3D users to think of a Mac as a possible 3D option in the future, but that also guarantees more sales of the 3D cards for Mac's in their second generation because of the last generation still has the option to upgrade to it. So you see... Going with PCIe on this is probably a mistake.



    With "that" in mind. "that" being 2 things.

    #1) AGP for the first generation of 3D cards, and

    #2) AGP, and PCIe in the first generation of PCIe motherboards.



    I don't think the PCIe transition would be as successful as it could be, and any 3D graphics card would sell less units if it were done differently.



    Without "that" in both ways (singularly, and together) I think the move to PCIe is like entering a room with only one door, and it is to enter only.



    I'm sure you can tell by my writing that I'm pretty tired, and falling asleep at the keyboard, but I think (hope) my points came out clear enough
  • Reply 23 of 44
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by Programmer

    The latest cards are more about improved capability than improved base performance. Not only can they do more, they can also do more faster than previous cards. They show the least improvement when they are doing lots of less complex operations -- the 9800 is quite capable in that department. The new hardware will really outshine the 9800 when you run more complicated shaders on it, and most software isn't currently using very complex shaders... they're not even leveraging the 9600's capabilities yet!



    I meant to make this point earlier...



    Only a few applications really take advantage of the higher end cards. Over the semi-high end cards. If you're a gamer you won't see anything. If you're in Cinema4d you might see a little more. If you're in Maya you will see even more. However, even in maya comparing the 9800xt to the x800pro or xt you won't see a HUGE difference. Programmer is right in the fact that many programs still haven't tapped into the 9600's maximum performance. E



    Again he is right in that they should be focusing on the next motherboard when PCI-e is a true standard... or closer to becoming one.
  • Reply 24 of 44
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by onlooker

    I hate the fact that you keep using ATI as an example when their OpenGL is a known tragedy amongst the 3D community.



    So what should I be comparing to? Any other company besides NVidia will result in a 1k+ graphics card.



    Quote:



    ... Because after buying Maya, or Lightwave, (or whatever you use) you may be too broke to get a $3,500.00 PowerMac, and a $2/$3,000.00 graphics card on top of that. And again price, and pro really isn't an issue because these cards need to be at that level so bringing in lesser cards would just totally defeat the purpose.



    ...



    With "that" in mind. "that" being 2 things.

    #1) AGP for the first generation of 3D cards, and

    #2) AGP, and PCIe in the first generation of PCIe motherboards.



    I don't think the PCIe transition would be as successful as it could be, and any 3D graphics card would sell less units if it were done differently.





    How many users are going to shell out $6,500 JUST to use a mac? Why not spend ~ $2k on a PC and bare with windows. Is a mac really going to make the work productivity any faster?



    I do agree with you that apple needs to do something about the graphics cards. What I don't understand is why apple's is so much more and why you can't stick a PC card in and flash the rom like you used to be able to on the ati 8500, geforce 2 and 3... oh and the 3dfx voodoo 5500. If apple could get back to that stage we could easily have High end graphics cards.
  • Reply 25 of 44
    zapchudzapchud Posts: 844member
    Quote:

    Originally posted by emig647

    I meant to make this point earlier...



    Only a few applications really take advantage of the higher end cards. Over the semi-high end cards. If you're a gamer you won't see anything. If you're in Cinema4d you might see a little more. If you're in Maya you will see even more. However, even in maya comparing the 9800xt to the x800pro or xt you won't see a HUGE difference. Programmer is right in the fact that many programs still haven't tapped into the 9600's maximum performance. E




    No, you're misinterpreting Programmer's post. Many programs might not have tapped into the 9600's max performance, but many programs and especially games has. The X800 is clearly superior in performance for a great many games, to the 9800. See Anandtech.



    In reference to the 9600, what Programmer is talking about is shader program length (because there are limits). Newer cards can do longer programs, while sustaining higher performance.



    Also, the 9800 is still a very powerful card for pro usage. But nonetheless, it's still room for improvement. There's a reason for that Apple lists the 9800 card or better as the recommended one.



    But I'll stop here. This is way off topic.
  • Reply 26 of 44
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by Zapchud

    No, you're misinterpreting Programmer's post. Many programs might not have tapped into the 9600's max performance, but many programs and especially games has.



    We weren't talking about games, we were talking about high end graphics programs.
  • Reply 27 of 44
    zapchudzapchud Posts: 844member
    Quote:

    Originally posted by emig647

    We weren't talking about games, we were talking about high end graphics programs.



    But you were still wrong.
  • Reply 28 of 44
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by Zapchud

    But you were still wrong.



    Wrong about what? The shaders? What program takes major advantage over these between the 9800xt and the x800pro?
  • Reply 29 of 44
    Apple fell behind the curve a long time ago, and are remaining there ever-despairingly so. I doubt we'll ever see a dual core desktop PPC chip in an Apple branded computer until late in 2005 at the earliest. Predictably behind the competition as always.
  • Reply 30 of 44
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by emig647



    How many users are going to shell out $6,500 JUST to use a mac? .




    It costs the same to configure a PC with the same speeds/features, and how many users would pay would be determined by the pre-orders.



    Also I found the Q FX 4000 for 1,995.00 which is much less than my original search.



    ATI's best effort at a 3D card is the ATi FireGL X2-256t.



    Toms hardware did a comparison of the ATi FireGL X2-256t vs. the Nvidia Quadro FX 1100.

    The Nvidia Quadro FX 1100 is about 4x slower, or less in real world performance than the Quadro FX 4000.

    Check out what toms hardwares conclusion was of the Quadro FX 1100, and the FireGL X2-256t



    BTW the 256t is more expensive than the 1100.



    TOMS HARDWARE LINK_



    (taken from the Nvidia web site)



    The NVIDIA Quadro® FX 4000 sets a new bar for workstation graphics, shattering the limits of performance, programmability, precision, and quality for professional CAD, DCC, and scientific applications. Featuring a revolutionary new architecture with 2x the geometry and fill rate, 5x the hardware pixel read-back performance and 1.25x the memory bandwidth of previous generation workstation graphics, and with support for ultra-fast GDDR3 memory, the NVIDIA Quadro FX 4000 is the first ultra high-end workstation graphics solution on the planet. Implementation of rotated grid FSAA introduces far greater sophistication in the multi-sampling pattern, significantly increasing color accuracy and visual quality of edges and lines without compromising performance.



    Performance:

    Highest Workstation Application Performance

    Next-generation architecture enables over 2x improvement in geometry and fill rates with the industry?s highest performance for professional CAD, DCC, and scientific applications.



    Precision:_

    NVIDIA High-Precision Dynamic-Range (HPDR) Technology

    HPDR sets new standards for image clarity and quality through floating point capabilities in shading, filtering, texturing, and blending. Enables unprecedented rendered image quality for visual effects processing.



    Programmability:

    Next-Generation Vertex & Pixel Programmability

    NVIDIA Quadro FX 4000 GPUs introduce infinite length vertex programs and dynamic flow control, removing the previous limits on complexity and structure of shader programs. With full support for Vertex and Shader Model 3.0, NVIDIA Quadro FX 4000 GPUs deliver sophisticated effects never before imagined for real-time graphics systems.



    Quality:_

    Rotated-Grid Full-Scene Antialiasing (FSAA)

    The rotated-grid FSAA sampling algorithm introduces far greater sophistication in the sampling pattern, significantly increasing color accuracy and visual quality for edges and lines, reducing ?jaggies? while maintaining performance.



    Quadro FX 4000

    Memory Size = 256MB

    Memory Interface = 256-bit

    Graphics Memory Bandwidth = 32.0GB/sec

    Display Connectors = DVI-I+DVI-I+Stereo

    Dual-Link DVI = 2

    proe-02 = 45.1

    ugs-03 = 59.0

    3dsmax-02 = 29.2



    3D Primitive Perf

    Triangles per Second = 133 Million

    Texels per Second/Fill Rate = 4.5 Billion



    Relative 3D Application Performance

    SPEC PROE-02 = 4.3

    SPEC UGS-03 = 6.1



    (Below taken from CGnetworks)



    The new NV40GL GPU contains significantly more hardware than previous Quadro generations: 16 pixel pipelines, six vertex shader units, and two fragment/pixel shader units per pipeline. NVIDIA claims 4-8 times faster pixel shader FP32 support, and twice the geometry and fill rate performance over the Quadro FX 3000. We'll look into these performance claims when the final board arrives for us to perform benchmarks.



    Compared to the Quadro FX 3000, the FX 4000 will feature:



    Support for Microsoft Vertex Shader 3.0 and Pixel Shader 3.0. FP32 format support is now a requisite.



    Support for High Dynamic Range (HDR) rendering. NVIDIA is claiming full hardware support for the OpenEXR 16-bit floating-point standard.



    Integrated programmable video processor (meant to support encoding and decoding of MPEG2/4)



    A new rotated grid sampling full-screen anti-alias scheme (RG-FSAA). NVIDIA claims that 4x RG-FSAA is superior in quality to 8x ordered grid FSAA in the Quadro FX 3000.



    Like previous performance-oriented Quadro products, the new model Quadros are for professional applications that frequently feature multiple rendering windows._ The Quadro FX 4000 and Quadro FX 4000 SDI offer additional capabilities over and above the consumer-oriented GeForce 6800:



    Antialiased points and lines for wireframe display.



    OpenGL logic operations.



    Up to eight clip regions (GeForce 6800 supports one).



    Hardware accelerated clip planes.



    Memory usage optimization for multiple simultaneous graphics windows.



    Support for two-sided lighting.



    Hardware overlay planes.



    Support for quad-buffered stereo for shutter glasses.







    (Raves from highend companies, and even Adobe)



    "Designers need their work to be as realistic as possible as early in the design stage as possible. CATIA® V5 accelerated by NVIDIA Quadro FX graphics allows designers to create and interact with complex 3D models with ultra-realistic material aspects in real time. New NVIDIA Quadro FX 4000 capabilities like extended programmability, infinite program lengths, and 32-bit shader precision promise to further increase CATIA user productivity and design quality."



    Jean-Luc Cuinier

    R&D manager at Dassault Systèmes



    www.3ds.com



    "The NVIDIA Quadro FX 4000 combines complete floating point performance, extended programmability, and high-precision dynamic range technology allowing mental ray® customers, for the first time, to leverage the GPU for dramatically accelerated photorealistic rendering of complex visual effects. To take full advantage of key new features of the just-released version 3.3 of mental ray, we recommend the NVIDIA Quadro FX 4000 as the graphics platform of choice."



    Rolf Herken

    CEO and CTO, Mental Images



    www.mentalimages.com



    "Professionals using Adobe software can now take full advantage of breakthrough computer graphics hardware to drive stunning 3D animations. The NVIDIA Quadro FX 4000 drives great new features in the Adobe Video Collection like Premiere Pro?s GPU Effects and After Effect?s real-time, high-quality 3D compositing."



    David Trescot

    Senior director of digital video products at Adobe



    www.adobe.com



    "With the performance that we?ve seen, Alias is very excited about the radically new graphics architecture within the NVIDIA Quadro FX 4000. With the introduction of many exciting new technologies, we anticipate professionals using Maya will be able to take digital artwork created for movies and games to significantly higher levels of realism."



    Rob Hoffmann

    Senior Maya product marketing manager, Alias



    www.alias.com



    "The new rotated-grid antialiasing of the NVIDIA Quadro FX 4000 makes NVIDIA?s product line quality a lot more realistic, even for the most demanding designer?s eye. NVIDIA is once again helping SolidWorks deliver industry-leading quality to engineers and designers without compromising performance."



    Antony Hervo, Hardware Partner Manager at SolidWorks Corporation.



    www.solidworks.com





    If Apple decides to do it they might as well do it right, and give us the real deal, (NVIDIA), or it will probably be overlooked.
  • Reply 31 of 44
    @homenow@homenow Posts: 998member
    Quote:

    Originally posted by utmostcertainty

    Apple fell behind the curve a long time ago, and are remaining there ever-despairingly so. I doubt we'll ever see a dual core desktop PPC chip in an Apple branded computer until late in 2005 at the earliest. Predictably behind the competition as always.



    IBM and or Freescale will make it happen or not, Apple does not make processors, they just use them. IBM has wanted a consumer level multi-core processor since shortly after the G3 cam out and IBM and Moto released the "Wish List" of features for the G4, which included multi-core chips. Apple and Moto chose to add the SIMD to the G4, not IBM. I would imagine that IBM will have a multi-core "Desktop" processor as soon as the design is economically feasible.
  • Reply 32 of 44
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by utmostcertainty

    Apple fell behind the curve a long time ago, and are remaining there ever-despairingly so. I doubt we'll ever see a dual core desktop PPC chip in an Apple branded computer until late in 2005 at the earliest. Predictably behind the competition as always.



    I agree with @homenow. It will happen, but you may be right in saying it may be that far off in the future.

    When the PowerMac G4 500 was introduced the wait before any real upgrades was so long it made my skin crawl. I think the PowerMacs should be updated in at least some way every 6 months minimum. , but we just waited 12 months for a bump that wasn't that bad, but it should have been a more impressive update for 12 a month break between.

    So in following how the G4 was, and after seeing what just happened in 12 months I'd say you could be quite-right that IBM, and Apple may not have a Dual core CPU until late 2005, or later which is a tragedy because IBM, and Apple are in striking distance of the competition.

    They each could make great pathways in conforming users of other systems if they could get each of their acts together where they are now lacking. It's probably less difficult for Apple to improve by offering the better graphics needed for such a transition because the parts, and technology are there, but IBM has to develop their technology, and have it in quantity on their part, and that is one of the issues they seem to be having a hard time with now.
  • Reply 33 of 44
    jcgjcg Posts: 777member
    Quote:

    Originally posted by onlooker

    ...So in following how the G4 was, and after seeing what just happened in 12 months I'd say you could be quite-right that IBM, and Apple may not have a Dual core CPU until late 2005, or later which is a tragedy because IBM, and Apple are in striking distance of the competition. ...



    I wouldn't make the mistake of comparing the 970 family chips and the G5 descendants with the G4 development. Also, I think that the driving force in adding multi-cores to these chips is going to be IBM, not Apple, because this will make their PowerPC blade servers more attractive.
  • Reply 34 of 44
    wizard69wizard69 Posts: 13,377member
    I like that thought "12 months to install a radiator". People should put this into perspective, waiting 12 hours to have a new radiator installed in a car is excessive. Here we have waited over 12 months for what amounts to a radiator install.



    That 2.5 GHz machine owuld have been a nice speed bump after 3-6 months. The current machine is a total insult to anybody with any brains at all. What does Apple want us to do, sit back for another year and watch them get farther and farther behind the technology curve while they milk the customer for every penny they have?



    I suppose that one can alway hope that WWDC will bring a machine with PCI-Exprees and other modern features, but lets face it it doesn't look good. So it is a reasonable question, what has Apple engineering been up to for the last year or so? The thought of a kick ass iMac replacement does whirl around in ones mind but I just have the feeling we will be waiting even longer for that machine.



    Thanks

    Dave







    Quote:

    Originally posted by onlooker

    I've been alluding to that in most of my arguments against these requests for ridiculous new products for a long time, but mostly what I see of the recent complaints are for neglecting existing products. I for one have been complaining about the lack of graphics features in PowerMacs, and that after a years time there was only a speed bump. Even if you don't have an 80 billion dollar war chest you have 12 months to make adjustments to your existing Pro machine, but all you do is increase the MHz. Sure they added a water cooler, but IBM was working on the Processors. What was the team dedicated to the PowerMac doing for 12 months? Installing a radiator?



    I think my personal biaching is valid. But thats my opinion.




  • Reply 35 of 44
    wizard69wizard69 Posts: 13,377member
    What Rev this machine is really has no bearing on the value of the machine as seen buy the market place. Apples memory allocations are always seen in the market as being stingy in the extreme. In some cases the allocations where so bad as to have a dramatic impact on the performance of their hardware.



    Further is you really believe that delivering the machines with a reasonable allocation of memory would so grossly impact Apples profit margin (highest in the industry) then do take a close look at the retail and wholesale prices of memory. There simply is not enough difference in price to have a significnat impact on profits. I mean do you really believe that Apples margins come soley from the RAM installed on its machines? If so that thought is garbage.



    Thanks

    dave







    Quote:

    Originally posted by emig647

    This was REV B... which means updating... not re-inventing.

    I suppose they could have added more memory (ram and HD)... but that just would have taken away from their profit margins big time. Personally I wish they would come without a hd and ram as a BTO option.



  • Reply 36 of 44
    hmurchisonhmurchison Posts: 12,423member
    Quote:

    hat 2.5 GHz machine owuld have been a nice speed bump after 3-6 months. The current machine is a total insult to anybody with any brains at all. What does Apple want us to do, sit back for another year and watch them get farther and farther behind the technology curve while they milk the customer for every penny they have?



    Dave only Intel is above 2.5Ghz for their processor and that's because of their superpiplining. Apple/IBM are right were they need to be. I thought the 3Ghz goal was a little excessive for a company that was stuck at 500mhz for 18 months. If any good comes from it, it will be getting Steve to shut his mouth for once and let the engineers work.



    Apple simply is not going to rev the G5 motherboard to PCI Express in less than a year. We were ALL dreamin' PCI Express is not ready in volume. The cards are not ready. People are already voicing displeasure at the need to wait until July for the 2.5s.



    The next update will contain the new goodies. I see some good stuff coming and next spring sounds definitely plausible for incorporating recent tech. New OS...new HW features.
  • Reply 37 of 44
    onlookeronlooker Posts: 5,252member
    I said this earlier, but I'll say it again. It would be much wiser to offer a motherboard with a PCIe, and an AGP slot on your first motherboard that had PCIe as a feature if you were Apple. Wait to go PCIe exclusive until your second motherboard that featured PCIe.



    For reasons I already listed.
  • Reply 38 of 44
    wizard69wizard69 Posts: 13,377member
    You seem to miss the whole point, the price of everything that is in these machines has dropped dramatically since last year. It would no have hurt Apple to enhance the machines a bit to offer better value to the customer. Frankly its a shame that Apple still charges top dollar for their hardware and then configures those machines with a limited amount of memory. It is certainly no favor for the customer and really hurts them in sales.



    As to PCI express if Apple waits another year to bring that out they will have been extremely foolish. It makes no difference how old the current rev of the motherboard is, it is a matter of keeping position in the market. Frankly it would have been better to wait to July or August even for a machine that supports PCI-Express. The current machines will have to be replaced shortly by anybody that is expecting high performance and needs it. Investment wise they just won't have the life to justify the price unless you are in a very performance sensitive business.



    Many people are going to sit back and simply say nice try Apple but I'll wait until you get your act together again.



    Thanks

    Dave





    Quote:

    Originally posted by hmurchison

    Dave only Intel is above 2.5Ghz for their processor and that's because of their superpiplining. Apple/IBM are right were they need to be. I thought the 3Ghz goal was a little excessive for a company that was stuck at 500mhz for 18 months. If any good comes from it, it will be getting Steve to shut his mouth for once and let the engineers work.



    Apple simply is not going to rev the G5 motherboard to PCI Express in less than a year. We were ALL dreamin' PCI Express is not ready in volume. The cards are not ready. People are already voicing displeasure at the need to wait until July for the 2.5s.



    The next update will contain the new goodies. I see some good stuff coming and next spring sounds definitely plausible for incorporating recent tech. New OS...new HW features.




  • Reply 39 of 44
    Quote:

    Originally posted by wizard69

    [B...Frankly its a shame that Apple still charges top dollar for their hardware and then configures those machines with a limited amount of memory...[/B]





    *sigh*



    Are you an Apple reseller? My guess is not. I am, and the reason that the boxes stay with minimal RAM is to allow me and others like me an opportunity to increase OUR margins, since we're often running at around 5% gross on top of DAC. RAM is not the deal-breaker you think it is.
  • Reply 40 of 44
    hmurchisonhmurchison Posts: 12,423member
    Quote:

    Originally posted by Jellytussle

    *sigh*



    Are you an Apple reseller? My guess is not. I am, and the reason that the boxes stay with minimal RAM is to allow me and others like me an opportunity to increase OUR margins, since we're often running at around 5% gross on top of DAC. RAM is not the deal-breaker you think it is.




    Jelly I was in retail and corporate sales of Macs for years. I know exactly what you mean. Consumers don't understand the business aspect of computer sales. I'm not saying they have to care but reality is reality.



    I remember we loved the hockey puck iMac mouses. That was usually an instant new mouse sale or we sold those cheap clip on pieces for better grip. Cost us $2 sold'em for $10. RAM was really the only way to make up profit.



    Don't consumers ever wonder why many printers don't come with cables. It is a concession to the stores. Bundling too much with your product leaves nothing for retailers and without retailers there is no store to go see product.



    With Macs I only worry about what can't be changed easily. RAM, HD, AGP card can be replaced, processors cannot.
Sign In or Register to comment.