Has apple any hope?

124»

Comments

  • Reply 61 of 78
    [quote]Originally posted by MarcUK:

    <strong>Likewise we've had pretty much the same MB for 4 years.</strong><hr></blockquote>



    What is that supposed to mean!? Macs have shipped with 32MB to 512MB RAM in the past 4 years - it's cheap to buy, what's the difference if you buy it yourself or they put it in for you?



    Capacity has been 1-2GB in that time, how much do you want?
  • Reply 62 of 78
    umm clive he means mother board. yea sorry about that.. lol
  • Reply 63 of 78
    hmm. I think he means motherboard. Which is just not true.
  • Reply 64 of 78
    I think Apple should surrender in the desktop war and focus 60-70% of R&D towards laptops....where they actually have a small lead and a fighting chance....
  • Reply 65 of 78
    hobbeshobbes Posts: 1,252member
    If we don't see even a hint of what's to come on October 14, that's a bit worrying.



    If we don't see even a hint of what's to come by January 2003, that's very worrying.



    I think we'll have a hint by Oct. 14, and the new chip by the middle of next year. But who knows.



    It's clear that Apple is cooking up their next-generation chip. The question that no one (perhaps not even Apple) can answer is when...



    [ 09-11-2002: Message edited by: Hobbes ]</p>
  • Reply 66 of 78
    [quote]Originally posted by gumby5647:

    <strong>I think Apple should surrender in the desktop war and focus 60-70% of R&D towards laptops....where they actually have a small lead and a fighting chance....</strong><hr></blockquote>





    What's so great about PC hardware? Other than a DDR bus and evolutionary improvements like ATA-133 I don't really see many huge gains in Mobo tech the last 2 years.



    It's obvious that nexgen tech like Hypertransport, RapidIO and PCIExpress are vying for their place on future mobo's. What I don't understand is why Mac user view HW as something that cannot be updated quickly. I could see Processor problems but I'm fairly confident that Apple will be in position to be competitive when nexgen HW starts shipping.



    Threads like this always de-evolve into bitching threads about Apple Hardware. Perhaps AI should create a Doom and Gloom Topic so that the worrywarts can congregate and moan in unison.
  • Reply 67 of 78
    zosozoso Posts: 177member
    [quote]Originally posted by Clive:

    <strong>Because they will also have higher clockspeeds?</strong><hr></blockquote>



    No. I'm sorry, but I don't think you're getting this totally right. Regarding the Hammer, it already is some 20-40% faster than an equally clocked Athlon running the same 32 bit code.

    I think CPUs are not so simple: just as raw frequency can be a performance myth, so can be pipeline depth. The P4's pipeline is certainly huge, but it's got its benefits (it's arguably the most powerful desktop CPU around). Some people say the P4 architecture will not last long after 2004, because there isn't the technology to keep pushing the frequency so much, but this is Intel's fault, it doesn't lie in the architecture itself.



    I don't have the knowledge necessary to prove this, but as far as I can tell there are other factors beyond pipeline depth in determining the IPC rate of a CPU.



    ZoSo
  • Reply 68 of 78
    stoostoo Posts: 1,490member
    [quote]I just think Apple should have a long hard look at their business model. It's too rigid. Too rigid to reach 10%. I'd like to be proven wrong.<hr></blockquote>



    Could they support more BTO options? Methinks yes.





    Some ramblings about IPC and performance, correct me if I'm wrong.



    Instructions Per Second =

    Instructions Per Cycle x Clock Rate.



    To make up for the difference in clock rate between G4+ and IA32, G4+ needs a vastly better IPC. (Divide IPS by one million and you get MIPS, which ain't a good cross platform benchmark anyway. )



    IPC has nothing to do (directly) with clock rate.



    Branch prediction accuracy affects IPC: the less cycles wasted in processing (and then discarding) mispredictions, the closer the actual IPC is to the processors' optimum. Beefing up the branch prediction unit (BPU) can increase IPC but has a transistor cost: this is probably partly the case with Hammer's increased IPC.



    The commonly mentioned advantage of a short pipeline is that it wastes less cycles on a misprediction, as there's fewer stages of work to throw away. (However, each of those fewer stages has to do more work).



    The CPU will be working at its optimal IPC when instructions are being retired are completed by all of the possible units as quickly as possible. Adding more functional units (usually) increases IPC (probably Hammer's other IPC trick).



    All of the CPU's functional units have to be kept busy and fed with data. With caches having high hit rates (90%), this isn't as tricky as it might appear from the CPU/FSB clock rate difference. Cache size and speed therefore gives another IPC tradeoff. Having more registers (PPC: 32 general purpose reg.s, x86: 8) is also useful for keeping units fed.



    Out-Of-Order-Execution can increase IPC by running independant code while waiting for other code's results, but (no surprise) is complex and costs transistors.
  • Reply 69 of 78
    kedakeda Posts: 722member
    I regards to the roadmap...



    Apple seems to be in a constant state of suspense. As I long time Apple watcher and Mac user, I have never know what Apple will do next. It can be very frustrating when making purchasing decision.



    I understand why Apple runs things this way. But it is hard on the consumer.



    In the wintel world, its easy to figure out what chips the machines will be running in 6mos and how fast they will be. But on the Apple side, we dont even know what kind of chip we will be offered in 6mos or who will be making it. All we have is rumors.



    This makes business uneasy. I would like to see Apple announce some type of intention. Vague comments about 'having options' seem to imply that other chips besides the G4 are being considered, but Id like Apple to give us more. I dont need them to tell me details, but is it asking too much for a general plan?



    Since rumors of the next gen chip have been circulating, I have been holding of on a new machine as long as I can. But, next year Ill be in the market. I hope there are signs of Apple's direction be then.
  • Reply 70 of 78
    jimmacjimmac Posts: 11,898member




    [ 09-11-2002: Message edited by: jimmac ]</p>
  • Reply 71 of 78
    nevynnevyn Posts: 360member
    [quote]Originally posted by MarcUK:

    <strong>There is no Hardware roadmap. From this you can only look at previous performance. We have had a 750 mhz improvemnt in 3 years. We were stuck at 500mhz for 18months. Likewise we've had pretty much the same MB for 4 years. Consider.



    What happens if we get stuck at 1.25ghz until 2004/5?

    What happens if the speed only increases to 2ghz by 2006?



    </strong><hr></blockquote>



    Moore's observation is called 'Moore's LAW' because it holds so, so true.



    Yes, there are aberrations - there are aberrations in _any_ real data-set.



    -&gt; Intel had compelling reasons to push MHz _hard_ while allowing instructions-per-clock to be a secondary priority.



    -&gt; Motorola (back in 1998, the year of the beleaguered Apple) didn't emphasize the 'high' end of their chip lines as much as we wish they would have (with 20-20 hindsight).



    -&gt; It's also pretty clear that at least one _major_ chip project inside Mot had a Titanic scale disaster.





    Now. Extrapolating from the MHz (Or GFLOPS, or whatever you want) data from Mot _for_2001 & 2002_, we'll get minimal boosts. As you indicated. Moore's law is really about transistor counts, but observed speed is pretty well correlated. (Though, note: MHz is _not_)



    Extrapolating from limited data is not the best plan. Particularly in the face of a strong model that indicates otherwise, and evidence that some of the recent data suck.



    This doesn't change any of your other reasons (or reasoning), but just pointing out fallacies in your extrapolation. Extrapolating from the Stock Market data in the latter half of 1929 would also be pretty pointless.
  • Reply 72 of 78
    There's always 'hope'.



    Lemon Bon Bon
  • Reply 73 of 78
    [quote]Originally posted by Lemon Bon Bon:

    <strong>There's always 'hope'.



    Lemon Bon Bon</strong><hr></blockquote>



    we have hope that they will pull something out of the fire without turning into Microscum. However this time Apple have right royally f*cked themselves and every loyal mac user into the bargain.



    For god's sake steve, Style over Substance was over by the mid eighties. get over it and deliver the kick ass goods.



    This is a far more periolous time for apple that at any point in their incredibly stupid past.



    yet, I'm still in awe. How did this company that eventually ruins everything great they make, that continually snatch massive defeats from the jaws of certain success, that dumped the Newton, Hypercard, Copeland et all still survive and occasionally wow us?
  • Reply 74 of 78
    amorphamorph Posts: 7,112member
    I'm going to give this thread a long overdue scoot into General Discussion.
  • Reply 75 of 78
    thttht Posts: 5,452member
    <strong>Originally posted by Nevyn:

    Moore's observation is called 'Moore's LAW' because it holds so, so true.



    Yes, there are aberrations - there are aberrations in _any_ real data-set.</strong>



    Moore's Laws are observations about the economics of CMOS-based fabrication techniques. MHz increases naturally fall out from the physics of making CMOS transistors increasingly smaller, but the specifics of the clock rate of a processor is left up to the designer. Moore only was observing that companies would find it economical, and necessary if they want to stay in the same business, to improve their fabs every 18 to 24 months and thereby double transistor counts for processors because improving fabs (30% scale factor on gate widths, etc.) reduces the cost per transistor by half.



    The observation obviously means that the most economically strong semiconductor company will have the best chance of advancing technology. And Intel is in the best position.



    <strong>Now. Extrapolating from the MHz (Or GFLOPS, or whatever you want) data from Mot _for_2001 & 2002_, we'll get minimal boosts. As you indicated. Moore's law is really about transistor counts, but observed speed is pretty well correlated. (Though, note: MHz is _not_)</strong>



    The clock rates can easily be extrapolated. One just has to be sure of the processor microarchitecture and the fab technology.



    The 750 to 7400 transition wasn't that great because the 750 was fabbed on an optimized 0.25u Al process while the 7400 was on an immature 0.22u Cu process. (Copper fab yields were terrible at the time). IBM had the advantage of having a more mature 0.20u Cu fab for their PPC 750, so naturally had slightly faster (50 MHz) PPC processors. Then after that, both IBM and Moto moved to a 0.18 micron process. That technology jump was small and would not produce large increases in MHz as is evidenced. IBM and Moto simply wasted their time and money on 0.20 and 0.22 micron fabs.



    Contrast that from AMD and Intel which went from 0.25u to 0.18 in one step, and they used 10+ stage pipeline processors to boot. It was very obvious who was going to have higher clock rates.



    As for the G4. The 7450 based G4 is a 7 stage pipeline processor using typical voltages:



    0.67 GHz to 1.25 GHz at 0.18 micron

    1.25 GHz to 1.80 GHz at 0.13 micron



    They won't get 1.8 GHz right away, but after some fab optimization and circuit tuning on the processor, 1.8 GHz should be reachable.
  • Reply 76 of 78
    1.83Ghz on an 11x Multiplier!



    JOY!
  • Reply 77 of 78
    scottscott Posts: 7,431member
    I'm looking to buy a new computer soon and I just can't justify an Apple right now. Price vs. Performance == NOT APPLE. I'd love to have the 17 inche iMac but ... can't upgrade it and it

    s too slow.





    The G4 is going to be the end of Apple. The CPU was a dog from The Great Speed Dump to today. The G4 sucks and Apple sucks right along with it.
  • Reply 78 of 78
    I would just like to say that people here get way too serious about this stuff and should lighten up and go with the flow a little more.



    Oh, and regarding "style vs. substance," or worse, "form vs. function," it's not like Apple can just take those designers and make faster CPUs with them, so the point is, well, pointless. You can have both, but Apple is having problems with its CPUs, and this has little impact on their hardware and software forms.



    [ 09-15-2002: Message edited by: BuonRotto ]</p>
Sign In or Register to comment.