One more iMac G4 revision

123457

Comments

  • Reply 121 of 149
    oldmacfanoldmacfan Posts: 501member
    Quote:

    Originally posted by Res

    You are totally wrong (and that was the nicest way I could say it). I get things done a lot faster on my 800 MHz G4 then I did on my 300MHz G3. And when using a faster computer I can get even more things done in a day.



    Anyone who corrects and prints photos taken with their digital camera, plays games, or edits a home movie with their computer, to just name three uses, will truly benefit from more computing power.



    The idiots who say people don't need faster computers are like some of the fools at IBM in the 70s who thought that individuals would never need computers at all. They just don't get it.



    The truth of the mater is that, at this point in time, no company makes a computer that is fast enough for the average user. We all make do with what we can get, but computers are nowhere near as powerful as we want them to be. That's why people keep upgrading every few years.




    Well, when does it end. New software needs new hardware, and likewise. I would like them to take two years and fix the problems first before going to the next latest greatest thing (software or hardware), but that doesn't fit their business model.
  • Reply 122 of 149
    3.14163.1416 Posts: 120member
    Quote:

    Originally posted by oldmacfan

    Well if the 975 is on the way to be 3.0Ghz, and Apple is going to announce that at WWDC, then they should in the next couple of weeks announce a 970fx based iMac or replacement there of.



    You're much more optimistic than me. I expect to see 2.0-2.6 towers at WWDC, with iMacs around the same time at 1.6-2.0. Obviously, I like your scenario better



    (As an aside, has anybody other than MOSR confirmed the *existence* of the 975?)
  • Reply 123 of 149
    resres Posts: 711member
    Quote:

    Originally posted by oldmacfan

    Well, when does it end. New software needs new hardware, and likewise. I would like them to take two years and fix the problems first before going to the next latest greatest thing (software or hardware), but that doesn't fit their business model.



    Eventually computer technology will plateau and there will be no more upgrades. Even before then, if computers get powerful enough to completely and instantaneously satisfy our desires, there would be no real reason to upgrade. (But it is going to be a long time before either of those happen).
  • Reply 124 of 149
    Quote:

    Originally posted by Res

    Eventually computer technology will plateau and there will be no more upgrades. Even before then, if computers get powerful enough to completely and instantaneously satisfy our desires, there would be no real reason to upgrade. (But it is going to be a long time before either of those happen).



    How so? In virtually no area of innovation has development plateaued (slowed, yes, plateaued, no). Cars continue to evolve/improve, as does hi-fi, structural engineering, even the printed word (in the UK our oldest newspaper has recently changed to a new 'innovative' page size). As we find more and more exciting ways to harness technology, computers will continue to be upgraded.



    Granted, 'power' may stop being the determining factor and other dimensions of capability may become more important, but upgrades there will always be. The pace will undoutedly slow down though. The printing press is perhaps the best example. After it was invented there followed rapid eovlution, which slowed until electricity arrived, and progress accelerated, as it did with the arrival of desktop computers which allowed better designed machines and digital pre-press.



    It is the nature of human endeavour that we are never satisfied with the present. The propsect of betterment, be it financial or otherwise is one of the key things that motivates us as a race. Without a magic ball we cannot predict what developments will change the course of computer development, nor indeed what societal changes will affect the way we use computers.



    Just my 2c.



    J.
  • Reply 125 of 149
    oldmacfanoldmacfan Posts: 501member
    Quote:

    Originally posted by 3.1416

    You're much more optimistic than me. I expect to see 2.0-2.6 towers at WWDC, with iMacs around the same time at 1.6-2.0. Obviously, I like your scenario better



    (As an aside, has anybody other than MOSR confirmed the *existence* of the 975?)




    The 975, if that is what it will be called, is what name is being used for Apple's single core Power5 dirivative. Last summer, i believe, after the release of the 970, I read somewhere that Apple and IBM where developing two new processor lines. The first being the Power4 based stop gap measure CPU, and the other a Power5 based CPU. IBM has announced it will start shipping Power5 CPU designed systems in June.
  • Reply 126 of 149
    cubistcubist Posts: 954member
    Quote:

    Originally posted by James Cocker

    How so? In virtually no area of innovation has development plateaued (slowed, yes, plateaued, no). Cars continue to evolve/improve, ...



    Improvement in cars is on a pretty flat slope. Sure, incremental improvements continue to be made - tires have made great improvements since the 1960's, for example - but there is very little on a current-day automobile, from a user standpoint, that would surprise a time-traveller from 1960. This is what we could call "asymptotic improvement", and I would imagine it is true of any particular technology. Initially there are huge improvements, but as time goes on, the slope gets flatter and flatter as we approach, but never quite hit, the asymptotic limit for that technology.



    Personal computers of today are on a flattening slope, but if you try to do video stuff on your 800MHz G4, you'll find that there's still a lot of room for improvement. And I think there's still plenty of room for improvement in software user-friendliness, compatibility and stability.
  • Reply 127 of 149
    resres Posts: 711member
    Quote:

    Originally posted by James Cocker

    How so? In virtually no area of innovation has development plateaued (slowed, yes, plateaued, no). Cars continue to evolve/improve, as does hi-fi, structural engineering, even the printed word (in the UK our oldest newspaper has recently changed to a new 'innovative' page size). As we find more and more exciting ways to harness technology, computers will continue to be upgraded.



    Granted, 'power' may stop being the determining factor and other dimensions of capability may become more important, but upgrades there will always be. The pace will undoutedly slow down though. The printing press is perhaps the best example. After it was invented there followed rapid eovlution, which slowed until electricity arrived, and progress accelerated, as it did with the arrival of desktop computers which allowed better designed machines and digital pre-press.



    It is the nature of human endeavour that we are never satisfied with the present. The propsect of betterment, be it financial or otherwise is one of the key things that motivates us as a race. Without a magic ball we cannot predict what developments will change the course of computer development, nor indeed what societal changes will affect the way we use computers.



    Just my 2c.



    J.






    While I agree with your sentiments, and your examples are good, I think that eventually the laws of physics will get in the way of further improvements. Once we reach the physical limits of information technology there will be no real difference between computers that are made ten thousand years apart.



    The same thing will happen to your examples of cars and printing presses: eventually all forms technology will plateau. Of course, it is not going to happen anytime soon.
  • Reply 128 of 149
    @homenow@homenow Posts: 998member
    Quote:

    Originally posted by cubist

    Improvement in cars is on a pretty flat slope. Sure, incremental improvements continue to be made - tires have made great improvements since the 1960's, for example - but there is very little on a current-day automobile, from a user standpoint, that would surprise a time-traveller from 1960. This is what we could call "asymptotic improvement", and I would imagine it is true of any particular technology. Initially there are huge improvements, but as time goes on, the slope gets flatter and flatter as we approach, but never quite hit, the asymptotic limit for that technology.



    Personal computers of today are on a flattening slope, but if you try to do video stuff on your 800MHz G4, you'll find that there's still a lot of room for improvement. And I think there's still plenty of room for improvement in software user-friendliness, compatibility and stability.




    Actually your example proves your point wrong. Cars have advanced immeasurably since the 60's, but the technology and advances are things that we take for granted or are not aware of. When was the last time you tuned up your car? Changed the points? Adjusted the carburetor? Changed the spark plugs? Stalled out in the rain due to a cracked distributer cap?



    A car goes 100,000+ miles now before a tune up, back in the 60's you were rebuilding an engine when you hit that mileage (if I remember correctly that was scheduled maintenance around 75,000 miles for the air cooled VW Bug). Just the advances in motor oil formulas and filter design and materials have made that a thing of the past. In 2000 or 2001 Auto makers started making engines for certain parts of the country that automatically sense the fuel make-up and adjust on the fly for a mix of up to 85% alcohol. I didn't even touch on the brakes, climate control, transmission, unit-body construction, safety, or electrical systems.



    As to how this all applies to computer's, that all depends on how fast the software industry and applications eat up those FLOPS that we have or will gain. History has shown that the software engineers use those gains up pretty fast with existing as well as new applications. If it does follow your analogy, then at some point in the future the advances will go pretty much unnoticed by the average user. I don't see that happening anytime soon due to advances in software, eventual advances in internet bandwidth, new applications, and increased multi-tasking as computers take on more of a role in the home.
  • Reply 129 of 149
    cubistcubist Posts: 954member
    Quote:

    Originally posted by @homenow

    Actually your example proves your point wrong. Cars have advanced immeasurably since the 60's, but the technology and advances are things that we take for granted or are not aware of. ...



    The example proves my point right. Compare a car from 1920, a car from 1960, and a car from 2000. The first 40 years show tremendous progress, in comparison with only incremental improvements in the second 40 years.



    Now compare a computer from 1964, a computer from 1984 (a 128K Mac, for example), and a computer from 2004. (I won't go back to 1944.) While there's less improvement in the last 20 years than in the first 20, there's still a great deal. We are still on a healthy slope of innovation.



    And guys, please don't quote an entire message when replying. Edit the quote. We can always scroll back to see the previous message.
  • Reply 130 of 149
    neumacneumac Posts: 93member
    Quote:

    Originally posted by cubist

    The example proves my point right. Compare a car from 1920, a car from 1960, and a car from 2000. The first 40 years show tremendous progress, in comparison with only incremental improvements in the second 40 years.



    Sorry but I have to agree with @homenow, cars are a poor example. Your analogy would work if this were 1985, but other than 1900-1920, cars have advanced more in the last 20 years that they have over any other 20 years period in the history of the automobile. Individual examples are too numerous to list here, but safety, efficiency (not fuel efficiency unfortunately, almost all of the gains have been poured into increasing HP/liter since the CAFE standards have actually started to go down), emissions, handling, drivability, etc. have all made huge gains. Drive a 1962 Impala, a 1982 Citation, and a 2004 Malibu and you'll find that the basic underpinnings between the '62 and '82 are remarkably unchanged while the '04 is improved in every way from the '84.



    The analogy becomes interesting when you realize what made for the huge leap in cars over the last 20 years: the computer. Industries may stagnate but a new technology can completely revitalize or change them in an instant. I suspect that, at some point, the same thing will happen to the computer industry. Eventually we may run out of new technologies, but we aren't going to be around to see it.
  • Reply 131 of 149
    mac voyermac voyer Posts: 1,294member
    The car analogy simply does not work on any level for this reason. Cars serve the same purpose as they always have so they do not need to make drastic leaps. Their purpose is the transport people or things from one place to another. No matter how they do it, that is all they will ever need to do. You can only believe that more power is not needed in computers if you believe that they will never be used for purposes other than what they are being used for right now. Absurd!



    This is not directed to anyone in particular. Please do not limit my future needs by your lack of imagination. For people who only want their computer to do a predetermined set of tasks limited to today's applications, the iMac in its current form is absolutely perfect. For everyone else, there's Master Card. Their going to need it if they want to be able to afford a G5 tower.
  • Reply 132 of 149
    Quote:

    Originally posted by Mac Voyer

    The car analogy simply does not work on any level for this reason. Cars serve the same purpose as they always have so they do not need to make drastic leaps. Their purpose is the transport people or things from one place to another. No matter how they do it, that is all they will ever need to do. You can only believe that more power is not needed in computers if you believe that they will never be used for purposes other than what they are being used for right now. Absurd!



    This is not directed to anyone in particular. Please do not limit my future needs by your lack of imagination. For people who only want their computer to do a predetermined set of tasks limited to today's applications, the iMac in its current form is absolutely perfect. For everyone else, there's Master Card. Their going to need it if they want to be able to afford a G5 tower.




    Well, if fast computers are like fast cars I hope I don't get a speeding ticket in a construction zone.
  • Reply 133 of 149
    I believe that it will be a long time before we see a slowdown in computer development. When you look at processors you only have to look at IBM and Intel competing to see who is going to come out with a "better & faster" chip. That competitive drive is not going to slow down until one gives up - and I doubt that they will.



    The other factor is the benefit of economies of scale. While a 3 gig G5 will be nice, there is a limited number of people that will benefit significantly in terms of speeding up their work. By producing the 3 gig in large quantities those that really need it will get it at a reasonable price because others will also buy it - even if they only do e-mails. Same goes for 4,5 or 10 gig Gx's. The market will depend on Bubba buying a new computer in order to continue development of the next fastest chip. Bubba will benefit, but not as much as those that use their computer to make money.



    For the power users, if a 3 gig will bring a heavy processing task that took 25 minutes down to say 15 minutes how interested do you think they will be in a computer that will bring it down to 5 seconds - even with significant software enhancements?



    Economies of scale (lots & lots of Bubbas) have paid the way for all of the development we have seen in personal computing for the last 20 years and it is only going to improve in the future.
  • Reply 134 of 149
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by kenaustus

    I believe that it will be a long time before we see a slowdown in computer development.



    Don't count your chickens before they hatch...



    Data Speed Limit (storage)



    It could be completely possible for these things to happen eventually...
  • Reply 135 of 149
    amorphamorph Posts: 7,112member
    There's always a way around:



    1) This is a limit with the technique HDDs currently use to store data. So, let's see how long it takes someone to come up with a new one...



    2) Right now, many HDDs have multiple platters to increase storage capacity. An "internal RAID" setup that striped the platters would greatly speed up access - and with double-sided platters, you could implement RAID 5 in a notebook drive.



    3) If you don't have platter level RAID (RAIP? Umm...) just do it the old fashioned way and stripe data across HDDs.



    Option 3 is pretty much what's happening now in the CPU world. CPUs have gotten big and hot enough to be impractical in the sorts of machines people are buying, so there's a big push now to substitute the One Big Core philosophy for lots of little cores working in parallel. And, of course, it's also been used with RAM - interleaving is an old trick, and dual channel RAM appears in the G5.
  • Reply 136 of 149
    cubistcubist Posts: 954member
    Well, when I went from my 1GHz Cube to this 1.6GHz G5, everything became much faster. It's not just the clock, it's the faster bus, serial ATA hard drive, etc. A faster machine means I can get more work done, and I can enjoy my games more. A faster computer makes a more pleasant computing experience all around - even if you are just doing word processing, web sites and email. Even non-technical people can see that and feel it. They're not stupid for wanting a faster machine.
  • Reply 137 of 149
    carniphagecarniphage Posts: 1,984member
    I think that the desktop market is just incredibly tough right now. In order to make a dent in the market - you either have to have a low price and/or offer something quite unique.



    The iMac is cute - but in terms of being a major innovation - or a mass market computer - it has not really succeeded. I don't believe for a moment that making it a G5 will improve its chances.



    In my opinion, the best feature of the iMac also brings with it the worst aspect. The monitor arm is a brilliant piece of engineering - offering the "floating screen experience TM". However the downside is that this arm welds the display to the CPU. This is a downer for customers who are denied the freedom of buying either part independently. This forces them to pay for both components at the same time - driving up the basic price.



    My take would be to keep the arm and drop the computer. Forget headless - the world would love a bottomless iMac. Namely a beautiful ergonomic floating display. With a DVI connection, it would attract new customers including PC, iBook and Powerbook owners. (remember Powerbook owners have to pay a surcharge to use Apple displays!) TV Tuner functionality would not be a dumb idea.



    As I have suggested elsewhere - it would not be too hard for Apple to sell a compact CPU box which would "upgrade" this display into an all-in-one desktop - if the customer wanted to do such a thing.



    The only other way to go is copy Sony's lead - and turn the iMac into a bedroom style box. They sell a widescreen TV with an integrated DVD player - which just so happens to be a PC/PVR too. Not sure if Apple see themselves



    Carni.
  • Reply 138 of 149
    neumacneumac Posts: 93member
    Quote:

    Originally posted by Carniphage

    My take would be to keep the arm and drop the computer. Forget headless - the world would love a bottomless iMac. Namely a beautiful ergonomic floating display. With a DVI connection, it would attract new customers including PC, iBook and Powerbook owners. (remember Powerbook owners have to pay a surcharge to use Apple displays!) TV Tuner functionality would not be a dumb idea.



    I've had my fingers crossed for months that Apple would do something like this. There has been a mock-up floating around these boards of a modular aluminum design, a stand-alone series of monitors with the adjustability of the current iMac display that would go nicely with the current G5. Coupled with these would be a plug-in "base" that can be used with the monitors or with your old CRT. Ideally you'd have three product order options, monitor, CPU, or "all-in-one." There are certainly impracticalities with this approach, but it offers tremendous improvement in choice over the current line-up, something that consumers generally appreciate.
  • Reply 139 of 149
    Quote:

    Originally posted by Amorph

    There's always a way around:



    2) Right now, many HDDs have multiple platters to increase storage capacity. An "internal RAID" setup that striped the platters would greatly speed up access - and with double-sided platters, you could implement RAID 5 in a notebook drive.





    That's a very interesting idea. Why has no one thought of that?
  • Reply 140 of 149
    programmerprogrammer Posts: 3,461member
    Quote:

    Originally posted by TaoOfMars

    That's a very interesting idea. Why has no one thought of that?



    Because RAID stands for "Redundant Array of Inexpensive Disks". If you put more read heads into a disk it is no longer inexpensive, and it would no longer be redundant. Heck, it wouldn't even be an array. Hmmm... for that matter it would only be "disk", not "disks" (from the user perspective).
Sign In or Register to comment.