One more iMac G4 revision

1234568»

Comments

  • Reply 141 of 149
    amorphamorph Posts: 7,112member
    Quote:

    Originally posted by Programmer

    Because RAID stands for "Redundant Array of Inexpensive Disks". If you put more read heads into a disk it is no longer inexpensive, and it would no longer be redundant. Heck, it wouldn't even be an array. Hmmm... for that matter it would only be "disk", not "disks" (from the user perspective).



    The meaning of the "i" shifts around - but of course they wouldn't be "independent" either. So it would have to mean "internal" in this context.



    Programmer's point holds, though: Until RAID controllers are cheap enough to throw in with cache and firmware, and until per-platter density reaches the point where you can do this without gelding the HDD's capacity, this is not going to happen.



    If it happens at all, it won't be soon. I was just throwing out ideas for ways to get around a hard limit in storage density - and since we're not going to hit that soon, either, I wasn't concerned with whether the designs were immediately feasible.
  • Reply 142 of 149
    zosozoso Posts: 177member
    Quote:

    Originally posted by emig647

    Don't count your chickens before they hatch...



    Data Speed Limit (storage)



    It could be completely possible for these things to happen eventually...




    I also read that the next CPU technology switch, from 90 to 65 nm could be the last one, before chips start melting for the simple concentration of too much energy in such a small area. Leaving alone signal integrity & things like that, wich current 90 nm productions from Intel & IBM (arguably two of the best processor makers around) are already trying to deal with! (seen many 90 nm 970s or Prescotts around?)



    Things will slow down - hell, they've already had! Look at CPUs introduced in 2002, then in 2003 and now in 2004. Remember when a month wouldn't pass without a new CPU announcement from either Intel or AMD?



    Soon there'll be the need for a paradigm shift - quantum computing? Far off... Smaller cores, but in large numbers? More feasible, for sure... Solid state storage? It's another option. Who knows, I'm just speculating.



    I'm sure, don't expect technology to stop evolving; but in order to continue increasing available computing power and speed lots of things will have to change. Technology (singular, capital T) will always evolve, but many current technologies will eventually reach a dead end. Some sooner, some others later.



    ZoSo
  • Reply 143 of 149
    Quote:

    Originally posted by Programmer

    Because RAID stands for "Redundant Array of Inexpensive Disks". If you put more read heads into a disk it is no longer inexpensive, and it would no longer be redundant. Heck, it wouldn't even be an array. Hmmm... for that matter it would only be "disk", not "disks" (from the user perspective).



    My understanding of how hard drives work is pretty limited especially concerning the mechanincs, but there must already be some kind of array-type calculations going on inside the harddrive. It seems logical to me to be able to find something between a true RAID and some system that treats the platters as an array, possibly just extending whatever protocols are used to access the data across platters, but with a smaller, better brain and without needing a seperate protocol to the bus or OS. It could be entirely encapsulated withtin the drive's logistics, as Amorph said.



    Anyway i'm more interested in how possible it is (and how it might work) than whether or not it has a potential to hit the market soon. Carry on, good sirs.



    TOM
  • Reply 144 of 149
    resres Posts: 711member
    Quote:

    Originally posted by ZoSo

    I also read that the next CPU technology switch, from 90 to 65 nm could be the last one, before chips start melting for the simple concentration of too much energy in such a small area. Leaving alone signal integrity & things like that, wich current 90 nm productions from Intel & IBM (arguably two of the best processor makers around) are already trying to deal with! (seen many 90 nm 970s or Prescotts around?)



    Things will slow down - hell, they've already had! Look at CPUs introduced in 2002, then in 2003 and now in 2004. Remember when a month wouldn't pass without a new CPU announcement from either Intel or AMD?



    Soon there'll be the need for a paradigm shift - quantum computing? Far off... Smaller cores, but in large numbers? More feasible, for sure... Solid state storage? It's another option. Who knows, I'm just speculating.



    I'm sure, don't expect technology to stop evolving; but in order to continue increasing available computing power and speed lots of things will have to change. Technology (singular, capital T) will always evolve, but many current technologies will eventually reach a dead end. Some sooner, some others later.



    ZoSo




    I'm pretty sure that we will get well below 65nm in size, but the processes and materials will change. Researchers have already made a transistor using a molecule only 2nm long. Other are trying to grow molecular bio-computers out of DNA/RNA .



    Technology will continue to advance at a very rapid rate for quite some time. But somewhere down the road, the advances will grind to a halt -- it is inevitable. Knowledge is finite, and eventually, if our species survives long enough, we will know everything that there is know, and our technology will flatten out.



    Of course, I'm not sure what all this has to do with "One more iMac G4 revision"
  • Reply 145 of 149
    oldmacfanoldmacfan Posts: 501member
    Quote:

    Originally posted by Res

    Of course, I'm not sure what all this has to do with "One more iMac G4 revision"



    It doesn't matter, men will be extinct in 100,000 years.
  • Reply 146 of 149
    Speak for yourself....!
  • Reply 147 of 149
    oldmacfanoldmacfan Posts: 501member
    Quote:

    Originally posted by fuzzylogic

    Speak for yourself....!



    You think I am kidding, CNN earlier in the week interviewed an over-educated scientist who wrote a book, yes, someone paid him to write a book about his findings. His comment, that I remember, was that the Y cromosome is diffective. As for my genepool, I am on my third wife and have six kids, So they can speak for me, I have done my work.
  • Reply 148 of 149
    beigeuserbeigeuser Posts: 371member
    My opinion on technology plateau:



    Every technology will plateau at some point. Not because of the law of physics. Not because of cost. Not because of any technical reason. It will plateau because of user demand.



    The car example: The spec sheet of cars becomes more impressive every year. But to the average user, it is still only a mode of transportation from point A to B. A 3-year old car does the same job as brand new one. It is coming to the point that only spec-whores see the difference.



    Computers are not at that point yet. A 3-year old computer cannot do some things that a modern computer can. But at some point, a 3-year old computer will be able to do the exact same things as a modern computer. Sure, the modern computer will always be able to do it better. But it will no longer be able to do things exclusively. That is when technology plateaus.
  • Reply 149 of 149
    Regarding the car anology, I don't think cars have advanced much at all up until the past few years. Incremental improvements. Gas mileage is a good example. At least now we have hybrid vehicles emerging and fuel cell vehicles in developement. Manufacturers have only recently starting putting a lot of effort into them both in terms of performance and design. For a long time they just kind of dragged their feet.
Sign In or Register to comment.