Will the faith in Motorola be restored if G5 really comes?

2»

Comments

  • Reply 21 of 38
    Well, those features that "bloated" photoshop where added because the horsepower was there to run them...given the power, developers like Adobe will have the freedom to impliment features that we can't even imagine yet. The car analogy doesn't apply well since it will never be safe to drive at much higher speeds then we have now and to be able to accelerate at g forces that would make us black out wouldn't be a great idea either so higher horsepower then most cars have now probably won't happen.
     0Likes 0Dislikes 0Informatives
  • Reply 22 of 38
    ihxoihxo Posts: 567member
    [quote]Originally posted by Junkyard Dawg:

    <strong>So this same jackass doesn't care about what he NEEDS, it's about what he wants, and what he wants is based on IMAGE, ultimately, the image he thinks it takes to get laid. And more MHz somehow fits into all this to make him a better, bigger man.</strong><hr></blockquote>



    In general, I totally agree with you Junkyard, but are you sure you are the same Junkyard as b4?? Just feels weird ....
     0Likes 0Dislikes 0Informatives
  • Reply 23 of 38
    cowerdcowerd Posts: 579member
    [quote]2 years ago, did photoshop require 150+megs? That wouldv'e been, what OS 8.0? That used about 40megs if I remeber correctly. So, you see my point, apps and the OS have bloated due to extra features, effects, drop shadows, etc. But I think we are close to being as far as we need to be.



    2 years from now, I don't think photoshop will require 500megs to run, if it does, then LOL, holy shit!



    You see what I mean, it's like now that we have cars with 200+hp we really don't need much more, or at least by average.<hr></blockquote>



    Then you don't use these apps to make a living. Anything to shave time in working on a 300+ meg PS file, or even getting DW to run at a decent speed. We won't even discuss 3D rendering times. There is never going to be a fast enough for Apple's pro market--and it better be coming soon, cause lots have already made the move to Win2K.
     0Likes 0Dislikes 0Informatives
  • Reply 24 of 38
    Cowerd. if apple comes out with a kick-butt machine would the people you know of come back to mac?
     0Likes 0Dislikes 0Informatives
  • Reply 25 of 38
    [quote]Originally posted by craiger77:

    <strong>Well, those features that "bloated" photoshop where added because the horsepower was there to run them...given the power, developers like Adobe will have the freedom to impliment features that we can't even imagine yet. The car analogy doesn't apply well since it will never be safe to drive at much higher speeds then we have now and to be able to accelerate at g forces that would make us black out wouldn't be a great idea either so higher horsepower then most cars have now probably won't happen.</strong><hr></blockquote>





    So narrow-minded! *kim kap sol shakes his head in dissapointment*
     0Likes 0Dislikes 0Informatives
  • Reply 26 of 38
    rickagrickag Posts: 1,626member
    If the spec numbers The Register used from their reliable mole are even close, the G5 wouldn't need to be revised much to stay way ahead of Intel/AMD for quite some time. What were they? Twice as fast?





    In Jan. I'm

    Wishing for a G5 1.6

    Hoping for a G4 Apollo 1.3 w/ DDRram

    expecting a G4 Apollo 1.1 sDram
     0Likes 0Dislikes 0Informatives
  • Reply 27 of 38
    marcukmarcuk Posts: 4,442member
    Computers will never be fast enough for me, For instance on a G4 400 I may set off a 3D radiosity scene that may take 5 hours to render, which is quite conservative. I would like 30 of these frames rendered per second if it was possible, so would everyone else into serious 3d. Now I don't know if faster processors or faster gpu's are the answer, I suspect both, but I don't see this kind of performance for at least 20 years.
     0Likes 0Dislikes 0Informatives
  • Reply 28 of 38
    powerdocpowerdoc Posts: 8,123member
    [quote]Originally posted by MarcUK:

    <strong>Computers will never be fast enough for me, For instance on a G4 400 I may set off a 3D radiosity scene that may take 5 hours to render, which is quite conservative. I would like 30 of these frames rendered per second if it was possible, so would everyone else into serious 3d. Now I don't know if faster processors or faster gpu's are the answer, I suspect both, but I don't see this kind of performance for at least 20 years.</strong><hr></blockquote>

    You will certainly have suffisant power in ten years, but at this time you will need more power of calculation because you will use a rendering process more sophisticated than radiosity and you will want higher resolution.
     0Likes 0Dislikes 0Informatives
  • Reply 29 of 38
    amorphamorph Posts: 7,112member
    Mot Semiconductor has an axe over its head right now, so I imagine that they're trying to restore Motorola's faith in them as well.



    We'll see how well they do under the gun. Early indications look good.
     0Likes 0Dislikes 0Informatives
  • Reply 30 of 38
    It amazes me how the guys who always say how much they like their Macs because it automatically recognized their Digital Camera or their firewire drive as soon as they plugged it in, unlike Windows where they would have had to install drivers etc. are the same ones that go " Well, we don't need faster Macs, if it weren't for OS and application bloat we wouldn't even have a use for what we already have. " when some of us complain about Macs falling behind PCs in speed.



    I could easily use a processor that's a hundred times faster than what is available on the market today.



    I would love to render movies and 3D instantly, play 3D games that look like final fantasy.



    We're still so far from decent Artificial Intelligence, how about that?



    How about higher resolution screens at 300, or even 600 dpi so everything will look like print. Wait, that requires more bandwidth, and more GPU power right? What if I wanna shoot a movie at that resolution and edit it and play it back on my computer? Faster, bigger hard drives, faster CPU, more RAM?



    Are you telling me I shouldn't be able to do that? Ever?



    What if I wanna play an adventure game that's not retarded, that can really understand spoken english and has characters that can really respond to situations? How many TeraHertz would that take.



    [ 11-29-2001: Message edited by: timortis ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 31 of 38
    kidredkidred Posts: 2,402member
    [quote]Originally posted by cowerd:

    [QB]Then you don't use these apps to make a living. <hr></blockquote>



    Wrong. I use Photoshop and Dramweaver everyday to make my pretty good living.



    And the rest of you guys are missing the point. I don't think my mom is going to need a 10ghz machine 5 years from now to check her damn email. Thats my point!
     0Likes 0Dislikes 0Informatives
  • Reply 32 of 38
    brussellbrussell Posts: 9,812member
    [quote]Orig
     0Likes 0Dislikes 0Informatives
  • Reply 33 of 38
    Consider this:



    Every product that ever came out went through the same product curve.



    Someone identifies a problem their going to solve and build a product that poses a signifigant benifit to the user, enough to seperate his/her $ from pocket.



    As the product gains acceptance other people start to copy it and the capabilitis of the product get "better" (they pose more and more benifit) to the user, the product moves through a growth cycle. This is where cell phones are right now.



    In the thrid phase, the product (I mean all products that do the same thing) begins to mature, and it costs more and more to extract more benift to offer the consumer. This is where the car is.



    Lastly someone comes up with a totally new solution to the problem (maybe this is Ginger, we'll find out Monday!!) which makes the current one not attractive enough to sell, and it dies off.



    The computer is leaving the growth phase and is maturing. This means you've got to inovate or die. The watch did this a while ago swatch saw it and took advantage of it.



    I think the important thing to consider is the "digital hub" strategy. This may be the thing that moves Apple back into the mainstream.



    I don't see anyone caring how many Mhz the iPod has! It's a new solution to a problem that was not solved very well with the old system (computer type players).



    They won't have to "convince" anyone to buy a "digital hub device" if they get it right. Consumers will see the benifit and act.
     0Likes 0Dislikes 0Informatives
  • Reply 34 of 38
    If Apple can ship 1-1.5 GHz PowerPC 8500-based PowerMacs a month or two after MWSF then yes I would restore my faith. If not, I question the future of the Macintosh platform. Are they really serious about performance? I don't want Apollo, I want the real deal G5.
     0Likes 0Dislikes 0Informatives
  • Reply 35 of 38
    marcukmarcuk Posts: 4,442member
    [quote]Originally posted by powerdoc:

    <strong>

    You will certainly have suffisant power in ten years, but at this time you will need more power of calculation because you will use a rendering process more sophisticated than radiosity and you will want higher resolution.</strong><hr></blockquote>



    umm, 10 years? I dunno, how do you work that out?

    Are you talking about specialist HW (ie SGI/Sun) or just desktops?



    To go from 1 frame in 5 hours to 30fps is a jump of 5*60*60*30*MHZ which means 216,000,000 MHZ in the case I stated, (or 216 thousand GHZ). In the last 10 years we've seen a processor increase from 20MHZ to 2GHZ, about x100, which if applied to todays 2GHZ machines equalls 200GHZ. Notwithstanding the fact that Intel are predicting 10GHZ by 2005, I'd say we'd be very lucky to achieve even 100GHZ be 2011. Perhaps Moto's patented silicon/light process may help, but I'd say 100GHZ chips won't be fabbed at all like todays chips, unless there are 100stage pipelines.



    Perhaps the performance could come from GPU's. But then apart from OpenGL realtime rendering, GPU's really are not capable/ or being utilized for any broadcast quality final output from 3d programmes at the moment, its all done on the CPU.



    Just rambling...
     0Likes 0Dislikes 0Informatives
  • Reply 36 of 38
    gilschgilsch Posts: 1,995member
    Sorry, but a computer won't be fast or powerful enough for me until it boots up and is ready to go instantly(like in a second or 2).

    A computer won't be fast or powerful enough for me until every app. opens faster than the eye can see.

    A computer won't be fast or powerful enough for me until every process or task, be it what it may, no matter how complex is done in 3 seconds or less.

    Sorry but I'm picky. My point is a computer will never be fast or powerful enough.There will always be something to strain its limits.

    "straining the limits of machine and man...."



    [ 11-30-2001: Message edited by: Gilsch ]</p>
     0Likes 0Dislikes 0Informatives
  • Reply 37 of 38
    powerdocpowerdoc Posts: 8,123member
    [quote]Originally posted by MarcUK:

    <strong>



    umm, 10 years? I dunno, how do you work that out?

    Are you talking about specialist HW (ie SGI/Sun) or just desktops?



    To go from 1 frame in 5 hours to 30fps is a jump of 5*60*60*30*MHZ which means 216,000,000 MHZ in the case I stated, (or 216 thousand GHZ). In the last 10 years we've seen a processor increase from 20MHZ to 2GHZ, about x100, which if applied to todays 2GHZ machines equalls 200GHZ. Notwithstanding the fact that Intel are predicting 10GHZ by 2005, I'd say we'd be very lucky to achieve even 100GHZ be 2011. Perhaps Moto's patented silicon/light process may help, but I'd say 100GHZ chips won't be fabbed at all like todays chips, unless there are 100stage pipelines.



    Perhaps the performance could come from GPU's. But then apart from OpenGL realtime rendering, GPU's really are not capable/ or being utilized for any broadcast quality final output from 3d programmes at the moment, its all done on the CPU.



    Just rambling... </strong><hr></blockquote>

    I have to admit, MACUK i did not make any calculation, however i think the performance in 3 D rendering will increase much faster, than other tasks. The video card progress at an incredible speed. 20 years seems me very long for that, and perhaps they will be some extra strong GPU who will be able to do this sort of calculation in a much better way (by the increase of the size of the chip with multicanal, rather than the increase of the clock speed).Why not a 1024 or 2048 bit CPU ?

    However, even if you have this kind of machine, and i hope we will live enough to see them and why not bough them, we will find that this computer will be still slow for certains tasks.



     0Likes 0Dislikes 0Informatives
  • Reply 38 of 38
    The better GPU's on the market today can do more calculations than the CPU's, plus they are specialized chips.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.