How long will Apple survive?

13»

Comments

  • Reply 41 of 57
    ccr65ccr65 Posts: 59member
    I don't necessarily dispute your numbers but we are talking about Apple and about their "pro" towers if you want to play MP3's and surf the net then an iMac is perfect for you. That's the whole reason there is a consumer line and a pro line. Apple has built a large part of their business on catering to pro users. As a percentage of total computer users, professional users aren't very numerous but as a block of Apple's business we are quite sizable. They need us like Microsoft needs IT/networking people.



    what would you have us do, tell Apple we want a third line of computers called the "really pro desktops" so that casual consumers that don't want an iMac can still have their towers and we can have our more powerful ones?



    A casual user doesn't need the PCI slots in a tower any more than they do the extra speed so why get a tower? Gamers want to add a different graphics card but that's about it.



    Apple has been getting users more and more into video with iMovie and iDVD those do require the extra CPU. If you choose to do home movies on your Mac you need all the power you can get.
  • Reply 42 of 57
    airslufairsluf Posts: 1,861member
  • Reply 43 of 57
    airslufairsluf Posts: 1,861member
  • Reply 44 of 57
    alcimedesalcimedes Posts: 5,486member
    just so you know what i'm talking about when i say that hardware increases have outstripped software needs.



    in 1996, some of the fastest processors were at 200 Mhz max.



    in 1999, 400Mhz-450Mhz



    2002, 2,000Mhz



    processors are now running 1,000% faster than they were 6 years ago.



    they are running 500% faster than three years ago.



    in that time, the vast majority of software has not needed 500% more CPU, or 1,000% more CPU to get the same job done.



    the only real market that has been pushing hardware is gaming. don't get me wrong, i think that Apple should be pushing forward speed wise, i just don't think it's as severe an issue as everyone likes to think.



    -alcimedes
  • Reply 45 of 57
    spookyspooky Posts: 504member
    If apple's market share is not increasing then speed and raw power become a real serious issue.



    Why the cr*p can't apple produce a killer powermac to end all debates?
  • Reply 46 of 57
    If I played movie frames at 85 FPS and a series at 120 FPS then finally at 200 FPS, you would NOT be able to tell the difference. I don't know what eye doctor you talked to!
  • Reply 47 of 57
    powerdocpowerdoc Posts: 8,123member
    There is one thing that we don't have to forgot whe speaking of FPS , it is the screen.

    My screen is on 1024 per 728 at 85 hz, so the maximum fps that deliver the screen is 85. If you screen is at 60 hz it's 60 fps max.



    30 fps is the maximum that an eye can see, because of the retina remanence.

    The benchmark will be better if they speak of the minimum fps. The fps should not drop under 30 fps in any case, the maximum fps does not matter.
  • Reply 48 of 57
    [quote] igh latency connections add prediction and popping problems that are dealt with in many different ways, but none can give one high FPS player an advantage because absolute time and programmed physics are invariate between all players.

    <hr></blockquote>



    Bull. Players can jump farther in Quake 3 if they have greater fps. It's a fact. You do not know what you are talking about.



    And to settle this dumb FPS debate once and for all...



    You people need to learn the difference between "average fps" and "minimum fps". When you see a report of some Wintel getting 200 fps in Quake, that is an average. Sometimes they get more fps, and sometimes they get less. The problem is when they get less fps. A drop in fps from 200 to 40 is very obvious and it changes the control characteristics of the game, thus affecting gameplay. It is the minimum fps that gamers want to raise most, if they could get 80 fps all the time then sure it would be enough, but the way it works is that if one gets 200 fps when facing a wall, that will drop precipitously during heavy action.



    So it is irrelevant if someone needs 200 fps, or even 100 fps. The problem is that the minimum fps needs to remain acceptable, at least 60-80 fps is a nice number.



    Go back to your stats books and review what "average/mean", "maximum", and "minimum" describe. Then you will understand why gamers like fast Wintels....and please, nobody is that dumb, if you want a computer to play games, they you don't buy a Mac. And gaming performance benefits everyone, because it drives the advances in performance on computers....because of gaming, photoshop runs faster. Think about it.
  • Reply 49 of 57
    [quote]Originally posted by Riverfront Media:

    <strong>If I played movie frames at 85 FPS and a series at 120 FPS then finally at 200 FPS, you would NOT be able to tell the difference. I don't know what eye doctor you talked to!</strong><hr></blockquote>



    Ok, you and PowerDoc need to brush up on this. It's not as simple as sitting in a room, and having me play you videos in random order at 30fps, 85 fps, and 200 fps, and expecting you to be able to tell me how fast each one is.



    My post went into great detail to insure you guys would have the scientific evidence to support my claims.



    The idea of having things stream at a higher fps is to get a greater effect of "suspension of reality". Therefore, a higher fps is going to heighten that effect, and it is especially important for digital imagery. You need to go re-read my post. Furthermore, if you guys don't believe me, you might want to actually go to an eye doctor or physiologist and actually ask them. Again, I'm not saying you can watch an animation, and say, "Oh, that is xx frames per second.". I'm simply saying that there is a definent increase in the quality of viewing that the brain can realize when watching higher fps animations.



    It's been proven, in many, many studies that we are capable of accepting up to and beyond 200 fps. I think the Air Force study sort of sums that one up. In my post, I also discussed how we actually visualize reality, and if you guys actually read that, you'll see that we are indeed quite capable of my claims.



    So go ask a doctor if you don't believe me. I've been doing this a long time, I've hosted lectures on this, and I'd like to think I know what I'm talking about. So I'm sorry, but Riverfront Media (Who should know better for the industry you're in, HDTV is 60 fps, and if we couldn't see past 30fps, why would we standardize on 60?) and PowerDoc, but you're both wrong. Show me evidence to support your claims, but until then, you've really got no evidence.
  • Reply 50 of 57
    And before someone tries to call me on the HDTV @ 60fps thing, it is indeed one of the proposed HD standards. If you look at the proposed papers, many of the standards include 60fps proposals.
  • Reply 51 of 57
    powerdocpowerdoc Posts: 8,123member
    [quote]Originally posted by M3D Jack:

    <strong>



    Ok, you and PowerDoc need to brush up on this. It's not as simple as sitting in a room, and having me play you videos in random order at 30fps, 85 fps, and 200 fps, and expecting you to be able to tell me how fast each one is.



    .</strong><hr></blockquote>



    I will not enter in the discussion about the persistance of the retina : but 200 fps is useless as i have already said because most users have a CRT screen with a 85 hertz frequency. So explain me how you make to do a difference between 200 fps and 85 fps on a screen with a frequencie that just aloud 85 hertz (85 image per second).

    And i bet if on Quake 3 (the game i use to play) that a machine who did no go under 30 fps is good. My computer is at 40 image per second average , but it drop to 1 fps in certain case : it freeze.
  • Reply 52 of 57
    [quote]Originally posted by powerdoc:

    <strong>



    I will not enter in the discussion about the persistance of the retina : but 200 fps is useless as i have already said because most users have a CRT screen with a 85 hertz frequency. So explain me how you make to do a difference between 200 fps and 85 fps on a screen with a frequencie that just aloud 85 hertz (85 image per second).

    And i bet if on Quake 3 (the game i use to play) that a machine who did no go under 30 fps is good. My computer is at 40 image per second average , but it drop to 1 fps in certain case : it freeze.</strong><hr></blockquote>



    I completely understand and agree with that. This whole "maximum perceivable fps" debate spawned from Riverfront Media's comment that we can't perceive anything higher than 30 fps, and that it is a waste of time to try and view anything higher than that. Sure, ultimately you are still limited by your display... but if you had a display that was capable of refreshing itself 160 times a second (My three year old CRT is capable of 120Hz @ 1152x870, so it isn't uncommon), you would definitely be able to perceive 160fps.



    I think the reason most people believe the limit is around 24 or 30 fps is because of the idea of the "Least Noticeable Difference".



    The receptors in your eyes absorb energy, and we know that energy disappates at a predictable rate. When a light flashes, it take a while for that image to go away, and that is the afterimage. So when your eyes receive an image, the visual receptors absorb the energy and pass it on to the visual cortex of the brain. So if you can introduce a brand new image before the energy level of the last image fades to a certain level, you don't know it is doing that. That is the Least Noticeable Difference, and we estimate it to be something like 24 or 30 fps.



    The big misconception comes when people just assume that is the maximum numbers of fps we can see. This is when people grossly underestimate the power of our brain, and just how advanced our imaging system is. It's also terribly hard to place a limit on the number of fps you can see, but we've done studies that show humans can tell the difference between, say, 120fps and 30 fps, because it's much easier to notice when frames are missing.



    There is a freeware program out for windows that shows a rotating bar rendered in OpenGL, only it shows it in two spaces. You can set the fps to each bar to be different, and believe me, you can notice a difference. If you have a 90Hz display, set one of them to 90, set the other to 30. There is a difference. It is available at: <a href="http://sdw.arsware.org/FPSCompare/"; target="_blank">http://sdw.arsware.org/FPSCompare/</a>;



    Since the maximum fps does vary from person to person, and due to it's difficult nature to measure, it is thus inherently hard to give a good number as a ceiling. Studies that try and approach a number get around the 200 fps range. It's sort of like me asking you how many colours we can see. It's hard for you to even give me a ballpark, and it is going to differ widely between people. Studies for that give estimates of about 16 million colours, and that's why your computer displays go up that high.



    At any rate, the point I'm trying to prove is that if you think 30fps is the limit, you're wrong. 30fps represents the Least Noticeable Difference, but don't underestimate the human brain. It's capable of much more, and if you run the software above, I think you'll see the difference. As for PowerDoc and others citing that in the end it all comes down to how fast your display can update, you're absolutely right, and I don't disagree with that. The bottom line of my argument has always remained that we can see beyond 30fps, and for digital images on digital displays with no natural motion blur filter, the optimal frame rate to create a good "suspension of reality" is about 85 - 120fps. And we can see that many,, and more, so long as your display will support it.
  • Reply 53 of 57
    airslufairsluf Posts: 1,861member
  • Reply 54 of 57
    <a href="http://www.burstnet.com/ads/ad5765a-map.cgi/1964?292,29"; target="_blank">http://www.burstnet.com/ads/ad5765a-map.cgi/1964?292,29</a>;



    I'll post the article on Quake FPS and game physics asap...I lost the bookmark and it will take a while to find it. But understand that I didn't pull that outta my arse, it is a FACT that jump distance in Quake is dependent upon fps.
  • Reply 55 of 57
    airslufairsluf Posts: 1,861member
  • Reply 56 of 57
    quaremquarem Posts: 254member
    M3D Jack thanks for your awesome posts, that is a really great explanation for why FPS is important in 3D games.



    I have always explained why we need high FPS in 3D games by looking at angular velocity. In Quake 3 and other FPS shooters you tend to move with fast angular velocities, and without blurring of the image that jumpiness between frames can be quite noticeable. I am sure that if you couldn't turn the camera quickly in Quake 3 then 30 FPS would be totally fine, but when you are spinning around really quickly the image between two consecutive frames becomes more and more different, therefore we need high FPS to account for this and create the sense of reality.



    When they jerk the camera wildly in movies they have blurring effects so it doesn't look that bad, just like you pointed out.



    I thought I would add this since it seems to get the point across pretty good, since people can visualize what's going on and then understand why more then 30 FPS is desperately needed.



    That being said, lets hope that the G5, when it arrives, will bring Macs up to the same FPS levels as PCs.
  • Reply 57 of 57
    One thing I am getting sick of is the Apple has only X% of the market arguments. Where do these numbers come from? I see 2.7%, 4%, 5% bandied about, but there is NO way anyone can verify these numbers. I am not saying that Apple has 20% market share, but given that Macs stay in use a lot longer and that many sales don't show up in traditional surveys, who cares what the actual number is? Apple has MILLIONS of Macs out there, and are selling more each day. The bottom line is Apple has a stable, installed user base, and it is not going away anytime soon.
Sign In or Register to comment.