How long will Apple survive?

2

Comments

  • Reply 21 of 57
    [quote]Originally posted by spooky:

    <strong>It doesn't matter how many people are happy with their current macs. My experience of the last two years has been comprised of most of my mac using designer friends jumping ship, most of the colleges I know jumping ship and apple seemingly indifferent to its own demise. At least the new imac can run itunes while rome burns.



    What the f*ck can't apple bring out a killer pro tower? Everyone else seems able to do it.</strong><hr></blockquote>





    And those people that jumped ship have already been replaced by newcomers to the platform. Apple has really tried to market themselves on the consumer level over the past few years, and a lot of mac faithful have bailed. If you're not happy with your platform, switch This isn't any indication that Apple's future is doomed, because as I've pointed out, they replace those users that bail, and there are plenty of the faithful that won't be switching anyway. They're a niche player, and they're not going anywhere for a while.



    As to your comment about having a kick ass pro tower... what exactly are your requirements for a kick ass pro tower? Faster clock speeds are all up to Motorola and IBM... and if you want Apple to switch to x86, which won't happen, go buy a Windows box. If you want DDR RAM, it's coming... but if you expect Apple to add new technology to their Motherboards twice a year, it's going to cost you. Apple likes to build a motherboard that can be reuses for a long time... that allows them to recover the cost of development over a longer period. If they rev the motherboard more often, they have to increase the cost of their already expensive computers.



    At any rate, that's the way I see it. I could be wrong, it's been a long day, and I'm about ready to go to sleep
  • Reply 22 of 57
    Again, when will the Sun burn out? 4.x billion years fron now? Just asking.
  • Reply 23 of 57
    applenutapplenut Posts: 5,768member
  • Reply 24 of 57
    [quote]Originally posted by CodeWarrior:

    <strong>While it's possible that Apple won't be around by 2006 and be bought out by Sony or IBM, I wouldn't count on it. I'd expect Disney. Especially if Mac OS X is the only commercial OS (Linux isn't commercial) that doesn't have DRM (Digital Rights Management) built in. That might be the only way they can lock down MP3's and DVDs on the desktop.

    </strong><hr></blockquote>



    After the bashing that Apple took from Disney lastweek, I doubt Jobs would allow them to buy Apple.
  • Reply 25 of 57
    macaddictmacaddict Posts: 1,055member
    Sometimes I have wondered this. Apple, while still making successful hardware products (I must say the Ti is the sexiest laptop I have ever seen in my life, and the iMac is doing very nicely), they are still losing marketshare.



    I am not sure about this, but I believe that their marketshare is at the lowest point since when they started out. Apple is handling a small marketshare very well, and they have quite a few developer and pretty much all standard software is available (or has an equivelant) for the Macintosh, save for games.



    The thing is, Apple's marketshare is dipping a little bit every quarter. Right now they have about 2.7% of current shipments.



    <a href="http://arstechnica.infopop.net/OpenTopic/page?a=tpc&s=50009562&f=48409524&m=1610934343"; target="_blank">Some discussion</a>



    If it drops a little each quarter, eventually sales will dwindle enough that developers will pull out. However, it is happening so slowly that I think Apple will be around at least 5 or 10 years more. They have some of the best experience in the comp. industry and are there to stay. They also have a huge wad of cash keeping them up there.
  • Reply 26 of 57
    The human eye can only see so many frames per sec...for example, a hollywood feature is only 24FPS...TV is 29.97 or "30" FPS. If ya think you can see the difference between 80 FPS and 100, you are wrong. As for processor speed, clock speed doesn't mean squat! I will run my dual 533 G4 against ANY Dual 1500 or less (more than double) Insmell...I mean Intel pentium and kick it's a** in rendering video...that is a fact! I had dual Windows boxes and they suck! It isn't just the CPU, it is the whole system. If windows crashes, then your P2.2 will do you absolutely no good!
  • Reply 27 of 57
    applenutapplenut Posts: 5,768member
    nally posted by Riverfront Media:

    <strong>The human eye can only see so many frames per sec...for example, a hollywood feature is only 24FPS...TV is 29.97 or "30" FPS. If ya think you can see the difference between 80 FPS and 100, you are wrong. As for processor speed, clock speed doesn't mean squat! I will run my dual 533 G4 against ANY Dual 1500 or less (more than double) Insmell...I mean Intel pentium and kick it's a** in rendering video...that is a fact! I had dual Windows boxes and they suck! It isn't just the CPU, it is the whole system. If windows crashes, then your P2.2 will do you absolutely no good!</strong>[/QUOTE]



    someone is smoking some damn good stuff

    :eek:
  • Reply 28 of 57
    powerdocpowerdoc Posts: 8,123member
    [quote]Originally posted by applenut:

    <strong>
  • Reply 29 of 57
    [quote]Originally posted by Riverfront Media:

    <strong>The human eye can only see so many frames per sec...for example, a hollywood feature is only 24FPS...TV is 29.97 or "30" FPS. If ya think you can see the difference between 80 FPS and 100, you are wrong. As for processor speed, clock speed doesn't mean squat! I will run my dual 533 G4 against ANY Dual 1500 or less (more than double) Insmell...I mean Intel pentium and kick it's a** in rendering video...that is a fact! I had dual Windows boxes and they suck! It isn't just the CPU, it is the whole system. If windows crashes, then your P2.2 will do you absolutely no good! </strong><hr></blockquote>



    Wow, ok, let's all sit back for a second while I go over how the Human Eye works and how many fps it is actually capable of seeing. This should be especially useful for anyone doing any sort of video work. I'll preface this by saying I grew up in the Television world at Turner Broadcasting, I could use the CMX Linear Editor by Age 13. I also spent a few years of my life actually working at Turner Broadcasting. In that time, I've learned a lot about Video Theory, and more importantly, how the human eye perceives reality. So grab a chair....



    First we need to understand how the human eye interprets reality. Light comes in a steady stream (and I don't want to go into a physics debate about the nature of light, where it acts as a particle and a wave, so roll with me here) and is then focused onto the retina of your eye by the lens. Light, and therefore reality, is constantly streamed to us; your eye never stops receiving information; that is you can't turn it off. It does this through two types of cells: Rods and Cones are the receiving cells. Intensity, Color, and position is transmitted by the retina to the optic nerve... and from there, that information is passed onto the Visual Cortex for us to consciously perceive.



    Of the two cells, Rods are simpler, and only interprets position and intensity of light. They are color blind, and therefore only see in Black and White. It isn't really a true B&W image... think of it as a mask in Photoshop... it's really just the intensity of light hitting the cells. Based on the intensity of light hitting your rods, this determines how much neurotransmitter is then released. More rods exist on the outer edge of your retina, and these cells are quite fast due to the basic nature of calculating intensity, and the fact that our neural capabilities are much greater than that of a computer.



    Cones are the more complex cells. They basically absorb different wavelengths of light and then release the corresponding amount of neurtransmitter. A Cone has three receptors that absorb Red, Green, and Blue wavelengths. Each of these release differing amounts of neurotransmitter based on the color, or wavelength, of the light that hits it. The information released from these cells is then passed onto the visual cortex, and causes you to see a certain color. Note these cells are slower to react to light changes because of the complexity of their interpretation of light. These cells largely make up the center and fovea of the retina.



    Your optic nerve passed the information received from your eye to the visual cortex of your brain. Nerve impulses essentially travel at over 200mph, and since the nerve is only about 2cm - 3cm long, you have an amazing amount of bandwidth here. Don't discount the human eye and the power of our brain. It's much more than any computer is capable of. We have an incredibly advanced vision system. Some animals may have enhanced vision in certain scenarios... an eagle has quick, sharp vision... but the eagle can only see in Black and White, and an Owl, for example, has the tradeoff of not being able to move it's eyes within the socket.



    So, moving right along, we'll touch up on the Video and Film world, and those of you doing it professionally, pay attention This is where we get a common misconception that we are only capable of perceiving 30 fps. It's just not true. This is where a property called 'Motion Blurring' enters the lecture



    One of the tricks our mind uses to make up for the fact that our brain can only hold so much information is the property of the motion blur. This is terribly important to the way we perceive reality. Because the eye can receive only so much information, and our visual cortex can only process so much of that, we have to be able to properly visualize the world. In other words....



    When you're watching a train go by at a high speed, it blurs, right? Do you know why? Whenever something is moving in this universe, at any give point in time, it will always have a fixed position, no matter how fast it is going. So long as it exists, it has a position (and no getting into relativistic physics). So, let's take, for example, a bird flying along in the sky at about 10 mph. We can see the bird in excellent detail at this point. However, if we choose to get into an airplane, and fly by the bird at around 160mph, we just see it as a streak. This is because our eyes cannot receive and process enough information to render that bird perfectly... and that is why we see it as a streak. If we couldn't produce motion blur, the bird would seem to pop in and out of reality. You would see the bird in one place, and then you would see it again several feet in the direction it was going. Basically, we can't process enough information in our minds to be able to account for an exact replication of the detail of the bird for every fixed position in the universe it travels as we go by it, so we compensate by blurring it's image.



    Motion Blurring is something common in analog mediums. It happens in Film and video because those mediums are not fast enough to capture exact replications of fast moving images. Motion blur in those mediums allows us to better compensate for the low frame rates, and this assisting our eye in moving between the frames being projected. This is why motion blurring in After Effects and the like is so important when you're mastering back out to film or video.



    with pretty much anything digital, you're getting a perfectly clear image every frame. When I say digital, I don't mean DVCam's. That is still video. I mean computer displays and video games. Every frame Quake III draws is perfectly detailed. There is no motion blurring, because then it would be terribly problematic to determine where in the 3D space a particular object was at the time. In order for digital images to look more natural, we have to bump up the frame rate. Anything under 60 won't really be good enough, and the optimal number we're looking for is between 85 and 120fps. when you start to get above 100fps, the advantages in perceptional quality aren't so much anymore, and only shows minimal improvements in the "suspension of reality" effect that higher framerates will yield.



    If the analog vs. digital thing isn't clear enough... think of it this way. Film is made up of tiny grains, which when viewed from a distance make up an image. It is by no means an exacting representation of that image. If you work on a 3D animation in LightWave or Maya, render it out, and display it on a digital display (LCD, Plasma), every frame is going to have exacting detail. There will be no natural motion blur, and it stops being necessary to add when you bring your frame rates up.



    Anyhow, I know this is getting long winded, but I just can't sit by idly when a comment like this appears. Basically, what you all want to know...



    How many FPS can we see?



    Our brain is capable of noticing up to and beyond 200 fps. This has been proven, again and again. Ask ANY optometrist or Physiologist.



    I read somewhere that the Air Force had done tests with their pilots where they would put them in a dark room, and an image of an aircraft would be flashed on a screen in front of them at 1/220th of a second. The pilots were consistently able to see the afterimage of the aircraft, and identify it. Sure, it's a specific situation, but it goes a long way to show how sensitive our eyes and brain are to light.



    In conclusion, and to summarize, We can see a hell of a lot more than 30 fps. It's especially important to bump this number WAY up in digital scenarios like 3D gaming because of the nature of digital images.



    :: steps off the podium ::
  • Reply 30 of 57
    willoughbywilloughby Posts: 1,457member
    ...and the award for longest post of the year goes to.....





    M3D Jack!!!!!



  • Reply 31 of 57
    bogiebogie Posts: 407member
    There are shortages because the new iMac is a hit, there are over 200,000 orders, I expect some record breaking sales if you compare it to past releases. The shortages do suck, but sales in the PC industry have been steadily declining so according to the original logic of this post we should be asking "how long will the PC industry survive."



    People asking me if Apple will die were first offensive, then funny, now I just don't care. Apple is always seen as on their way out, odd thing is they never go anywhere but up. In 1997 they were in a much worst situation and then there stock went from $12.75 to over $155.
  • Reply 32 of 57
    rokrok Posts: 3,519member
    sorry, but if the mid-90's didn't kill apple, nothing will.



    they'll never take over the world, but they will never go away, either. :cool:
  • Reply 33 of 57
    alcimedesalcimedes Posts: 5,486member
    How many FPS can we see?



    that's all fine and good. now ask yourself how many can your computer monitor display?



    as i mentioned earlier, most people don't run their monitors above 85Hz. last i checked, that means that their screen is refreshing at at rate of 85 times per second.



    enter in your video card. if you are outputting 100fps consistantly or 200fps consistantly, there is no difference in what you see.



    along those lines, what a lot of people notice is that their video card will choke on complex/action filled scenes. so that 200fps drops suddenly to 60fps. however, macs seem to have the ability where they will show around 100fps normally, and stick there. any scene, any time, is still showing at 100fps.



    at which point the obsession of max fps is pointless and stupid. what people should be more concerned about is min. fps that their setup will do in their favorite games. find settings that won't dip below 85fps or so and you're fine.
  • Reply 34 of 57
    ...and the award for longest post that is totally off-base goes to.....

    M3D Jack!!!!!







    Thanks, alcimedes, you beat me to it. It's a shame that people like this don't understand why its important to push *average* FPS.
  • Reply 35 of 57
    I care zero about frames per second in game X, but they do seem to correlate to the work I do. I'm in the scientific community and the simulations we do require a strong FPU. Altivec is almost useless since it's precision is limited to 6 or 7 significant digits.



    We need G5 soon. We need it's 64bit architecture. We need increase cache and memory bandwidth with low latencies. We need a stronger FPU that is at least three times as fast as the G4, or else we're turning to the Hammer.
  • Reply 36 of 57
    ccr65ccr65 Posts: 59member
    [quote] there are a few high end applications that benefit, but at this point, hadware is accelerating much faster than software can take advantage of. <hr></blockquote>



    I'm sorry but you don't know what you're talking about. just about every app that is used in professional graphics, video and audio could use more processor power. When I'm working on either a PC at work or my Mac at home very little time goes by that they aren't maxed out. Try converting 30 minutes of DV footage to MPEG some time and see what I mean.



    Memory bandwidth and disk performance is important too but when you are trying to be creative it helps to have as close to a realtime enviroment as possible. We aren't there yet. Producing content for games for instance can tax a computer far more than a user will in playing it.
  • Reply 37 of 57
    I wouldn't say it is totally off base. The question about fps comes up all the time, and it would seem that nobody really knows exactly what they're talking about when they dive into it. I guess it was a little long, that's what my coffee does for me in the morning



    What set me off was the comment that we can't see high frame rates... and it's difficult to measure that because we don't view reality as frames. This does come full circle in that higher frame rates by gamers are justified, just not the way they think they justify it. It's actually not uncommon to have a CRT that can do 120Hz+, and the point of the science lesson was that 120fps is probably a pretty good number for 3D games.
  • Reply 38 of 57
    alcimedesalcimedes Posts: 5,486member
    [quote]I'm sorry but you don't know what you're talking about. just about every app that is used in professional graphics, video and audio could use more processor power<hr></blockquote>



    and what percentage of the computers sold do you think are used for these purposes? 3%, 4%, maybe 6%?



    so you've got at least 94% of the world who's buying stuff they will never even come close to tapping. for those of you who are consistantly maxing out your CPU, i'm sure the new 64 bit chips are going to be the way to go. maybe you should be looking at Power4 chips. point is, you are no where near the average consumer.



    stop and ask yourself when the last time was your e-mail, mp3 player/ripper, word processer, powerpoint crap came close to using your full CPU. well guess what, that's what bascially everyone else in the world runs on thier machines, period.



    if you can tell me with a straight face that the average user is going to see a benefit from switching from a 1.4 to a 1.8 Ghz chip, you're full of crap.
  • Reply 39 of 57
    crusadercrusader Posts: 1,129member
    Apple is always on the brink of ruin.
  • Reply 40 of 57
    quaremquarem Posts: 254member
    [quote]Originally posted by alcimedes:

    <strong>



    and what percentage of the computers sold do you think are used for these purposes? 3%, 4%, maybe 6%?



    so you've got at least 94% of the world who's buying stuff they will never even come close to tapping. for those of you who are consistantly maxing out your CPU, i'm sure the new 64 bit chips are going to be the way to go. maybe you should be looking at Power4 chips. point is, you are no where near the average consumer.



    stop and ask yourself when the last time was your e-mail, mp3 player/ripper, word processer, powerpoint crap came close to using your full CPU. well guess what, that's what bascially everyone else in the world runs on thier machines, period.



    if you can tell me with a straight face that the average user is going to see a benefit from switching from a 1.4 to a 1.8 Ghz chip, you're full of crap.</strong><hr></blockquote>



    Processor intensive tasks are becoming more common. Just look at all those new iMacs with Superdrives. The people that bought then will be making iMovies and compressing MPEG video to burn DVDs.



    Software will always evolve to fit the capacity of the hardware. The problem with Apple lagging behind the PC is that it creates a differential in the hardware. Therefore the potential exists for a 'killer' app to be made on the PC that requires the power of 'consumer' PC that won't run on 'consumer' Macs because they're to slow.



    Ultra-fast hardware is always a luxury today but a necessity tomorrow.
Sign In or Register to comment.