Mac Pro Refesh in March

17810121319

Comments

  • Reply 181 of 374
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by Tallest Skil View Post


    So you're saying Apple is lying about selling more Macs every quarter than previous?



    Nooo I'm saying the guys that publish these articles spin the numbers to look bigger or smaller. When they talk about desktop growth there, Apple is a relatively small player in that area compared to some of the others, and the acceleration on growth in that market segment even within Apple is winding down. Also note that when it's based on total macs, the laptops significantly outpace everything else. Apple could spread the love around on their line a bit considering it's a smaller product line regardless for a company of that size. They seem to leave some of the older designs to the momentum of the brand itself.
  • Reply 182 of 374
    MarvinMarvin Posts: 15,322moderator
    Quote:
    Originally Posted by wizard69 View Post


    If apple where to start building specialized GPU processing hardware the cost of this hardware would make the Mac Pros look cheap.



    Apple wouldn't have to build that part themselves and it wouldn't be that expensive. Not much more than a Pegasus RAID system.



    Quote:
    Originally Posted by wizard69 View Post


    Remember at this point if mac Pro is replace, the new machine will likely be close to twice as fast CPU wise and leery interesting GPU wise.



    Doubtful. We are just getting Sandy Bridge just now so not even close to double after nearly 2 years.



    Quote:
    Originally Posted by wizard69 View Post


    If they really want to offer up something like this it has to fall out of a more general purpose architecture that can ship in volume.



    Fast GPU options will ship in far higher volumes than 12 CPUs. The GPUs can at least be used for visual tasks.



    Quote:
    Originally Posted by wizard69 View Post


    So while amazing GFLOP numbers can be thrown around by NVidia and others just realize that GPU's need the right sorts of data to work on to get those numbers.



    Like in Apple's Core software architecture. They can do it for encoding too - Core Compressor. This can be an API used by all apps to do batch image and movie compression. FCPX exports can be done in a fraction of the time.



    Quote:
    Originally Posted by wizard69 View Post


    With the right hardware, a Mac Pro, you could just plug in a compute card and bypass the slow TB link.



    You can't fit 4 or more high-end GPUs in a Mac Pro so you have to install a card that effectively connects to an eternal box anyway. You may as well use Thunderbolt. When it comes to computation, the link bandwidth only matters if you have a huge dataset that you are accessing on the host. The compute box will have its own storage so the link speed doesn't matter.



    Quote:
    Originally Posted by wizard69 View Post


    Even better would be a future where every machine comes with a GPU with the resources to do compute.



    Yeah but you'll always be able to stick 4 or more on the outside of the machine.



    Quote:
    Originally Posted by wizard69 View Post


    The things that a GPU does in a system are highly parallel. That means one gets almost 100% out of every processor added. You can't get that sort of advantage from most CPU applications no matter what you do. The fact that GPU hardware can be applied to a narrow range of other problems is sort of icing on the cake.



    There are only a small set of very highly computationally intensive tasks though. That's highlighted in the Weta example. The final render is done in Renderman on CPUs. The heavy raw computation is done on the GPU.



    Quote:
    Originally Posted by wizard69


    Or to phrase it as a question how can such a product compete against a PC outfitted with a couple of compute cards?



    How many USB tuners would you expect Elgato sell compared to PCI-based Win-TV tuners? When you have a product you can buy off the shelf and plug into a port compared to opening up a machine and installing it in a PCI slot, your audience is vastly bigger, not least because you can appeal to laptop owners.



    Quote:
    Originally Posted by hmm


    You aren't suggesting that they rendered Avatar on a single 1U server are you?



    Nope.



    Quote:
    Originally Posted by hmm


    i'm not sure what protocols thunderbolt supports anyway. Apple likes to tell us that it supports everything, but there are already a bunch of fringe examples where it simply doesn't work.



    It's the same as PCI. What examples are there where it can't be used?



    Quote:
    Originally Posted by hmm


    As to Marvin's suggestion, not even a proof of concept exists at this point, much less a functioning rig. You're basically turning the mini into a slim client, which is pointless.



    Render farms with thin client controllers have a similar setup. The server manager isn't going to go round with a USB pen and copy files to each server separately. They are all hooked up to a central location. The setup I suggest is simply a way for a single workstation user to have a powerful main machine and a simple compute cluster for intensive tasks.



    Quote:
    Originally Posted by hmm


    The point of a workstation is for work that can't be centralized onto a server.



    It's not really on a remote server though. It's more like a co-processor. Think of the following scenario:



    You have a 6-core Xeon Mac Pro and working in Maya. When it comes to rendering, you plug in your Thunderbolt S1070 compute box and you start the heavy computation step. When it is finished, do your test renders on the CPU - it takes very little time as the heavy computation was done 25x faster than the CPU would have done it.



    Instead of paying $1500 for a second 6-core Xeon, pay $1500 for a GPU cluster. The results are shown in CPU vs GPU configurations:



    http://www.luxrender.net/luxmark/top/top20/Room/GPU

    http://www.luxrender.net/luxmark/top/top20/Room/CPU



    You can see the single CPU 6-core Mac Pro in the CPU list with a score of 360. This CPU costs $1200 from Apple.

    4 x Radeon HD7970 scores 3205. Those cost $480 each.



    Long story short, if you get a 12-core Mac Pro for rendering, it's not the best use of money by a long way. Same for scientific computation. What's the point in shipping a 12-core Mac Pro if heavy computation is far better suited to the GPU?



    Quote:
    Originally Posted by hmm


    If it wouldn't have been feasible in the past to offload something via fibre channel, what makes thunderbolt a game changer there?



    Thunderbolt isn't just an IO protocol. You can't run a GPU over a Fibre Channel link.



    Quote:
    Originally Posted by MacRonin


    Inserting Renderman into OS X as a Core component would be pretty sweet?



    And the simple fact of it being a bundle-in with the older NeXT OS, which became Mac OS X, which became OS X; could be a sort of leverage?



    Petition to the Mouse??!? ;^p



    I would say that Renderman shouldn't be the core component though as it needs to be improved constantly, rather just a very specialised ray casting engine that pre-computes lighting. This is by far the slowest process. It can then bake the data and allow any number of final render engines to use it. It would be like Core Image in a way. It's not a case of having Core Photoshop but Core filters that process specific effects that can be used by apps like Quartz Composer or Pixelmator without starting from scratch.



    Quote:
    Originally Posted by hmm


    You know it's something like $2000/seat right? Yes... bundled into OSX.



    Yeah, $2000 per seat in the volumes they ship. Color used to cost $25000. Software is only priced in such a way that it makes a profit after paying the people who developed it. Apple sells 20 million Macs a year and before Pixar was bought by Disney, their earnings were:



    http://www.pixar.com/companyinfo/pre...307-189666.htm



    "In addition to film revenue, software licensing contributed $14.4 million to full year 2005 revenue."



    They could sell Mountain Lion for $31 with Renderman bundled and make more profit than that. Even though some companies have servers with thousands of machines, the license isn't a yearly cost. It would be nice if companies could do that without being anti-competitive. If Adobe convinced Microsoft and Apple to ship the Adobe CS Suite with the OS, they could charge less than $5 on top of the cost of the OS. It's not that much more anti-competitive than Apple bundling iMovie.



    Still, I wouldn't advocate bundling software like that as it's too complex, I was talking about core computation engines that complex apps can link up to like the Core Image or Quicktime frameworks. After Effects doesn't need to roll its own whole media API for rendering out to a movie like it would on Linux. The same could apply for rendering lighting, although the more complex the computation, the less reusable it is generally.
  • Reply 183 of 374
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by Marvin View Post


    Apple wouldn't have to build that part themselves and it wouldn't be that expensive. Not much more than a Pegasus RAID system.



    It depends if we're talking generic gpus or something comparable to Teslas which remain quite expensive. Someone did a comparison on this. Oh also in some of the applications which can harvest gpu functions, one of the limitations seems to be vram, which could become less of an issue if this gains popularity.



    Quote:
    Originally Posted by Marvin View Post


    Doubtful. We are just getting Sandy Bridge just now so not even close to double after nearly 2 years.



    Well you can still buy real growth, but at comparable price points, it's remained fairly static.



    Quote:
    Originally Posted by Marvin View Post


    Fast GPU options will ship in far higher volumes than 12 CPUs. The GPUs can at least be used for visual tasks.







    Like in Apple's Core software architecture. They can do it for encoding too - Core Compressor. This can be an API used by all apps to do batch image and movie compression. FCPX exports can be done in a fraction of the time.



    That would be pretty cool.





    Quote:
    Originally Posted by Marvin View Post


    You can't fit 4 or more high-end GPUs in a Mac Pro so you have to install a card that effectively connects to an eternal box anyway. You may as well use Thunderbolt. When it comes to computation, the link bandwidth only matters if you have a huge dataset that you are accessing on the host. The compute box will have its own storage so the link speed doesn't matter.




    You can't in a mac pro. There are apparently a couple workstations that accommodate such a thing, but you will pay a lot. Just the power needed for the gpus alone is pretty intense.







    Quote:
    Originally Posted by Marvin View Post




    It's the same as PCI. What examples are there where it can't be used?



    It supposedly supports displayport 1.2 yet certain displays just don't work. Anandtech tested the overall performance as being lower with the connection maxed on displays. 10 bit displayport was only supported on one or two mac pro cards in Leopard. It does not seem to be currently possible over thunderbolt. This is one of my massive irritations as I would use it. Apple doesn't care because their display is unaffected as it lacks this feature.



    Quote:
    Originally Posted by Marvin View Post


    Render farms with thin client controllers have a similar setup. The server manager isn't going to go round with a USB pen and copy files to each server separately. They are all hooked up to a central location. The setup I suggest is simply a way for a single workstation user to have a powerful main machine and a simple compute cluster for intensive tasks.



    Again I never suggested that a mac pro was primarily a render box. I said you could either set it as a low background priority to fill unused cpu cycles or run such a thing when you're not using it via a job management application. It wouldn't be the primary use. It would just be something to maximize its use.



    Quote:
    Originally Posted by Marvin View Post


    It's not really on a remote server though. It's more like a co-processor. Think of the following scenario:



    You have a 6-core Xeon Mac Pro and working in Maya. When it comes to rendering, you plug in your Thunderbolt S1070 compute box and you start the heavy computation step. When it is finished, do your test renders on the CPU - it takes very little time as the heavy computation was done 25x faster than the CPU would have done it.



    Instead of paying $1500 for a second 6-core Xeon, pay $1500 for a GPU cluster. The results are shown in CPU vs GPU configurations:



    http://www.luxrender.net/luxmark/top/top20/Room/GPU

    http://www.luxrender.net/luxmark/top/top20/Room/CPU



    You can see the single CPU 6-core Mac Pro in the CPU list with a score of 360. This CPU costs $1200 from Apple.

    4 x Radeon HD7970 scores 3205. Those cost $480 each.



    Long story short, if you get a 12-core Mac Pro for rendering, it's not the best use of money by a long way. Same for scientific computation. What's the point in shipping a 12-core Mac Pro if heavy computation is far better suited to the GPU?



    I'd never suggest a 12 core mac pro as a rendering box. That's a waste of money. Really any mac hardware dedicated to such a function is a complete waste of money. I need to find more details on gpu rendering, but it would have seen better growth if it didn't lack limitations somewhere or require complex code revision to make it work. You've probably seen Maxwell fire (not a big fan of Next Limit). They use the gpu for real time preview work there, yet it has no real effect on final rendering times. There has to be a reason for that, and obviously that doesn't mean it will remain this way. Apple is still coasting on their current product lines anyway. I don't think they even handle Mac Pro revisions internally. I doubt there is any Mac Pro team. It's most likely just handled at Foxconn or in a sense spun off, but none of the current lines make real sense as successors.





    Quote:
    Originally Posted by Marvin View Post


    Thunderbolt isn't just an IO protocol. You can't run a GPU over a Fibre Channel link.







    Quote:
    Originally Posted by Marvin View Post


    I would say that Renderman shouldn't be the core component though as it needs to be improved constantly, rather just a very specialised ray casting engine that pre-computes lighting. This is by far the slowest process. It can then bake the data and allow any number of final render engines to use it. It would be like Core Image in a way. It's not a case of having Core Photoshop but Core filters that process specific effects that can be used by apps like Quartz Composer or Pixelmator without starting from scratch.



    That makes more sense although I don't understand the methods of calculation there well enough to know how that would work. I noticed comments on some of the earlier gpu rendering implementations of it taking a certain amount of time to compile the scene for the gpu. Certain shaders and things were not supported as well. I can't find completely consistent information in terms of limitations there.





    Quote:
    Originally Posted by Marvin View Post


    Yeah, $2000 per seat in the volumes they ship. Color used to cost $25000. Software is only priced in such a way that it makes a profit after paying the people who developed it. Apple sells 20 million Macs a year and before Pixar was bought by Disney, their earnings were:



    http://www.pixar.com/companyinfo/pre...307-189666.htm



    "In addition to film revenue, software licensing contributed $14.4 million to full year 2005 revenue."



    They could sell Mountain Lion for $31 with Renderman bundled and make more profit than that. Even though some companies have servers with thousands of machines, the license isn't a yearly cost. It would be nice if companies could do that without being anti-competitive. If Adobe convinced Microsoft and Apple to ship the Adobe CS Suite with the OS, they could charge less than $5 on top of the cost of the OS. It's not that much more anti-competitive than Apple bundling iMovie.



    Still, I wouldn't advocate bundling software like that as it's too complex, I was talking about core computation engines that complex apps can link up to like the Core Image or Quicktime frameworks. After Effects doesn't need to roll its own whole media API for rendering out to a movie like it would on Linux. The same could apply for rendering lighting, although the more complex the computation, the less reusable it is generally.



    I'm aware of this stuff. Power animator used to cost $30k or whatever. Smoke was close to $100k with the dedicated hardware. I got that stuff. It is common practice with other stuff too. Mental Ray ships default with several packages although it has a lot of quirks. If standalone MR was the default, it would most likely cost more and see lower adoption rates. Personally I thought the lawsuit over Windows being bundled was dumb when they also offer Linux options.



    Actually that kind of development could be a good thing, but this is a departure from your previous statement that a mini was ideal. I really don't care what the box looks like. In the case of thunderbolt, if the available IO is suitable to such a rig, it could be good. To do any kind of box with 4 GPUs you'd need a really beefy power source though and a lot of cooling right there (although according to my email boxx just created one that's probably over $10k fully configured). Then of course in terms of IO, you still require enough lanes to pump out that much TB bandwidth, as they subtract from available PCI lanes. I wasn't speaking so much of rendering stages anyway, which are commonly done on a server or in a distributed manner (as in workstations could be put to use at night if possible for such a task). I was saying that it's typical for functions that are run as close to real time as possible to remain localized as opposed to split off which generally works better for longer number crunching sequences. The only times I've really seen people use their workstation as a primary rendering box is when they get a new one (they'll sometimes put the old one to that task). If you mainly deal with stills and stuff for print compositing (something I plan to move away from), you can do that part off almost anything. 6-8k stills beat up the ram more than anything else. It's just working in a high poly scene without hiding a bunch of stuff or working with simulations can suck either way (even for stills you can still frame grab and take snapshots).
  • Reply 184 of 374
    MarvinMarvin Posts: 15,322moderator
    Quote:
    Originally Posted by hmm View Post


    It supposedly supports displayport 1.2 yet certain displays just don't work.



    In general, I don't like that they put display output and PCI over the same cable. I would have preferred USB 3 and Thunderbolt to be combined. No going back now though.



    Quote:
    Originally Posted by hmm View Post


    I'd never suggest a 12 core mac pro as a rendering box. That's a waste of money. Really any mac hardware dedicated to such a function is a complete waste of money.



    I agree and it's why I think the dual processor models are unnecessary now. The real-time tasks are better served by fast storage and more RAM.



    Quote:
    Originally Posted by hmm View Post


    You've probably seen Maxwell fire (not a big fan of Next Limit). They use the gpu for real time preview work there, yet it has no real effect on final rendering times. There has to be a reason for that



    Maxwell Fire is CPU-based:



    http://www.maxwellrender.com/fire/mw2_fire_benefits.php



    "While other interactive render solutions providing GPU based interactive previews force you to buy expensive graphics cards to achieve the desired results, Maxwell Fire is CPU based, and no special hardware is needed.



    While GPU hardware has become more capable of handling some of the calculations a complex render engine requires, they are still not ready to efficiently accommodate all Maxwell Render features."



    Being assured that a big enough user-base has the right GPUs and drivers is a problem at the moment but it shouldn't affect modern Macs and over time, all computers will adopt the standards like they have with OpenGL.
  • Reply 185 of 374
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by Marvin View Post


    In general, I don't like that they put display output and PCI over the same cable. I would have preferred USB 3 and Thunderbolt to be combined. No going back now though.



    Bleh it's typical Apple. Consolidate things and claim that it has zero disadvantages simply because the majority of their users will never encounter them. Routing it over usb would have made too much sense. I still think the job title "director of thunderbolt technology" is hilarious (from the intel videos).



    Quote:
    Originally Posted by Marvin View Post


    I agree and it's why I think the dual processor models are unnecessary now. The real-time tasks are better served by fast storage and more RAM.



    They definitely have a smaller market now. They'll either remain or they won't. Partially I would guess it'll depend on upcoming software. RAM is commonly misunderstood. 32 bit applications meant that you couldn't take advantage of as much of it. Now silly people can't see a point in anything past 4-8GB of ram, yet they feel an SSD makes everything so much faster. I wonder why. Fast storage is great too especially if you can get away without using file compression which in some applications remains a single threaded process. Anyway still waiting on a mac pro.



    The things in terms of quality control and features that would be required to make me even remotely interested in an imac would probably push the price up considerably. If they still ignore 10 bit displayport, I may actually switch. I thought I was safe buying this given that it showed signs of development at the time, but rather than continuing, that died with Snow Leopard and hasn't shown up in Lion. Things like that irritate me considerably on expensive hardware.



    Quote:
    Originally Posted by Marvin View Post


    Maxwell Fire is CPU-based:



    http://www.maxwellrender.com/fire/mw2_fire_benefits.php





    "While other interactive render solutions providing GPU based interactive previews force you to buy expensive graphics cards to achieve the desired results, Maxwell Fire is CPU based, and no special hardware is needed.



    While GPU hardware has become more capable of handling some of the calculations a complex render engine requires, they are still not ready to efficiently accommodate all Maxwell Render features."



    Being assured that a big enough user-base has the right GPUs and drivers is a problem at the moment but it shouldn't affect modern Macs and over time, all computers will adopt the standards like they have with OpenGL.



    Doh! You're right... I swore their marketing kool-aid said otherwise. Maxwell studio always looked cool, but the idea of letting stills bake for 24 hours to clear noise. It's not like it's that difficult to tune any of them these days anyway. When maxwell came out initially it was extremely slow with limited features, but it could do some amazing stuff out of the box, at least from what I saw produced by others.
  • Reply 186 of 374
    MarvinMarvin Posts: 15,322moderator
    Quote:
    Originally Posted by hmm View Post


    Maxwell studio always looked cool, but the idea of letting stills bake for 24 hours to clear noise. It's not like it's that difficult to tune any of them these days anyway. When maxwell came out initially it was extremely slow with limited features, but it could do some amazing stuff out of the box, at least from what I saw produced by others.



    It was one of the first commercial engines to do unbiased rendering, which uses physical lighting algorithms. It's very slow but accurate. Luxrender has an experimental OpenCL version using the same technique:



    http://www.macupdate.com/app/mac/33632/smallluxgpu



    As they note on their benchmark site, it's not really a case of CPU vs GPU but rather CPU+GPU, which only OpenCL allows. OpenCL lets you use power you already have to almost double the compute power of your machine or more (depending on what GPU you have).



    Unbiased isn't really needed though - if it takes 24 hours now to get clean output, it will take 10 years to get it under an hour. Feature film effects just use scanline or raytracing:



    http://dl.acm.org/citation.cfm?id=1778774



    Composition helps with the realism and I'd say it's better to allow an artist to tweak an inaccurate picture quickly than allow a computer to generate an accurate image slowly.



    It all comes down to results. That's what I think will play the biggest role in deciding the future of the Mac Pro. Can an iMac allow Apple's customers to achieve the results they need? If not, the Mac Pro lives on.
  • Reply 187 of 374
    macroninmacronin Posts: 1,174member
    My initial thought of RenderMan in OS X had to do with the history of it being included in the NeXT OS, of which OS X is derived…



    From a geek perspective, it would be neat if RenderMan was somehow a Core component, and an expensive app store purchase could buy a full seat (your choice; RFM, Studio or Pro Server)…



    Always wished apple could have become the new version of Silicon Graphics and take over the DCC market…
  • Reply 188 of 374
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by Marvin View Post


    It was one of the first commercial engines to do unbiased rendering, which uses physical lighting algorithms. It's very slow but accurate. Luxrender has an experimental OpenCL version using the same technique:



    http://www.macupdate.com/app/mac/33632/smallluxgpu



    As they note on their benchmark site, it's not really a case of CPU vs GPU but rather CPU+GPU, which only OpenCL allows. OpenCL lets you use power you already have to almost double the compute power of your machine or more (depending on what GPU you have).



    Unbiased isn't really needed though - if it takes 24 hours now to get clean output, it will take 10 years to get it under an hour. Feature film effects just use scanline or raytracing:



    http://dl.acm.org/citation.cfm?id=1778774



    Composition helps with the realism and I'd say it's better to allow an artist to tweak an inaccurate picture quickly than allow a computer to generate an accurate image slowly.



    It all comes down to results. That's what I think will play the biggest role in deciding the future of the Mac Pro. Can an iMac allow Apple's customers to achieve the results they need? If not, the Mac Pro lives on.



    I noted that maxwell made a lot of bandaid fixes to their workflow so that it can spit out multiple versions from a single render so that trying to get a bunch of lighting angles for a product shot or something like that (which seems to be a popular area for it) doesn't have to take days or a massive render farm. The workflow still seems weird and some of the cool features are extremely cpu intensive. They even suggest against the use of things like the physical sky rather than a skydome because of the amount of time required. The skydome is probably fine though. It can't be any less correct than lighting via spherical hdr imagery.



    The imac and other machines really aren't designed to cater to such a market, and it's not just at a level of raw power. Lack of serviceability on a couple basic components, quality control, and durability aren't really there. It doesn't even save you on price if it means you have to buy all new supporting hardware. In the case of the Z1, HP was aiming at different customers. I'm not comparing pricing here. It's quite expensive. My point is that the imac was never designed with such a market in mind. Apple really ignores it quite a bit on some things. It's just been little stuff for quite a few years like these hard drives make your machine unable to sleep (or cause problems when waking) or the gpus don't support certain features that would be incredibly useful to some people. They take the theory of dangling a shiny object in front of the general population. You don't notice that you can't adjust the imac height until later. Display issues tend to pop up after the one year mark.



    I get the issue that the workload isn't growing exponentially for the majority of their customer base, but it's unlikely that they'll fix some of these issues with the other machines just for that portion of their users.
  • Reply 189 of 374
    MarvinMarvin Posts: 15,322moderator
    Quote:
    Originally Posted by MacRonin View Post


    My initial thought of RenderMan in OS X had to do with the history of it being included in the NeXT OS, of which OS X is derived?



    From a geek perspective, it would be neat if RenderMan was somehow a Core component, and an expensive app store purchase could buy a full seat (your choice; RFM, Studio or Pro Server)



    It would definitely be cool to have some Pixar software integrated into the Mac system given the history of the companies. Renderman came to OS X in 2003 and they even had Ed Catmull from Pixar in one of their keynote videos:



    http://www.youtube.com/watch?v=iwsn27J_tlo

    http://www.youtube.com/watch?v=AACvaTrdVHs

    http://www.youtube.com/watch?v=ygpmuTeJXzg

    http://www.youtube.com/watch?v=esBZBvmhD1w

    http://www.youtube.com/watch?v=1Di5yLkrFGU

    http://www.youtube.com/watch?v=YJcShAUrTlc



    The above keynote highlights a dramatic shift in Apple's focus. The full hour keynote was dedicated to the G5 with companies like Luxology (Modo) and Wolfram taking part. Benchmarking their computer on stage against the Xeon PCs and going into all the technical details of the CPU (some people these days would claim that isn't Apple's way but it very clearly was).



    There is quite a sad part where the guy from Wolfram says about working with Steve for 15 years and wondering if in another 15 years, they'd be shuffling around introducing a 'nanotube' Mac.



    Jon Rubinstein was there too talking about the G5 system. It's amazing to see how exciting they were able to make the technology and enclosure of the high-end tower but they can't really do this any more. The excitement now comes from getting something so powerful into an ever smaller package.



    Even with a small package, it's hard to impress people. You can see at the iPad launch where they showed off the flying game. It looked nowhere near as good as Ace Combat for the PS3 and people could tell. The reaction was just 'meh'. All the computing power of that awesome G5 tower in the palm of their hands in just 9 years and the reaction goes from loud applause to 'meh' because now it's just incremental.



    Quote:
    Originally Posted by hmm


    Lack of serviceability on a couple basic components, quality control, and durability aren't really there.



    They take the theory of dangling a shiny object in front of the general population. You don't notice that you can't adjust the imac height until later. Display issues tend to pop up after the one year mark.



    I get the issue that the workload isn't growing exponentially for the majority of their customer base, but it's unlikely that they'll fix some of these issues with the other machines just for that portion of their users.



    I agree to an extent. I think the iMac displays should have a 3 year warranty like Dell's and they should improve the serviceability of the memory components but when you look at the iPad, how often do you hear about display failures or storage failures? No beachballing, rarely dead pixels, no failure to boot, no overheating GPU, you name all the issue on the desktop and they don't exist on the iPad. That's where computers are going.



    Serviceability introduces risk. Steve Jobs said it once about code - 'the least expensive, most bug-free line of code is the one you didn't have to write'. We simply won't always have to open computers up just like we don't open up microwaves, ovens, fridges, toasters. Their functions are basic but the parts have evolved to fulfil their purpose without service or improvement and this will be the same for computing.



    Computing hasn't reached that pinnacle yet but the idea is to cull the lines that have no growth at the right time because they're going to die anyway.



    Think of the fact that the same power has gone from the G5 tower in 2003 to a phone in 2012. By 2020, the same thing will have happened, possibly even more dramatic as manufacturers explore different materials. Imagine a Mac Mini in 2020 - 16x faster CPU, 32x faster GPU, 1TB SSD, 50-100Gbit Thunderbolt, up to 128GB RAM and cheap as dirt so serviceability doesn't matter. Sure, a tower could house 4x the performance and allow you to remove storage but no one would bother buying it when you can just toss a Mini out when it breaks just outside its warranty like you would a broken microwave.
  • Reply 190 of 374
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by Marvin View Post


    It would definitely be cool to have some Pixar software integrated into the Mac system given the history of the companies. Renderman came to OS X in 2003 and they even had Ed Catmull from Pixar in one of their keynote videos:



    http://www.youtube.com/watch?v=iwsn27J_tlo

    http://www.youtube.com/watch?v=AACvaTrdVHs

    http://www.youtube.com/watch?v=ygpmuTeJXzg

    http://www.youtube.com/watch?v=esBZBvmhD1w

    http://www.youtube.com/watch?v=1Di5yLkrFGU

    http://www.youtube.com/watch?v=YJcShAUrTlc



    The above keynote highlights a dramatic shift in Apple's focus. The full hour keynote was dedicated to the G5 with companies like Luxology (Modo) and Wolfram taking part. Benchmarking their computer on stage against the Xeon PCs and going into all the technical details of the CPU (some people these days would claim that isn't Apple's way but it very clearly was).



    There is quite a sad part where the guy from Wolfram says about working with Steve for 15 years and wondering if in another 15 years, they'd be shuffling around introducing a 'nanotube' Mac.



    Jon Rubinstein was there too talking about the G5 system. It's amazing to see how exciting they were able to make the technology and enclosure of the high-end tower but they can't really do this any more. The excitement now comes from getting something so powerful into an ever smaller package.



    Even with a small package, it's hard to impress people. You can see at the iPad launch where they showed off the flying game. It looked nowhere near as good as Ace Combat for the PS3 and people could tell. The reaction was just 'meh'. All the computing power of that awesome G5 tower in the palm of their hands in just 9 years and the reaction goes from loud applause to 'meh' because now it's just incremental.



    Heh...I actually predicted the ipad and ipod would be big. I figured the same with the iphone, but I still underestimated how big. Looking at ipods, the non removable battery did suck. I had several die. Gaming has never been a strong benchmark for Apple. They never put a lot of effort into working with the developers who release gaming engines and stuff like that. It was more a side thought of something that would be nice. I'm surprised they used one in a keynote for reference.



    Regarding what they do with the towers, you should consider that we currently have three aging desktop designs. With the mini they definitely pushed it to the low end. It's below the spec of available laptops and desktops. It's definitely intended as their budget model there. My issue with the imac and serviceability is that it's not that reliable, and they do still make them today. With the mac pro the argument is that it's dated. If Apple wanted to refactor it in any way, especially given their tendency to make everything as compact as possible, Haswell seems like a more logical time if Intel lives up to their hype. This would put it somewhere in late 2013 to 2014. If they're releasing an updated unit this year that probably takes virtually no development as Foxconn can handle it, that gives you a machine to address any pent up demand, and Ivy Bridge can be used on the same board design like they did with Westmere. You'll see more grumbling, but if they intend to ride the same design for a decade again, that is most likely the logical choice given that none of their other machines are ready to pick up the slack yet.



    Quote:
    Originally Posted by Marvin View Post


    I agree to an extent. I think the iMac displays should have a 3 year warranty like Dell's and they should improve the serviceability of the memory components but when you look at the iPad, how often do you hear about display failures or storage failures? No beachballing, rarely dead pixels, no failure to boot, no overheating GPU, you name all the issue on the desktop and they don't exist on the iPad. That's where computers are going.



    Serviceability introduces risk. Steve Jobs said it once about code - 'the least expensive, most bug-free line of code is the one you didn't have to write'. We simply won't always have to open computers up just like we don't open up microwaves, ovens, fridges, toasters. Their functions are basic but the parts have evolved to fulfil their purpose without service or improvement and this will be the same for computing.



    Computing hasn't reached that pinnacle yet but the idea is to cull the lines that have no growth at the right time because they're going to die anyway.



    Think of the fact that the same power has gone from the G5 tower in 2003 to a phone in 2012. By 2020, the same thing will have happened, possibly even more dramatic as manufacturers explore different materials. Imagine a Mac Mini in 2020 - 16x faster CPU, 32x faster GPU, 1TB SSD, 50-100Gbit Thunderbolt, up to 128GB RAM and cheap as dirt so serviceability doesn't matter. Sure, a tower could house 4x the performance and allow you to remove storage but no one would bother buying it when you can just toss a Mini out when it breaks just outside its warranty like you would a broken microwave.



    I'm going to add that server components go down too. Whenever you run something that hard, it puts more stress on the machine. They just don't really build anything designed for that outside of the mac pro right now (and again if i'm running anything that goes for a really long time, I run it at the end of the day and let the machine go to sleep when it finishes).



    Three years or more is standard on displays and typical on every other workstation class machine. It made more sense to limit warranty periods when Apple was a small company, given that they probably couldn't afford it back then. The problems you mention are generally much bigger complaints with the laptops and mini than on the enormous towers. Also I don't fully agree on it being like a microwave. Apple likes to maintain higher price points wherever possible, and the top ipads are still quite expensive. The phones appear less expensive as it's worked into your mobile contract.



    My issue with minis, laptops, etc. isn't entirely one of performance. Much of it is just the ability to run the machine hard without failure or hiccups being likely. Machines don't necessarily just die. You start to experience problems or they run excessively hot and shut down randomly. It's just a case of wishing to deal with a line that is much less prone to this. When they are designing such machines, they get to make choices. My whole issue with their choices is that aesthetics and simplicity of software seem to trump things like functionality, ergonomics, and reliability. I dislike their typical priorities.



    Anyway the X line will be killed thing comes up all the time. It came up with the Xserve which was true. It came up with the mini. It came up with the Apple TV. Regarding reliability and the ipads, I still hear of backlight bleed issues. They could accept a slightly brighter black level and just use panel blocking to fix this, but that would also add to manufacturing costs due to the additional levels of testing needed in manufacturing and a method of calculation to adjust this for drift and backlight degradation. I'm also not sure how hard the NAND is getting hammered in the ipad. Laptop ssds experience completely different usage patterns. I don't think the ipad even caches to disk, which would explain the lack of the spinning wheel. Memory paging was a really common cause there. IOS seems to be more like earlier OS versions where if you didn't have enough memory, you had to close something (people could learn from that today).



    Anyway I'm not sure what direction they'll go. Consumer demand influences quite a lot of stuff. Games drove improvements on gpu technology starting from the 90s. If everyone owns a lighter laptop as their sole or primary computer, that will influence both the requirements of future games and the relative scale of game elements due to smaller average display sizes. That was just an example.
  • Reply 191 of 374
    mccrabmccrab Posts: 201member
    Quote:
    Originally Posted by Lemon Bon Bon. View Post


    The 'desktop' is changing. Just not in the direction you want.



    The MacPro is becoming an embarrassment - 500+ days without any refresh/redesign. When does Apple throw in the towel and pull it from the line up? And more importantly, how does Apple upsell its customers from the high end iMac - or does it even bother? This is quite an important strategic question - I wouldn't give up on the high end. I like having a couple of good looking screens on my desk, a good slug of RAM, and some grunt in the CPU/GPU area. And I'd rather give my money to Apple because I like their stuff.
  • Reply 192 of 374
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by McCrab View Post


    The MacPro is becoming an embarrassment - 500+ days without any refresh/redesign. When does Apple throw in the towel and pull it from the line up? And more importantly, how does Apple upsell its customers from the high end iMac - or does it even bother? This is quite an important strategic question - I wouldn't give up on the high end. I like having a couple of good looking screens on my desk, a good slug of RAM, and some grunt in the CPU/GPU area. And I'd rather give my money to Apple because I like their stuff.



    *sigh* this argument comes up all the time. The lack of updated cpus and stuff isn't really the issue. We had one weak gpu generation that only made it to about half of the PC side at the workstation level. Those same machines are still using 5500/5600 series cpus too. Go compare. Right now the main sticky issue I could see is that the newest gpus aren't really shipping in volume.
  • Reply 193 of 374
    Quote:
    Originally Posted by McCrab View Post


    The MacPro is becoming an embarrassment - 500+ days without any refresh/redesign. When does Apple throw in the towel and pull it from the line up? And more importantly, how does Apple upsell its customers from the high end iMac - or does it even bother? This is quite an important strategic question - I wouldn't give up on the high end. I like having a couple of good looking screens on my desk, a good slug of RAM, and some grunt in the CPU/GPU area. And I'd rather give my money to Apple because I like their stuff.



    Aye.



    I guess the next Mac Pro update will tell us a great deal. I think a Pro redesign is obviously over due but if a 100 billion cash in bank company can't prioritise a re-design it won't auger well for the Pro's future.



    We're still waiting on ANY update to the Mac.



    That, 3 months in, we got an iOS product first in the iPad tells us a great deal about where Apple's priorities are focused.



    Lemon Bon Bon.
  • Reply 194 of 374
    Quote:
    Originally Posted by hmm View Post


    *sigh* this argument comes up all the time. The lack of updated cpus and stuff isn't really the issue. We had one weak gpu generation that only made it to about half of the PC side at the workstation level. Those same machines are still using 5500/5600 series cpus too. Go compare. Right now the main sticky issue I could see is that the newest gpus aren't really shipping in volume.



    Yes. But high prices and the fact that none of the other components get updated in the meantime over nearly 2 years can't help sales on a product that, polity put, is vastly marked up.



    While competitive PC towers have prices that constantly churn downwards.



    Who wants to pay outrageous prices for out of date kit? Who?



    Marv's comparison of the Apple display vs the Dell 27 inch display mark up. Almost £300+ in difference for the same thing?



    Lemon Bon Bon.
  • Reply 195 of 374
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by Lemon Bon Bon. View Post


    Yes. But high prices and the fact that none of the other components get updated in the meantime over nearly 2 years can't help sales on a product that, polity put, is vastly marked up.



    While competitive PC towers have prices that constantly churn downwards.



    Who wants to pay outrageous prices for out of date kit? Who?



    Marv's comparison of the Apple display vs the Dell 27 inch display mark up. Almost £300+ in difference for the same thing?



    Lemon Bon Bon.



    Well they aren't identical. They use the same panel. The way they do their measurements and set the levels, backlight design, secondary uniformity corrections, dithering method, internal processing, and overall supportive electronics make a significant difference. We're at a point of basically generic hardware with displays, so the difference between high end and low end isn't quite what it was, but it's still there.
  • Reply 196 of 374
    wizard69wizard69 Posts: 13,377member
    Honestly Intel is late with USB 3, Ivy Bridge and Sandy Bridge E and everyone blames Apple. WTF!



    Quote:
    Originally Posted by Lemon Bon Bon. View Post


    Aye.



    I guess the next Mac Pro update will tell us a great deal. I think a Pro redesign is obviously over due but if a 100 billion cash in bank company can't prioritise a re-design it won't auger well for the Pro's future.



    The whole desktop line up at Apple is an embarrassment. In football terms they need to punt.



    All that cash does imply an issue with priorities as they should have no trouble highering entire staffs just to design each machine. Of course most of that money isn't in the US so maybe it isn't too easy to get to. Still it wouldn't hurt them to spend $100 million a year on Mac engineering. Mind you a $100 million ought to get you a state of the art Mac with a whole bunch of custom electronics.

    Quote:

    We're still waiting on ANY update to the Mac.



    True but don't blame Apple.

    Quote:

    That, 3 months in, we got an iOS product first in the iPad tells us a great deal about where Apple's priorities are focused.



    Lemon Bon Bon.



    Actually the iPad debut tells us nothing. If nothing else it looks like they fitted it into an empty gap in the calandar. Really if Intel is humping the pooch instead of delivering new chips what can Apple do? Even the latest news on IB is a little screwed up, it isn't even clear if we will have chips suitable for the AIR before July.
  • Reply 197 of 374
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by wizard69 View Post


    Honestly Intel is late with USB 3, Ivy Bridge and Sandy Bridge E and everyone blames Apple. WTF!






    I get annoyed too at times, but people can be weird. There are little updates I would like to see, and I'd like to see them do a lot better with display drivers. It really makes no sense to launch an update without both cpus and gpus in place though especially considering how soon we should see new AMD cards and/or Kepler become available. Also regarding Ivy Bridge keep in mind that Intel isn't the only one who was having trouble keeping on schedule with 22nm fabrication. You actually mentioned that yourself last year on this same topic.
  • Reply 198 of 374
    MarvinMarvin Posts: 15,322moderator
    IBM has announced new Xeon E5 servers:



    http://www.theregister.co.uk/2012/03...server_lineup/



    so an update can come in at any time now. Some of the E5 benchmarks are pretty good:



    http://www.tomshardware.com/reviews/...ew,3149-5.html



    but the majority of them are just 40% higher with the extra 2 cores. That would be an ok update over 1 year but not after nearly 2 years. I think they at least need to get the entry model up to 6-cores.



    Ivy Bridge chips for the iMac and 15" MBP are due late April/May.

    Ivy Bridge chips for the MBA, 13" MBP and Mini are due in June.



    There's a benchmark of Ivy Bridge CPU + kepler GPU that could make its way to the 15"MBP:



    http://forum.notebookreview.com/gami...-yknyong1.html



    I think the MBP will use the higher up CPU though.
  • Reply 199 of 374
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by hmm View Post


    I get annoyed too at times, but people can be weird. There are little updates I would like to see, and I'd like to see them do a lot better with display drivers.



    Apple really doesn't have any excuses anymore. I mean this totally, there is no reason why they can't higher the people they need to get drivers, OpenGL and other features up to snuff. When Linux has better drivers and support you know something is wrong.

    Quote:

    It really makes no sense to launch an update without both cpus and gpus in place though especially considering how soon we should see new AMD cards and/or Kepler become available.



    The AMD GPUs are ready to go so I don't think an update is being held up there. Well other than the possibility that Apple will integrated the GPU processor on the motherboard.

    Quote:

    Also regarding Ivy Bridge keep in mind that Intel isn't the only one who was having trouble keeping on schedule with 22nm fabrication. You actually mentioned that yourself last year on this same topic.



    Yes I know and that is why I object to the blame Apple mentality. Like it or not Apple can't ship new stuff if the processor isn't there to ship in the first place.



    In any event Apples problem with the Pro is targeting to small of a market considering it is Apples only viable and configurable desktop. I see this as the primary driver for a refactored Pro.
  • Reply 200 of 374
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Marvin View Post


    IBM has announced new Xeon E5 servers:



    http://www.theregister.co.uk/2012/03...server_lineup/



    so an update can come in at any time now. Some of the E5 benchmarks are pretty good:



    http://www.tomshardware.com/reviews/...ew,3149-5.html



    but the majority of them are just 40% higher with the extra 2 cores. That would be an ok update over 1 year but not after nearly 2 years. I think they at least need to get the entry model up to 6-cores.



    I would want to see production systems from Apple before getting too excited one way or the other. I've seen numbers all over the place, some indicating a 2 X improvement in performance.

    Quote:

    Ivy Bridge chips for the iMac and 15" MBP are due late April/May.



    I can't wait and frankly I'm not even in the market. Today my intention is to hold off another year but hey you never know.

    Quote:

    Ivy Bridge chips for the MBA, 13" MBP and Mini are due in June.



    The interesting thing here is that they will likely be competing directly with Trinity from AMD. If that chip lives up to its billing it would be a better choice for the AIR and Mini. However I was under the impression that the Mini already used 35 watt processors

    Quote:

    There's a benchmark of Ivy Bridge CPU + kepler GPU that could make its way to the 15"MBP:



    http://forum.notebookreview.com/gami...-yknyong1.html



    I think the MBP will use the higher up CPU though.



    Frankly I hope they stay away from NVidia. Mainly because AMD has changed for the better with respect to drivers and open source. Note I said better, their drivers have a ways to go but are far better than past efforts.
Sign In or Register to comment.