Apple ready and waiting with redesigned iMac line

1161719212225

Comments

  • Reply 361 of 486
    outsideroutsider Posts: 6,008member
    Quote:
    Originally Posted by backtomac View Post


    That's not true IIRC. Windows has thread pooling. I don't know if they Windows solution is as good as GCD but it exists. In fact most of the programmers at Ars down play the significance of GCD for this reason. They state that thread pooling has existed with Windows for some time and it helps some but that the major challenge is figuring out how and when to break up your code into threads. They're pretty unimpressed with GCD for the most part.



    Oh, are they now? Well considering they are mostly arrogant douche-nozzles, that doesn't mean much.
  • Reply 362 of 486
    addaboxaddabox Posts: 12,665member
    Quote:
    Originally Posted by Outsider View Post


    Oh, are they now? Well considering they are mostly arrogant douche-nozzles, that doesn't mean much.



    Well, I don't know if I'd go quite so far as to call them mostly arrogant douche-nozzles, but there are quite a few guys over there that are completely wed to whatever language/process/platform they came up on, and of that crew there certainly are a certain number who exhibit the tell-tale signs of thundering arrogant douche-nozzlery, tech division, which has no peer for smug certainty.



    OTOH, the lengthy Ars article gives a good overview of the why and what of GCD, and a lot of the people on the forums dismissing it seem to be doing so by obstinately ignoring what it's doing that's different and useful, and stubbornly focusing on what's similar to existing tech. If you read through the threads, "different and useful" gets pointed out over and over, and the nay-sayers just keep running the same "LOL, did that years ago" line.



    Part of that, of course, is driven by Apple's penchant for presenting their stuff in fairly over-the-top terms, so a lot of people just want to knock down the hype. Doesn't change the fact that GCD is going to be pretty helpful in a lot of instances and that how Apple has gone about implementing it isn't trivial.
  • Reply 363 of 486
    benroethigbenroethig Posts: 2,782member
    Quote:
    Originally Posted by backtomac View Post


    That's not true IIRC. Windows has thread pooling. I don't know if they Windows solution is as good as GCD but it exists. In fact most of the programmers at Ars down play the significance of GCD for this reason. They state that thread pooling has existed with Windows for some time and it helps some but that the major challenge is figuring out how and when to break up your code into threads. They're pretty unimpressed with GCD for the most part.



    I thought the point of GCD is so they wouldn't have to break it up into threads, the OS would do that for them.
  • Reply 364 of 486
    backtomacbacktomac Posts: 4,579member
    Quote:
    Originally Posted by BenRoethig View Post


    I thought the point of GCD is so they wouldn't have to break it up into threads, the OS would do that for them.



    Ben, I've read the Siracusa review a couple of times now, but since I've no background in programming I'm not sure I fully understand it and how it differs from the Windows way of thread pooling.
  • Reply 365 of 486
    djrumpydjrumpy Posts: 1,116member
    Quote:
    Originally Posted by backtomac View Post


    That's not true IIRC. Windows has thread pooling. I don't know if they Windows solution is as good as GCD but it exists. In fact most of the programmers at Ars down play the significance of GCD for this reason. They state that thread pooling has existed with Windows for some time and it helps some but that the major challenge is figuring out how and when to break up your code into threads. They're pretty unimpressed with GCD for the most part.



    Someone correct me if I'm wrong here but Thread Pool for windows is an API, meaning a programmer would have to take advantage of it in their code. GCD eliminates that as a requirement.
  • Reply 366 of 486
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by BenRoethig View Post


    I thought the point of GCD is so they wouldn't have to break it up into threads, the OS would do that for them.



    The basic of this is not that programmers won't need to multithread their programs. They still need to find the processes that could benefit from threads.



    What they don't have to do is to define the maximum number of threads etc. They can define the processes that could use a thread, no matter how many, and the OS takes the next step..



    The OS will know where threads are being used, and how many. So it can decide how many processes to pull into threads, and when they're finished.



    The problem with the old way is that if the programmer spec'd too many threads, the program could hang. If they didn't spec enough, then it would run more slowly than it could.



    Simplistically, this is a thread manager.



    I don't know exactly how Windows does it, but it's not the same thing.
  • Reply 367 of 486
    djrumpydjrumpy Posts: 1,116member
    Found a great in-depth look at GCD. A very good read if you have the time.



    http://arstechnica.com/apple/reviews...-x-10-6.ars/12
  • Reply 368 of 486
    palplepalple Posts: 35member
    Quote:
    Originally Posted by superd View Post


    So there have been some interesting developments as of late. 2 Australian Forums have taken down a number of posts showing the specs of the soon to be released iMacs.



    http://forums.whirlpool.net.au...eplies.cfm?t=1275768



    User 275907 formerly known as iNerd has been leaking info, reposted last night and now the other forums posts that he created have mysteriously disappeared as have the links in the above forum...



    Summary of pertinent details



    Summary of iNerd's predictions for the iMac:



    26", refined to 25.5" LED backlit display

    Up to 12Gb RAM, new CPU, likely to be Apple

    first (Core i7 mobile)

    Minor shape changes ? Slightly bigger then "minor" BUT not totally different

    Up to 2Tb HDD

    New mouse design, backlit keyboard

    20" replaced with 21.5

    Available BluRay



    To be release on October 13th according to his now delete posts on aqua-soft.org



    Can't wait,

    Dave



    That's what I'm dreaming of, I won't buy a new iMac until LED backlit display and a Nehalem based processor are featured.
  • Reply 369 of 486
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by superd View Post


    So there have been some interesting developments as of late. 2 Australian Forums have taken down a number of posts showing the specs of the soon to be released iMacs.



    http://forums.whirlpool.net.au...eplies.cfm?t=1275768



    User 275907 formerly known as iNerd has been leaking info, reposted last night and now the other forums posts that he created have mysteriously disappeared as have the links in the above forum...



    Summary of pertinent details



    Summary of iNerd's predictions for the iMac:



    26", refined to 25.5" LED backlit display

    Up to 12Gb RAM, new CPU, likely to be Apple

    first (Core i7 mobile)

    Minor shape changes – Slightly bigger then "minor" BUT not totally different

    Up to 2Tb HDD

    New mouse design, backlit keyboard

    20" replaced with 21.5

    Available BluRay



    To be release on October 13th according to his now delete posts on aqua-soft.org



    Can't wait,

    Dave



    Wow that is a wet dream man... It could happen. It just sounds too good to be true. BluRay and backlit keyboard and new mouse... Important stuff.



    One point though, 12GB RAM doesn't make sense whatsoever. Apple would not allow the iMac to encroach onto the Mac Pro in this way. This 26" could be excellent for video editing even with 8GB of RAM max.
  • Reply 370 of 486
    nvidia2008nvidia2008 Posts: 9,262member
    Oh that 25.5" and 21.5" will probably be 16:9
  • Reply 371 of 486
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by nvidia2008 View Post


    Oh that 25.5" and 21.5" will probably be 16:9



    I truly hope we DON'T get a 19:9 screen. It's too wide. These things are meant for a lot more than watching movies and Tv shows.
  • Reply 372 of 486
    mjteixmjteix Posts: 563member
    Quote:
    Originally Posted by nvidia2008 View Post


    This 26" could be excellent for video editing...

    Oh that 25.5" and 21.5" will probably be 16:9



    Quote:
    Originally Posted by melgross View Post


    I truly hope we DON'T get a 19:9 screen. It's too wide. These things are meant for a lot more than watching movies and Tv shows.



    I believe you meant 16:9, not 19:9



    - IF the 21.5" is 1920x1080, then it's a improvement vs the current 20" (1680x1050)

    - AFAIK, 25/26" displays are "just" 1920x1080, and are not an improvement vs the current 24" (1920x1200)

    - I've seen some 27" displays capable of 2048x1152 (Samsung), that would be a small improvement vs the current 24"



    Most of the time 16:9 is not an improvement vs 16:10. Like it has been said many times in this forum, with 16:10 in the 24", for example, you can display all of the 16:9 footage and have room for some tools in the bottom, that's better for video editing. Of course, a 27" or 30" (2560x1600) display would be even better.
  • Reply 373 of 486
    djrumpydjrumpy Posts: 1,116member
    Quote:
    Originally Posted by mjteix View Post


    I believe you meant 16:9, not 19:9



    - IF the 21.5" is 1920x1080, then it's a improvement vs the current 20" (1680x1050)

    - AFAIK, 25/26" displays are "just" 1920x1080, and are not an improvement vs the current 24" (1920x1200)

    - I've seen some 27" displays capable of 2048x1152 (Samsung), that would be a small improvement vs the current 24"



    Most of the time 16:9 is not an improvement vs 16:10. Like it has been said many times in this forum, with 16:10 in the 24", for example, you can display all of the 16:9 footage and have room for some tools in the bottom, that's better for video editing. Of course, a 27" or 30" (2560x1600) display would be even better.



    If the screen is large enough, a 16:9 aspect ratio isn't that horrible. I recently bought a 1080p LCD display for my Mini. It's a bit wider than I like, but not a noticeable detraction. I also prefer 16:10 as it's a nice compromise between the 16:9 and old 4:3 standard of yore
  • Reply 374 of 486
    benroethigbenroethig Posts: 2,782member
    Quote:
    Originally Posted by nvidia2008 View Post


    Wow that is a wet dream man... It could happen. It just sounds too good to be true. BluRay and backlit keyboard and new mouse... Important stuff.



    One point though, 12GB RAM doesn't make sense whatsoever. Apple would not allow the iMac to encroach onto the Mac Pro in this way. This 26" could be excellent for video editing even with 8GB of RAM max.



    The iMac has been encroaching on the Mac Pro for years. Its the machine they want you to buy. They also need to do something with the RAM situation. With only slots, you're either suck with the same 4GB that's been the norm for a couple years now or you have to pay a super premium for 8GB.
  • Reply 375 of 486
    MarvinMarvin Posts: 15,320moderator
    Quote:
    Originally Posted by BenRoethig View Post


    The iMac has been encroaching on the Mac Pro for years. Its the machine they want you to buy. They also need to do something with the RAM situation. With only slots, you're either suck with the same 4GB that's been the norm for a couple years now or you have to pay a super premium for 8GB.



    Yeah, I definitely think the iMac is what Apple want you to buy because it's always featured on the compare pages. If you go to the Mini compare, it has an iMac, if you go to the Mac Pro compare, it has an iMac. If you go to the iMac compare, it has neither the Mini nor Mac Pro.



    It's about control and probably profit. They control every single component you get including the display when you get an iMac - it's near impossible to even replace your own hard drive.



    That aspect I don't like about it but it would be easier to look past if the design was better. No chin, no glossy and that would remove a significant barrier towards liking it. Clarksfield processors would be great and a more powerful NVidia graphics chip or the Radeon 4830.



    If they put e-IPS into the lowest one instead of TN, that would help too but probably wouldn't be cost-effective.



    I really want them to put in a hard drive slot at the bottom similar to what you get for Ram so they can be easily removed/upgraded. 2 x 2.5" slots would be nice as it will accommodate SSD drives in RAID or a small SSD for boot and a 2.5" drive for storage.



    Apple only go up to 1TB BTO in the iMac anyway so 2 x 500GB 2.5" is the same storage and because you can put them in RAID-0, it can be faster or in RAID-1 more secure.
  • Reply 376 of 486
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by mjteix View Post


    I believe you meant 16:9, not 19:9



    - IF the 21.5" is 1920x1080, then it's a improvement vs the current 20" (1680x1050)

    - AFAIK, 25/26" displays are "just" 1920x1080, and are not an improvement vs the current 24" (1920x1200)

    - I've seen some 27" displays capable of 2048x1152 (Samsung), that would be a small improvement vs the current 24"



    Most of the time 16:9 is not an improvement vs 16:10. Like it has been said many times in this forum, with 16:10 in the 24", for example, you can display all of the 16:9 footage and have room for some tools in the bottom, that's better for video editing. Of course, a 27" or 30" (2560x1600) display would be even better.



    I did mean 16:9. It was late, and I didn't catch the typo.



    I don't like the wider displays, because when putting a page on the screen, we'll be forced to have a significantly lower resolution, and less height, physically.



    Right now, I can get two pages, side by side, that are 100% 8.5 x 11 on a 24" monitor, which is why that size is very popular in publishing, and anything that has to do with publishing, because a full 11" x 17" spread is also seen, of course, in full size, at good resolution. It will also display the slightly narrower and longer A4 size equally well.



    But a 24" 16:9 screen will be too short to do that. It's a really bad idea, even though a few PC laptops have already gone that route.



    For video editing, it's also bad, because companies will in some cases have to re-do their GUI, as it depends on space for controls below the full screen playback.



    This is strictly an amateur solution for those doing little other than watching widescreen video.



    I don't mind separate monitors giving people that choice, but not if it's a requirement in an AIO.



    Going to a larger diag to make up for the height isn't useful either. You then begin to approach the size of a 30" unit without any of the benefits, though with lower cost. Then all the people already complaining about the "low" ppi count of present monitors will complain more with the even lower ppi of these.



    This would be a lose lose situation.
  • Reply 377 of 486
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by DJRumpy View Post


    If the screen is large enough, a 16:9 aspect ratio isn't that horrible. I recently bought a 1080p LCD display for my Mini. It's a bit wider than I like, but not a noticeable detraction. I also prefer 16:10 as it's a nice compromise between the 16:9 and old 4:3 standard of yore



    Terrible.



    It will also make people scroll more while on the internet. I can't think of a single benefit to going 16:9.
  • Reply 378 of 486
    amorphamorph Posts: 7,112member
    Quote:
    Originally Posted by backtomac View Post


    That's not true IIRC. Windows has thread pooling. I don't know if they Windows solution is as good as GCD but it exists. In fact most of the programmers at Ars down play the significance of GCD for this reason. They state that thread pooling has existed with Windows for some time and it helps some but that the major challenge is figuring out how and when to break up your code into threads. They're pretty unimpressed with GCD for the most part.



    Surprising, considering that the greatest explication of the importance of GCD has come from one of their own, John Siracusa.



    Thread pooling is almost as old as BSD UNIX. In fact, there are ideas in GCD that go back to batch programming in the 1960s. The simple genius of GCD, as Siracusa observed, is that it puts the thread pool at the operating system level rather than the application level. This frees up applications from having to manage a thread pool, check for the number of available cores, and so on and on, only to have your attempts thwarted by other applications similarly trying to squeeze performance out of the same hardware. There is a clean, simple API for feeding code to GCD, which takes care of everything else and uses its centrality to ensure that the hardware is used efficiently in a way that no application in a multitasking environment possibly can. If that code contains Core Image calls, those can run on CPU or GPU cores, depending on availability. And it's all completely transparent to the application developer.



    Windows has APIs that make it easier to manage thread pools, which is great; but it's still a per-application solution. It can't wring as much performance out of the hardware as GCD can as long as there is even one other non-trivial process running concurrently.



    GCD does not solve threading in the general case. The attempts at comprehensive solutions, like PARLOG, are hardly better than the problem, because threading is tricky and involves a lot of judgment calls and edge cases. GCD pretty much sidesteps the hardest problem--what to thread,---and simply ensures that once you know what to thread, it will take care of all the details with an efficiency that is impossible within the application space, and spare the developer some extra work in the process. Developers usually like that.



    But the takeaway point is that GCD is not some mind-blowing new technological shiny thing. At root, it has one small but crucial improvement on everyone else: It moves the thread pool and the management and allocation code into a space where it can consider the entire context it's running in--hardware, applications, everything--when deciding what to allocate where. It's refinement, not revolution, but that doesn't make it any less significant. It accomplishes what it is meant to, which is to offer a simpler API which offers better performance.



    (Anyone who thinks that technology really embraces revolutionary change is welcome to explain to me why the idioms and syntax of the C programming language, codified in 1973, are still in nearly universal use 36 years later on all platforms despite great leaps forward in linguistics and parsers and compilers and run-time environments, and despite decades of research by many of our brightest minds. In fact, any truly revolutionary ideas will come under a relentless assault, and only those that survive this mob attack will have any chance of taking hold.)
  • Reply 379 of 486
    djrumpydjrumpy Posts: 1,116member
    Quote:
    Originally Posted by melgross View Post


    I don't like the wider displays, because when putting a page on the screen, we'll be forced to have a significantly lower resolution, and less height, physically.



    Right now, I can get two pages, side by side, that are 100% 8.5 x 11 on a 24" monitor, which is why that size is very popular in publishing, and anything that has to do with publishing, because a full 11" x 17" spread is also seen, of course, in full size, at good resolution. It will also display the slightly narrower and longer A4 size equally well.



    But a 24" 16:9 screen will be too short to do that. It's a really bad idea, even though a few PC laptops have already gone that route.



    For video editing, it's also bad, because companies will in some cases have to re-do their GUI, as it depends on space for controls below the full screen playback.



    Terrible.



    It will also make people scroll more while on the internet. I can't think of a single benefit to going 16:9.



    It's not that dramatic. 1920 x 1200 to 1920 x 1080 assuming they stayed at the current width. A whole 120 Pixels in height would be lost. If they did go from 16:10 to 16:9 they would probably also up the resolution given the rumors of larger displays.



    I swear the drama queens in here are something else. Given a larger screen size and a higher resolution you'd still have the same real estate with more width and possibly more height depending on how much of a bump in resolution they went with. It also isn't necessary to display a video in the exact width or height of the video's output image. They can shrink it by any arbitrary amount to fit various toolbars and widgets. Most video editing software already does this to make room for various editing tools. You should also know that a standard 11x17 page @ 93 PPI (the current Pixel density on my 08 iMac) is only 1581 x 1023 and already below the current 1920x1200 resolution. Your 8 1/2 x 11 pages are being scaled to fit your screen already, and they could easily be scaled to fit your screen on a widescreen monitor as can any text, graphics, video or any other media your displaying from the web, locally, playing from a dvd player, or whatever.



    I agree that a true wide screen wouldn't be ideal, but I'm not going to start crying that the world will crumble if they go from 16:10 to 16:9.



    At this point it's all only rumor.
  • Reply 380 of 486
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by Amorph View Post


    Surprising, considering that the greatest explication of the importance of GCD has come from one of their own, John Siracusa.



    Thread pooling is almost as old as BSD UNIX. In fact, there are ideas in GCD that go back to batch programming in the 1960s. The simple genius of GCD, as Siracusa observed, is that it puts the thread pool at the operating system level rather than the application level. This frees up applications from having to manage a thread pool, check for the number of available cores, and so on and on, only to have your attempts thwarted by other applications similarly trying to squeeze performance out of the same hardware. There is a clean, simple API for feeding code to GCD, which takes care of everything else and uses its centrality to ensure that the hardware is used efficiently in a way that no application in a multitasking environment possibly can. If that code contains Core Image calls, those can run on CPU or GPU cores, depending on availability. And it's all completely transparent to the application developer.



    Windows has APIs that make it easier to manage thread pools, which is great; but it's still a per-application solution. It can't wring as much performance out of the hardware as GCD can as long as there is even one other non-trivial process running concurrently.



    GCD does not solve threading in the general case. The attempts at comprehensive solutions, like PARLOG, are hardly better than the problem, because threading is tricky and involves a lot of judgment calls and edge cases. GCD pretty much sidesteps the hardest problem--what to thread,---and simply ensures that once you know what to thread, it will take care of all the details with an efficiency that is impossible within the application space, and spare the developer some extra work in the process. Developers usually like that.



    But the takeaway point is that GCD is not some mind-blowing new technological shiny thing. At root, it has one small but crucial improvement on everyone else: It moves the thread pool and the management and allocation code into a space where it can consider the entire context it's running in--hardware, applications, everything--when deciding what to allocate where. It's refinement, not revolution, but that doesn't make it any less significant. It accomplishes what it is meant to, which is to offer a simpler API which offers better performance.



    (Anyone who thinks that technology really embraces revolutionary change is welcome to explain to me why the idioms and syntax of the C programming language, codified in 1973, are still in nearly universal use 36 years later on all platforms despite great leaps forward in linguistics and parsers and compilers and run-time environments, and despite decades of research by many of our brightest minds. In fact, any truly revolutionary ideas will come under a relentless assault, and only those that survive this mob attack will have any chance of taking hold.)



    Much better explanation that mine.
Sign In or Register to comment.