or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple ready and waiting with redesigned iMac line
New Posts  All Forums:Forum Nav:

Apple ready and waiting with redesigned iMac line - Page 10

post #361 of 486
Quote:
Originally Posted by backtomac View Post

That's not true IIRC. Windows has thread pooling. I don't know if they Windows solution is as good as GCD but it exists. In fact most of the programmers at Ars down play the significance of GCD for this reason. They state that thread pooling has existed with Windows for some time and it helps some but that the major challenge is figuring out how and when to break up your code into threads. They're pretty unimpressed with GCD for the most part.

Oh, are they now? Well considering they are mostly arrogant douche-nozzles, that doesn't mean much.
post #362 of 486
Quote:
Originally Posted by Outsider View Post

Oh, are they now? Well considering they are mostly arrogant douche-nozzles, that doesn't mean much.

Well, I don't know if I'd go quite so far as to call them mostly arrogant douche-nozzles, but there are quite a few guys over there that are completely wed to whatever language/process/platform they came up on, and of that crew there certainly are a certain number who exhibit the tell-tale signs of thundering arrogant douche-nozzlery, tech division, which has no peer for smug certainty.

OTOH, the lengthy Ars article gives a good overview of the why and what of GCD, and a lot of the people on the forums dismissing it seem to be doing so by obstinately ignoring what it's doing that's different and useful, and stubbornly focusing on what's similar to existing tech. If you read through the threads, "different and useful" gets pointed out over and over, and the nay-sayers just keep running the same "LOL, did that years ago" line.

Part of that, of course, is driven by Apple's penchant for presenting their stuff in fairly over-the-top terms, so a lot of people just want to knock down the hype. Doesn't change the fact that GCD is going to be pretty helpful in a lot of instances and that how Apple has gone about implementing it isn't trivial.
They spoke of the sayings and doings of their commander, the grand duke, and told stories of his kindness and irascibility.
Reply
They spoke of the sayings and doings of their commander, the grand duke, and told stories of his kindness and irascibility.
Reply
post #363 of 486
Quote:
Originally Posted by backtomac View Post

That's not true IIRC. Windows has thread pooling. I don't know if they Windows solution is as good as GCD but it exists. In fact most of the programmers at Ars down play the significance of GCD for this reason. They state that thread pooling has existed with Windows for some time and it helps some but that the major challenge is figuring out how and when to break up your code into threads. They're pretty unimpressed with GCD for the most part.

I thought the point of GCD is so they wouldn't have to break it up into threads, the OS would do that for them.
post #364 of 486
Quote:
Originally Posted by BenRoethig View Post

I thought the point of GCD is so they wouldn't have to break it up into threads, the OS would do that for them.

Ben, I've read the Siracusa review a couple of times now, but since I've no background in programming I'm not sure I fully understand it and how it differs from the Windows way of thread pooling.
post #365 of 486
Quote:
Originally Posted by backtomac View Post

That's not true IIRC. Windows has thread pooling. I don't know if they Windows solution is as good as GCD but it exists. In fact most of the programmers at Ars down play the significance of GCD for this reason. They state that thread pooling has existed with Windows for some time and it helps some but that the major challenge is figuring out how and when to break up your code into threads. They're pretty unimpressed with GCD for the most part.

Someone correct me if I'm wrong here but Thread Pool for windows is an API, meaning a programmer would have to take advantage of it in their code. GCD eliminates that as a requirement.
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
post #366 of 486
Quote:
Originally Posted by BenRoethig View Post

I thought the point of GCD is so they wouldn't have to break it up into threads, the OS would do that for them.

The basic of this is not that programmers won't need to multithread their programs. They still need to find the processes that could benefit from threads.

What they don't have to do is to define the maximum number of threads etc. They can define the processes that could use a thread, no matter how many, and the OS takes the next step..

The OS will know where threads are being used, and how many. So it can decide how many processes to pull into threads, and when they're finished.

The problem with the old way is that if the programmer spec'd too many threads, the program could hang. If they didn't spec enough, then it would run more slowly than it could.

Simplistically, this is a thread manager.

I don't know exactly how Windows does it, but it's not the same thing.
post #367 of 486
Found a great in-depth look at GCD. A very good read if you have the time.

http://arstechnica.com/apple/reviews...-x-10-6.ars/12
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
post #368 of 486
Quote:
Originally Posted by superd View Post

So there have been some interesting developments as of late. 2 Australian Forums have taken down a number of posts showing the specs of the soon to be released iMacs.

http://forums.whirlpool.net.au...eplies.cfm?t=1275768

User 275907 formerly known as iNerd has been leaking info, reposted last night and now the other forums posts that he created have mysteriously disappeared as have the links in the above forum...

Summary of pertinent details

Summary of iNerd's predictions for the iMac:

26", refined to 25.5" LED backlit display
Up to 12Gb RAM, new CPU, likely to be Apple
first (Core i7 mobile)
Minor shape changes Slightly bigger then "minor" BUT not totally different
Up to 2Tb HDD
New mouse design, backlit keyboard
20" replaced with 21.5
Available BluRay

To be release on October 13th according to his now delete posts on aqua-soft.org

Can't wait,
Dave

That's what I'm dreaming of, I won't buy a new iMac until LED backlit display and a Nehalem based processor are featured.
will be what it's going to be
Reply
will be what it's going to be
Reply
post #369 of 486
Quote:
Originally Posted by superd View Post

So there have been some interesting developments as of late. 2 Australian Forums have taken down a number of posts showing the specs of the soon to be released iMacs.

http://forums.whirlpool.net.au...eplies.cfm?t=1275768

User 275907 formerly known as iNerd has been leaking info, reposted last night and now the other forums posts that he created have mysteriously disappeared as have the links in the above forum...

Summary of pertinent details

Summary of iNerd's predictions for the iMac:

26", refined to 25.5" LED backlit display
Up to 12Gb RAM, new CPU, likely to be Apple
first (Core i7 mobile)
Minor shape changes – Slightly bigger then "minor" BUT not totally different
Up to 2Tb HDD
New mouse design, backlit keyboard
20" replaced with 21.5
Available BluRay

To be release on October 13th according to his now delete posts on aqua-soft.org

Can't wait,
Dave

Wow that is a wet dream man... It could happen. It just sounds too good to be true. BluRay and backlit keyboard and new mouse... Important stuff.

One point though, 12GB RAM doesn't make sense whatsoever. Apple would not allow the iMac to encroach onto the Mac Pro in this way. This 26" could be excellent for video editing even with 8GB of RAM max.
post #370 of 486
Oh that 25.5" and 21.5" will probably be 16:9
post #371 of 486
Quote:
Originally Posted by nvidia2008 View Post

Oh that 25.5" and 21.5" will probably be 16:9

I truly hope we DON'T get a 19:9 screen. It's too wide. These things are meant for a lot more than watching movies and Tv shows.
post #372 of 486
Quote:
Originally Posted by nvidia2008 View Post

This 26" could be excellent for video editing...
Oh that 25.5" and 21.5" will probably be 16:9

Quote:
Originally Posted by melgross View Post

I truly hope we DON'T get a 19:9 screen. It's too wide. These things are meant for a lot more than watching movies and Tv shows.

I believe you meant 16:9, not 19:9

- IF the 21.5" is 1920x1080, then it's a improvement vs the current 20" (1680x1050)
- AFAIK, 25/26" displays are "just" 1920x1080, and are not an improvement vs the current 24" (1920x1200)
- I've seen some 27" displays capable of 2048x1152 (Samsung), that would be a small improvement vs the current 24"

Most of the time 16:9 is not an improvement vs 16:10. Like it has been said many times in this forum, with 16:10 in the 24", for example, you can display all of the 16:9 footage and have room for some tools in the bottom, that's better for video editing. Of course, a 27" or 30" (2560x1600) display would be even better.
post #373 of 486
Quote:
Originally Posted by mjteix View Post

I believe you meant 16:9, not 19:9

- IF the 21.5" is 1920x1080, then it's a improvement vs the current 20" (1680x1050)
- AFAIK, 25/26" displays are "just" 1920x1080, and are not an improvement vs the current 24" (1920x1200)
- I've seen some 27" displays capable of 2048x1152 (Samsung), that would be a small improvement vs the current 24"

Most of the time 16:9 is not an improvement vs 16:10. Like it has been said many times in this forum, with 16:10 in the 24", for example, you can display all of the 16:9 footage and have room for some tools in the bottom, that's better for video editing. Of course, a 27" or 30" (2560x1600) display would be even better.

If the screen is large enough, a 16:9 aspect ratio isn't that horrible. I recently bought a 1080p LCD display for my Mini. It's a bit wider than I like, but not a noticeable detraction. I also prefer 16:10 as it's a nice compromise between the 16:9 and old 4:3 standard of yore
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
post #374 of 486
Quote:
Originally Posted by nvidia2008 View Post

Wow that is a wet dream man... It could happen. It just sounds too good to be true. BluRay and backlit keyboard and new mouse... Important stuff.

One point though, 12GB RAM doesn't make sense whatsoever. Apple would not allow the iMac to encroach onto the Mac Pro in this way. This 26" could be excellent for video editing even with 8GB of RAM max.

The iMac has been encroaching on the Mac Pro for years. Its the machine they want you to buy. They also need to do something with the RAM situation. With only slots, you're either suck with the same 4GB that's been the norm for a couple years now or you have to pay a super premium for 8GB.
post #375 of 486
Quote:
Originally Posted by BenRoethig View Post

The iMac has been encroaching on the Mac Pro for years. Its the machine they want you to buy. They also need to do something with the RAM situation. With only slots, you're either suck with the same 4GB that's been the norm for a couple years now or you have to pay a super premium for 8GB.

Yeah, I definitely think the iMac is what Apple want you to buy because it's always featured on the compare pages. If you go to the Mini compare, it has an iMac, if you go to the Mac Pro compare, it has an iMac. If you go to the iMac compare, it has neither the Mini nor Mac Pro.

It's about control and probably profit. They control every single component you get including the display when you get an iMac - it's near impossible to even replace your own hard drive.

That aspect I don't like about it but it would be easier to look past if the design was better. No chin, no glossy and that would remove a significant barrier towards liking it. Clarksfield processors would be great and a more powerful NVidia graphics chip or the Radeon 4830.

If they put e-IPS into the lowest one instead of TN, that would help too but probably wouldn't be cost-effective.

I really want them to put in a hard drive slot at the bottom similar to what you get for Ram so they can be easily removed/upgraded. 2 x 2.5" slots would be nice as it will accommodate SSD drives in RAID or a small SSD for boot and a 2.5" drive for storage.

Apple only go up to 1TB BTO in the iMac anyway so 2 x 500GB 2.5" is the same storage and because you can put them in RAID-0, it can be faster or in RAID-1 more secure.
post #376 of 486
Quote:
Originally Posted by mjteix View Post

I believe you meant 16:9, not 19:9

- IF the 21.5" is 1920x1080, then it's a improvement vs the current 20" (1680x1050)
- AFAIK, 25/26" displays are "just" 1920x1080, and are not an improvement vs the current 24" (1920x1200)
- I've seen some 27" displays capable of 2048x1152 (Samsung), that would be a small improvement vs the current 24"

Most of the time 16:9 is not an improvement vs 16:10. Like it has been said many times in this forum, with 16:10 in the 24", for example, you can display all of the 16:9 footage and have room for some tools in the bottom, that's better for video editing. Of course, a 27" or 30" (2560x1600) display would be even better.

I did mean 16:9. It was late, and I didn't catch the typo.

I don't like the wider displays, because when putting a page on the screen, we'll be forced to have a significantly lower resolution, and less height, physically.

Right now, I can get two pages, side by side, that are 100% 8.5 x 11 on a 24" monitor, which is why that size is very popular in publishing, and anything that has to do with publishing, because a full 11" x 17" spread is also seen, of course, in full size, at good resolution. It will also display the slightly narrower and longer A4 size equally well.

But a 24" 16:9 screen will be too short to do that. It's a really bad idea, even though a few PC laptops have already gone that route.

For video editing, it's also bad, because companies will in some cases have to re-do their GUI, as it depends on space for controls below the full screen playback.

This is strictly an amateur solution for those doing little other than watching widescreen video.

I don't mind separate monitors giving people that choice, but not if it's a requirement in an AIO.

Going to a larger diag to make up for the height isn't useful either. You then begin to approach the size of a 30" unit without any of the benefits, though with lower cost. Then all the people already complaining about the "low" ppi count of present monitors will complain more with the even lower ppi of these.

This would be a lose lose situation.
post #377 of 486
Quote:
Originally Posted by DJRumpy View Post

If the screen is large enough, a 16:9 aspect ratio isn't that horrible. I recently bought a 1080p LCD display for my Mini. It's a bit wider than I like, but not a noticeable detraction. I also prefer 16:10 as it's a nice compromise between the 16:9 and old 4:3 standard of yore

Terrible.

It will also make people scroll more while on the internet. I can't think of a single benefit to going 16:9.
post #378 of 486
Quote:
Originally Posted by backtomac View Post

That's not true IIRC. Windows has thread pooling. I don't know if they Windows solution is as good as GCD but it exists. In fact most of the programmers at Ars down play the significance of GCD for this reason. They state that thread pooling has existed with Windows for some time and it helps some but that the major challenge is figuring out how and when to break up your code into threads. They're pretty unimpressed with GCD for the most part.

Surprising, considering that the greatest explication of the importance of GCD has come from one of their own, John Siracusa.

Thread pooling is almost as old as BSD UNIX. In fact, there are ideas in GCD that go back to batch programming in the 1960s. The simple genius of GCD, as Siracusa observed, is that it puts the thread pool at the operating system level rather than the application level. This frees up applications from having to manage a thread pool, check for the number of available cores, and so on and on, only to have your attempts thwarted by other applications similarly trying to squeeze performance out of the same hardware. There is a clean, simple API for feeding code to GCD, which takes care of everything else and uses its centrality to ensure that the hardware is used efficiently in a way that no application in a multitasking environment possibly can. If that code contains Core Image calls, those can run on CPU or GPU cores, depending on availability. And it's all completely transparent to the application developer.

Windows has APIs that make it easier to manage thread pools, which is great; but it's still a per-application solution. It can't wring as much performance out of the hardware as GCD can as long as there is even one other non-trivial process running concurrently.

GCD does not solve threading in the general case. The attempts at comprehensive solutions, like PARLOG, are hardly better than the problem, because threading is tricky and involves a lot of judgment calls and edge cases. GCD pretty much sidesteps the hardest problem--what to thread,---and simply ensures that once you know what to thread, it will take care of all the details with an efficiency that is impossible within the application space, and spare the developer some extra work in the process. Developers usually like that.

But the takeaway point is that GCD is not some mind-blowing new technological shiny thing. At root, it has one small but crucial improvement on everyone else: It moves the thread pool and the management and allocation code into a space where it can consider the entire context it's running in--hardware, applications, everything--when deciding what to allocate where. It's refinement, not revolution, but that doesn't make it any less significant. It accomplishes what it is meant to, which is to offer a simpler API which offers better performance.

(Anyone who thinks that technology really embraces revolutionary change is welcome to explain to me why the idioms and syntax of the C programming language, codified in 1973, are still in nearly universal use 36 years later on all platforms despite great leaps forward in linguistics and parsers and compilers and run-time environments, and despite decades of research by many of our brightest minds. In fact, any truly revolutionary ideas will come under a relentless assault, and only those that survive this mob attack will have any chance of taking hold.)
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
post #379 of 486
Quote:
Originally Posted by melgross View Post

I don't like the wider displays, because when putting a page on the screen, we'll be forced to have a significantly lower resolution, and less height, physically.

Right now, I can get two pages, side by side, that are 100% 8.5 x 11 on a 24" monitor, which is why that size is very popular in publishing, and anything that has to do with publishing, because a full 11" x 17" spread is also seen, of course, in full size, at good resolution. It will also display the slightly narrower and longer A4 size equally well.

But a 24" 16:9 screen will be too short to do that. It's a really bad idea, even though a few PC laptops have already gone that route.

For video editing, it's also bad, because companies will in some cases have to re-do their GUI, as it depends on space for controls below the full screen playback.

Terrible.

It will also make people scroll more while on the internet. I can't think of a single benefit to going 16:9.

It's not that dramatic. 1920 x 1200 to 1920 x 1080 assuming they stayed at the current width. A whole 120 Pixels in height would be lost. If they did go from 16:10 to 16:9 they would probably also up the resolution given the rumors of larger displays.

I swear the drama queens in here are something else. Given a larger screen size and a higher resolution you'd still have the same real estate with more width and possibly more height depending on how much of a bump in resolution they went with. It also isn't necessary to display a video in the exact width or height of the video's output image. They can shrink it by any arbitrary amount to fit various toolbars and widgets. Most video editing software already does this to make room for various editing tools. You should also know that a standard 11x17 page @ 93 PPI (the current Pixel density on my 08 iMac) is only 1581 x 1023 and already below the current 1920x1200 resolution. Your 8 1/2 x 11 pages are being scaled to fit your screen already, and they could easily be scaled to fit your screen on a widescreen monitor as can any text, graphics, video or any other media your displaying from the web, locally, playing from a dvd player, or whatever.

I agree that a true wide screen wouldn't be ideal, but I'm not going to start crying that the world will crumble if they go from 16:10 to 16:9.

At this point it's all only rumor.
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
post #380 of 486
Quote:
Originally Posted by Amorph View Post

Surprising, considering that the greatest explication of the importance of GCD has come from one of their own, John Siracusa.

Thread pooling is almost as old as BSD UNIX. In fact, there are ideas in GCD that go back to batch programming in the 1960s. The simple genius of GCD, as Siracusa observed, is that it puts the thread pool at the operating system level rather than the application level. This frees up applications from having to manage a thread pool, check for the number of available cores, and so on and on, only to have your attempts thwarted by other applications similarly trying to squeeze performance out of the same hardware. There is a clean, simple API for feeding code to GCD, which takes care of everything else and uses its centrality to ensure that the hardware is used efficiently in a way that no application in a multitasking environment possibly can. If that code contains Core Image calls, those can run on CPU or GPU cores, depending on availability. And it's all completely transparent to the application developer.

Windows has APIs that make it easier to manage thread pools, which is great; but it's still a per-application solution. It can't wring as much performance out of the hardware as GCD can as long as there is even one other non-trivial process running concurrently.

GCD does not solve threading in the general case. The attempts at comprehensive solutions, like PARLOG, are hardly better than the problem, because threading is tricky and involves a lot of judgment calls and edge cases. GCD pretty much sidesteps the hardest problem--what to thread,---and simply ensures that once you know what to thread, it will take care of all the details with an efficiency that is impossible within the application space, and spare the developer some extra work in the process. Developers usually like that.

But the takeaway point is that GCD is not some mind-blowing new technological shiny thing. At root, it has one small but crucial improvement on everyone else: It moves the thread pool and the management and allocation code into a space where it can consider the entire context it's running in--hardware, applications, everything--when deciding what to allocate where. It's refinement, not revolution, but that doesn't make it any less significant. It accomplishes what it is meant to, which is to offer a simpler API which offers better performance.

(Anyone who thinks that technology really embraces revolutionary change is welcome to explain to me why the idioms and syntax of the C programming language, codified in 1973, are still in nearly universal use 36 years later on all platforms despite great leaps forward in linguistics and parsers and compilers and run-time environments, and despite decades of research by many of our brightest minds. In fact, any truly revolutionary ideas will come under a relentless assault, and only those that survive this mob attack will have any chance of taking hold.)

Much better explanation that mine.
post #381 of 486
Quote:
Originally Posted by DJRumpy View Post

It's not that dramatic. 1920 x 1200 to 1920 x 1080 assuming they stayed at the current width. A whole 120 Pixels in height would be lost. If they did go from 16:10 to 16:9 they would probably also up the resolution given the rumors of larger displays.

I don't know about the "probably". The idea behind these 16:9 displays, from what I've seen so far, is that they mimic the 1920 x 1080 of the format. It's a going backward sort of thing.

After all, most movies aren't 16:9 at all, but more like 20:9. So how far are we going to go with this?

120 pixels is a lot. It can easily be ten lines of text, or more, at smaller sizes.

It's a very big difference when looking at an entire page. It's often the difference between an accurate representation of the page, and a garbled mess.

Quote:
I swear the drama queens in here are something else. Given a larger screen size and a higher resolution you'd still have the same real estate with more width and possibly more height depending on how much of a bump in resolution they went with. It also isn't necessary to display a video in the exact width or height of the video's output image. They can shrink it by any arbitrary amount to fit various toolbars and widgets. Most video editing software already does this to make room for various editing tools. You should also know that a standard 11x17 page @ 93 PPI (the current Pixel density on my 08 iMac) is only 1581 x 1023 and already below the current 1920x1200 resolution. Your 8 1/2 x 11 pages are being scaled to fit your screen already, and they could easily be scaled to fit your screen on a widescreen monitor as can any text, graphics, video or any other media your displaying from the web, locally, playing from a dvd player, or whatever.

I agree that a true wide screen wouldn't be ideal, but I'm not going to start crying that the world will crumble if they go from 16:10 to 16:9.

At this point it's all only rumor.

You're making assumptions you don't know will be true. Cost is still a factor, last I looked. Larger monitors with more pixels will still cost more. That's what will be required.

I'm not comparing a 24" with a smaller model, because that's not really in question. But if it were, then that smaller model would have the same problem, except it might possibly be worse because it already is less than ideal.

Scales to fit means more than one thing. If you read my post completely, then you know that I'm saying that on my screen an 8.5" x 11" page is 8.5" x 11" ON MY MONITOR. The scaling is, as I mentioned, 100%. So when I compare the page to the monitor page,it's exactly the same.

If you've done publishing or graphics, you know that seeing different sized pages gives different impact to the various elements on that page, which is why 100% scaling is considered to be ideal.
post #382 of 486
Quote:
Originally Posted by melgross View Post

I don't know about the "probably". The idea behind these 16:9 displays, from what I've seen so far, is that they mimic the 1920 x 1080 of the format. It's a going backward sort of thing.

After all, most movies aren't 16:9 at all, but more like 20:9. So how far are we going to go with this?

120 pixels is a lot. It can easily be ten lines of text, or more, at smaller sizes.

It's a very big difference when looking at an entire page. It's often the difference between an accurate representation of the page, and a garbled mess.



You're making assumptions you don't know will be true. Cost is still a factor, last I looked. Larger monitors with more pixels will still cost more. That's what will be required.

I'm not comparing a 24" with a smaller model, because that's not really in question. But if it were, then that smaller model would have the same problem, except it might possibly be worse because it already is less than ideal.

Scales to fit means more than one thing. If you read my post completely, then you know that I'm saying that on my screen an 8.5" x 11" page is 8.5" x 11" ON MY MONITOR. The scaling is, as I mentioned, 100%. So when I compare the page to the monitor page,it's exactly the same.

If you've done publishing or graphics, you know that seeing different sized pages gives different impact to the various elements on that page, which is why 100% scaling is considered to be ideal.

Actually it's an inch and a half. About 3-5 lines of text at typical font sizes. Negligible.

I make no assumptions. You implying that just because it's larger it will cost more. PC Components and electronics components in general become cheaper with wider adoption. It's happening as we type. You are the one making assumptions. LCD technology gets cheaper every year. It's also reasonable to expect that a 26" display at a higher resolution will cost the same as 24" display used to which may be why they are looking at larger displays. Who knows? We weren't even discussing cost.

If you were in publishing you would know that a piece of 'paper' as displayed on your monitor has no bearing on how it would look on real paper. It is all about pixels per inch, and the current iMac monitor has about a 93 PPI rating meaning your 8.5x11 'paper' @ 93 PPI is actually being upscaled if your using the 24" display and it's filling your screen.
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
post #383 of 486
Quote:
Originally Posted by Amorph View Post

Surprising, considering that the greatest explication of the importance of GCD has come from one of their own, John Siracusa.

Thread pooling is almost as old as BSD UNIX. In fact, there are ideas in GCD that go back to batch programming in the 1960s. The simple genius of GCD, as Siracusa observed, is that it puts the thread pool at the operating system level rather than the application level. This frees up applications from having to manage a thread pool, check for the number of available cores, and so on and on, only to have your attempts thwarted by other applications similarly trying to squeeze performance out of the same hardware. There is a clean, simple API for feeding code to GCD, which takes care of everything else and uses its centrality to ensure that the hardware is used efficiently in a way that no application in a multitasking environment possibly can. If that code contains Core Image calls, those can run on CPU or GPU cores, depending on availability. And it's all completely transparent to the application developer.

Windows has APIs that make it easier to manage thread pools, which is great; but it's still a per-application solution. It can't wring as much performance out of the hardware as GCD can as long as there is even one other non-trivial process running concurrently.

GCD does not solve threading in the general case. The attempts at comprehensive solutions, like PARLOG, are hardly better than the problem, because threading is tricky and involves a lot of judgment calls and edge cases. GCD pretty much sidesteps the hardest problem--what to thread,---and simply ensures that once you know what to thread, it will take care of all the details with an efficiency that is impossible within the application space, and spare the developer some extra work in the process. Developers usually like that.

But the takeaway point is that GCD is not some mind-blowing new technological shiny thing. At root, it has one small but crucial improvement on everyone else: It moves the thread pool and the management and allocation code into a space where it can consider the entire context it's running in--hardware, applications, everything--when deciding what to allocate where. It's refinement, not revolution, but that doesn't make it any less significant. It accomplishes what it is meant to, which is to offer a simpler API which offers better performance.

(Anyone who thinks that technology really embraces revolutionary change is welcome to explain to me why the idioms and syntax of the C programming language, codified in 1973, are still in nearly universal use 36 years later on all platforms despite great leaps forward in linguistics and parsers and compilers and run-time environments, and despite decades of research by many of our brightest minds. In fact, any truly revolutionary ideas will come under a relentless assault, and only those that survive this mob attack will have any chance of taking hold.)

Nice post. Even though we may have different opinions on this topic I can understand your points clearly.

I'm not a programmer so I can't speak as to the significance of GCD. It sounds like a programming breakthrough to people like me who have a superficial understanding of programming.

As you probably know Apple have open sourced GCD. Here's the thread at Ars on the subject. While the general tone of the discussion doesn't 'slam' GCD as a useless tool, it isn't considered a breakthrough either.
post #384 of 486
Quote:
Originally Posted by backtomac View Post

Nice post. Even though we may have different opinions on this topic I can understand your points clearly.

I'm not a programmer so I can't speak as to the significance of GCD. It sounds like a programming breakthrough to people like me who have a superficial understanding of programming.

As you probably know Apple have open sourced GCD. here's the thread at Ars on the subject. While the general tone of the discussion doesn't 'slam' GCD as a useless tool, it isn't considered a breakthrough either.

Thanks for posting this. I was curious about this article after someone (you?) mentioned it above.
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
post #385 of 486
Quote:
Originally Posted by DJRumpy View Post

Actually it's an inch and a half. About 3-5 lines of text at typical font sizes. Negligible.

I make no assumptions. You implying that just because it's larger it will cost more. PC Components and electronics components in general become cheaper with wider adoption. It's happening as we type. You are the one making assumptions. LCD technology gets cheaper every year. It's also reasonable to expect that a 26" display at a higher resolution will cost the same as 24" display used to which may be why they are looking at larger displays. Who knows? We weren't even discussing cost.

If you were in publishing you would know that a piece of 'paper' as displayed on your monitor has no bearing on how it would look on real paper. It is all about pixels per inch, and the current iMac monitor has about a 93 PPI rating meaning your 8.5x11 'paper' @ 93 PPI is actually being upscaled if your using the 24" display and it's filling your screen.

No, it's about ten lines, often more, as text is often ten points, not twelve as you're imagining. There is often smaller text as well, sometimes down to eight point, or even six in some Ads.

Cost is relative, as bigger monitors get cheaper, so do smaller models.

What we used to think of as a reasonable cost for a 1280 x 1024 31" monitor, $3,000, 20 years ago, is now considered, even with inflation taken into account, to be the price for a high end monitor. As expectations change, so does the willingness to spend. When Apple's first HQ LCD 23" monitor came out, its $4,000 was considered to be low for such a thing, but now people are complaining about a 30" for less half that.

Big monitors will always cost more, and for most people, and unfortunately today, even for businesses that should know better, cost is a primary concern. So if going to a larger monitor with more horizontal resolution in order to get a proper vertical size is required, then that will cost more, and people won't be happy abut it.

I was in publishing and photography for almost 40 years, so yes, I know what it looks like. Since you've shown that you don't know that the industry has been moving to "soft proofing" for the past ten years, I'm not so certain how much you know.

More often, what you see on your, professional, properly calibrated monitor, is what you go to press with. The approved image on yours is sent to the printer to view on his, and you go to press. More rarely are "proofs" being used. 3M, Kodak, Agfa and others are discontinuing their proofing systems, and with digital printing systems, and even with ink presses, press proofs are becoming a thing of the past.

When I print at home, I've got a 5,000k Graphlite viewing box, just as I used to use at my lab. My Eizo monitor is calibrated with my i1 X-Rite equipment, and I use calibrated profiles for my Canon IPF5100 printer.

What I see on my monitor VERY closely matches my print.

I'm well aware that 90 to 100 ppi monitors are not at the rez of a sheet of printed paper. The fact that you would think to say that is puzzling. It doesn't matter. For proofing, it's considered to be fine, and it is.
post #386 of 486
Quote:
Originally Posted by backtomac View Post

Nice post. Even though we may have different opinions on this topic I can understand your points clearly.

I'm not a programmer so I can't speak as to the significance of GCD. It sounds like a programming breakthrough to people like me who have a superficial understanding of programming.

As you probably know Apple have open sourced GCD. Here's the thread at Ars on the subject. While the general tone of the discussion doesn't 'slam' GCD as a useless tool, it isn't considered a breakthrough either.

I read that thread, and the people complaining about it obviously don't understand it.
post #387 of 486
Quote:
Originally Posted by melgross View Post

No, it's about ten lines, often more, as text is often ten points, not twelve as you're imagining. There is often smaller text as well, sometimes down to eight point, or even six in some Ads.

Cost is relative, as bigger monitors get cheaper, so do smaller models.

What we used to think of as a reasonable cost for a 1280 x 1024 31" monitor, $3,000, 20 years ago, is now considered, even with inflation taken into account, to be the price for a high end monitor. As expectations change, so does the willingness to spend. When Apple's first HQ LCD 23" monitor came out, its $4,000 was considered to be low for such a thing, but now people are complaining about a 30" for less half that.

Big monitors will always cost more, and for most people, and unfortunately today, even for businesses that should know better, cost is a primary concern. So if going to a larger monitor with more horizontal resolution in order to get a proper vertical size is required, then that will cost more, and people won't be happy abut it.

I was in publishing and photography for almost 40 years, so yes, I know what it looks like. Since you've shown that you don't know that the industry has been moving to "soft proofing" for the past ten years, I'm not so certain how much you know.

More often, what you see on your, professional, properly calibrated monitor, is what you go to press with. The approved image on yours is sent to the printer to view on his, and you go to press. More rarely are "proofs" being used. 3M, Kodak, Agfa and others are discontinuing their proofing systems, and with digital printing systems, and even with ink presses, press proofs are becoming a thing of the past.

When I print at home, I've got a 5,000k Graphlite viewing box, just as I used to use at my lab. My Eizo monitor is calibrated with my i1 X-Rite equipment, and I use calibrated profiles for my Canon IPF5100 printer.

What I see on my monitor VERY closely matches my print.

I'm well aware that 90 to 100 ppi monitors are not at the rez of a sheet of printed paper. The fact that you would think to say that is puzzling. It doesn't matter. For proofing, it's considered to be fine, and it is.

It is relevant because the complaint was that you simply couldn't see two 'sheets' of 8.5x11 documents side by side in a 16:9 aspect on a 1920x1080 resolution, which is not true. Scaled to 100% at 72DPI you can acutally stack 3 of them side by side plus some at 100%, even at 16:10. Oh, and the default size on my iMac 24 in Firefox was 16. I believe it's 14 or 16 on Safari as well.

This whole argument about cost is irrelevant. You seem to be the only one working that angle. People are fine with a larger option for screen size like 26" or 30". Hell we even agree that 16:10 would be better than 16:9, but stating they will lose too much real estate to properly display two docs side by side if silly.
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
post #388 of 486
Quote:
Originally Posted by DJRumpy View Post

It is relevant because the complaint was that you simply couldn't see two 'sheets' of 8.5x11 documents side by side in a 16:9 resolution on a 1920x1080 resolution, which is not true. Scaled to 100% at 72DPI you can acutally stack 3 of them side by side plus some at 100%, even at 16:10.

I have no idea what you're talking about, and I'm not sure you do either.

Do you know what 100% scale means?

It doesn't mean what you seem to think it means. It doesn't mean that the computer thinks it's at 100%.

It means that an 8.5" x 11" sheet is exactly 8.5" x 11" on the monitor.

A 16:10 monitor has a screen that's about 12.625" high. After you take the room for the menu bar at the top the top of the screen, the program's menu bar and the border, you have a bit more than 11" left for the display of the sheet. Only on a 24" monitor is 100% in the program, 100% to the actual sheet. For smaller monitors, 100% is smaller than the actual sheet.

This is what matters.
post #389 of 486
Quote:
Originally Posted by melgross View Post

I have no idea what you're talking about, and I'm not sure you do either.

Do you know what 100% scale means?

It doesn't mean what you seem to think it means. It doesn't mean that the computer thinks it's at 100%.

It means that an 8.5" x 11" sheet is exactly 8.5" x 11" on the monitor.

A 16:10 monitor has a screen that's about 12.625" high. After you take the room for the menu bar at the top the top of the screen, the program's menu bar and the border, you have a bit more than 11" left for the display of the sheet. Only on a 24" monitor is 100% in the program, 100% to the actual sheet. For smaller monitors, 100% is smaller than the actual sheet.

This is what matters.

Aspect ratio has NOTHING to do with the height of a monitor in inches. It only describes the width to height ratio. Nothing more.

I actually have a 16:9 24" display on my Mini. It's 12 inches high on the actual display area. More than enough to preview an 8.5x11 document with menu bar.
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
post #390 of 486
Quote:
Originally Posted by Marvin View Post

Yeah, I definitely think the iMac is what Apple want you to buy because it's always featured on the compare pages. If you go to the Mini compare, it has an iMac, if you go to the Mac Pro compare, it has an iMac. If you go to the iMac compare, it has neither the Mini nor Mac Pro.

It's about control and probably profit. They control every single component you get including the display when you get an iMac - it's near impossible to even replace your own hard drive.

That aspect I don't like about it but it would be easier to look past if the design was better. No chin, no glossy and that would remove a significant barrier towards liking it. Clarksfield processors would be great and a more powerful NVidia graphics chip or the Radeon 4830.

If they put e-IPS into the lowest one instead of TN, that would help too but probably wouldn't be cost-effective.

I really want them to put in a hard drive slot at the bottom similar to what you get for Ram so they can be easily removed/upgraded. 2 x 2.5" slots would be nice as it will accommodate SSD drives in RAID or a small SSD for boot and a 2.5" drive for storage.

Apple only go up to 1TB BTO in the iMac anyway so 2 x 500GB 2.5" is the same storage and because you can put them in RAID-0, it can be faster or in RAID-1 more secure.

I would agree as long as they have enough space for three platter 12.5mm drives. Sooner or later the 2.5" form factor will be the norm as SSDs become cheaper and more common.
post #391 of 486
16:9 displays are becoming more and more prevalent in notebooks and computer monitors because of cost. LCD panel makers want to standardize on 16:9. This will drive the cost down of fabrication. I would be shocked if the new iMac, new MacBook, and I can only presume new MacBook Pros, don't switch to this ratio with the very soon-to-be-announced redesign.
You think Im an arrogant [expletive] who thinks hes above the law, and I think youre a slime bucket who gets most of his facts wrong. Steve Jobs
Reply
You think Im an arrogant [expletive] who thinks hes above the law, and I think youre a slime bucket who gets most of his facts wrong. Steve Jobs
Reply
post #392 of 486
Quote:
Originally Posted by DHagan4755 View Post

16:9 displays are becoming more and more prevalent in notebooks and computer monitors because of cost. LCD panel makers want to standardize on 16:9. This will drive the cost down of fabrication. I would be shocked if the new iMac, new MacBook, and I can only presume new MacBook Pros, don't switch to this ratio with the very soon-to-be-announced redesign.

I would be shocked if they do it this soon (Apple likes 16:10 displays and they're usually slow to adopt new display tech), but sooner or later 16:10 are going to be pretty hard to find.
post #393 of 486
The rumors of a thinner design,new keyboard, mouse, etc. work in the favor of them also incorporating new displays too.
You think Im an arrogant [expletive] who thinks hes above the law, and I think youre a slime bucket who gets most of his facts wrong. Steve Jobs
Reply
You think Im an arrogant [expletive] who thinks hes above the law, and I think youre a slime bucket who gets most of his facts wrong. Steve Jobs
Reply
post #394 of 486
Quote:
Originally Posted by mjteix View Post

I believe you meant 16:9, not 19:9

- IF the 21.5" is 1920x1080, then it's a improvement vs the current 20" (1680x1050)
- AFAIK, 25/26" displays are "just" 1920x1080, and are not an improvement vs the current 24" (1920x1200)
- I've seen some 27" displays capable of 2048x1152 (Samsung), that would be a small improvement vs the current 24"

Most of the time 16:9 is not an improvement vs 16:10. Like it has been said many times in this forum, with 16:10 in the 24", for example, you can display all of the 16:9 footage and have room for some tools in the bottom, that's better for video editing. Of course, a 27" or 30" (2560x1600) display would be even better.

They would standardise the iMac on 16:9 at 1920x1080 and not any higher resolution, if they go to 16:9. Because of video content and so on for TV shows especially. Of course, if there was BluRay they would "have to" put in a 16:9 1080p format screen. Though, 16:9 doesn't help with movies anyway because they're wider than 16:9. So you're getting less vertical pixels than 1080 anyway...

I think this 16:9 could be done on the next refresh, in other words, possibly within a few weeks time. Less chance if no BluRay incorporated into the iMac, higher chance if it is. My prediction.
post #395 of 486
Quote:
Originally Posted by BenRoethig View Post

I would be shocked if they do it this soon (Apple likes 16:10 displays and they're usually slow to adopt new display tech), but sooner or later 16:10 are going to be pretty hard to find.

The cost of 16:9 screens is staggeringly cheaper than 16:10 screens. In the PC world, anything 19" and up on the cheaper side is predominantly 16:9.

It could be a cost saving on components that Apple simply would not be able to resist.
post #396 of 486
If Apple were to do something truly radical in regards to it's Mighty Mouse -- it would take it out behind the barn and shoot it. A more radical solution to the iMac line, and to production streamlining, would be to replace those keyboards with laptop-style keyboards with trackpad built-in instead, even going so far as to include backlighting. This would allow Apple to focus and save on a single engineering format, as well as to put their attentions squarely onto gesture-based computing, which is where it needs to be as things move into a multi-touch world.
post #397 of 486
Quote:
Originally Posted by nvidia2008 View Post

The cost of 16:9 screens is staggeringly cheaper than 16:10 screens. In the PC world, anything 19" and up on the cheaper side is predominantly 16:9.

It could be a cost saving on components that Apple simply would not be able to resist.

That depends on what they want to emphasize. 16:9 is cheap because HD video is 16:9. If you're viewing HD video, it's perfect. If you're authoring video, it's not. You'd like some space for the editing interface. Apple, of course, has skin in that game.

Apple has been using unorthodox screen sizes since the Titanium Powerbook's odd 3:2 ratio. If they cost more? Apple's a premium brand. If Apple does adopt 16:9 it will be on the consumer machines only. The professional machines can soak the extra cost, especially in the quantities that Apple orders displays.
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
post #398 of 486
I don't care if it stays 16:10 or goes 16:9. Matters not to me. But it does matter a lot to others. Apple has hopefully learned from the matte vs. glossy on its pro notebooks to what professionals want. Then maybe they haven't. It wouldn't surprise me if they went 16:9 as a cost saving measure and to standardize on that. I would be shocked if they did some and not all. It may start next week or so with the newly designed MacBooks and iMacs, but eventually to the pro lines. Think SD card slot on everything, so too, shall 16:9 displays. Just my hunch.
You think Im an arrogant [expletive] who thinks hes above the law, and I think youre a slime bucket who gets most of his facts wrong. Steve Jobs
Reply
You think Im an arrogant [expletive] who thinks hes above the law, and I think youre a slime bucket who gets most of his facts wrong. Steve Jobs
Reply
post #399 of 486
Quote:
Originally Posted by nvidia2008 View Post

The cost of 16:9 screens is staggeringly cheaper than 16:10 screens. In the PC world, anything 19" and up on the cheaper side is predominantly 16:9.

It could be a cost saving on components that Apple simply would not be able to resist.

The one problem is that while they're prevalent in in the TN ranks, you don't see a lot of IPS displays with a 16:9 panel yet. The only one can think is the brand new 23" from NEC. It does come with a pretty considerable price decrease versus 16:9 IPS screen though. Then again, its lower resolution than the new 23" TN screens.
post #400 of 486
Quote:
Originally Posted by monstrosity View Post

i wonder if we will see any 'secret' SL features...


Like steam-power or a solar energy pod?
"Run faster. History is a constant race between invention and catastrophe. Education helps but it is never enough. You must also run." Leto Atreides II
Reply
"Run faster. History is a constant race between invention and catastrophe. Education helps but it is never enough. You must also run." Leto Atreides II
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple ready and waiting with redesigned iMac line