or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Intel, Nvidia show off next-gen silicon potentially bound for Apple's future Macs
New Posts  All Forums:Forum Nav:

Intel, Nvidia show off next-gen silicon potentially bound for Apple's future Macs

post #1 of 112
Thread Starter 
Computing industry observers got a glimpse of the future from two processor makers on Tuesday, as Nvidia revealed its GPU product roadmap and Intel showed off benchmarks of the forthcoming "Haswell" microarchitecture.

nvidia


Speaking at the GPU Technology Conference in San Jose, Nvidia CEO Jen-Hsun Huang said the company's 2016 release, dubbed Volta, will leverage stacked DRAM technology to deliver 1 terabyte per second of memory bandwidth, according to Forbes. Nvidia didn't provide a firm price tag or release date, aside from a 2016 window, at the event.

"Volta is going to solve one of the biggest challenges facing GPUs today," Huang said, "which is access to memory bandwidth. We never seem to have enough."

Volta will be able to reach these speeds by stacking DRAM on top of the GPU, with a silica substrate separating them. By then drilling through the silicon, Nvidia connects the two layers. The result is a GPU capable of moving the equivalent of a full Blu-Ray disc in 1/50th of a second. By comparison, the company's current high-end graphics card, "The Titan," can handle about 288GB per second, a bit more than a quarter of Volta's projected capability.

Meanwhile, Tom's Hardware recently acquired an early sample of Intel's upcoming Core i7-4770K Haswell chip and put the silicon through its paces. They found Haswell to boast significant improvements over existing Sandy Bridge-based processors in floating point performance, as well as Integer tests.

haswell1


Other performance measures, such as an OpenCL-enabled workload in Photoshop CS6 or compressing a 1.3GB folder of mixed files in WinZip, showed Haswell slightly outperforming its predecessor, a result attributed to additional EUs, more bandwidth, and higher IPC.

haswell2


The Haswell architecture, according to a document leaked late last year, is expected to show up in iMacs later in 2013. On the high end, the Core i7-4770K processor will run at a base frequency of 3.5GHz, with 8MB of memory.

The 2013 Haswell lineup includes a total of six "standard power" desktop processors, with two of them being more powerful Core i7 models. Also listed are eight "low power" processors that include three Core i7 variations.
post #2 of 112
Let's hope that Apple comes out with a Haswell-equipped Mac Pro with unbuffered RAM, Thunderbolt, USB 3 and a lower price tag than what they are offering now. Oh, having more GPU selections would be good. Graphic designers, photographers, audio/video specialists don't need every bit as correct as scientists require.
post #3 of 112
Quote:
Originally Posted by zoffdino View Post

Let's hope that Apple comes out with a Haswell-equipped Mac Pro with unbuffered RAM, Thunderbolt, USB 3 and a lower price tag than what they are offering now. Oh, having more GPU selections would be good. Graphic designers, photographers, audio/video specialists don't need every bit as correct as scientists require.

I think you're out of luck with unbuffered RAM.

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply
post #4 of 112
What's the benefit of unbuffered ram?
post #5 of 112
Quote:
Originally Posted by SolipsismX View Post


I think you're out of luck with unbuffered RAM.

 

And the price... I would love to see us return to the days of sub $2000 Mac Pro (at that time, PowerMac), but I'm not holding my breath.

 

I love the iMac, minus the tangled mess of wires you have if you have a few external hard drives, media reader, etc. Also, I would much rather have two 22" screens vs. one 27" screen.

post #6 of 112
Quote:
Originally Posted by zoffdino View Post

Let's hope that Apple comes out with a Haswell-equipped Mac Pro with unbuffered RAM, Thunderbolt, USB 3 and a lower price tag than what they are offering now. Oh, having more GPU selections would be good. Graphic designers, photographers, audio/video specialists don't need every bit as correct as scientists require.

Haswell Xeons will probably be released in 2015. Never expect the newest generation of chips in Mac Pro's. They'll always hit portables and regular desktops first.

post #7 of 112
Quote:
Originally Posted by 65C816 View Post

What's the benefit of unbuffered ram?

Less expensive RAM and accompanying logic board are the biggest benefits. Unbuffered (non-ECC) RAM may also have better performance than ECC RAM due to the removal of the error correction. Error correction sounds like a good thing but you have plenty of other ways to verify data integrity. If we're talking about a server or workstation it can make sense but not so much for a consumer system.

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply
post #8 of 112

Let's hope Apple does not screw up again with the iMac refresh, if they do it could mean we have to wait until the end of 2014 for an update. As for me my 2010 iMac will be due for a refresh at the end of the year, if they release a new model I'll pre-order straight away.

iPad, Macbook Pro, iPhone, heck I even have iLife! :-)
Reply
iPad, Macbook Pro, iPhone, heck I even have iLife! :-)
Reply
post #9 of 112
Quote:
Originally Posted by SolipsismX View Post

Less expensive RAM and accompanying logic board are the biggest benefits. Unbuffered (non-ECC) RAM may also have better performance than ECC RAM due to the removal of the error correction. Error correction sounds like a good thing but you have plenty of other ways to verify data integrity. If we're talking about a server or workstation it can make sense but not so much for a consumer system.

True. But when did the Mac Pro become a consumer system? (zoffdino was referring specifically to the Mac Pro).

The Mac Pro is a high end workstation - and unbuffered RAM is a fairly standard feature at that level. And if they drop it, we'll get even more whining about how it's not a real workstation.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #10 of 112
Quote:
Originally Posted by jragosta View Post

True. But when did the Mac Pro become a consumer system? (zoffdino was referring specifically to the Mac Pro).

The Mac Pro is a high end workstation - and unbuffered RAM is a fairly standard feature at that level. And if they drop it, we'll get even more whining about how it's not a real workstation.

I have no idea why you'd have a "but" to disagree with what I wrote when I clearly responded to zoffidino that I think unbuffered RAM is unlikely in the next Mac Pro and concluded my comment to 65C816, which you quoted by saying that buffered RAM is used for workstations.

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply
post #11 of 112

Sounds like a big yawn to me.  Great for gamers (a fading breed) and no one else really.  I certainly wouldn't hold off buying a computer waiting for this. 

post #12 of 112
Quote:
Originally Posted by Gazoobee View Post

Sounds like a big yawn to me.  Great for gamers (a fading breed) and no one else really.  I certainly wouldn't hold off buying a computer waiting for this. 

For average users, that has been true for years. Even a low end system has more power than most users need - which is a major factor in declining PC sales.

Computer developments today are mostly for the very small group of people who push their systems to the limit. For everyone else, it's just not that big a deal.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #13 of 112
Quote:
Originally Posted by Gazoobee View Post

Sounds like a big yawn to me.  Great for gamers (a fading breed) and no one else really.  I certainly wouldn't hold off buying a computer waiting for this. 

It could make the Mac Mini quite a bit better (due to the HD4600 integrated graphics). Better integrated graphics are a reason for non-geeks to care about new Intel chips.

post #14 of 112
Quote:
Originally Posted by Gazoobee View Post

Sounds like a big yawn to me.  Great for gamers (a fading breed) and no one else really.  I certainly wouldn't hold off buying a computer waiting for this. 

 

I’m waiting for Haswell because of the new FMA instructions (addition and multiplication both in a single cycle). But, you’re right, that’s a CS whim.

post #15 of 112
Quote:
Originally Posted by Gazoobee View Post

Sounds like a big yawn to me.  Great for gamers (a fading breed) and no one else really.  I certainly wouldn't hold off buying a computer waiting for this. 

 

More than gamers. Think about scientists who need to run physics simulation, or video editor who needs to cut the latest feature film. There are people who need desktop towers, with their expandability, and massive amount of RAM and HDD, etc.

 

Quote:
Originally Posted by SolipsismX View Post


I have no idea why you'd have a "but" to disagree with what I wrote when I clearly responded to zoffidino that I think unbuffered RAM is unlikely in the next Mac Pro and concluded my comment to 65C816, which you quoted by saying that buffered RAM is used for workstations.
 

What I meant was that a workstation with unbuffered RAM would be suitable for some usage scenarios. As a video editor, I can easily live with a few bad pixels in a 1080p rendering if that means I can save hundreds off a tower (multiply that by 6). Since the kind of error ECC corrects for are random memory error, I will get different bad pixels on the next frame. Movies run too fast for you to recognize that there are few bad pixels out there so I’m totally okay with that. Combine the lower cost of non-ECC RAM with non-ECC CPUs and you can have decent savings right, with slight speed advantage to boost. ECC is a feature for those who need them. It is not a prerequisite for a workstation. In fact, both Dell and HP offer non-ECC workstations in their lineups. Not that I use them, but sometimes I feel envious of my clients who have them for their production staffs.

post #16 of 112
Quote:
Originally Posted by jragosta View Post


For average users, that has been true for years. Even a low end system has more power than most users need - which is a major factor in declining PC sales.

Computer developments today are mostly for the very small group of people who push their systems to the limit. For everyone else, it's just not that big a deal.

 

Yeah, but I would argue that this *needn't* be the case and that while the computer has enough "power" for the average user, their experience could certainly be improved in many ways.  It's been known for a long time now that the gaming industry "drives" the improvements in chip technology, but this just means that the type of improvements we get are aimed at the needs of that group.  

 

The average user's experience could be vastly improved by focussing on other things like concurrent execution and multi-processing, but that wouldn't help games at all, so nothing is ever really done about that.  I mean even on a multi-core Mac Pro with the best graphic cards and gigs and gigs of memory, it can still basically only do one intensive task at a time.  Sometimes I think it's funny when I'm importing some shows into iTunes and the whole computer just dies for a minute or two while it does it.  It's hardly any different to when I was using my IBM XT and waiting for a print job to execute all those years ago.  

 

The average computer user shouldn't have to experience these kinds of waits (especially on what is essentially a background task), but no one has ever devoted serious resources to solving those kind of problems.  Instead it's always … "better graphics!"  As a life long computer user that doesn't give a crap about gaming, I find it annoying. 

post #17 of 112
Quote:
Originally Posted by zoffdino View Post

 

More than gamers. Think about scientists who need to run physics simulation, or video editor who needs to cut the latest feature film. There are people who need desktop towers, with their expandability, and massive amount of RAM and HDD, etc. ...

 

Well I have a giant Mac Pro with Gigs and Gigs of RAM and a super fast video card but I don't play games.  

 

What I mean is that there are still improvements that would help me and could make my computer faster and more useful, but they aren't related to … "better graphics!"

post #18 of 112
Originally Posted by zoffdino View Post
…unbuffered RAM…

 

It's a workstation. Unless they decide to make the next one cater to no one professional, that's a dream.


Oh, having more GPU selections would be good.

 

THIS is a dream no matter what they make. If they turn the Mac Pro into a mainframe, you can bet there will be two GPUs. Maybe not even that. And maybe not even removable at all.


Originally Posted by Gazoobee View Post
Great for gamers (a growing breed)…

 

Fixed.

Originally Posted by asdasd

This is Appleinsider. It's all there for you but we can't do it for you.
Reply

Originally Posted by asdasd

This is Appleinsider. It's all there for you but we can't do it for you.
Reply
post #19 of 112
Quote:
Originally Posted by Gazoobee View Post

Well I have a giant Mac Pro with Gigs and Gigs of RAM and a super fast video card but I don't play games.  

What I mean is that there are still improvements that would help me and could make my computer faster and more useful, but they aren't related to … "better graphics!"

If your tasks use OpenCL, then a faster graphics card would help.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #20 of 112
Quote:
The 2013 Haswell lineup includes a total of six "standard power" desktop processors, with two of them being more powerful Core i7 models.

Unfortunately, Apple doesn't sell mid-range 'desktop' machines. Everything is either built with low-power laptop parts, or Xeons in the Mac Pro.

post #21 of 112
Originally Posted by davida View Post
Unfortunately, Apple doesn't sell mid-range 'desktop' machines. Everything is either built with low-power laptop parts, or Xeons in the Mac Pro.

 

Yep, let's just go ahead and ignore that iMac, shall we? 1oyvey.gif

Originally Posted by asdasd

This is Appleinsider. It's all there for you but we can't do it for you.
Reply

Originally Posted by asdasd

This is Appleinsider. It's all there for you but we can't do it for you.
Reply
post #22 of 112
Performance in the past few years has already gone past the level where any consumer would care, especially with SSD installed machines. And with Stacked DRAM GPU coming in few years time that should finally take care of the minority of us who still wants more performance. So for even Prosumer / Professional usage i think Apple is waiting for time to fix itself.

For the rest that would need as much computation as they need, that is were the Mac Pro fits in, and i think ECC Ram would be extremely important as the amount of memory grows. It would still need a 2 Socket CPU, and hopefully 2x Telsa K20 GPU with 2x SSD in Raid. ( And hopefully all these will fit into a Cube Shape )
post #23 of 112
Quote:
Originally Posted by zoffdino View Post

What I meant was that a workstation with unbuffered RAM would be suitable for some usage scenarios.

I know exactly what you meant. How is it my comments are unclear to both you and jragosta?

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply
post #24 of 112
Quote:
Originally Posted by ksec View Post

... i think ECC Ram would be extremely important as the amount of memory grows.

Exactly. It should be standard by now.
post #25 of 112
The only reason I want a new computer is for better graphics performance. Everything else I do with my computer could be done satisfactorily with a ten year old machine. I would love to have one of those super Nvidia chips in a portable machine with a video out port.
post #26 of 112
All signs are pointing to Apple returning to the G5 PPC processor for their Mac Pro line of computers. Since they can't seem to make any steps forward, steps backwards is the logical conclusion.
post #27 of 112
Quote:
Originally Posted by dysamoria View Post

Exactly. It should be standard by now.

I don't see why more RAM would require ECC on the RAM. Too many cons and not enough pros for consumer devices? Where do you draw the line? Android phones are shipping with 2GB of RAM and that's multudes higher than the amount of RAM from the 70s so should those get ECC, too?

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply
post #28 of 112
Originally Posted by Mike Fix View Post
Since they can't seem to make any steps forward…

 

Knock off the FUD or add an /s to this sort of thing.

Originally Posted by asdasd

This is Appleinsider. It's all there for you but we can't do it for you.
Reply

Originally Posted by asdasd

This is Appleinsider. It's all there for you but we can't do it for you.
Reply
post #29 of 112
Quote:
Originally Posted by SolipsismX View Post


I don't see why more RAM would require ECC on the RAM. Too many cons and not enough pros for consumer devices? Where do you draw the line? Android phones are shipping with 2GB of RAM and that's multudes higher than the amount of RAM from the 70s so should those get ECC, too?


This gets posted at times. The concern is the continually increasing density. If bit flipping was a common problem in properly functioning ram, ECC would probably become the standard. Even then ECC correction is specifically aimed at single bit errors, so like I said, it just prevents bit flipping anomalies. It isn't the only point of concern when it comes to data. I will say that pricing is is relatively similar to non-ECC variants today. Ten or more years ago that was not the case. IIRC workstations at that time didn't always ship with ECC by default due to the higher cost.

post #30 of 112
Quote:
Originally Posted by Tallest Skil View Post

 

Yep, let's just go ahead and ignore that iMac, shall we? 1oyvey.gif


I realize this is my first post, and I will be looked at as an automated troll for a while. Hey, I've been away so long, I forgot my old user name and password, so all is new. Yeah, let's ignore the iMac as desktop box. It's an AIO and I feel that is a category of it's own. I may have lost my mind, but years back, the Pro could be downgraded in cost to something like $1899. Anyone remember that? I want to select my own screen, not an AIO. So, that leaves me with the mini, or Pro. The mini is more than enough for my needs. I realize others want/need more. No matter what you own, it will become outdated in about 5 years, right? Just because the iMac is neat and tidy, doesn't make it the best choice. Why replace both computer and screen? I would use the same monitor for as long as possible, not pitch the whole thing for a refresh.

post #31 of 112
Originally Posted by JCM722 View Post
Yeah, let's ignore the iMac as desktop box. It's an AIO and I feel that is a category of it's own.

 

It's not, though.


I want to select my own screen, not an AIO.

 

What's wrong with the H-IPS panel the iMac uses?

 

The mini is more than enough for my needs.
 

But it's "laptop parts", as was said, so it's "invalid"…


Why replace both computer and screen?


Because you can use an iMac as a display anyway. So use it for seven years, buy a brand new Mac Mini, and plug it in.

Originally Posted by asdasd

This is Appleinsider. It's all there for you but we can't do it for you.
Reply

Originally Posted by asdasd

This is Appleinsider. It's all there for you but we can't do it for you.
Reply
post #32 of 112
Quote:
Originally Posted by Gazoobee View Post

The average computer user shouldn't have to experience these kinds of waits (especially on what is essentially a background task), but no one has ever devoted serious resources to solving those kind of problems.  Instead it's always … "better graphics!"  As a life long computer user that doesn't give a crap about gaming, I find it annoying. 

Graphics are one obvious element to improve, as there is easy to focus target in front - making visuals lifelike.

But it is not like other aspects of computing haven't improved. From power-hungry overheating pig Intel P4 was, to Core i7. From 128MB of RAM (my first laptop had when purchased back in 2001) to 4GB, soon likely to be 8GB standard. From 40GB 4200 IDE HDDs to 1TB 7200 SATA3 spinners. From CD readers to Bluray burners. From 14" CRT screens to 20+" LEDs. Every aspect of personal computing has improved.

Problem here is, most people do not care - or even do not notice - number of those improvements. 5 years old Core 2 Duo will work perceivably as fast in everyday tasks as latest i7. In fact, Core 2 Duo is already too fast for majority of those tasks, a reason why low-powered tablets became so popular.

Technology evolves much faster than mankind. We are the bottleneck right now. We do more Internet, more emailing, more social networking - but we don't do it faster than we did before. We still type more or less as fast as we ever did, and read as fast as we ever did, and 20 minutes YouTube video still takes 20 minutes to watch. My current hardware with my current broadband speed can probably stream dozens of videos at once, but to what success? I can still really focus on only one.

So, the graphics. One of the things we can see improvement. True, not for non-gamers and others who do not benefit from faster graphics hardware; but then, those people will not notice improvement from almost anything IT can bring. But those who do - and those are not only gamers, but everyone who does benefit from any sort of hardware acceleration modern graphics can bring to data processing - are, at present, market segment that wants to pay for better. And manufacturers are only answering to market demand.

I happen to be a gamer, but I also edit lots of photos, videos... so I'm definitely interested in better graphics and all the benefits they bring. I've been playing computer days since Sinclair ZX81 and also developed interest in observing how that part of technology evolved - and it was amazing voyage, from ZX81 Scramble to modern PC Battlefield 3 and Far Cry 3. I'm really looking forward what tomorrow brings.

But even if I wasn't. I mean, I don't race cars, but I'm still not annoyed when Ferrari releases new super car.
post #33 of 112
Quote:
Originally Posted by JCM722 View Post

I forgot my old user name and password, so all is new.

http://forums.appleinsider.com/u/56644/WPLJ42
Quote:
Originally Posted by JCM722 View Post

Just because the iMac is neat and tidy, doesn't make it the best choice. Why replace both computer and screen? I would use the same monitor for as long as possible, not pitch the whole thing for a refresh.

It's not just about what the buyer wants but what works best for the seller. Intel and NVidia could easily double performance every single year if they wanted to but instead they do it every other year if we're lucky so they stay in business twice as long. If they don't have any competition, what do they care?

Apple doesn't want to sell you a machine and a display separately because all you'd do is go and buy a display from someone else that doesn't cost $1000. For some people a machine gets outdated more quickly, for others the display does. With an AIO, it doesn't matter, they both have to upgrade more often and Apple makes the margins on both parts.

It's not an important issue because laptops are by far the highest selling PCs in the world and they are AIOs. That doesn't make iMacs laptops, it means they follow the same upgrade model and the PC industry has woken up to the fact that it might actually be a profitable way to do business because it increases their average selling price, allows them to have unique selling points outside of raw performance and cut down on manufacturing as they aren't building two boxes with two power supplies.
post #34 of 112
Quote:
Originally Posted by Gazoobee View Post

Sometimes I think it's funny when I'm importing some shows into iTunes and the whole computer just dies for a minute or two while it does it. I find it annoying. 

Oe that's a good point! You are really making sense here by sharing this experience. iTunes is really faulty; no matter how fast your Mac is, iTunes simply does not update fast. Or quickly. It stalls. When renaming, updating info, adding lyrics - it stalls. Even happens with scrolling. I have the media on HDD and the rest is on ~, which is on an incredibly fast PCIeSSD. I know that info gets written into the header of .mp3 files and such, so the HDD is obviously the slow down in the chain. But just from using iTunes you can tell the software is slow as well. From watching the Activity Monitor, iTunes still uses only one CPU, not even with HT.
Quote:
Originally Posted by SolipsismX View Post

I know exactly what you meant. How is it my comments are unclear to both you and jragosta?

It simply has to be a mis-read, forgetting the un-part. There's no other explanation.
"See her this weekend. You hit it off, come Turkey Day, maybe you can stuff her."
- Roger Sterling
Reply
"See her this weekend. You hit it off, come Turkey Day, maybe you can stuff her."
- Roger Sterling
Reply
post #35 of 112

2016! That's 10 years away!

 

OK 3 but it always seems forever when you are waiting for awesome tech.

post #36 of 112
Sadly it doesn't look like it will be as good of an improvement as many of us hoped for. We still have HD5000 to come. I suspect though that even geeks care about better graphics in Intels processors. That is really what makes things like AIR so attractive today.
Quote:
Originally Posted by ascii View Post

It could make the Mac Mini quite a bit better (due to the HD4600 integrated graphics). Better integrated graphics are a reason for non-geeks to care about new Intel chips.
post #37 of 112

Marvin ... Well, I have a unique low-vision situation, and cannot use any of Apple's monitors. I also can't use zoom/magnification, as blurry gets blurrier. So, I am presently using a W7 AIO with an 18.5 screen @ 1366 x 768. The standard text in the AI forum is tiny and hard to see. I have sight and am not ready to surrender to VoiceOver. Also, Apple is rich, while I am poor. Don't care what Apple wants, I am the customer, and will do what I want and must do for my visual needs. If the 27 inch iMac could double-down, for lack of a better term, to 1280 x 720, with no loss in clarity, Apple just might get me.

 

Tallest Skil ... I was not aware the iMac could be used as a monitor.

post #38 of 112
Quote:
Originally Posted by JCM722 View Post

Marvin ... Well, I have a unique low-vision situation, and cannot use any of Apple's monitors. I also can't use zoom/magnification, as blurry gets blurrier. So, I am presently using a W7 AIO with an 18.5 screen @ 1366 x 768. The standard text in the AI forum is tiny and hard to see. I have sight and am not ready to surrender to VoiceOver. Also, Apple is rich, while I am poor. Don't care what Apple wants, I am the customer, and will do what I want and must do for my visual needs. If the 27 inch iMac could double-down, for lack of a better term, to 1280 x 720, with no loss in clarity, Apple just might get me.

Tallest Skil ... I was not aware the iMac could be used as a monitor.

Wait a second. You can't use a 27" Apple AIO, but you can use an 18.5" W7 AIO?

Considering that Apple has typically offered far better quality screens on its AIOs than the competition, that's pretty hard to believe. Or were you simply unaware taht you can change the resolution on an iMac, too. The newest 27" iMac will certainly handle 1366 x 768 - it may go lower, as well. And considering the quality and size of the screen compared to your 18.5" AIO, the iMac would be FAR more readable than what you have.

Head into an Apple Store or Best Buy and set the resolution to its minimum to see how much better.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #39 of 112
Enough baloney to feed every hungry person in the world!!
Quote:
Originally Posted by Gazoobee View Post

Yeah, but I would argue that this *needn't* be the case and that while the computer has enough "power" for the average user, their experience could certainly be improved in many ways.  
One of the reasons the average user upgrades is to get getter performance. Admittedly many of those users are running Windows with all the performance issues there.
Quote:
It's been known for a long time now that the gaming industry "drives" the improvements in chip technology, but this just means that the type of improvements we get are aimed at the needs of that group.  
It is a driver for GPU cards, processors (CPUs) in general are designed to serve many purposes. CPUs advance when technology allows. This is why Haswell is seeing modest integer gains while it is benefiting from significant floating point increases. It could be a long time before Intel engineers find a way to boost integer performance significantly.
Quote:
The average user's experience could be vastly improved by focussing on other things like concurrent execution and multi-processing,
You have heard of multi core processors haven't you?
Quote:
but that wouldn't help games at all, so nothing is ever really done about that.  
That depends upon the game and the state of the toolkit used to build the game.
Quote:
I mean even on a multi-core Mac Pro with the best graphic cards and gigs and gigs of memory, it can still basically only do one intensive task at a time.
Baloney!! If the user knows what he is doing a Mac Pro can easily handily many processes running at once. Frankly that is one of the reasons to have workstation class machines.
Quote:
 Sometimes I think it's funny when I'm importing some shows into iTunes and the whole computer just dies for a minute or two while it does it.  It's hardly any different to when I was using my IBM XT and waiting for a print job to execute all those years ago.  
That is called using all available resources to get the job done!
Quote:
The average computer user shouldn't have to experience these kinds of waits (especially on what is essentially a background task), but no one has ever devoted serious resources to solving those kind of problems.  
Again this is baloney and makes me wonder if you have any sense of history. I've on many occasions have run a compiler, ITunes and Safari (at the same time) on an old 2008 MBP with the machine remaining functional if not slow. That is with a machine containing only 2GB of RAM. The fact that the machine can do this at all highlights the strength and reliability of Apples Mac OS.

Now I sure would love better performance and I'm sure I would get that with an upgrade but who doesn't expect no better performance when upgrading a 5 year old computer.
Quote:
Instead it's always … "better graphics!"  As a life long computer user that doesn't give a crap about gaming, I find it annoying. 

I find you out of touch. First; the GPUs are used in a number of ways these days. So a good GPU doesn't imply gaming platform. Second; how did you mis all the improvements that have gone into Intels CPUs over the last few years?
post #40 of 112

jragosta ... You missed the part where I said blurry gets blurrier. I am also using W7 and not OS X. So far, I cannot use any monitor that is not in the native resolution. I have an iMac 20 inch gathering dust, since I cannot see it. Nope, cannot drop the resolution either. Some people can see through the slight out of focus left behind at a different resolution. I can't. I might be able to use a mini and a 27 inch/1080 monitor. If not, my next choice is a 26 inch HDTV at 720/768 resolution. Trust me, I've been messing around with this ever since the demise of the CRT, where you could alter the resolution with no loss in clarity.

 

My point ... Apple could use desktop CPUs if the mini wasn't so small. The Pro is the only other true desktop from Apple. OK fine, everyone seems to want notebooks anyway, so to hell with desktops and choice. The iMac is a fixed resolution display to me, and as a result, useless. Same goes for everything but the iPad.

 

Also, how does the "Site Only (no email)" feature work? How does that differ from not subscribing?  Yes, I've been away from the AI forum a while.

New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Intel, Nvidia show off next-gen silicon potentially bound for Apple's future Macs