or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Inside Apple's rumored 'new MacBook' vs. updated MacBook Pro
New Posts  All Forums:Forum Nav:

Inside Apple's rumored 'new MacBook' vs. updated MacBook Pro - Page 3

post #81 of 115
Quote:
Originally Posted by landilevente View Post

Games?

Intel 3770K CPU with HD 4000 GPU can hardly handle World of Warcraft on 1920x1200 resolution screen with 25fps on lowest settings at a place what deserted and made 5 years ago. We talk about a desktop CPU what is more powerful than mobile versions. And to be honest we know WOW isn't that demanding game. Just imagine what fps can we get if we use it on 2560x1800.
http://www.youtube.com/watch?v=UnYpLZGOMQc

OK, now tell us how many MacBook Pro users play World of Warcraft. Darn few, I'd venture.

You don't seem to understand - people who buy Macs are generally not looking for the latest, greatest gaming experience.

So, in the end, you don't really have anything to back up your claims.

Quote:
Originally Posted by ngmiller View Post

The HD4000, while a "big step up", is still a fair bit less than twice as fast as the HD3000 in real world tests. If we're being generous we can go with Intel's 2x benchmarks. Fine. The Retina display doubles the pixel dimensions of the screen, quadrupling the number of pixel. So that's half the graphics power per pixel of the previous generation laptop. I think that's fair to call not cutting ice.

IOW, you don't have any evidence to back up your claim.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #82 of 115
Quote:
and it's hard for me to see a 15" pro which won't really be any more than 1440 x 900 - and therefore not 1080p capable like many 15" Win machines

 

The 15" range really needs to go to 1080 as the base res, with all the 13" range going to 900. There is no technical or economic excuse for not doing it (far as I know).

post #83 of 115
Quote:
Originally Posted by Amti View Post

The 15" range really needs to go to 1080 as the base res, with all the 13" range going to 900. There is no technical or economic excuse for not doing it (far as I know).

You're right, I'd call it human factors. Because using the current interface at such a tight dot pitch makes it harder to use. The current UI is very raster-heavy, resolution independence wasn't available to prevent text from being too small. Because points are rendered as pixels, text is generally rendered at half the size it should be, making it harder to read. It sounds like Apple might be finally releasing resolution independence, so we can get nice, crisp display without making things harder to read and use.
post #84 of 115
Quote:
Originally Posted by JeffDM View Post


You're right, I'd call it human factors. Because using the current interface at such a tight dot pitch makes it harder to use. The current UI is very raster-heavy, resolution independence wasn't available to prevent text from being too small. Because points are rendered as pixels, text is generally rendered at half the size it should be, making it harder to read. It sounds like Apple might be finally releasing resolution independence, so we can get nice, crisp display without making things harder to read and use.

A few questions I think many of us don't quite understand:

 

1. Does what you're saying about rez independence apply only to programs running ON the computer - or to all content elements of web sites (which seems unlikely to me, but I'm only a nerd, not a real techie) - anyway, so might you not have tiny website text even if the native OS is resolution independent?   (When I scale browser windows up and down now - all elements scale simultaneously). 

 

(1a. And if programs themselves, will they all have to be re-optimized to take advantage of RI, or will this be a service provided by Mountain Lion to all native Mac apps?)

 

2. Does the browser's technology itself play a role in this?  

 

3. What about the tools used to create the sites themselves?  

 

4. And finally, might there be an awkward transition period in which some stuff is new age (rendered in proportion to visual elements at a good perceptual size) and a declining percentage is old?  And if so, can Apple drive the transition?

An iPhone, a Leatherman and thou...  ...life is complete.

Reply

An iPhone, a Leatherman and thou...  ...life is complete.

Reply
post #85 of 115
Quote:
Originally Posted by JeffDM View Post

It sounds like Apple might be finally releasing resolution independence, so we can get nice, crisp display without making things harder to read and use.

Yes please. :)

post #86 of 115
Quote:
Originally Posted by bigpics View Post

A few questions I think many of us don't quite understand:

 

1. Does what you're saying about rez independence apply only to programs running ON the computer - or to all content elements of web sites (which seems unlikely to me, but I'm only a nerd, not a real techie) - anyway, so might you not have tiny website text even if the native OS is resolution independent?   (When I scale browser windows up and down now - all elements scale simultaneously). 

 

(1a. And if programs themselves, will they all have to be re-optimized to take advantage of RI, or will this be a service provided by Mountain Lion to all native Mac apps?)

 

2. Does the browser's technology itself play a role in this?  

 

3. What about the tools used to create the sites themselves?  

 

4. And finally, might there be an awkward transition period in which some stuff is new age (rendered in proportion to visual elements at a good perceptual size) and a declining percentage is old?  And if so, can Apple drive the transition?

 

The way I understand it, with HiDPI mode the need for resolution independence becomes seriously diminished, to the point of probably being moot. Turn HiDPI on and you get beautiful Retina graphics, turn it off and you can now select much larger screen real estates if that suits your fancy. So, let's assume the new 15" will have a native screen resolution of 2880x1800, qualifying it as Retina. Well, if you turn off HiDPI mode you could run the screen at 1900x1200 and every resolution smaller than that if you want, increasing/decreasing proportionally the size of everything displayed on your screen. On a 15" in screen you probably won't want to go higher than 1900x1200, and anything below 1440x900 should run with HiDPI mode on to take advantage of Retina graphics for the Apps and Websites that have been coded to make use of that feature (unless you need to take it easy on your GPU).

 

As far as transitions, it'll be like with the iPhone and iPad, all non-retina apps/websites/etc. will look like crap relative to the new stuff, but they will look exactly like they do now.

post #87 of 115
Quote:
Originally Posted by johndoe98 View Post

 

The way I understand it, with HiDPI mode the need for resolution independence becomes seriously diminished, to the point of probably being moot. Turn HiDPI on and you get beautiful Retina graphics, turn it off and you can now select much larger screen real estates if that suits your fancy. So, let's assume the new 15" will have a native screen resolution of 2880x1800, qualifying it as Retina. Well, if you turn off HiDPI mode you could run the screen at 1900x1200 and every resolution smaller than that if you want, increasing/decreasing proportionally the size of everything displayed on your screen. On a 15" in screen you probably won't want to go higher than 1900x1200, and anything below 1440x900 should run with HiDPI mode on to take advantage of Retina graphics for the Apps and Websites that have been coded to make use of that feature (unless you need to take it easy on your GPU).

 

As far as transitions, it'll be like with the iPhone and iPad, all non-retina apps/websites/etc. will look like crap relative to the new stuff, but they will look exactly like they do now.

 

I like the sound of that pretty much.  And a long way from worst case if you're right on the particulars.  There's been a dearth of clear explanation of this new age of rez on the "normal Apple geeks" sites.  Even a quick search of Anandtech and Ars Technica didn't turn up anything that helpful.    

An iPhone, a Leatherman and thou...  ...life is complete.

Reply

An iPhone, a Leatherman and thou...  ...life is complete.

Reply
post #88 of 115

I believe that there will be a consolidation of the lines. It makes much more sense to elinate line like current MacBooks or macbook pro to Macbook air with varying sizes of screen. 11, 13 15 and maybe 17 inch screens. No optical drive, USB 3, thunderbolt and maybe an sd or mini sd port. 

post #89 of 115
I personally don't see this happening like that. But *IF* they create a lineup with these prices, they can practically kill the "classic" MBP instantly. I mean: Who's gonna pay the same amount of money for a thicker machine with a much worse screen - only because of the SuperDrive?? How could Apple say "Pro" means "+SuperDrive -Screen" in their right mind? It makes absolutely no sense at all. I understand that some people might still want an internal SuperDrive, but the "Pro" moniker people would, imho, prefer to have it all. They want the better graphics card with the better display plus the SuperDrive and longer battery life. They'll also gladly _pay_ a little more for having what they want. With that many models, I sure hope that customers can mix things up and have the lighter version with the cheaper screen as well as the thicker model with the good display. If this simply means that a good screen isn't possible with the SuperDrive, then simply kill the SuperDrive, Apple. They've done it with the Mac mini and the MacBook Air already, it's f*!?ing time to kill the classic MBP. Sorry for the anger, I needed an outlet. 1smile.gif
post #90 of 115
Quote:
Originally Posted by fryke View Post

I personally don't see this happening like that. But *IF* they create a lineup with these prices, they can practically kill the "classic" MBP instantly. I mean: Who's gonna pay the same amount of money for a thicker machine with a much worse screen - only because of the SuperDrive?? How could Apple say "Pro" means "+SuperDrive -Screen" in their right mind? It makes absolutely no sense at all.

I agree with you on calling a thicker machine with obsolete tech and less modern tech overall a Pro machine and the latest advancements simply a MacBook, but I think the prices are inline with what Apple should do.

Remember that one of Apple's great methods for turning a profit is economics of scale. With the old style machines Apple will sell considerably less of them. They will have the latest CPUs, USB 3.0 as it comes with the chipset, and probably a few other advancements but they will probably just continue these for a couple cycles until the interest wanes so much that even this isn't worth keeping around*. On top of that, Apple doesn't want you to buy the old machines. They want to push their new ones.

Besides pushing down costs as they sell more of these there aren't too many vendors that compete with Apple on any HW and manufacturing costs. Google did a "First" with the new 3D mapping, two years ago MS and HP did a "Frist" with the Slate and those were just rumours. We have actual evidence in OS X and evidence in the iPhone, iPod Touch and iPad that Apple will be pushing 2x resolution and yet I've seen no vendor able to beat Apple to the punch.

I hope Apple has a new Mac campaign on the ready because looking at their HW, their OS, the confusing version of Windows MS is bringing to the table I think they have a chance to sell a lot more Macs than they did when Vista and 64-bit Windows were pulling people to Macs several years ago.



* This is based on the current rumours being true and what I'd do, not any statement of fact.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #91 of 115
Quote:
Originally Posted by jragosta View Post


OK, now tell us how many MacBook Pro users play World of Warcraft. Darn few, I'd venture.
You don't seem to understand - people who buy Macs are generally not looking for the latest, greatest gaming experience.
So, in the end, you don't really have anything to back up your claims.
IOW, you don't have any evidence to back up your claim.

 

Its not about hardcore gamers. They usually don't use notebooks at all.

I don't think playing on lowest settings and low resolution to be usable would be the user experience what Apple want to sell.

On Steam 50% of mac users play on MacBook Pro and 15% on MacBook. So 65% of mac users on steam play on a notebook.

 

http://store.steampowered.com/hwsurvey

 

To be honest I don't know why are you so obsessed with proving that I'm wrong. What I said is that I can't imagine Apple making a mac with retina display and intel integrated GPU because i don't think it's powerful enough. It works ok with 1M pixels but probably not with 4,6M when the user want to do more than basic web browsing, word processing, spreadsheets...

post #92 of 115
Quote:
Originally Posted by johndoe98 View Post

The way I understand it, with HiDPI mode the need for resolution independence becomes seriously diminished, to the point of probably being moot. Turn HiDPI on and you get beautiful Retina graphics, turn it off and you can now select much larger screen real estates if that suits your fancy. So, let's assume the new 15" will have a native screen resolution of 2880x1800, qualifying it as Retina. Well, if you turn off HiDPI mode you could run the screen at 1900x1200 and every resolution smaller than that if you want, increasing/decreasing proportionally the size of everything displayed on your screen. On a 15" in screen you probably won't want to go higher than 1900x1200, and anything below 1440x900 should run with HiDPI mode on to take advantage of Retina graphics for the Apps and Websites that have been coded to make use of that feature (unless you need to take it easy on your GPU).

That is not correct. The screen will always be sharpest at its native resolution. When you go to a different resolution, you will lose some sharpness. Probably less than you would with current screens, but noticeable nonetheless.

But that doesn't answer the question that you were responding to. bigpics asked "1. Does what you're saying about rez independence apply only to programs running ON the computer - or to all content elements of web sites (which seems unlikely to me, but I'm only a nerd, not a real techie) - anyway, so might you not have tiny website text even if the native OS is resolution independent? (When I scale browser windows up and down now - all elements scale simultaneously). "

The answer is that the web page is normally controlled by the web developer. You can set the size of the image, but the image elements would not change. If the developer has an image thats 200 pixels by 100 pixels, it would be tiny if you double the screen resolution (it would still be 200 pixels by 100 pixels, but the pixels would be 1/4 the size, so the entire image would be 1/4 the size). To get around this, you would either have to manually change the size of the image on your screen or have the browser automatically display things at 4 times their original size.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #93 of 115
Quote:
Originally Posted by bigpics View Post

A few questions I think many of us don't quite understand:

1. Does what you're saying about rez independence apply only to programs running ON the computer - or to all content elements of web sites (which seems unlikely to me, but I'm only a nerd, not a real techie) - anyway, so might you not have tiny website text even if the native OS is resolution independent?   (When I scale browser windows up and down now - all elements scale simultaneously). 

(1a. And if programs themselves, will they all have to be re-optimized to take advantage of RI, or will this be a service provided by Mountain Lion to all native Mac apps?)

2. Does the browser's technology itself play a role in this?  

3. What about the tools used to create the sites themselves?  

4. And finally, might there be an awkward transition period in which some stuff is new age (rendered in proportion to visual elements at a good perceptual size) and a declining percentage is old?  And if so, can Apple drive the transition?

There is a HiDPI offered for web sites. The information on how to do it is out there. I expect the browser will detect that it's not a HiDPI site and scale them up accordingly. As for the rest, it's a wide open world, I don't know the specifics.
Edited by JeffDM - 6/9/12 at 7:44am
post #94 of 115

Slimmer is a dumb move. These are laptops and not bread knives. I'd rather than them a bit thicker, with the extra space taken up by a larger battery. Having used the new iPad since it came out, I'm not interested in replacing my aging MacBook until there are models with 10+ hour batteries. I like being able to go several days without recharging.

post #95 of 115
Quote:
Originally Posted by Inkling View Post

Slimmer is a dumb move. These are laptops and not bread knives. I'd rather than them a bit thicker, with the extra space taken up by a larger battery. Having used the new iPad since it came out, I'm not interested in replacing my aging MacBook until there are models with 10+ hour batteries. I like being able to go several days without recharging.

Sure, but just there is too thin (for me that is the MBA because the battery life isn't long enough) there is also too thick, too light (again, for me the MBA) and too heavy, etc. Once we get 24 hours on one charge i'll be wanting a week. After we get I'll be wanting a month.

This is one area I've come to trust Apple. They do balance out the performance, usability and portability well.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #96 of 115
Quote:
Originally Posted by jragosta View Post


OK, now tell us how many MacBook Pro users play World of Warcraft. Darn few, I'd venture.
You don't seem to understand - people who buy Macs are generally not looking for the latest, greatest gaming experience.
 

Gamers for whom video power matters generally aren't Mac users, whether or not the game is a decent experience on Apple hardware or not.  OTOH, the 15" MBP is THE most popular computer for pro photographers who use Macs, in my personal experience.  Since Aperture completely runs off the GPU (except for exporting, where it uses it not at all), for a lot of these and potential MBP buyers the graphics performance does matter, though unlike in gaming in Aperture/LightRoom it's a question of waiting two seconds instead of six for an image to update the preview.  Someone who has been sometimes waiting 10-15 seconds for updates (like on 2008 MBPs) might be someone who'd be thrilled to have it cut to 5 in a new laptop, but there is a huge community of users in the graphics world who are counting on at least competitive professional graphics power to what else is out there as it applies to their time spent waiting.   There are some blazingly fast non-Apple laptops in use for graphics (obviously not Aperture but LR and PS) which beat the currently available fairly handily.

 

I have no idea exactly how the new cards will stack up in real world use for Photoshop and Aperture, but the relative graphics power does matter to a lot of people and it will affect their decision even if they don't play any games.

post #97 of 115
Quote:
Originally Posted by JeffDM View Post

There is a HiDPI offered for web sites. The information on how to do it is out there. I expect the browser will detect that it's not a HiDPI site and scale them up accordingly. As for the rest, it's a wide open world, I don't know the specifics.

Yes, HiDPI is OFFERED for web sites, but not all web sites use it. In fact, I would venture that the majority do not use it - since there are not that many devices that will use it.

So, the argument that web sites will automatically be viewed in all their HiRes glory is not true for most sites.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #98 of 115
Quote:
Originally Posted by jragosta View Post
Yes, HiDPI is OFFERED for web sites, but not all web sites use it. In fact, I would venture that the majority do not use it - since there are not that many devices that will use it.
So, the argument that web sites will automatically be viewed in all their HiRes glory is not true for most sites.

 

The point is that it will be once Apple adopts it.

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply
post #99 of 115
Quote:
Originally Posted by landilevente View Post

 

Games?

 

Intel 3770K CPU with HD 4000 GPU can hardly handle World of Warcraft on 1920x1200 resolution screen with 25fps on lowest settings at a place what deserted and made 5 years ago. We talk about a desktop CPU what is more powerful than mobile versions. And to be honest we know WOW isn't that demanding game. Just imagine what fps can we get if we use it on 2560x1800.

http://www.youtube.com/watch?v=UnYpLZGOMQc

I don't claim to know much about the inner workings of resolution independence or video game display technology but I don't think what you're outlining is how a HiDPI machine would work with a game.  Something like WOW probably has no 2560 x 1600 setting (someone who has it can comment, I'm sure).  I envision the game running at 1280 x 800 but each specific pixel in the game is displayed by four pixels on the screen to fill the screen up.  It can't take 4x the horsepower to do such a thing.

 

It doesn't make sense to me that Apple is going to come out with a display technology and associated hardware that together equals a massive fail.  These guys didn't fall off a turnip truck yesterday and decide to build an OS and computer.

post #100 of 115
Quote:
Originally Posted by Hudson1 View Post

I don't claim to know much about the inner workings of resolution independence or video game display technology but I don't think what you're outlining is how a HiDPI machine would work with a game.  Something like WOW probably has no 2560 x 1600 setting (someone who has it can comment, I'm sure).  I envision the game running at 1280 x 800 but each specific pixel in the game is displayed by four pixels on the screen to fill the screen up.  It can't take 4x the horsepower to do such a thing.

It doesn't make sense to me that Apple is going to come out with a display technology and associated hardware that together equals a massive fail.  These guys didn't fall off a turnip truck yesterday and decide to build an OS and computer.


Ok but that way you play the game on 1280x800 resolution on a much higher res screen. It would just look like it does on current 13" Macbook pro. You can play with anything on 800x600 too but whats the point?
post #101 of 115
Quote:
Originally Posted by Tallest Skil View Post

The point is that it will be once Apple adopts it.

Oh, sure. Every web site and app in the world uses Apple technologies. No one would ever use anything like Flash, for example, because it isn't supported by Apple.

Where have you been living for the past 3 decades?
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #102 of 115
Quote:
Originally Posted by jragosta View Post
Oh, sure. Every web site and app in the world uses Apple technologies. No one would ever use anything like Flash, for example, because it isn't supported by Apple.
Where have you been living for the past 3 decades?

 

Mockery aside, do you believe for a second that the number of websites covered in Flash ads today would be as low as it is had it not been for Apple five years ago?

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply
post #103 of 115
Quote:
Originally Posted by Tallest Skil View Post

Mockery aside, do you believe for a second that the number of websites covered in Flash ads today would be as low as it is had it not been for Apple five years ago?

No. But your statement that everyone will use technologies that Apple favors is absurd.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #104 of 115
Quote:
Originally Posted by jragosta View Post
No. But your statement that everyone will use technologies that Apple favors is absurd.

 

And it would be, had I said that.

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply
post #105 of 115

Does nobody else posting on here actually use their existing MBP to make a living with fairly demanding applications? 

I use a BTO 2010 15" MBP for photography, 3D rendering, photo stitching and video editing. Much of which is used 'in the field', either freelancing in other agencies or on a shoot somewhere difficult to get to. Clients still want to take away copies of images or footage shot on location, and I'm damned if I'm going to cart around a bunch of 4Gb USB thumb drives to give away, when I can burn a DVDR for peanuts and give it to them… or burn a copy of the day's work to save my ass in case anything happens to my HD(s). If you can't redo a photoshoot then redundancy is essential… and I'd rather not carry around an external DVD burner as well as cables and an external FW800 HD…

 

I think there's still a demand for a Pro 15" model, with 16Gb RAM to keep up with 'heavy lifting' eg rendering; large PSDs/PSBs etc. I know Apple seem to be neglecting the pro market a bit these days, but you've gotta be living in cloud cuckoo land to think you can depend on just Wifi and iCloud; rather than Ethernet, FW800, a DVD burner, and perhaps a custom screen & GPU.

 

I may sound like an old greybeard, but it seems most of the kids posting on this thread don't seem to understand the need to push a laptop to the max in order to make a living… but some of us do. I think (hope!) Apple still realise there's a proportion of their users who need to derive their income with a powerful, fast, flexible portable laptop. It's niche, but not nada…

PS I had a 17" MBP years ago. Frickin' enormous, bulky and heavy… I was much happier to replace it with a BTO MBP with a higher res matte screen… none of that glossy screen for me if possible. Yeah, I paid more but I got a great, hi res screen, and a powerful GPU. So if it goes from the lineup I won't miss it…

Nothing but sunshine, it's all sunshine ....
Reply
Nothing but sunshine, it's all sunshine ....
Reply
post #106 of 115
Quote:
Originally Posted by jobes View Post


PS I had a 17" MBP years ago. Frickin' enormous, bulky and heavy… I was much happier to replace it with a BTO MBP with a higher res matte screen… none of that glossy screen for me if possible. Yeah, I paid more but I got a great, hi res screen, and a powerful GPU. So if it goes from the lineup I won't miss it…

You had the pre-unibody in other words? To me the unibody 17" was a nice machine though I guess even it was too heavy?

post #107 of 115
Quote:
Originally Posted by Winter View Post

You had the pre-unibody in other words? To me the unibody 17" was a nice machine though I guess even it was too heavy?

Yups, pre-unibody. Still, it was pretty damn big… fine for a desk, but too big and heavy for travelling comfortably.

I don't mind a trade-off between size and specs, and the power difference between 15" and 17" became less obviously pronounced once the unibody design appeared, IMO…

Nothing but sunshine, it's all sunshine ....
Reply
Nothing but sunshine, it's all sunshine ....
Reply
post #108 of 115
Quote:
Originally Posted by Tallest Skil View Post

And it would be, had I said that.

You did.

I said:
"So, the argument that web sites will automatically be viewed in all their HiRes glory is not true for most sites."

Your response was:
"The point is that it will be once Apple adopts it."

The implication is that you thought that all web sites would adopt Apple's technology - which hasn't happened with existing technologies, so there's no reason it would happen with new technologies.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #109 of 115
Quote:
Originally Posted by jragosta View Post
The implication is that you thought that all web sites would adopt Apple's technology - which hasn't happened with existing technologies, so there's no reason it would happen with new technologies.

 

Except it has, with most sites, as you said.

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply
post #110 of 115
Quote:
Originally Posted by 1984 View Post

Hopefully because the sticker is fake.  Otherwise we will get 1280x800 effective screen real estate on a 15" screen.

It doesn't work like that! The resolution is exactly what is implied. GUI elements are now drawn based on physical dimensions not pixels. Apps can use these pixels anyway they please.
post #111 of 115
Games are a good reference when it comes to discussing GPUs. You don't have to be a game player to benefit from knowing where Intels GPUs stand performance wise.
Quote:
Originally Posted by jragosta View Post

OK, now tell us how many MacBook Pro users play World of Warcraft. Darn few, I'd venture.
It makes little difference. Games simply give people tangible feel for the discussion at hand.
Quote:
You don't seem to understand - people who buy Macs are generally not looking for the latest, greatest gaming experience.
That is very true but that doesn't mean they don't need really good GPU acceleration. It is an extremely weak position to take to say people don't need good GPU performance because moat don't game on the Mac platform. The reasons to have that sort of performance vary widely and that doesn't include gaming.
Quote:
So, in the end, you don't really have anything to back up your claims.
He has performance comparisons which are valid. The issue is how will these chips perform if they have to drive retina displays. The reality is such hardware may come soon, so there is no sense of getting involved in a long discussion. I will say this though, I wouldn't be surprised to find the AIRs running the same screens as they do now, simply because staying with Intel would lead to a performance regression.
Quote:
IOW, you don't have any evidence to back up your claim.

As big of an improvement that the new iVy Bridge GPU is, it is not bleeding edge performance. It doesn't even beat AMD fusion processors of last year much less this years Trinity. There are good reasons to question the viability of running hi density displays on Intel GPUs. Obviously we will not know what Apples machines will deliver until they are in the hands of users. What we do know is that unbiased reviews show good but not over whelming performance increase over old INTEL hardware. Not the base line there is old intel hardware,,
post #112 of 115
Quote:
Originally Posted by jobes View Post

Clients still want to take away copies of images or footage shot on location, and I'm damned if I'm going to cart around a bunch of 4Gb USB thumb drives to give away, when I can burn a DVDR for peanuts and give it to them… or burn a copy of the day's work to save my ass in case anything happens to my HD(s). If you can't redo a photoshoot then redundancy is essential… and I'd rather not carry around an external DVD burner as well as cables and an external FW800 HD…

 

I think there's still a demand for a Pro 15" model, with 16Gb RAM to keep up with 'heavy lifting' eg rendering; large PSDs/PSBs etc. I know Apple seem to be neglecting the pro market a bit these days, but you've gotta be living in cloud cuckoo land to think you can depend on just Wifi and iCloud; rather than Ethernet, FW800, a DVD burner, and perhaps a custom screen & GPU.

 

The internet's kind of past the CB radio stage - that is, it's here, becoming more ubiquitous and ain't going anywhere.  So, it is  part of the equation from now on.  Meaning....

 

.....No giveaway flash drives or ODD's (or iCloud itself) required - not with free services like You Send It - and further, creating and emailing downloadable links in DropBox is quicker than burning a DVD (yes I know only 2 GB is free - so use a SugarSync free 5GB account if that's the size you need).  I imagine you can use SkyDrive (7 GB) or Google Drive (5 GB?) as well.  And I'm fairly sure that if encryption's an issue, at least one of these services can handle uploading and downloading an encrypted file.  

 

There simply is NO real use case for "needing" DVD's in the field any more - unless you're out in the woods with no wi-fi or cellular hot spot and the client HAS TO HAVE THE FILES RIGHT NOW - as what pros are serving clients that lack internet and email?? So granting you everything else you mentioned, your next MBP can go on a diet by shedding its superannuated ODD - with little practical impact on 98%+ of users. Even if it means a few of the 2% are going to have to carry a light-weight $40 burner around in their gear pack (with a total weight likely no more than an ounce or so beyond current MBP's).

Apple may surprise me though, and keep it.  But I will be surprised.  

On the other hand, given the immaturity and cost of TB devices, cables and everything else TB - and the pro community's investment in FW800 gear, I DO hope that port sticks around for at least one more rev.  But I'm not totally confident on that.

An iPhone, a Leatherman and thou...  ...life is complete.

Reply

An iPhone, a Leatherman and thou...  ...life is complete.

Reply
post #113 of 115

That's true… I use DropBox extensively, as well as other cloud services, but to do so 'in the field' requires connectivity I just haven't experienced. Mebbe in Silicon Valley, with an always on LTE signal and no data restrictions, but here in large parts of the UK we can't even get a 3G signal (and from what I know, that's also true in large swathes of the US and other countries too). Anyway I deal in gigabytes rather than megabytes of data for my jobs…


Your statement "There simply is NO real use case for "needing" DVD's in the field any more - unless you're out in the woods with no wi-fi or cellular hot spot and the client HAS TO HAVE THE FILES RIGHT NOW" amplifies my point… it's not an either/or, but using the best tool for the job, and what can be used…


If you're lucky enough to have an unlimited data plan, a good 3G or wifi signal then it's an option… but that kind of connectivity doesn't exist everywhere, especially for location-based work. Even when wifi does exist in hotels, bars and restaurants often the speed isn't great, and they stiff you for data usage. Burning a DVD costs me pennies, and I can do it anywhere.

Plus, it takes a lot less time to burn 4.4Gb onto a DVDR than it does to upload it anywhere… and it's the same for backing my work as a safety measure. Never mind potential data security concerns. Some clients I have would freak if they thought I was saving their data onto the cloud… even if it is secure, encrypted and so forth, it may be seen as an NDA/contract breaker.

 

Yes, you can use an external burner, but it's another bit of kit to shift about, and right now I like convenience. Plus, I reckon an internal burner is less likely to get damaged inside a unibody MBP than on its own :)

 

Don't get me wrong, I'm not hung up on 'old' tech, and I'm looking forward to being able to retire optical storage; but we're some way from having always-on superfast wireless data, an ecosystem of Thunderbolt devices, and cheap SSDs everywhere. I don't mind how Apple sometimes force users to move on from old kit (goodbye ADB, hello USB!) but don't let them strand power users in the process!

Let's see what gets announced later today… and whether they'll tempt me to get the credit card out, or band my head against my desk :)

Nothing but sunshine, it's all sunshine ....
Reply
Nothing but sunshine, it's all sunshine ....
Reply
post #114 of 115
Quote:
Originally Posted by jobes View Post

That's true… I use DropBox extensively, as well as other cloud services, but to do so 'in the field' requires connectivity I just haven't experienced. Mebbe in Silicon Valley, with an always on LTE signal and no data restrictions, but here in large parts of the UK we can't even get a 3G signal (and from what I know, that's also true in large swathes of the US and other countries too). Anyway I deal in gigabytes rather than megabytes of data for my jobs…


Your statement "There simply is NO real use case for "needing" DVD's in the field any more - unless you're out in the woods with no wi-fi or cellular hot spot and the client HAS TO HAVE THE FILES RIGHT NOW" amplifies my point… it's not an either/or, but using the best tool for the job, and what can be used…


If you're lucky enough to have an unlimited data plan, a good 3G or wifi signal then it's an option… but that kind of connectivity doesn't exist everywhere, especially for location-based work. Even when wifi does exist in hotels, bars and restaurants often the speed isn't great, and they stiff you for data usage. Burning a DVD costs me pennies, and I can do it anywhere.

Plus, it takes a lot less time to burn 4.4Gb onto a DVDR than it does to upload it anywhere… and it's the same for backing my work as a safety measure. Never mind potential data security concerns. Some clients I have would freak if they thought I was saving their data onto the cloud… even if it is secure, encrypted and so forth, it may be seen as an NDA/contract breaker.

 

Yes, you can use an external burner, but it's another bit of kit to shift about, and right now I like convenience. Plus, I reckon an internal burner is less likely to get damaged inside a unibody MBP than on its own :)

 

Don't get me wrong, I'm not hung up on 'old' tech, and I'm looking forward to being able to retire optical storage; but we're some way from having always-on superfast wireless data, an ecosystem of Thunderbolt devices, and cheap SSDs everywhere. I don't mind how Apple sometimes force users to move on from old kit (goodbye ADB, hello USB!) but don't let them strand power users in the process!

Let's see what gets announced later today… and whether they'll tempt me to get the credit card out, or band my head against my desk :)

 

FWIW 9 to 5 Mac is now reporting we're both going to get some of our wishes this am - two form factor MBP's (and new Mac Pros) - one with and one without a Superdrive.  If so, the transition seems set to continue for at least one more rev. And new MP's also indicate that Apple's still thinking about its harder core pro base.  Which I think makes sense for company image reasons - if they go all consumer because that's where all the volume is, even people who prefer Apple will think twice about whether or not there's a growth path for them if they move up to more demanding uses.  Losing that aura could cost them not only current frustrated pros, but "aspirational" users as well - and therefore some of Apple's buzz factor.  As in "Sure, kid, you can make great home or semi-pro movies on a Mac, but forget about high-end work.  You'll need....   ...Windows big boy hardware for that."  And I don't think that's a message they want to send.  

 

Apple is famous for margins and a focus on products that achieve them in volume - but there are other intangible factors they have to take into account that matter in the long run, and not losing users in your class is one of those - even if, as a percentage of Apple sales you're now fairly far down on the list.  Whereas dropping X-Serve was not nearly as much a threat to any strategic part of their business - since they're finally finding their way through the front door of the Fortune 1000 with droves and droves of iDevices - outflanking rather than infiltrating IT.  

Also, back to the matter at hand, you make some good points, thanks - and we didn't even discuss local storage - one thing that tempers my enthusiasm for a new superslim MBP.  I only work with stills and not huge ones (video is hobby only stuff) - and a likely practical max of 256 GB of SSD this year - with the still, yes, immature cloud options available is not totally appealing to me for a main machine - despite the performance advantages.

An iPhone, a Leatherman and thou...  ...life is complete.

Reply

An iPhone, a Leatherman and thou...  ...life is complete.

Reply
post #115 of 115

Well, they're playing an interesting game with the new line-up announced today. As Bigpics said, they're certainly managing the transition by maintaining the two tier approach for now: catering for the bleeding edge and the incremental updates at the same time.

Interesting developments indeed. Think I'll have to wait and digest all the news from today… as with so many Apple announcements, the devil's in the detail…

Nothing but sunshine, it's all sunshine ....
Reply
Nothing but sunshine, it's all sunshine ....
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Inside Apple's rumored 'new MacBook' vs. updated MacBook Pro