Apple's A8 SoC likely carrying new 6-core PowerVR GPU, clocked at 1.4GHz with 1GB RAM

1810121314

Comments

  • Reply 181 of 269
    v900v900 Posts: 101member
    I used to have an Amiga1200 with 2MB RAM, a 14mhz CPU and a tiny 40MB hard drive. It was a great gaming machine, and was awesome at multitasking: Browsing BBS while downloading files and unpacking them in the background, like it's nobody's business!

    A year later I traded it in for a 486 PC. It had FOUR times as much RAM, a CPU that at 66mhz was over four times as fast, and a staggering 420MB HD.

    However: While the graphics were much better; it didn't feel a lot faster, and any time I tried to throw two tasks at it at once, it completely chocked. Despite having quadruple the resources.

    RAM isn't just RAM, and you can't directly compare hardware across different CPUs, architectures (ARM v7 vs ARM v8) and operating systems.

    And Android is a resource hog, unlike iOS and WP8. While WP8 runs just fine on a 512MB phone, Android is an awful experience on hardware with less than 1GB.

    So besides marketing and spec-whoring Android manufacturers have a good reason to pack their phones with RAM. They need 2gb or more to get the kind of performance people expect, while an iPhone runs excellent on 1gb.

    I'm sure that they have prototypes at Infinite Loop with all sorts of RAM combinations. And apparently, in their estimates, an upgrade to 1.5 or 2gb ram isn't worth it.

    We can only guess at their thinking, but it's possible that while the benefits of doubling the RAM might be marginal, the drawbacks aren't.

    Besides increased power usage, increasing the RAM would lead to increased fragmentation. It would also encourage developers to write less efficient code, and be more sloppy with system resources. Which again means, that many new apps wouldn't be compatible with popular models like iphone4/4s and ipad2.

    That's the major issue I believe: The benefits are marginal, but the drawbacks are increased fragmentation. Apple always puts the customer and customer experience first, and will go far to avoid the kind of fragmentation and quick obsolescence you see on Android.
  • Reply 182 of 269
    Using mobile DDR 4 would prevent another new truth manta becoming the norm.
    We all just fed the bull some RAM. The horses have the trotts as per usual.


    When there is a problem we send the people to be retrained in 1GB RAM appreciation.
    And obviously that retraining machine also acts as an Orgasmatron.
  • Reply 183 of 269
    melgrossmelgross Posts: 33,600member
    plovell wrote: »
    I'm not sure of your question. It will compile source to either x86 or to ARM. But it won't convert, for example, x86 binary to ARM binary. Is that what you meant?

    But that's what Rosetta2 would be for :)

    What I meant was, and I should have been clearer, was would it convert an app that was specifically designed to use the features of one CPU family to take advantage of the very different features of another CPU family? This is where things get hung up. I saw plenty of software being described as "taking just a weekend" to port over, and seeing it on stage, but then not coming out for six to eight months afterwards, because of all the work that needed to be done after the basic port was finished.

    I remember Rosetta as being slow. Also, it wasn't a universal so,union, as this bit from the Wiki shows:

    Rosetta is part of Mac OS X for Intel operating system prior to Lion. It translates G3, G4, and AltiVec instructions; however, it does not translate G5 instructions. Therefore, applications that rely on G5-specific instruction sets must be modified by their developers to work on Rosetta-supported Intel-based Macs. According to Apple, applications with heavy user interaction but low computational needs (such as word processors) are well suited to translation via Rosetta, while applications with high computational needs (such as raytracers, games, or Adobe Photoshop) are not.[5] Pre-existing PowerPC versions of Apple "Pro" media-production applications (such as Final Cut Pro, Motion, Aperture, and Logic Pro) are not supported by Rosetta and require a "crossgrade" to a universal binary version to work on Rosetta-supported Intel-based Macs.
  • Reply 184 of 269
    melgrossmelgross Posts: 33,600member
    65c816 wrote: »
    You do remember the A7 was introduced as a mobile chip with desktop class performance...?

    I don't remember it being said to have Desktop class performance. I remember them calling it a Desktop class chip, which isn't exactly the same thing.
  • Reply 185 of 269
    melgrossmelgross Posts: 33,600member
    wizard69 wrote: »
    I'm shocked that you are so negative here!
    They barely budged the clock rate and are getting 25%, if you believe Apples numbers that is pretty damn good.
    A focus on power reduction was rumored to be the primary focus for a good part of 2014. It is a good move by Apple frankly as it looks like it allowed them to drastically improve GPU performance with little if any hit to the power profile.
    Which might be easy. The key point here being that we don't know what the maximum clock rate is on these chips. Based on other similar hardware I suspect Apple could take this chip well past 2GHZ. I wouldn't be surprised to see double the performance as an easy target for Apple.
    Or Apple could say no emulation here and support running iOS apps in a Window to cover themselves for the switch over period. IPAd clearly has proved that most of Apples customers have no need at all for emulation.
    It certainly could work but I don't see Apple wasting time on such emulation hardware.
    You are assuming here that i86 is a big deal for the majority of Apple users, it isn't. Mind you I'm one that has to have Windows machines at work because of software and hardware ties. That isn't the majority of Apple users however.
    How much they can raise the clock rate is unknown but we can take guesses at what might be possible. I would suggest that based on similar hardware 2GHZ would be easy. How much faster beyond 2GHz remains unknown. If Apple can realistically double clock rate to something like 2.8 GHZ and not sustain huge performance issues then you have double your performance right there.

    However clock rate isn't everythig, by far the biggest throttle with respect to performance on APUS type chips is the interface to RAM. Apple has several options there to improve RAM performance.
    It isn't a pipe dream, Apple could simply demand that apps support both architectures for continued inclusion in the App Store. That right there is a powerful incentive.
    XCode already supports cross compiling so that is not a problem.
    If the vast majority of Apps follow Apple guidelines running on ARM will not be a problem. Sure some debugging will take place, but if the apps use Apples APIs as recommended that will be a minor issue. We nay need to look at the various attempts to get apps to run on ARM based Linux systems as a hint as to how serious that app situation will be. For the most part apps aren't a problem. Getting the OS and libraries working is though but that would all be on Apple.

    In the end the big question about A8 in a Mac comes down to performance, in this respect we don't really know how the chip will perform at laptop power levels. I wouldn't count it out though as Apple could triple wattage and still have a very viable laptop chip. I'm guessing triple the wattage here would result in a 15 watt max chip.

    By the way they don't need to exceed i5 performance levels for this to work. All they need to be able to do is to offer a machine that comes close and is $300 cheaper.

    Sorry, but I just have to disagree.

    Last year, along with the year before, they barely nudged to clock and achieved double the performance. While I didn't expect to see that again (though I kept my fingers crossed), I did expect to see 50% on the CPU, and 75% on the GPU, so yes, I'm disappointed.

    As for clock rate, I already went through that. There are two types of CPU designs. One sends two, maybe at most, three instructions at once, and needs, and uses, a high clock. The other is wider, sending four or more instructions through at once, and uses a lower clock. You can't simply assume that the clock can be raised that much. We saw a lot of predictions for the A8 that stated that not only would the clock be above 2GHz this year, but it would be 2.7GHz. I spent a lot of time on those sites disabusing them of that notion, and stated very definitely that the clock would be boosted just a bit, if at all, and I was right. Raising clock speeds al la Intel's Netburst was a failed concept. The only reason Qualcomm is getting away with it, is because their performance isn't all that high yet. When clocks get to the mid 3GHz range, they will have the same problems that Intel had. By staying with a wider CPU, Apple is staying very clear of those problems. Remember that Intel's Netburst Prescott chip, the last in the line, had just a 3.8GHz clock, though with extreme cooling solutions it could go to a bit higher than 4GHz. Even today, the highest performing chips are just about back to that clock. And they use 140 watts. Keeping clock speeds low is a key to keeping power use low, and heat down.

    It's not that easy to just remove heat on smaller processors. While the overall heat is less, the heat density is higher, in other words, coming from a smaller area. While I do think that Apple is doing the right thing, I'm still surprised at the performance increase.
  • Reply 186 of 269
    melgrossmelgross Posts: 33,600member
    wizard69 wrote: »
    It looks like more than a tick. Of course with Apples secrecy it is hard to tell. However you don't double transistors on a tick. Beyond that the A series highlights the importance of specialized execution units in modern SoC design. Performance is no longer measured by simply profiling the CPU. Even with Intelsat hardware, the die space dedicated to the CPUs is rather tiny these days.

    Honestly I'm sometimes more excited to wait for the tear downs than the actual device. It will be very interesting to see where all of those extra transistors go to.
    This is possible. For example a quad core chip should be fairly easy with this architecture, they could deliver such a chip for a laptop on this years process and shrink it next year for tablets and phones. The COURT cores themselves wouldn't add a lot to the power profile of the chip but supporting them might. In the end a lot of power is dissipated in the RAM interfaces and caches keeping lots of cores running. I don't think this reality is lost at Apple.

    Yes you can double transistors on a tick. It depends on what they are used for. A tock, Intel style, is a major architectural design change, and unless we get evidence of that from Anandtech when they get to do their usual very complete job, we can't say that Apple did that. Specialized execution units, if they aren't part of an architectural design change, are just additional units.

    As usual though, we'll never know what all those transistors will be for. We still don't know what a third of the A7 die is used for, and it's been out for a year.
  • Reply 187 of 269
    melgrossmelgross Posts: 33,600member
    wizard69 wrote: »
    When was the last time Intel delivered 25%?
    Melgross, I'm shocked that you are taking this attitude and are so focused on CPU performance. For one there is a substantial increase in GPU performance with this chip. We are getting a new camera processor and apparently new encode and decode hardware for video. It actually looks like an entirely new chip.

    Now I'm not saying that CPU performance isn't important in a Mac OS device. The problem is that isn't the whole equation these days. The initial feel here is that this chip would deliver video conferencing capabilities well beyond what the CPU numbers might imply. Maybe it isn't a usage you have but this chip ought to be able to provide video conferencing support that would be hard to approach on a quad core if you didn't have the dedicated hardware. In effect the chip can deliver better than quad core performance in some of today's more popular uses.
    Apple has the hottest selling cell phone made, they have lost very little. This time next year will provide us with real data about the sustainability of the large cell phone market. I really don't think it is as hot as some imply. If the 6+ becomes a huge hit we are wrong, but I don't see the majority of people wanting to carry around these huge cell phones all the time. A years worth of sales figures ought to highlight if 6+ sized phones have a long term future.

    By the way I approve of Apples largeish cell phone as some people can justify the big devoce. I just don't see it as a major portion of Apples long term sales. I can also see people dumping the 6+ after realizing that it is more of a problem to carry around than it is worth. In the end I do wonder why it took Apple so long to offer up a real line up of phones and I'm frustrated that an updated iPhone 4 sized device hasn't arrived. Size isn't Apples problem it is rather the lack of effort to maintain a portfolio of high performance phones.

    I don't know why you are so shocked. This is a considerable decrease in performance advancement. I think that just admitting that will get us through a lot, and then we can get to talking about what Apple's thinking about this might be.

    I'm already seeing some early performance tests, and these phones aren't nearly as ahead of the newest competing phones as was Apple's 5 and 5S. With competing devices getting features that Apple isn't bothering with, performance is one major area in which we could still talk about as being Apple's to lost. I don't want to see them lose it.

    And yes, Intel doesn't get 25% performance out of ticks or rocks! and that what I've been excited about with the "A" series. With Apple moving their chips so rapidly, they have been gaining on Intel in big leaps. The A7 is close to the highest performing Atom Bay Trail. If Apple were able to double performance this year, it would be competing with lower end ultra low power i3's. That would have really been something! Using two of these chips together would give performance about the level of lower series ultra low power i5's, and that would really be something! And that's why I'm disappointed. It looks like the new Atom line might pull away again, with Intel claiming close to a 50% performance increase.

    As far as size goes, you must know by now that the 5"+ phones are the biggest selling phone category in China! and moving up to that in India! and has become pretty popular here over the last two years, so yes, I do think Apple needs that. In fact, I'm considering buying it myself early January when out two years is up. I wouldn't never have believed I would do that two years ago, but the reasons, for me, are becoming very compelling. I'm surprised Apple didn't up the clock another 100MHz on the 6+, as cooling isn't as much of a problem! and the bigger screen could use the extra boost in performance, as I'm sure we'll see with the iPad, as usual. Unless Apple does an A5+ and adds more GPU units.
  • Reply 188 of 269
    melgrossmelgross Posts: 33,600member
    wizard69 wrote: »
    When you post like this all you are doing is displaying to the entire world your ignorance about computer systems.
    Sure it can be because RAM's impact remains the same. As long as RAM remains the same size the issues will remain the same. Oh by the way yes some of us are on iOS 8 betas and know very well what works well in those betas and what has been improved. Safari has been improved in many ways actually but still suffers from RAM problems.

    If you had any understanding of computer systems you would realize that even if Apple put a 12GHz CPU in the machine it would still suffer from not having enough RAM. The performance of the various processors in the machine do not make up for the lack of RAM.

    I keep saying that I'm going to join the developer program just to get my hands on Xcode and the betas, but I'm too busy, unfortunately.

    What I would like to have seen is memory compression, as we have it with Mavericks. It seems to work well enough, though it doesn't solve all the problems of too little RAM. But as no one has mentioned it for iOS, I'm assuming that it isn't there. It would have been interesting if it were, though I suppose Apple would have mentioned it during the intro.
  • Reply 189 of 269
    melgrossmelgross Posts: 33,600member
    wmsfo wrote: »

     


    What about the specialized GPU in the A8 chip?  You stated the Apple is very good at using specialized processors.  Why would Apple put a specialized GPU in a device if it did not help them with the "user experience"?

    Does it not stand to make a reasonable conclusion that Apple may have evaluated several "points of emphasis" and decided that 1GB system RAM was sufficient for the points it wants to emphasize?

    Everyone commenting about the amount of system RAM have not provided specifics where more system RAM would desirable - just saying Safari would work better is NOT a specific. A specific would be "more system RAM allows the CPU retail more system calls without ...".

    As you have not given specifics rather a general statement - "RAM directly impact the performance of Apples iOS devices, that is the only concern Apple should have." I will ignore your other comments.

    If you have some specifics, please enlighten us.

    We've had a number of threads about this. Too bad you weren't there.

    So no, it's not just Safari. And there, it's not just RAM that's a problem. But when some of us run video editing, photo editing, and 3D CAD, performance is a big issue. More RAM allows bigger files to be used, and more of them.

    The same thing is true for games. Big, 3D games use a lot of RAM. Since the GPU shares RAM space, that limits not only CPU access, but GPU access. That means textures, normally held in RAM are of lower quality. It means that shaders, and antialiasing isn't as good. It means a lot of things, including frame rates.

    It means that Apple's more inclusive must asking that we see year after year, and the big boost this year, will require more RAM.

    So yes, RAM can be an issue. But 1GB is likely enough for most people most of the time, particularly for the 6. But I'd like to have seen 2GB for the 6+, just like I would for the iPads.

    Despite what Apple says, they on,y make the Mac Pro for high end users. For the rest of their lineup, it's for mainline users. The same thing is true here. Apple is balancing what THEY think most people will be satisfied with. They're likely right, for most users. But for those of us that do more, it's just barely acceptable.
  • Reply 190 of 269
    melgrossmelgross Posts: 33,600member
    It's called an iPad.

    Glad that Apple could make your dreams come true!

    Uh, no. Sorry Benji, but I use a Mac Pro, and my iPad is nowhere like my Mac Pro. In fact, it's nowhere near my Macbook Pro either.
  • Reply 191 of 269
    melgrossmelgross Posts: 33,600member
    No it isn't and Apple would agree with me. An iPad and a Mac are for very different uses with a lot of obvious overlap. I don't think i will be editing one of my 4K videos, with many TBs of Thunderbolt RAID storage and three screens on my iPads anytime soon even though I read email, Netflix (used as a verb) and web browse on both.

    Sorry to disagree, but as I have stated earlier, my reason for asking about an Apple foray into a desktop CPUs and GPUs (not a just Mobile chips) is purely based on the premiss that Apple should now control everything. Tim just said Apple controls its own hardware and software but in fact that isn't totally true as long as the GPUs and CPUs in a Mac are not made by Apple. I say screw Intel, NVidea and AMD and the ability to run VMware with Winblows, it's time to drop legacy crap.

    It's interesting that you mention Apple controlling their own technology, because supposedly, that's what it really is, according to Apple; controlling their own technology, software and hardware.

    But really, they haven done a very good job of it, have they? They've had a lot of opportunities to buy companies with technology they need, including software. Yes, those companies would have been billions. But for ten years now, Apple could have afforded it.

    So we can start with Navigon, which produces maps. Apple could have bought them going into the first iPhone, but went to Google instead. They could have bought Skype, but Microsoft did later, for much more money than Apple could have picked it up for. They could have bought Nuance when Siri first came out, but didn't.

    In fact, before Google went public, they were going around trying to sell themselves for $5 billion. They approached Apple but...

    Imagine what would have been if Apple would have bought google!
  • Reply 192 of 269
    melgrossmelgross Posts: 33,600member
    Melgross was asking for an iPad; I gave it to him.

    If you want a heavier, thicker, bigger, less portable computer with a fan, then you want a Mac.

    No, I was not, and you know I was not. You're just trying to be a wise guy here. You know very well what we're talking about.
  • Reply 193 of 269
    melgrossmelgross Posts: 33,600member
    Good to know but does that preclude Apple from having developed their own CPU capable of running OS X ?

    It doesn't. I would be very surprised if Apple isn't at least working one on paper, so to speak.

    It's very possible that the A8 is the second generation of a design that is eventually, with some possibly major modifications, intended for a low cost Mac. But Apple takes it's time. Sometimes too long.

    It's possible that they're trying to get needed performance and power draw in balance. Some compromise that would be acceptable to that particular user. As I've been saying for a while, if they could include the instructions that give the biggest emulation problems, this could work without needing Adobe, Microsoft and others to rewrite their software.
  • Reply 194 of 269
    melgrossmelgross Posts: 33,600member
    Couldn't agree less. He's a pretentious, condescending pissant, who I would bet any amount of money doesn't know 1% as much as he'd like you to think he does.

    This crap about accusing everybody but him of "not knowing how RAM is used", really pisses me off. Apparently he, the great expert, is not aware of how RAM works. Kind a a remarkable lapse of knowledge for the Great Guru!

    Hey, Wiz! RAM is absolutely the only thing in your phone that you can't turn off. It has to be refreshed thousands of times a second! You can stop the CPU, the GPU and the backlight, but the second you stop refreshing that RAM you've lost everything and have to reboot the phone from scratch.

    In a desktop or laptop, the more RAM the better, but with the extremly limited energy available to a phone, you need to use as little as you can possibly get away with. Kind of amazing that one of your enormous knowledge¡ wouldn't be aware of this.

    IOW, get bent.

    I just gave you an infraction for this post. Refrain from doing this again.
  • Reply 195 of 269
    melgrossmelgross Posts: 33,600member
    v900 wrote: »
    Well, Thats certainly an interesting rewriting of recent history.
    But hardly based in reality, unless the only benchmark for "better" is "more" and "faster", which of course is nonsense.

    Look at the CPUs Apple typically put in their cellphones. You typically see CPUs with more megahertz and with more cores in Android handsets. Bigger and more is better, right?

    Nope. Apples CPUs get much more done pr. MHz, so an A6 CPU at 1.2 GHz is faster than most of the CPUs clocked at 2GHz in other phones. Phone workloads are also eminently badly suited for multicore workloads, which is why Apple sticks to a dual core design instead of the quad Krait/Exynos/etc CPUs.

    They could easily have stuck a quad core CPU at over 2GHz in the iphone6, but the only advantage would have been in terms of marketing. Apple chose differently, in order to make a faster, more efficient phone with longer battery life.

    The same can be said about the iphone screens. Apple could easily have stuck a HD or FullHD screen in the iphone years ago. Or made a 5 or 6 inch screen. (Heck, Dell released a 4 and 5inch phone in 2010, big screen phones are not a recent development).

    So why didn't they? Because the trade offs were too big. Battery life and one handed ease of use would have to be sacrificed, and that for quickly diminishing returns. Very few people can tell the difference of a HD and FullHD screen on a cellphone, and once screen ppi get above 320, like in the iphone, it's hard to tell the difference between 350 and 400 ppi.

    Competitors like Samsung, LG or Motorola doesn't have Apples advantage of controlling the whole ecosystem from OS to store to hardware. That means it's harder for them to make a good, solid product and distinguish themselves from the competition.

    Instead, they're forced to play the spec-game and fake progress through throwing specs and gimmicks around. They can't make a better phone from year to year, so instead they throw a slightly bigger screen and 2.2ghz CPU in last years 2ghz phone, tweak the design a little bit and call it a day.

    As for Apple losing sales because of screen size, that's very dubious. You forget the huge part of the market that prefers smallish-pocketable phones. Which is just as big, if not bigger than the market that crave huge screens.

    After all, the phablet is primarily a hit in Asia, and the only company who had profits here is Samsung. (Many other companies chase that market though, since the premium phone market mostly belongs to Apple.)

    Phones above 4 inches, and especially phablets are still just a small part of the market.

    Really? Show where it's a rewriting. It's not, you should know. What I said is exactly correct. If you don't know that, then you should look back. You seem to have a very limited way of looking at things. You really think that Samsung's phones haven't cut into Apple's sales? How naive is that? Right now, one third of the market here, in the USA is in phones larger than 4.7".
  • Reply 196 of 269
    melgrossmelgross Posts: 33,600member
    popinfresh wrote: »
    You forget the things apple has put in place to support such a move. Xcode can already compile for x86 and arm so it's not a stretch that they could make something write once compile with both arch. Also, let's not forget Swift. This may bee something they designed around when making the new language with foresight to an Arm and Intel OSX product mix. Gatekeeper and the push toward Mac AppStore distribution is another piece that would tend to support this.

    I wouldn't say "note even close" as I'm sure they could design in parallel a different series chip that is based on a similar core that is more powerful and designed around OSX. This could happen relatively soon for something like the MacBook Air which is more suited for consumption rather than production. I would, however be very surprised with an entire transition to ARM like we saw with the move to Intel. I don't see Apple making an ARM powered Mac Pro any time soon.

    The potential is there, but it all depends on if they can get "good enough" performance for things like the new photos app, iMovie, etc. and this performance can be achieved for less cost than with Intel, and / or provide a significant boost to efficiency resulting in significant battery life improvements.

    -PopinFRESH

    I see what Apple is doing. And some of it is pretty good. But I was talking about this SoC in particular. And as this SoC goes, it's not even close! Understand that Apple uses an ultra low power i5 for their cheapest, and lowest performing Macbook Air. The A8 isn't even in the same continent in performance.

    Are they working on another chip that could be? Sure. But we're not talking about some mythical chip. We're talking about one that exists.

    People have to stop defending Apple while using products that don't exist as an excuse to defend them. Apple may not have any intention of ever using ARM for OS X, and we have to keep that in mind. Just assuming that they do is disingenuous,

    So we talk about it here because it's fun to do so, without any conception that any of this discussion has any basis in reality. But we are using real chips to base our views upon. If someone wants to come out into the discussion with secret chips that Apple has in their labs, then the discussion veers off into unknown territory, and end up in the realm of fantasy.

    So I'd rather discuss this with what we know Apple has, and how it can be modified realistically, rather than to talk about chips that very well may never appear.
  • Reply 197 of 269
    melgrossmelgross Posts: 33,600member
    Dumb question???

    These A8 chips (and A8X) are very inexpensive compared to Intel chips. Would anything be gained by using 2, or even 4 A8s instead of a single multicore Intel chip? Isn't that what Intel does on its Xeons?

    When this publication, and others, were discussing this new chip, well before it was announced, as being a four core design, I disagreed, and stated that it would, again, be a two core design. I'm on record with that.

    The reason I gave, and I'm still giving, is that per core performance is very important. We've seen that with Apple the past couple of years, where their iOS products perform very well, beating out their competitors, though losing a bit in multicore performance, which seems to have less of an effect on device performance as a whole. Last year, Anandtech called an A7 CPU core as "a computing monster."

    My feeling was that if they did come out with four cores, per core performance would suffer. Seeing the performance boost this year, I think that opinion is justified. But, when we look at most computers running OS X or Windows, how many cores do we see? Simple, either two, or four. Only high performance computers use more.

    A good reason is that most software still doesn't perform significantly better on more than two cores. Some does, on four cores, but it's very rare, and only with specialized software, that we see real improvements on more than four cores. Even multitasking does well enough on four cores.

    So where does that leave us? If Apple had come out with a four core SoC, then that would be it. But coming out with a two core SoC leaves open some interesting possibilities. Binding two together would double performance, and still just use four cores, that maximum for most computers I was just talking about.
  • Reply 198 of 269
    melgrossmelgross Posts: 33,600member
    nolamacguy wrote: »
    a 25% increase is hardly "measly". want proof? if your boss were to offer you a 25% salary bonus, would you call it measly? nope.

    the fact is, as mobile processing matures the low-hanging fruit will be gone, and increases are going to get slimmer, just as they have w/ desktop class processors.

    I'm not sure what one thing has to do with the other. Fortunately, I WAS the boss.

    But did you really read my post? I'm talking in reference to the fact that for two years running, Apple doubled performance of their SoC, and the year before that, it was a 50% improvement for the CPU. When compared to those very significant improvements, yes, 25% is measly.

    We have to look at this in reference to what they wanted to do this year. In the past, they kept the power draw about the same, even increasing it a bit YOY. Performance was the parameter they were going for. This year, they went for efficiency as well, for the first time. That prevented more performance increases.
  • Reply 199 of 269
    thttht Posts: 5,608member
    Quote:

    Originally Posted by CanukStorm View Post

     



    But what if they moved to 2GB RAM using the more power efficient LPDDR4 RAM, which is suppose to be about 40% more power efficient than the RAM they're currently using? Wouldn't that help Apple achieve their objectives?


     

    I'm not saying Apple isn't adding another GB of DRAM because of the power draw from the DRAM. That's peanuts. 10s of milliwatts.

     

    I am saying that by having another GB of DRAM, users will end up running more background processes resulting in more usage of the CPU and wireless radios. 100s of milliwatts. The benefit of running all those extra background processes is questionable for Apple's mass market customers versus having the system last longer.

     

    More RAM is an inevitable thing. Not in this years iPhones, but 2015, the more pressure on Apple to add more and more RAM both because system software improvements, larger memory footprint software being used on smartphones more and more, and user workloads.

  • Reply 200 of 269
    melgrossmelgross Posts: 33,600member
    nolamacguy wrote: »
    when we reach that point, sitting on them likely wont do any damage. probably less than sitting on a 6+ in your back pocket.

    You really shouldn't be putting your phone in you back pocket in the first place.
Sign In or Register to comment.