Bugs in Qualcomm's 64-bit Snapdragon 810 may force Samsung to use its own Exynos chips in Galaxy S6

1235»

Comments

  • Reply 81 of 94
    richl wrote: »
     
    <span style="line-height:1.4em;">Well I'm glad you finally learned something from reading an AI article RichL! </span>

    I've learned many, many things from reading AI. It's only your articles that I find misleading to the point of being worthless. It doesn't help that you defend them with insults under a pseudonym.

    So worthless that you find the time to respond to them regularly.
  • Reply 82 of 94
    pk22901pk22901 Posts: 153member
    Quote:

    Originally Posted by cnocbui View Post

     

    ...

    The already released Samsung Note 4 comes in two variants - one with Samsung's own Exynos 5433 64-Bit processor, running in 32 bit mode - probably because 64-bit lolipop isn't out yet - and Qualcom's Snapdragon 805.  In Benchmarks, the Exynoss 5433 variant seems to outperform the Snapdragon 805

     


     

    What does it mean for the processor to be running in "32 bit mode"? Can the 5433 execute 32 and 64 bit code side by side or do you need to run it with one or the other?

  • Reply 83 of 94
    cnocbuicnocbui Posts: 3,613member
    Quote:

    Originally Posted by pk22901 View Post

     

     

    What does it mean for the processor to be running in "32 bit mode"? Can the 5433 execute 32 and 64 bit code side by side or do you need to run it with one or the other?




    AFAIK it is a 64-Bit chip being run in 32-Bit mode because Kit-Kat isn't 64-Bit.  I don't know whether you could run the two simultaneously, but I doubt it.

    http://anandtech.com/show/8537/samsungs-exynos-5433-is-an-a57a53-arm-soc

  • Reply 84 of 94
    melgrossmelgross Posts: 33,599member



    See the blue highlighted responses made by @Corrections in the posts preceding yours. @Corrections is a Sock Puppet for DED -- where he is allowed to attack and denigrate anyone who challenges DED's agenda or version of the facts.

    When he sticks to the facts, I appreciate DED's writing as much as anyone. But when he goes off on a rant, or attacks the legitimate disagreements of other posters (using an alias) ... it sullies the whole reason for the article and destroys the discussion.

    I can understand giving an author some freedom to express his biases and his agenda in an article. But I do not believe that gives the author the right to anonymously attack those who disagree in the discussion portion of the thread. Normally, after a few of these @Corrections attacks in a thread, I will just move on to another article or another web site. Sometimes, though, I will complain (like this post) in the hopes that the Editors at AI will clean this up -- it does not meet the otherwise high standards of AI discussions.

    Ok, at least now I understand what's going on there. How does one know that he's really DED? I've argued a couple of times with DED on his web site some time ago, and yes, he's really closed to anything he doesn't come up with (sheesh, he's even worse than me!) but unless someone's doing some ip snooping, I don't know how you can be sure they're the same person.

    I can imagine that someone here is such a fanboy that the fanboy writing of DED is something (s)he really wishes is all true.
  • Reply 85 of 94
    melgrossmelgross Posts: 33,599member
    qo_ wrote: »
    Re champing vs chomping, I see it's been abused enough within the last few years that chomping is winning through sheer volume of misuse. So, in a 64-bit world, it appears champing has been usurped.

    Language changes over time. The first one I really didn't like was when people began saying "so fun". I was hoping it would go away, but when my daughter's teachers began using it when she was in elementary school in the mid/late '90's, I just knew it would stick around, I still don't like it.

    But I've never heard champing at the bit until I read, somewhere, that chomping was wrong. Since it's just really an idiomatic expression, it really doesn't matter either way. The truth is that champing sounds British English, while chomping sounds like American English.
  • Reply 86 of 94
    melgrossmelgross Posts: 33,599member
    So worthless that you find the time to respond to them regularly.

    Benj, let's not expand this please. If something is going on here that's not correct, we'll do something about it.
  • Reply 87 of 94
    noivadnoivad Posts: 186member
    Quote:
    Originally Posted by Dachar View Post



    Don't have all of your eggs in one basket. It seems that Apple's strategy of flipping between various key part suppliers and spreading manufacturing amongst a number of companies will give it greater long term protection that other phone companies who appear to relie on one supplier for key parts. On top of this, if Apple's plan B is development of key chips in the background, they would seem to have out manovered the competition. My only concern is that if this strategy eliminates competition will Apple loose the impetuous to innovate or take advantage of its monopoly position by reducing production and raising prices?



    Considering that Apple doesn’t try to compete on price with any one— instead relying on iOS’s & OS X’s quality to command a premium, Apple’s customers are either in it for the looks/prestige of the devices or the performance-centric (or both), it already charges what the market will bare. Also, add in Apple’s history of setting price points and iterating new products within those price points every year, I would say there is no chance they would change their strategy now, even if they gain a monopoly position in the high end. The other manufacturers are not going to go away, and new competition in the low end will rise from Chinese companies, and despite the dynamically different target market, Apple will still have to stay ahead of the curve in terms of not only performance but in design. 

     

    The only way any of this would change is if the Apple Board turned over and ousted Tim — not likely considering he has continued the upward climb that Jobs started without faltering. No company has ever switched gears when they are firing on all cylinders and still accelerating with headroom to spare. More likely Apple will continue to set the trends over companies follow by moving into or creating new viable markets as they have been for the past decade+. Before Apple came out with the iPod, MP3 players were a niche, before the iPhone, smartphones were a niche, and before the iPad, Tablets were a niche. While Apple didn’t invented the product categories, but they did show the industry how to make them appeal to the public, which is why they maintain their lead as far as having a solid architecture and continued profit margins. 

     

    Android might sell 50%–80% more, but most of these companies are in the red in their mobile sales, and only a fraction of that majority are using Qualcomm’s higher end Snapdragon processors. This means that the industry will continue to push the mediocre while Apple further erodes the profitable cutting edge sales. The thing is, none of the companies can give up despite their losses because that would cede the market to Apple. What will probably happen is Apple will continue to make inroads to setting the de-facto standards for mobile content and the other companies will chase their tail and keep funneling resources from profitable sectors (advertising & other consumer markets). The press will report Apple’s minuscule market share like it alway does, citing Android’s dominance, but neglect to compare 1:1 profitability and capability (like every review that has to cite 5 different Android phones to show how they are better at the 5 things in Apple’s one phone). This won’t mean the competitors will all die, but it does mean that they will make less and less a difference to Apple’s profitability without some major shift or real innovation. Considering how Droid makers think: valuing quantity over quality, this environment won’t change any time soon.

  • Reply 88 of 94
    melgross wrote: »
    qo_ wrote: »
    Re champing vs chomping, I see it's been abused enough within the last few years that chomping is winning through sheer volume of misuse. So, in a 64-bit world, it appears champing has been usurped.

    Language changes over time. The first one I really didn't like was when people began saying "so fun". I was hoping it would go away, but when my daughter's teachers began using it when she was in elementary school in the mid/late '90's, I just knew it would stick around, I still don't like it.

    But I've never heard champing at the bit until I read, somewhere, that chomping was wrong. Since it's just really an idiomatic expression, it really doesn't matter either way. The truth is that champing sounds British English, while chomping sounds like American English.

    "for free"

    and

    "floundering"


    Like Wow!

    We communicate on so many different levels/threads :D
  • Reply 89 of 94
    melgross wrote: »
    Ok, at least now I understand what's going on there. How does one know that he's really DED? I've argued a couple of times with DED on his web site some time ago, and yes, he's really closed to anything he doesn't come up with (sheesh, he's even worse than me!) but unless someone's doing some ip snooping, I don't know how you can be sure they're the same person.

    I can imagine that someone here is such a fanboy that the fanboy writing of DED is something (s)he really wishes is all true.

    It's circumstantial, but it's matter of style. I followed DED on Roughly Drafted and DED/Prince McClean on AI for years. After a while, you recognize the style and can predict with high-accuracy what is coming ... like Rush Limbaugh, Glen Beck, Hannity, Chris Matthews et al.

    If you go back through @Corrections posts -- AFAICT, they are only made to attack posters criticizing DED articles ... And the style of using embedded BOLD instead of quoting makes it difficult to respond in context to @Corrections posts -- this is the same way DED responded to criticism on Roughly drafted.
  • Reply 90 of 94
    Quote:
    Originally Posted by Dick Applebaum View Post





    It's circumstantial, but it's matter of style. I followed DED on Roughly Drafted and DED/Prince McClean on AI for years. After a while, you recognize the style and can predict with high-accuracy what is coming ... like Rush Limbaugh, Glen Beck, Hannity, Chris Matthews et al.



    If you go back through @Corrections posts -- AFAICT, they are only made to attack posters criticizing DED articles ... And the style of using embedded BOLD instead of quoting makes it difficult to respond in context to @Corrections posts -- this is the same way DED responded to criticism on Roughly drafted.



    It's been well documented that Daniel is Corrections and Corrections is Daniel. He's stated as much previously and Corrections's posting history indicates much the same:

     

    http://forums.appleinsider.com/forums/posts/by_user/id/134541/page/970

     

    He also likes to sockpuppet under the name "Prince McLean"

     

    http://tracks.ranea.org/post/524619951/dan-dilger-exploding-head

     

    Rather sleazy really...

  • Reply 91 of 94
    It's circumstantial, but it's matter of style. I followed DED on Roughly Drafted and DED/Prince McClean on AI for years. After a while, you recognize the style and can predict with high-accuracy what is coming ... like Rush Limbaugh, Glen Beck, Hannity, Chris Matthews et al.


    If you go back through @Corrections posts -- AFAICT, they are only made to attack posters criticizing DED articles ... And the style of using embedded BOLD instead of quoting makes it difficult to respond in context to @Corrections posts -- this is the same way DED responded to criticism on Roughly drafted.


    It's been well documented that Daniel is Corrections and Corrections is Daniel. He's stated as much previously and Corrections's posting history indicates much the same:

    http://forums.appleinsider.com/forums/posts/by_user/id/134541/page/970

    He also likes to sockpuppet under the name "Prince McLean"

    http://tracks.ranea.org/post/524619951/dan-dilger-exploding-head

    Rather sleazy really...

    From your second link:
    Submitted for your consideration: Daniel Eran Dilger, proprietor of the aptly-named Roughly Drafted. (He uses all three names to distinguish himself from all those other Daniel Dilgers out there.) I passed through a few stages in my reading at RD after discovering it:

    • Long, well-written and well-researched articles! Cool!
    • Hmm. Maybe just long and well-researched.
    • Just down to “long,” aren’t we?
    • This diagram looks like an explosion at the flow chart factory.
    • Wait, what orifice did he pull that out of?
    • My brain is crawling out my ear to get away

    The above mirrors my experience, almost exactly ... Because DED was, mostly, stating things I wanted to believe, it took me longer than I like to admit -- to realize what he was about.

    DED is knowledgeable, intelligent and is capable of writing excellen, concise articles. But, too often they deteriorate into a tirade about every real or imagined slight DED believes wronged him.

    I can usually guess, by the wording of the headline, when an article is written by DED. Usually, I try to give the article a fair reading.

    @Corrections is in my block list so I don't see the sock puppet attacks unless someone quotes him.

    However, too many OT rants; too many self-supporting links to other articles supporting his agenda (and written by DED); or too many @Corrections attacks -- I just move on, because I have better things to do.

    Sadly, sometimes, I see other well-reasoned members enthralled with DED's writings -- traveling somewhere along the bulleted path quoted above ...
  • Reply 92 of 94
    thttht Posts: 5,605member
    Quote:

    Originally Posted by melgross View Post



    There's a problem with benchmarking which might explain some of the variability of the Android benchmarks. If you use Geekbench, for example, you will notice that they advise not running anything else at the same time. You should turn your phone off completely after making sure that everything that turns on when you turn the phone back on will remain off.



    Due to Android using a Desktop multitasking model, many apps remain on when you turn the phone completely off, and back on again. While many Android geeks claim to be knowledgable, I fint that not to be so. They know a few minor picky things, but don't understand the system.



    Because of that, most Android phones will have some apps running in the background which slow down various aspects of the benchmarks. It's much easier to control this in an iOS device. Once you turn the phone off completely, and turn it back on again, this problem doesn't exist, so the scores are much closer, just bearing the usual slight deviations in manufacturing tolerances of the components, of about 5%, or so.

     

    For this particular Note 4 Geekbench plot, there are several things to explain the variations in the plot.

     

    1. The Note 4 has 2 architectures: a 1.9 GHz Cortex-A57, 1.3 GHz Cortex-A53 big.LITTLE CPU and a 2.7 GHz Krait CPU. This explains some of the variation in frequency in the plot. All the scores >2.5 GHz are likely Krait scores. All the scores at 2 GHz are the Cortex-A57 scores (run in 32-bit mode), and they seem a bit low to me actually.

     

    2. The scores at 1.4 GHz are maybe the Geekbench process being run on either 1.3 GHz Cortex-A53, or maybe the Cortex-A57 for who knows what reason (it should never do this it totally defeats the reason for big.LITTLE). The 2.27 and 2.46 GHz are maybe down clocked or slow ramp-ups of the CPU from an idle state on the Krait Note 4's.

     

    3. The 2.9 GHz scores, there are lots of them, are who knows what. There are certainly some people who root and over clock, so that may explain some of those results. It might explain them all, who knows. I do know that there are people who love to try to get the highest possible benchmark score possible. So, they root, and strictly control the processes running on their Android device to try to get that highest benchmark score, and they do submit it to the Geekbench database.

     

    4. The RAM reporting is just FUBARed. Who knows. Thought all Note 4's had 3 GB, but you see 0.8, 2 and 3 GB being reported.

     

    5. Concentrating on the large variation of the Krait scores at 2.65 GHz, which is the as advertised frequency of the CPU, I don't think I buy that multi-tasking explains that variation. It's probably part of it, but I don't think all.

     

    Android doesn't use a desktop multitasking model. At least that's what I can gather. It implements a multi-tasking model suitable for a handheld that's more flexible than iOS, but desktop multitasking it is not. On a PC, background apps are run "whole", there's very little difference between foreground and background (except of scheduler priorities), with the UI updating as it needs too. For Android, apps are designed on a client-server style architecture with the UI in one process and non-UI parts in another thread or process. When it is in background, the UI thread is stopped, while the non-UI thread can keep on going. It's up to the app developer to architect their apps in this way, otherwise their apps will likely be stopped when in background. Then, apps in background can be stopped if RAM is starved or perhaps it's been a long time since it was touched. This is why Samsung had to specially implement their feature of running 2 to 4 apps simultaneously.

     

    I agree with you that it generate some variation in the benchmark scores at 2.7 GHz, but not all. Some of it surely involves esoteric details of how Android uses cache, how the task scheduler distribute processes and threads across 4 cores, what the CPU governor will allow in terms what CPUs can ramp and fast they can ramp, and what the state of the garbage collector is. Geekbench could be running, and if the garbage collector decides to run for 50 ms, that'll probably hurt the Geekbench score.

  • Reply 93 of 94
    thttht Posts: 5,605member
    Quote:

    Originally Posted by pk22901 View Post

     

    What does it mean for the processor to be running in "32 bit mode"? Can the 5433 execute 32 and 64 bit code side by side or do you need to run it with one or the other?


     

    The 5433 should be able to execute 32-bit and 64-bit instructions simultaneously, but the OS (really the kernel) has to be 64-bit to do that. That's "should be". Who knows, maybe Samsung messed it up.

     

    Since Android 4.x is a 32-bit OS, people use the term "32 bit mode" when a 64-bit system is running a 32-bit OS. All the instructions are 32-bit in such a case, more specifically, all the instructions are part of the 32-bit ARM v7 instruction set, none are of the 64-bit ARM v8 instruction set.

  • Reply 94 of 94
    melgrossmelgross Posts: 33,599member
    tht wrote: »
    For this particular Note 4 Geekbench plot, there are several things to explain the variations in the plot.

    1. The Note 4 has 2 architectures: a 1.9 GHz Cortex-A57, 1.3 GHz Cortex-A53 big.LITTLE CPU and a 2.7 GHz Krait CPU. This explains some of the variation in frequency in the plot. All the scores >2.5 GHz are likely Krait scores. All the scores at 2 GHz are the Cortex-A57 scores (run in 32-bit mode), and they seem a bit low to me actually.

    2. The scores at 1.4 GHz are maybe the Geekbench process being run on either 1.3 GHz Cortex-A53, or maybe the Cortex-A57 for who knows what reason (it should never do this it totally defeats the reason for big.LITTLE). The 2.27 and 2.46 GHz are maybe down clocked or slow ramp-ups of the CPU from an idle state on the Krait Note 4's.

    3. The 2.9 GHz scores, there are lots of them, are who knows what. There are certainly some people who root and over clock, so that may explain some of those results. It might explain them all, who knows. I do know that there are people who love to try to get the highest possible benchmark score possible. So, they root, and strictly control the processes running on their Android device to try to get that highest benchmark score, and they do submit it to the Geekbench database.

    4. The RAM reporting is just FUBARed. Who knows. Thought all Note 4's had 3 GB, but you see 0.8, 2 and 3 GB being reported.

    5. Concentrating on the large variation of the Krait scores at 2.65 GHz, which is the as advertised frequency of the CPU, I don't think I buy that multi-tasking explains that variation. It's probably part of it, but I don't think all.

    Android doesn't use a desktop multitasking model. At least that's what I can gather. It implements a multi-tasking model suitable for a handheld that's more flexible than iOS, but desktop multitasking it is not. On a PC, background apps are run "whole", there's very little difference between foreground and background (except of scheduler priorities), with the UI updating as it needs too. For Android, apps are designed on a client-server style architecture with the UI in one process and non-UI parts in another thread or process. When it is in background, the UI thread is stopped, while the non-UI thread can keep on going. It's up to the app developer to architect their apps in this way, otherwise their apps will likely be stopped when in background. Then, apps in background can be stopped if RAM is starved or perhaps it's been a long time since it was touched. This is why Samsung had to specially implement their feature of running 2 to 4 apps simultaneously.

    I agree with you that it generate some variation in the benchmark scores at 2.7 GHz, but not all. Some of it surely involves esoteric details of how Android uses cache, how the task scheduler distribute processes and threads across 4 cores, what the CPU governor will allow in terms what CPUs can ramp and fast they can ramp, and what the state of the garbage collector is. Geekbench could be running, and if the garbage collector decides to run for 50 ms, that'll probably hurt the Geekbench score.

    The way Android does my,ti tasking is very similar to Desktop versions. All apps on Desktop systems suspend UI functions when in the background. By that, I don't mean behind another window. Of course, the difference in mobile is that excepting a very few devices, only one app can be in focus at once, as only one app can be on the Desktop at once. But the way they function in the background is very much the same,
Sign In or Register to comment.