iPhone 6 rivals by Samsung, LG, HTC suffering delays in Qualcomm's 64-bit Snapdragon answer to Apple

123457»

Comments

  • Reply 121 of 134
    crowleycrowley Posts: 10,453member
    solipsismy wrote: »
    How is it obvious when there are over a 100 previous comments in the thread.
    Common sense and standard practice. If you're responding to the most recent post then quotes are unnecessary.
    solipsismy wrote: »
    Just because your comment could line up with the previous comment doesn't mean it's part of the same conversation,
    In the absence of any evidence to the contrary that's exactly what it means.
    solipsismy wrote: »
    especially when you write something ambitious for a forum, like "Obviously neither" without including anything about the comment in which you're replying that could lead the reader to know your comment is specifically referring to Brooklyn or Queens.
    Do you always repeat questions in real life too? Else how will someone know you're not answering a question from last week?

    Come on, this is ridiculous.
    solipsismy wrote: »
    You also have no idea if there will be other comments that will be sent before you make your reply thereby separating your comments in the thread even further.
    At the time of posting that is true. However I'll notice straight away once it has submitted and if there is a lack of clarity I'll edit the quote in. Doesn't happen that often, but I have done that.
    solipsismy wrote: »
    Despite what you think the "quote" option isn't feature rot there to make innumerable conversations within the same thread more difficult.
    I don't think it's feature rot, but thanks for telling me what I think. I use the quote button all the time, when I think it's helpful or necessary.
  • Reply 122 of 134
    solipsismysolipsismy Posts: 5,099member
    crowley wrote: »
    Do you always repeat questions in real life too?

    For the sake of clarity to make sure I understood the question or if there are multiple questions thrown at me at once then, absolutely. However, most of the time verbal conversations typically don't have dozens of people all talking at once, which is the effective outcome of an internet thread. Have you never been on a conference call? Even with 3 people on the line you need to acknowledge whom you're referring, and once you get to 5 it becomes a clusterfuck if several people are speaking at once. Your post in this thread is number 112.


    [VIDEO]
  • Reply 123 of 134
    crowleycrowley Posts: 10,453member
    If you're answering a question in the middle or a stream of questions then by all means clarify which one you're answering in. If you answering the last one there's absolutely no need.

    And pertinently to this thread there weren't any outstanding questions addressed to me. It was blindingly obvious that I was responding to the one post that had been made since my last post.

    What a colossal waste of time.
  • Reply 125 of 134
    crowley wrote: »
    If you're answering a question in the middle or a stream of questions then by all means clarify which one you're answering in. If you answering the last one there's absolutely no need.

    And pertinently to this thread there weren't any outstanding questions addressed to me. It was blindingly obvious that I was responding to the one post that had been made since my last post.

    What a colossal waste of time.

    You're thinking of threads too linearly.

    Sometimes I skim through threads and miss out posts. Often, several people ask questions that go unanswered. Or one can leave a thread and come back to it days later.

    So yes, Solip is right in saying that not using quotes is a pain in the arse. Seeing as it takes precisely the same time to quote as not quoting, there is no reason not to.
  • Reply 126 of 134
    You're thinking of threads too linearly.

    Sometimes I skim through threads and miss out posts. Often, several people ask questions that go unanswered. Or one can leave a thread and come back to it days later.

    So yes, Solip is right in saying that not using quotes is a pain in the arse. Seeing as it takes precisely the same time to quote as not quoting, there is no reason not to.

    ...or you read a thread then stop and then come back hours or days later, but would like to pick up anywhere in the thread when skimming.

    ...or your reply is at the top of the new page (see screenshot above) and you have no idea what he's referring to if you're not familiar with the very last post on the previous page, which may require a reader trying to follow along to now hit the back button and scroll to the bottom just to find out what the talking about.

    ...or you don't even know your comment was ever reied to because without quoting a poster you don't get a notification.
  • Reply 127 of 134
    crowleycrowley Posts: 10,453member
    Quote:
    Originally Posted by SolipsismY View Post





    ...or you read a thread then stop and then come back hours or days later, but would like to pick up anywhere in the thread when skimming.

    I'm not replying to the general public, I'm replying to an individual.  Other people entering into the conversation mid-stream isn't something I have any interest or incentive to make more convenient.

     

    Quote:
    Originally Posted by SolipsismY View Post



    ...or your reply is at the top of the new page (see screenshot above) and you have no idea what he's referring to if you're not familiar with the very last post on the previous page, which may require a reader trying to follow along to now hit the back button and scroll to the bottom just to find out what the talking about.

    Cry me a river. If the post is interesting enough to make you want to know what it's in reply to, then go back. If it's not, don't.

     

    Meanwhile the quote-happy make me have to scroll the page a great deal more than I would otherwise need to.  What a liberty!

     

    Quote:
    Originally Posted by SolipsismY View Post



    ...or you don't even know your comment was ever reied to because without quoting a poster you don't get a notification.

    Maybe I should stop quoting you then, this is incredibly tedious.  Learn to follow the flow of a conversation.  If it gets confusing then go find another conversation to butt into.  I have no interest in dealing with that.

  • Reply 128 of 134
    solipsismysolipsismy Posts: 5,099member
    crowley wrote: »
    I'm not replying to the general public, I'm replying to an individual.  Other people entering into the conversation mid-stream isn't something I have any interest or incentive to make more convenient.

    Cry me a river. If the post is interesting enough to make you want to know what it's in reply to, then go back. If it's not, don't.

    Meanwhile the quote-happy make me have to scroll the page a great deal more than I would otherwise need to.  What a liberty!

    Maybe I should stop quoting you then, this is incredibly tedious.  Learn to follow the flow of a conversation.  If it gets confusing then go find another conversation to butt into.  I have no interest in dealing with that.

    Translation: You're selfish and with no desire to improve your communication.

    Got it.
  • Reply 129 of 134
    nhtnht Posts: 4,522member
    Quote:

    Originally Posted by SolipsismY View Post



    Translation: You're selfish and with no desire to improve your communication.



    Got it.



    Says the guy that misquotes people.

     

    Got it.

  • Reply 130 of 134
    solipsismysolipsismy Posts: 5,099member
    nht wrote: »

    Say, that guy that molests people.

    Got it.

    What?!
  • Reply 131 of 134
    crowleycrowley Posts: 10,453member
    solipsismy wrote: »
    Translation: You're selfish and with no desire to improve your communication.

    Got it.
    I'm not the one saying that you should change your ways to meet my desires. So you can take selfish and go boil your head.

    And in terms of improving communication I refer you to Latin For Dummies* the other day.

    * the Latin was mostly you. I had a moment of dummy myself.
  • Reply 132 of 134
    Wow, did you actually just say that? How utterly ridiculous. You might as well have said that since the Android NDK uses C/C++ and most software is written in C/C++, that it's possible to take any piece of software and easily port it to Android. This is simply not the case.

    Both Apple and Microsoft have made significant changes to their development environments to make it easy to port code over from their desktop software to mobile. This requires far more than just using a common language.

    Besides, none of the truly useful desktop software is written in Java. They will almost all be written in C/C++. The people who use Java are those who need to crank out quick versions of software (for example, within an organization) or software that doesn't require high performance. You won't see Photoshop in Java, for example. Or popular browsers like Chrome.



    If you had a 6 cylinder 3.0 litre engine that produces 500HP and a 10 cylinder 8.0 litre engine that also produces 500HP, which one would you consider as being technically more advanced? Sure they both produce the same HP, but one uses brute force while the other uses advanced engineering. I'd say a much larger engine that produces the same amount of HP is a piece of crap.

    The K1, Snapdragons and Exynos are crap because they use the brute force method. The A7/8/8X are superior because they use advanced technology to get their results.

    Or a better way to look at it: You can't make a racehorse out of a pig, but you can still have a pretty fast pig.
    Just a FYI on the clock speed, after compiling my own n9 kernel(its basically stock) I've been doing some testing and benchmarks- at 1.7 GHz I'm still scoring 1700+ on geekbench 3 and right at 3000 for geek bench multicore. So even under clocked by quite a bit, it still performs very well. My bone stock scores were right above 2000 single and around 3500 multi. So where is the line where its no longer brute force? If I take it down to 1.4 and still match a8 scores will you be happy? I've also over clocked and taken it to 2.5(not really over clocked, the code was already there in kernel, but 2.5 wasn't active). Close to 3.0 should be perfectly stable, will try soon So what's really more advanced? So far Denver can be clocked fairly low and still match and possibly exceed the a8 and come close to a8x in single core CPU performance, or clock at 3.0 and absolutely demolish it.
    Also, I had to dig through the kernel code to see where this thing throttles, because I had to try VERY hard just to get this thing to throttle a little. I had to do about 15 minutes of performance governor with 100% CPU load to hit 70° which it then throttles to 2.0 from 2.29(stock clock speed).
  • Reply 133 of 134
    di11igaf wrote: »
    Wow, did you actually just say that? How utterly ridiculous. You might as well have said that since the Android NDK uses C/C++ and most software is written in C/C++, that it's possible to take any piece of software and easily port it to Android. This is simply not the case.

    Both Apple and Microsoft have made significant changes to their development environments to make it easy to port code over from their desktop software to mobile. This requires far more than just using a common language.

    Besides, none of the truly useful desktop software is written in Java. They will almost all be written in C/C++. The people who use Java are those who need to crank out quick versions of software (for example, within an organization) or software that doesn't require high performance. You won't see Photoshop in Java, for example. Or popular browsers like Chrome.



    If you had a 6 cylinder 3.0 litre engine that produces 500HP and a 10 cylinder 8.0 litre engine that also produces 500HP, which one would you consider as being technically more advanced? Sure they both produce the same HP, but one uses brute force while the other uses advanced engineering. I'd say a much larger engine that produces the same amount of HP is a piece of crap.

    The K1, Snapdragons and Exynos are crap because they use the brute force method. The A7/8/8X are superior because they use advanced technology to get their results.

    Or a better way to look at it: You can't make a racehorse out of a pig, but you can still have a pretty fast pig.

    Just a FYI on the clock speed, after compiling my own n9 kernel(its basically stock) I've been doing some testing and benchmarks- at 1.7 GHz I'm still scoring 1700+ on geekbench 3 and right at 3000 for geek bench multicore. So even under clocked by quite a bit, it still performs very well. My bone stock scores were right above 2000 single and around 3500 multi. So where is the line where its no longer brute force? If I take it down to 1.4 and still match a8 scores will you be happy? I've also over clocked and taken it to 2.5(not really over clocked, the code was already there in kernel, but 2.5 wasn't active). Close to 3.0 should be perfectly stable, will try soon So what's really more advanced? So far Denver can be clocked fairly low and still match and possibly exceed the a8 and come close to a8x in single core CPU performance, or clock at 3.0 and absolutely demolish it.

    May I just add one thing to the equation and that is heat dissipation. This is what make the Apple "A" series of chip such a power house of a mobile chip. Certainly the Denver chip is a fast chip, as your trials have shown, but it creates way to much heat to be dissipated inside a small enclosure. This is also supported by tests. So, you are correct in theory, but not in practical mobile situations.

    This is why bench tests do not show the superiority of the A8/A8X in small devices where heat conditions are a factor. Apple does not need to back off the performance under heavy use conditions and the competitor's products have to quickly dial back performance due to heat build up. The performance superiority of Apple's hardware is also due to several things Apple has done with software to further contain heat generation on the CPU/GPU SoC.

    I suspect this is what Eric may have been thinking of when he was writing about brute force vs. better design. Apple has used their ability to influence the design of hardware and software to squeeze more performance out of the SoC that any specs or benchmarks may reveal. As devices become smaller, thinner, and lighter, Apple's integrated design direction will have even more of an effect in differentiating its products.
  • Reply 134 of 134
    I'm guessing you didn't see the edit I added, I didn't know I had been quoted. Anyway Here is what I added-
    "Also, I had to dig through the kernel code to see where this thing throttles, because I had to try VERY hard just to get this thing to throttle a little. I had to do about 15 minutes of performance governor with 100% CPU load to hit 70° which it then throttles to 2.0 from 2.29(stock clock speed)."
    I won't deny it does run a little warm, but all the reports if it getting hot, are 1-with chrome(there is a bug with chrome right now on this device on 5.0) and 2-a little blown out of proportion- all CPUs get hot- it just so happens you can feel the heat through the back of this one where CPU is. As far as heat dissipation- I'm not sure how fast the a8x dissipates heat, but I've never in my life of hacking on tablets and phones seen a CPU that cools off faster than this one. It was very hard for me to even get a screenshot of this thing at 69°, (me and another developer are changing some things in kernel with thermal governor) pulling the info from /d/soc-therm because literally as soon as the tests were done the CPU drops 20° almost instantly, by the time I can even get ready click buttons for a screenshot, the CPU temp is back down to ~50°, seconds later its back to mid 40's. The other developer couldn't even hit 60°.
    Now anyone can say this CPU sucks, and that's fine-thats your opinion, but-i am FULLY skeptical of any claims by nvidia, and i have told myself I would never buy a tegra device again after owning the atrix and the tegra 3 prime- those CPU's sucked, and I was the one who discovered the prime shipped with a locked bootloader(not nvidias fault). No neon support in tegra 2, tegra 3 was not that impressive for a quad core. The first nexus 7 was decent when I owned it, but I never got into messing with it- have always done work with snapdragons. My single core snapdragon inspire was much faster/smoother than the dual core atrix.
    I must say this Denver soc has surprised me thoroughly- I was expecting same results as with the t2 and t3, but so far I am impressed, and it has got me back to wanting to read through kernel code and tweaking things to my liking. I haven't really compiled anything since the HTC inspire, so far this has got me back into it.
    All heat tests were done at 2.3ghz, at 1.7 or lower, you couldn't get this thing to throttle if you wanted too in a tablet- its impossible to say what would happen in a phone.
Sign In or Register to comment.