iPhone 6 rivals by Samsung, LG, HTC suffering delays in Qualcomm's 64-bit Snapdragon answer to Apple

12346

Comments

  • Reply 101 of 134
    paul94544paul94544 Posts: 1,027member
    You are totally winning this argument and he is now in full on, damage containment mode , don't worry about it though it's typical of a man who made absolute statements they are almost impossible to substantiate when challenged by objective reasoning. You have cornered him and he doesn't have the character to say he was incorrect in his original assertions
    relic wrote: »
    I keep writing over and over because you keep saying that every chip out there is garbage, over and over. It's simply not true, you've lost your awe, your bewilderness, if such a term existed, this technology is wondrous, regardless of who is faster. You look at the A8 and deemed it king of all kings. That's great and all but that chip is only available in one product. Do you expect others just to give up, it's hopeless Apple will always be better than us, might as well turn in our pocket protectors. Have you actually seen the code that's out their for the K1, even played with a device that contained this chip. Not just the K1 but any of these chips that you keep calling garbage. My Nokia 2520 with a Qualcomm 800 is fantastic, never once have I thought to myself, man this thing is slow, my Kindle HDX also with it's Qualcomm 800 is one of the most lag free gizmos I have ever owned. Your idea of a good chip is if it beats the Apple A8x, everything else is pure carp. Why, because there not at 64Bit. This article is full of carp, it's states that Qualcomm and Samsung have been scrambling to get a viable 64Bit chip to market. Not true, both of these companies after Nvidia released their timetable stated as far back as 2011 that 2015 would be the year that they would start releasing their next gen 64Bit chips. Any article about malfunctions are problems is just conjecture at this point, let's a actually wait a month or so to see if they deliver.

    I'm not downplaying Apple's achievements either, I just can't stand reading these silly comments about how bad the competition's chips are when the majority of users here can't even fathom just how much computing or even know what to do with the computing power that's in an A6, let alone an A8x. However it's so important that some arbitrary benchmark beats the rest. I only use these benchMarks to try to show that these chips aren't bad. If you would see just a quarter of the things I'm doing with my K1 devices not only would you be a convert but you would probably want to join in on the fun. My Lenovo 4K touch monitor has an Nvidia K1, I've never seen a faster dumb terminal slash multimedia hub, my now two Jetson K1 development boards, from just being good desktop computers to a node in a render cluster, all completely wondrous. I'll keep having the time of my life rediscovering the great things about computer technology over and over again and you can keep shaking your finger at others calling them crap. Both are happy.
     0Likes 0Dislikes 0Informatives
  • Reply 102 of 134
    tmaytmay Posts: 6,470member
    Quote:
    Originally Posted by Relic View Post





    Why is what he said bad, why did you feel it necessary to belittle him. He probably read this somewhere and decided to share it. You could have politely replied to him. You strike me as a person who just reads and reads every bit of tech highlights they can get their hands on without actually getting their hands dirty, talking instead of doing. Java is one of the most used in house programming languages for businesses there is. Entire infrastructures are built upon it, a lot more than compiled code.



    Car analogies are silly but can you honestly tell me you wouldn't rather have a V10 over a V8. That 3.0 liter V8 would also be at it's max limit to what is possible, where as the V10 could be pushed to uncharted new hights. I would also choose the engine that I can work on myself instead of outsourcing it to a garage. I'm also limited to a single platform using the A8, where as the K1 is opened to a lot more. Not saying one is better than the other, they both have their purposes and do them well, calling one lesser than the other though is just silly fanboyism crap. The K1 is a very advanced chip, there is nothing else like it on the market right now. Brute force, what does that even me, the K1 is running at it's optimal speed and thermal usage. Nvidia didn't over clock the chip or push it, which by the way you can do, almost 300Mhz past it's shipping spec'ed clock speed before you start to seeing glitches. Including the GPU by 150Mhz.



    Spend just a week with a Jetson K1 development board and I guarantee that you will be singing a different tune. You won't though and I'm sure even if you had one you wouldn't know what to do with it. You spew your copied and pasted snippets from tech b!ogs and declare it as gossiple without every actually using the technology you so passionatly hate. The things you can do with this technology is mind blowing, there is 192 CUDA cores in the K1 that's as fast as a GeForce 8800GT. With open libraries and code available to manipulate that power to your hearts content. You don't find that amazing, oh, I forgot, it's not as fast as the Apple A8 or as conservative with it's power consumptipn. I see a super computer and you just another piece of tech that doesn't live up to your expectations, which amount to nothing.

    Maybe I'm missing out on the big picture, but your many defenses of the K1 as a processor have little to do with question at hand, which is; where are these OEM's going to get a 64 bit processor to compete with the now second generation A series, and will this said 64bit processor even be available prior to the release of the next generation of A series processor, and will it even be competitive then?

     

    It is apparent to me that K1 and K1 Denver design wins are few, which indicates that there is in fact some issue(s) with the K1, probably not raw performance, but perhaps cost and power efficiency, lack of integrated baseband, or maybe just past disappointments with Nvidia, that is keeping adoption low. Apple is probably set for 200 million A8 and 50 million A8X processors over the next year; I can't imagine that the K1 will displace many Qualcomm or Exynos wins, so I'm not sure that your argument for the K1 is even relevant to the conversation; it just isn't going to end up being a processor of choice for mobile.

     

    Perhaps as you state, the K1 Denver will end up being popular other than mobile. I could see that.

     0Likes 0Dislikes 0Informatives
  • Reply 103 of 134
    relic wrote: »
    Why is what he said bad, why did you feel it necessary to belittle him. He probably read this somewhere and decided to share it. You could have politely replied to him. You strike me as a person who just reads and reads every bit of tech highlights they can get their hands on without actually getting their hands dirty, talking instead of doing. Java is one of the most used in house programming languages for businesses there is. Entire infrastructures are built upon it, a lot more than compiled code.

    Car analogies are silly but can you honestly tell me you wouldn't rather have a V10 over a V8. That 3.0 liter V8 would also be at it's max limit to what is possible, where as the V10 could be pushed to uncharted new hights. I would also choose the engine that I can work on myself instead of outsourcing it to a garage. I'm also limited to a single platform using the A8, where as the K1 is opened to a lot more. Not saying one is better than the other, they both have their purposes and do them well, calling one lesser than the other though is just silly fanboyism crap. The K1 is a very advanced chip, there is nothing else like it on the market right now. Brute force, what does that even me, the K1 is running at it's optimal speed and thermal usage. Nvidia didn't over clock the chip or push it, which by the way you can do, almost 300Mhz past it's shipping spec'ed clock speed before you start to seeing glitches. Including the GPU by 150Mhz.

    Spend just a week with a Jetson K1 development board and I guarantee that you will be singing a different tune. You won't though and I'm sure even if you had one you wouldn't know what to do with it. You spew your copied and pasted snippets from tech b!ogs and declare it as gossiple without every actually using the technology you so passionatly hate. The things you can do with this technology is mind blowing, there is 192 CUDA cores in the K1 that's as fast as a GeForce 8800GT. With open libraries and code available to manipulate that power to your hearts content. You don't find that amazing, oh, I forgot, it's not as fast as the Apple A8 or as conservative with it's power consumptipn. I see a super computer and you just another piece of tech that doesn't live up to your expectations, which amount to nothing.

    Still rambling on I see?

    Java is popular because it's great for rapid prototyping and development. That's the reason it's used by organizations. As an actual language, it's inferior to all the variants of C. For example, you can't even optimize Java code to take advantage of things like additional registers on a processor because it's a high-level language. Which is why you wouldn't pick Java for applications that require performance (photo/video editing, audio production, games, encryption or an actual OS). And it's also why NO Java Apps will ever fully utilize the power of a 64bit processor (like you can with Obj C for iOS or the NDK for Android).

    Overclock by 300MHz? Now are we talking about the Denver K1? Or is this like last time, when you talked about using a K1 development board while we were discussing the Denver K1 and we found out you were using the Jetson board, which is NOT using the Denver? And how you were using results from the Jetson to make ASSUMPTIONS about what the Denver K1 would be like? Honestly, it's impossible to keep track as you seem to like to cherry pick numbers/data from all over the place. So I'll ask again: did you overclock a Denver K1 or not? And if you did, how is 300MHz impressive when the processor is already running at 2.5GHz?

    192 cores? How come a GPU with 192 cores is about the same in performance to the A8X which only has 8 cores? Oh right, because it doesn't actually have 192 GPU cores. This is just Nvidia playing games with terms to make it appear it's some monster GPU. Perhaps Apple and others should count individual shader units and other subsystems within the GPU so they can also claim to have "several hundred core" GPU's?


    You still keep skirting the real issue here, which I can understand since you have nothing to counter my argument. I'll break it down for you:

    - Nvidia claimed their Denver K1 was going to offer the highest performance per clock when compared to other ARM processors.
    - They touted their Dynamic Code Optimization, huge instruction buffers (which you also brought up in a previous discussion) and their 7-wide superscalar design (all Apple 64bit processors are only 6-wide) as being the reasons for the processor performance.
    - In the end, the Denver K1 isn't even as fast as the Apple A7 from a year ago. It gets its performance from a high clock speed, NOT from all their BS about code optimization or other "technology". Or like previous Tegra chips, Nvidia was long on promise and short on delivery.
     0Likes 0Dislikes 0Informatives
  • Reply 104 of 134
    Quote:

    Originally Posted by Paul94544 View Post



    You are totally winning this argument and he is now in full on, damage containment mode , don't worry about it though it's typical of a man who made absolute statements they are almost impossible to substantiate when challenged by objective reasoning. You have cornered him and he doesn't have the character to say he was incorrect in his original assertions

     

    Anytime you want to start actually discussing the topic at hand instead of patting others on the back, feel free. I doubt you will.

     0Likes 0Dislikes 0Informatives
  • Reply 105 of 134
    Did you hear about The law suit against Apple ? Iwasbitcom.com has the information needed to see if you qualify to join it. They have been using old or refurbished parts instead of new in replacement phones. Phones they say are new.
     0Likes 0Dislikes 0Informatives
  • Reply 106 of 134
    paul94544paul94544 Posts: 1,027member
    This is turning into quite a flame war . And no I'm not going to make it a 3some as you want

    Hysterical
    Anytime you want to start actually discussing the topic at hand instead of patting others on the back, feel free. I doubt you will.
     0Likes 0Dislikes 0Informatives
  • Reply 107 of 134
    tmaytmay Posts: 6,470member
    Quote:
    Originally Posted by Paul94544 View Post



    You are totally winning this argument and he is now in full on, damage containment mode , don't worry about it though it's typical of a man who made absolute statements they are almost impossible to substantiate when challenged by objective reasoning. You have cornered him and he doesn't have the character to say he was incorrect in his original assertions

    She's lost the argument.

     

    She's now deployed her long list of a wide array of skill sets as a programmer/developer, a wide array of devices that she uses, and anecdotal evidence of performance in and effort to bolster her case. I find it unconvincing.

     

    Here's the thing. Every processor is a compromise of many factors, and the K1 Denver, whatever its unusual architecture, certainly has compromises and decisions that make it, in my opinion and based on reviews of its use in the Nexus 9, less than a stellar tablet processor, and certainly not suitable for smartphones.

     

    It's a one trick pony; graphics performance at the price of power consumption, and that would be expected with the clock rate. I'm not seeing any other design wins, perhaps as the Android tablet market is decidedly driven by low cost devices, though in fairness, it may find some wins in Chromebook devices. Qualcomm will not see any long term impact from the K1, and Samsung will certainly not be using the K1 in any of its devices, so Nvidia's market is really niche for the K1.

     

    (Some reviews note stutters and lagging, notably hot in heavy use, and average at best, poor at worst, battery life. I won't even talk about the build, which is very poor relative to the cost of the device, and not even in the same league as the iPad 2)

     0Likes 0Dislikes 0Informatives
  • Reply 108 of 134
    Quote:

    Originally Posted by SolipsismY View Post



    1) .... 2) .... 3) .... 4) ....

     

    Crowly's sidetrack here strikes me as an Al Bundy / Tim Allen type TV trope. A person grousing over multiculturalism and people not speaking 'what everyone else speaks' -- Uhmurican!

     0Likes 0Dislikes 0Informatives
  • Reply 109 of 134
    nhtnht Posts: 4,522member

    Quote:

    Originally Posted by EricTheHalfBee View Post

     

    Wow, did you actually just say that? How utterly ridiculous. 


     

    I say that as someone that has ported Java apps to Android in the past.  The business logic ports pretty easily unless you're dependent on a library that isn't available on Android.  You replace the swing bits with Android UI bits.

     

    That Google leveraged Java to capture a large programmer ecosystem is the basis of a rather contentious lawsuit.

     

    Quote:


    You might as well have said that since the Android NDK uses C/C++ and most software is written in C/C++, that it's possible to take any piece of software and easily port it to Android. This is simply not the case.


     

    It's moderately true.  More true than for iOS.  For iOS it's best to start with an iOS mindset and not a cross platform one.  The times this isn't true is when you are using something like Unity.

     

    Quote:

    Both Apple and Microsoft have made significant changes to their development environments to make it easy to port code over from their desktop software to mobile. This requires far more than just using a common language.

     

    Java has always been easier to make cross platform even if write once, run anywhere was never really true.

     

    Android is just much better than J2ME so Google was able to embrace, extend and extinguish J2ME.

     

    Quote:


    Besides, none of the truly useful desktop software is written in Java. They will almost all be written in C/C++. The people who use Java are those who need to crank out quick versions of software (for example, within an organization) or software that doesn't require high performance. You won't see Photoshop in Java, for example. Or popular browsers like Chrome.


     

    We control satellites using software written in Java.  There is Eclipse.  And OpenOffice.  HyperSQL.  Derby. 

     

    In any case, there is a boatload of J2EE developers and mobile apps are often connected apps.  Chrome might not be java but Tomcat is.

     

    For the most part all the major Java desktop/swing developers now work for Google on Android.

     

    Quote:

    If you had a 6 cylinder 3.0 litre engine that produces 500HP and a 10 cylinder 8.0 litre engine that also produces 500HP, which one would you consider as being technically more advanced? Sure they both produce the same HP, but one uses brute force while the other uses advanced engineering. I'd say a much larger engine that produces the same amount of HP is a piece of crap.

     

    Can't say without also knowing the amount of torque.  As far as which is technically more advanced it's equally hard to say.

     

    NASCAR and Formula One engines are vastly different but both are pinnacles of engine technology.

     

    Quote:

    The K1, Snapdragons and Exynos are crap because they use the brute force method. The A7/8/8X are superior because they use advanced technology to get their results.

     

    Or a better way to look at it: You can't make a racehorse out of a pig, but you can still have a pretty fast pig.



     

    One could argue that the A8X with three cores is the brute force approach vs the K1's dual cores but that's a fallacious argument.  Just like claiming that using higher clock speeds is a brute force approach to better single core performance is a fallacious argument.  Or using a process shrink to gain a performance advantage is a brute force approach vs a more efficient architecture (x86 vs ARM).

     

    All are widely accepted engineering approaches to more performance.

     

    I agree with Anandtech:

     

    "Though I don’t want to reduce this to a numbers war between A8X and NVIDIA’s TK1, it’s clear that these two SoCs stand apart from everything else in the tablet space."

     

    For me, it's more about the superiority of iOS over Android than which SoC is slightly better.

     0Likes 0Dislikes 0Informatives
  • Reply 110 of 134
    nhtnht Posts: 4,522member
    Quote:

    Originally Posted by SolipsismY View Post



    Ceteris paribus, sure, but all other things are never equal when it comes to more cores or more cylinders.

    Really? You're going to get a car that has no complex computers and sensors? You take your phone apart to work on the processor? image

     

    I know you have a man crush on me but Relic wrote that and not me.

     

    Of course, being a guy, car analogies don't phase me a bit.  Especially with someone that doesn't understand that torque is just as important as horsepower to real world performance and peak HP and torque is deeply interrelated to displacement.  That 3.0L engine that revs very high to get astronomical HP is great for a very light car but not so happy with a heavier car.  Nor is it very pleasant to drive on daily basis in comparison to a higher displacement 6.0L engine that revs lower to achieve the same HP.

     

    Everything is a tradeoff in both engine and processor design.

     

    Apple and nVidia has made those design tradeoffs in different areas.  Both have advantages and disadvantages.  

     0Likes 0Dislikes 0Informatives
  • Reply 111 of 134
    nhtnht Posts: 4,522member
    Quote:

    Originally Posted by EricTheHalfBee View Post



    Java is popular because it's great for rapid prototyping and development. That's the reason it's used by organizations. As an actual language, it's inferior to all the variants of C. For example, you can't even optimize Java code to take advantage of things like additional registers on a processor because it's a high-level language. Which is why you wouldn't pick Java for applications that require performance (photo/video editing, audio production, games, encryption or an actual OS). And it's also why NO Java Apps will ever fully utilize the power of a 64bit processor (like you can with Obj C for iOS or the NDK for Android).

     

    Java apps fully utilize the power of 64bit processors because the 64 bit JRE fully utilizes the 64 bit processor.  There is a performance penalty of JIT vs AOT but you gain flexibility and speed of development in exchange.  

     

    Java AOT compilers exist (Excelsior Jet) and of course Android is moving toward ART which does AOT compilation at install time which can (in theory) be more deeply optimized for the specific processor in your device than a typical JIT compilation.

     

    The Java Hotspot compiler vastly improved register allocation in Java 6.  Annotating byte code for optimizing register allocation is possible but nobody has bothered to build an AJIT (annotation aware JIT) compiler outside of academia.

     

    Quote:


    192 cores? How come a GPU with 192 cores is about the same in performance to the A8X which only has 8 cores? Oh right, because it doesn't actually have 192 GPU cores. This is just Nvidia playing games with terms to make it appear it's some monster GPU. Perhaps Apple and others should count individual shader units and other subsystems within the GPU so they can also claim to have "several hundred core" GPU's?


     

    The A8X has a dual PowerVR GX6450 variant which if you look at the imgtec site states has 128 cores in 4 clusters.

     

    http://www.imgtec.com/powervr/series6xt.asp

     

    Everyone uses the same terminology.  For nVidia it's vaguely legitimate since they call it 192 CUDA cores in a single cluster.

     

    So for the A8X it has 256 F32 GPU cores in 8 clusters.

     


    You still keep skirting the real issue here, which I can understand since you have nothing to counter my argument. I'll break it down for you:



    - Nvidia claimed their Denver K1 was going to offer the highest performance per clock when compared to other ARM processors.

    - They touted their Dynamic Code Optimization, huge instruction buffers (which you also brought up in a previous discussion) and their 7-wide superscalar design (all Apple 64bit processors are only 6-wide) as being the reasons for the processor performance.

    - In the end, the Denver K1 isn't even as fast as the Apple A7 from a year ago. It gets its performance from a high clock speed, NOT from all their BS about code optimization or other "technology". Or like previous Tegra chips, Nvidia was long on promise and short on delivery.


     


    On a performance per watt basis (which has far more merit than performance per clock) the K1 lies somewhere between the A7 and A8.  Which for a 28nm design is decent.


     


    While nVidia does often promise more than it can deliver, in the case of the K1 it seems that they have a very nice tablet/netbook SoC and they have a significant design win which has escaped them in the past.


     


    Too bad Android Lollipop is still IMHO meh on a tablet.   

     

    Claiming that the K1 is slower than the A7 is idiocy.

     0Likes 0Dislikes 0Informatives
  • Reply 112 of 134
    crowleycrowley Posts: 10,453member
    31 flavas wrote: »
    Crowly's sidetrack here strikes me as an Al Bundy / Tim Allen type TV trope. A person grousing over multiculturalism and people not speaking 'what everyone else speaks' -- Uhmurican!
    I don't know who Al Bundy is, I don't know what grousing means, I live in one of the most multicultural boroughs in one of the most multicultural cities in the world and love it, and I don't speak Uhmerican.
     0Likes 0Dislikes 0Informatives
  • Reply 113 of 134
    dasanman69dasanman69 Posts: 13,002member
    crowley wrote: »
    I don't know who Al Bundy is, I don't know what grousing means, I live in one of the most multicultural boroughs in one of the most multicultural cities in the world and love it, and I don't speak Uhmerican.

    Brooklyn or Queens?
     0Likes 0Dislikes 0Informatives
  • Reply 114 of 134
    crowleycrowley Posts: 10,453member

    Obviously neither.  I'm sot sure what your point is.

     0Likes 0Dislikes 0Informatives
  • Reply 115 of 134
    dasanman69dasanman69 Posts: 13,002member
    crowley wrote: »
    Obviously neither.  I'm sot sure what your point is.

    How obviously neither? I took a guess. Not many places are called boroughs, at least that I know of. Btw can you please quote people you're replying to. You've been around long enough to know how to do that.
     0Likes 0Dislikes 0Informatives
  • Reply 116 of 134
    crowleycrowley Posts: 10,453member
    Why would I quote? Yours is the only post I could possibly be responding to.

    And re obviously, my location is on my profile. Though I've just realised that it only appears under my name on desktop, not mobile, so not quite as obvious as I thought. Sorry for that. I'm in London. Tower Hamlets to be exact.
     0Likes 0Dislikes 0Informatives
  • Reply 117 of 134
    dasanman69dasanman69 Posts: 13,002member
    crowley wrote: »
    Why would I quote? Yours is the only post I could possibly be responding to.

    And re obviously, my location is on my profile. Though I've just realised that it only appears under my name on desktop, not mobile, so not quite as obvious as I thought. Sorry for that. I'm in London. Tower Hamlets to be exact.

    99.9% of the time I'm on the mobile version. I did remember you being given a brit, but I run into so many brits that recently moved to NYC that I thought you might have been one of them.
     0Likes 0Dislikes 0Informatives
  • Reply 118 of 134
    crowley wrote: »
    Why would I quote? Yours is the only post I could possibly be responding to.

    And re obviously, my location is on my profile. Though I've just realised that it only appears under my name on desktop, not mobile, so not quite as obvious as I thought. Sorry for that. I'm in London. Tower Hamlets to be exact.
    Tower Hamlets.... poor you
     0Likes 0Dislikes 0Informatives
  • Reply 119 of 134
    crowleycrowley Posts: 10,453member
    Not at all, it's quite lovely near the river.
     0Likes 0Dislikes 0Informatives
  • Reply 120 of 134
    solipsismysolipsismy Posts: 5,099member
    crowley wrote: »
    Why would I quote? Yours is the only post I could possibly be responding to.

    And re obviously, my location is on my profile. Though I've just realised that it only appears under my name on desktop, not mobile, so not quite as obvious as I thought. Sorry for that. I'm in London. Tower Hamlets to be exact.

    How is it obvious when there are over a 100 previous comments in the thread. Just because your comment could line up with the previous comment doesn't mean it's part of the same conversation, especially when you write something ambitious for a forum, like "Obviously neither" without including anything about the comment in which you're replying that could lead the reader to know your comment is specifically referring to Brooklyn or Queens. You also have no idea if there will be other comments that will be sent before you make your reply thereby separating your comments in the thread even further. Despite what you think the "quote" option isn't feature rot there to make innumerable conversations within the same thread more difficult.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.