or Connect
New Posts  All Forums:

Posts by THT

 Yup. Most of the CPU and GPU benchmarks are natively compiled. The web-browser tests have to pass through a HTML and Javascript interpreter, so those tests are really CPU+JIT tests.  Huh? 5.0 is 32-bit only? (I don't follow Android closely). And a 64-bit release is waiting on a 5.0.x or 5.x release in 2015? For GPU, 32-bit vs 64-bit won't make much difference. Then, with Denver, I kind of wonder what the performance will really be like with 64-bit instructions. The code...
Oh Daniel, you love to deliver blowback don't you. Michael Wolfe should be your next target! That dude is plying his Apple doomsaying stick without even a hint of knowing that he's lying. At least Hruska only is skeptical about Apple and will admit when he's wrong, sometimes. At least, I think so. He'll accept data.   It's ok to be skeptical of Apple's performance claims, but there's no excuse swallowing Nvidia's marketing spiel. They've gone through too many shame-on-me...
Those SoCs have the Cortex-A53 in them as opposed to the Cortex-A57 core. The Cortex-A53 core is simpler, lower performance, and more suitable to be fabbed on 28 nm nodes. So, it matured faster and got to market faster then the higher performance Cortex-A57 parts. The higher performance Cortex-A57 will really need a 20/22 nm fab to fit inside a smartphone or tablet TDP. The Samsung Exynos 5433, which is in the international or maybe just Korean Note 4, is supposedly a...
 Are people really confused by this? All this says is that the memory bus has 2 channels of LPDDR3 memory that are each 32 bits wide running at a clock of 800 MHz. Math: 2 channels x 32 bits per channel x 800 million cycles per second x 2 32-bit transfers per cycle (DDR) = 12.8 GB/s of theoretical memory bandwidth. It's says nothing about the CPU ISA and instruction size. The memory bus width sizes can vary quite a bit. Apple shipped SoCs with 128 bit memory buses in the...
 Yeah. Everything is a tradeoff. They'll go to 2 GB sooner or later. But it seems today they believe negatives in user experience with 1 GB are better than the negatives with 2 GB.
 I'm not saying Apple isn't adding another GB of DRAM because of the power draw from the DRAM. That's peanuts. 10s of milliwatts. I am saying that by having another GB of DRAM, users will end up running more background processes resulting in more usage of the CPU and wireless radios. 100s of milliwatts. The benefit of running all those extra background processes is questionable for Apple's mass market customers versus having the system last longer. More RAM is an...
FWIW, my speculation on why Apple is sticking with 1 GB of RAM for the iPhone is really from an uptime perspective (battery blah blah blah), but not due to the power draw from another GB of DRAM.   The power draw from 1 GB of DRAM is peanuts in the grand scheme of things. The increased power consumption comes from powering the CPUs and wireless radios that more RAM would afford, i.e., more processes, more applications running require the SoC and radios to run more...
 No, it makes sense. It was all but inevitable. There was only one thing left in my mind, but it is at best a compromised measure, and I wasn't sure 20 nm would get them there. All the low hanging fruit for CPU performance have been implemented. ARM marched up the path that server/desktop processors took. They did it in 6 years versus about 2 decades. Superscalar, deep pipelining, SIMD, OOOE, branch prediction, multi-core, 64-bit, etc. It took close to 20 years for the PC...
 You are talking about far-field wireless power transmission, right? That sort of thing just isn't going to work as long as the device has a display to power. You are talking micro watts and pico watts with incredibly bad efficiency rates. I can't imagine this thing working in any scenario whatsoever. If Apple has a secret sauce, it's going to be a super low power display, super low power wireless radios and super low power sensors and SoCs.
 Pebble uses an e-Ink or e-paper display, right? That means anything involving color or animation is out not possible on the Pebble? Most of the Android Wear watches use LCDs or OLEDs which require a lot of power and therefore have runtime performance like cell phones, or devices that use those types of displays. As for Apple's iWatch. Wait and see. I could see them delivering a device that needs to be charged every night to one that can last a month. It all depends on...
New Posts  All Forums: