misa
About
- Username
- misa
- Joined
- Visits
- 34
- Last Active
- Roles
- member
- Points
- 270
- Badges
- 1
- Posts
- 827
Reactions
-
Breaking the trend: why Apple is likely to release both an 'iPhone 7s' and 'iPhone 8' this...
jkichline said:1983 said:The S8 multi-core performance is actually very slightly better than the iPhone 7, even though the difference is tiny, so you could say they're on equal footing there. Its single core performance where the iPhone 7 is still the king by quite a considerable margin.
https://www.xda-developers.com/benchmark-cheating-strikes-back-how-oneplus-and-others-got-caught-red-handed-and-what-theyve-done-about-it/
Even though they may claim not to be cheating now, you will never know unless the benchmark changes it's benching algorithm process with every version.
Which brings me to the other half of the performance claim. iOS software is 100% C/C++/OBJC/Swift. They are compiled to native code. Android devices are a mixture of Java and native code. Even if you could run Android on an iOS device, it would perform worse than iOS with the same software because there is this entire Java problem to deal with.
You can see this with games developed in Unity. The game on iOS, works like a charm, low loading time, etc. Same game on Android on equivalent hardware, long loading times, lots of render stutter, OS native UI stuff is slow and ugly, etc.
Like, there is no way in hell I would buy an Android device for that reason alone. You need 20% more "phone" just to make up for Android inefficiency, and that is reflected in synthetic benchmarks which perform non-UI computations. If you were to actually test the responsiveness of a phone on the same software, the iOS software actually is responsive, where as Android is sluggish. There is just no getting around this. I've never seen a game perform identically on an iOS phone and a Samsung phone. The Samsung phone is always kneecapped by Android. -
Apple ditching Imagination Technologies GPU technology, moving design in-house
AppleInsider said:...
The move away from Imagination may be part of an attempt by Apple to take more control over the design of its hardware. Apple was said to be in talks to acquire Imagination early last year, though ultimately no such deal was made.
Apple has also taken time to poach a number of Imagination's staff over the last two years, including GPU architects and designers. These employees could help Apple to produce its own graphics architecture, potentially saving it from having to pay royalty fees to Imagination for using its intellectual property.
According to Imagination's statement, Apple has asserted that it has been "working on a separate, independent graphics design in order to control its products."
Imagination is also seemingly suggesting there could be a legal fight in the future over the in-house graphics architecture move, declaring Apple has not presented any evidence to substantiate its assertion that it will no longer require Imagination's technology, without violating Imagination's patents, intellectual property, and confidential information. While evidence has been requested by Imagination, Apple has declined to provide any to the company.
Imagination believes that it would be extremely challenging to design a brand new GPU architecture from basics without infringing its intellectual property rights, so in the statement about the matter, Imagination does not accept Apple's assertions. The company has also attempted to discuss potential alternative commercial arrangements with Apple for the current license and royalty agreement.
And yes, pretty much GPU tech has plateaued for mobile designs. If you want more GPU power you need to move up to the iPad/Tablet platform and have a much larger battery. There's probably still some more innovation left in the GPU pipe, but like the CPU tech, they're going to be refinements, not leaps.
Hence, Apple will probably just go with their own IP core for the GPU of the mobile devices. Don't expect to see this in Laptops or Desktops where Apple can just buy CPU and GPU's that are suitable. Apple's end-game appears to be to eliminate the desktop/laptop space entirely by making the iphone/ipad your one-and-only computer, and you just drop the iphone into a docking station to get the full iMac/Mac Pro experience *shudder*
I just don't see how neglecting their professional users has done anything but push Mac Users away from the Apple Ecosystem. -
MCX sells one-time Apple Pay challenger CurrentC to JPMorgan Chase
9secondkox2 said:Lol. Deducted this back when Apple Pay was being blocked in favor of whenever current-c would get its act together. If the mcx actually cared about security, convenience, and ease of use like they say, they'd simply throw full support behind Apple Pay.
The only solution that that actually protects the consumer instead of tracks the consumer.
This is what happens when merchants get greedy. Credit cards have been a staple of American commerce since 1959. Making harder than the original card was doomed to fail. The reason Apple Pay works and bad alternatives like Google and Samsung have repeatedly failed, is because to make Apple Pay work, you can use it everywhere there is a NFC-enabled card terminal, which is pretty much everywhere. These alternatives don't work, are insecure (MST), or require special equipment not already present. So the end result is that you're better off using the plastic card you already have than have to flip through apps, or figure out how to hold the phone for it to work.
As an example of poor design. Many "loyalty cards" still use bar codes, and if you have a screen protector on your phone, most barcode readers can't read through it. -
Microsoft claims Windows licensing gains are chipping into Apple's 'premium' computer mark...
xzu said:The Surface Studio looks like such an amazing design, first time I have been tempted to purchase an all-in-one desktop since my iMac debacle(s). Too bad it runs Windows. Hoping the competition will make Apple rethink what they are doing.
It's still a weak system, it's basically the tablet with a larger surface area. The tablet systems are less than 50% as powerful as a quadcore system.
And no, Apple will never use it's own A-series CPU/GPU parts as long as Intel's parts are still 4X as powerful. AMD is putting out 8-core CPU's and 4-core/weak(but better than Intel) iGPU's. Apple needs those A-series parts for the iPhones, iPods, iPads, and AppleTV units and the ARM parts don't UP scale energy use as efficiency as Intel parts, while the Intel x86 parts don't scale down to "mobile" devices very well.
It's been calculated out several times, but in order to use the A-series parts, the CPU performance parity would have to be equal to a desktop at 1/4th the power. It's not hard to squeek out a beat of the Intel GPU parts, because Intel's GPU's have always been rubbish, but the CPU side can not. The per-thread performance is nowhere near where it needs to be.
The iPhone 7 https://browser.primatelabs.com/v4/cpu/387526
Single core A10 @2.34GHz is 3360.
vs Intel parts:
Intel Core i7-4790K @ 4.80 GHz is 6664.
An i5-7600K @ 3.80 GHz is 6665.
Those are all quadcore parts. Assuming you could just straight up double the clock speed without a penalty in power usage, they are still different architectures, and clock for clock are not the same.
Apple switched from PPC to Intel, and unless Intel stops making viable parts, they have no reason to switch from it. If they ever decide to switch, there will be a directive to Apple software developers to go back to "Fat Binaries" and so far that has not been done. -
Apple says hidden Safari setting led to flawed Consumer Reports MacBook Pro battery tests
macxpress said:Why the hell would CR mess with any settings? Just take the damn thing out of the box and test it as is. I don't trust CR anyways. I'm sure they'll cover their ass and just blame it on Apple. I've always said these issues are most likely software issues with the battery. I seriously doubt Apple would release a laptop thats very well noted for its battery life with something else that gets half or worse the amount of battery life. Apple is one of the few you can actually rely on for accurate battery life.
Test the battery life for movies/media: by creating a playlist of different music and movies for approximately 16 hours, while an application in the background time-stamps every minute and notes the battery meter reported value.
Test the battery life for standby: Literately run only the time-stamp app, have no software running and see how long it runs idle. This is where you get "maximum battery life" from
Test the battery life for conventional usage: Requires running two applications, one that counts the battery life, and another that cycles through the web browser, media player and so forth using macro playback, while configured to use a testing proxy server that feeds consistent data for web tests. Basically, you see how far it gets in a 16 hour "work/home cycle" of use.
So you would typically turn off caching if you are testing the web browser's speed, but turning it off for a battery life test takes it pretty far from how users normally use a web browser, so a reported inconsistent life means that there is a problem with the testing setup. For it to be a hardware/software issue, it would replicate the problem on every device running the same software and hardware, which is something that CR didn't expand it's research into.