or Connect
New Posts  All Forums:

Posts by auxio

Oh, sorry, I thought you were being sarcastic about the fact that iOS apps may not be able to do low-level Bluetooth packet sniffing due to sandboxing restrictions (thus promoting Android).  There's so much FUD being spread around these days by young technical elitists whose ignorance about the history of the technology their beloved Android is built on is staggering.
But it's hard to resist when some people's data smells so nice.
 I'd imagine that the Android developer community invented packet sniffing.  How far back do you want to go with this pointless techno-elitist nonsense?
I was just about to post the same thing.  Another question: will it also work with Mac? So basically, they need to support a myriad of Android devices connecting with a myriad of PCs with various capabilities and Windows versions, as well as Macs.  What could go wrong?
 I think some people just need to be against the top company in any industry -- trying to set themselves apart from the masses or something.  I mean, yeah sure, there are plenty of people who buy Apple products to be fashionistas.  The same people who buy a Louis Vitton handbag and whatnot -- mindlessly buying expensive items to show off and brag.  So this does create a certain level of animosity towards all Apple customers. However, if you're a technically minded person...
 Wow, talk about full-circle.  I was just looking up the history of SGI (and the MIPS architecture) and had no idea Imagination was involved with it now.  Makes perfect sense that with the modern mobile incarnation of CPU+GPU on the same die, we'd be revisiting MIPS. Very interesting indeed.  Thanks for the info.
 Well, Apple came into the mobile GPU space at a time when there were relatively few off-the-shelf options.  So they really had no choice but to closely work with a partner (Imagination) on designs and gain experience. As compared with the desktop space where the GPU market was much more mature when they started really leveraging GPUs in OS X.  That said, I wouldn't be surprised if they had some desktop GPU experience back in the NeXT days (though it was more SGI's thing...
 The GPU is becoming far more relevant to performance than the CPU these days with the high density displays on mobile devices and 4K/5K displays coming to desktops.
 Right.  Everything uses GCD under the hood, Swift just does it more transparently than Objective-C did (where some things like animation effects would use it intrinsically and other things had to be manually told to use it).  Now if only Swift could interoperate with C/C++ as nicely as Obj-C did (for those of us who need to work across many different platforms)... 
 As long as developers are building their apps with newer versions of Xcode, it will take advantage of optimizations for each CPU type.  Xcode will build different versions of the app for each CPU type you tell it to, glue them together, and then dynamically choose the right version based on the device you're running the app on.  It's the same technology that was used during the PowerPC to Intel transition on OS X (re: universal binaries). That, combined with use of Grand...
New Posts  All Forums: