or Connect
New Posts  All Forums:

Posts by auxio

 I'd imagine that the Android developer community invented packet sniffing.  How far back do you want to go with this pointless techno-elitist nonsense?
I was just about to post the same thing.  Another question: will it also work with Mac? So basically, they need to support a myriad of Android devices connecting with a myriad of PCs with various capabilities and Windows versions, as well as Macs.  What could go wrong?
 I think some people just need to be against the top company in any industry -- trying to set themselves apart from the masses or something.  I mean, yeah sure, there are plenty of people who buy Apple products to be fashionistas.  The same people who buy a Louis Vitton handbag and whatnot -- mindlessly buying expensive items to show off and brag.  So this does create a certain level of animosity towards all Apple customers. However, if you're a technically minded person...
 Wow, talk about full-circle.  I was just looking up the history of SGI (and the MIPS architecture) and had no idea Imagination was involved with it now.  Makes perfect sense that with the modern mobile incarnation of CPU+GPU on the same die, we'd be revisiting MIPS. Very interesting indeed.  Thanks for the info.
 Well, Apple came into the mobile GPU space at a time when there were relatively few off-the-shelf options.  So they really had no choice but to closely work with a partner (Imagination) on designs and gain experience. As compared with the desktop space where the GPU market was much more mature when they started really leveraging GPUs in OS X.  That said, I wouldn't be surprised if they had some desktop GPU experience back in the NeXT days (though it was more SGI's thing...
 The GPU is becoming far more relevant to performance than the CPU these days with the high density displays on mobile devices and 4K/5K displays coming to desktops.
 Right.  Everything uses GCD under the hood, Swift just does it more transparently than Objective-C did (where some things like animation effects would use it intrinsically and other things had to be manually told to use it).  Now if only Swift could interoperate with C/C++ as nicely as Obj-C did (for those of us who need to work across many different platforms)... 
 As long as developers are building their apps with newer versions of Xcode, it will take advantage of optimizations for each CPU type.  Xcode will build different versions of the app for each CPU type you tell it to, glue them together, and then dynamically choose the right version based on the device you're running the app on.  It's the same technology that was used during the PowerPC to Intel transition on OS X (re: universal binaries). That, combined with use of Grand...
 Competitors already do this sort of analysis.  Investors should be interested in the fact that Apple has the expertise to customize chip designs to their needs rather than just blindly use off-the-shelf designs like most of their competitors do.  This gives them an edge.  Especially when combined the software expertise to ensure iOS and apps are highly optimized for this. Even if investors don't understand or care about the technical details, they should understand that...
 It could be in the contrast at the edges of objects in the scene (for example: a dark object on a light background or vice versa).  If you have more pixels in the video image, you can average/smooth those edges better when you downsample the larger image to fit on the screen.
New Posts  All Forums: