or Connect
New Posts  All Forums:

Posts by macarena

Link please? This is the first I am hearing of such a conspiracy theory!
Motorola must be regretting that bluster by their lawyer. This is the sort of statement that paints the company as a reckless aggressor, abusing Standard Essential Patents. Google might have managed to wriggle out of the Oracle mess (at least for now), but this noose looks like a tight one. Motorola is going to see the value of its patents evaporate pretty soon. If a $15 chip from Qualcomm comes fully licensed to the entire 3G stack, how much money could Motorola...
There could be multiple reasons for this - - Apple has done a great job of alerting users when a Wifi network is available - and you just select the network, enter the password and done. In Android, you have to manually do this - and you don't get alerted about Wifi availablity. - Secondly, iPhone users typically use their phone more - whether on WiFi or on 3G. So chances are, they are more sensitive to usage limits, and use WiFi wherever they can. - Thirdly, the...
This isn't Nokia work. Clearly MS style. But even for MS this is stupid. Windows Phone is not competing against iPhone. It is competing against Android. Attempting to compete against the iPhone makes them look pathetic - because the only issue they can come up with is a non-issue. That too almost 2 years old, and ignored by everyone even then! Why would anyone bother about this issue today? If they instead focused on Android, it would position Windows Phone as a better...
"Apple's proprietary solution" - indeed. Nokia should know that Apples idea is pin compatible with both SIM and MicroSIM. So simple physical adapters are all it takes to convert a nanoSIM to Micro or regular SIM. The point is not about violating ETSI's tech specs. That is just wool over our eyes. The real issue is when Nokia says Apple is devaluing others IP! That is the true reason. This is a master stroke by Apple. They have nothing to lose - worst case status quo...
How many tasks currently being handled by servers really need 64-bit support? There are a few use cases - like Database servers, massive number crunching, etc - where sheer performance is the only metric that matters. Those use cases will not be suitable for ARM at the moment. But there are dozens of use cases that are extremely suitable for ARM... For instance - the biggest deployments of servers today are for supporting the "Cloud". For Cloud purposes, there is no point...
Don't be afraid to dream - it costs nothing, and invariably, it is the most improbable dreams that lead to the most exciting breakthroughs!Maybe you don't know what are the major issues impacting Data Centers today - it is not power that is an issue - it is power per watt that is the issue. Not just in terms of the power consumed by the server itself, but also other costs like the cost of cooling the data center, etc. An ARM based server that can be clocked at higher...
This and several similar cases, are an example of Apple being held to a higher standard than other similar companies. But honestly, I genuinely think it is fair to hold Apple to a higher standard. This is one of those cases where Apple slipped up. Other companies can maybe rely on fine print and disclaimers, but Apple should not resort to such a defense. "Think Different" is applicable here too. This just is not Apple's way of doing things, and someone somewhere slipped...
Thats where Apple's core competency lies... I suggest Apple modify its Apple TV solution, to make it the world's best server hardware package. - Add Thunderbolt port, including Power over Thunderbolt - so one cable will handle power, data, and everything required. Should support full Thunderbolt spec of 20Gbps. - 8 GB Flash for local storage, and all units can access Petabytes of data on external SANs via the Thunderbolt port - at faster speeds than they can...
Just checking - was it actually possible to spot the dead pixel? On a retina display, where the eye is not supposed to be able to see the pixels? Or did you have to hold it closer to the eye to see it? I guess what Apple really means is that we can't distinguish between the pixels. Imagine if we actually have tech where we can't even spot a single bad pixel! Maybe another 4-5 years of Apple innovation!
New Posts  All Forums: