- Last Active
shaneg said:Rayz2016 said:Interestingly negative take on the decision.I agree that Apple shouldn’t have cut developer support for the Unreal Engine however, even if it was warranted. Harming your own customers is a dick move worthy of Epic. Apple shouldn’t be doing it. Hopefully the judge’s words have opened their eyes to this.
If the app in question was a piece of malware masquerading as a useful app, and the developer had multiple apps under multiple accounts, would you not want Apple to apply the same process?
Please also recognise that no individual or organisation on this planet is capable of being 100% consistent with what they say they will do and what they actually do. A parent will apply the rules of the household on a child inconsistently, a business will serve customers inconsistently, even the judicial system will apply penalties inconsistently. There is plenty of evidence that Apple applies the rules consistently in the majority of cases, even while it occasionally treats some developers (and customers) differently. We hear about the "special" cases only because they are unusual; if everything was a special case then it wouldn't be worthy of attention.
I think it's extremely interesting that the judge felt that after due consideration it was more in the public interest to allow continued development of the Unreal Engine for iOS while the case progresses. As mentioned in an earlier comment, this has implications for Apple's right to exclude certain behaviours from its store. It's quite a thorny argument; should an SDK developed by a third party always be exempted from the store rules if that same third party violates them? I expect Apple will be devoting a lot of attention to this situation and refining the policies applicable to third party SDKs.
Correct me if I'm wrong, but more RAM = greater power consumption and one of the enduring advantages of iOS is that it operates very well on devices with less RAM than is shipped in devices running Android.linuxplatform said:
Google tried lower RAM devices with "software optimizations" for no reasons other than sheer arrogance
linuxplatform said:Apple wanted to charge Samsung a $50 per device fee over TRADE DRESS and you are upset over Qualcomm wanting about $2 per device over VITAL FUNCTIONALITY WITHOUT WHICH THE DEVICE WOULD NOT WORK AT ALL THAT THEY INVESTED BILLIONS OF R&D DEVELOPING? Man you people are sick ...
verne arase said:gatorguy said:I never seriously doubted QC would be successful with their appeal. Now will come the appeal of the appeal ruling.
It doesn't matter to Apple anyway as they already came to an agreement with Qualcomm that both parties are satisfied with. No doubt QC was already aware of Apple plans for building their own chipset to replace them and getting a 6 year revenue contract with Apple was a good move considering they wouldn't be needed by then anyway except for essential patent licenses.
Apple should never have withdrawn their FTC complaint and testified to Qualcomm's extortion in the Smartphone industry. Qualcomm's gotten away with their abuse of standards bodies and FRAND terms for much too long.
This will come back to haunt Apple later on when they attempt to manufacture and distribute phones using their own chips.
- Qualcomm had already been paid for the "vital functionality" by the actual manufacturer of the chip, who then sold it to Apple. If there's a portion of the end product sales price (manufactured by someone else!) included in the licensing for an individual component, which seems an extraordinary provision in any licensing agreement, then realistically that money should be included in the price of the chip to Apple, and then forwarded by the chip manufacturer to Qualcomm.
- Trade dress is extraordinarily valuable because it plays a pivotal role in the decision to buy for the average consumer (who is assumed to be as thick as two short planks). Duplication of trade dress can destroy the value capture of the original creator, who is pricing their goods so as to recoup more than the cost of research and development on top of the cost of manufacture, distribution and sales. This dramatically reduces the incentive to develop something new while increasing the incentive to just copy more of what already exists.
So, there are two harms in play. The first involves a creator trying to get money from "downstream" use of something they created - something they already got paid for by the person they directly sold it to - which in the majority of cases causes more harm than good to society. There are some things, like a musical recording, where society has said, OK, if I buy a recording for personal use that's different to buying a recording for hosting a dance party where I charge people for entry, so I should pay more for the recording. But by and large it's recognised that once property is sold it is entirely under the auspices of the purchaser. IP is nowadays licensed, not sold, but it still gets treated as property and most people agree that once something is paid for, that's the transaction done. THAT is why people are upset over Qualcomm's actions.
The second harm involves a creator being denied financial reward for taking a risk. It's all very good to argue that the iPhone was obvious in hindsight and that Apple was trying to be compensated for coming up with a rounded rectangle, but nobody denies that the iPhone was new and exciting to the vast majority of consumers. Society broadly agrees that the creator of something new deserves to profit from that invention (whether or not they actually do profit is subject to the vagaries of business), and to the average consumer it was Apple who invented the device with the whole-face touchscreen that runs apps and is called an iPhone. Given the incredibly subtle differences in trade dress (not just the hardware, but the UI that Samsung in particular adopted) between the iPhone and the various "knock-offs" and the widely-adopted practice of salespeople telling consumers "[This] is exactly like [that], only cheaper" and people calling their Android device an "iPhone" and you can see why people broadly agree that Samsung owed Apple money for copying an original invention.
For as long as people ascribe more value to the look of something than to its functionality, there will be a difference in price between the perceived types of harm involved here - so I don't see anything "sick" about the reasoning of the people on this forum.
As for Taiwan, no it doesn't belong to China, no matter how much the CCP state so, and though China will ultimately weigh having to invade Taiwan, most Western countries support Taiwan's right to exist independently of Mainland China. The question will be, whether those Western Nations will support Taiwan militarily.
Maybe, just maybe, Mainland China belongs to the democracy that we know as Taiwan.
Taiwan is part of China and belongs to China.
However, the society that exists on Taiwan has developed under close to 100 years of British rule and being subject to British laws. The people who live in that society do not want to be part of the society that exists on mainland China. There was a legal agreement in place at the end of the lease that grants the Taiwanese society some legal differences from the mainland society for a period of time (I think it was 50 years, but I may be wrong). The CCP is trying to reduce that agreed-upon timeframe, and the people of Taiwan are resisting that attempt.
It is a bad situation for all concerned. I wonder what would have happened had the British not gone to war with China in the 1800s?
jimh2 said:elijahg said:ARM is great for power saving, and in some tasks they exceed the slower Intel CPUs. But there are other tasks they're much slower at, and generally even the best ARM CPU is miles off the mid range Intel ones. I don't really see the point in switching architecture again, people aren't complaining about the battery life on MacBooks, which is really the only advantage x86 in a laptop has.
Moving from x86 not only means switchers to the Mac won't have the "safety net" of running windows, whether natively or in a VM. The vast numbers of utilities for x86 Linux would also become incompatible, people who want to dabble in the occasional game can't reboot to Windows either. We used to dual boot Macs at the school network I ran for various Windows apps. Switching would mean developers would need fat binaries again (apart from MAS distribution) and no doubt it would be another chance for Apple to apply even more OS restrictions. There are a lot of downsides for essentially no upsides. Don't get me wrong - x86 is a crap architecture and if it wasn't for AMD bodging 64-bit support on we'd probably be back to a form of RISC architecture now, like Itanium, but ARM's disadvantages way outweigh the advantages imo.
There are various cloud providers that can spin up a VM in less time than I can do it on my local machine, that manage the Windows licensing for me, and that only charge me for the active use... for the money that I spent on RAM, I could have had almost three years of an always-on, dual core 8GB RAM 160GB SSD remote "machine" with unmetered data transfer at https://buyvm.net
I know that not everyone has access to fast, reliable Internet - but I do, and that changes the way I look at things. Renting resources on someone else's server is economically advantageous and takes a whole swathe of activities and requirements off my plate. When the Mac went to Intel 15 years ago (!) I stopped having to worry about CPU architecture; I suspect I won't have to start worrying about it again.