cloudguy
About
- Banned
- Username
- cloudguy
- Joined
- Visits
- 21
- Last Active
- Roles
- member
- Points
- 1,149
- Badges
- 1
- Posts
- 323
Reactions
-
MacBook Air with M1 chip outperforms 16-inch MacBook Pro in benchmark testing
All right. I will eat crow. I have long claimed that there was no way that Apple Silicon would match the Core i7 at launch and would probably be in line with the Core i3 or at best Core i5. I was wrong. I shall go sit in the corner with my dunce cap on now.
But while on my way to the corner I will protest:
Apple did not reach this performance with the 4 and 6 core iPhone and iPad chips as people were claiming previously. Apple only reached this performance with an octacore chip that was specifically designed for use in personal computers - not mobile devices - that requires more cores, more power and dissipate more heat. We have always known that this was possible, as modern (meaning a ARM Holdings design base and not the Sun Sparc and other early RISC servers that go back to the 1980s) Linux-based ARM workstations and servers have existed since at least 2011 (the year after the A4 was released). Ubuntu has had official ARM releases since 2012, and HP - the venerable Wintel manufacturer - has been selling them to data centers since 2014.
So I was absolutely right about Apple not being able to build a MacBook Pro or iMac with a 6 core chip that had 128kb/8MB caches (the M1 is octacore with 192kb/12MB caches). As lots of people on this site and elsewhere were indeed claiming that the 4 and 6 core low power/low heat iPhone chips could absolutely be put in a MacBook Pro and work as good or better ... yeah those people were as wrong as I was and even more so.
Now in the corner of shame I go, sucking my thumb in the process. But you folks who claimed that this would have been possible with the iPhone chips need to go to corners of their own. -
Apple's claims about M1 Mac speed 'shocking,' but 'extremely plausible'
h4y3s said:Don’t overlook the unified memory architecture that Apple can deploy, (as they own the whole stack) this will save 2x on a lot of common functions!
So UMA will work for Apple Silicon for workloads that are both CPU and GPU intensive (right now this is handled by giving, say, 16 GB of RAM to the CPU and 4 GB of RAM to the GPU as is done for the 16' MacBook Pro) only if enough RAM is provided. Of course, Apple knows this. And they know that users of, say, vector video editing or 3D animation - examples of heavy peak simultaneous CPU/GPU workloads - are going to be willing to pay a lot for that RAM. -
MacBook Air with M1 chip outperforms 16-inch MacBook Pro in benchmark testing
joguide said:kpom said:DuhSesame said:I wonder how much difference we’ll see for the Air vs. the Pro.
I assume it’s just the long-term performance.
But as for right now, there are already plenty of fanless Windows 10 - and I mean real Windows, not Windows on ARM that tries and fails to emulate x86! - laptops out there. Consider the Acer Switch 7: 16 GB of RAM and Intel i7 processor. There are also a couple of Dell XPS fanless laptops and a couple of Asus ones in addition to more Acer ones.
Get this: folks are kicking around the idea that the new Intel Tiger Lake CPUs with integrated Iris XE graphics will allow fanless gaming laptops to become a thing (because Tiger Lake is Intel's low heat/low power design and Iris XE GPU - which is integrated in all Tiger Lake Core i5 and higher chips - is supposed to provide gaming performance on the caliber of the Nvidia MX350).
So seriously, you guys need to pay attention to the wider tech world more. If you are thinking that Apple Silicon is going to result in these magical devices that the rest of the tech world can't comprehend let alone compete with that is going to result in Apple quadrupling or more its market share and influence, prepare to be sadly mistaken. The tech media might not know this - as Apple devices are all that they use and as a result truly cover - but actual consumers do.
-
ProtonMail CEO says Apple strong-armed adoption of in-app purchases
lightvox88 said:I'm not claiming to speak for the millions of iOS device owners, but I personally buy Apple devices for the sand-boxed and curated App Store. I don't want any alternate app stores.If you want alternate, and willy nilly then go buy an Android device, and sell your wares on those platforms. Leave our walled infrastructure alone. If you don't like it, then leave...
"Pork is bad"
with
"I like chocolate"
Your liking chocolate - while fascinating on its own merits - has nothing to do with the health effects of eating pork or the ethics of raising animals for the purpose of slaughtering them. Likewise your desire for a single method to install apps on iOS has nothing to do with the way that Apple chooses to manage that method. ProtonMail isn't saying that he wants an alternative app store for the purpose of not having to offer IAPs. He is saying that he didn't want to be forced to offer IAPs on the App Store that he is already in. -
Microsoft releases M1-native Visual Studio Code for developing apps
VSCode isn't "Microsoft's long-standing app development software." That would be Visual Studio. VSCode is a free, lightweight open source tiny subset of Visual Studio that was released in order to stop the bleeding of scripting programmers - i.e. Javascript and Python - from Visual Studio to competing free and open source scripting IDEs. The older IDEs - Visual Studio, NetBeans, Eclipse etc. - were designed around full blown programming languages like C++ and Java. But for scripting languages full blown IDEs were overkill. In addition in some instances the IDEs were proprietary software that cost a ton of money and aren't available on all platforms (see Visual Studio Enterprise).
Basically, Javascript is replacing Java for a ton of client (Angular and Express) and server (node.js) for a bunch of applications ... the MEAN stack is now supplanting the LAMP stack - the rage 10 to 15 years ago - for ecommerce sites. (C/C++ was never widely used for web servers and applications, though it is very possible to do so, and Microsoft has made some attempts to push it with their IIS web servers.) And then you have Python and R used for data science. Microsoft was losing a huge chunk of the next generation of programmers, so they created and open-sourced VS Code to get them back. Fortunately for them VS Code is excellent software so that plus the Microsoft name worked like a charm: it is the de facto standard. Including for people who are now using it for Java and C++ instead of Visual Studio. -
ARM deal nears closure with Nvidia mulling $40B purchase from SoftBank
killroy said:Too many legal land mines in this sale. An IPO would be better and keep things out of court.
Apple doesn't do this because Apple - see above - Apple isn't into basic R&D. They do R&D for their own products for which it isn't in their interests to license.
As for the IPO thing ... Softbank investigated that, largely because it was the stated preference of ARM Holdings' current employees, previous owners/stakeholders as well as many of the licensees. The problem is that there is absolutely no way that ARM Holdings is going to generate anywhere near $32 billion in an IPO. ARM Holdings doesn't make/sell products. They are what a lot of folks on this board would not hesitate to call a patent troll were ARM to ever sue Apple or vice versa. Their only value is to an existing company that wants to bring their R&D in house.
Nvidia is pushing an to create an ARM-based edge computing platform (something about parallel processing on GPUs to take advantage of architectural capabilities that do not exist in CPUs because of the way that CPUs are designed to handle instructions) to sell to data centers and cloud companies. They are satisfied with the software but right now are basically running it on commodity hardware. They want to create their own custom ARM-based data center GPU hardware that is designed specifically for and optimized for their platform software and the data center workloads. If they are able to buy ARM and dedicate their R&D resources towards this design issue, they are going to dominate this market - which is very lucrative and on the verge of exploding but is also niche because it requires specialized hardware and software that is very difficult and expensive and not many companies have the expertise or capability to provide or any real way of getting it anytime soon - and this $40 billion will pay for itself many times over. But if they are not able to, then that will give the competition - which does exist - time to catch up by coming up with a similar platform but with better hardware (or software) or another approach to edge computing altogether.
But the bottom line, Nvidia's reasons for buying ARM have nothing to do with consumer devices like phones, tablets, PCs and smartwatches. Nvidia's competitors in the ARM-based enterprise hardware space have some reasons to be concerned but that is about it. -
Apple now blocking new installs of sideloaded iOS apps on M1 Macs
Of course, you all know by now that I am not an Apple apologist. But I am not anti-Apple either! So take my point of view. Take cloud streaming. The single most promising cloud streaming effort - in my opinion - is Nvidia GeForce Now. Why? Because it allows you to play the games that you have already purchased on a Microsoft Windows PC instance in the cloud: Steam and Epic games. Or at least they did. The publishers of lots of these games refuse to allow them on GeForce Now. Despite the fact that the purchasers of these games have already bought a copy, the people who created and published the game stated that those licenses were only good for downloading them to and running them on PCs. That they didn't have a licensing model to cover cloud applications. So GeForce Now was very reluctantly forced to delist most of their catalog.
This is the same. When these publishers put these apps on the App Store, they only licensed them for iOS and iPadOS. That is a valid, legal binding contract. Those apps can't be put on tvOS, watchOS or macOS without the publishers agreeing to license them. Why wouldn't they? Who knows. But ultimately the apps are owned by the publishers. So long as they meet Apple's terms of service, they have the unilateral right to decide which platforms their apps appear on, and even to pull them from the App Store entirely. Apple's hands are completely tied here. There is literally nothing they can do legally. If they allow or do not act to prevent license violations, these app developers can and will sue them and will absolutely win because Apple will not have a valid defense. At all.
Remember: iOS is not Android. It is Android that has always sold itself as an open ecosystem and app developers who embrace Android do so fully knowing that their licenses aren't going to be honored: people are going to obtain their apps and install them on whatever devices they want, often without paying. It is for this reason that lots of developers avoid Android like the plague. They chose Apple instead because Apple promised them a secure ecosystem that would act to protect their investment as developers by enforcing their licenses and not allowing their apps to run anywhere that they aren't authorized.
I repeat: this is a good thing. A very good thing. Security, privacy and control: the very reason why developers choose Apple over Android and Windows and why users choose Apple over Android and Windows in the first place. Well, users other than me. I prefer maximum control, which is why I prefer Linux and Android. And developers other than me. If I ever get back in the software development game again - it has been awhile and things have, er, changed lol - I will choose Android because I would want my apps to be distributed as widely as possible and, and there is the "freemium/free with ads" models to monetize those who sideload.
But I want people who have different wants - indeed different business needs than my own - to have a choice. Apple offers them a choice. Good grief, think about work environments that require security clearances. Would you even want a computer with the ability to install unauthorized apps on them in the first place? Of course not. That is why if I had a business or contract with those constraints, everyone would be required to use iPhones, iPads and Macs and they would all have the strictest MDMs to lock them down as possible. Despite my personal preference for Linux, Android and ChromeOS, business needs are business needs. OK?
As for you folks who said "why did I buy an M1 Mac in the first place if it can't run mobile apps!" ... I am sorry but buying a PC to run mobile apps makes absolutely no sense at all. They are not what PCs are for. I bought my first two ChromeOS devices long before Google added Android and Linux app capability to them and was perfectly fine with both. Your M1 Mac is just the same as your Intel Mac was, except a good bit faster.
Also, can you wait a bit PLEASE? The M1 Mac platform isn't even 3 months old yet. Give more developers time to update their licensing terms. It will be less than a year. Even better: wait 2-3 years when developers embrace writing apps with iOS, iPadOS and macOS versions! Lots of them are going to do it but it is just going to take time. They can't do it right now because they don't have M1 Macs suitable for this development work yet. Don't roll your eyes: the only M1 Macs released have been entry level devices: Mac Minis, MacBook Airs and MacBook Pros with 16 GB of RAM and rudimentary GPUs. Wait until the 32 and 64 GB RAM workhorse machines with GPUs capable of taking on the best of Nvidia and AMD become available so the developers of your favorite iPad apps will actually be able to tweak their existing apps and make new ones. These M1 Macs that you bought today will still be good in 2023, right? And you didn't just throw your previous perfectly functional Intel Mac in the trash when you bought your M1 model did you? (If you did, that's on you.)
I can't believe that I am actually defending Apple products from Apple fans on an Apple site but here we are. But if you are going to get upset over stuff like this: well Windows and ChromeOS beckon. You can run all the Android apps you want on a Chromebook or Chromebox, and with BlueStacks and its competitors running Android apps on Windows is easy too. -
Apple's Tim Millet discusses A14 architecture, future chip designs
"The expansion of the Neural Engine to 16 cores instead of 8 cores brought up the question of why Apple elected to devote transistors there and not budgeted for more GPU or CPU performance, which Millet suggests is down to how Apple views the feature."
Two things.
1. There are a lot of AI features on mobile and this will increase in the future. You offload those AI features to their own chips for the same reason that floating point math was offloaded at one time and graphics and security functions are offloaded now. Separating the specialized functions allows for botth the specialized and generalized functions to run better.
2. The other companies involved in AI - Microsoft, Amazon, Google - are using the cloud for AI. Apple lacks their cloud prowess and infrastructure so they are trying to compete using hardware. Apple is going to either aim to provide more AI functionality locally than the competition can provide using the cloud - which should be possible in theory so long as Apple can come up with applications for it - or provide similar AI functionality but with more "speed, security and privacy" than shipping AI queries and results back and forth over a 5G connection.
While it isn't this way on PCs - yet - go ahead and accept that going forward there are going to be 3 main processing units in mobile: CPU, GPU and AI. -
Supreme Court rules in favor of Google in Oracle Java fight
Please recall that the original judge to rule on this case was a former programmer - as a hobby - and he called out Oracle's nonsense for what it was. Had the subsequent judges, juries etc. been required to take an online programming course Oracle's years long attempt to profit off Google's hard work would have been avoided. Keep in mind: Oracle was the same company who spent years claiming that the cloud would fail. Now Azure and Amazon both are making a mint on tools that will convert Oracle's byzantine legacy database into a much more modern cloud database for free.
Oracle bought Sun - who agreed to let Google use the APIs and stated so in the trial - because they thought that they could make hundreds of billions of Java licenses. They didn't know - because I guess all their programmers were still using PL/1 and COBOL - that virtually none of the people who used Java paid for it. Sun gave Java away to nearly everyone for free because they wanted to create a standard Internet programming platform for front end, middleware and backend. Legacy companies like Microsoft, Oracle and Apple couldn't wrap their heads around the need for such a thing at the time. Even though they wanted to get paid and licensed, they were fine with Google not paying them because Sun wanted Android to succeed too. Yes, Sun wanted Android to succeed as an open mobile platform because their alternative as an open mobile platform had already failed. If Android wasn't going to succeed then mobile would have been split between proprietary incompatible platforms by Apple, Microsoft and Nokia.
Oracle made a bad purchase and tried to sue Google in order to recoup some of their bad investment. Didn't work. Plus, software development has moved on anyway. Java is now a legacy platform. The MEAN stack - MongoDB and Javascript frameworks Express, Angular and NodeJS - has replaced the LAMP (Linux, Apache, MongoDB and Python/Perl) stack. Also the people who would have been learning Java 10 years ago are now learning Python, Golang and Rust. Even Google has essentially replaced Java on Android with Kotlin (Javascript that compiles to the JVM).
Google delayed ARM on ChromeOS because they didn't want Oracle to cite it as a talking point. Now they are moving full steam ahead, licensing ChromeOS on Qualcomm (before it was only available on MediaTek) and also designing their own ARM SOC for use with ChromeOS and Android. They knew after the Supreme Court argument that Oracle was going to lose. All Oracle did with this nonsense was enrich their lawyers. Instead of wasting 8 years suing Google they should have spent that time and money developing their own next generation cloud database and programming language platform. Instead they bought formerly open source MySQL instead, resulting in pretty much everyone who used MySQL dumping it for any alternative they could find. -
Google Project Zero security researcher moves to Apple
chasm said:The Google Project Zero team have ironically been excellent at discovering security issues within iOS (et al), so it is great news that one of them will be joining Apple directly. I wish the people in that team were allowed to take charge over the security of Android and the Google Play Store, the major sources of malware and security issues in the mobile world ...
A) virtually no Android exploits actually impact real world end users
nearly all Android vulnerabilities are due to bypassing Google Play and sideloading apps - which means that Apple fans who call Android a monopoly and demand that governments need to it up would result in more security issues not less. Do Apple fans want this in order for the increased security problems to force more customers who otherwise would be happy and satisfied by Android into iOS unwillingly?
C) The open nature of Android makes accomplishing real security impossible. Google learned from this and has since made every other platform i.e. ChromeOS, Wear OS and Android TV very much closed down. You cannot so much as even obtain an image of ChromeOS to create a virtual machine or use with Bootcamp. As a result virtually no security issues - even the type that results in no end user exploits and are easily avoided like in Android - exist on ChromeOS and the others.
D) if your core complaint is Google having Project Zero at all, well there was nothing preventing Apple from investing the massive resources and leadership that it took to create the best private cybersecurity research team in the world in Project Zero. You should bash Apple for not creating their own team instead of bashing Google for having this sort of initiative. Instead Apple is reduced to acqui-hiring someone that Google put in the hard work of identifying and training.