Since the M1 blocks you from running Windows, and not being able to run Windows means you won't be able to run proprietary Windows programs, then it doesn't matter how fast either is and that is EXACTLY what it is: Windows vs MacOS. As I said, but you ignored:
It doesn't matter how fast it is if it can't do what it needs to do. Would you take a Porsche to pick up a yard of mulch?
For me, it brings back memories of a project I was handed to transfer data
from a DB2 database into a proprietary system and, the only way to do
that while retaining data integrity was to type it into the receiving
system. I had both a Mac and a Windows machine sitting on my desk at
work but neither would do it. Instead I had to use my home computer
running OS2 to read a record from the DB2 database and then type it into
the proprietary system using a keyboard emulator. In that context,
both the Mac and Windows machines were worthless."
M1 doesn't "block" Windows. If Microsoft wants to make Windows on ARM available for retail sale, or to Parallels and VMWare as "OEMs" then it will run on the M1 Macs with a little bit of work (I'm running a beta of Windows on ARM on the Parallels technical preview and it mostly works).
Or realizing this thread is solely about performance & benchmarks.
Spend all day saying how Intel still got it’s own, I don’t care. Bringing that to another unrelated topic is misleading, or I guess someone doesn’t understand the context.
I am SO very sorry that I introduced the fact that, if the tool can't do the job at all, then its performance is zero.
I'm just wondering how "Stock Option Pricing" and "Online Homework" are measurable metrics?
Intel is desperately misleading customers.
Intel has been misleading people (probably investors most) for a while. After years of underdelivering in the real world, people are starting to see what they've been doing. The following video says they even go above their power spec (1:30):
They measured as high as 40W. The reviewer there wondered why they'd do this but Samsung has done the same to inflate benchmark results.
These power levels Intel uses will be closer to the 16" Macbook Pro. Plus they are using non-native apps and without hardware acceleration. They can probably fool their investors with tests like this but they can't fool everyone. The real world experience of Intel machines vs ARM Macs is clear to see.
GeorgeBMac said: For me, it brings back memories of a project I was handed to transfer data
from a DB2 database into a proprietary system and, the only way to do
that while retaining data integrity was to type it into the receiving
system.
This just might be the dumbest thing I've read in a long time. If you think the only way to have accomplished your task was to manually type in the contents of one database into another, then you should have been fired for incompetence. All commercial databases are proprietary and exporting to flat files is effectively the lowest common denominator type of a solution. At least that can be automated. I've never seen a proprietary system of any type that did not have import capabilities and at the enterprise level, I've never seen a vendor that wouldn't allow this type of import into their system. Seriously, manually re-keying information is actually far more prone to errors not to mention that such a solution is hardly scalable.
Apple isn’t really the competition he needs to worry about.
Your company lost Apple’s business. Get over it already.
I get your point about AMD and Qualcomm being the primary competition. In terms of "direct" competition, I agree with you. However, Apple is setting the bar now. They have become the new standard by which all others will be measured. In that respect, Intel knows it has to compare what it is doing to what Apple is doing.
I'm just wondering how "Stock Option Pricing" and "Online Homework" are measurable metrics?
Intel is desperately misleading customers.
Intel has been misleading people (probably investors most) for a while. After years of underdelivering in the real world, people are starting to see what they've been doing. The following video says they even go above their power spec (1:30):
They measured as high as 40W. The reviewer there wondered why they'd do this but Samsung has done the same to inflate benchmark results.
These power levels Intel uses will be closer to the 16" Macbook Pro. Plus they are using non-native apps and without hardware acceleration. They can probably fool their investors with tests like this but they can't fool everyone. The real world experience of Intel machines vs ARM Macs is clear to see.
Oh yeah they’ve been there for a long time, not even a secret if you knows a thing or two about laptops.
You’ll be laughed at your face for still believing TDP. All 15-watt chip goes above 30 while all 45-watt goes 60-70 (and now even 80).
GeorgeBMac said: For me, it brings back memories of a project I was handed to transfer data
from a DB2 database into a proprietary system and, the only way to do
that while retaining data integrity was to type it into the receiving
system.
This just might be the dumbest thing I've read in a long time. If you think the only way to have accomplished your task was to manually type in the contents of one database into another, then you should have been fired for incompetence. All commercial databases are proprietary and exporting to flat files is effectively the lowest common denominator type of a solution. At least that can be automated. I've never seen a proprietary system of any type that did not have import capabilities and at the enterprise level, I've never seen a vendor that wouldn't allow this type of import into their system. Seriously, manually re-keying information is actually far more prone to errors not to mention that such a solution is hardly scalable.
LOL... Perhaps, it is you should be fired. Has anybody ever taught you what "data integrity" means and how to achieve and maintain it? Yeh, I could have bypassed all the data edits that insured that integrity and loaded the database myself. But then data integrity may or may not have been intact. But I understand the foolishness of that -- so I elected to take each new record through the full process of data edits before it was loaded into the proprietary database.
Yours is the opinion of the so called expert who knows nothing.
But, another piece you did not understand is that in the narrative I gave, there was no manual retyping: I used an application in OS2 to automate that part -- the computer did all the "typing". It would read a record from the original relational database and then "type" it into the proprietary database with no human intervention. Any records with errors were recorded in an error file and later corrected and reprocessed the same way.
Comments
I am SO very sorry that I introduced the fact that, if the tool can't do the job at all, then its performance is zero.
They measured as high as 40W. The reviewer there wondered why they'd do this but Samsung has done the same to inflate benchmark results.
These power levels Intel uses will be closer to the 16" Macbook Pro. Plus they are using non-native apps and without hardware acceleration. They can probably fool their investors with tests like this but they can't fool everyone. The real world experience of Intel machines vs ARM Macs is clear to see.
You’ll be laughed at your face for still believing TDP. All 15-watt chip goes above 30 while all 45-watt goes 60-70 (and now even 80).
Some famous examples include:
4th-gen,
8th-gen,
9th-gen,
10th-gen,
11th-gen...
heh - revisiting this topic next year is going to be fun