Geekbench 6 released to better benchmark modern hardware
Geekbench has been updated to version 6, with the cross-platform benchmark improved to better handle the improved specifications of modern smartphones and other devices.

Geekbench 6
Geekbench is a highly-used benchmark tool, providing quick ways to compare the performance between similar-specification devices. On Tuesday, the app was finally updated by Primate Labs to version 6, bringing with it quite a few improvements.
The new version is made to better accommodate changes in the tech world since the release of Geekbench 5. Machine learning, larger smartphone cameras, and higher core counts have significantly changed the way we use hardware, necessitating the update.
Part of the changes is new "real-world tests," which goes a little beyond simply crunching numbers at high speed, and into how long actual tasks take to complete. For example, it will measure how long it takes for an example website to load, for filters to be added to a photo, or to render a PDF.

Geekbench 6 on macOS
New tests in this fares include blurring backgrounds for video conferencing, removing background objects from images, and processing text "within a development workflow."
"These tests are precisely crafted to ensure that results represent real-world use cases and workloads," the app's description states.
The CPU benchmark also adds new tests, covering application areas including artificial intelligence, augmented reality, and machine learning. GOPU benchmarks add support for machine learning and a more uniform GPU performance across different platforms.
Available for iOS, macOS, Windows, and Android, Geekbench 6 is free for non-commercial personal usage. A paid Pro edition will also be available, incorporating things like command line operation and offline results management, with a limited-time 20% discount reducing it to $79.
AppleInsider will be updating the most recent Mac reviews with GeekBench 6 numbers in the coming days.
Read on AppleInsider

Geekbench 6
Geekbench is a highly-used benchmark tool, providing quick ways to compare the performance between similar-specification devices. On Tuesday, the app was finally updated by Primate Labs to version 6, bringing with it quite a few improvements.
The new version is made to better accommodate changes in the tech world since the release of Geekbench 5. Machine learning, larger smartphone cameras, and higher core counts have significantly changed the way we use hardware, necessitating the update.
Part of the changes is new "real-world tests," which goes a little beyond simply crunching numbers at high speed, and into how long actual tasks take to complete. For example, it will measure how long it takes for an example website to load, for filters to be added to a photo, or to render a PDF.

Geekbench 6 on macOS
New tests in this fares include blurring backgrounds for video conferencing, removing background objects from images, and processing text "within a development workflow."
"These tests are precisely crafted to ensure that results represent real-world use cases and workloads," the app's description states.
The CPU benchmark also adds new tests, covering application areas including artificial intelligence, augmented reality, and machine learning. GOPU benchmarks add support for machine learning and a more uniform GPU performance across different platforms.
Available for iOS, macOS, Windows, and Android, Geekbench 6 is free for non-commercial personal usage. A paid Pro edition will also be available, incorporating things like command line operation and offline results management, with a limited-time 20% discount reducing it to $79.
AppleInsider will be updating the most recent Mac reviews with GeekBench 6 numbers in the coming days.
Read on AppleInsider
Comments
Sorry to break it to you, but Geekbench states they optimize for all platforms and work with processor manufacturers to make sure results are consistent.
Take a look at that Primate Labs GB6 Metal score screenshot. It's of an M2 MBA. Scoring 45k in GB6 Metal. Best score for a M2 MBA in GB5 Metal is like 31k. The values being different doesn't mean anything because they aren't comparing the same thing.
The relative difference between CPUs, GPUs and ML cores may shift from GB5 to GB6 though, because the sub tests probably stress different parts of the CPUs, GPUs, ML units and memory subsystems. So some particular architecture may do better relative to where they before, relative to other CPUs, GPUs and ML units.
Of course you can get it now for the special discount price of $79 - but it's still nuts for a benchmarking tool (and still a 527% increase).
Unless you're MAX TECH this isn't a tool you use every day - and it's waaayyy overpriced for what it does.
https://www.phonearena.com/news/snapdragon-8-gen-2-chip-test-gpu-gaming-vs-apple-a16-iphone-14-pro_id144427
https://www.gizchina.com/2022/12/31/snapdragon-8-gen2-may-defeat-the-apple-a17-bionic-chip/
https://www.notebookcheck.net/Snapdragon-8-Gen-2-vs-A16-Bionic-vs-Dimensity-9200-Qualcomm-s-Adreno-740-wins-the-GPU-race-in-style.682905.0.html
And this is an Apple-centric site MacWorld: https://www.macworld.com/article/1507269/samsung-galaxy-s23-snapdragon-8-gen-2-a16-iphone.html
Yes, MacWorld shows the Samsung Galaxy S23 Ultra having a lower multicore score than the A14 Pro. The problem: Samsung never has the fastest Android devices. Samsung doesn't prioritize speed. They prioritize "light and thin." The Chinese phones, especially the gaming phones like the Asus ROG Phone and the ZTE Red Magic phones, always beat Samsung in benchmarks. (Why do people buy Samsung if the Chinese phones are faster? Because they provide a terrible user experience for everything other than gaming.)
But, the article didn't mention the free tier at all. Nor did it mention that one of the main motivators for changing the formula was the increased significance of big.LITTLE (not just mobile SOCs, but Apple, Intel, Qualcomm and MediaTek use them in PC CPUs) and integrated graphics (again, chiefly motivated by the need to compare Apple M1, M2 and soon M3 systems to x86 workstations with discrete graphics).
Not that it was an upgrade - it was a repurchase of the whole shebang with a tiny discount.
That was fine when the price was $14.99 - not so fine when it's a $99 product.
It wasn't until later when I started to think that it was really expensive that I checked back on the old Geekbench purchases and found they had traditionally been $14.99.
Any way, my fault for not looking more closely at the situation - but good to know if you haven't pulled the trigger yet.
Instant is instant whichever way I look at it.
That applies to older iPhones too.
Your iPhone is faster than his iPhone but he never knew he had a problem because his is lightning fast as well.
That said I've seen lots of Android phones doing things faster than iPhones.
But of course you are only talking about benchmarks and while they may have some value, instant is still instant whichever way you look at it.
But then you look beyond the benchmark numbers and see iPhones performing worse in areas where there is a noticeable difference.
Which numbers are more important? The ones you notice or the ones you don't?