- Last Active
flyway said:GPUs such as the AMD Radeon Pro 5500M use GDDR6 and the 5600M uses HBM2 in the 16" MacBook Pro.
Is DDR5 mainly for the CPU and how does it compare to GDDR6 and HBM2?
2 channels of DDR4: 50 GB/s
2 channels of DDR5: 100 GB/s
2 channels of GDDR6: 250 GB/s
2 channels of HBM2: 1000 GB/s
DDR5 is for system or main memory for PCs, servers and such, but it usually comes down to cost. If HBM was cheap, there would be systems using it for main memory, but it is two expensive to be used as system memory for a regular PC. It will be interesting if Apple uses it as main memory for high end Apple Silicon Macs though. I'm almost half expecting it.
There are latency differences that can sway usage of one type of RAM over the other depending on primary application as well.
cogitodexter said:waveparticle said:cogitodexter said:Serious question:
What does an Intel Core i9 do that requires it to be as power inefficient in the same processing circumstances as an AS M1 Max?Presumably there's a reason why it draws so much more current to achieve the same ends? Are there features in it that are not replicated in the M1 Max?
I'm assuming the architecture is radically different, but what stops Intel from changing to that architecture?
First and foremost, Intel's fabrication technology - how small they can make the transistors - was effectively broken for close to 4 years. 2 CEOs and executive teams were sacked because of this. The smaller the transistors, the more transistors you can put in a chip and the less power it will take to power them. Intel was the pre-eminent manufacturer of computer chips for the better part of 40 years, with 75 to 80% marketshare for most of those years. It takes a lot of mistakes for them to lose their fab lead.
Two things enabled TSMC, Apple's chip manufacturer, to catch and lap Intel in terms of how small a transistor they could make. The smartphone market became the biggest chip market, both in terms of units and money. It's bigger than PCs and servers. This allowed TSMC to make money, a lot of it Apple fronted, and invest in making smaller and smaller transistors. The other thing was Intel fucked up, on both ends. They decided not to get into the smartphone market (they tried when it became obvious, but failed). They then made certain decisions about the design of their 10nm fab that ended up not working and hence a 4 year delay, allowing TSMC to lap them. Even Samsung caught up and lapped them a little.
More transistors mean more performance. Apple's chips have a lot more transistors than Intel's, probably by a factor of 2. If you aren't economical to have a lot of transistors, you can increase performance by having higher clock rates. High clock rates mean it will take more power to run. It's not a linear relationship. It's an exponential increase in power consumption. So, Apple's chips have transistors that are about 2x smaller than Intel's, there are more of them, and consequently can design their chips to run at relatively lower clock rates, and consuming less power.
Intel can theoretically design chips with the same number of transistors as Apple, but the chips will be 2x as large. They will not be profitable doing this. Well, it's really, they will not enjoy their traditional 60% margins if they do it this way, ie, not profitable "enough". So smallish chips with higher power consumption is their way. Apple hates high power consumption chips, and they do it the opposite way (big chips lower power consumption), and you end up with an M1 Pro having about the same performance as an Alder Lake i9-12900H, but with the M1 Pro needing 30 W and the i9 need 80 to 110 W. And Apple has a 2x to 3x more performant on-chip GPU than Intel has. They can not do this if TSMC didn't become the leader in chip manufacturing.
Intel has plans to regain the fab lead, be able to fab the smallest transistors, but we will see about that. They might, or not.
blastdoor said:Apple is certainly taking their time on this. It was a strategic error to discontinue a branded Apple monitor+dock. They should have shipped an Apple Thunderbolt 5K display in 2018. They really should have done it in 2016, but I digress.
I can understand the wait for XDR miniLED versions, but a 27" 5K monitor, sourced straight from the iMac, should have been shipping 2 years ago.
Would love to hear how their product marketing and finance folks made all these decisions. Better be a book. It would be a horror book, but those are fun to read too. Maybe it was a bargaining chip with LG for monitor development?
An apple branded monitor is a marketing tool. Marketing-wise it’s nuts to have Mac users staring at a Dell logo all day. If they’re going to do that, then might as well put “intel inside” stickers on Macs too.
Apple sells about 20+ million Macs per year that could use an external monitor. With a take-up rate of 5% for a $1000 Apple monitor, 1m units per year, that's $1b per year in monitor sales alone. That's huge! Wasn't thinking about branding purposes at all.
GG1 said:acheron2018 said:This is probably the most interesting article AI has ever published. Fascinating.And to state the obvious SOMEONE has to pay for all those man-years of research. The $5K price is not all profit.sjworld said:It’s air cooled. This machine is very much likely to start thermal throttling once it reaches 80C during heavy workloads.I doubt Apple will reveal the upper limit of thermal capacity of this design, but there must be MUCH design margin after Apple admitted this shortcoming in the previous Mac Pro design.Can anybody estimate how much of the 1400 Watt power supply can actually be in use with all options and RAM installed?But a technical deep dive (from Apple) would be fascinating.Edit: grammar
- Maximum continuous power:
- 1280W at 108–125V or 220–240V
- 1180W at 100–107V
Electrical and Operating Requirements
If these parameter ranges are square, the design case is running the machine at 16,500 ft altitude at ambient temperature of 95 °F with a sustained power delivery of 1280W. The dual Vega II card looks like it can hit about 470 Watts, and the Xeon CPUs have TDP of 205 W. 12 banks of RAM is probably less than 50 W. 1280W is just squeaking the maximum power consumption of the components. Cooling-wise, it will be dependent on the upper limit on how fast the fans can spin. If they designed it right, it will be fast enough to remove 1400 W of heat at 95 °F at 16400 feet altitude.
- Line voltage: 100–125V AC at 12A; 220–240V AC at 6A
- Frequency: 50Hz to 60Hz, single phase
- Operating temperature: 50° to 95° F (10° to 35° C)
- Storage temperature: –40° to 116° F (–40° to 47° C)
- Relative humidity: 5% to 95% noncondensing
- Maximum altitude: tested up to 16,400 feet (5000 meters)
It’s a bit esoteric though as it is pathological to max out both the CPUs and GPUs at the same time. The machine isn’t limited by thermals. They choose to design it for the typical 110 V 12 A circuit that are in the vast majority of places in the world. So it is limited by the typical power circuits in buildings. Higher power will mean some places will have to add higher power circuits, like the ones for a dryer or an oven. Then if you think about it, you do not want anymore than that for a machine that is on your desktop or desk side. It’ll heat up the room and you’ll need to have some consideration for the air conditioning the room.
- Maximum continuous power:
tenthousandthings said:dewme said:runswithfork said:I was so excited to get a Mac Studio until I saw the GPU benchmarks. Very disappointed and the wait continues for a Mac with great 3D performance. They should really clarify that their performance graphs are for video editors only - this is no where near the performance of a 3090 for 3D and I didn't think it was but even if it was 70% of the performance of a 3080 I would have got one. Will there ever be a Mac with comparable GPU performance? Probably not, it seems their focus is solely on the video side of things.I’m very interested to hear about directly comparable machines (as defined above) that demonstrate that Apple has somehow missed the mark on the graphics performance of the M1 Ultra. I don’t discount the possibility of their existence. I’m also expecting that Apple will deliver a new Mac Pro that breaks through some of the constraints imposed by the Studio’s very small form factor and user friendly operating characteristics, i.e., no earplugs required to operate the system 12 inches from your keyboard.At the same time, I do agree that Apple has some 'splaining to do regarding their M1 launch presentation material that shows their wonderchip running neck and neck with competitive graphics platforms. This includes ones like the RTX 3090 that require small fission reactors and cooling towers to supply them with sufficient power to max out those critical benchmarks and 3D applications that drive some people's purchase decisions. This is where even diehard Apple supporters are feeling like we’re not getting the whole story from Apple. What exactly did they mean by those graphs and what assumptions were they making? The review numbers don’t add up and we need to know why.
I’d like to see Apple do a better job of defending their design decisions and their pricing, but I suspect they see it as a sort of no-win situation. Better not to get drawn into it.
The M1 Ultra has a 21 single precision TFLOPS GPU. It will have some wins over the 3090 because of memory architecture and TBDR, but yes, it isn't going to compete on most GPU compute tasks against a 35 TFLOPS 3090. The rumored 32+8+128 Apple Silicon package for the Mac Pro will have a ~42 TFLOPS GPU, hopefully at the end of 2022, running at about 250 W. So they are getting there. They might even have raytracing hardware in it.
The GPU really is a platform issue for Apple. Virtually all GPU compute code is optimized for CUDA, and hardly any for Metal. It's a long road ahead. They need to get developers to optimize their GPU compute code for Metal. They need to sell enough units for developers to make money. Those units have to be competitive to the competition. So, a Mac Pro will Apple Silicon needs to be rack mountable, be able to have 4 or 5 of those 128 g-core GPUs, consume 2x less power than Nvidia and AMD, and Apple needs to have their own developers optimize code for Metal. Like with Blender, but a lot of other open source software and closed source software. Otherwise, it is the same old same old. Apple's high end hardware is for content creation (FCP, Logic, etc), and not much else.
A variation of this article shows up virtually every time prior to an iPhone press event. It used to happen like clock work come September. New iPhone is running into this or that production problem, and could be delayed, everyone should panic. It's like a template and they just replace the names. Then, there is a tail end production rumor article sometime in January where they say Apple has cut production of iPhones by 50%, everyone should panic.
It's the nature of any complex mass production endeavor. Problems arise. People work to get them fixed. Maybe they will make it on time, maybe they will be 4 weeks late. That type of delay is of minor consequence. Missing the holiday shopping season is a big mistake, but quite doubtful to happen. It's not like Apple has not been doing this for 40+ years, and have been doing this at a worldwide supply chain, worldwide mass production scale for the past 10 years.
The timelines in the article doesn't make sense to me. I would think pilot production (small scale production) was done in July. So perhaps the reports of problems is really older than stated. The point of pilot production is to find the problems with assembly before mass production starts, fix them, and move on to mass production. So, finding problems in pilot production is like "no shit!". They work around the clock for every product to get them out the door.
knowitall said:knowitall said:Very interesting, nice info.
Is Gassee former Apple?
I have seen him at an Apple session in Amsterdam.
I know its someone with software expertise, one of the best I think.
After that, he tried to push BeOS as an alternative PC operating system. Obviously that failed as you have to have MS Office to be successful as a PC operating system, or be free like Linux or Unix is. Tried their hand at being an Internet Appliance operating system after that, obviously failed. After that, Palm bought them out and BeOS tech was going to be in the next gen PalmOS, but they could never pull legacy PalmOS apps along, and was never able to get any OEMs to license PalmOS Cobalt. I’m guessing Rubenstein didn’t like it because Palm didn’t use it for webOS. It died inside Palm, or maybe I should say lies dormant in the Chinese company that bought Palm, Access.Palm was basically the poster child of making the mistake of following pundit-class advice. They did everything that people were advising Apple to do: allow Palm clones, license the software, split up the company to be a separate hardware and software companies, find a buyer, who knows what else.
tmay said:DAalseth said:lkrupp said:DAalseth said:lkrupp said:And if the climate change radicals get their way this is the future for the U.S. Learn to live one or two days a week without power... to save the planet of course.Thats a completely clueless comment.
The climate radicals won’t accept that fact. So yes, if they have their way, energy production will not be able to keep up with demand. If they get their way. Hoping for more rational minds to prevail.
At one time whale oil was the fuel of choice. We transitioned to something better. When that happened there were howls of protest from people that said it would never work. But after a few years the transition was over and things got better. We are going through such a transition now. There are howls of protest and people insisting that it won’t work. They are simply wrong. We can’t afford to not make this change.
Still, the easiest solution is to improve our power grid, making it more efficient and robust, so that excess energy generated in the Southwest can be cheaply distributed through the grid.
Ultimately, a good chunk of residences, business, and places will have enough renewables+storage to become grid independent. It's going to be a vicious cycle for the power companies. It's going to get weird. Like, $100/mo charges just to be connected to the grid. The politics are going to be brutal. I'm a house battery away from being grid independent. With a bidirectional EV, it would be a no-brainer.
I think hydrogen isn't going to make it. It has lost a big use case with cars. Where it can fit in will be interesting to see. Air-to-gas will be available to make jet fuel. Methane (basically natural gas, propane) made from air-to-gas processes can be used for backup or peak electricity generation, but I don't see how that is cheaper than batteries will be. For colder places, backup gas storage used for heating could be a thing, but these places should use geothermal.
crowley said:crowley said:dewme said:As inelegant as this is, it's still far nicer than the previous generation so-called "smart battery case (SBC)." If you need the extra battery capacity, you can slap on one of these magnahumpergizers and go to town. If you don't need the extra capacity and don't want to have to cinch up your belt up a couple of notches to keep the combined weight from auto-dropping your trousers on you, just leave the thing at home. You don't have to swap out the entire case like the SBC.If you've ever owned a SBC for a jumbo phone and if you keep your phone in your pants pocket, you will most definitely be adding Jony to your Christmas card list for keeping the base configuration iPhones as light and slim as possible. Adding a big hunkwad of additional heft and girth to your iPhone by slapping on a SBC or a magnahumpergizer is a nice choice that people who really need that kind of added runtime should be happily making on their own. If Apple built all of that heft & girth into the base phone, at least with current battery technology, the appeal of jumbo phones would take a big hit. Hopefully the battery technology will evolve to keep Apple from inflicting that kind of suffering on the masses.This thing is so much better than the SBC.Speaking of the SBC ... I have one for my Xs Max, but for some reason, mine ain't so smart. In fact, it is rather stupid. Big and stupid. This model got recalled and replaced but the replacement was no smarter than the recalled one. It would not infrequently try to discharge my phone, especially when recharging wirelessly. My understanding was that the SBC would discharge its own battery to keep the phone's battery charged. If I put the SBC and phone on a wireless charger with the SBC at 50% and the phone at 100% I'd expect to come back later and see both batteries at 100%. This was not always the case. Often I'd come back, hours later and the case would be at 60% and the phone would be at 30%. Not sure why. I thought the recall would fix it but it did not. What dealt the SBC its golden ticket to the junk drawer was when I discovered the SBC was interfering with the magnetometer and keeping the compass from working. If I really need a massive amount of run time, I can always strap the SBC back on and charge it up.Hopefully, Apple learned from the SBC mistakes and the new magnetic humpy energizer thing will be a pure delight.
The MagSafe Battery for the iPhone 12 series has a 11.1 WH battery, and it costs $100. Since it uses induction, it does lose some energy to transmission inefficiency. I would like to see the loss quantified one of these days.
So, the MagSafe battery has about the same capacity as the SBC. Whether people will like the quick attach, detach features of MagSafe versus an always on case is up to them. Pluses and minuses per the usual.
I confess my electrical knowledge is fairly limited.
The Watt-Hours (WH) is the amount of energy in the whole power circuit and is the most reliable number. It is just amps x voltage x runtime. So 1460 mAH x 7.6 V = 11.1 WH. People should only be talking about battery capacity in terms of WH. The mAH is some weird nerd, gadget journalist term of art that needs to die.
DaringFireball had a link to an old OS X engineer who told the story of the shrinkydink UI from 2006, in which Stage Manager is modern rendition of the interface. It definitely lighted up some light bulbs.
About in the 2010 time frame, Apple had several OS X UI patents which describes a UI where UI elements were aligned along the walls of a 3D perspective. When I saw that, it was basically WTF. How could it work? Well, lo and behold shrinkydink and Stage Manager. That's what those patents were all about.
It's basically a dock for spaces, though it appears shrinkydink had a separate perspective dock for windows within an app. So docks for everything and they used a perspective skew to communicate that it's a dock.