tht

About

Username
tht
Joined
Visits
124
Last Active
Roles
member
Points
4,244
Badges
1
Posts
4,439
  • New DDR5 SDRAM standard supports double the bandwidth of DDR4

    flyway said:
    GPUs such as the AMD Radeon Pro 5500M use GDDR6 and the 5600M uses HBM2 in the 16" MacBook Pro.
    Is DDR5 mainly for the CPU and how does it compare to GDDR6 and HBM2?
    It depends on how many channels of each memory.

    2 channels of DDR4: 50 GB/s
    2 channels of DDR5: 100 GB/s
    2 channels of GDDR6: 250 GB/s
    2 channels of HBM2: 1000 GB/s

    DDR5 is for system or main memory for PCs, servers and such, but it usually comes down to cost. If HBM was cheap, there would be systems using it for main memory, but it is two expensive to be used as system memory for a regular PC. It will be interesting if Apple uses it as main memory for high end Apple Silicon Macs though. I'm almost half expecting it.

    There are latency differences that can sway usage of one type of RAM over the other depending on primary application as well.
    GG1mattinozMacProapplguyrundhvidqwerty52watto_cobra
  • Benchmarks show that Intel's Alder Lake chips aren't M1 Max killers

    Serious question:

    What does an Intel Core i9 do that requires it to be as power inefficient in the same processing circumstances as an AS M1 Max?

    Presumably there's a reason why it draws so much more current to achieve the same ends? Are there features in it that are not replicated in the M1 Max? 

    I'm assuming the architecture is radically different, but what stops Intel from changing to that architecture?
    Power is consumed when a transistor switch from 0 to 1 or 1 to 0. Switching is controlled by clock cycles. The more switching the more power is consumed. 
    Well, that is presumably a given. And possibly at a slightly lower level than I was alluding to. More specifically, is there some set of processing or overall design feature that Intel does wrong? Or does it do more 'stuff' that the M1 doesn't do? Is it required to support legacy ways of doing stuff that the M1 is free from? 
    It basically all comes down to economics. There's a lot of hoo-ha about instruction set architecture (RISC vs CISC), but it's not a big deal imo. It comes down to the economics of how many transistors you can have in a chip, what power budgets the OEM is willing to design for, and whether it is profitable in the end.

    First and foremost, Intel's fabrication technology - how small they can make the transistors - was effectively broken for close to 4 years. 2 CEOs and executive teams were sacked because of this. The smaller the transistors, the more transistors you can put in a chip and the less power it will take to power them. Intel was the pre-eminent manufacturer of computer chips for the better part of 40 years, with 75 to 80% marketshare for most of those years. It takes a lot of mistakes for them to lose their fab lead.

    Two things enabled TSMC, Apple's chip manufacturer, to catch and lap Intel in terms of how small a transistor they could make. The smartphone market became the biggest chip market, both in terms of units and money. It's bigger than PCs and servers. This allowed TSMC to make money, a lot of it Apple fronted, and invest in making smaller and smaller transistors. The other thing was Intel fucked up, on both ends. They decided not to get into the smartphone market (they tried when it became obvious, but failed). They then made certain decisions about the design of their 10nm fab that ended up not working and hence a 4 year delay, allowing TSMC to lap them. Even Samsung caught up and lapped them a little.

    More transistors mean more performance. Apple's chips have a lot more transistors than Intel's, probably by a factor of 2. If you aren't economical to have a lot of transistors, you can increase performance by having higher clock rates. High clock rates mean it will take more power to run. It's not a linear relationship. It's an exponential increase in power consumption. So, Apple's chips have transistors that are about 2x smaller than Intel's, there are more of them, and consequently can design their chips to run at relatively lower clock rates, and consuming less power.

    Intel can theoretically design chips with the same number of transistors as Apple, but the chips will be 2x as large. They will not be profitable doing this. Well, it's really, they will not enjoy their traditional 60% margins if they do it this way, ie, not profitable "enough". So smallish chips with higher power consumption is their way. Apple hates high power consumption chips, and they do it the opposite way (big chips lower power consumption), and you end up with an M1 Pro having about the same performance as an Alder Lake i9-12900H, but with the M1 Pro needing 30 W and the i9 need 80 to 110 W. And Apple has a 2x to 3x more performant on-chip GPU than Intel has. They can not do this if TSMC didn't become the leader in chip manufacturing.

    Intel has plans to regain the fab lead, be able to fab the smallest transistors, but we will see about that. They might, or not.
    patchythepirateAlex_Vcat52IreneWh2ppscooter63muthuk_vanalingamhucom2000watto_cobra
  • LG working on Pro Display XDR successor & 2 other high-end monitors, reportedly for Apple

    blastdoor said:
    tht said:
    Apple is certainly taking their time on this. It was a strategic error to discontinue a branded Apple monitor+dock. They should have shipped an Apple Thunderbolt 5K display in 2018. They really should have done it in 2016, but I digress. 

    I can understand the wait for XDR miniLED versions, but a 27" 5K monitor, sourced straight from the iMac, should have been shipping 2 years ago.

    Would love to hear how their product marketing and finance folks made all these decisions. Better be a book. It would be a horror book, but those are fun to read too. Maybe it was a bargaining chip with LG for monitor development?
    I’d say tactical marketing error rather than strategic error, but otherwise I agree.

    An apple branded monitor is a marketing tool. Marketing-wise it’s nuts to have Mac users staring at a Dell logo all day. If they’re going to do that, then might as well put “intel inside” stickers on Macs too.
    Who knows what the difference between tactics and strategy are here, but Apple left billions of revenue off the table by discontinuing monitors.  A typical desktop setup in modern times, and I'm going back about 10 years here, is to have a laptop with an external monitor or two connected to it. When the 4th gen Macbook Pros came out in 2016, it should have been when this type of setup is mature, no dock dongle needed as it would have been in the monitor. Plug in the TB cable, everything is lit up, plug-n-play, and theoretically more "reliable" if it came from Apple. This is more or less how it works with my LG UF27, except that it only has 3 USBC in back. It could have Ethernet, SD card, the usual.

    Apple sells about 20+ million Macs per year that could use an external monitor. With a take-up rate of 5% for a $1000 Apple monitor, 1m units per year, that's $1b per year in monitor sales alone. That's huge! Wasn't thinking about branding purposes at all.
    williamlondonlkrupph2pwatto_cobrarundhvidelijahgopinion
  • Apple engineers reveal how they prevent Mac Pro overheating

    GG1 said:
    This is probably the most interesting article AI has ever published.  Fascinating. 

    And to state the obvious SOMEONE has to pay for all those man-years of research.  The $5K price is not all profit. 
    sjworld said:
    It’s air cooled. This machine is very much likely to start thermal throttling once it reaches 80C during heavy workloads.
    I doubt Apple will reveal the upper limit of thermal capacity of this design, but there must be MUCH design margin after Apple admitted this shortcoming in the previous Mac Pro design.

    Can anybody estimate how much of the 1400 Watt power supply can actually be in use with all options and RAM installed?

    But a technical deep dive (from Apple) would be fascinating.

    Edit: grammar
    Apple says:

    Power Supply

    1.4 kilowatts

    • Maximum continuous power:
      • 1280W at 108–125V or 220–240V
      • 1180W at 100–107V

    Electrical and Operating Requirements

    • Line voltage: 100–125V AC at 12A; 220–240V AC at 6A
    • Frequency: 50Hz to 60Hz, single phase
    • Operating temperature: 50° to 95° F (10° to 35° C)
    • Storage temperature: –40° to 116° F (–40° to 47° C)
    • Relative humidity: 5% to 95% noncondensing
    • Maximum altitude: tested up to 16,400 feet (5000 meters)

    If these parameter ranges are square, the design case is running the machine at 16,500 ft altitude at ambient temperature of 95 °F with a sustained power delivery of 1280W. The dual Vega II card looks like it can hit about 470 Watts, and the Xeon CPUs have TDP of 205 W. 12 banks of RAM is probably less than 50 W. 1280W is just squeaking the maximum power consumption of the components. Cooling-wise, it will be dependent on the upper limit on how fast the fans can spin. If they designed it right, it will be fast enough to remove 1400 W of heat at 95 °F at 16400 feet altitude.

    It’s a bit esoteric though as it is pathological to max out both the CPUs and GPUs at the same time. The machine isn’t limited by thermals. They choose to design it for the typical 110 V 12 A circuit that are in the vast majority of places in the world. So it is limited by the typical power circuits in buildings. Higher power will mean some places will have to add higher power circuits, like the ones for a dryer or an oven. Then if you think about it, you do not want anymore than that for a machine that is on your desktop or desk side. It’ll heat up the room and you’ll need to have some consideration for the air conditioning the room.
    pscooter63GG1cgWerksmobirdwatto_cobraviclauyyc
  • Mac Studio with M1 UItra review: A look at the future power of Apple Silicon

    dewme said:
    I was so excited to get a Mac Studio until I saw the GPU benchmarks. Very disappointed and the wait continues for a Mac with great 3D performance. They should really clarify that  their performance graphs are for video editors only - this is no where near the performance of a 3090 for 3D and I didn't think it was but even if it was 70% of the performance of a 3080 I would have got one. Will there ever be a Mac with comparable GPU performance? Probably not, it seems their focus is solely on the video side of things.
    While I think there may be more GPU performance in the M1 Ultra than what we’ve seen in current benchmarks there is a point where you have to recognize that physics still do apply. The Mac Studio with M1 Ultra is still a very small form factor machine that you can run very comfortably right on your desktop next to or underneath your monitors. I’m not a gaming PC person but I can only imagine it would be somewhat difficult to find an RTX 3090 equipped machine with the same form factor, power supply requirements, operating thermals, and audible characteristics (fan noise) as Apple’s Studio systems without losing something on the performance side. 

    I’m very interested to hear about directly comparable machines (as defined above) that demonstrate that Apple has somehow missed the mark on the graphics performance of the M1 Ultra. I don’t discount the possibility of their existence. I’m also expecting that Apple will deliver a new Mac Pro that breaks through some of the constraints imposed by the Studio’s very small form factor and user friendly operating characteristics, i.e., no earplugs required to operate the system 12 inches from your keyboard. 

    At the same time, I do agree that Apple has some 'splaining to do regarding their M1 launch presentation material that shows their wonderchip running neck and neck with competitive graphics platforms. This includes ones like the RTX 3090 that require small fission reactors and cooling towers to supply them with sufficient power to max out those critical benchmarks and 3D applications that drive some people's purchase decisions. This is where even diehard Apple supporters are feeling like we’re not getting the whole story from Apple. What exactly did they mean by those graphs and what assumptions were they making? The review numbers don’t add up and we need to know why.
    This, plus the “modularity” spat earlier in this thread reflect something about the Peek Performance presentation—it’s like the people who designed the Studio handed responsibility over to Marketing and didn’t look back. I guess it probably reflects an attitude that the product speaks for itself. That truly tech-savvy people will recognize a functional marvel when they see it. A good example of this dynamic is the Ars Technica dismantling of the foolish (the nicest word I can think of) YouTube posturing of Luke Miani (mentioned earlier in this thread in a post that has apparently been deservedly vaporized): https://arstechnica.com/gadgets/2022/03/explaining-the-mac-studios-removable-ssds-and-why-you-cant-just-swap-them-out/

    I’d like to see Apple do a better job of defending their design decisions and their pricing, but I suspect they see it as a sort of no-win situation. Better not to get drawn into it. 
    That Luke Miani video has got to be an SEO Youtube ASMR designed shit piece. There isn't any honesty in it. It's all acting. It seems ludicrous that any Apple rumorologist, and Miani's video channel trades in Apple rumors, doesn't know how Apple's NAND storage works. Apple hasn't used a normal SSD since the T1/T2 Macs came out. Both the 2017 iMac Pro and the 2019 Mac Pro have this same basic storage design as the Mac Studio: dumb NAND in a daughter card with the storage controller in the T2. Other Macs with the T2 have the dumb NAND soldered onto the logic board instead of a daughter card. With Apple Silicon, the storage controller is in the SoC itself. It's not like Apple hasn't illustrated this multiple times in their videos.

    The M1 Ultra has a 21 single precision TFLOPS GPU. It will have some wins over the 3090 because of memory architecture and TBDR, but yes, it isn't going to compete on most GPU compute tasks against a 35 TFLOPS 3090. The rumored 32+8+128 Apple Silicon package for the Mac Pro will have a ~42 TFLOPS GPU, hopefully at the end of 2022, running at about 250 W. So they are getting there. They might even have raytracing hardware in it.

    The GPU really is a platform issue for Apple. Virtually all GPU compute code is optimized for CUDA, and hardly any for Metal. It's a long road ahead. They need to get developers to optimize their GPU compute code for Metal. They need to sell enough units for developers to make money. Those units have to be competitive to the competition. So, a Mac Pro will Apple Silicon needs to be rack mountable, be able to have 4 or 5 of those 128 g-core GPUs, consume 2x less power than Nvidia and AMD, and Apple needs to have their own developers optimize code for Metal. Like with Blender, but a lot of other open source software and closed source software. Otherwise, it is the same old same old. Apple's high end hardware is for content creation (FCP, Logic, etc), and not much else.

    dewmetenthousandthingsmuthuk_vanalingamFileMakerFellercgWerks
  • 'Apple Watch Series 7' complexity causing production delays [u]

    A variation of this article shows up virtually every time prior to an iPhone press event. It used to happen like clock work come September. New iPhone is running into this or that production problem, and could be delayed, everyone should panic. It's like a template and they just replace the names. Then, there is a tail end production rumor article sometime in January where they say Apple has cut production of iPhones by 50%, everyone should panic.

    It's the nature of any complex mass production endeavor. Problems arise. People work to get them fixed. Maybe they will make it on time, maybe they will be 4 weeks late. That type of delay is of minor consequence. Missing the holiday shopping season is a big mistake, but quite doubtful to happen. It's not like Apple has not been doing this for 40+ years, and have been doing this at a worldwide supply chain, worldwide mass production scale for the past 10 years.

    The timelines in the article doesn't make sense to me. I would think pilot production (small scale production) was done in July. So perhaps the reports of problems is really older than stated. The point of pilot production is to find the problems with assembly before mass production starts, fix them, and move on to mass production. So, finding problems in pilot production is like "no shit!". They work around the clock for every product to get them out the door.
    sireofsethllamaviclauyycFidonet127Japheywilliamhlkruppdatumax
  • ARM Mac Pro coming sooner rather than later, says Jean-Louis Gassee

    knowitall said:
    tht said:
    knowitall said:
    Very interesting, nice info.
    Is Gassee former Apple?
    JLG is indeed ex-Apple, in the late 80s and early 90s, but he doesnotknowitall, knowitall. He doesn’t have any real sources inside Apple nor its supply chain, and is just shooting the breeze here.
    I thought so, but thanks for the confirmation.
    I have seen him at an Apple session in Amsterdam.
    I know its someone with software expertise, one of the best I think.
    JLG has pretty storied history vis a vis Apple. He was the guy that guided the features of Apple Macs post Steve Jobs starting in 1985. The Mac II, IIci, IIfx, all that Mac hardware of that era, etc, were his babies. He left Apple to form Be, Inc and push super threaded BeOS in 90s, which was the OS Apple was going to buy before deciding on NeXTSTEP. They thought they had Apple by the huevos and was pushing for a higher buyout, but Apple bought NeXT instead.

    After that, he tried to push BeOS as an alternative PC operating system. Obviously that failed as you have to have MS Office to be successful as a PC operating system, or be free like Linux or Unix is. Tried their hand at being an Internet Appliance operating system after that, obviously failed. After that, Palm bought them out and BeOS tech was going to be in the next gen PalmOS, but they could never pull legacy PalmOS apps along, and was never able to get any OEMs to license PalmOS Cobalt. I’m guessing Rubenstein didn’t like it because Palm didn’t use it for webOS. It died inside Palm, or maybe I should say lies dormant in the Chinese company that bought Palm, Access. 

    Palm was basically the poster child of making the mistake of following pundit-class advice. They did everything that people were advising Apple to do: allow Palm clones, license the software, split up the company to be a separate hardware and software companies, find a buyer, who knows what else.
    StrangeDaysknowitallrandominternetpersonGG1FileMakerFellerdewmewatto_cobra
  • China increases power cuts, 'scared' suppliers look to leave country

    tmay said:
    DAalseth said:
    lkrupp said:
    DAalseth said:
    lkrupp said:
    And if the climate change radicals get their way this is the future for the U.S. Learn to live one or two days a week without power... to save the planet of course.
    No. 
    Thats a completely clueless comment. 
    Yeah, and how is it clueless? Energy needs are growing exponentially, not shrinking. Climate radicals insist that wind and solar will fill the need. No need for hydrocarbons or nuclear. Abject nonsense. My oldest son is director of engineering at a company deep in the power industry. He was part of the team that designed and built a solar power plant in the Mojave desert that uses liquid sodium to store energy. When he tells me hydrocarbons and nuclear will be around for a very long time I believe him. Solar and wind will never be able to provide a stable base load supply of electricity. The energy density of hydrocarbons far surpasses that of solar and wind. Add to that the problem of storing the energy produced by sources that are not 24/7/365 available.  

    The climate radicals won’t accept that fact. So yes, if they have their way, energy production will not be able to keep up with demand. If they get their way. Hoping for more rational minds to prevail. 
    It is clueless because you obviously know nothing about the subject. Do you ever wonder WHY it’s so hard to move over to renewables? Because the industry and those of us dealing with the issue are working very hard to NOT cause precisely the problems you so blithely say we want. Heck it would be simple to just turn off the power plants and let everyone sit in the dark and cold. But we’ve been struggling for decades to prevent exactly that outcome. As far as hydrocarbons being more energy dense, well that’s true. That’s why we are working so hard to advance alternatives that won’t kill the environment, and as a result billions of people. Moving off of fossil fuels is an absolute necessity. It’s just a matter of how. It’s not something we can wait another hundred years to figure out. We’re already fifty years too late to start.

    At one time whale oil was the fuel of choice. We transitioned to something better. When that happened there were howls of protest from people that said it would never work. But after a few years the transition was over and things got better. We are going through such a transition now. There are howls of protest and people insisting that it won’t work. They are simply wrong. We can’t afford to not make this change.
    Hydrogen, generated via solar energy is a magic bullet for replacement of fossil fuels. It also allows decentralized power production and energy storage which improved the resilience  ot the power grid. 

    Still, the easiest solution is to improve our power grid, making it more efficient and robust, so that excess energy generated in the Southwest can be cheaply distributed through the grid.
    Yes, the grid needs to be updated to be like the Internet: distributed, adaptable, robust. It should be capable of rerouting power around a failed distribution point, make up for lost power from generators that trip offline, stop little power failures from cascading into big ones. It will take a 1950s style big engineering program like building the interstate highway system. There is considerable inertia against doing this in the USA as it's hard to break down the entrenched, incumbent fiefdoms of the public utility energy companies and state governments. They want to survive after all, so it is unlikely anything will be done. It will have to happen in a manner that doesn't arise from public policy.

    Ultimately, a good chunk of residences, business, and places will have enough renewables+storage to become grid independent. It's going to be a vicious cycle for the power companies. It's going to get weird. Like, $100/mo charges just to be connected to the grid. The politics are going to be brutal. I'm a house battery away from being grid independent. With a bidirectional EV, it would be a no-brainer.

    I think hydrogen isn't going to make it. It has lost a big use case with cars. Where it can fit in will be interesting to see. Air-to-gas will be available to make jet fuel. Methane (basically natural gas, propane) made from air-to-gas processes can be used for backup or peak electricity generation, but I don't see how that is cheaper than batteries will be. For colder places, backup gas storage used for heating could be a thing, but these places should use geothermal.
    tmayfastasleepWgkruegerp-dogbyronlelijahg
  • Apple releases MagSafe Battery Pack for iPhone

    crowley said:
    tht said:
    crowley said:
    dewme said:
    As inelegant as this is, it's still far nicer than the previous generation so-called "smart battery case (SBC)." If you need the extra battery capacity, you can slap on one of these magnahumpergizers and go to town. If you don't need the extra capacity and don't want to have to cinch up your belt up a couple of notches to keep the combined weight from auto-dropping your trousers on you, just leave the thing at home. You don't have to swap out the entire case like the SBC.

    If you've ever owned a SBC for a jumbo phone and if you keep your phone in your pants pocket, you will most definitely be adding Jony to your Christmas card list for keeping the base configuration iPhones as light and slim as possible. Adding a big hunkwad of additional heft and girth to your iPhone by slapping on a SBC or a magnahumpergizer is a nice choice that people who really need that kind of added runtime should be happily making on their own. If Apple built all of that heft & girth into the base phone, at least with current battery technology, the appeal of jumbo phones would take a big hit. Hopefully the battery technology will evolve to keep Apple from inflicting that kind of suffering on the masses.

    This thing is so much better than the SBC.

    Speaking of the SBC ... I have one for my Xs Max, but for some reason, mine ain't so smart. In fact, it is rather stupid. Big and stupid. This model got recalled and replaced but the replacement was no smarter than the recalled one. It would not infrequently try to discharge my phone, especially when recharging wirelessly. My understanding was that the SBC would discharge its own battery to keep the phone's battery charged. If I put the SBC and phone on a wireless charger with the SBC at 50% and the phone at 100% I'd expect to come back later and see both batteries at 100%. This was not always the case. Often I'd come back, hours later and the case would be at 60% and the phone would be at 30%. Not sure why. I thought the recall would fix it but it did not. What dealt the SBC its golden ticket to the junk drawer was when I discovered the SBC was interfering with the magnetometer and keeping the compass from working. If I really need a massive amount of run time, I can always strap the SBC back on and charge it up.

    Hopefully, Apple learned from the SBC mistakes and the new magnetic humpy energizer thing will be a pure delight.
    The SBC does have twice the capacity of this MagSafe battery though.  And doesn't lose as much charge in the transmission.  And there's no chance of it falling off.
    The Smart Battery Case for the iPhone 11 has a 10.9 WH battery, and it costs $130. The SBC for the iPhone X series, I would have to go back and look, but it will be similar.

    The MagSafe Battery for the iPhone 12 series has a 11.1 WH battery, and it costs $100. Since it uses induction, it does lose some energy to transmission inefficiency. I would like to see the loss quantified one of these days.

    So, the MagSafe battery has about the same capacity as the SBC. Whether people will like the quick attach, detach features of MagSafe versus an always on case is up to them. Pluses and minuses per the usual.
    Hmm, I did look it up before posting, and the MagSafe appears to have a 1460 mAH battery while the iPhone 11 SBC had 2 * 1430mAh cells.  I wasn't totally making it up, but I didn't notice the wattage.  I guess the MagSafe is putting out a higher voltage? 

    I confess my electrical knowledge is fairly limited.
    The Smart Battery Case and the MagSafe battery are quite similar in terms of toplevel battery design: 2 batteries connected in series. Obviously a lot different in other aspects. When connected in series, the voltage increases by the sum of the output voltages of the batteries while the amps stay the same. The labeling shows the mAH as 1460 and the voltage as 7.6 V. Li-ion battery chemistry outputs about 3.8 V give or take. 7.6 just happens to be 3.8 x 2, therefore, two Li-ion batteries in series, with 1460 mAH at 7.6 V.

    The Watt-Hours (WH) is the amount of energy in the whole power circuit and is the most reliable number. It is just amps x voltage x runtime. So 1460 mAH x 7.6 V = 11.1 WH. People should only be talking about battery capacity in terms of WH. The mAH is some weird nerd, gadget journalist term of art that needs to die.
    fastasleepmuthuk_vanalingamcrowleyjdb8167watto_cobra
  • Apple's macOS Ventura beta review: great new features, but some concerns

    DaringFireball had a link to an old OS X engineer who told the story of the shrinkydink UI from 2006, in which Stage Manager is modern rendition of the interface. It definitely lighted up some light bulbs.

    About in the 2010 time frame, Apple had several OS X UI patents which describes a UI where UI elements were aligned along the walls of a 3D perspective. When I saw that, it was basically WTF. How could it work? Well, lo and behold shrinkydink and Stage Manager. That's what those patents were all about.

    It's basically a dock for spaces, though it appears shrinkydink had a separate perspective dock for windows within an app. So docks for everything and they used a perspective skew to communicate that it's a dock.
    JapheyAlex1Nmuthuk_vanalingamdocno42