Detnator
About
- Username
- Detnator
- Joined
- Visits
- 44
- Last Active
- Roles
- member
- Points
- 620
- Badges
- 1
- Posts
- 287
Reactions
-
Reddit breaks down the math on how the new MacBook Pro saves them money
IreneW said:sflocal said:IreneW said:Well, let's just say that if their engineers are spending 45 minutes per day, just waiting far a compilation to finish, doing nothing else, they are doing it wrong.
Even if they cut that in half.Cutting down compile times like that is huge.
My IDE tells me in real time if there are any syntactical errors or obvious logical problems in my code. The build server under my desk continuously builds, lints and checks, including all unit tests, even before i even think about committing anything. The build farm in our server room does the heavy lifting of rebuilding Yocto and target images.
When a build, for some reason, breaks my workflow there are always closely related tasks to do, like adding unit tests, requirement linking or writing docs.
However your compiling is being done, each compile is still an event, even if it's 2700 one second compiles per day (one of many possible numerical interpretations of "continuous"). And you can't claim you don't spend any of those seconds either monitoring or assessing the result of the compiling.
Still... instead of using one computer (like these guys you're criticizing) to handle your development, you have two... plus a build farm. Got it.
-
Reddit breaks down the math on how the new MacBook Pro saves them money
GeorgeBMac said:Detnator said:GeorgeBMac said:AppleInsider said:
However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.
Read on AppleInsiderBullshit!
They are correct -- but only from a mathematical standpoint.But they are wrong from a cost standpoint based on reality.The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is. They certainly won't change by saving the 20 minutes a day they are estimating.So, if those costs don't change, they should not be included in the "salary".So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day. So, it would take 220 days to recover the cost of a $3,500 machine. That's roughly a year.But, to be fair, you can't just use a cost of $3,500. You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler). Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.... To put it another way: if costs don't change then all you did was spend money, not save it.Cost accounting (like all statistics) can be used to prove almost anything you want. So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.... Reddit needs to hire a cost accountant -- or least find a capable one.
No, the time saved is put towards getting the product out the door sooner or some other productive outcome that adds to the business in some way….Unless, if there really is absolutely nothing else those devs can add to the business with their extra time then they would have to lay off 1 from 25 (they don’t have to have 100 devs and lay off 4 for that to be a saving) to offset the MacBook costs - the difference not the $3500. As you say. In which case the $150 is still valid.But the more likely scenario is they’re constantly growing and are constantly balancing what needs to get done, what income each dev ultimately generates (even if indirectly), and how much those devs cost per hour to produce what they produce - in other words if the team is now 4% more productive that’s 4% more time the hiring decision maker can wait before spending $150 per hour more on the next much needed newly added dev.One way or another that $150 is valid.The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.Without going back through all the detail I already laid out, the synopsis still says it all: If costs do not change then no money was saved. And, most of the costs they included in that $150 an hour would not change. There would be some savings, but no where near what they claimed.The other things you site such as quicker turn around still may (or may not) be beneficial.But the claim did not include them and neither did I since I was responding to that claim.
Ok... you want to nitpick... that's ok (really. I'm not saying that to be snarky). Let's do it!
There's two parts here:- The correct cost for each dev.
- Whether the 20 minutes per dev time saving has any value, and if so how?
Your comment started with "bullshit", quoting the article's $150 cost of employee. Really, that's not BS at all. That's what they've figured out the dev costs (regardless of whether the Mac translates to saving them any of that or not).
You asserted at first that the non-salary costs are fixed, therefore invalid because they're not "saved". You eventually (correctly) noted that the salary is also fixed therefore it's not validly "saved" either. But you still did all your calcs based on the $50/hour. That doesn't really add up (pun intended). Regardless of if or why we do or don't care about the dev cost, it's still everything, not just the salary, so that $150/hour as the cost of the employee is valid...
So then... if we're going to calculate the value of that 20 minutes, then it's all or nothing. It needs to be done with either the $150/hour (= $1200 per 8hr day), or zero, depending on how the dev and/or company uses that 20 minutes.
So... If 60 minutes per dev costs $150 per dev, then 20 minutes per dev costs $50 per dev. If it's 20 minutes out of a 480 minute day then it's still 4% but its cost is still $50 per dev per day (not $16).
Ok... but is it "saved"?The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.That statement is not quite correct. The Reddit guy -- Jameson Williams -- started with the basic premise of weighing the outlay against the opportunity cost of not spending that, and concluded (claimed) that the $3500 outlay will pay for itself (both are different to saving). He went into detail about the value of the time saved (the $150), without going into detail about how he does or will turn that time saving into the $ value that will pay for the outlay. It could be by laying off 4% of the dev team, or (unlikely but hypothetically) cutting their cost by 4% (which would be mostly salary as the most liquid part of their cost, essentially) and letting them have the time back for themselves, or possibly other ways of turning it into saving money. Or it could be by making more money (the most likely outcome).Now, granted, the AI article's title and first line refer to "saving" money. That's entirely this article's author's interpretation/extrapolation, and I'm pretty sure he's not a cost accountant so it's not an unreasonable interpretation given that you've acknowledged at least one way that that is possible (laying off 4% of the devs). I gave a couple of other possibilities too (eg. more growth possible before needing to hire again).
Regardless, the absence of the how in his claim, doesn't mean it isn't happening. Of course it's happening one way or another, and it's not unreasonable to assume so (otherwise he wouldn't be in business), and that is all that is required for his claim to be valid. As you noted, one of the only ways that 20 minutes won't have value to the company is if it's spent goofing off instead of doing something productive.
(As for the $3500, not that you were calling BS on this part, but just to be thorough, since I'm highlight the "pays for itself" bit: $3500 / $50 per day = 70 days = 14 weeks.* So, as long as that 20 minutes is converted to $ value one way or another, then in just over three months the Mac pays for itself in entirety. Although that's before you count the cost of whatever they'd have instead so it's really only difference (as you noted). If it's a $3500 machine instead of say a $2K one, then the difference: $1500 / $50 per day = 30 days = 6 weeks.*)
So at worst, some extrapolation is required, if it even is, but either way:- The $150 cost of the dev is valid, and
- The idea that 20 minutes time saved per day will turn into $ value one way or another is hardly a stretch (nor "bullshit").
(*at 5 days per week.)
-
FAA forced 5G rollout delays despite no proof of harm, claim trade bodies
darkvader said:The FAA does their job, wants to be sure planes don't fall out of the sky.The telcos whine about it.The standard in aviation isn't "proof of harm" - it's "as close as possible to proof that there is no harm". And if there's a chance, even a small chance, that these frequencies used for cell phone data is going to interfere with older altimeters (I'm assuming radar altimeters, I doubt pressure altimeters could be affected) then the FAA did exactly what they're supposed to do - put the brakes on and demand testing.The telcos need to calm down. 5G isn't a big deal for the vast majority of people, not having planes fall out of the sky is.
-
Developers get day in court over 'tyrannical greed' of Apple's App Store
OutdoorAppDeveloper said:darelrex said:OutdoorAppDeveloper said:I don't like their chances. Their arguments are weak. Their demands are ridiculous.
If you want to go after Apple's monopoly, you have to use the customer's right to chose what apps they can use on their iPhones. That is where Apple is the weakest.
For example, why can't I mine cryptocurrency on my iPhone if I want to?
Why can't I use BitTorrent if I want to?
Why can't I display a list of the WiFi networks around me?
Why can't I run a Windows or game emulator?
Why can't I choose any kind of Apple Watch face?
Yep, It’s your device, sure. And as long as you don’t use any of the firmware, OS, and software that are Apple’s IP and NOT yours then sure, you CAN do whatever you like with it.It makes a pretty good paperweight. In exactly the right conditions it can almost work as a mirror.I suppose if you can figure out how to install BitTorrent or mine cryptocurrency on it, without using Apple’s IP to do so, then have at it. But if you can’t, then that’s about as much Apple’s responsibility as it is GE’s responsibility that you can’t do those things with your toaster.
So far as I can see, NO ONE who is spouting this “it’s my device” argument has ever yet responded to the “sure, but it’s not your OS and other IP” response.Perhaps you’ll be the first? Go on. Surprise us…? -
Apple Store down ahead of 'Unleashed' new MacBook Pro event
darkvader said:Detnator said:Can't wait to see what @darkvader and a couple of others have to say about how terrible and outdated the M1 Pro and M1 Max are.I never said the M1 was outdated. I said it was a stupid thing to do. I also said that its real world performance is nowhere close to the hype, and I absolutely stand by that. I've used several more computers with the M1 since I said that, and it's still true. It seems that Windoze 11 ARM will run on Parallels, and that ARM Windoze has Intel emulation, so that may not be as much of an issue as I'd expected, but of course there will be a speed penalty.I have yet to touch a max/pro MBP, and neither have you. So I have no real world performance comparison to comment on yet, and neither do you.I can say that the camera notch is absolutely idiotic, and will be a major distraction for a lot of people when they use these computers.darkvader said (July 18, 2021 10:54AM):It was obsolete, deficient, underpowered, and overpriced when it was released. I mean, seriously, a processor released in 2020 that couldn't address more than 16GB RAM? A processor released in 2020 that could only handle two Thunderbolt/USB ports? A processor that can't even handle a dedicated GPU so you get nothing but its underpowered integrated graphics? It's fine for a toy like the iPad, but it doesn't belong in a computer.Apple shouldn't have used that junk in last year's computers. It definitely doesn't belong in computers that haven't even been released yet.
I and a number of others replied to that message arguing the same point: that you're talking as if it's junk because it can't compete with the Mac Pro. (Which is pretty funny really because Apple Insider's Andrew temporarily replaced his $6000 Mac Pro with a $1200 M1 MBA and it did all his video editing etc. with virtually the same performance as the Mac Pro.) Still, the M1 is a first generation chip for the base entry level Macs, and with the M1 in them, those Macs smoke the previous entry level Macs and every PC in the same class, while keeping up with all but the fastest Macs and PCs. You make a lot of claims but never seem to reply when facts are being discussed.
Your issue with "integrated graphics" is equally misguided. Just because Intel can't get integrated graphics right doesn't mean it's the "integrated" part that's the problem. Apple's now managed to prove that integrated vs discreet is meaningless. In fact part of why Apple's chips' graphics are fast is BECAUSE they're integrated. Saying graphics is weak because it's integrated is as idiotic as saying a Porsche isn't a "real" car because it's engine's in the "wrong" end.
Real world? Ok... As I've said in more than one reply to your excessively negative posts about the M1's, including the above quoted one -- none of which you've replied to -- I have a maxed out (at original release) Intel 2.4GHz i9 16" MBP (64GB RAM, 8GB 5500M), alongside a maxed out 13" M1 MBP (16GB RAM), both right here. They're basically clones of each other (what they're running) and the M1 does most tasks faster -- except it can't fry an egg like the Intel one can. There's absolutely no question in my real world use (along with nearly everyone else's) that the M1 lives up to all of the sensible "hype". If it doesn't do it for you, then YOU're doing something wrong. Again you make a lot of claims but never seem to reply to, or with, facts.
So I don't need to touch a Max/Pro MBP to have a lot of confidence it will live up to what Apple says it can do. So far, every report from the few who have "touched" them say they do. And spec-wise these chips between double and quadruple nearly everything in the M1 except clock speed.
But ok... let's wait until a lot more people have "touched" them... and see. I can't wait to hear how obsolete, deficient, underpowered and overpriced they are then.
PS. For the record, the hype isn't all about how much faster they are. That's almost just the bonus. The hype is that they're faster, while using a fraction of the power/energy and generating a fraction of the heat. That's what allows these chips to go into the low end machines, machines that have never seen this kind of performance till now. Meanwhile the M1 Max is wiping the floor with Server grade Xeons and desktop graphics cards, from inside a 3.5lb laptop with 17 hours of battery.