Reddit breaks down the math on how the new MacBook Pro saves them money

2

Comments

  • Reply 21 of 44
    GeorgeBMacGeorgeBMac Posts: 11,421member

    However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.


    Read on AppleInsider
    Bullshit!
    They are correct -- but only from a mathematical standpoint.
    But they are wrong from a cost standpoint based on reality.

    The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is.   They certainly won't change by saving the 20 minutes a day they are estimating.

    So, if those costs don't change, they should not be included in the "salary".
    So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?
    Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day.   So, it would take 220 days to recover the cost of a $3,500 machine.  That's roughly a year.

    But, to be fair, you can't just use a cost of $3,500.  You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.

    On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler).  Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.
    ... To put it another way:  if costs don't change then all you did was spend money, not save it.

    Cost accounting (like all statistics) can be used to prove almost anything you want.  So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.
    ... Reddit needs to hire a cost accountant -- or least find a capable one.


    edited November 2021 rezwitsspheric
  • Reply 22 of 44
    GeorgeBMacGeorgeBMac Posts: 11,421member
    MplsP said:
    jpellino said:
    IreneW said:

    Well, let's just say that if their engineers are spending 45 minutes per day, just waiting far a compilation to finish, doing nothing else, they are doing it wrong.

    Even if they cut that in half.
    Not so much doing nothing else but trying to do something else but really looking over your shoulder to see if it compiles or throws a piston...
    I don't have 20 minute complies to worry about, but the bigger problem for me is programs like CAD software or MS Word that take forever to load. I end up going and checking my email and getting distracted by other stuff in the mean time. ADD's a bitch!

    Yes, ADD/ADHD is a bitch.  But I bet you make up for the lack of focus with hyper-focus when something engages you.   (From what I've seen attention swings from 0% to 1,000% based on the interest level).

    Ned Halliwell describes it as having a "Ferrari brain with bicycle brakes".
    edited November 2021
  • Reply 23 of 44
    Reddit CEO is correct. A compilation is not an isolated event. Reduce the time by half means the product can be released faster. The user will obtain the better product faster. They will be happy and more productive. 
    williamlondonwatto_cobra
  • Reply 24 of 44
    sflocalsflocal Posts: 6,096member
    IreneW said:
    Well, let's just say that if their engineers are spending 45 minutes per day, just waiting far a compilation to finish, doing nothing else, they are doing it wrong.

    Even if they cut that in half.
    Do you even develop software?  I do.  Compiling does not mean sitting on one’s thumb with nothing to do.  It means monitoring the compile in the event of compile-time errors which then need to be addressed.  If that means getting up for a few minutes to get coffee and come back, great. 

    Cutting down compile times like that is huge.
    muthuk_vanalingamwilliamlondonwatto_cobracrowleyspheric
  • Reply 25 of 44
    BeatsBeats Posts: 3,073member
    In other words “duh”.

    Every time an iKnockoff moron says “Macs are overpriced!” All logic and common sense is thrown out.
    williamlondonwatto_cobra
  • Reply 26 of 44
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Beats said:
    In other words “duh”.

    Every time an iKnockoff moron says “Macs are overpriced!” All logic and common sense is thrown out.

    Generally speaking -- if you only consider the hardware -- they are overpriced.   Or, more correctly, they were overpriced when running essentially off-the-shelf Intel processors and such -- but the M series processors are changing that equation.

    But, even without an M series processor, when you fold in that which people typically ignore -- Apple's ecosystem -- the equation tends to even out or fall in Apple's favor.

    When you buy an Apple product you get much more than a chunk of hardware. 
    watto_cobra
  • Reply 27 of 44
    dws-2dws-2 Posts: 276member
    It's not just the time; it's the distraction. It interrupts your flow when things take a long time.
    muthuk_vanalingamwatto_cobra
  • Reply 28 of 44
    sflocal said:
    IreneW said:
    Well, let's just say that if their engineers are spending 45 minutes per day, just waiting far a compilation to finish, doing nothing else, they are doing it wrong.

    Even if they cut that in half.
    Do you even develop software?  I do.  Compiling does not mean sitting on one’s thumb with nothing to do.  It means monitoring the compile in the event of compile-time errors which then need to be addressed.  If that means getting up for a few minutes to get coffee and come back, great. 

    Cutting down compile times like that is huge.
    Yes. Developing software is my day job, together with leading a team of programmers and testers.
    My IDE tells me in real time if there are any syntactical errors or obvious logical problems in my code. The build server under my desk continuously builds, lints and checks, including all unit tests, even before i even think about committing anything. The build farm in our server room does the heavy lifting of rebuilding Yocto and target images.
    When a build, for some reason, breaks my workflow there are always closely related tasks to do, like adding unit tests, requirement linking or writing docs.
    GeorgeBMacwilliamlondon
  • Reply 29 of 44

    However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.


    Read on AppleInsider
    Bullshit!
    They are correct -- but only from a mathematical standpoint.
    But they are wrong from a cost standpoint based on reality.

    The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is.   They certainly won't change by saving the 20 minutes a day they are estimating.

    So, if those costs don't change, they should not be included in the "salary".
    So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?
    Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day.   So, it would take 220 days to recover the cost of a $3,500 machine.  That's roughly a year.

    But, to be fair, you can't just use a cost of $3,500.  You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.

    On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler).  Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.
    ... To put it another way:  if costs don't change then all you did was spend money, not save it.

    Cost accounting (like all statistics) can be used to prove almost anything you want.  So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.
    ... Reddit needs to hire a cost accountant -- or least find a capable one.


    Umm…. The cost alone is not the point. If the only thing that matters is the cost then why are they in business at all?

    No, the time saved is put towards getting the product out the door sooner or some other productive outcome that adds to the business in some way…. 

    Unless, if there really is absolutely nothing else those devs can add to the business with their extra time then they would have to lay off 1 from 25 (they don’t have to have 100 devs and lay off 4 for that to be a saving) to offset the MacBook costs - the difference not the $3500. As you say. In which case the $150 is still valid. 

    But the more likely scenario is they’re constantly growing and are constantly balancing what needs to get done, what income each dev ultimately generates (even if indirectly), and how much those devs cost per hour to produce what they produce - in other words if the team is now 4% more productive that’s 4% more time the hiring decision maker can wait before spending $150 per hour more on the next much needed newly added dev. 

    One way or another that $150 is valid. 
  • Reply 30 of 44
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Detnator said:

    However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.


    Read on AppleInsider
    Bullshit!
    They are correct -- but only from a mathematical standpoint.
    But they are wrong from a cost standpoint based on reality.

    The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is.   They certainly won't change by saving the 20 minutes a day they are estimating.

    So, if those costs don't change, they should not be included in the "salary".
    So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?
    Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day.   So, it would take 220 days to recover the cost of a $3,500 machine.  That's roughly a year.

    But, to be fair, you can't just use a cost of $3,500.  You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.

    On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler).  Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.
    ... To put it another way:  if costs don't change then all you did was spend money, not save it.

    Cost accounting (like all statistics) can be used to prove almost anything you want.  So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.
    ... Reddit needs to hire a cost accountant -- or least find a capable one.


    Umm…. The cost alone is not the point. If the only thing that matters is the cost then why are they in business at all?

    No, the time saved is put towards getting the product out the door sooner or some other productive outcome that adds to the business in some way…. 

    Unless, if there really is absolutely nothing else those devs can add to the business with their extra time then they would have to lay off 1 from 25 (they don’t have to have 100 devs and lay off 4 for that to be a saving) to offset the MacBook costs - the difference not the $3500. As you say. In which case the $150 is still valid. 

    But the more likely scenario is they’re constantly growing and are constantly balancing what needs to get done, what income each dev ultimately generates (even if indirectly), and how much those devs cost per hour to produce what they produce - in other words if the team is now 4% more productive that’s 4% more time the hiring decision maker can wait before spending $150 per hour more on the next much needed newly added dev. 

    One way or another that $150 is valid. 

    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    Without going back through all the detail I already laid out, the synopsis still says it all:   If costs do not change then no money was saved.  And, most of the costs they included in that $150 an hour would not change.  There would be some savings, but no where near what they claimed.

    The other things you site such as quicker turn around still may (or may not) be beneficial.
    But the claim did not include them and neither did I since I was responding to that claim.
  • Reply 31 of 44
    This video by a professional young lady after one month of using MacBook Pro 2021 is pretty informative.
    https://www.youtube.com/watch?v=drbSYEg8dY0
  • Reply 32 of 44
    Detnator said:

    However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.


    Read on AppleInsider
    Bullshit!
    They are correct -- but only from a mathematical standpoint.
    But they are wrong from a cost standpoint based on reality.

    The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is.   They certainly won't change by saving the 20 minutes a day they are estimating.

    So, if those costs don't change, they should not be included in the "salary".
    So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?
    Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day.   So, it would take 220 days to recover the cost of a $3,500 machine.  That's roughly a year.

    But, to be fair, you can't just use a cost of $3,500.  You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.

    On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler).  Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.
    ... To put it another way:  if costs don't change then all you did was spend money, not save it.

    Cost accounting (like all statistics) can be used to prove almost anything you want.  So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.
    ... Reddit needs to hire a cost accountant -- or least find a capable one.


    Umm…. The cost alone is not the point. If the only thing that matters is the cost then why are they in business at all?

    No, the time saved is put towards getting the product out the door sooner or some other productive outcome that adds to the business in some way…. 

    Unless, if there really is absolutely nothing else those devs can add to the business with their extra time then they would have to lay off 1 from 25 (they don’t have to have 100 devs and lay off 4 for that to be a saving) to offset the MacBook costs - the difference not the $3500. As you say. In which case the $150 is still valid. 

    But the more likely scenario is they’re constantly growing and are constantly balancing what needs to get done, what income each dev ultimately generates (even if indirectly), and how much those devs cost per hour to produce what they produce - in other words if the team is now 4% more productive that’s 4% more time the hiring decision maker can wait before spending $150 per hour more on the next much needed newly added dev. 

    One way or another that $150 is valid. 

    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    Without going back through all the detail I already laid out, the synopsis still says it all:   If costs do not change then no money was saved.  And, most of the costs they included in that $150 an hour would not change.  There would be some savings, but no where near what they claimed.

    The other things you site such as quicker turn around still may (or may not) be beneficial.
    But the claim did not include them and neither did I since I was responding to that claim.

    Ok... you want to nitpick... that's ok (really.  I'm not saying that to be snarky).  Let's do it!

    There's two parts here:
    1. The correct cost for each dev.
    2. Whether the 20 minutes per dev time saving has any value, and if so how?
    So a few points: 

    Your comment started with "bullshit", quoting the article's $150 cost of employee.  Really, that's not BS at all. That's what they've figured out the dev costs (regardless of whether the Mac translates to saving them any of that or not). 

    You asserted at first that the non-salary costs are fixed, therefore invalid because they're not "saved". You eventually (correctly) noted that the salary is also fixed therefore it's not validly "saved" either. But you still did all your calcs based on the $50/hour. That doesn't really add up (pun intended). Regardless of if or why we do or don't care about the dev cost, it's still everything, not just the salary, so that $150/hour as the cost of the employee is valid...

    So then... if we're going to calculate the value of that 20 minutes, then it's all or nothing. It needs to be done with either the $150/hour (= $1200 per 8hr day), or zero, depending on how the dev and/or company uses that 20 minutes.

    So... If 60 minutes per dev costs $150 per dev, then 20 minutes per dev costs $50 per dev. If it's 20 minutes out of a 480 minute day then it's still 4% but its cost is still $50 per dev per day (not $16).

    Ok... but is it "saved"?
    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    That statement is not quite correct.  The Reddit guy -- Jameson Williams -- started with the basic premise of weighing the outlay against the opportunity cost of not spending that, and concluded (claimed) that the $3500 outlay will pay for itself (both are different to saving). He went into detail about the value of the time saved (the $150), without going into detail about how he does or will turn that time saving into the $ value that will pay for the outlay. It could be by laying off 4% of the dev team, or (unlikely but hypothetically) cutting their cost by 4% (which would be mostly salary as the most liquid part of their cost, essentially) and letting them have the time back for themselves, or possibly other ways of turning it into saving money.  Or it could be by making more money (the most likely outcome).

    Regardless, the absence of the how in his claim, doesn't mean it isn't happening. Of course it's happening one way or another, and it's not unreasonable to assume so (otherwise he wouldn't be in business), and that is all that is required for his claim to be valid. As you noted, one of the only ways that 20 minutes won't have value to the company is if it's spent goofing off instead of doing something productive.

    Now, granted, the AI article's title and first line refer to "saving" money. That's entirely this article's author's interpretation/extrapolation, and I'm pretty sure he's not a cost accountant so it's not an unreasonable interpretation given that you've acknowledged at least one way that that is possible (laying off 4% of the devs). I gave a couple of other possibilities too (eg. more growth possible before needing to hire again).

    (As for the $3500, not that you were calling BS on this part, but just to be thorough, since I'm highlight the "pays for itself" bit: $3500 / $50 per day = 70 days = 14 weeks.*  So, as long as that 20 minutes is converted to $ value one way or another, then in just over three months the Mac pays for itself in entirety.  Although that's before you count the cost of whatever they'd have instead so it's really only difference (as you noted).  If it's a $3500 machine instead of say a $2K one, then the difference: $1500 / $50 per day = 30 days = 6 weeks.*)

    So at worst, some extrapolation is required, if it even is, but either way:
    1. The $150 cost of the dev is valid, and
    2. The idea that 20 minutes time saved per day will turn into $ value one way or another is hardly a stretch (nor "bullshit").
    ----

    (*at 5 days per week.)

    edited November 2021 MplsP
  • Reply 33 of 44
    IreneW said:
    sflocal said:
    IreneW said:
    Well, let's just say that if their engineers are spending 45 minutes per day, just waiting far a compilation to finish, doing nothing else, they are doing it wrong.

    Even if they cut that in half.
    Do you even develop software?  I do.  Compiling does not mean sitting on one’s thumb with nothing to do.  It means monitoring the compile in the event of compile-time errors which then need to be addressed.  If that means getting up for a few minutes to get coffee and come back, great. 

    Cutting down compile times like that is huge.
    Yes. Developing software is my day job, together with leading a team of programmers and testers.
    My IDE tells me in real time if there are any syntactical errors or obvious logical problems in my code. The build server under my desk continuously builds, lints and checks, including all unit tests, even before i even think about committing anything. The build farm in our server room does the heavy lifting of rebuilding Yocto and target images.
    When a build, for some reason, breaks my workflow there are always closely related tasks to do, like adding unit tests, requirement linking or writing docs.
    Well, neither the article nor the source said it was ONE 45 minute compile each day. More likely it's somewhere from 15x three minute compiles to 45x one minute compiles (now 15x 90s - 45x 30s instead).

    However your compiling is being done, each compile is still an event, even if it's 2700 one second compiles per day (one of many possible numerical interpretations of "continuous").  And you can't claim you don't spend any of those seconds either monitoring or assessing the result of the compiling.

    Still... instead of using one computer (like these guys you're criticizing) to handle your development, you have two... plus a build farm.  Got it.
    edited November 2021 williamlondon
  • Reply 34 of 44
    IreneWIreneW Posts: 303member
    Detnator said:
    IreneW said:
    sflocal said:
    IreneW said:
    Well, let's just say that if their engineers are spending 45 minutes per day, just waiting far a compilation to finish, doing nothing else, they are doing it wrong.

    Even if they cut that in half.
    Do you even develop software?  I do.  Compiling does not mean sitting on one’s thumb with nothing to do.  It means monitoring the compile in the event of compile-time errors which then need to be addressed.  If that means getting up for a few minutes to get coffee and come back, great. 

    Cutting down compile times like that is huge.
    Yes. Developing software is my day job, together with leading a team of programmers and testers.
    My IDE tells me in real time if there are any syntactical errors or obvious logical problems in my code. The build server under my desk continuously builds, lints and checks, including all unit tests, even before i even think about committing anything. The build farm in our server room does the heavy lifting of rebuilding Yocto and target images.
    When a build, for some reason, breaks my workflow there are always closely related tasks to do, like adding unit tests, requirement linking or writing docs.
    Well, neither the article nor the source said it was ONE 45 minute compile each day. More likely it's somewhere from 15x three minute compiles to 45x one minute compiles (now 15x 90s - 45x 30s instead).

    However your compiling is being done, each compile is still an event, even if it's 2700 one second compiles per day (one of many possible numerical interpretations of "continuous").  And you can't claim you don't spend any of those seconds either monitoring or assessing the result of the compiling.

    Still... instead of using one computer (like these guys you're criticizing) to handle your development, you have two... plus a build farm.  Got it.
    Note, as I already pointed out, that i definitely support buying the best computers money can buy for your developers. That, and two or three large screens, adjustable desks, comfy chairs, whatever... Keep the developers happy, or they will leave. That's a fact. 

    But, please, do not claim a decent engineer waste time waiting for a compiler to do its work! 

    The original article sounds more like it is describing a code factory, measuring productivity in LoC per hour, and believing 75-80% of the time should be spent hacking away on the keyboard. While that has been a popular view, at least in some parts af the industry, the success rate has been low (and, when it has worked, mainly attributed to low cost offshoring, allowing a brute force approach).

    The most valuable developer time is the time spent thinking.
    edited November 2021 GeorgeBMacwilliamlondon
  • Reply 35 of 44
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Detnator said:
    Detnator said:

    However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.


    Read on AppleInsider
    Bullshit!
    They are correct -- but only from a mathematical standpoint.
    But they are wrong from a cost standpoint based on reality.

    The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is.   They certainly won't change by saving the 20 minutes a day they are estimating.

    So, if those costs don't change, they should not be included in the "salary".
    So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?
    Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day.   So, it would take 220 days to recover the cost of a $3,500 machine.  That's roughly a year.

    But, to be fair, you can't just use a cost of $3,500.  You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.

    On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler).  Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.
    ... To put it another way:  if costs don't change then all you did was spend money, not save it.

    Cost accounting (like all statistics) can be used to prove almost anything you want.  So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.
    ... Reddit needs to hire a cost accountant -- or least find a capable one.


    Umm…. The cost alone is not the point. If the only thing that matters is the cost then why are they in business at all?

    No, the time saved is put towards getting the product out the door sooner or some other productive outcome that adds to the business in some way…. 

    Unless, if there really is absolutely nothing else those devs can add to the business with their extra time then they would have to lay off 1 from 25 (they don’t have to have 100 devs and lay off 4 for that to be a saving) to offset the MacBook costs - the difference not the $3500. As you say. In which case the $150 is still valid. 

    But the more likely scenario is they’re constantly growing and are constantly balancing what needs to get done, what income each dev ultimately generates (even if indirectly), and how much those devs cost per hour to produce what they produce - in other words if the team is now 4% more productive that’s 4% more time the hiring decision maker can wait before spending $150 per hour more on the next much needed newly added dev. 

    One way or another that $150 is valid. 

    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    Without going back through all the detail I already laid out, the synopsis still says it all:   If costs do not change then no money was saved.  And, most of the costs they included in that $150 an hour would not change.  There would be some savings, but no where near what they claimed.

    The other things you site such as quicker turn around still may (or may not) be beneficial.
    But the claim did not include them and neither did I since I was responding to that claim.

    Ok... you want to nitpick... that's ok (really.  I'm not saying that to be snarky).  Let's do it!

    There's two parts here:
    1. The correct cost for each dev.
    2. Whether the 20 minutes per dev time saving has any value, and if so how?
    So a few points: 

    Your comment started with "bullshit", quoting the article's $150 cost of employee.  Really, that's not BS at all. That's what they've figured out the dev costs (regardless of whether the Mac translates to saving them any of that or not). 

    You asserted at first that the non-salary costs are fixed, therefore invalid because they're not "saved". You eventually (correctly) noted that the salary is also fixed therefore it's not validly "saved" either. But you still did all your calcs based on the $50/hour. That doesn't really add up (pun intended). Regardless of if or why we do or don't care about the dev cost, it's still everything, not just the salary, so that $150/hour as the cost of the employee is valid...

    So then... if we're going to calculate the value of that 20 minutes, then it's all or nothing. It needs to be done with either the $150/hour (= $1200 per 8hr day), or zero, depending on how the dev and/or company uses that 20 minutes.

    So... If 60 minutes per dev costs $150 per dev, then 20 minutes per dev costs $50 per dev. If it's 20 minutes out of a 480 minute day then it's still 4% but its cost is still $50 per dev per day (not $16).

    Ok... but is it "saved"?
    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    That statement is not quite correct.  The Reddit guy -- Jameson Williams -- started with the basic premise of weighing the outlay against the opportunity cost of not spending that, and concluded (claimed) that the $3500 outlay will pay for itself (both are different to saving). He went into detail about the value of the time saved (the $150), without going into detail about how he does or will turn that time saving into the $ value that will pay for the outlay. It could be by laying off 4% of the dev team, or (unlikely but hypothetically) cutting their cost by 4% (which would be mostly salary as the most liquid part of their cost, essentially) and letting them have the time back for themselves, or possibly other ways of turning it into saving money.  Or it could be by making more money (the most likely outcome).

    Regardless, the absence of the how in his claim, doesn't mean it isn't happening. Of course it's happening one way or another, and it's not unreasonable to assume so (otherwise he wouldn't be in business), and that is all that is required for his claim to be valid. As you noted, one of the only ways that 20 minutes won't have value to the company is if it's spent goofing off instead of doing something productive.

    Now, granted, the AI article's title and first line refer to "saving" money. That's entirely this article's author's interpretation/extrapolation, and I'm pretty sure he's not a cost accountant so it's not an unreasonable interpretation given that you've acknowledged at least one way that that is possible (laying off 4% of the devs). I gave a couple of other possibilities too (eg. more growth possible before needing to hire again).

    (As for the $3500, not that you were calling BS on this part, but just to be thorough, since I'm highlight the "pays for itself" bit: $3500 / $50 per day = 70 days = 14 weeks.*  So, as long as that 20 minutes is converted to $ value one way or another, then in just over three months the Mac pays for itself in entirety.  Although that's before you count the cost of whatever they'd have instead so it's really only difference (as you noted).  If it's a $3500 machine instead of say a $2K one, then the difference: $1500 / $50 per day = 30 days = 6 weeks.*)

    So at worst, some extrapolation is required, if it even is, but either way:
    1. The $150 cost of the dev is valid, and
    2. The idea that 20 minutes time saved per day will turn into $ value one way or another is hardly a stretch (nor "bullshit").
    ----

    (*at 5 days per week.)


    Nope, not nit picking at all.
    The part you (and they) missed is that the $150 is the total cost per employee.  But that is worthless for the purpose they are using it for.   Instead they needed to break that down into the fixed and variable costs per employee -- because the fixed costs WILL NOT CHANGE (particularly over a paltry savings of 20 minutes a day).
    So, as I said, if costs don't change, you didn't save anything.  In this case, they saved some, but nowhere near what they claimed.  So yeh:  BULLSHIT!
  • Reply 36 of 44
    Detnator said:
    Detnator said:

    However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.


    Read on AppleInsider
    Bullshit!
    They are correct -- but only from a mathematical standpoint.
    But they are wrong from a cost standpoint based on reality.

    The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is.   They certainly won't change by saving the 20 minutes a day they are estimating.

    So, if those costs don't change, they should not be included in the "salary".
    So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?
    Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day.   So, it would take 220 days to recover the cost of a $3,500 machine.  That's roughly a year.

    But, to be fair, you can't just use a cost of $3,500.  You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.

    On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler).  Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.
    ... To put it another way:  if costs don't change then all you did was spend money, not save it.

    Cost accounting (like all statistics) can be used to prove almost anything you want.  So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.
    ... Reddit needs to hire a cost accountant -- or least find a capable one.


    Umm…. The cost alone is not the point. If the only thing that matters is the cost then why are they in business at all?

    No, the time saved is put towards getting the product out the door sooner or some other productive outcome that adds to the business in some way…. 

    Unless, if there really is absolutely nothing else those devs can add to the business with their extra time then they would have to lay off 1 from 25 (they don’t have to have 100 devs and lay off 4 for that to be a saving) to offset the MacBook costs - the difference not the $3500. As you say. In which case the $150 is still valid. 

    But the more likely scenario is they’re constantly growing and are constantly balancing what needs to get done, what income each dev ultimately generates (even if indirectly), and how much those devs cost per hour to produce what they produce - in other words if the team is now 4% more productive that’s 4% more time the hiring decision maker can wait before spending $150 per hour more on the next much needed newly added dev. 

    One way or another that $150 is valid. 

    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    Without going back through all the detail I already laid out, the synopsis still says it all:   If costs do not change then no money was saved.  And, most of the costs they included in that $150 an hour would not change.  There would be some savings, but no where near what they claimed.

    The other things you site such as quicker turn around still may (or may not) be beneficial.
    But the claim did not include them and neither did I since I was responding to that claim.

    Ok... you want to nitpick... that's ok (really.  I'm not saying that to be snarky).  Let's do it!

    There's two parts here:
    1. The correct cost for each dev.
    2. Whether the 20 minutes per dev time saving has any value, and if so how?
    So a few points: 

    Your comment started with "bullshit", quoting the article's $150 cost of employee.  Really, that's not BS at all. That's what they've figured out the dev costs (regardless of whether the Mac translates to saving them any of that or not). 

    You asserted at first that the non-salary costs are fixed, therefore invalid because they're not "saved". You eventually (correctly) noted that the salary is also fixed therefore it's not validly "saved" either. But you still did all your calcs based on the $50/hour. That doesn't really add up (pun intended). Regardless of if or why we do or don't care about the dev cost, it's still everything, not just the salary, so that $150/hour as the cost of the employee is valid...

    So then... if we're going to calculate the value of that 20 minutes, then it's all or nothing. It needs to be done with either the $150/hour (= $1200 per 8hr day), or zero, depending on how the dev and/or company uses that 20 minutes.

    So... If 60 minutes per dev costs $150 per dev, then 20 minutes per dev costs $50 per dev. If it's 20 minutes out of a 480 minute day then it's still 4% but its cost is still $50 per dev per day (not $16).

    Ok... but is it "saved"?
    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    That statement is not quite correct.  The Reddit guy -- Jameson Williams -- started with the basic premise of weighing the outlay against the opportunity cost of not spending that, and concluded (claimed) that the $3500 outlay will pay for itself (both are different to saving). He went into detail about the value of the time saved (the $150), without going into detail about how he does or will turn that time saving into the $ value that will pay for the outlay. It could be by laying off 4% of the dev team, or (unlikely but hypothetically) cutting their cost by 4% (which would be mostly salary as the most liquid part of their cost, essentially) and letting them have the time back for themselves, or possibly other ways of turning it into saving money.  Or it could be by making more money (the most likely outcome).

    Regardless, the absence of the how in his claim, doesn't mean it isn't happening. Of course it's happening one way or another, and it's not unreasonable to assume so (otherwise he wouldn't be in business), and that is all that is required for his claim to be valid. As you noted, one of the only ways that 20 minutes won't have value to the company is if it's spent goofing off instead of doing something productive.

    Now, granted, the AI article's title and first line refer to "saving" money. That's entirely this article's author's interpretation/extrapolation, and I'm pretty sure he's not a cost accountant so it's not an unreasonable interpretation given that you've acknowledged at least one way that that is possible (laying off 4% of the devs). I gave a couple of other possibilities too (eg. more growth possible before needing to hire again).

    (As for the $3500, not that you were calling BS on this part, but just to be thorough, since I'm highlight the "pays for itself" bit: $3500 / $50 per day = 70 days = 14 weeks.*  So, as long as that 20 minutes is converted to $ value one way or another, then in just over three months the Mac pays for itself in entirety.  Although that's before you count the cost of whatever they'd have instead so it's really only difference (as you noted).  If it's a $3500 machine instead of say a $2K one, then the difference: $1500 / $50 per day = 30 days = 6 weeks.*)

    So at worst, some extrapolation is required, if it even is, but either way:
    1. The $150 cost of the dev is valid, and
    2. The idea that 20 minutes time saved per day will turn into $ value one way or another is hardly a stretch (nor "bullshit").
    ----

    (*at 5 days per week.)


    Nope, not nit picking at all.
    The part you (and they) missed is that the $150 is the total cost per employee.  But that is worthless for the purpose they are using it for.   Instead they needed to break that down into the fixed and variable costs per employee -- because the fixed costs WILL NOT CHANGE (particularly over a paltry savings of 20 minutes a day).
    So, as I said, if costs don't change, you didn't save anything.  In this case, they saved some, but nowhere near what they claimed.  So yeh:  BULLSHIT!
    No I didn’t miss it. I addressed it in detail in my post. So you either didn’t read that or you missed where I clearly addressed it. 

    Also, It seems the same is trrue for the article.

    So once again:

    The Reddit guy didn’t claim anything about saving on costs
  • Reply 37 of 44
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Detnator said:
    Detnator said:
    Detnator said:

    However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.


    Read on AppleInsider
    Bullshit!
    They are correct -- but only from a mathematical standpoint.
    But they are wrong from a cost standpoint based on reality.

    The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is.   They certainly won't change by saving the 20 minutes a day they are estimating.

    So, if those costs don't change, they should not be included in the "salary".
    So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?
    Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day.   So, it would take 220 days to recover the cost of a $3,500 machine.  That's roughly a year.

    But, to be fair, you can't just use a cost of $3,500.  You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.

    On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler).  Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.
    ... To put it another way:  if costs don't change then all you did was spend money, not save it.

    Cost accounting (like all statistics) can be used to prove almost anything you want.  So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.
    ... Reddit needs to hire a cost accountant -- or least find a capable one.


    Umm…. The cost alone is not the point. If the only thing that matters is the cost then why are they in business at all?

    No, the time saved is put towards getting the product out the door sooner or some other productive outcome that adds to the business in some way…. 

    Unless, if there really is absolutely nothing else those devs can add to the business with their extra time then they would have to lay off 1 from 25 (they don’t have to have 100 devs and lay off 4 for that to be a saving) to offset the MacBook costs - the difference not the $3500. As you say. In which case the $150 is still valid. 

    But the more likely scenario is they’re constantly growing and are constantly balancing what needs to get done, what income each dev ultimately generates (even if indirectly), and how much those devs cost per hour to produce what they produce - in other words if the team is now 4% more productive that’s 4% more time the hiring decision maker can wait before spending $150 per hour more on the next much needed newly added dev. 

    One way or another that $150 is valid. 

    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    Without going back through all the detail I already laid out, the synopsis still says it all:   If costs do not change then no money was saved.  And, most of the costs they included in that $150 an hour would not change.  There would be some savings, but no where near what they claimed.

    The other things you site such as quicker turn around still may (or may not) be beneficial.
    But the claim did not include them and neither did I since I was responding to that claim.

    Ok... you want to nitpick... that's ok (really.  I'm not saying that to be snarky).  Let's do it!

    There's two parts here:
    1. The correct cost for each dev.
    2. Whether the 20 minutes per dev time saving has any value, and if so how?
    So a few points: 

    Your comment started with "bullshit", quoting the article's $150 cost of employee.  Really, that's not BS at all. That's what they've figured out the dev costs (regardless of whether the Mac translates to saving them any of that or not). 

    You asserted at first that the non-salary costs are fixed, therefore invalid because they're not "saved". You eventually (correctly) noted that the salary is also fixed therefore it's not validly "saved" either. But you still did all your calcs based on the $50/hour. That doesn't really add up (pun intended). Regardless of if or why we do or don't care about the dev cost, it's still everything, not just the salary, so that $150/hour as the cost of the employee is valid...

    So then... if we're going to calculate the value of that 20 minutes, then it's all or nothing. It needs to be done with either the $150/hour (= $1200 per 8hr day), or zero, depending on how the dev and/or company uses that 20 minutes.

    So... If 60 minutes per dev costs $150 per dev, then 20 minutes per dev costs $50 per dev. If it's 20 minutes out of a 480 minute day then it's still 4% but its cost is still $50 per dev per day (not $16).

    Ok... but is it "saved"?
    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    That statement is not quite correct.  The Reddit guy -- Jameson Williams -- started with the basic premise of weighing the outlay against the opportunity cost of not spending that, and concluded (claimed) that the $3500 outlay will pay for itself (both are different to saving). He went into detail about the value of the time saved (the $150), without going into detail about how he does or will turn that time saving into the $ value that will pay for the outlay. It could be by laying off 4% of the dev team, or (unlikely but hypothetically) cutting their cost by 4% (which would be mostly salary as the most liquid part of their cost, essentially) and letting them have the time back for themselves, or possibly other ways of turning it into saving money.  Or it could be by making more money (the most likely outcome).

    Regardless, the absence of the how in his claim, doesn't mean it isn't happening. Of course it's happening one way or another, and it's not unreasonable to assume so (otherwise he wouldn't be in business), and that is all that is required for his claim to be valid. As you noted, one of the only ways that 20 minutes won't have value to the company is if it's spent goofing off instead of doing something productive.

    Now, granted, the AI article's title and first line refer to "saving" money. That's entirely this article's author's interpretation/extrapolation, and I'm pretty sure he's not a cost accountant so it's not an unreasonable interpretation given that you've acknowledged at least one way that that is possible (laying off 4% of the devs). I gave a couple of other possibilities too (eg. more growth possible before needing to hire again).

    (As for the $3500, not that you were calling BS on this part, but just to be thorough, since I'm highlight the "pays for itself" bit: $3500 / $50 per day = 70 days = 14 weeks.*  So, as long as that 20 minutes is converted to $ value one way or another, then in just over three months the Mac pays for itself in entirety.  Although that's before you count the cost of whatever they'd have instead so it's really only difference (as you noted).  If it's a $3500 machine instead of say a $2K one, then the difference: $1500 / $50 per day = 30 days = 6 weeks.*)

    So at worst, some extrapolation is required, if it even is, but either way:
    1. The $150 cost of the dev is valid, and
    2. The idea that 20 minutes time saved per day will turn into $ value one way or another is hardly a stretch (nor "bullshit").
    ----

    (*at 5 days per week.)


    Nope, not nit picking at all.
    The part you (and they) missed is that the $150 is the total cost per employee.  But that is worthless for the purpose they are using it for.   Instead they needed to break that down into the fixed and variable costs per employee -- because the fixed costs WILL NOT CHANGE (particularly over a paltry savings of 20 minutes a day).
    So, as I said, if costs don't change, you didn't save anything.  In this case, they saved some, but nowhere near what they claimed.  So yeh:  BULLSHIT!
    No I didn’t miss it. I addressed it in detail in my post. So you either didn’t read that or you missed where I clearly addressed it. 

    Also, It seems the same is trrue for the article.

    So once again:

    The Reddit guy didn’t claim anything about saving on costs

    Sorry -- but fixed is still fixed.  And when costs don't change you can't count them as savings.
    It's just how it works.

    (And, to be honest, it is YOU who ignored the points in my original post and are trying to avoid the facts)
    edited November 2021
  • Reply 38 of 44
    Detnator said:
    Detnator said:
    Detnator said:

    However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.


    Read on AppleInsider
    Bullshit!
    They are correct -- but only from a mathematical standpoint.
    But they are wrong from a cost standpoint based on reality.

    The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is.   They certainly won't change by saving the 20 minutes a day they are estimating.

    So, if those costs don't change, they should not be included in the "salary".
    So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?
    Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day.   So, it would take 220 days to recover the cost of a $3,500 machine.  That's roughly a year.

    But, to be fair, you can't just use a cost of $3,500.  You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.

    On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler).  Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.
    ... To put it another way:  if costs don't change then all you did was spend money, not save it.

    Cost accounting (like all statistics) can be used to prove almost anything you want.  So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.
    ... Reddit needs to hire a cost accountant -- or least find a capable one.


    Umm…. The cost alone is not the point. If the only thing that matters is the cost then why are they in business at all?

    No, the time saved is put towards getting the product out the door sooner or some other productive outcome that adds to the business in some way…. 

    Unless, if there really is absolutely nothing else those devs can add to the business with their extra time then they would have to lay off 1 from 25 (they don’t have to have 100 devs and lay off 4 for that to be a saving) to offset the MacBook costs - the difference not the $3500. As you say. In which case the $150 is still valid. 

    But the more likely scenario is they’re constantly growing and are constantly balancing what needs to get done, what income each dev ultimately generates (even if indirectly), and how much those devs cost per hour to produce what they produce - in other words if the team is now 4% more productive that’s 4% more time the hiring decision maker can wait before spending $150 per hour more on the next much needed newly added dev. 

    One way or another that $150 is valid. 

    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    Without going back through all the detail I already laid out, the synopsis still says it all:   If costs do not change then no money was saved.  And, most of the costs they included in that $150 an hour would not change.  There would be some savings, but no where near what they claimed.

    The other things you site such as quicker turn around still may (or may not) be beneficial.
    But the claim did not include them and neither did I since I was responding to that claim.

    Ok... you want to nitpick... that's ok (really.  I'm not saying that to be snarky).  Let's do it!

    There's two parts here:
    1. The correct cost for each dev.
    2. Whether the 20 minutes per dev time saving has any value, and if so how?
    So a few points: 

    Your comment started with "bullshit", quoting the article's $150 cost of employee.  Really, that's not BS at all. That's what they've figured out the dev costs (regardless of whether the Mac translates to saving them any of that or not). 

    You asserted at first that the non-salary costs are fixed, therefore invalid because they're not "saved". You eventually (correctly) noted that the salary is also fixed therefore it's not validly "saved" either. But you still did all your calcs based on the $50/hour. That doesn't really add up (pun intended). Regardless of if or why we do or don't care about the dev cost, it's still everything, not just the salary, so that $150/hour as the cost of the employee is valid...

    So then... if we're going to calculate the value of that 20 minutes, then it's all or nothing. It needs to be done with either the $150/hour (= $1200 per 8hr day), or zero, depending on how the dev and/or company uses that 20 minutes.

    So... If 60 minutes per dev costs $150 per dev, then 20 minutes per dev costs $50 per dev. If it's 20 minutes out of a 480 minute day then it's still 4% but its cost is still $50 per dev per day (not $16).

    Ok... but is it "saved"?
    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    That statement is not quite correct.  The Reddit guy -- Jameson Williams -- started with the basic premise of weighing the outlay against the opportunity cost of not spending that, and concluded (claimed) that the $3500 outlay will pay for itself (both are different to saving). He went into detail about the value of the time saved (the $150), without going into detail about how he does or will turn that time saving into the $ value that will pay for the outlay. It could be by laying off 4% of the dev team, or (unlikely but hypothetically) cutting their cost by 4% (which would be mostly salary as the most liquid part of their cost, essentially) and letting them have the time back for themselves, or possibly other ways of turning it into saving money.  Or it could be by making more money (the most likely outcome).

    Regardless, the absence of the how in his claim, doesn't mean it isn't happening. Of course it's happening one way or another, and it's not unreasonable to assume so (otherwise he wouldn't be in business), and that is all that is required for his claim to be valid. As you noted, one of the only ways that 20 minutes won't have value to the company is if it's spent goofing off instead of doing something productive.

    Now, granted, the AI article's title and first line refer to "saving" money. That's entirely this article's author's interpretation/extrapolation, and I'm pretty sure he's not a cost accountant so it's not an unreasonable interpretation given that you've acknowledged at least one way that that is possible (laying off 4% of the devs). I gave a couple of other possibilities too (eg. more growth possible before needing to hire again).

    (As for the $3500, not that you were calling BS on this part, but just to be thorough, since I'm highlight the "pays for itself" bit: $3500 / $50 per day = 70 days = 14 weeks.*  So, as long as that 20 minutes is converted to $ value one way or another, then in just over three months the Mac pays for itself in entirety.  Although that's before you count the cost of whatever they'd have instead so it's really only difference (as you noted).  If it's a $3500 machine instead of say a $2K one, then the difference: $1500 / $50 per day = 30 days = 6 weeks.*)

    So at worst, some extrapolation is required, if it even is, but either way:
    1. The $150 cost of the dev is valid, and
    2. The idea that 20 minutes time saved per day will turn into $ value one way or another is hardly a stretch (nor "bullshit").
    ----

    (*at 5 days per week.)


    Nope, not nit picking at all.
    The part you (and they) missed is that the $150 is the total cost per employee.  But that is worthless for the purpose they are using it for.   Instead they needed to break that down into the fixed and variable costs per employee -- because the fixed costs WILL NOT CHANGE (particularly over a paltry savings of 20 minutes a day).
    So, as I said, if costs don't change, you didn't save anything.  In this case, they saved some, but nowhere near what they claimed.  So yeh:  BULLSHIT!
    No I didn’t miss it. I addressed it in detail in my post. So you either didn’t read that or you missed where I clearly addressed it. 

    Also, It seems the same is trrue for the article.

    So once again:

    The Reddit guy didn’t claim anything about saving on costs

    Sorry -- but fixed is still fixed.  And when costs don't change you can't count them as savings.
    It's just how it works.

    (And, to be honest, it is YOU who ignored the points in my original post and are trying to avoid the facts)
    Fixed or not it’s irrelevant. What part of “he never said savings” do you not understand???
    edited November 2021
  • Reply 39 of 44
    Detnator said:
    Detnator said:
    Detnator said:
    Detnator said:

    However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.


    Read on AppleInsider
    Bullshit!
    They are correct -- but only from a mathematical standpoint.
    But they are wrong from a cost standpoint based on reality.

    The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is.   They certainly won't change by saving the 20 minutes a day they are estimating.

    So, if those costs don't change, they should not be included in the "salary".
    So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?
    Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day.   So, it would take 220 days to recover the cost of a $3,500 machine.  That's roughly a year.

    But, to be fair, you can't just use a cost of $3,500.  You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.

    On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler).  Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.
    ... To put it another way:  if costs don't change then all you did was spend money, not save it.

    Cost accounting (like all statistics) can be used to prove almost anything you want.  So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.
    ... Reddit needs to hire a cost accountant -- or least find a capable one.


    Umm…. The cost alone is not the point. If the only thing that matters is the cost then why are they in business at all?

    No, the time saved is put towards getting the product out the door sooner or some other productive outcome that adds to the business in some way…. 

    Unless, if there really is absolutely nothing else those devs can add to the business with their extra time then they would have to lay off 1 from 25 (they don’t have to have 100 devs and lay off 4 for that to be a saving) to offset the MacBook costs - the difference not the $3500. As you say. In which case the $150 is still valid. 

    But the more likely scenario is they’re constantly growing and are constantly balancing what needs to get done, what income each dev ultimately generates (even if indirectly), and how much those devs cost per hour to produce what they produce - in other words if the team is now 4% more productive that’s 4% more time the hiring decision maker can wait before spending $150 per hour more on the next much needed newly added dev. 

    One way or another that $150 is valid. 

    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    Without going back through all the detail I already laid out, the synopsis still says it all:   If costs do not change then no money was saved.  And, most of the costs they included in that $150 an hour would not change.  There would be some savings, but no where near what they claimed.

    The other things you site such as quicker turn around still may (or may not) be beneficial.
    But the claim did not include them and neither did I since I was responding to that claim.

    Ok... you want to nitpick... that's ok (really.  I'm not saying that to be snarky).  Let's do it!

    There's two parts here:
    1. The correct cost for each dev.
    2. Whether the 20 minutes per dev time saving has any value, and if so how?
    So a few points: 

    Your comment started with "bullshit", quoting the article's $150 cost of employee.  Really, that's not BS at all. That's what they've figured out the dev costs (regardless of whether the Mac translates to saving them any of that or not). 

    You asserted at first that the non-salary costs are fixed, therefore invalid because they're not "saved". You eventually (correctly) noted that the salary is also fixed therefore it's not validly "saved" either. But you still did all your calcs based on the $50/hour. That doesn't really add up (pun intended). Regardless of if or why we do or don't care about the dev cost, it's still everything, not just the salary, so that $150/hour as the cost of the employee is valid...

    So then... if we're going to calculate the value of that 20 minutes, then it's all or nothing. It needs to be done with either the $150/hour (= $1200 per 8hr day), or zero, depending on how the dev and/or company uses that 20 minutes.

    So... If 60 minutes per dev costs $150 per dev, then 20 minutes per dev costs $50 per dev. If it's 20 minutes out of a 480 minute day then it's still 4% but its cost is still $50 per dev per day (not $16).

    Ok... but is it "saved"?
    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    That statement is not quite correct.  The Reddit guy -- Jameson Williams -- started with the basic premise of weighing the outlay against the opportunity cost of not spending that, and concluded (claimed) that the $3500 outlay will pay for itself (both are different to saving). He went into detail about the value of the time saved (the $150), without going into detail about how he does or will turn that time saving into the $ value that will pay for the outlay. It could be by laying off 4% of the dev team, or (unlikely but hypothetically) cutting their cost by 4% (which would be mostly salary as the most liquid part of their cost, essentially) and letting them have the time back for themselves, or possibly other ways of turning it into saving money.  Or it could be by making more money (the most likely outcome).

    Regardless, the absence of the how in his claim, doesn't mean it isn't happening. Of course it's happening one way or another, and it's not unreasonable to assume so (otherwise he wouldn't be in business), and that is all that is required for his claim to be valid. As you noted, one of the only ways that 20 minutes won't have value to the company is if it's spent goofing off instead of doing something productive.

    Now, granted, the AI article's title and first line refer to "saving" money. That's entirely this article's author's interpretation/extrapolation, and I'm pretty sure he's not a cost accountant so it's not an unreasonable interpretation given that you've acknowledged at least one way that that is possible (laying off 4% of the devs). I gave a couple of other possibilities too (eg. more growth possible before needing to hire again).

    (As for the $3500, not that you were calling BS on this part, but just to be thorough, since I'm highlight the "pays for itself" bit: $3500 / $50 per day = 70 days = 14 weeks.*  So, as long as that 20 minutes is converted to $ value one way or another, then in just over three months the Mac pays for itself in entirety.  Although that's before you count the cost of whatever they'd have instead so it's really only difference (as you noted).  If it's a $3500 machine instead of say a $2K one, then the difference: $1500 / $50 per day = 30 days = 6 weeks.*)

    So at worst, some extrapolation is required, if it even is, but either way:
    1. The $150 cost of the dev is valid, and
    2. The idea that 20 minutes time saved per day will turn into $ value one way or another is hardly a stretch (nor "bullshit").
    ----

    (*at 5 days per week.)


    Nope, not nit picking at all.
    The part you (and they) missed is that the $150 is the total cost per employee.  But that is worthless for the purpose they are using it for.   Instead they needed to break that down into the fixed and variable costs per employee -- because the fixed costs WILL NOT CHANGE (particularly over a paltry savings of 20 minutes a day).
    So, as I said, if costs don't change, you didn't save anything.  In this case, they saved some, but nowhere near what they claimed.  So yeh:  BULLSHIT!
    No I didn’t miss it. I addressed it in detail in my post. So you either didn’t read that or you missed where I clearly addressed it. 

    Also, It seems the same is trrue for the article.

    So once again:

    The Reddit guy didn’t claim anything about saving on costs

    Sorry -- but fixed is still fixed.  And when costs don't change you can't count them as savings.
    It's just how it works.

    (And, to be honest, it is YOU who ignored the points in my original post and are trying to avoid the facts)
    Fixed or not it’s irrelevant. What part of “he never said savings” do you not understand???

    The part where it said:
    "Following an initial claim that the fully-loaded new MacBook Pro will save Reddit money and engineering time, the company has broken down the figures to show when the shift will pay off."

    And then went on to [incorrectly] calculate how much those savings would be.
  • Reply 40 of 44
    Detnator said:
    Detnator said:
    Detnator said:
    Detnator said:

    However, he says he used common estimate of $150 per hour for a software engineer -- based not just on salary, but also "recruiting, office leases, support staff," and much more.


    Read on AppleInsider
    Bullshit!
    They are correct -- but only from a mathematical standpoint.
    But they are wrong from a cost standpoint based on reality.

    The costs that they folded in (""recruiting, office leases, support staff,") are fixed costs that won't change regardless of how fast the computer is.   They certainly won't change by saving the 20 minutes a day they are estimating.

    So, if those costs don't change, they should not be included in the "salary".
    So, how does that work out with a more realistic estimate of, say, $50 an hour ($400 a day) ?
    Saving 20 minutes out of a 480 minute day would reduce costs 4% -- from $400 a day all the way down to $384 -- a savings of $16 a day.   So, it would take 220 days to recover the cost of a $3,500 machine.  That's roughly a year.

    But, to be fair, you can't just use a cost of $3,500.  You really need to use the difference in cost of the M1 Max vs whatever else they would have bought -- assuming the existing machine would have been replaced at all.

    On the flip side, it could even be considered that, if the engineer is paid on a salary basis, that that cost is fixed as well -- so the savings would be zero (assuming the engineer spends the 20 minutes bullshitting around the proverbial water cooler).  Or, to put it another way, the savings of 4% would only be saved if you had a staff of 100 engineers and were able to lay-off 4 of them.
    ... To put it another way:  if costs don't change then all you did was spend money, not save it.

    Cost accounting (like all statistics) can be used to prove almost anything you want.  So, it has to be done honestly, correctly, and in context with the existing conditions and how they might change.
    ... Reddit needs to hire a cost accountant -- or least find a capable one.


    Umm…. The cost alone is not the point. If the only thing that matters is the cost then why are they in business at all?

    No, the time saved is put towards getting the product out the door sooner or some other productive outcome that adds to the business in some way…. 

    Unless, if there really is absolutely nothing else those devs can add to the business with their extra time then they would have to lay off 1 from 25 (they don’t have to have 100 devs and lay off 4 for that to be a saving) to offset the MacBook costs - the difference not the $3500. As you say. In which case the $150 is still valid. 

    But the more likely scenario is they’re constantly growing and are constantly balancing what needs to get done, what income each dev ultimately generates (even if indirectly), and how much those devs cost per hour to produce what they produce - in other words if the team is now 4% more productive that’s 4% more time the hiring decision maker can wait before spending $150 per hour more on the next much needed newly added dev. 

    One way or another that $150 is valid. 

    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    Without going back through all the detail I already laid out, the synopsis still says it all:   If costs do not change then no money was saved.  And, most of the costs they included in that $150 an hour would not change.  There would be some savings, but no where near what they claimed.

    The other things you site such as quicker turn around still may (or may not) be beneficial.
    But the claim did not include them and neither did I since I was responding to that claim.

    Ok... you want to nitpick... that's ok (really.  I'm not saying that to be snarky).  Let's do it!

    There's two parts here:
    1. The correct cost for each dev.
    2. Whether the 20 minutes per dev time saving has any value, and if so how?
    So a few points: 

    Your comment started with "bullshit", quoting the article's $150 cost of employee.  Really, that's not BS at all. That's what they've figured out the dev costs (regardless of whether the Mac translates to saving them any of that or not). 

    You asserted at first that the non-salary costs are fixed, therefore invalid because they're not "saved". You eventually (correctly) noted that the salary is also fixed therefore it's not validly "saved" either. But you still did all your calcs based on the $50/hour. That doesn't really add up (pun intended). Regardless of if or why we do or don't care about the dev cost, it's still everything, not just the salary, so that $150/hour as the cost of the employee is valid...

    So then... if we're going to calculate the value of that 20 minutes, then it's all or nothing. It needs to be done with either the $150/hour (= $1200 per 8hr day), or zero, depending on how the dev and/or company uses that 20 minutes.

    So... If 60 minutes per dev costs $150 per dev, then 20 minutes per dev costs $50 per dev. If it's 20 minutes out of a 480 minute day then it's still 4% but its cost is still $50 per dev per day (not $16).

    Ok... but is it "saved"?
    The claim was that a $3,500 outlay would save money based on a 20 minute a day savings at $150 an hour.
    That statement is not quite correct.  The Reddit guy -- Jameson Williams -- started with the basic premise of weighing the outlay against the opportunity cost of not spending that, and concluded (claimed) that the $3500 outlay will pay for itself (both are different to saving). He went into detail about the value of the time saved (the $150), without going into detail about how he does or will turn that time saving into the $ value that will pay for the outlay. It could be by laying off 4% of the dev team, or (unlikely but hypothetically) cutting their cost by 4% (which would be mostly salary as the most liquid part of their cost, essentially) and letting them have the time back for themselves, or possibly other ways of turning it into saving money.  Or it could be by making more money (the most likely outcome).

    Regardless, the absence of the how in his claim, doesn't mean it isn't happening. Of course it's happening one way or another, and it's not unreasonable to assume so (otherwise he wouldn't be in business), and that is all that is required for his claim to be valid. As you noted, one of the only ways that 20 minutes won't have value to the company is if it's spent goofing off instead of doing something productive.

    Now, granted, the AI article's title and first line refer to "saving" money. That's entirely this article's author's interpretation/extrapolation, and I'm pretty sure he's not a cost accountant so it's not an unreasonable interpretation given that you've acknowledged at least one way that that is possible (laying off 4% of the devs). I gave a couple of other possibilities too (eg. more growth possible before needing to hire again).

    (As for the $3500, not that you were calling BS on this part, but just to be thorough, since I'm highlight the "pays for itself" bit: $3500 / $50 per day = 70 days = 14 weeks.*  So, as long as that 20 minutes is converted to $ value one way or another, then in just over three months the Mac pays for itself in entirety.  Although that's before you count the cost of whatever they'd have instead so it's really only difference (as you noted).  If it's a $3500 machine instead of say a $2K one, then the difference: $1500 / $50 per day = 30 days = 6 weeks.*)

    So at worst, some extrapolation is required, if it even is, but either way:
    1. The $150 cost of the dev is valid, and
    2. The idea that 20 minutes time saved per day will turn into $ value one way or another is hardly a stretch (nor "bullshit").
    ----

    (*at 5 days per week.)


    Nope, not nit picking at all.
    The part you (and they) missed is that the $150 is the total cost per employee.  But that is worthless for the purpose they are using it for.   Instead they needed to break that down into the fixed and variable costs per employee -- because the fixed costs WILL NOT CHANGE (particularly over a paltry savings of 20 minutes a day).
    So, as I said, if costs don't change, you didn't save anything.  In this case, they saved some, but nowhere near what they claimed.  So yeh:  BULLSHIT!
    No I didn’t miss it. I addressed it in detail in my post. So you either didn’t read that or you missed where I clearly addressed it. 

    Also, It seems the same is trrue for the article.

    So once again:

    The Reddit guy didn’t claim anything about saving on costs

    Sorry -- but fixed is still fixed.  And when costs don't change you can't count them as savings.
    It's just how it works.

    (And, to be honest, it is YOU who ignored the points in my original post and are trying to avoid the facts)
    Fixed or not it’s irrelevant. What part of “he never said savings” do you not understand???

    The part where it said:
    "Following an initial claim that the fully-loaded new MacBook Pro will save Reddit money and engineering time, the company has broken down the figures to show when the shift will pay off."

    And then went on to [incorrectly] calculate how much those savings would be.
    You originally made a couple of good points, but you've contradicted yourself and taken quotes and comments out of context.  Therefore your conclusions are basically wrong.

    I've written multiple paragraphs to show how you've done that, but you won't discuss or counter any of that. You know... if I'm wrong you could point out how, instead of just regurgitating your position.

    So it seems there's no further conversation to be had here.
Sign In or Register to comment.