JustSomeGuy1

About

Banned
Username
JustSomeGuy1
Joined
Visits
60
Last Active
Roles
member
Points
1,172
Badges
1
Posts
330
  • How we ended up with the 'Pregnant Man' Emoji

    leighr said:
    It probably should be called “pregnant woman dressed as a man” (which is effectively male misappropriation) as we all know that it is scientifically impossible for a male to become pregnant.
    This is a perfect demonstration of the fact that many arguments like this are really, at least in part, about semantics.

    Your statement is true, if and only if you (and those who agree with you) are the sole arbiter of the meaning of the word "male". However in the real world, there is obvious disagreement. You have a very narrow view based mostly on genetics (but not entirely, as there are rare people born as "obviously female" who have XY genes). Many others - possibly a majority, possibly not, but in any case a very large number - have a different definition. By theirs, it is entirely possible (though still quite rare) for males to be pregnant. Since there's no final arbiter, disputes continue.

    This is identical to the argument about gay marriage. Many of those who opposed it claimed that it imposed upon their religion by damaging religious marriage. This failed to account for the fact that one word, "marriage", signified at least two concepts - a religious one and a completely secular one that encompassed legal rights and obligations. Laws enabling gay marriage had only secular consequences, but this was generally ignored in the debate.

    In both of these instances, conservatives seemed quite incensed at the notion that the definition of the word can be other than they imagine. Among other things they want to force on everyone else, they want to be the arbiters of semantics. But sadly for them, they are not. Languages evolve, generally in rough accord with relevant social customs. And so we see that today, gay marriage is becoming widely accepted. The next generation may be only dimly aware that it was ever a political issue.
    darkvaderbeowulfschmidt
  • How we ended up with the 'Pregnant Man' Emoji

    On a forward-looking note... I think most people fail to recognize just where the technology curve is taking us, and how fast it's moving, *and* how fast it's accelerating.

    If you are bothered by people who ID as male getting pregnant, just wait until significant body mods move from science fiction to reality. It's coming, and it's going to be here in most of our lifetimes... unless the right-wing luddites crash our entire society, anyway. (Yes, there are lots of left-wing luddites too, and they're generally morons, but most of them are less inclined to violence.)

    It may not happen for 20 or 30 years, maybe a little more, but I guarantee, someday you'll see a news article about some male-from-birth celebrity becoming pregnant. They'll stay male, but they'll have an embryo-support system (be it an actual biological womb, or something else) installed for the duration of the pregnancy. There will be outrage, and despair, and then 20 years after that nobody will care.
    JaiOh81
  • How we ended up with the 'Pregnant Man' Emoji

    twlatl said:
    There is nothing logical about a pregnant man. Nice attempt to write thousands of words to legitimize it, but a pregnant man emoji is as useful as a emoji of a fish riding a bicycle. Both are pure fantasy. 
    That's a great argument! I'm sure you protested against the unicorn symbol as well!

    Pathetic.

    crowleyronnsconosciutojas99JaiOh81darkvader
  • Apple's 2019 Mac Pro is now three PCIe revisions behind

    DuhSesame said:
    DuhSesame said:
    DuhSesame said:
    DuhSesame said:
    The server market is really distinct from the workstation market. And why do you think TR isn't selling well?

    But again, I say it's not relevant, because we're not talking about some random risk-averse IT manager, or the tech team at Amazon's EC division. We're talking about Apple, which is supposed to have some vision, and to "skate where the puck is going to be, not where it is". The puck was obviously headed towards AMD in 2019.
    “Vision”, okay…I’m sure they have planned to switch way back then, well scheduled during or even before the current design.  Whatever this product is marketing at, it should primarily cover this transition period first, not fighting spec wars.  28-core is enough for couple of years, may not be the best, but does the job, and way more reliable.

    That’s the other issue, as you want a serious production system to work 100% all the time, where IIRC Zen 2 does tend to glitch if someone maxed out their PCIe lanes.  This is why there’s now a TR Pro to cover this segment.
    You keep claiming that Zen 2 had reliability problems. Source?

    But not relevant anyway, as you seem to be saying this about the desktop chip. That's not what we're talking about! The chip in the Mac Pro is a Xeon, providing >40 PCIe lanes and 6 channels of DRAM. If they'd used a Zen 2 chip, it would have been the EPYC, in order to provide as many PCIe lanes (or more, EPYC has 128 PCIe4 lanes), and 8 channels of DRAM. The desktop chip would have been a nonstarter at two DRAM channels, nevermind the smaller PCIe config.
    I think everyone’s favorite Techtubers (LTT) did an episode on this.  I’m just too lazy to find it.  And I doubt Apple is interested in full-blown Server chips.
    Then you're not paying attention. What do you think is in the Mac Pro?

    Intel's product line doesn't match up with AMD's exactly, but the W32xx matches up roughly to the single-chip EPYCs (the "P" processors, like the 7502P). In positioning, at least, though definitely not in performance.
    EPYC is meant to compete with Xeon Scalable, not W-series.  This is why they have Threadripper Pros now.

    Well, now I got some time, did some digging, while I'm still searching for that video (that's a long time ago), here are some articles that point in my direction:

    1). https://www.tomshardware.com/news/hardware-reliability-puget-systems-2021
    2). https://www.quora.com/Are-data-centers-still-prefer-the-Xeon-processors-because-they-consume-less-power-or-do-they-prefer-AMD-s-EPYC-processor-for-better-performance-but-more-power-consumption
    3). https://www.reddit.com/r/buildapc/comments/f8ofxg/amd_threadripper_vs_intel_xeon_for_reliability/

    So...Yes, TR does fail a bit more, but we're still talking in single digits, the biggest issue is the trust factor.  While it makes sense for enthusiasts to praise them, workstation users want the most reliable product as possible, more than performance and price perhaps.  We know what the 2019 Mac Pro is built around.  Now I know why I got all the complaints about it.

    Granted, it seems like AMD has finally caught up, but this is the last x86 workstation for Apple, as it's likely scheduled to be replaced this year, whatever chip they put in was only meant to cover this short period of time, and it needs to be reliable, so, Intel.
    "EPYC is meant to compete with Xeon Scalable, not W-series." That's just not true. Remember, as you pointed out earlier, we're talking about 2019 here, not 2021. That's exactly what the P-series EPYCs were for, at that time: Competing with the W series. The TR does not have enough RAM channels to play in this segment. The TR Pro that you mentioned does have that, but those didn't exist back then.

    None of the links provided above are relevant. Puget's info is of interest, but it's two years too new. The others are random answers from randos, most of whom are obviously clueless - for example, any answer that talks about Intel having better power consumption can be ignored with prejudice. That was a common misconception, but I don't know why, as the AMDs had enormously better performance per watt.
    1. There are scalable workstations out there too, so what.
     
    2. "The more expensive and conservative the better", that sums up a lot, doesn't it?  With that in mind, AMD does have some reliability issues, even in this day and age.

    But isn't the biggest point is to serve what it needs the most?  Your customers want something reputable and just works, where AMD was still building for that reputation.  It works for the time period, then the Mac Pro will be shifting toward Apple Silicon.  The 2019 Mac Pro simply doesn't want to compete in the segment you'd hope for, which it could change with Apple's own chip.  Sounds terribly unexciting, I know.

    Oh, as in mid-2019, we won't see Zen 2 EPYC as well, so not even a huge gap to fill, and P-series doesn't offer anything below 16-core.
    The Zen 2 EPYCs were released 8/7/19, so Apple could easily have used them. And of course, the release date isn't real, as large customers and OEMs had them well before that date.

    The P series has an 8-core model (7232P), but with pricing so much lower than Intel's (Apple was paying for Intel's pricey "M" versions of the chips), Apple could have shipped a 16-core model at the same price as the Intel 8-core and made just as much money.

    I think that most of our disagreement is about what we think Apple was trying to do with that unit. I don't think, as you do, that they were selling to a completely risk-averse crowd that still thought AMD was dangerous.

    If you think that many studios weren't starting to buy AMDs at that point, you're mistaken. However. People don't buy Macs because of the chip inside, though they do care to some extent about performance. If Apple said the AMD chips were good, Mac Pro buyers would be happy to buy them. And they would be good - if Puget can catch bad chips, so can Apple.

    However some of our disagreement stems from the fact that you know a bit less about this hardware than you think you do. I own a LOT of machines with both intel and AMD chips. And I started buying them before 2019. (I = my company, we're in the computing infrastructure business.) I know what was on the market when Apple's Mac Pro shipped, and what it was competing against, because I was thinking about buying them, and with real regret declined.
    muthuk_vanalingamwatto_cobra
  • Apple's 2019 Mac Pro is now three PCIe revisions behind

    DuhSesame said:
    DuhSesame said:
    DuhSesame said:
    The server market is really distinct from the workstation market. And why do you think TR isn't selling well?

    But again, I say it's not relevant, because we're not talking about some random risk-averse IT manager, or the tech team at Amazon's EC division. We're talking about Apple, which is supposed to have some vision, and to "skate where the puck is going to be, not where it is". The puck was obviously headed towards AMD in 2019.
    “Vision”, okay…I’m sure they have planned to switch way back then, well scheduled during or even before the current design.  Whatever this product is marketing at, it should primarily cover this transition period first, not fighting spec wars.  28-core is enough for couple of years, may not be the best, but does the job, and way more reliable.

    That’s the other issue, as you want a serious production system to work 100% all the time, where IIRC Zen 2 does tend to glitch if someone maxed out their PCIe lanes.  This is why there’s now a TR Pro to cover this segment.
    You keep claiming that Zen 2 had reliability problems. Source?

    But not relevant anyway, as you seem to be saying this about the desktop chip. That's not what we're talking about! The chip in the Mac Pro is a Xeon, providing >40 PCIe lanes and 6 channels of DRAM. If they'd used a Zen 2 chip, it would have been the EPYC, in order to provide as many PCIe lanes (or more, EPYC has 128 PCIe4 lanes), and 8 channels of DRAM. The desktop chip would have been a nonstarter at two DRAM channels, nevermind the smaller PCIe config.
    I think everyone’s favorite Techtubers (LTT) did an episode on this.  I’m just too lazy to find it.  And I doubt Apple is interested in full-blown Server chips.
    Then you're not paying attention. What do you think is in the Mac Pro?

    Intel's product line doesn't match up with AMD's exactly, but the W32xx matches up roughly to the single-chip EPYCs (the "P" processors, like the 7502P). In positioning, at least, though definitely not in performance.
    EPYC is meant to compete with Xeon Scalable, not W-series.  This is why they have Threadripper Pros now.

    Well, now I got some time, did some digging, while I'm still searching for that video (that's a long time ago), here are some articles that point in my direction:

    1). https://www.tomshardware.com/news/hardware-reliability-puget-systems-2021
    2). https://www.quora.com/Are-data-centers-still-prefer-the-Xeon-processors-because-they-consume-less-power-or-do-they-prefer-AMD-s-EPYC-processor-for-better-performance-but-more-power-consumption
    3). https://www.reddit.com/r/buildapc/comments/f8ofxg/amd_threadripper_vs_intel_xeon_for_reliability/

    So...Yes, TR does fail a bit more, but we're still talking in single digits, the biggest issue is the trust factor.  While it makes sense for enthusiasts to praise them, workstation users want the most reliable product as possible, more than performance and price perhaps.  We know what the 2019 Mac Pro is built around.  Now I know why I got all the complaints about it.

    Granted, it seems like AMD has finally caught up, but this is the last x86 workstation for Apple, as it's likely scheduled to be replaced this year, whatever chip they put in was only meant to cover this short period of time, and it needs to be reliable, so, Intel.
    "EPYC is meant to compete with Xeon Scalable, not W-series." That's just not true. Remember, as you pointed out earlier, we're talking about 2019 here, not 2021. That's exactly what the P-series EPYCs were for, at that time: Competing with the W series. The TR does not have enough RAM channels to play in this segment. The TR Pro that you mentioned does have that, but those didn't exist back then.

    None of the links provided above are relevant. Puget's info is of interest, but it's two years too new. The others are random answers from randos, most of whom are obviously clueless - for example, any answer that talks about Intel having better power consumption can be ignored with prejudice. That was a common misconception, but I don't know why, as the AMDs had enormously better performance per watt.
    muthuk_vanalingamwatto_cobra