Goldman Sachs will reevaluate Apple Card credit line limits after claims of gender bias

2

Comments

  • Reply 21 of 50
    revenant said:
    you mean apple will, right? because apple is the one being sexist here. it is their algorithm that they tailored to be sure to give women a subpar credit limit. all goldman sachs does is issue the card on apple's behalf, apple is the one doing all the work, they are a giant bank.
    This was sarcasm, wasn't it? You just forgot to set '/s'. Or thought that it was obvious. Otherwise a very uneducated comment.
     0Likes 0Dislikes 0Informatives
  • Reply 22 of 50
    Ah the effort to try to find discrimination in absolutely everything. Why the attention here? Because it's click-bait. High profile. Apple and Goldman. Headline grabbers.

    News flash: Banks extend credit based on your credit score, credit history, employment, debt-to-income ratio, and the calculated likelihood of you being able to pay back the loan - on an individual basis. Your race, gender, marital status, community property, and tax returns have nothing to do with it.
    StrangeDayssarthos
     2Likes 0Dislikes 0Informatives
  • Reply 23 of 50
    flydogflydog Posts: 1,147member
    revenant said:
    you mean apple will, right? because apple is the one being sexist here. it is their algorithm that they tailored to be sure to give women a subpar credit limit. all goldman sachs does is issue the card on apple's behalf, apple is the one doing all the work, they are a giant bank.
    Wrong. 
     0Likes 0Dislikes 0Informatives
  • Reply 24 of 50
    MacPromacpro Posts: 19,873member
    Clarity after all the poking the finger at Apple. Cant believe this story has received so much traction considering if you have an issue with your bank you go directly to them and seek a review. Also about time Woz returned back to the pasture he came from. 
    Agreed on both points.  My guess is this is actually revealing something that is built into the system already and only because Apple is involved has this risen to the level of national attention.  This in a country where women are still not paid equally for the same work people are surprised the credit ratings are gender biased ... really?
    lkrupp
     1Like 0Dislikes 0Informatives
  • Reply 25 of 50
    Time to move on. This problem has peaked. It will go away in a week. 

    From here on, this is all just free advertising for the AppleCard. 
     0Likes 0Dislikes 0Informatives
  • Reply 26 of 50
    MacPromacpro Posts: 19,873member
    Time to move on. This problem has peaked. It will go away in a week. 

    From here on, this is all just free advertising for the AppleCard. 
    While I agree it's time to move on from the implication Apple is somehow involved it isn't time to move on from nonequal pay and gender credit bias.  I hope this does have some effect on the ludicrous situation in the USA.  How many civilized first world countries still refuse to codify equal pay in 2019? 
     0Likes 0Dislikes 0Informatives
  • Reply 27 of 50
    mike1mike1 Posts: 3,489member
    revenant said:
    you mean apple will, right? because apple is the one being sexist here. it is their algorithm that they tailored to be sure to give women a subpar credit limit. all goldman sachs does is issue the card on apple's behalf, apple is the one doing all the work, they are a giant bank.
    Wow. That is simply wrong on every level.
     0Likes 0Dislikes 0Informatives
  • Reply 28 of 50
    I think it's funny that people try to take swipes at "woke" et cetera when the reality is that Apple's card offer was only interesting to consumers specifically because the credit card industry was already generally viewed as an unethical dumpster fire. 
     0Likes 0Dislikes 0Informatives
  • Reply 29 of 50
    dewmedewme Posts: 6,023member
    I think a lot of people are getting foolishly hung up on the word "bias" because their default reaction to data driven algorithms that spit out results that they don't agree with assume the algorithms are purposely tainted by nefarious actors. It used to be very easy to see what people claimed to be "gender bias" in the word prediction logic on your iPhone. If you typed something like "The doctor" or "The engineer" followed by a space the next word prediction candidate list would offer up the word "he" but not "she" as one of the choices. The reason for this was that the prediction logic driven by frequency of association would make "he" a more likely match. This was strictly an artifact of the probabilistic processing and not a consequence of someone at Apple assuming that all doctors and engineers are men. Of course this data driven rationale didn't placate the overly sensitive reactions from people who are acutely tuned to assume nefarious behaviors by Apple.

    So now, in our newest iPhones, Apple has to specifically trap data driven results that may be misinterpreted as being biased and post process them to sanitize what the user sees. You'll now see both "he" and "she" and emojis of similar choice in response to the same text input and next word predictions. I don't have a problem with this because you never want to offend people, either intentionally or unintentionally, but I don't believe it was ever fair to blame Apple for intentionally skewing the algorithmic results to promote their own gender biases, which never existed in the first place. 
    StrangeDaysh2papplesnoranges
     1Like 0Dislikes 2Informatives
  • Reply 30 of 50
    dewme said:
    I do not believe for a millisecond that Apple or Goldman Sachs is being intentionally sexist here. I do know that algorithms that are fed with real world data that already has an inherent bias will produce biased results unless the raw results are intentionally post processed to compensate for the bias that’s been baked into the data,  

    Yeah it sucks that unbiased scientific algorithms that act on historical real world data have to post-compensate for the traditional biases that are baked into the historical data that is feeding the algorithms.  But until the data that is feeding the algorithms is no longer tainted by bias the owners of the algorithms will have to resort to using compensation to remove the bias that passes through the algorithms. 

    In this case, the algorithms are totally free of bias but the biases that are in the data are leaking into the results produced by the algorithms. 
    Precisely.  Neural nets are very good at learning whatever is included in the data (even if not included by design and even if the data is curated to try and prevent such bias). So whatever credit data is being obtained from the client and scraped from the internet or credit rating services, it may well (almost certainly does) contain information sufficient to identify gender, race, marital status, education of the client.  Just as a deep learning vision processing system has identifiable edges, shapes, textures, convolutions) at a lower level, it is possible that the Apple/Sachs deep learning system has identifiable (though unintended... not purposeful) categories for gender, race, marital status, and education level (and maybe also conservative, liberal, brexit, or other political leaning).
     0Likes 0Dislikes 0Informatives
  • Reply 31 of 50
    MacPro said:
    Time to move on. This problem has peaked. It will go away in a week. 

    From here on, this is all just free advertising for the AppleCard. 
    While I agree it's time to move on from the implication Apple is somehow involved it isn't time to move on from nonequal pay and gender credit bias.  I hope this does have some effect on the ludicrous situation in the USA.  How many civilized first world countries still refuse to codify equal pay in 2019? 
    Sorry, but WTF does "codify equal pay" mean? That's what communist countries do. (Except for what those in charge, their family, and their pals pay themselves in those countries).
    edited November 2019
     0Likes 0Dislikes 0Informatives
  • Reply 32 of 50
    GeorgeBMacgeorgebmac Posts: 11,421member
    As this unfolds, it reminds me of back in the 60's when there was systemic bias against blacks in many areas (employment, housing, banking & credit, etc...).   And, after the civil rights laws were passed it was not from explicit bias but implicit bias from various rules and such.

    The result was the affirmative action programs to help bring black people up to the level needed to meet necessary criteria -- such as giving them special rights in education to help build their abilities and special rights in employment to help build their social status and income.   It was needed and ultimately the whole country benefited.

    This so called bias by GS is increasingly looking to be a "bias" against all people with low incomes.

    The question is:  Do we want to force them to make allowances for people with low incomes?   Females, displaced workers, young people, retirees, immigrants and others all say "YES!"   But:
    -- Is it the smart thing to do?
    -- Is it the right thing to do?

    I say no to either (even though I am in one of those demographics affected by this).
    h2p
     1Like 0Dislikes 0Informatives
  • Reply 33 of 50
    entropys said:
    steveau said:
    tundraboy said:
    revenant said:
    you mean apple will, right? because apple is the one being sexist here. it is their algorithm that they tailored to be sure to give women a subpar credit limit. all goldman sachs does is issue the card on apple's behalf, apple is the one doing all the work, they are a giant bank.
    Are you kidding?  You realize if someone defaults on their Apple Card balance, it's Goldman Sachs that eats the loss, not Apple, right?  You really believe that Goldman Sachs, or any bank for that matter, would allow someone else to make the lending decision for them?  Assessing creditworthiness is THE INDISPENSABLE, MOST IMPORTANT SKILL NEEDED to run a bank.  You know, a bank --those companies that turn a profit by extending credit.  If you have no ability to evaluate a loan applicant's creditworthiness, you have no business being in the banking business.
    Yes, he's kidding. It's called irony and it is officially a form of humour.
    Actually it’s sarcasm. Irony is a foreign concept beloved by the English and self depreciating Australians.  What is even better is the number of upvotes received by those that piled on oblivious to the sarcasm, and then there was a completely uncalled for Trump link! 
    If it wasn’t sarcasm, it was masterful trollery. 
    Since the comment wasn’t written in such a way as to be obvious sarcasm to anyone, a sarcasm tag is needed, which wasn’t provided. Therefore, no assumption can be made. And considering some of the stupid opinions I’ve read here, I’d give it even odds. 
    h2p
     1Like 0Dislikes 0Informatives
  • Reply 34 of 50
    MacPro said:
    Clarity after all the poking the finger at Apple. Cant believe this story has received so much traction considering if you have an issue with your bank you go directly to them and seek a review. Also about time Woz returned back to the pasture he came from. 
    Agreed on both points.  My guess is this is actually revealing something that is built into the system already and only because Apple is involved has this risen to the level of national attention.  This in a country where women are still not paid equally for the same work people are surprised the credit ratings are gender biased ... really?
    It’s not gender biased. I’ve developed credit application processing software for Cap One, things that are evaluated as part of your credit rating are credit score, utilized credit, credit accounts, payment history, income, etc.. None of that is based on gender. Plenty of rich women in the world with good credit. Main take away is credit is personal, there is no reason to believe spouses share credit ratings. 
    GeorgeBMach2papplesnoranges
     2Likes 0Dislikes 1Informative
  • Reply 35 of 50

    dewme said:
    I do not believe for a millisecond that Apple or Goldman Sachs is being intentionally sexist here. I do know that algorithms that are fed with real world data that already has an inherent bias will produce biased results unless the raw results are intentionally post processed to compensate for the bias that’s been baked into the data,  

    Yeah it sucks that unbiased scientific algorithms that act on historical real world data have to post-compensate for the traditional biases that are baked into the historical data that is feeding the algorithms.  But until the data that is feeding the algorithms is no longer tainted by bias the owners of the algorithms will have to resort to using compensation to remove the bias that passes through the algorithms. 

    In this case, the algorithms are totally free of bias but the biases that are in the data are leaking into the results produced by the algorithms. 
    Precisely.  Neural nets are very good at learning whatever is included in the data (even if not included by design and even if the data is curated to try and prevent such bias). So whatever credit data is being obtained from the client and scraped from the internet or credit rating services, it may well (almost certainly does) contain information sufficient to identify gender, race, marital status, education of the client.  Just as a deep learning vision processing system has identifiable edges, shapes, textures, convolutions) at a lower level, it is possible that the Apple/Sachs deep learning system has identifiable (though unintended... not purposeful) categories for gender, race, marital status, and education level (and maybe also conservative, liberal, brexit, or other political leaning).
    AI doesn’t exist like it does in the movies. A programmer like me would have to have instructions to parse any of those signals and look for gender/race/etc and then enforce business rules based on the results. Doing so would be illegal. Which is why we don’t do that. 

    Good material for your next screenplay about run-away AI, but that’s about it. 
    edited November 2019
    GeorgeBMac
     1Like 0Dislikes 0Informatives
  • Reply 36 of 50

    MacPro said:
    Time to move on. This problem has peaked. It will go away in a week. 

    From here on, this is all just free advertising for the AppleCard. 
    While I agree it's time to move on from the implication Apple is somehow involved it isn't time to move on from nonequal pay and gender credit bias.  I hope this does have some effect on the ludicrous situation in the USA.  How many civilized first world countries still refuse to codify equal pay in 2019? 
    Sorry, but WTF does "codify equal pay" mean? That's what communist countries do. (Except for what those in charge, their family, and their pals pay themselves in those countries).
    Clearly what he means is within an org equal pay for equal work should be required policy. That isn’t communist. What on earth do you mean? Do you know what communism is?
     0Likes 0Dislikes 0Informatives
  • Reply 37 of 50
    This was the dumbest Apple "scandal" ever. Woz and that one developer on twitter need anger management classes.
     0Likes 0Dislikes 0Informatives
  • Reply 38 of 50
    Rayz2016rayz2016 Posts: 6,957member
    An interesting response from GS. 

    They will look at individual cases and make an evaluation based on extra information provided.

    What they won't do is panic and demand their credit score partners make changes to their algorithms because a programmer swore on Twitter.
    GeorgeBMac
     1Like 0Dislikes 0Informatives
  • Reply 39 of 50
    hentaiboyhentaiboy Posts: 1,252member
    entropys said:
    steveau said:
    tundraboy said:
    revenant said:
    you mean apple will, right? because apple is the one being sexist here. it is their algorithm that they tailored to be sure to give women a subpar credit limit. all goldman sachs does is issue the card on apple's behalf, apple is the one doing all the work, they are a giant bank.
    Are you kidding?  You realize if someone defaults on their Apple Card balance, it's Goldman Sachs that eats the loss, not Apple, right?  You really believe that Goldman Sachs, or any bank for that matter, would allow someone else to make the lending decision for them?  Assessing creditworthiness is THE INDISPENSABLE, MOST IMPORTANT SKILL NEEDED to run a bank.  You know, a bank --those companies that turn a profit by extending credit.  If you have no ability to evaluate a loan applicant's creditworthiness, you have no business being in the banking business.
    Yes, he's kidding. It's called irony and it is officially a form of humour.
    Actually it’s sarcasm. Irony is a foreign concept beloved by the English and self depreciating Australians.  What is even better is the number of upvotes received by those that piled on oblivious to the sarcasm, and then there was a completely uncalled for Trump link! 
    If it wasn’t sarcasm, it was masterful trollery. 

    I think you’ll find that’s deprecating 😉
     0Likes 0Dislikes 0Informatives
  • Reply 40 of 50
    MacPro said:
    Clarity after all the poking the finger at Apple. Cant believe this story has received so much traction considering if you have an issue with your bank you go directly to them and seek a review. Also about time Woz returned back to the pasture he came from. 
    Agreed on both points.  My guess is this is actually revealing something that is built into the system already and only because Apple is involved has this risen to the level of national attention.  This in a country where women are still not paid equally for the same work people are surprised the credit ratings are gender biased ... really?
    The “wage gap” has been repeatedly debunked, but its backers refuse to accept reality:
    https://m.huffpost.com/us/entry/us_2073804
    edited November 2019
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.