you mean apple will, right? because apple is the one being sexist here. it is their algorithm that they tailored to be sure to give women a subpar credit limit. all goldman sachs does is issue the card on apple's behalf, apple is the one doing all the work, they are a giant bank.
This was sarcasm, wasn't it? You just forgot to set '/s'. Or thought that it was obvious.
Otherwise a very uneducated comment.
Ah the effort to try to find discrimination in absolutely everything. Why the attention here? Because it's click-bait. High profile. Apple and Goldman. Headline grabbers.
News flash: Banks extend credit based on your credit score, credit history, employment, debt-to-income ratio, and the calculated likelihood of you being able to pay back the loan - on an individual basis. Your race, gender, marital status, community property, and tax returns have nothing to do with it.
you mean apple will, right? because apple is the one being sexist here. it is their algorithm that they tailored to be sure to give women a subpar credit limit. all goldman sachs does is issue the card on apple's behalf, apple is the one doing all the work, they are a giant bank.
Clarity after all the poking the finger at Apple. Cant believe this story has received so much traction considering if you have an issue with your bank you go directly to them and seek a review. Also about time Woz returned back to the pasture he came from.
Agreed on both points. My guess is this is actually revealing something that is built into the system already and only because Apple is involved has this risen to the level of national attention. This in a country where women are still not paid equally for the same work people are surprised the credit ratings are gender biased ... really?
Time to move on. This problem has peaked. It will go away in a week.
From here on, this is all just free advertising for the AppleCard.
While I agree it's time to move on from the implication Apple is somehow involved it isn't time to move on from nonequal pay and gender credit bias. I hope this does have some effect on the ludicrous situation in the USA. How many civilized first world countries still refuse to codify equal pay in 2019?
you mean apple will, right? because apple is the one being sexist here. it is their algorithm that they tailored to be sure to give women a subpar credit limit. all goldman sachs does is issue the card on apple's behalf, apple is the one doing all the work, they are a giant bank.
I think it's funny that people try to take swipes at "woke" et cetera when the reality is that Apple's card offer was only interesting to consumers specifically because the credit card industry was already generally viewed as an unethical dumpster fire.
I think a lot of people are getting foolishly hung up on the word "bias" because their default reaction to data driven algorithms that spit out results that they don't agree with assume the algorithms are purposely tainted by nefarious actors. It used to be very easy to see what people claimed to be "gender bias" in the word prediction logic on your iPhone. If you typed something like "The doctor" or "The engineer" followed by a space the next word prediction candidate list would offer up the word "he" but not "she" as one of the choices. The reason for this was that the prediction logic driven by frequency of association would make "he" a more likely match. This was strictly an artifact of the probabilistic processing and not a consequence of someone at Apple assuming that all doctors and engineers are men. Of course this data driven rationale didn't placate the overly sensitive reactions from people who are acutely tuned to assume nefarious behaviors by Apple.
So now, in our newest iPhones, Apple has to specifically trap data driven results that may be misinterpreted as being biased and post process them to sanitize what the user sees. You'll now see both "he" and "she" and emojis of similar choice in response to the same text input and next word predictions. I don't have a problem with this because you never want to offend people, either intentionally or unintentionally, but I don't believe it was ever fair to blame Apple for intentionally skewing the algorithmic results to promote their own gender biases, which never existed in the first place.
I do not believe for a millisecond that Apple or Goldman Sachs is being intentionally sexist here. I do know that algorithms that are fed with real world data that already has an inherent bias will produce biased results unless the raw results are intentionally post processed to compensate for the bias that’s been baked into the data,
Yeah it sucks that unbiased scientific algorithms that act on historical real world data have to post-compensate for the traditional biases that are baked into the historical data that is feeding the algorithms. But until the data that is feeding the algorithms is no longer tainted by bias the owners of the algorithms will have to resort to using compensation to remove the bias that passes through the algorithms.
In this case, the algorithms are totally free of bias but the biases that are in the data are leaking into the results produced by the algorithms.
Precisely. Neural nets are very good at learning whatever is included in the data (even if not included by design and even if the data is curated to try and prevent such bias). So whatever credit data is being obtained from the client and scraped from the internet or credit rating services, it may well (almost certainly does) contain information sufficient to identify gender, race, marital status, education of the client. Just as a deep learning vision processing system has identifiable edges, shapes, textures, convolutions) at a lower level, it is possible that the Apple/Sachs deep learning system has identifiable (though unintended... not purposeful) categories for gender, race, marital status, and education level (and maybe also conservative, liberal, brexit, or other political leaning).
Time to move on. This problem has peaked. It will go away in a week.
From here on, this is all just free advertising for the AppleCard.
While I agree it's time to move on from the implication Apple is somehow involved it isn't time to move on from nonequal pay and gender credit bias. I hope this does have some effect on the ludicrous situation in the USA. How many civilized first world countries still refuse to codify equal pay in 2019?
Sorry, but WTF does "codify equal pay" mean? That's what communist countries do. (Except for what those in charge, their family, and their pals pay themselves in those countries).
As this unfolds, it reminds me of back in the 60's when there was systemic bias against blacks in many areas (employment, housing, banking & credit, etc...). And, after the civil rights laws were passed it was not from explicit bias but implicit bias from various rules and such.
The result was the affirmative action programs to help bring black people up to the level needed to meet necessary criteria -- such as giving them special rights in education to help build their abilities and special rights in employment to help build their social status and income. It was needed and ultimately the whole country benefited.
This so called bias by GS is increasingly looking to be a "bias" against all people with low incomes.
The question is: Do we want to force them to make allowances for people with low incomes? Females, displaced workers, young people, retirees, immigrants and others all say "YES!" But: -- Is it the smart thing to do?
-- Is it the right thing to do?
I say no to either (even though I am in one of those demographics affected by this).
you mean apple will, right? because apple is the one being sexist here. it is their algorithm that they tailored to be sure to give women a subpar credit limit. all goldman sachs does is issue the card on apple's behalf, apple is the one doing all the work, they are a giant bank.
Are you kidding? You realize if someone defaults on their Apple Card balance, it's Goldman Sachs that eats the loss, not Apple, right? You really believe that Goldman Sachs, or any bank for that matter, would allow someone else to make the lending decision for them? Assessing creditworthiness is THE INDISPENSABLE, MOST IMPORTANT SKILL NEEDED to run a bank. You know, a bank --those companies that turn a profit by extending credit. If you have no ability to evaluate a loan applicant's creditworthiness, you have no business being in the banking business.
Yes, he's kidding. It's called irony and it is officially a form of humour.
Actually it’s sarcasm. Irony is a foreign concept beloved by the English and self depreciating Australians. What is even better is the number of upvotes received by those that piled on oblivious to the sarcasm, and then there was a completely uncalled for Trump link!
If it wasn’t sarcasm, it was masterful trollery.
Since the comment wasn’t written in such a way as to be obvious sarcasm to anyone, a sarcasm tag is needed, which wasn’t provided. Therefore, no assumption can be made. And considering some of the stupid opinions I’ve read here, I’d give it even odds.
Clarity after all the poking the finger at Apple. Cant believe this story has received so much traction considering if you have an issue with your bank you go directly to them and seek a review. Also about time Woz returned back to the pasture he came from.
Agreed on both points. My guess is this is actually revealing something that is built into the system already and only because Apple is involved has this risen to the level of national attention. This in a country where women are still not paid equally for the same work people are surprised the credit ratings are gender biased ... really?
It’s not gender biased. I’ve developed credit application processing software for Cap One, things that are evaluated as part of your credit rating are credit score, utilized credit, credit accounts, payment history, income, etc.. None of that is based on gender. Plenty of rich women in the world with good credit. Main take away is credit is personal, there is no reason to believe spouses share credit ratings.
I do not believe for a millisecond that Apple or Goldman Sachs is being intentionally sexist here. I do know that algorithms that are fed with real world data that already has an inherent bias will produce biased results unless the raw results are intentionally post processed to compensate for the bias that’s been baked into the data,
Yeah it sucks that unbiased scientific algorithms that act on historical real world data have to post-compensate for the traditional biases that are baked into the historical data that is feeding the algorithms. But until the data that is feeding the algorithms is no longer tainted by bias the owners of the algorithms will have to resort to using compensation to remove the bias that passes through the algorithms.
In this case, the algorithms are totally free of bias but the biases that are in the data are leaking into the results produced by the algorithms.
Precisely. Neural nets are very good at learning whatever is included in the data (even if not included by design and even if the data is curated to try and prevent such bias). So whatever credit data is being obtained from the client and scraped from the internet or credit rating services, it may well (almost certainly does) contain information sufficient to identify gender, race, marital status, education of the client. Just as a deep learning vision processing system has identifiable edges, shapes, textures, convolutions) at a lower level, it is possible that the Apple/Sachs deep learning system has identifiable (though unintended... not purposeful) categories for gender, race, marital status, and education level (and maybe also conservative, liberal, brexit, or other political leaning).
AI doesn’t exist like it does in the movies. A programmer like me would have to have instructions to parse any of those signals and look for gender/race/etc and then enforce business rules based on the results. Doing so would be illegal. Which is why we don’t do that.
Good material for your next screenplay about run-away AI, but that’s about it.
Time to move on. This problem has peaked. It will go away in a week.
From here on, this is all just free advertising for the AppleCard.
While I agree it's time to move on from the implication Apple is somehow involved it isn't time to move on from nonequal pay and gender credit bias. I hope this does have some effect on the ludicrous situation in the USA. How many civilized first world countries still refuse to codify equal pay in 2019?
Sorry, but WTF does "codify equal pay" mean? That's what communist countries do. (Except for what those in charge, their family, and their pals pay themselves in those countries).
Clearly what he means is within an org equal pay for equal work should be required policy. That isn’t communist. What on earth do you mean? Do you know what communism is?
you mean apple will, right? because apple is the one being sexist here. it is their algorithm that they tailored to be sure to give women a subpar credit limit. all goldman sachs does is issue the card on apple's behalf, apple is the one doing all the work, they are a giant bank.
Are you kidding? You realize if someone defaults on their Apple Card balance, it's Goldman Sachs that eats the loss, not Apple, right? You really believe that Goldman Sachs, or any bank for that matter, would allow someone else to make the lending decision for them? Assessing creditworthiness is THE INDISPENSABLE, MOST IMPORTANT SKILL NEEDED to run a bank. You know, a bank --those companies that turn a profit by extending credit. If you have no ability to evaluate a loan applicant's creditworthiness, you have no business being in the banking business.
Yes, he's kidding. It's called irony and it is officially a form of humour.
Actually it’s sarcasm. Irony is a foreign concept beloved by the English and self depreciating Australians. What is even better is the number of upvotes received by those that piled on oblivious to the sarcasm, and then there was a completely uncalled for Trump link!
Clarity after all the poking the finger at Apple. Cant believe this story has received so much traction considering if you have an issue with your bank you go directly to them and seek a review. Also about time Woz returned back to the pasture he came from.
Agreed on both points. My guess is this is actually revealing something that is built into the system already and only because Apple is involved has this risen to the level of national attention. This in a country where women are still not paid equally for the same work people are surprised the credit ratings are gender biased ... really?
Comments
So now, in our newest iPhones, Apple has to specifically trap data driven results that may be misinterpreted as being biased and post process them to sanitize what the user sees. You'll now see both "he" and "she" and emojis of similar choice in response to the same text input and next word predictions. I don't have a problem with this because you never want to offend people, either intentionally or unintentionally, but I don't believe it was ever fair to blame Apple for intentionally skewing the algorithmic results to promote their own gender biases, which never existed in the first place.
The result was the affirmative action programs to help bring black people up to the level needed to meet necessary criteria -- such as giving them special rights in education to help build their abilities and special rights in employment to help build their social status and income. It was needed and ultimately the whole country benefited.
This so called bias by GS is increasingly looking to be a "bias" against all people with low incomes.
The question is: Do we want to force them to make allowances for people with low incomes? Females, displaced workers, young people, retirees, immigrants and others all say "YES!" But:
-- Is it the smart thing to do?
AI doesn’t exist like it does in the movies. A programmer like me would have to have instructions to parse any of those signals and look for gender/race/etc and then enforce business rules based on the results. Doing so would be illegal. Which is why we don’t do that.
Good material for your next screenplay about run-away AI, but that’s about it.
Clearly what he means is within an org equal pay for equal work should be required policy. That isn’t communist. What on earth do you mean? Do you know what communism is?
They will look at individual cases and make an evaluation based on extra information provided.
What they won't do is panic and demand their credit score partners make changes to their algorithms because a programmer swore on Twitter.
I think you’ll find that’s deprecating 😉
https://m.huffpost.com/us/entry/us_2073804