I don't think it's unreasonable to allow platforms to moderate content. Personally I'd rather they not censor anything and let me decide after a warning (and being able to disable "warnings" would be better still - the amount of ideological "warnings" is getting nuts on YouTube).
What's unreasonable is the completely oblique and inconsistent way they moderate content, often with guidelines that are non-existent or unpublished.
Sunlight is the best disinfectant. I think if they want to keep their 230 safe harbor, companies should be required:
1) Publish publicly their specific rules. No more "we reserve the right for any reason". You want to hid behind any reason? No problem - no more 230 for you. 2) Publish publicly all moderation, citing specifically how the rules from #1 were used for the moderated outcome.
It won't be perfect, but it should lead to a LOT more consistency. Twitter is rife with left wing violence, but if any other group posts violent content it's removed immediately. It should all be removed, or none of it removed.
Nonsense. You don’t like that private platform owners are exerting their own freedom of expression by removing content that violates their private platform rules. It’s their billboard, they get to decide what goes on it. They don’t have to publish material that breaks their rules, intentionally spreads misinformation, advocates violence, etc. It’s in their interest not to do so. The right-winger Qanon-believer who shot up a pizza parlor thankfully didn’t kill anyone, but just as easily could have. It’s in the private platform’s interest not to assist such lunatics and their plans.
Twitter is not rife it’s left wing violence. It’s rife with video of protests, the vast majority of which is peaceful. It also documents police violence and government violation of the freedom to peaceably assemble. Equating videos of racist right-winger violence against people to videos of property damage is fallacy.
The private platform part is what makes this so challenging from a regulation perspective. When you boil Facebook, Twitter, et al down to their core they really are no more than brokers that provide a way to connect their only real customers, advertisers, with a monetizeable asset - subscriber eyeballs. From a business perspective the only thing Facebook or Twitter care about is selling access to its subscriber's eyeballs to its advertiser customers. Everything else is just noise and overhead. As long as the number of eyeballs stays steady or keeps increasing, life is good for Facebook and Twitter.
Sure, to keep all those eyeballs showing up to consume advertising, Facebook and Twitter have to provide something in return, that's the overhead cost for Facebook and Twitter. If subscribers start getting nasty towards one another or if the advertisers start crafting their ads to attract a specific audience or promote self interests beyond simply making more money for themselves, negative side effects start to take place. The problem is that the fundamental business model, i.e., connecting advertisers to eyeballs, has no inherent control mechanism to avoid negative side effects, like social disruption. Facebook and Twitter can continue to profit while the negative side effects being emitted from the platform run amok, so something must be done. In control system theory the solution to unconstrained responses is always the same: a negative feedback loop.
The only question is who gets to apply the negative feedback. In my mind the benefactors of the business model should be the ones responsible for keeping their system under control, just like the operator of a nuclear power plant is responsible for keeping their plant operating safely while deriving a financial profit for operating the plant. Government regulation in the Facebook and Twitter case would be really directed at mitigating the negative side affects, not controlling the system itself.
Bottom line: I would place full responsibility on the owners of the system (social platform business), in other words require them to moderate their content with strict penalties for not doing so or letting negative side effects leak out on to the general public, like a radiation cloud. They should not be allowed to take a position that they don't care if their system is abused or causes public suffering. They are getting off way too easy.
Frist, all the platforms started out allowing any content (and I am not talking about illegal content i think everyone agrees this should not be available) and everything was fine and good.
Then a group of people who decided what they were reading, or worse what others were reading was not acceptable to them and they did not want other to read or see. They decided to pressures other companies to stop spending money with the tech companies which some of that money worked its way to the content generators. Before we knew it we had content being removed, shadow ban, or demonetized. Now we have a group of people (not liable to anyone) deciding what content is acceptable and not acceptable and who is allow to make money for their content.
The companies have couple of chooses here, go back to the original model where all content is welcome, or they can come up with a set of standards and control all content to those standard and the lowest voices do not get to decide or they become liable for everyone content. They tried playing the middle ground and only response to the outrage crowds and to claim they are not responsible for people's content, but in reality they were acting editor and deciding what people could read and see.
I am sorry in the US individuals do not get to decide what other can say and do and who is allow to make money for want they say and produce. In this country people vote with their own patronage, they do not get to force others to see things their way.
The solution is even simpler ... just start treating private social media platforms as businesses. That's exactly what they are, no different than the Quicky Mart across the street. If you walk into a Quicky Mart and cause a disruption, start an argument with another patron, or cause damage to the property or surrounding properties you can be trespassed from the business, arrested for returning to the premises, and potentially charged with a crime. The Quicky Mart is not a public forum for the expression of free speech, it's a business. Twitter and Facebook are not public forums, they are businesses. As such they can remove you as a user, remove you as a vendor, delete your posts, filter your posts, and do any number of things that many would claim to be censorship, as long as they are acting non-discriminatory (in a legal sense) in their enforcement of rules that apply to everyone who uses their business.
Likewise, if the Quicky Mart starts selling tainted meat or germ laden Mega Gulps, they should be held responsible for the damage it inflicts on those affected.
Yes, I get that it always comes down to who gets to decide what's considered "damaging" to the public and that words and deception are not the same as radiation leakage from a non-compliant power plant. But I am comfortable with us having a clearly defined set of roles and responsibilities established between regulatory bodies and businesses, much like we have with many other businesses and other mass media businesses like TV and radio in particular. I'm okay with regulators establishing what is acceptable, preferably with public input, and with punishing the businesses involved for non-compliance, even if non-compliance is induced into the businesses through one of their subscribers. Yes, they'll have to do more work and spend some real money on real solutions rather than turning a few folks into billionaires 500 times over. These social media companies have been getting a free ride by bamboozling regulators who are very naïve and overly intimidated by the technology behind social media businesses. If regulators just treat social media businesses like real businesses and make them play by the same rules that any other business must play by, we'd at least be on a resolvable path. Having two variables, 1) who sets the rules and 2) who is liable for damage caused by the business would at least be a step in the right direction.
Had big tech left Trump alone, none of this would have happened. This is fact, not a defense of the President. And it's very sad too because FREEDOM is preferred over draconian freedom-limiting changes proposed to limit 230 protections. It's also interesting to consider that the President is against the US being policeman of the planet (as am I), and yet policing user content is what these 230 changes are all about. Sometimes it's best not to rock the boat.
Now that I've read more from the DOJ, I don't think it intended to do what I previously thought it intended to do - i.e., remove the immunity which providers have for content they allow if they choose to censor some material in objectionable ways.
The wording of the proposed changes wouldn't have that effect and it seems that wasn't the intended effect. These changes aren't as substantial - or as bad - as I had originally thought. Even if this proposal was enacted, providers (and users, for that matter) would still enjoy unconditioned immunity for (i.e. not be treated as publishers of) the speech of others which they left up.
EDIT: I should be clear, the immunity provided by Section 230(c)(1) wouldn't be unconditional. But it wouldn't be conditioned on the provider not censoring certain material. The proposed changes actually place some meaningful conditions on Section 230(c)(1) immunity.
They take a little right to free speech away at a time until your arrested for thinking.
Had big tech left Trump alone, none of this would have happened. This is fact, not a defense of the President. And it's very sad too because FREEDOM is preferred over draconian freedom-limiting changes proposed to limit 230 protections. It's also interesting to consider that the President is against the US being policeman of the planet (as am I), and yet policing user content is what these 230 changes are all about. Sometimes it's best not to rock the boat.
This is like pudding left with no cover in the refrigerator for a week and the thick skin on top has a crevasse in the middle and is pulled away at the edges and the whole thing has a gross brown liquid sloshing around.
I frequently get censored online... I don’t curse or attack anyone, my comments just disappear.
I stopped commenting (or going to) sports.yahoo.com because the posts would just randomly disappear. Frequently, I’m just saying the same thing as 100 other people.
Censorship lives! ...and they want to make it worse?
I suspect most of these social media companies have AI’s removing content. I wouldn’t be surprised if it’s all outsourced to India, etc. Who are they going to sue? This is politicians being politicians... they’re probably catering to the people that can barely use a computer.
”Tough on crime!”
“War on drugs!”
It all sounds good, until the policies are actually implemented. Then, it’s a bureaucratic mess and a money pit...
Problem I see are people - really dumb, easily-manipulated people, too eager to believe anything they read off the Internet, and then run with it, spreading like wildfire.
I'm getting to the point where things need to be vetted. As long as it's based on facts, truth, real-science, then yes, don't censor anything. But when any nut-job (including #45) is allowed to post nonsense as fact, then I'm beginning to thing that platforms like Facebook and Twitter need to start taking responsibility. It's hard, and I have zero idea how anyone, or anything can pull that off.
And hasn't this always been the problem with humans? There are smart, wily people who are ethically challenged who have always used the easily manipulated to do their dirty work. In ancient times there was the town square. You could stand up and shout out your views. If you said something silly you might get a tomato in the kisser. if you said something crazy you might get some more damaging projectiles hurled your way. If you said something dangerous you might get the stock for a few days or worse. This kept things in balance and the kept crazy dangerous people on the fringes. Throughout the technological development of communications tech, mass comms tended to stay in the hands of the rich and civil minded. The fringe still had no effective voice in the society. Then the Internet happened and a bunch of silicon valley libertarians with limited understanding of human nature gave mass communications to everyone. Consequences were not considered. Knowing that human intelligence exists on a bell curve and that at any moment in time 50% of the population has an IQ of 100 or less, is giving everyone access to unfettered mass comms a good idea?
Had big tech left Trump alone, none of this would have happened. This is fact, not a defense of the President. And it's very sad too because FREEDOM is preferred over draconian freedom-limiting changes proposed to limit 230 protections. It's also interesting to consider that the President is against the US being policeman of the planet (as am I), and yet policing user content is what these 230 changes are all about. Sometimes it's best not to rock the boat.
This is like pudding left with no cover in the refrigerator for a week and the thick skin on top has a crevasse in the middle and is pulled away at the edges and the whole thing has a gross brown liquid sloshing around.
I honestly have no idea what you are saying other than your own cryptic attempt to disagree. In my eyes, your analogy is akin to getting upset with another person and then voicing your disagreement to that person in a foreign language. Sometimes simple is best. Please just say, "I disagree" or "I disagree because of X, Y and Z."
It is a fact that if big tech did not flag Trump's post, that lion would not have roared back against them. Passion for social justice can and often does backfire. Sometimes we need to leave things alone and just try to live in harmony with others despite disagreements. And toward that end, I will not write anything further on this subject. All that really needs to be said has been said.
Comments
Likewise, if the Quicky Mart starts selling tainted meat or germ laden Mega Gulps, they should be held responsible for the damage it inflicts on those affected.
Yes, I get that it always comes down to who gets to decide what's considered "damaging" to the public and that words and deception are not the same as radiation leakage from a non-compliant power plant. But I am comfortable with us having a clearly defined set of roles and responsibilities established between regulatory bodies and businesses, much like we have with many other businesses and other mass media businesses like TV and radio in particular. I'm okay with regulators establishing what is acceptable, preferably with public input, and with punishing the businesses involved for non-compliance, even if non-compliance is induced into the businesses through one of their subscribers. Yes, they'll have to do more work and spend some real money on real solutions rather than turning a few folks into billionaires 500 times over. These social media companies have been getting a free ride by bamboozling regulators who are very naïve and overly intimidated by the technology behind social media businesses. If regulators just treat social media businesses like real businesses and make them play by the same rules that any other business must play by, we'd at least be on a resolvable path. Having two variables, 1) who sets the rules and 2) who is liable for damage caused by the business would at least be a step in the right direction.
I stopped commenting (or going to) sports.yahoo.com because the posts would just randomly disappear. Frequently, I’m just saying the same thing as 100 other people.
I suspect most of these social media companies have AI’s removing content. I wouldn’t be surprised if it’s all outsourced to India, etc. Who are they going to sue? This is politicians being politicians... they’re probably catering to the people that can barely use a computer.
”Tough on crime!”
It all sounds good, until the policies are actually implemented. Then, it’s a bureaucratic mess and a money pit...
It is a fact that if big tech did not flag Trump's post, that lion would not have roared back against them. Passion for social justice can and often does backfire. Sometimes we need to leave things alone and just try to live in harmony with others despite disagreements. And toward that end, I will not write anything further on this subject. All that really needs to be said has been said.