'Safe Tech Act' could strip Section 230 user content protections from websites

2»

Comments

  • Reply 21 of 26
    sdw2001sdw2001 Posts: 18,016member
    command_f said:
    I support free speech* but the internet has proved that there need to be some constraints. Paedophilia, terrorism, misogyny and crazy levels of bullying, to name but a few, cannot be left unchecked and unchallenged.

    There is always a problem of where to draw a line. We have, at least in the UK, have laws against media (TV, newspapers and others) publishing the gross offences (those that we pretty much all agree are wrong). There are time-tested mechanisms for handling borderline cases by taking publishers to court and letting judges decide.

    I fail to see why we don't test social media in the same way: I simply don't believe that the likes of Google and Facebook can extract the last iota of marketable profiling from a user's post yet fail to detect non-borderline unacceptable content (and, in fact, Google and others have actually done admirable, and successful, work against child pornography). Such capabilities can be made available to all forum hosts.

    If a website collects and disseminates information, how is that different from a news site? What distinguishes them from publishers? If it looks like a duck and quacks like a duck...

    *Free speech should admit that you can discuss any subject. It's when it becomes damaging that it's a problem - hard to describe but I think most grown-ups can recognise that when they see it.
    I think that I can address some of that.  First, in the U.S., we do have laws that limit speech.  They include pedophilia, terrorism, threats, harassment, conspiracy, etc.  These apply on all forms of communication.  Secondly, we do have laws regarding the media.  If the New York Times libels someone with false and malicious reporting, they can be sued civilly.  Ditto on pretty much any other outlet, including book publishers and authors.  

    The difference with social media is precisely why section 230 exists.  It aims to to indemnify platforms that host speech rather than publish it.  Media sources and publishers are just that...publishers.  They decide what they want to cover and how.  Nothing is getting on their platform without their consent.  Social media is different, because there are millions of users posting all manner of things.  For example, Facebook shouldn't be responsible if some random crazy makes a threat or brags about voter fraud.  The individual should be responsible.  This policy not only was to encourage expression, but development of these platforms themselves.  As has been stated many times, if section 230 was completely repealed, comment sections would probably disappear from the Internet entirely.  

    Facebook and Twitter in particular have become the new public square in many ways.  Instead of just policing obvious criminal or heinous activity/speech, they are acting as publishers in deciding what kind of content they want to host.  These are openly liberal/progressive companies whose employees donate 95% to one political party.  They have the power to influence elections through targeting turnout efforts in certain areas and amongst certain audiences.  In other words, they are making editorial decisions like publishers.  However, they are immune from civil litigation under 230.  The law explicitly gives them immunity when it comes to "otherwise objectionable" content.  They use this as an umbrella to limit freedom of expression.  But don't worry, it gets worse.  These same companies donate hundreds of millions of dollars to various causes, often having to do with voter turnout, election law, etc.  Those private organizations then turn around and fund---you guessed it---one political party.  The entire thing is a giant scam.  This is why you see members of one party and major media publishers openly advocating for censorship.  

    Getting away from the political issue for a moment:  I'm not sure what the solution is.  One idea I have heard is that based on their size and reach, these companies should essentially be declared Common Carriers, like the railroads and telecoms were.  That would mean they could not censor or otherwise limit speech unless there was criminal activity involved, including criminal harassment, illegal content, etc.  The reasoning behind this is that the government cannot outsource the limiting of Rights to private corporations (it can't have corporations do what it is not allowed to do), which under 230, seems to be happening now.  That's just one idea though, and I haven't fully examined it.  
    command_fmaestro64
  • Reply 22 of 26
    sbdudesbdude Posts: 261member
    These rules apply to every periodical published for public consumption in the US. This is hardly a limitation on free speech, not to mention the fact that the Supreme Court has ruled that not all forms of speech are protected. For instance: libel, slander, hate speech during the commission of a crime, speech that incites violence.

    Just as newspapers can be taken to task for the "opinions" of its writers, and other editorials that don't rise to a journalistic standard (see: Gawker), it's time for Social Media to take responsibility for the dumpster fires they've created.
    command_f
  • Reply 23 of 26
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    sbdude said:
    These rules apply to every periodical published for public consumption in the US. This is hardly a limitation on free speech, not to mention the fact that the Supreme Court has ruled that not all forms of speech are protected. For instance: libel, slander, hate speech during the commission of a crime, speech that incites violence.

    Just as newspapers can be taken to task for the "opinions" of its writers, and other editorials that don't rise to a journalistic standard (see: Gawker), it's time for Social Media to take responsibility for the dumpster fires they've created.
    Section 230 of the CDA does not apply to "every periodical published for public consumption in the US" in any way.

    You are correct about a publication being responsible for opinions of writers published by the venue, but that has nothing to do with 230. In the case of AppleInsider, libel/slander laws apply to some extent to an Editorial that we publish from time to time, and perhaps, with the widest interpretation possible, what staff posts here in the forums.

    However, as it stands now, 230 applies to places like the AppleInsider forums, where there is non-staff content on display. I don't disagree that there are many dumpster fires caused by social media, but this legislation won't fix that.
    edited February 2021 command_f
  • Reply 24 of 26
    command_f said:
    It would be sad to lose these columns. We can't see, of course, how much work you do behind the scenes but the appearance is that this is generally a sensible set of people who argue their differences without very often descending to simple abuse or worse. The odd person gets banned and that is presumably the result of your patience becoming exhausted.
    Moderating these forums is an incredible amount of work, because everything generates a significant amount of unacceptable posts, and so far, automation is useless.

    On the aggregate, we know who you are, and you aren't children. Simple abuse, and threats are way too common for what is, on the average, a significantly older population than, say, Reddit.
    I'm sorry that moderation costs you so much time, the only consolation is that users get to see a reasonable discussion without all the nastiness. Thank you.

    I hope that there is legislation but that it can be crafted to allow you to continue with these forums.
  • Reply 25 of 26
    maestro64maestro64 Posts: 5,043member
    MplsP said:
    maestro64 said:
    Just another example, you think you have problem now wait until you see the solution the government comes up with.
    the truly ironic part of it all is that some of the people pushing the hardest for 'reform' of section 230 because of perceived 'infringement of freedom of speech,' are the very people that twitter, Facebook, etc would be canceling first.
    Agree, there are people on all sides who either want it changed or removed totally. Those who think removing the liability clause will make it better do not realize only the big companies who have the lion share have the resources to mitigate any risk of liability and the liability threats can be used against smaller competitors. They just need to simplify it all.  They want to treat it like a phone line, you can not hold the phone company liable for what criminal plan using a phone. But the govern thought they were being smart to make companies  of platform liable for some planning a bad ack. The problem is the definition of bad act has expanded to include insulting someone intelligence. Simple word these days are consider violent if some thinks it is offends them. There is a who generation of people who had parents who told them their entire life they are perfect and if someone say some mean to them go call the police on them. It should not surprise anyone these are the same people trying to shut down people who do not agree with them.
    edited February 2021
  • Reply 26 of 26
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    command_f said:
    command_f said:
    It would be sad to lose these columns. We can't see, of course, how much work you do behind the scenes but the appearance is that this is generally a sensible set of people who argue their differences without very often descending to simple abuse or worse. The odd person gets banned and that is presumably the result of your patience becoming exhausted.
    Moderating these forums is an incredible amount of work, because everything generates a significant amount of unacceptable posts, and so far, automation is useless.

    On the aggregate, we know who you are, and you aren't children. Simple abuse, and threats are way too common for what is, on the average, a significantly older population than, say, Reddit.
    I'm sorry that moderation costs you so much time, the only consolation is that users get to see a reasonable discussion without all the nastiness. Thank you.

    I hope that there is legislation but that it can be crafted to allow you to continue with these forums.
    For me, at least, the discussion without the nastiness is worth the labor. I don't write the checks, though.
Sign In or Register to comment.