If you can't convert a file into a PDF, maybe you shouldn't be writing laws about technolo...
Not everyone can be trusted with the almighty power of a sudo command in the Terminal. Yet, politicians who know nothing about the inner workings of computers, software, or the internet at large have declared themselves super users, proposing dangerously misinformed policies and laws that could have disastrous effects.
In Washington's latest extended facepalm session, lawmakers have taken issue with Section 230 of the Communications Decency Act of 1996, which shields tech companies from lawsuits over the content users post on their sites. In a congressional hearing held last week, politicians on both sides of the aisle made it abundantly clear they have no idea what they are talking about.
During questioning, Rep. Gus Bilirakis of Florida asked Facebook Chief Executive Mark Zuckerberg if he had concerns about content posted on YouTube. In case you've been living under a rock, you already know that YouTube is a video hosting platform owned by Google, not Facebook.
"Congressman, are you asking me about YouTube?" Zuckerberg asked, with Bilirakis responding yes, he wanted to talk about YouTube. Perhaps Congress should have Elon Musk, the CEO of electric carmaker Tesla, give his thoughts on the safety of Huffy electric bicycles. Close enough, right?
Make no mistake -- this is not an endorsement of Facebook, YouTube or Twitter, all of which are cesspools of misinformation and hate speech that have served to further sow division in the U.S. and around the world. When asked whether they felt their platforms contributed to the attack on the U.S. Capitol on Jan. 6, only Twitter CEO Jack Dorsey admitted his platform played a part, while Zuckerberg and Google CEO Sundar Pichai deflected any blame or accountability.
However, unless politicians ask the right questions and hold these online platforms accountable in appropriate and effective ways, true and effective reform will never happen. And it's hard to ask the right questions when you don't even know what it means to click those like and subscribe buttons on YouTube, let alone who owns YouTube.

It wasn't always this way. From 1972 to 1995, the U.S. Office of Technology Assessment served the United States Congress with objective analysis of complex issues related to technology and science.
And that's the way it should be. Politicians are not expected to cut together their own TikTok dance videos, nor should they be elected because of no-filter Instagram pics of their plated lunch. They have interns to handle that for them.
The OTA was ultimately dismantled because of -- you guessed it -- politics. Republicans in Congress viewed the office as wasteful and counter to their own interests.
Now, nearly three decades later, being baffled by technology is a bipartisan issue. Neither party has shown that it understands complex technology issues well enough to legislate on their own, without some sort of guidance from experts.
The irony of it all is that Congress is indeed on the correct path here -- online platforms should be held accountable for misinformation and dangerous speech from users, particularly when their technology disseminates, promotes, and amplifies such content.

The algorithms of Facebook, YouTube and Twitter are designed to increase engagement, encouraging users to spend more time on their pages, thus earning the companies more money. But studies have repeatedly shown that misinformation is among the most engaging content shared on social media. The platforms serve up extremism because visitors consume it, fueling a dangerous cycle that spreads dangerous and outright false claims from groups like QAnon, anti-vaxxers, and white supremacists.
Asking Facebook, Google and Twitter to self-regulate on these issues would be like trusting an alcoholic to captain a booze cruise. And Section 230 is a 26-word law that is older than all three of the companies who have been called to testify before Congress.
Our politicians should absolutely be examining the full complexity of these issues -- but they should also be doing their homework first instead of aimlessly flailing away and posturing for the camera. Instead, the CEOs of Facebook, Google and Twitter were asked last week whether they had seen "The Social Dilemma" documentary on Netflix, and whether they have been vaccinated against Covid-19. Real pressing questions.
At one point, Arizona Representative Tom O'Halleran lobbed out a question for a "Mr. Zuckerman." If they ever find Mr. Zuckerman, perhaps they'll ask him if he's seen the 1998 film "You've Got Mail." Seems relevant enough to a law written in 1996, being critiqued by people who probably only have the ability to read their AOL email addresses.
In Washington's latest extended facepalm session, lawmakers have taken issue with Section 230 of the Communications Decency Act of 1996, which shields tech companies from lawsuits over the content users post on their sites. In a congressional hearing held last week, politicians on both sides of the aisle made it abundantly clear they have no idea what they are talking about.
During questioning, Rep. Gus Bilirakis of Florida asked Facebook Chief Executive Mark Zuckerberg if he had concerns about content posted on YouTube. In case you've been living under a rock, you already know that YouTube is a video hosting platform owned by Google, not Facebook.
"Congressman, are you asking me about YouTube?" Zuckerberg asked, with Bilirakis responding yes, he wanted to talk about YouTube. Perhaps Congress should have Elon Musk, the CEO of electric carmaker Tesla, give his thoughts on the safety of Huffy electric bicycles. Close enough, right?
Make no mistake -- this is not an endorsement of Facebook, YouTube or Twitter, all of which are cesspools of misinformation and hate speech that have served to further sow division in the U.S. and around the world. When asked whether they felt their platforms contributed to the attack on the U.S. Capitol on Jan. 6, only Twitter CEO Jack Dorsey admitted his platform played a part, while Zuckerberg and Google CEO Sundar Pichai deflected any blame or accountability.
However, unless politicians ask the right questions and hold these online platforms accountable in appropriate and effective ways, true and effective reform will never happen. And it's hard to ask the right questions when you don't even know what it means to click those like and subscribe buttons on YouTube, let alone who owns YouTube.

It wasn't always this way. From 1972 to 1995, the U.S. Office of Technology Assessment served the United States Congress with objective analysis of complex issues related to technology and science.
And that's the way it should be. Politicians are not expected to cut together their own TikTok dance videos, nor should they be elected because of no-filter Instagram pics of their plated lunch. They have interns to handle that for them.
The OTA was ultimately dismantled because of -- you guessed it -- politics. Republicans in Congress viewed the office as wasteful and counter to their own interests.
Now, nearly three decades later, being baffled by technology is a bipartisan issue. Neither party has shown that it understands complex technology issues well enough to legislate on their own, without some sort of guidance from experts.
The irony of it all is that Congress is indeed on the correct path here -- online platforms should be held accountable for misinformation and dangerous speech from users, particularly when their technology disseminates, promotes, and amplifies such content.

The algorithms of Facebook, YouTube and Twitter are designed to increase engagement, encouraging users to spend more time on their pages, thus earning the companies more money. But studies have repeatedly shown that misinformation is among the most engaging content shared on social media. The platforms serve up extremism because visitors consume it, fueling a dangerous cycle that spreads dangerous and outright false claims from groups like QAnon, anti-vaxxers, and white supremacists.
Asking Facebook, Google and Twitter to self-regulate on these issues would be like trusting an alcoholic to captain a booze cruise. And Section 230 is a 26-word law that is older than all three of the companies who have been called to testify before Congress.
Our politicians should absolutely be examining the full complexity of these issues -- but they should also be doing their homework first instead of aimlessly flailing away and posturing for the camera. Instead, the CEOs of Facebook, Google and Twitter were asked last week whether they had seen "The Social Dilemma" documentary on Netflix, and whether they have been vaccinated against Covid-19. Real pressing questions.
At one point, Arizona Representative Tom O'Halleran lobbed out a question for a "Mr. Zuckerman." If they ever find Mr. Zuckerman, perhaps they'll ask him if he's seen the 1998 film "You've Got Mail." Seems relevant enough to a law written in 1996, being critiqued by people who probably only have the ability to read their AOL email addresses.
Comments
Sure you can say that they cannot figure it out themselves, or they are going to talk to people that tell them what they want to hear, either way you wind up back in the same place, an uninformed lawmaker deciding how things should work....
However there is also a random possibility that they may actually figure something out if they actually went out and try to figure it out themselves by perhaps actually trying to listen to both sides.. which is probably rare.
What you folks say here in the forums, as long as we have a good faith moderation effort, we are not responsible for. If we chose to moderate the forums by an idealogical agenda, that STILL doesn't make us a publisher of what THE USER says in the forums.
If you want to convince the supreme court that companies don't have first amendment rights and overturn three decades of precedent, hey, go ahead, because that's what it's going to take to force an ideologically neutral standpoint. But until they don't have those rights, they may moderate as they see fit.
And, the proposals for 230 reform that both parties have put forth? It will mandate MORE moderation, not LESS. Anything even remotely questionable or in the slightest bit untrue will need to be purged.