This case against TikTok might spur the Section 230 reform we desperately need
Section 230 needs fixing and this case may be the catalyst.
Section 230 of the Telecommunications Decency Act is a very important law that allows the Internet to function as it does today. Without it, your favorite website would either cease to exist or change in ways that make it unrecognizable. We need these protections because, without them, we would have no way to express ourselves online if we didn't agree with whoever is tasked to moderate the content.
But it's also a very broad law that needs to be reformed. When it was written in 1996, nobody could predict the power that a few tech firms would wield or how much influence social media sites would have on us all. As situations change, the laws governing them must do the same.
A recent decision by the Third Circuit US Court of Appeals has ruled that ByteDance, the parent company of TikTok, is responsible for the distribution of harmful content even though it is shielded as its publisher. It's a tragic story of a 10-year-old girl trying the "blackout challenge" she saw in a TikTok short and dying of asphyxia as a result.
The child's mother sued for negligence and wrongful death and the case worked its way through the courts to the Third Circuit. The next stop is the Supreme Court. While the case is a terrible one, the ruling from the Third may be what's needed to revamp Section 230 and hold big tech "accountable" while shielding them at the same time.
Android Central has reached out to TikTok for a statement and will update this article when we receive one.
There's a difference between a publisher and a distributor. If I write a post on X or make a video on TikTok encouraging illegal activity, X or TikTok are only publishing it. Once their algorithm picks it up and forces it upon others, they are distributing it.
You really can't have one without the other, but the third has decided 230 stating "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider" does not protect the publisher from the consequences of distributing the content.
Get the top Black Friday deals right in your inbox: Sign up now!
Receive the hottest deals and product recommendations alongside the biggest tech news from the Android Central team straight to your inbox!
I don't agree with the Third's reasoning here simply because it's distributed as a result of it being published. Then again, I have no say in the matter because I'm just some dude, not a circuit court judge. It does point out that social media giants have to have some incentive to better police their content, or the law needs to be changed.
No, I'm not calling for censorship. We should be able to say or do any dumb thing we want as long as we are willing to deal with the consequences. But the Metas and ByteDances of the world don't have to like what we say or do and can yank it down any time they like as a consequence.
Without Section 230, they would do it a lot more often and that's not the right solution.
I have no idea how you fix things. I don't need to know how to fix it to know that they are broken. People collecting much larger salaries than me are responsible for that.
I know a 10-year-old child should not be enticed to asphyxiate herself because TikTok told her it was cool. I know nobody working for ByteDance wanted her to do it. I also know that no amount of parental control could prevent this from happening 100% of the time.
We need legislation like Section 230 to exist because there is no way to prevent terrible content from slipping through even the most draconian moderation. But it needs to be looked at again, and lawmakers need to figure it out. Now could be the right time to do it.
Jerry is an amateur woodworker and struggling shade tree mechanic. There's nothing he can't take apart, but many things he can't reassemble. You'll find him writing and speaking his loud opinion on Android Central and occasionally on Threads.
-
Mooncatt I think Section 230 itself is ok. The problem is social media companies want to be distributors with the protections of publishers. I.e. picking and choosing what to moderate and remove on a whim, even when it's not in good faith. Like when Facebook partnered with the US government to suppress certain posts like a distributor, but claim they are protected as a publisher.Reply
There needs to be accountability when those companies do not follow their own rules. I have been reporting adult content on Facebook right and left that shouldn't be there and not even something I've wanted to see there, but it's never removed. Maybe I should "accidentally" let one of kids see it and sue them for providing such content to minors?
Now to the case at hand with the 10yo girl trying that challenge, that is an age where it's still the parents' primary responsibility to supervise what she watches. It's no secret that Tik Tok is full of garbage, and I question why she was allowed to watch it in the first place. It's a tragedy that it happened, but that doesn't mean TT should automatically be the bad guy here.