Section 230

Post Reply
User avatar
Phoebe
Posts: 4029
Joined: Thu Nov 26, 2020 2:57 pm

Section 230

Post by Phoebe »

Apparently the Supreme Court is hearing a challenge on this and nobody really knows what the justices think or might do. This is the portion of the communications decency act (?? I think) shielding internet providers from lawsuits if they platform objectionable content that might otherwise get them sued. I e. Lower court interpretations of this provision have protected social media companies and internet service providers from being sued when their users create objectionable content.

But things have changed since the advertising model shifted to putting as many things in front of viewer's eyes for as long as possible. Nina Totenberg did a nice segment on this earlier this morning, and her example was a family suing YouTube for promoting isis videos, which they contend ultimately led to the death of their daughter who was killed in an attack in Paris.
She also quoted representatives of the pro and the con before landing on what the Biden administration had to say. Yet again Biden making sense to me: 230 should give companies protection for hosting until they start to make choices that manipulate the content by promoting certain things. Then they've done something that goes beyond merely hosting content for which someone else is responsible, because they're deliberately promoting and recommending it.

I feel like making this the legal position would solve a lot of the things I think are bad about social media. But I'm leery of embracing anything that's going to turn out to dampen the freedom to put things online. Maybe that's the correct interpretation of what the companies are doing wrong, but if those companies overreact by strictly policing everything that gets posted, far beyond what would be necessary merely to comply with the law, then it won't be a good outcome. So I'm not sure what to make of it.
User avatar
Phoebe
Posts: 4029
Joined: Thu Nov 26, 2020 2:57 pm

Re: Section 230

Post by Phoebe »

Correction: the example of the family was actually the plaintiff in one of the cases.
User avatar
Phoebe
Posts: 4029
Joined: Thu Nov 26, 2020 2:57 pm

Re: Section 230

Post by Phoebe »

Two other recent examples that temper my inclination to be absolutist about free speech and open platforms in this legal context:
First is that when you go on YouTube, they have a new thing where short videos will play, Tik-tox style, as you scroll down from video you were watching. I don't know if I'm accidentally touching it as I scroll down or if it just starts playing, but YouTube chooses to put it there. I really dislike this because I find it difficult to look away if they show me a picture of a cute dog or animal doing something interesting, and I don't want to have to fight against that kind of distraction just because I'm looking at a video elsewhere. But I've noticed that after letting a video like this play and continuing to scroll, It takes you to more short videos and now you're out of the place you were before with regular videos.
Every time this has happened to me, I've been given a video of Ben f****** Shapiro. And I don't watch that dude at all or even hate watch him so there's no reason to be giving me this other than YouTube thinks it'll keep me watching much like the cute animal videos. The content is basically offensive to me; in the most recent example he was giving an argument for questioning the judgment of sexual assault victims. So how is YouTube not making a decision to promote that content to me? Even if they haven't picked out "promote b******* about sexual assault", they've created a system in which that content is being promoted to me when I never chose it and never would have chosen it.

This is even worse than the fairly passive practice of suggesting other videos based on the one you're currently watching. I think that is also a manipulation, but we're clearly into worse territory when I'm no longer making a choice to click on the recommended content - It's just given to me if I move my finger across the screen even inadvertently.
Given that it's on my account I can assume the same thing is now happening to my kids. And honestly I think they should be accountable for it. That's not about free speech or a free marketplace of ideas.

Same is true when it comes to Twitter and Elon. Although we let advertisers get away with a great deal of lying, some limits to that dishonesty still exist. A clear example would be something like, If I asked a broker whether they received any type of financial incentives, should I purchase an investment they're recommending, they can't legally lie to me about stuff like that. There's a lot of wiggle room about how things are represented but they can't lie about that. Meanwhile, when musk was asked whether the Twitter algorithm is promoting his tweets, allegedly he claimed it may have happened once for some specific reason but that now it's no longer happening. Pure B.S. this may not work if you already have a Twitter account but I do not, so when I go to Twitter in a follow-up to news stories, If I scroll down under the tweet that was linked to, The very first thing I see every single time is a tweet from musk. He can put his tweet there if he feels like it but can he lie about it, pretending that his product works certain way that it doesn't? Maybe he can lie to users but can he lie to his advertisers about this type of thing? I don't know but the door is open for regulation of the kind the Supreme Court might decide to step in and perform, and I no longer think that being on the side of free speech means preventing that regulation. Maybe there are things companies that provide online services are going to have to be honest about, and maybe when they make deliberate decisions to promote certain kinds of content, whether it's directly or indirectly, they're responsible for it.

I don't know where the line exists between facilitating the speech of others and manipulating it such that you're responsible for that manipulation. But somewhere is a line, imo, because we get to a point where that manipulation is no longer okay/should not be legally unfettered.
Post Reply