EPISODE 103: Section 230 – Mend it, Or End it with Klon Kitchen


If you’ve ever had a post flagged on or removed from social media, you need to know about the arcane sounding Section 230 of the Communications and Decency Act. 

Klon Kitchen, the Director of the Heritage Foundation’s Center for Technology Policy joins me to take an in-depth look into Section 230 – called “the 26 words that created the Internet.” And how the Big Tech/Social Media companies have squandered the public’s trust by abusing its privileges.

#Sec230 #FreedomOfSpeech #MarkZuckerberg #JackDorsey #Censorship


SUBSCRIBE TODAY


FEATURED GUESTS


EPISODE 103 TRANSCRIPT

Episode 103: Klon Kitchen – Section 230 – Mend it or End it

Bill Walton (00:08):

Welcome to the Bill Walton Show. Last week I joined a long and growing list of people and organizations who’ve had their ideas censored by one of the social media companies. YouTube decided that a show that I did with Dr. Jay Richards about COVID-19 did not meet its, quote, community guidelines. Well, I can try cite chapter and verse about how the show presented a well-researched and reasoned argument about the social economic and emotional costs of the lockdowns, and especially what are better alternatives to protect Americans. Today, I’d rather dig into the larger issue. What gives YouTube and the other social media companies, the right to choose, which should and should not be part of public debate? Why do they have the power to stand between us and our first amendment rights? The answer is complex and understanding it gets right at the heart of their power.

It’s something called Section 230 of the Communications and Decency Act. We all need to understand this thing called Section 230 and what it should or should not be doing to protect or interfere with our rights of free speech. With me to explain Section 230 and how to fix it is my frequent guest and friend Klon Kitchen, director of the Center for Technology Policy in the National Security and Foreign Policy Institute at the Heritage Foundation. Welcome, Klon.

Klon Kitchen (01:45):

Hey, it’s great to be here.

Bill Walton (01:46):

So you just posted something on the Heritage site, which I highly recommend everybody reading entitled Section 230: Mend It, Don’t End It. First, let’s dig into it. What is Section 230?

Klon Kitchen (02:00):

Yeah, I’ll do this as quickly and as less nerdy as I can. It’s a part of the statute, as you mentioned, called the Communications Decency Act and Section 230 is a particular portion of that act that lays out liability protections for internet companies. And the brief background on it is in the early days of the internet and in the mid ’90s, Congress decided that it wanted to free websites to be able to remove some of the worst things on the internet from their websites, things like pornography and all kinds of defamatory language and that kind of thing. And so what they wanted to do was they created a protection called Section 230, that provided a liability protection for these companies, if they were to remove that content from their platforms so that they wouldn’t be in fear of being sued for bridging people’s free speech rights.

So the intent, the original intent of Section 230 is laudable. And it’s easy to appreciate, let’s keep the internet from becoming the worst part of itself. However, in the subsequent decades, multiple courts at the state and federal level have interpreted those protections very, very broadly and have essentially equated them with free speech in and of itself. So for a piece of law that was intended to help websites remove awful material from the web, from the internet, it has since been used to allow just as a couple of examples, a revenge pornography website that was devoted to posting nude images without the consent of those in the pictures.

Message boards have successfully defended themselves using Section 230 when they knowingly facilitated illegal activity. Websites that have facilitator at least made easier child sexual exploitation materials. All of these things have been litigated and ultimately these websites protected by arguing that Section 230 allowed them to pursue these practices. So it’s been a real problem.

Bill Walton (04:28):

This is 1996, it was part of the Telecommunications Act. And it was Chris Cox and Ron Wyden, Republican and Democrat who were concerned about, I guess it was a lawsuit, Stratton Oakmont versus Prodigy Services. And Stratton Oakmont, which AKA, the firm and Wolf of Wall Street, sued Prodigy and won $200 million because Prodigy hadn’t been, I guess, I don’t remember the basis for the claim and the award, but somehow Prodigy lost, Stratton Oakmont won, and they wanted to do something about it.

So they wrote something in that says, you’ve got to filter… What’s their language? They say, blocking offensive materials online and there’s got something very specific about obscene, lewd, lascivious, filthy, excessively violent harassing. Absolutely, we want to block that and then there’s this word or otherwise objectionable.

Klon Kitchen (05:38):

Exactly.

Bill Walton (05:38):

And it’s those three words that people have driven a truck through.

Klon Kitchen (05:41):

Yeah, that’s right. As well as the, there’s a second part about taking actions to enable or to make available information, content writers, or other technical means to restrict access. So that second part has also played a key part, but in much of the current political conversation. So you opened at the beginning regarding how you’ve been treated on one of these platforms and it’s that otherwise objectionable language that really has enabled that type of increasingly politically motivated content moderation. And that’s, what’s been a real problem here recently.

Bill Walton (06:18):

So I’ve been confused about this because I thought Section 230 basically said, and you’ve amplified it in your paper, basically said that if the platform, the social media companies don’t weigh in one way or another, they’re not going to be treated as a publisher. And therefore they can’t be sued. And they did that in ’96 because I think the average American, well, the average American wasn’t on the internet, but if it was, it was what you said, average of 30 minutes a month or something like that?

Klon Kitchen (06:50):

Yeah. About 27 minutes a month in 1996, that’s the average time an American spent online.

Bill Walton (06:56):

And so they had a good idea they wanted to… And by the way, we all lived happily back then without being online, it was pretty great. But the idea was that they wanted to promote this technology, allowed these social media companies to grow and grow. They did and now we’re at a point where I think you point out that YouTube is uploading 500 hours of content or is uploaded to it every minute.

Klon Kitchen (07:23):

Every minute of every day.

Bill Walton (07:25):

Wow. So I thought they were supposed to stay above the fray and not opine, yet, Section 230 explicitly says, “No, you’re supposed to weigh in and sensor objectionable material.” But it really refers to pornography, sex trafficking, really obviously bad stuff that anybody left or right could agree is something that ought to be blocked.

Klon Kitchen (07:50):

Yeah. And so the key point about section 230 is to recognize that it’s the offering of a privilege. So it’s not a constitutional right or anything like that. It’s the government decided that we think it’s in the national interest to free websites up, if they choose to remove this type of lewd content, to be able to do so without fear of being kind of sued into oblivion and so we created the statute. Now in one sense it’s forgivable, in 1996, we had no clue about what the internet was going to evolve to be, certainly the idea of social media.

But the main thrust of my paper is that, it’s 2020, the underlying purpose of this statute and the kind of ancillary benefits that it’s provided to innovation and the industry in general are all good, but it clearly needs to be updated. And that’s why, while some have argued that it’s just be removed outright, we would say that it’s best to keep it, but it needs to better reflect modern requirements and needs.

Bill Walton (09:00):

Well, who can change it? It was enacted by Congress, signed in the law by the president. Does that mean to change it, we’ve got to go back through Congress and again, have it signed into law by whomever the president is?

Klon Kitchen (09:13):

That is our preferred outcome. It’s something that could easily be done. In the paper we make specific language recommendations, take this out, add this kind of a detail and Congress could easily pass this. There is actually bipartisan support. There’s different motivations for that support, but there’s bipartisan support for reform of Section 230. And we actually think that with something like this, that is a far more preferable way forward, than just having executive agencies like the FCC start re-interpreting it. Because at the end of the day, we believe the intent of Section 230 continues to be valid, but it needs to be updated. And the best way to update a statute’s language is not through executive interpretation, but actually through congressional action.

Bill Walton (10:02):

Now you explicitly don’t think some of the other remedies that people have proposed are a good idea, for example, declaring the social media companies to be public utilities.

Klon Kitchen (10:17):

Yeah. What I say specifically is that it’s easy to empathize with those. So there are some who feel very frustrated with their treatment by social media and feel like maybe social media is having a really bad impact on our society. And they see the benefits of Section 230 as kind of a political club to kind of hit these guys over the head, kind of get them back in their box and maybe humble them a little bit. I am

entirely sympathetic to that feeling, I know how you come to that sense. But typically when people begin arguing that these companies are public utilities and they reach for Section 230 as a way of kind of getting back at them, from just a pure policy analysis standpoint, we think that’s not the best way forward.

One, because in any normal use of that terminology of public utility, it doesn’t really fit. So public utility typically refers to some type of government imposed monopoly. And these companies are very powerful. There are some clear dominators in different sectors of the industry, but it is still true that all of them have multiple competitors. And to the degree that they enjoy any type of, kind of decisive position, it’s not government imposed. So it’s just not an appropriate use of that terminology. And therefore, I don’t think it’s the best basis upon which to argue for some of these changes.

Bill Walton (11:47):

And that leads me towards the other thing, which is to break them up, use antitrust. And the idea that they’ve got so much market power, that they’re too big and they’ve got to be broken up. I can think of a lot of reasons why I don’t like that. Why don’t you like it?

Klon Kitchen (12:08):

Well, number one, I think antitrust is just an entirely separate issue. I think the statute itself should stand and fall on its merits. And if we still think that its intended consequences are good, which I do, then we should update it so that it’s relevant. And so that we prevent some of these kind of potential abuses of the protections it provides. But I don’t think that holding Section 230 over the head of industry as a kind of weapon or threat to kind of break them up and antitrust, I think it’s just conflating two issues that aren’t relevant to one another. And look, there are some very good questions about what fair competition in the tech market space looks like right now. I’m very open to some of those conversations, but again, I think that conversation should proceed on the merits and not get confused and conflated with this other issue.

Bill Walton (13:06):

Well, can we break it like this? Can you say, well, there’s an economic issue, which is market power benefits, consumer, too much pricing control on behalf of the monopolist, if you will? And then there’s the other piece of it, which is speech. And I don’t think antitrust laws apply to issues involving speech. And it seems like Section 230 gets its focused on the speech issue. Is that a fair way to think about it?

Klon Kitchen (13:38):

Yeah, I think it is. And I do want to be clear. I mean, this is a complex issue, not just because it’s kind of law and there’s language and that kind of thing, but the reality is, is that these companies also enjoy free speech. So when we talk about first amendment, it’s good and right that conservatives fight for and push for free speech, that makes perfect sense. But these companies also enjoy speech protections. And it’s also important to understand that Section 230, doesn’t only protect these companies. So for example, Section 230 also protects the Heritage Foundation’s website and what we can and cannot put on there and our choice not to post certain [inaudible 00:14:21].

Bill Walton (14:23):

How’s it do that? I didn’t know that you guys, you-

Klon Kitchen (14:26):

Because Section 230 applies to all online kind of content platforms. So any business who’s operating online also enjoys these protections. And that would include, The Daily Signal and those kinds of things. That’s why this gets kind of complex very quickly is because we have to understand that any major muscle movements we do in regards to Facebook or Twitter or YouTube or anything like that, can have very significant unintended consequences for essentially anybody who’s operating online, which is why in the paper, I have that section called, A Word of Warning, because conservatives have to think about that very carefully.

Bill Walton (15:09):

Well, but I love your Word of Warning, but I’m really annoyed because I had a show that was a very, very carefully reasoned alternate point of view to lockdowns. And there are lots of other ways to protect Americans and yet YouTube and its wisdom. And then we switched to plan B and put it on Vimeo. And we got an even stronger notice from Vimeo that what we were doing violated its, what do they call it? You cannot upload videos that depict or encourage self-harm, falsely claim that mass tragedies are hoaxes or perpetrate false or misleading claims about vaccine safety. Well, that’s just crazy. I mean, our show had none of that. We weren’t even on that planet. So when they do that though, you and I talked before we got started here about the public trust issue, you want to speak to that? Because I think that’s why this has become such an important issue,

Klon Kitchen (16:06):

Precisely. So the point you’re raising is the point that features prominently in this paper. And that is that look, the reality is, is that these companies have squandered the public trust and it’s a bi-partisan frustration with them. It’s having rules that we can all argue about as to whether or not they should have that rule and what should or should not be allowed on the platform, that’s one thing. But the underlying concern is that these rules are not being applied fairly, and that it’s particularly corrosive and kind of suppressive against conservatives. So throughout the paper I have a couple of stats from reputable polling, like Pew and Gallup that talks about three quarters of US adults believe that social media companies, quote, intentionally sensor political viewpoints that they find objectionable. So three quarters of Americans believe that.

Bill Walton (17:06):

And 55% of Democrats.

Klon Kitchen (17:08):

Yeah, exactly. 50. Yeah, exactly. More than 50% of Democrats think that’s true and 80% of Republicans also have little to no confidence that social media companies have the ability to determine which posts on their platforms should be labeled as inaccurate or misleading. And again, a majority of Democrats as well.

Bill Walton (17:28):

Well, I mean, how do we fix this? You say Mend it, Don’t End It, you’ve got some language changes that you think would be a deft way to fix it.

Klon Kitchen (17:40):

Yeah. So the bottom line is, is there’s some opaque language in the statute that just needs to be clarified. So for example, one of the things that we need to do is we need to strike that otherwise

objectionable line. We need to take those two words out because it’s just too big of a gap that you can drive a truck through. And I think doing that helps us to begin to narrow the scope and application of Section 230, much closer to its intended purpose. Beyond that there’s something called the good faith provision. At the beginning of the text, it talks about these companies are going to be protected if they act in good faith, but good faith isn’t defined. And so we think that in an effort, particularly to cut down on some of this biased application of these rules, we need to further refine and explain what good faith actually means.

And that would include any type or it would preclude any type of biased application of whatever rules they set, anything that’s intended to hurt any type of particular political view or things like that, and we think that that’s essential. We also think that… And this is the nub of it, and this is going to be a heavy lift for Congress and something that we’ll be engaged in further, but we need to clarify the line between the normal editing that goes on with content online versus what actually makes you a publisher who no longer enjoys Section 230.

So in the run up to the election, Twitter and others have started attaching labels to tweets and to other content where they say, “Get more facts or this is contested.” Now under the current reading of Section 230, that doesn’t violate the publisher rule, which if they were ruled to be publishers, then they would lose those protections. We think that those types of labels clearly affect how a piece of content is interpreted, how it spreads and how it shared. And so Congress needs to think a little more clearly about when have you crossed the line to becoming a publisher?

Bill Walton (19:50):

Well, they’ve gotten very cute on this. I mean, you don’t have to cut and paste to be an editor. I mean, they do things like your information is missing context. You’ve got PolitiFact and you’ve got everybody now gaming the fact checkers. And Get The Facts label was applied to a Donald Trump tweet, but then Senator Elizabeth Warren, one of my favorites, has a tweet up that has not been taken down saying, “Racism isn’t a bug of Donald Trump’s administration, it’s a feature. Racism is built into his platform.” And yet Jack Dorsey’s left that up to as far as I know to this day.

Klon Kitchen (20:36):

Yeah. It’s still up with no label, no context or nothing. And that’s where people are legitimately frustrated because… They were before Congress or recently, and several policy makers made the point of like, okay, wait a minute, you’re labeling Donald Trump, but you’ve still got the Ayatollah of Iran denying The Holocaust and you’re not doing anything about that.

Bill Walton (20:58):

I think Jack Dorsey admitted to that in his congressional [crosstalk 00:21:02].

Klon Kitchen (21:02):

Jack Dorsey did not do well in that conversation. He did not do well.

Bill Walton (21:10):

Well, now I think nose rings and a Hoshi Man beard don’t help either.

Klon Kitchen (21:14):

Well, it certainly plays to a type.

Bill Walton (21:17):

So I just put a human face on it, that was Jack Dorsey. But explain to me, who are these social media companies? I know them, but we all know them by name. We all use them, Twitter, Facebook, Google, Instagram on and on, how many people, for example, in Google or in YouTube or in Facebook would be monitoring posts and videos and things like that for content? And is it people or is it an algorithm or a combination?

Klon Kitchen (21:47):

It’s a combination everywhere. And it depends by the company. So YouTube has thousands. Facebook has thousands. I’m not sure about Twitter. The way this typically happens is the first line of engagement or a kind of content moderation is typically algorithmic. So they have ways of looking at how a piece of content is being shared or spreading and being able to determine, okay, there’s something going on here. And then it kind of gets kicked into a process of further review and refine it and ultimately up to kind of people. Now, as you mentioned before, we talked about 500 hours of YouTube video being uploaded every minute of every day. There’s a lot of marketing around these companies and they like to portray themselves as being this kind of omniscient, omni-capable institutions. But the reality is, is that they are completely unprepared to deal with the scope and scale of activity on their platforms.

And it’s a couple 100,000 pictures uploaded every minute of every day on Instagram, for example. And so the way this gets gained is a group of typically left politically aligned individuals will find a piece of conservative content that they don’t like and they’ll flag it. And if a piece of content gets enough flags, it automatically quickly gets moved. It kind of gets quarantined and kind of moved out until it can be reviewed. And then, a reviewer will then make an assessment. Well, that tends to happen to conservatives more, or at least it feels like that happens to conservatives more than it does liberal groups. In part that’s because typically conservatives aren’t sitting around just kind of flagging liberal content we’re living our lives, doing our stuff, but this type of gaming of the system has become-

Bill Walton (23:40):

You couldn’t pay me enough to sit around and flag liberal content.

Klon Kitchen (23:44):

It sounds awful, no, it sounds awful. But even the fact checkers though, so you mentioned PolitiFact and we came out pretty aggressively on this recently. There was a series of conservative political ads that were being put up on Facebook. And one of the fact checkers and Facebook’s fact checking program, PolitiFact, which is left leaning, they couldn’t rate it false and so instead they rated it as needs more context and their justification for that was, “Well, we don’t know what the future is. And so all the bad things that you say are going to happen, it’s impossible to know. So people need more context.” Well, what that effectively did was it still prevented those ads from being run as political ads. And so PolitiFact gamed the system, they didn’t have to rate it as false. They just had to say, “It needs more context.” And that killed the ads. And that type of a gaming of the fact checking system is now rampant on the left side of things and something that we are being very aggressive about calling out.

Bill Walton (24:46):

Well, the thing I find particularly troubling is it’s beginning to feel like we got Pravda because one of the things that happened with us is I think there are legitimate arguments and alternative ways to protect people against this virus. And what we’ve moved from was a virus pandemic to a pandemic of fear. And

my concern [inaudible 00:25:09] show is we’ve got to do something about the fear that’s gripping America, the fear that’s gripping the world.

And so we got to be less draconian and more thoughtful about how we deal with this. And yet, you read what happened to us with Google. They said we violated community guidelines. Well, we looked at the content guidelines for COVID-19 and you run across this line, you can’t make any claim that contradicts local health authorities or the World Health Organization. Now think about that, that means all the great thinkers, all the great scientists, all the great medical practitioners are either local health authorities or the World Health Organization. I happen to believe a lot of smart people don’t want to go to work in either one of those places. And you’ve got a lot of smart guys out of Stanford and all over the world that are coming out against some of these proclamations. And yet that’s what YouTube is hanging its hat on, thoughts?

Klon Kitchen (26:11):

Yeah. I mean, this is difficult because a part of what they’re trying to do with that policy is specifically work against foreign influence operations on COVID-19 and the like, and we get swept up in that. And that’s why the co-mingling of kind of American freedom of speech issues with the very real concerns about foreign activities and manipulation online is really difficult. And the reality is we do know that Russia and others are actively sewing all kinds of misinformation about COVID-19 in terms of its origin with the US military and all kinds of other things. And look, here’s the context that a lot of people just haven’t had the privilege to know. This goes back to, I spent 15 years in the US Intelligence Community. And when we first, when the government first started engaging with these companies, it was engaging them and asking them for help in cutting down on terrorist propaganda and recruit material online, we came to them and said, “Look, we can’t remove this stuff on Facebook, Facebook, you’ve got to remove it.”

And they responded and they’ve actually gotten really good, it’s not perfect, but they’ve gotten really good at identifying and removing that content. So we kind of created this monster and now it’s kind of turned on us and it’s a real problem. And there’s enough of a legitimate justification for some of these actions that it makes kind of decisively calling them out a little trickier. And so that’s one of the reasons, again, why we’re trying to reframe and refine the legal protections that they enjoy to kind of move us in a better direction.

Bill Walton (28:00):

Mm-hmm (affirmative). Well, the Russians are small bit players. The Chinese are really the ones I think that are all over-

Klon Kitchen (28:06):

Certainly on COVID.

Bill Walton (28:08):

I’ve done seven or eight shows on various aspects of China and on my YouTube comment section, I might as well just put it up in Chinese because I got so much interest in these shows. They don’t quite agree with what I’m saying, but also the censorship, we think we got taken out because the show was so popular. All of a sudden we’d gone from a few 1000 to many, many, many 1000 of viewers in a very short period of time, a couple of hours, a day, maybe. [inaudible 00:28:42]. And we think it was just the sheer popularity of what we put out that triggered the censorship.

Klon Kitchen (28:50):

Yeah. I mean, it’s entirely possible that again, let’s say that was the case that you started growing by leaps and bounds in a couple of hours, that likely would have run across some of these left-leaning groups who see content like that spreading, don’t like it and flag it, [crosstalk 00:29:09] that gets it kicked up the chain and then they take action.

Bill Walton (29:12):

That’s what happened. We got a [inaudible 00:29:16] minutes left. Let’s talk about why it’s in the interest of the social media companies to help us do something about this.

Klon Kitchen (29:24):

Mm-hmm (affirmative). Well, I mean, so the bottom line is, is that these companies have clearly squandered the public trust. And what we’re offering in this paper and more broadly as Heritage engages on tech policy issues, is a rational and coherent way forward where the market and the public dialogue get to remain free and fair. That’s what we’re trying to drive at. Up until this point, these companies have kind of stubbornly refused to engage in this conversation seriously. I think on both sides of the political aisle, the patience in that conversation is running out. This kind of middle road of mending it and not ending, Section 230 is I think going to be one of the last opportunities for us to get this right. And if we don’t get it right, if there’s still a kind of stubborn resistance, if we continue to get the Heisman from the tech industry, then even worse outcomes are going to become much more politically viable. And we all need to take a second and think about this very, very carefully.

Bill Walton (30:32):

Klon, thank you. We’re just talking with Klon Kitchen, director Center for Technology Policy at Heritage Foundation. And he’s pinned I think, a very, very smart approach to this whole social media censorship issue. And it’s called Section 230: Mend It, Or End It, and it’s on the Heritage website and Klon as always, great talking with you, to be continued.

Klon Kitchen (30:58):

Always, thank you for the time.

Bill Walton (31:00):

All right. Thanks Klon. And thank you for joining me on The Bill Walton Show. And we will also be talking with you next time. Take care.

Speaker 3 (31:07):

Thanks for listening. Want more? Be sure to subscribe at thebillwaltonshow.com or on iTunes.

Related
Episodes

Episode 280: How to Conduct an Honest Election

It won’t be news to anyone that this upcoming election is fraught with many potential problems. 

Based on a new study by Just Facts, 10% to 27% of “non-citizen” adults in the U.S. are estimated to be illegally registered to vote. Aggressive attempts to debunk the study have completely failed.

Watch Now

Episode 279: The Case for Trump

In this easy to listen to, charming, and informative episode of “The Bill Walton Show,” host Bill Walton engages with Mercedes Schlapp, former Trump White House Advisor and Matt Schlapp, leader and Chairman of CPAC in a dynamic discussion about the pressing issues facing America today. Together, and with humor, they dig into the upcoming elections, economic priorities, and the shifting political landscape.

Watch Now

Episode 278: Stop trying to “Reform” Schools, Start Fresh with New Ways to Teach Our Kids

Public K12 education in the United States has become the largest government-controlled monopoly in the world, (other than the CCP controlled monopolies in China), spending nearing $1 trillion per year.  And for all the money spent, it’s been a failure. In international tests, American eighth graders score 9th in reading, 16th in science, and 34th in math. 

Watch Now

Episode 277: “NATO Taunts Russia” with Stephen Bryen and Brandon Weichert

This episode examines the three explosive national security crises the United States and world are embroiled in today, any one of which could escalate into igniting a World War III.

Watch Now

Episode 276: Is Technology a Force for Good or Evil?

For the last 200 years, innovation and technology have produced dramatic increases in living standards and our quality of life. 

Yet today there is a widespread and growing belief that technology has become the root of all evils with all sorts of claims being made that it destroys privacy, spreads misinformation, undermines trust, and democracy, eliminates jobs, discriminates by race, and gender, increases inequality, rips off the consumer, harms children, and even threatens the human race.

Watch Now