The Internet is Hard
First- A preface.
We live in complicated and somewhat perilous times. I hate to be alarmist but the reality is that we are quickly approaching several points from which there just isn’t a practical return. Some of these things are existential threats to the species. Some only risk collapsing the nation into chaos. A few have weird side effects like trapping us permanently in a pre-industrial revolution world that might be ok, but would kill a staggering number of people along the way. There’s also endless variations on dystopia just around the corner if we aren’t careful.
I want to say that because there are a few ways to read what I’m about to start talking about and I want to be clear on my intentions. The future is not certain and at the end of the day it’s going to be people that decide it. We are some of those people. This is, in no small way, on us. We have to get this right.
So I’m going to advocate for prudence, intention, and no small degree of caution. I’m not advocating for dithering and freezing, but we also need to take a step back and really think about what we’re doing, because what we do next will have consequences and striding blindly through those decisions is just as bad as doing nothing. This is not to extol some kind of “Both Sidestm” nonsense, but is to say that the problems we’re dealing with are complicated and that if you truly think the answer starts with “Can’t we just” and fits on a bumper sticker you’re almost certainly wrong and you should think about it some more.
So, with all that said, let’s talk about internet regulation
This is section 230, which if you exist you’ve probably heard about and here is the specific part of it that’s relevant to the news these days
© Protection for “Good Samaritan” blocking and screening of offensive material
(1) Treatment of publisher or speaker
(2) Civil liability No provider or user of an interactive computer service shall be held liable on account of —
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
And then also
(2) Interactive computer service
The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.
(3) Information content provider
The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.
- Access software provider
- The term “access software provider” means a provider of software (including client or server software), or enabling tools that do any one or more of the following:
(A) filter, screen, allow, or disallow content;
(B) pick, choose, analyze, or digest content; or
© transmit, receive, display, forward, cache, search, subset, organize, reorganize, or translate content.
Now, very importantly, I am not a lawyer. I’m a humble boat captain. Everything I’m going to say from this point on is based on other things I’ve read and listened to. I just want to make that very clear. If you are a lawyer and I’ve got something wrong, by all means correct me.
Now, the way this keeps being explained to me is that the whole debate on section 230 comes down to whether or not social media sites like Facebook and Twitter are publishers or not. In the law it’s the distinction between the “Interactive computer service” (not a publisher) and “Access Software Provider” (Publisher). The analogy that keeps being given is a newspaper vs a bookstore, and all of this comes down to liability. A newspaper has an editorial board and decides what to print so they are responsible for what they print. A book store doesn’t have that liability. They just sell books and expecting a book store owner to be aware of everything that is in every book is frankly absurd so they aren’t liable if a book contains something objectionable or even illegal. The question is whether or not a site can be sued over content that people post on the site. A bunch of case law following the passage of this regulation has come down to “Websites that allow public comment are more like book stores than newspapers.”
Where this gets weird is that clause about moderation in good faith. They are allowed to moderate things as long as it fits into that description and they’re still covered. “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” So they’ve got that covered, but if they’re moderating over political beliefs or because they hate recipes that include basil, or want you to leave Britney alone then they have crossed the line and have become “access software providers” (Read newspapers) and now they can be held legally and civilly responsible for what gets posted.
Ok, so, now we have the claims that these companies are censoring things that they aren’t allowed to censor and so they need to have their section 230 protections revoked. In fact, there are some folks saying we need to get rid of section 230 entirely and just gut the bill.
YOU DO NOT ACTUALLY WANT THIS
Nobody wants this. Nothing good comes from this. Repealing Section 230 destroys the entire internet as we know it. The uncensored corners of the internet are all trash. They’re just awful places that you shouldn’t want to hang out in. They follow incredibly predictable paths of ugliness that usually start with racist memes and pictures of women of questionable age and then they just become more and more extreme till you have people being unironically nazis and posting the most upsetting pornography they can find. That’s what the internet does if you just let it go without any moderation. It really doesn’t go well for trying to have a mainstream social media site which means sites like facebook just aren’t going to do that. They can’t.
They’re going to moderate, and because there are two billion people on it and apparently 350 million **photos** are posted daily (I couldn’t find total number of posts) they use an algorithm to do the first pass moderation. The machine is watching to see if you’re doing something you shouldn’t. (The do also have people moderating and everything about that story is awful) The machine is going to get things wrong. It’s going to block things it shouldn’t, and it will let things through that should be blocked. That’s a reality that can’t be escaped, and moderating it all by hand would be impossible because of the sheer volume of content. The photos alone come out to just under 250,000 images per minute. So they’re going to use the algorithm, and it’s going to make mistakes. Those mistakes arguably mean that Facebook is a newspaper, not a book store.
What that all comes to say is that you would not be able to post or comment on Facebook anymore; at least not the way you’re used to doing it. Nothing would be able to be publicly posted without someone putting eyes on it. Everything will need a human moderator to give it the OK because otherwise they could be liable for you posting things you shouldn’t post and given the sheer volumes of posts we’re talking about, that means they need several armies of moderators and also it’s still going to take forever for anything to publish. This ends social media. But that’s not all. Do you like reading reviews on travel sites or recipe sites? Because those also will need to be cleared. How about your spam filter on your email because that’s also editorializing and moderating and that’s going to be gone if they get rid of section 230. By making companies liable for everything that crosses their servers you’re telling them that they need to be very aware and very careful about everything that goes across their server. That’s how that works.
You’ll notice that section 230 does specifically talk about them making a good faith effort to moderate well which is why they’re ok where they are. You can certainly argue that they’re being biased, but the evidence genuinely doesn’t seem to bear that out based on what I’ve seen (though I’m always happy to have more information) but the functional reality is that if we want to change it we need to consider what will happen when we change the rules.
By contrast, recently Australia decided that Facebook and Google needed to pay news organizations money, since the news sources are providing value to Facebook and Google. In response, Facebook blocked the walls for all Australian news sites, and also most of the obvious news sources from the US if you are in Australia. They also blocked a bunch of emergency services and government sites but we’ll come back to that in a bit.
Now, if I’m being generous I’ll say that the Australian government realizes that news organizations, especially smaller ones are being absolutely gutted by the modern world and social media and are trying to set up a revenue stream for them which sounds like a good idea, but it does seem kind of silly that Facebook should have to pay ABC for ABC voluntarily posting on Facebook. There’s also some very real questions about how much good this is going to do for small and independent media vs the massive Murdoch machine that owns a majority of Australian news but I digress.
All this does, functionally, is to leave us at a constant risk of making it slightly harder to find the information I want on the web. Like I said above, it’s not just Facebook, it’s Google. I don’t personally love some of how Google works, but I’m not sure that I really want Google being forced to pay money for the right to list things. That seems like a really clever way to make sure that Google de-lists a bunch of small sites that they don’t think are worth paying for. That seems less than ideal to me and pretty much only beneficial to companies that are already really big and well known.
Circling back to the whole “lot’s of sites that weren’t news sites got caught up in it” thing. It’s certainly fair to say that Facebook decided to cast a very wide net specifically to try and punish Australia but let’s imagine for a moment that it was a good faith effort to go along with the proposed regulations. This is the problem with new internet regulation in a nutshell. Anything we do is going to have some weird ripples as people either make mistakes or maliciously comply with the new rules. Everything we do is going to impact things that we didn’t intend to impact, and this is a problem because the internet has become so deeply woven into the fabric of our lives that those little hiccups could have drastic results.
We passed a law nominally trying to deal with sex trafficking and ended up putting a whole lot of sex workers into dangerous situations. We keep talking about banning types of encryption in order to prevent criminals and terrorists from being able to use it to hide from law enforcement, but that would also be the end of online banking. There’s talk about banning cryptocurrency without any concept of how that would even work, and at the same time we keep hearing things about building a new “smart grid” which puts our critical infrastructure on internet connections.
I don’t have any answers here. Facebook is a genuine evil. The media is complicated. Encryption can be used to hurt people. Google is….. just a lot. All these things need to be addressed and we need to find a way to rein in a lot of the things we have going on, but it needs to be done carefully and intentionally. We can’t just crank out some laws that sound good without looking at what the actual impacts are going to be, and we need to be paying very close attention to who is actually benefiting from the laws.
Follow the lobbyists.