Who should police content on the Internet?

  •   Comments

The beauty, and the danger, of the internet is that it’s open to everyone.  Anyone can put up a website, about pretty much anything.  This “open platform” is an amazing thing, and means that innovation can come from all corners, without barriers or gatekeepers.  It also introduces new challenges for how to deal with the inevitable bad things that come along with the good.

This past week, this question has come back to the foreground with the Charlottesville riots and the associated far-right websites that helped organize them.  Particularly in focus has been the website “The Daily Stormer”, one of the most vocal/violent/awful neo-nazi sites on the internet.  In recent days, all of the infrastructure providers that served the Daily Stormer have dropped it, and it has relocated to a Russian domain.  As of this writing, it appears that Anonymous has already DDOS’d dailystormer.ru and it is offline.

One of the companies that initially resisted dropping the Stormer, but ultimately did, was (USV portfolio company) Cloudflare. Cloudflare has taken heat for some time now for its insistence not to drop the Stormer, dating back to this ProPublica article from May.  In Cloudflare’s response to that article, CEO Matthew Prince included the following:

“Cloudflare is more akin to a network than a hosting provider. I’d be deeply troubled if my ISP started restricting what types of content I can access. As a network, we don’t think it’s appropriate for Cloudflare to be making those restrictions either.

That is not to say we support all the content that passes through Cloudflare’s network. We, both as an organization and as individuals, have political beliefs and views of what is right and wrong. There are institutions — law enforcement, legislatures, and courts — that have a social and political legitimacy to determine what content is legal and illegal. We follow the lead of those organizations in all the jurisdictions we operate. But, as more and more of the Internet sits behind fewer and fewer private companies, we’re concerned that the political beliefs and biases of those organizations will determine what can and cannot be online.”

This is a difficult line to walk, but it’s actually really important to the underpinnings of the Internet.  To understand why, you have to think about all of the bad things that happen on the internet every day — from really bad things like neo-nazi genocide organizing (I am writing this as someone whose great grandfather was murdered for being a Jew) and child exploitation, all the way to marginally or arguably not-so-bad things like, “I don’t like what this person wrote on this website and I want it taken down”.

So, from the perspective of someone operating internet infrastructure, you are constantly bombarded with requests to take down things that people don’t like, for one reason or another.  This is unsustainable for two reasons: 1) the pure scale of it, especially for larger properties handling millions or billions (or trillions, in the case of Cloudflare) pageviews and 2) platforms are almost always not in the best position to make a just determination about whether a given piece of content is legal or illegal.  So the position of most large web platforms has been to delegate decisions about the legality of (user-generated) content to law enforcement, the courts, or other actors “at the edges” who are in the best position to make those determinations.

From the user/customer perspective, if you think about it, you really don’t want your ISP, or DNS provider, or hosting provider making arbitrary decisions about what speech is acceptable and what is not.

To further codify this general approach to handling content, we have something called Section 230 of the Communications Decency Act which grants internet intermediaries limited liability when it comes to handling internet traffic and user-generated content (e.g., the speech of others).  Generally speaking (and I am not a lawyer) this means that companies are legally insulated from content that someone else publishes on their platform.  If this were not the case, then it would be impossible, from a risk perspective, to operate any website that handled the speech or content of others (think Facebook, Dropbox, GoDaddy, etc).  If you needed to be 100% certain that every piece of information that any user published on your platform didn’t violate any laws anywhere, you would simply not let anyone publish anything.  Or you’d need to have some very draconian/slow editorial & approval process, so we’d have no Twitter, no Instagram, etc.

Over the years, every time a new wave of bad activity emerges on the web, there is the inevitable battle about who should be responsible for stopping it. This is what the Stop Online Piracy Act (SOPA) of 2011 was about — this would have made internet platforms directly liable for any user-generated content that might have copyright violations in it (as opposed to the current situation where sites must comply with valid takedown notices in order to keep their immunity).  This has come up again in 2017 with the introduction of the “Stop Enabling Sex Traffickers Act of 2017” that seeks to limit CDA 230 protections in the name of addressing child exploitation on the internet.

The really hard thing here, whether we’re talking about piracy, or child exploitation, or neo-nazis, is that tailoring a law that addresses those problems without having broader implications for free speech on internet platforms is really hard. And what we don’t want is a world where, rather than an environment of due process, we end up with either platforms making arbitrary, unilateral decisions about the validity of content, or we get the vigilante justice of DDOS attacks knocking websites offline.

Cloudflare has done the hard work of defending due process and freedom of expression online.  It’s not easy to do this, and it is often unpopular (depending on who is doing the speaking). But in the end, they decided to drop the Daily Stormer from the Cloudflare platform.  In his explanation of why he decided to make this call, Matthew Prince explained it this way, in an email to the Cloudflare team:

“This was my decision. Our terms of service reserve the right for us to terminate users of our network at our sole discretion. My rationale for making this decision was simple: the people behind the Daily Stormer are assholes and I’d had enough.

Let me be clear: this was an arbitrary decision. It was different than what I’d talked talked with our senior team about yesterday. I woke up this morning in a bad mood and decided to kick them off the Internet. I called our legal team and told them what we were going to do. I called our Trust & Safety team and had them stop the service. It was a decision I could make because I’m the CEO of a major Internet infrastructure company.

Having made that decision we now need to talk about why it is so dangerous. I’ll be posting something on our blog later today. Literally, I woke up in a bad mood and decided someone shouldn’t be allowed on the Internet. No one should have that power.”

This is intentionally provocative, and meant to help everyone understand why it’s dangerous to encourage large internet **infrastructure** providers to take editorial control.  For while it may seem obvious that this is the right call in this case, there are literally millions of other cases every day which aren’t so clear, and around which we really should be aiming to have due process to guide decisions.

I would encourage you to read the follow-up piece on the Cloudflare blog discussing why they terminated the Daily Stormer – in it Matthew details out all of the kinds of players in the internet infrastructure space, what role they play, and how they impact free speech online.

In all of this, there is an important distinction between what platforms are **legally required** to preemptively take down, and what they are **within their rights** to remove.  A tension in the industry is a hesitation to exercise corporate rights to remove content, at the risk of sliding towards a legal regime where platforms have a positive obligation to remove content — this is what introduces the greatest risks to free speech and due process.

Another key point, which is raised in the Cloudflare post, is the different roles played by various types of internet providers. There is a difference between low-level providers like DNS servers, backbone transit providers, etc.; and high-level applications like social networks, marketplaces, and other, more narrowly-focused applications.   Generally speaking, the higher up in the stack you go, and the more competition there is at that layer, and the more specific your application or community, the more it makes sense to have community guidelines that limit or direct what kinds of activities can take place on your platform.

Lastly, none of this is to say that platforms don’t and shouldn’t partner with law enforcement and other authorities to remove illegal content and bad actors.  This is actually a large part of what platforms do, every day, and it’s critical to the safe functioning of the internet and of social platforms.

But perhaps the big takeaway here is that, as we continue to discuss where enforcement and censorship should take place, we should fall back on the underlying belief that transparency, accountability and due process (and not arbitrary decisions by powerful companies or outside groups) are critical components of any solution.

Learning by doing

  •   Comments

I had lunch yesterday with someone who has been investing in the crypto / token space recently — having pooled together a small “fund” from friends and family.  It’s a short-term vehicle (like, 6 months), and a large part of the goal is simply to become hands-on familiar / capable investing in token sales /… Read more »

Keeping it simple

  •   Comments

We recently had our daughter’s birthday party, and we held it in a public park near our house, where there’s an old parks department building.  The sun plan was outdoors, but of course it thunderstormed and we didn’t have a back-up plan.  So we called an audible and asked if we could use the back… Read more »

What’s your medium?

  •   Comments

Yesterday, I caught up with my old friend Gary Chou.  Gary was the first General Manager of the USV Portfolio Network (predating Brittany and Bethany), and has since been running Orbital, a community space and “studio for building networks” (which happens to be in the original Kickstarter building on the Lower East Side).  We got… Read more »

Speaking page

  •   Comments

I’ve been doing more public speaking recently, and finally assembled videos into a single place: https://www.nickgrossman.is/speaking/ As I look at that list, I realize that I’ve been doing a ton of speaking in Europe. Of course I know this, because I was there, but didn’t quite realize the pattern that the majority of my recently… Read more »

Getting Help

  •   Comments

I’m on vacation this week, and we have some old friends and their family staying with us.  Last night we got to talking about therapy (like psychotherapy) and how valuable it has been for me over the past few years. Maybe four years ago I started seeing a therapist on a bi-weekly basis.  There were… Read more »

The joy of fixing things up

  •   Comments

I am on a plane right now, watching home renovation shows on HGTV, thinking about how much fun it is to fix things up. Doing projects around the house (last year I built an exterior staircase and made new kitchen countertops, the year before that I built a mudroom), coding and buding apps, and working… Read more »

Getting in over your head

  •   Comments

I was out last night with some of the little league coach dads, and we got to talking about whether it’s better for our kids to be bumped up a level (but be at the lower end of skills/experience) or stay back a level and have a chance to really excel.  The consensus was that… Read more »

For web platforms considering a token strategy: cryptocurrency vs. dollars?

  •   Comments

A lot of founders / teams have been asking if they should be adopting a cryptocurrency strategy.  This is understandable given the frenzy of fundraising recently and the ongoing dialogue about the potential for cryptocurrencies as an alternative business model for web platforms. As “traditional” web & mobile platforms explore this option, there are a… Read more »

A little better every day

  •   Comments

I just got done coaching my son’s baseball practice. It has been amazing to watch this group of 7 and 8 year olds improve over the course of the season – learning the fundamentals and now starting to make some pretty great plays. I had a great baseball coach as a kid.  I’ll never forget the… Read more »

Entering the world of smart contracts

  •   Comments

One thing that’s interesting about yesterday’s Basic Attention Token sale is how quickly it went – $36M transacted in 30 sec. Lots of people were surely disappointed as they attempted to buy into the token sale only to have their orders canceled for missing the sale window. I haven’t nailed this down for certain, but I suspect… Read more »

Mechanics of the token sale

  •   Comments

In case you missed it, today Brave raised $36M for the Basic Attention Token.  They had allocated 30 days for the token sale, but sold out of 1B BAT in 24 seconds. The Basic Attention Token (BAT) ICO just raised 30 million dollars in 24 seconds. VC’s didn’t even have time to put on a sweater… Read more »

Open source leadership vs. corporate leadership

  •   Comments

As cryptocurrencies and blockchains have continued to gain steam (and attract capital), a common question in the air is, what type of leader does it take to be successful in this space? A common variant on that question is: “will [leader] need a grownup in the room once they get ahold of all that money from… Read more »

Regulating source code

  •   Comments

As more areas of our economy become computerized and move online, more and more of what regulators need to understand will be in the source code. For example, take the VW emissions scandal: These days, cars are an order of magnitude more complex, making it easier for manufacturers to hide cheats among the 100 million… Read more »

Cryptocurrenices: the native business model of attention

  •   Comments

There has been lots of attention this week on cryptocurrencies and blockchains, what with Consensus conf and the Token Summit and lots of related announcements. And with like lots of new things (thinking back to Twitter circa 2010) I find myself spending a lot of time explaining to people what blockchains and cryptocurrencies are, and… Read more »

The Service Recovery Paradox

  •   Comments

I’m writing this from a plane.  I’ve been in the air for an hour and everything is fine, but for a few minutes before the flight, things weren’t fine.  At roughly the time we were supposed to board (on an already late in the evening flight), the gate attendant came over the mic to announce… Read more »

Complicity

I had an interesting experience today.  As I was in the air on my way to San Francisco, I got a text from my Airbnb host saying that they had made a mistake and accidentally double-booked my room.  I ended up taking their offer to cancel and booked a hotel room (at a steep increase… Read more »

Flexing the platform for good

Over the past few weeks, I’ve been touching base with many companies and individuals in the tech sector to understand how they are reacting to the current political environment. Every company and community (of users, customers) is different, with its own sensitivities, priorities, and goals.  So it’s been really interesting to understand the very wide… Read more »