posted on Sun, Oct 27 '19 under tag: tech-policy

Should online services like Facebook be held responsible for the content that is distributed through them? What about free speech?

I was listening to Mike Hoye’s state of Mozilla (fall 2019) talk on youtube and at the very end of the Q&A Mike gets deeply philosophical about regulating software. I found the argument quite convincing that regulations are required to protect human rights from the harms of software. To be frank, I used to believe that despite all the bad things companies like Facebook do, there should never be regulation on the internet as it stifles innovation and free speech. But I have effectively changed my opinion today.

Intermediary Liability

Intermediary liability is the broad and technical term that becomes the foundation of bringing in regulation on software/online services. There is a lot to read about what “intermediary” means and I find this report by from March ‘19 to be a comprehensive overview. Essentially services like Facebook, twitter, google search should all fall under the definition of intermediary.

At the moment, though, services like Uber seem to be forgotten by people thinking of intermediary liability as the focus is on user-generated content, free speech, and information. These services should also fall under the broad ambit of “intermediary”. In fact, the motor vehicles amendment act of 2019 which introduced heavy penalties for traffic violations also defined “aggregators” as digital intermediaries. The word “intermediary” should probably stand for all software based services.

Duty of Care

UK government ran a consultation on an online harms white paper that kicked off some conversations about using “duty of care” as an approach in regulating intermediaries. The paper itself might not have done a good job at it, but the approach is interesting.

Imagine software makers (including companies that use software as services) being responsible for the software that they make. Instead of providing software “as is”, they have to be accountable to the shortcomings of the software. That is the duty of care approach, very briefly. I will not try to reflect on the approaches that have been proposed because that would be repetitive and boring. I am, instead, going to draw an analogy.

One conceptual leap I had to make to reach where I am now is the idea that software exists. Software is not just an abstract idea, it is also the machine that runs it, and the things that it does. This is absolutely fundamental in realizing why software needs regulation. You may think, “but software is just code, just logic. And logic is abstract. Logic doesn’t exist” and that is why I offer this analogy.

I am a doctor in primary care. I see people with illnesses, I talk to them, I may lightly touch them (to examine), and then I give them a set of instructions to follow so that they can get better soon. I do not cut them open, I do not inject drugs into their body. I only give instructions. But, if I give them the wrong instructions, I am responsible. And if I am negligent, I am criminally responsible.

Think of that. I give instructions that set in motion things that have real life consequences. But all I do is give instructions. I don’t harm anyone directly. Do you see the parallel? Software makers are also just giving instructions. They are instructing computers what to do - what to display, what not to display, what to allow, what not to allow, what to enable, what to disable. And there are real life consequences of these “instructions”. Software makers should be held responsible for their software.

Free Software

There are a lot of consequences of that statement. What happens to free software? Gratis software (free as in free water) is straightaway off the hook. There exists no duty of care when there is no “consideration” paid in return. But, is that it? Is Google and Facebook not largely “free”? This is a conundrum that will have to be solved. Free services that generate money out of ads are not really “free”. The consideration that is being paid there is the user’s attention. But that creates another problem. What about offline services where sponsors gain attention of users in return of free service being provided? Would duty of care exist in such situations? (I don’t have all answers here and will appreciate your comments).

Libre software (free as in freedom) will have to now consider whether they are gratis or not. If they are paid (which they never are), they will have duty of care. But if they are gratis, the above paragraph applies to them. Now, there is an added complexity with open source contributions. How would the liability be distributed when code is written by a hundred people? Code maintainers will now have to assume even more ethical obligations. And code review will become ethical review as well.

Free Speech

Intermediary liability by itself does not harm free speech. It is the implementation that causes the problem. (Well, if you can’t implement something correctly, should you make laws on it? That’s another question.) Regressive governments world over will block free speech one way or the other. They will definitely use intermediary liability as a weapon.

But then democracies should not elect such governments in the first place. It maybe a catch-22 and it may become even more difficult going forward. That is why we may have to act quickly. We have to regulate intermediaries before the countries become too polarized and unstable. Such that we can have a healthy democracy surviving that keeps a good government in power. (I don’t believe 100% in this paragraph. So feel free to disagree)


What about law makers? Do law makers have a duty of care? Should they be held accountable for their service? Should they be punished for the harms of what they did?

Like what you are reading? Subscribe (by RSS, email, mastodon, or telegram)!