Sign up for the THINK newsletter

You have been successfully added to our newsletter.

Get fresh opinions, sharp analyses and powerful essays delivered to your inbox.

Don Howard Whether robots deserve human rights isn't the correct question. Whether humans really have them is.

We need to figure out how to fairly and respectfully share our world with artificial friends and neighbors.
Image: Futuristic military cyborg surveillance on the street
We probably ought to figure out what rights robots will have before they reach sentience.gremlin / Getty Images
Get the Think newsletter.

While advances in robotics and artificial intelligence are cause for celebration, they also raise an important question about our relationship to these silicon-steel, human-made friends: Should robots have rights?

A being that knows fear and joy, that remembers the past and looks forward to the future and that loves and feels pain is surely deserving of our embrace, regardless of accidents of composition and manufacture — and it may not be long before robots possess those capacities.

Yet, there are serious problems with the claim that conscious robots should have rights just as humans do, because it’s not clear that humans fundamentally have rights at all. The eminent moral philosopher, Alasdair MacIntyre, put it nicely in his 1981 book, "After Virtue": "There are no such things as rights, and belief in them is one with belief in witches and in unicorns."

There are serious problems with the claim that conscious robots should have rights just as humans do, because it’s not clear that humans fundamentally have rights at all.

So, instead of talking about rights, we should talk about civic virtues. Civic virtues are those features of well-functioning social communities that maximize the potential for the members of those communities to flourish, and they include the habits of action of the community members that contribute to everyone’s being able to lead the good life.

After all, while the concept of "rights" is deeply entrenched in our political and moral thinking, there is no objective grounding for the attribution of rights. The Declaration of Independence says: "We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness."

But almost no one today takes seriously a divine theory of rights.

Most of us, in contrast, think that rights are conferred upon people by the governments under which they live — which is precisely the problem. Who gets what rights depends, first and foremost, on the accident of where one lives. We speak of universal human rights but that means only that, at the moment, most nations (though not all) agree on some core set of fundamental rights. Still, governments can just as quickly revoke rights as grant them. There simply is no objective basis for the ascription of rights.

Image: SpotMini
SpotMini, the Boston Dynamics robot dog.Boston Dynamics

We further assume, when talking about rights, that the possession of rights is grounded in either the holder's nature or their status — in the words of the aforementioned declaration, that people possess rights by virtue of being persons and not, say, trees. But there is also no objective basis for deciding which individuals have the appropriate nature or status. Nature, for instance, might include only sentience or consciousness, but it might also include something like being a convicted felon — which means, in some states, that you lose your right to vote or to carry a gun. For a long time, in many states in the U.S., the appropriate status included being white; in Saudi Arabia, it includes being male.

The root problem here is the assumption that some fundamental, objective aspect of selfhood qualifies a person for rights — but we then have to identify that aspect, which lets our biases and prejudices run amok.

Still, since we will share our world with sophisticated robots, how will we do that fairly and with due respect for our artificial friends and neighbors without speaking of rights? The answer is that we turn to civic virtues.

Focusing on civic virtues also forces us to think more seriously about how to engineer both the robots to come and the social communities in which we all will live.

In a famous 1974 essay, the political theorist Michael Walzer suggested there are at least five core, civic virtues: loyalty, service, civility, tolerance and participation. This is a good place to start our imagining a future lived together with conscious robots, one in which the needs of all are properly respected and one in which our silicon fellow citizens can flourish along with we carbonaceous folk.

Focusing on civic virtues also forces us to think more seriously about how to engineer both the robots to come and the social communities in which we all will live. What norms of public life should be built into our public institutions and inculcated in the young through parenting and education? The world would be a better place if we spent less time worrying, in a self-focused way, about our individual rights and more time worrying about the common good.

A final noteworthy consequence of this suggested shift of perspective is that it highlights a challenge, which is designing the optimal virtues for the robots themselves. A task for roboticists will be figuring out how to program charity and loyalty into a robot or, perhaps, how to build robots that are moral learners, capable of growing new and more virtuous ways of acting just as (we hope) our human children grow in virtue.

The robots will have an advantage over us: They can do their moral learning virtually and, thus, far more rapidly than human young. But that raises the even more vexing question of whether humans will have any role in the robotic societies of the future.

Don Howard is professor of philosophy at the University of Notre Dame. He is also a fellow and former director of Notre Dame's Reilly Center for Science, Technology and Values.

Get the Think newsletter.
MORE FROM think