Will AI have the same rights as people?

Gary Goodwin
3 min readSep 18, 2020

Immortals shall soon walk among us. They may also crawl, roll and perhaps hover. Yes, definitely hover. The immortals refer to artificially intelligent persons (AIPs), and by ‘us’ I mean natural persons.

The European Parliament Committee on Legal Affairs recent report recognized that humankind stands on the threshold of an era of sophisticated robots and other manifestations of artificial intelligence (“AI”). The Committee saw the need to legislate this area relatively quickly as self-driving cars are making their appearance. The fundamental question is what sort of legal status should be granted to AIPs. Natural persons want to avoid any “Battle of the AIPs” future scenarios.

Mary Shelly’s Frankenstein, the modern Prometheus, dramatically starts off the Committee’s report. The Committee thought that by addressing people’s real concerns up front, they could deal with the more substantive issues. The Committee recognizes that people have fantasized about the possibility of building intelligent machines and of achieving potential unbounded prosperity. The Committee does not mention drones with laser canons, but you just know they were all fantasizing about that instead.

In describing artificial intelligence, the Committee outlines how the present legislation does not encompass machines that become autonomous and self-aware. A machine can be built, loaded with software, and then go on to learn from its environment. This new environmental learning suggests that the AIP can determine its own actions and learn from its experience and failures. AIPs have an advantage here since the majority of natural persons still struggle with learning from failure.

If an AIP can decide its own actions and causes harm, then legal liability can shift from the builder over to the teacher providing the environment. If an AIP can operate independently with its environment and held accountable for its own actions, then it could be held strictly liable. Strict liability requires that a plaintiff show that the damage occurred and a causal link. This differs from negligence in that there is no need to establish the same duty of care, standard of care and breach of that duty of care. Strict liability would be allocated between builder and eventual teacher. The teacher and the surrounding environment impacts the liability shift between builder and teacher. This shift would be extremely difficult to establish in that it may take a village to raise a child, but a vast social media network environment raises an AIP.

The Committee suggests an ethical framework of beneficence, non-maleficence and autonomy, and Fundamental Rights, such as human dignity and human rights, equality, justice and equity, non-discrimination and non-stigmatisation, autonomy and individual responsibility, informed consent, privacy and social responsibility. Whether these ethics and fundamental rights will be offered to AIPs remains unclear, but sauce for the goose is sauce for the gander.

The Committee suggests the need to include a kill switch (opt-out mechanisms). I will shorten this to ‘OOM’. The OOM euphemism provides somewhat of a guilt release. Humanity can delude itself in the belief it has control over any situation as Kurt Vonnegut wrote “The only controls available to those on board were two push-buttons on the center post of the cabin — one labeled on and one labeled off. The on button simply started a flight from Mars. The off button connected to nothing. It was installed at the insistence of the Martian mental-health experts, who said that human beings were always happier with machinery they thought they could turn off.” If you have difficulty in OOMing your faithful Roomba, think how hard it might be if it asked you to reconsider.

To alleviate this OOM situation, I would recommend that readers take their favorite mind/body relaxant and considering the following: Consider if instead of immortality, AIPs live a limited number of years. Science fiction covers both ends of the spectrum of planned obsolescence of the most brutal kind to the inability to self-terminate. If we incorporated a pre-determined life span, would we tell our AIPs the exact date? We could leave the date determination to a random number generator entitled Final Actual Time Expiry, or FATE. Perhaps again, sauce for the goose is sauce for the gander.

--

--