Rights for Robots
Empiric Empiric

Rights for Robots

img
img
Published 05/11/2018

Rights for Robots

When considering whether robots might one day have rights, it’s relevant to ask not just what those rights will mean for them but what they will mean for us.

“Do we want to prohibit people from doing certain things to robots not because we want to protect the robot, but because of what violence to the robot does to us as human beings?" asks Northeastern University professor Woodrow Hartzog. If we treat human-like robots as things, might we increasingly see people like objects?

One of the key conflicts between AI and humans today, though, is the idea that robots will “take our jobs”. However, all too often, we’re looking at this from the wrong perspective.

“When robots replace jobs in an office, people are expecting there to be a robot called Fred, and Fred is going to be replacing so-and-so, but it’s not like that,” says Lydia Gregory, Co-Founder at FeedForward AI. 

“It’s software and the software is not going to be an entity that you can define as a robot, most of the time. It’s hard to define it as one robot that will take the job of one human or one robot that will take the job of ten humans,” she says. “It’s software that will do things that mean a certain number of humans are not required for that particular job.”

What’s more, while there is scope for automating tasks that are highly ordered and repetitive, AI tools will more often augment the abilities of humans rather than replacing them (think Garry Kasparov playing chess with AI assistance following his defeat by Deep Blue).

img
img

AI needs human oversight

Where AI does work autonomously, it remains desirable for humans to have oversight of precisely what it does and why.

For example, it was recently revealed that Amazon had attempted to automate its hiring processes with a machine learning algorithm – but the weighting of the system’s training data meant that it actively discriminated against female applicants.

In another case, it was reported that an algorithm used in US courts to assist sentencing based on the likelihood of reoffending demonstrated racial bias. Findings suggested that black defendants were predicted to have a higher incidence of recidivism than was actually the case – and that the opposite was true when it came to white defendants.

img
img

Can algorithms be trusted to operate without human interaction

This is perhaps the key ethical dilemma for AI today – how far can algorithms be trusted to operate without human interaction? And how far should humans trust their output? There’s then the question of what interacting with simulations of life will do to us.

Mark Goldfeder, a law professor and rabbi, had the following advice. “[If I see something that for all intents and purposes looks human], I cannot start poking it to see if it bleeds,” he wrote. “I have a responsibility to treat all that seem human as humans, and it is better to err on the side of caution from an ethical perspective.” 

In the meantime, we might do well to be polite to Alexa (regardless of Roko's basilisk).

In our first piece on the philosophical impact of AI, we discussed the bigger picture – and what a worst case scenario might look like.

img

About Empiric

Empiric is a multi-award winning business and one of the fastest growing technology and transformation recruitment agency's specialising in data, digital, cloud and security. We supply technology and change recruitment services to businesses looking for both contract and permanent professionals.


Read more (pdf download)

Empiric are committed to changing the gender and diversity imbalance within the technology sector. In addition to Next Tech Girls we proactively target skilled professionals from minority groups which in turn can help you meet your own diversity commitments. Our active investment within the tech community allows us to engage with specific talent pools and deliver a short list of relevant and diverse candidates.

For more information contact 

02036757777

To view our latest job opportunities click here.

This website uses cookies to ensure you get the best experience on our website