An academic at the University of Warwick is warning that communities need to get to know their digital human rights so they can influence the governance of future technologies and AI.

This comes ahead of the annual UN Human Rights Day (Sunday 10th December) and the 75th Anniversary of the Universal Declaration of Human Rights. As technology continues to evolve and permeate every aspect of our lives; from how we find our homes, or jobs, to how we find partners or track our individual health, concerns around privacy, autonomy and other ethical considerations have become increasingly paramount.

If left unchecked, AI systems can:

  • Deny loans or job opportunities for reasons that are incorrect or discriminatory. 
  • Health diagnoses made by faulty AI, or AI trained on incomplete data that excludes women or other groups can lead to misdiagnosis.
  • Use of facial recognition tools and crime prediction tools bring about risks to human rights for those targeted by police and other law enforcement bodies.   

Academics say with future advances in neurotechnology, even our right to think freely may come into question. 

Professor of Digital Health and Rights at the University of Warwick, Meg Davis, said: “When the UN Declaration of Human Rights was signed in 1948, the internet hadn’t yet been invented.

“Today there are over 200 ethical standards to guide digital technologies and AI but little to no actual governance. Many of these standards are contradictory and even well-meaning government officials don’t see how to apply them in practice. 

“AI systems lack transparency and there is very little human oversight or accountability for the harms these systems can cause. What’s more, the agenda for tech and AI is being shaped by the private sector in very few countries, but these companies are shaping the impacts AI is having on people globally.  

“We need to turn this power dynamic around and demand that human rights is put at the centre of AI and digital governance.” To do this, Professor Davis says that communities need to know their AI human rights, and grassroots movements are needed to be able to identify what applications, tools and government society needs. 

“It is these communities that should inform researchers, governments and technology, not the other way around,” she continues.

In order to address the governance gap, the UN Secretary General has called for a Digital Rights Advisory Mechanism to bring together independent experts to untangle the hundreds of standards and norms, interpreting them provide consistent, practical advice to governments, businesses and others about how to apply human rights and ethical standards to digital technology and AI.  

“This would be a great step in the right direction,” says Professor Davis, “but it needs to be backed by a transformation in which people learn about their rights and form a grassroots movement to set the agenda, to hold the UN, governments and the private sector accountable for the results.” 

What are our digital human rights?

Professor Davis says all the human rights we have offline also apply online, including:

  • Right to non-discrimination 
  • Right to freedom of opinion 
  • Right to freedom of assembly and association (that is the right to organise groups independently, right to gather freely) 
  • Right to privacy 
  • Right to autonomy 
  • Right to effective remedy 
  • Economic, social and cultural rights – such as the right to highest attainable standard of health goods/information/services, right to education and the right to work. 
  • Transparency, accountability, equality, and other human rights principles.

Professor Davis is putting this theory into practice via the Digital Health and Rights Project of which she is principal investigator.

The project involves academics, human rights activists and community-led networks studying how young adults experience the digital transformation in practice and offering training in digital human rights. The research team is conducting research in Colombia, Vietnam, Kenya and Ghana, engaging young adults and local civil society groups as partners and developing a digital empowerment hub with online training resources on human rights, technology and AI.