The moral imperative for a human rights-based data revolution
On 22 June 2021, I organised an event entitled Dissecting Digital Power Inequities: Reflections from Digital Rights Experts for Development Practitioners as part of my engagement with the Data Values Project. This was an event I’d wanted to organise for a while and was quite personal to me. While I’d encourage anyone reading this piece to watch or listen to the whole event by following the hyperlink above, for those who may not have the time to watch a 90-minute event, I wanted to take a few moments to share why this conversation was so important to me. I hope that this piece provides colleagues and friends in my professional circles with some understanding of why I believe so strongly that human rights need to be at the heart of the Data Revolution for Sustainable Development.
When the UN Secretary-General called for a Data Revolution for Sustainable Development in 2014, it feels to me that the world was a very different place to what it is now. Within the sustainable development sector, an optimistic atmosphere prevailed as the nations of the world gathered at the UN General Assembly in 2015 to approve a science-based, ambitious universal Agenda for Sustainable Development – the 2030 Agenda.
At around the same time in my personal life, I made my first visit to Palestine, where I have family. To get to the West Bank, I traveled through Tel Aviv in Israel. Upon arrival, I was interrogated by Israeli security for around four hours. They wanted to know who I was and who my relatives were. I was shown pictures of family members, told telephone numbers, addresses and other information. The Israeli state clearly knew more about my family than I did as I’d not met any of my relatives before. In addition to personal questions about my own life, I was also asked questions about my parents – where they lived, where and how they met, what they did and the like. Once the Israeli state had worked out how I fit into the profile they had of my family and were satisfied that I did not pose a risk to security, I was allowed to enter.
I didn’t really think about this experience again for a couple of years. However, shortly after Donald Trump became the 45th President of the United States in 2017, I found myself thinking about it again. Shortly after the so-called ‘Muslim Ban’ was brought into force, I found myself organising a work trip to New York. I’d heard horror stories from Arab American friends of mine about how the travel ban was being implemented – it focused less on religious beliefs and more on ethnic background and nationality. Before departing, I wrote up a contingency plan for my wife including the names and numbers of lawyers to call should I be detained. As I packed, I made sure I had a change of clothes and toiletries packed in my cabin luggage in case I had to spend a night in an airport or holding cell. As the airplane sat on the tarmac preparing for take-off, I deleted text messages, emails and social media chats. I then deleted social media apps.
Why was I so afraid? I’m not a criminal. I’d done nothing wrong. I was afraid because I remembered all the information that Israel held about me. I was afraid because I instinctively suspected that Israel would have shared that information about me with US authorities. As it turned out, my fears were not entirely justified and I wasn’t detained or interrogated. I did however spend four years being subjected to the opaque Secondary Security Screening Selection (SSSS) process every time I travelled to the US, meaning that I had to go through enhanced security checks.
Finding myself in this situation - of being afraid for my physical safety at the hands of the world’s superpower for no reason other than my ancestry - was a very uncomfortable feeling. It also changed the way I perceived digital data about myself. I no longer viewed it as something innocuous that floated around in cyberspace. I started to see it for what it is – a representation of how others choose to classify me based on assumptions that they have drawn from my physical characteristics, ethnic and cultural heritage, political views and opinions, and other ‘data points’. This data holds real power. It holds power that can easily be turned against me if outside of my control - in the hands of powerful people whose inferences derived from my ancestry are based on their own biased, racist assumptions for instance.
“I started to see it for what it is – a representation of how others choose to classify me based on assumptions that they have drawn from my physical characteristics, ethnic and cultural heritage, political views and opinions, and other ‘data points’. This data holds real power.”
I recognise my privilege in all of this. I came to no real harm, although I did feel great pressure to self-censure details of my heritage in the Trump years – for instance making sure that my beard looked more ‘hipster’ than ‘Arab’ when traveling. Many aren’t as fortunate or privileged as I. Within my professional life, I can think of numerous examples where people face far greater disenfranchisement, fear and loss of rights all enabled by ‘data’:
What if I were a Muslim resident in the Indian state of Assam or Indian-administered Kashmir, who has had my citizenship stripped from me under the pretext of the Aadhar digital identity system being rolled out.
What if I were a member of the Nubian minority in Kenya, also afraid that the roll-out of a national ID system there might further disenfranchise me and render me invisible.
What if I were an African American challenging the use of racist facial-recognition algorithms used by the police and justice system in America.
What if I were a much persecuted and extremely vulnerable Rohingya refugee in Bangladesh, who, traumatised, destitute and desperate, entrusted the UN Refugee Agency (UNHCR) with my most personal biometric data only to later discover that it had been handed over to my torturers and tormentors in Myanmar.
What of these people? As development professionals, do we not owe them a duty to at least consider the risk of harm and threats that they face when we work with governments and private companies to develop data and digital infrastructure, ostensibly for sustainable development?
In the past eighteen months or so, the COVID-19 pandemic has turbocharged the blurring of the lines between cyberspace and the physical world. In many parts of the world, whole populations are being tracked digitally to ensure that they abide by lockdowns, curfews and the like. Examples of abuse of power abound. All of this surveillance is now digitally enabled.
What does all this mean for the community of dedicated, optimistic, science-driven organisations and individuals who make up the Data Revolution for Sustainable Development, of which I am a part? I think it means that we have to make the moral case for the Data Revolution to have human rights at its heart a lot more powerfully. We need to listen to, and ally with, digital rights activists who know the risks. We need to rely less on ‘quick wins’ and focus on designing data infrastructures that are transparent, participatory, inclusive and accountable from the outset. We need to make sure that our interventions do not propagate harmful, extractivist digital economy business models that undermine trust in digital data and cause enormous damage to the cause of sustainable development.
“We have to make the moral case for the Data Revolution to have human rights at its heart a lot more powerfully. We need to listen to, and ally with, digital rights activists who know the risks. We need to rely less on ‘quick wins’ and focus on designing data infrastructures that are transparent, participatory, inclusive and accountable from the outset."
I’m delighted that I’ve been able to be a part of taking the first step in this direction by organising Dissecting Digital Power Inequities: Reflections from Digital Rights Experts for Development Practitioners. I am grateful to the amazing activists and thinkers who volunteered to share their knowledge and experience with my community of practice. I’m grateful to the Global Partnership for Sustainable Development Data for providing a space for this discussion to happen. I hope that it is the first of many and that the outcomes of the Data Values Project are not just performative, but contribute to a change in how we do business in the data revolution.
- Tom Orrell is Managing Director of DataReady and co-leads the Data Values Project track on ‘Data that is well-governed.'