Introduction

On February 19th 2020, the European Commission released a significant white paper “On Artificial Intelligence - A European approach to excellence and trust”. Much of the initial public reaction focused on the potential AI regulation points drawn up. The white paper also mentioned “requirements to take reasonable measures aimed at ensuring that use of AI systems does not lead to outcomes entailing prohibited discrimination.” However, very few focused on these points of discrimination on gender and ethical guidelines surrounding intersectionality. Indeed, this must not simply be a theoretical approach to discrimination. Tech-policies & regulations should focus on practical implementations for enabling & safeguarding those systematically oppressed. If artificial intelligence is based on “bad” data sourced predominantly from patriarchal & majority views, terrible things can happen.

Human Enveloped in Data

To quote an example of the importance of data inclusivity in research, take the instance of seatbelts, headrests, and airbags. Cars have been designed mainly on data collected from car crash dummy tests using the physique of men. As a result, women are 47% more likely to be seriously injured and 17% more likely to die than a man in a similar accident since women’s body shapes and pregnant bodies do not comply with the “standard” measurements.

The root of these problems is not only technological – it is social. Using technology with this underlying social foundation advances the worst possible things that are happening.

Technological scientists, developers and tech stakeholders are some of the most dangerous people in the world because there is an illusion of meritocracy and search for the objective truth. Technology must be situated in trying to understand the social dynamics of the world because radical change happens at the social level.

Currently, science is taught from an objective and naïve view that often enables tech-creations of destruction. Hence, there is a strong need for reforming education and enabling interdisciplinary work. This also calls for an urgent need to have governing bodies, consensual voting, and strong policies centred around intersectionality pushing for check and diversity in creation of such algorithms.

Of course, people talk about bias in the sense of equalizing outcomes across groups. But I believe we need to be asking more important questions: how is the data collected, should this task even exist in the first place, who is allowed to create it, who will deploy it on which demographic, and most importantly, who will own the data?

What about India and us?

Business models of today embrace the concept of rationality, self-interest and maximizing efficiency. Indeed, one would judge that running a company based on such principles is good. But is it truly a good goal?

On the other hand, a government must manage efficiency with equity, liberty, security, and justice. Public policy is devoted to ensuring that the impact of government programs address such principles and their trade-offs to positively enable the citizens of the nation. If I were to come back to our former point on business models, I would now argue that such a model is not enough for the society we live. It is crucial for everyone to realize the intersectionality of issues and how they permeate into the creations of everyday life.

For instance, an AI-recruitment tool was built by Amazon to automate their hiring process. Indeed, this was mainly built on the principle of maximizing efficiency. Year later, Reuters published a report on how Amazon's so-called "experimental" AI recruitment tool penalized resumes containing any word related to "woman". This had happened because their computer models were trained on resumes submitted to the company over a 10-year period; a time where men dominated the tech industry.

This clearly shows how just maximizing efficiency is not enough – and this is precisely what policy is imperative today. Be it you, me, her, him, or they – learning policy through the lens of intersectionality is crucial for us to take the right decisions and right actions.

Often at times, government and civil society have largely taken a careful and humble approach when making policy about technology, and with good reason. The benefits of technical innovation have been real, and risks from heavy-handed intervention are high. Yet just as the need is growing, our level of tech policy capacity remains low. For example, India has a reputation of invasive data usage and security vulnerabilities in its own citizen-based technology such as Arogya Setu or Aadhar. Tech-policy expertise is hard to find in our cabinet secretaries or senior political leadership, and key Parliamentary science and tech discussions seem to remain naïve and chronically discriminatory.

This calls for reformation and importance of setting up proper policies on ensuring that our government nor corporates overstep the boundaries of invasiveness and go against the basic human constitutional rights. A noteworthy example of the need for tech policy comes through the example of the encryption debate in America. Encryption is an essential tool for protecting privacy and security online, but also raises challenge for law enforcement investigations — as starkly revealed in the debate about Apple iPhone encryption in the wake of the San Bernardino shootings. What we learned is that engineers cannot make these decisions alone without an informed understanding of policy tradeoffs; and policy makers cannot make choices without some clear understanding of the technology ramifications.

After years of debate on encryption the USA has developed a corps of technology policy experts in the field. But in India, how will we develop similar capacity on issues like AI, cybersecurity, and fake news, which are front and center today?

What did we do in this course?

The course ran for a 1.5 months, with multiple homeworks and take away sessions. It was a rigurous learning course, with each session spanning 2.5 long.

The entire course that I led was split into 4 sessions:

Session 1 - Everyday encounters with data

In this session we focused on introducting the participants to the notion of data, history of data and what makes data important in the global south perspective. We also looked at what the meaning of intersectionality and equity is. We concluded by looking at the 7 principles of Data Feminism given by Lauren Klein and Catherine D'Ignazio.

Session 2 - All things Data!

All things Data session dealt with the analytical perspective on how to curb bias - we looked at the right way of doing visualizations as well how to structure a data project in such a way that the power balances are kept in check.

Session 3 - India through data

India through data session allowed for all to read on various case studied in India where wrong data metrics and wrong data analysis has led to bad outcomes. One such example that intrigued us all is how the global definition of malnutrition may not necessarily apply in our context, due to which the numbers reported are erroneous.

Session 4 - Data IRL

In Data IRL, participants took one case study each and presented their frameworks on how they would have gone about the project and how they would have dealt with the data present. There were many interesting cases presented such as Duerte's war on Facebook, the data privacy hack on Fertitlity apps, the problem of women and tuberculosis.

You can find the module ppts in this link. Please contact me for access of the same.

The end of the course saw a lot of joy and excitment in learning new topics. In fact, there is not a course like this ever done before! The average feedback score recevied from the partipants was 9.5/10. Three of the 15 students who took part in the course are now working exclusively in this field through their corporate entities or Non profit sectors. It has given me much joy to enable such a motivation and action in the right direction for the future of the country.

For more information on this course and if you are interested in organizing this with your team or organization, feel free to contact me!