Intersectionalities of Automated Decision-Making and Race/ Ethnicity
17 November 2021

Prof Paul Henman, UQ node, ADM+S
Willie Pepper, UQ
Assoc Prof Sofiya U. Noble, UCLA
Prof Bronwyn Carlson, Macquarie University
Dr Karaitiana Taiuru, University of Otago
Watch the recording
Duration: 1:05:12


Prof Paul Henman: 

Welcome everybody. Welcome to today’s seminar, or symposium, on examining intersectionalities of automated decision making and race ethnicity. My name is Paul Henman. I’m from the University of Queensland and part of the Australian research council’s Centre of Excellence for automated decision making and society. This event has been organised by that centre and so i want to acknowledge that the Australian Research Council. This has been a wonderful symposium, and I’ve been waiting all year to be able to hold it, so I’m here ready to learn, and i hope everybody else is here. It’s been set up as a zoom meeting, so please make sure you’ve got your microphones off, apart from our panel members, and there will be plenty of time to at the end of the presentations to engage with our esteemed colleagues.

So, before we begin, I want to hand over to welcome to country, to Willie Pepper. Willie Pepper is representing the Boon Warrung land and city council. He comes from Victoria, and he’s located- the welcome to country is based there, even though we’re meeting from all different places, because that is the location of RMIT, the home of the base for the Centre for automated decision making and society. So I’ll hand it over to you Willie.


Willie Pepper: 

Thank you very much Paul. Yes, my name is William Pepper. I am a Gunaikurnai man. My father was born on an aboriginal mission down at lake tyres. My mother is Yorta Yorta, being born up a Cummeragunja on the border between Victoria and New South Wales up near Shepparton. I will commence. Wominjeka as a representative of the Boon Warring, Melbourne’s first peoples, i am pleased to be able to welcome you all here this morning on behalf of N’arweet Carolyn Briggs to this intersectionalities of automotive decision making and race ethnicity event. I would like to pay my respect to the Boon Warrung ancestors and elders, past, present, and emerging, and to acknowledge all of you who are meeting on different lands. We are especially pleased to recognise the commitment the ARC centre of excellence for automated decision making and society at RMIT University has made in paying respect to the spirit of this land and its first peoples.

Through this you’ve shown the willingness to honour sacred ground. It’s important for all Victorians and Australians all around our nation, to understand and appreciate the history and culture of the Indigenous people of Melbourne who have played a significant role in the development of Melbourne both before and after European arrival, as they are unknown to many people who live on this beautiful land.

The struggle to preserve their culture and traditions began with the ancestors back in the 1830s. One of the lessons we should take from this struggle is the way in which the elders and leaders forge alliances that led to many of the achievements that we take for granted today, and as Australians, whilst we may have all descended from different clans and language groups and countries from across the world, we can all learn from this from this lesson.

The word welcome in Boon Warrung is Wominjeka, and it translates to “come with purpose”. It is a contract between the people as the custodians of this land and yourselves to ensure our laws are adhered to, and to guarantee safe passages for those who ask. According to tradition this land has always been protected by the creator Bunjil who travels as an eagle and by warn, who protects the waterways and travels as a crow.

Bunjil taught the Boon Warrung people to always welcome guests, but he always required the Boon Warrung to ask all visitors to make two promises, that i will ask of you this morning. To obey the laws of Bunjil, and not to harm the children or the land of Bunjil.

This commitment was made through the exchange of a small bow dipped in the water and the word spoken “wominjeka”. Thank you for your time and good luck with your day. Thank you very much willie pepper for your welcome, and warm welcome to country, and to our event. We appreciate your presence and your welcome. Okay Paul, thank you very much thanks everyone, bye.

Prof Paul Henman: 

Of course, Willie has just welcomed us to country, but i think it’s also important for non-indigenous people to acknowledge the traditional owners on where we’re all sitting, and in joining us in this event. We pay our respects tom their ancestors and descendants and recognise their continuing cultural and spiritual connections to the land. We recognise this once was, and always will be indigenous land and has never been ceded.

So, for today I just want to begin by, we’ve done our acknowledgements, we are recording this event, and the only the recording will occur for the front presentations. The discussion will not be recorded. So, we’ll have about 25 minutes with Safiya Noble, ten minutes Bronwyn Carlson will engage with Safiya’s conversation and also talk a bit about her work. And Dr Karaitiana Taiuru will also be speaking from New Zealand. That leaves us about 30 minutes for questions and discussions.

As I already acknowledged, this event has been held by the Australian Research Council centre of excellence for automated decision making, which was established last year as a cross-disciplinary project centre, to create knowledge and strategies necessary for responsible ethical and inclusive automated decision making.

This is the first of a number of events looking at the intersectionalities of automated decision making and society, and really pleased to begin with the questions of race and ethnicity. I think it’s well known for most people who have been interested in the question of automated decision making, or artificial intelligence, or digital technology, that there have been significant concerns raised in the past about the way in which race and ethnicity discrimination and racist behaviours and content has been reproduced from Microsoft, to more recently into the criminal justice system, from the compass system.

So, to engage with those questions we’ve got a really great panel of internationally renowned speakers associate professor Safiya Noble probably does not need any introduction. Her work algorithms of compression came out in 2018 based on work over the many years. It was one of the seminal pieces of work that really structured and captured the imagination of the way in which algorithms in today’s world create significant problems for discrimination inequality etc. So ,it is a real pleasure to have you here Safiya.

Safiya is at the University of California, Los Angeles, where she serves as the co-founder and director of the UCI Centre for critical internet inquiry. She is author of algorithm suppression but she’s also contributed to, as editor, to the intersectional internet and emotions technology and design. So, I’ll hand it over to you Safiya and welcome you on behalf of the rest of the group.

Assoc Prof Safiya Noble: 

Thank you so much Paul and thank you to all of the people who have put so much energy and time into organising this amazing session today. I’m really so honoured to be here. I’m also joining you from Los Angeles which is truly the Gabrielino and Tongva people’s land that they have not ceded, and i am both the descendant of colonised people who were brought to this land from the shores of Africa which are not known to us in any specific way because of those journeys of our ancestors to these lands, but also a descendant of colonisers who participated in the brutality of all that we experienced in North America. So, I just want to say that I’m in solidarity with both the people who steward this land and also the people who steward the lands where this conference is being held, and I’m so grateful for this opportunity to learn and be in conversation with my co-panelists today.

So, let me just start by saying that I have just a handful of slides that, not very many, but just enough to kind of get us going in a bit of a conversation So while I’m working to get these started here can you see just the slides? And not the whole computer background too? All right, great. I think it’s so important that we’re having these conversations around the world in different places in different lands by different people, because the specifics of the way in which so many of the technologies and automated decision-making systems that we’re engaging with are applied in different kinds of racial and ethnic constructs and contexts, but I also believe that the the struggles to resist the way in which we are being kind of further captured and dehumanised by automated systems, is some of the most important pressing civil human and sovereign rights in the world. nd for those of us who feel compelled to respond to these concerns and to talk about these things, I just want to acknowledge how incredibly difficult it is to do that work in the face of trillions of dollars and currencies that are poured around the world by the tech sector into narratives about the liberatory possibilities and promise of these technologies. And i think that many of us in our respective communities see different things, and this is one of the reasons why scholars of race and ethnicity really must continue to be, and indigenous studies, gender studies, sexuality, really need to be at the forefront of these conversations, because i think we ask different questions coming out of the kind of historical, political, social, economic, context of our work, than makers of technology or people who are kind of techno solutionists, are kind of embroiled in the techno utopianism. So, it’s from that vantage point that I’m joining this conversation with you today. So let me also share with you, because last summer in the United States a man named George Floyd was murdered by police officers in Minnesota, and he was one of many, many, many thousands, hundreds of thousands of black people, African Americans who have been murdered by the state, or whose lives have been kind of disregarded by the state, in these kind of very extremely brutal ways, and the uprising in the united states and around the world, and we are grateful for the solidarity of people around the world who still recognise the struggle of black people in the United States, brought about kind of a moment where for us in our centre at UCLA, there were many questions by people in the tech sector in particular, who were even, outside this the specifics of their jobs were asking about their responsibility and what they needed to know, or what they should be thinking about at the intersection of race and technology. So, I put this slide together, kind of in memoriam of not only Mr Floyd but all of the scholars of colour; black scholars in particular of the United States, who’ve been writing at the intersection of race and technology for many years.

Indeed, years before I started writing about algorithms specifically, who were trying to re-narrate and animate a different set of concerns and conversations about the stakes of power and how the kind of uneven ways in which power is enacted upon vulnerable people around the world, and in this country. So these are people that i just think, you know, i invite you, if you are interested and can think about things that you can pick up from their work to take into your work, these are people that i think are working in a variety of different spaces from sociology to black studies, communications, computer science, African American studies, black studies, American studies, but who are all of us who are thinking about race and power and technology, and i think that there are, we’re situated in different eras in history, but i think our work and this kind of collective body of work that’s emerging, what it signals to me is that no one person is really kind of carrying all the water on all this. Together we are trying to build a paradigm and shift a paradigm, at this moment. So, I just invite you to know about these scholars as well. All right, so what are some of the stakes, and I’m really just going to contour some of the things that i think could be a bit of a frame for my colleagues to also hang more evidence upon from your work as well, to think about what are the pointed ways in which we see race and ethnicity used and kind of weaponised in hierarchical power structures of control on people of colour continuously, in a in an incredibly kind of normative way that goes unquestioned. Of course, let me just definitionally say, i mean i can’t as a black studies professor i need to say that these constructs around race, at least in the U.S context, are really, when we think about race we’re talking about a system of power, and the way that system of power in the United States works is, it’s a hierarchical power based on a binary and that binary is kind of predicated upon Europeanity at the top. Also, you could think of it as kind of all the ways in which a construct of whiteness gets made and people petitioned to be recognised. So many people from many parts of the world who are ethnically connected to different nations or kind of ethnically defined also seek to locate their ethnicity within this racial binary of power in the United States that puts Europeanaity at the top, or whiteness kind of at the top, and people petitioning to be to be recognised through that power lens. And you know it relies upon blackness, black people, Africanness, as it’s polar in that power structure in order to determine kind of the distribution of goods and services who labours, and who receives the benefit of labour. Who is enslaved and who controls.

These legacies of race as it’s constructed here are really important and they often are conflated also with ethnicities. So, for me I would say my ethnicity is being African-American; a descendant of enslaved black and African people who are not connected to a specific nation-state, because of the transatlantic slave trade and our ancestors are really known to us on these lands. This is important because we have many kinds of black people and African people in the United States, from the Caribbean, from the continent, from Europe from all kinds of places so when we think about race as a lens to study technology, often what we’re talking about is that power system: who benefits and who loses within that power binary. And some of the most important work that points at that, are things like Joy Buolamwini’s work, talking about, and her famous study with Timnit Gebru and Deb Raji about how facial recognition for example is built, deployed, embraced, and it doesn’t recognise black women’s faces. So, you can see here’s an example in her study, their study, where they found that kind of the efficacy of something like a facial recognition technology, right out of the box from Amazon, from Microsoft from large you know multinational companies who work and develop and deploy those systems, often are made in the image of the most powerful people in our society. So, you can see lighter-skinned males and men in the United States are more likely to be recognised, known, understood, have those technologies perfected upon them. And black women, darker skinned women, having the least efficacy. And this is very important because we look at the deployment of these technologies, they are mostly pointed at black and brown faces through things like predictive policing technologies, or other kinds of surveillance technologies that get deployed into communities of colour, that mis-recognise us, and of course there are famous studies in the US about things like pointing facial recognition technologies at black or African-American members of congress, of our federal government, and the facial recognition technology marking black congress people as criminals. Right, matching them up with criminal database mugshots. In my own work i have really tried to point out to numerous examples and algorithms of oppression, is just like a study in all the failures of different kinds of technologies. Here’s one example from that book about doing a Google search in google images and looking for quote “unprofessional hairstyles for work” and being fed back almost exclusively black women with natural hair who have not straightened their hair, chemically treated our hair. I wear my hair natural 99 percent of the year, and so I’m one of, i would be one of these pictures, i could just see myself right there. And of course then looking for professional hairstyles for work and seeing almost exclusively white women and white, blonde women, right.

So, these kinds of systems are always interacting with, and marking race and racial categories, even when they are touted as neutral, touted as simple tools right. And these are the kinds of discourses that we hear constantly deployed around all kinds of technologies. Of course, we know that there’s not only, Paul as you mentioned the kind of deployment of things like recidivism prediction software, software that determines whether you spend your life in jail or in prison, or whether you’re let out on bail. We also have these kinds of technologies of matching and sorting people in and out of opportunity, and here we’ve had Robert Cooper who was this gentleman in Detroit, an African-American man who had never missed a day of work, who was just like a model human being, if only we could be as good as him. And he’s mismatched by a facial recognition system that, and police come and arrest him in his home in front of all of his neighbours, his two small children and his wife, and he’s told that he has stolen a watch from a high-end store only to see that while he is not only missing that first day of work, he is clearly not the person, his photograph is not a match in any way to the photograph, but yet the system is forced to arrest him because the system has made a match so much of an erroneous match that the law enforcement officers do not even feel empowered to declare it a mismatch, and are still compelled to arrest him and hold him.

So, we think about these things, i think about the things you know going back a decade now. It’s just so hard to believe that for 10 years I’ve been thinking about arguing that computer code is not only kind of a human construct, that technology is a social construct like race and gender, and we understand things like race and gender as social constructs but it’s just so difficult still to dislodge this idea that technology is somehow neutral. It’s somehow apolitical, that it has no uneven application, and I take this back, still to this day i talk about the first study that i did that launched the book algorithms of repression because i still find so many egregious searches like this. In the case for example of doing a search on black girls you know this was at a time when my daughter who’s grown now was a tween and a middle schooler, and i was thinking about her and my nieces and just curious to see how black girls were represented in google search. And seeing that almost all you know 80 percent of the search results that come back on black girls, back you know a decade ago, were all pornography or hyper-sexualised, and of course what does this mean for girls, you know, children. For people who are vulnerable, for people who are not the majority in the population who can’t control influence or even use gamification strategies. Who don’t have the resources, can’t search engine optimise, can’t optimise themselves into a better state of being understood. Writing about that and studying about that for 10 years you know, one of the things that I’ve learned is that there’s so many myths about what these technologies are, and so much obfuscation of how the algorithms work in large scale tech companies, but there’s also something even more kind of pernicious that i think about and that is this idea that we will somehow solve these problems of racism and discrimination and oppression and colonisation and occupation, by holding up a phone and tapping on a piece of glass. I mean like the ludicrous notion that we would somehow solve these problems with technology, when the technologies themselves are reticent to even take seriously the kinds of conversations we’re having today. In fact we see this in the case of Google who recently fired our friend and colleague Dr Timnit Gebru, one of the scholars who worked on that gender shades study with Timnit Gebru on facial recognition, and while she was heading up you know the ethical AI group, research group, and found that Google’s, you know, current products in development, kind of its new natural language processing kinds of network technology is fraught with racist and sexist bias and kind of discrimination, and also of course has this tremendous footprint ecologically, and has will have devastating effects for the planet in terms of its energy use. When she does the very thing that she’s hired to do within the companies, she’s summarily fired for not retracting her research and her paper and her findings and so these things tell us so much about kind of the positionally of black women in particular, and I underscore kind of the role of black women in our field, and women of colour more broadly, indigenous women, because I’ve seen over and over, how many women, people of colour, LGBTQ+, people whose communities are in the crosshairs of so many of these different kinds of technologies we often see the harm first and are yet often the least believed the least listened to, not in charge, and when we do our work, are also kind of vulnerable to firings or other kinds of pressures.

So, I just want to acknowledge especially for my co-panellists today, you know the stakes of our work, of our shared work of the stakes of our commitment and doing that under these kinds of intense pressures that we feel in our communities. I think the other thing I want to say- I gave myself a timer so i would be mindful of the time- one other thing that I’ll just say here as i start to close is that I wrote a piece last year during cover-19 for Noma magazine here in the US, it’s a small magazine, I’m sure that you haven’t seen it but I will just, I’ll tell you what’s in it. Which is that while these companies are kind of operating in these ways where there’s kind of a you know, resistance to engaging with the harms, and indeed we could even argue the crimes in many cases, tied to a variety of these different types of technologies and companies. We also are losing kind of our public institutions and our public democratic institutions that could respond and could be strengthened as the counterweight to so many of these kinds of corporatist, envisionments of the world. You know, in the US where i work in California, i can tell you that I work in the kind of the flagship public university system the University of California and you know, we see here we are just right down the street from all of the silicon corridors, silicon valley, silicon beach, and these companies don’t pay taxes for the most part, they offshore their profits, they gentrify neighbourhoods, they skim the cream of the crop of the best thinkers and students and activists, and people who artists, people who could really help us reclaim our communities and our institutions. They use our airports, our roads, our public transportation systems. All of it, and give nothing, very little back, except more kind of extractive business models and products.

And so, we’re also losing kind of our public goods in education libraries, public health, public media, all the things we need, through the depletion of resources. Into these kind of shared tax bases and spaces, to help those other kinds of institutions, public institutions flourish. And as our institutions are in crisis, and of course we really saw this during Covid 19 when black and Latin-x and Indigenous communities were so powerfully harmed by Covid 19, and our communities didn’t have employer-based health care. I know everybody who ever talks to anyone in the US can’t understand how terrible our healthcare system is. It’s like a crime against humanity. Those same kind of, the largest of the resources of these companies in these industries should have gone back into the cities and the communities where they do business. Instead, what they did is by withholding those resources and offshoring, kind of engaging in corporate tax evasion, they leave these institutions vulnerable and then they rush in with their products and services to somehow shore up our institutions. Ao instead of getting funding for graduate students and for faculty and lowering the fees and the cost to go to school, we get slack, and we get zoom licenses and we get more, you know more software licenses and more hardware right. And so, it’s like starving society of the resources and then using that to exploit and make more commercial opportunity, are the kinds of things that i think we must be thinking about and caring about.

So I’ll leave it here, and just say the promise was that we would get more liberation, more freedom, more connectivity, more democracy with technology and the internet, and i know because I’ve been on the internet since I’m going to say maybe 89 88 89 90 91? I’ve definitely been on a computer since the late 80s and been on the internet. So, it’s just a lot of hair dye, it’s fine, leave it. But i will tell you that you know these promises are so confronting because we really have more data and technology than ever, and we also have more global social political and economic inequality and injustice to go with it. And I argue in my work every day that technology and data are implicated in the escalating global inequality, and we have to actually think about the role of these systems and companies in exacerbating the kinds of conditions that we’re also so profoundly committed to changing. So, thank you so much, I’m going to stop there and just say I’m really thrilled and excited to get to speak with you Professor Carlson and Dr Taiuru.

Prof Paul Henman: 

Thank you very much Safiya. It’s wonderful to hear from you and i think you’ve just covered some really, really good introduction to a really good coverage in the positioning of the whole symposium today, because the way in which you’ve engaged with the questions of intersectionality- you’ve talked about the race and ethnicity, you’ve talked about gender, you’ve talked about sexual minorities etc, but also locate that in broader political economy of the structural significance so thank you very much for that.

I want to now turn to Professor Bronwyn Carlson, and i think the important thing that i want to bring in with this engagement about automated decision making and race and ethnicity, is the importance of Safiya’s work and other African-Americans- to demonstrate the significance of race and technology in the US is made more complex in the experience of indigenous people, with the insignificance of the connection to the land. Their significance and indigenous knowledges, and that’s why we’re introducing, bringing in Professor Bronwyn Carlson and Dr Karaitiana Taiuru. And so, I was handing over to Bronwyn now.

Bronwyn is an Aboriginal woman who was born and lives on Guwaurl country in New South Wales and is professor and head of school of indigenous studies at Macquarie University. She’s is also founding and managing editor of the Journal of Global Indigeneity, and the director of the enter for Global Indigenous Futures. Bronwyn is the author of a recently released Indigenous Digital Life, that practices politics and being indigenous on social media. And is also author of the Politics of Identity; who counts as aboriginal today? So, with that, I will hand over to you Bronwyn.

Prof Bronwyn Carlson: 

Thanks Paul. I didn’t have any slides because i think 10 minutes, I’ll just get going from the get-go. Thank you Dr Noble, I have to say that your work has been a big influence on my own as well as a couple of others that you put up in one of your slides- Ruja Benjamin who fabulously did a lecture for me last year, one of my classes, and of course Andre Brock’s work has been quite influential. And you’ll find yourselves in that new publication. So, I would just like to take a quick moment to acknowledge where I stand. I’m on Dharawal country in Wollongong, which is south of Sydney in New South Wales. I’ll just like to pay my respects to the old people, elders who’ve come before me to Aboriginal and Torres strait Islander people across this continent, to continue that work. And for our future ancestors who hopefully will learn from those who’ve come before and think about a new world in which we’ll have a different kind of world. Where colonialism isn’t the biggest, and key, and most significant thing in our lives. Look, some of the stuff that you were saying Safiya, definitely resonates with the work that I’ve been doing here. So, for the last eight or nine years i’ve been working on research projects that look specifically at the everyday lives of Indigenous people here, and around the world in relation to how it plays out on social media platforms, specifically. And one of the things that I’ve found in that research- so I’ve gone to every state and territory within this continent that we now call Australia and talk to people in all sorts of geographical locations, And more than 99% of those participants in my research studies have all spoken about the racism, discrimination, oppression, and violence that they’ve experienced online. And so, to do a bit of backup for that research I decided to take a look at media and how they report on indigenous people and social media. And so, what I found quite significantly in that, um over 1500 articles that I had a look at, and encoding this archive of new media which was specifically on indigenous people and social media; the categories of racism hatred and abuse were far and away the most populated.

So, from police officers using Facebook profiles under fake names to people who you know, had this intersectional hate of homophobia and racism towards indigenous people, harassment and denigrating our significant individuals who are entering into political life. So, we have a number of Indigenous people and particularly Indigenous women who are now in the political space, where they’ve received death messages, threats of death, and threats of rape. And other similar kind of messages from the Australian public. And the reason this takes place um is really that they have an audacity to claim their indigenous identity in a public way and they have an audacity to still be here in 2021. Now the whole way through Australian history, we can see how indigenous people have been written out and were not a people who have been thought of as people who will be here in the history, and I think this is really one of the key issues when we talk about technology. And I think Andre Brock points this out in their book, where they say that brown and black people are generally coded as people without technology, and so we see that same thing here in Australia. So, the fact that we are actually online and participating in this digital life, is a place that was never built or thought of or visioned that we’d be in it. And but here we are, and so one of the reasons I say this and I was thinking about when you were saying together we’re, you know, build and shift the paradigm and the important work that’s being done with your colleagues as well as yourself- but here in Australia i start to think about how many people are, how many indigenous people, are afforded the privilege right, of having some research funds to be researchers of technology, or indigenous people’s digital lives. And i have been very fortunate to have received three Australian research council grants that look specifically at, that I’m still one of the very few people who look at Indigenous digital lives in Australia, and that intersection of technology and everyday lives.

So, i sit and think about that because I’m often called to talk about it, and i wonder who are the other people, like who are the other indigenous people here who do that work? And it’s so few people because they don’t have funding and they don’t have the privilege platform to be able to engage in the kind of research. So this stuff goes unnoticed. You know the kind of depth of our experiences online is largely missing from the scholarly works that’s written, and it’s generally the same couple of voices that get to speak on it. And then when i think about who are indigenous people in tech industry right, who are part of the influencing of how this stuff operates. And so i was asked, i was invited probably because of the work that i do, to a couple of the tech companies here in Sydney so i went along there and i was looking around at these flash buildings and all these very important people, you know, they’ve got their own cafe and everything in there, it’s pretty smick. And they give you lots of cool stuff as you leave, you know fancy pens and notebooks and all the like, and there’s a snack bar straight out of some sort of movie about Google, or something from the US. So, you go there and I say, so how many indigenous staff do you have here and then there’s the crickets chirping like what? Yeah, how many indigenous staff do you have here, and like then they tell me I’m really committed to you know, building diversity within the workplace. We’re really committed to indigenous people in the lands we’re on. Yeah so right. How many indigenous people work here? Right well we don’t have anyone right now but we’ve hired somebody to paint a mural and I’m like ‘oh awesome’, and then i go through all the gates to get out and on every gate there’s a black body guarding that gate and these are all people from the US who are guarding the gates as you go through this building. So that’s really interesting to me.

So there are no indigenous people that work there and so I’ve been invited into these tech companies for their reconciliation breakfasts and to join their reconciliation action plan groups and all this kind of stuff but my research and my intellectual contribution to this space is never a consideration, in that I’m invited as an indigenous person to make a fancy document called a reconciliation action plan which you might not have those in the US. They’re a document that says this company is doing things that are cultural and respectful to Indigenous people and their fancy documents. And largely they’re ignored. They just look good, you know, they have a lovely dot painting on them, and so look fabulous. So, there’s very little happening. So then i ask other tech companies who are more interested in a pipeline of people, so more in the behind-the-scenes tech stuff like Google etc, you know, how do we create things? How do we operate algorithms, or you know all of these things. So the behind the scenes stuff, and again there are no indigenous people so when people talk about algorithms and bots as these are some you know, actual identities that are racist in another themselves, people forget that these are actually produced by generally CIS gendered men, white men, who create them in their own image so then they go forth and create havoc and hell upon the lives of Indigenous black and brown people. And in Australia the most targeted groups online are indigenous women and indigenous LGBTQIA community members. They’re the most targeted, and then there’s an intersectional targeting based on their gender and based on their identification as an indigenous person.

So, things such as facial recognition, the conversation between black and white here is a little bit more complicated as it is also in the US. Here Indigeneity is one of the most hated groups in these online spaces. We are targeted constantly, and it has nothing to do with how we look, it is everything to do with that audacity of speaking up and claiming to be indigenous in 2021, when we live in a space, in a continent that has written us out of that history, that future. And so we’re not welcome in this space, and so the internet has now become a hunting ground of us, and these are very real experiences of indigenous people in these spaces. I don’t know how am I going for time Paul. I didn’t set myself a timer sorry. Okay about two minutes, excellent. Okay so for us it’s really difficult. So, behind the scenes all of that, where these things are built and produced and thought about outside of our domain, outside of thinking of us being included, then the actual everyday lives of ourselves engaging on social media particularly. And I want to just say that it’s also a space that we have taken back and taken some power from where we can come together, and we can speak to each other. And where we can gather in larger numbers than ourselves and join a global network of Indigenous people. But that comes at a huge cost that Safiya was talking about. That cost on our lives is huge. We see high rates of self-harm and even suicide associated with large amounts of time on social media platforms, and that’s because every single day you’re dealing with a hatred that is so intense in this place that it is hard for me to even explain to you what that feels like. And i’ve experienced that myself. And then we asked the tech giants how they can better work to ensure some sort of, sense of safety for us online, and all of that work gets shipped offshore to people who have no understanding about the kinds of hatred and racism we face as indigenous people in this country. And I’ll use a really simplistic explanation, that is in this country indigenous people are often referred to as primates, apes, as animals that would not be picked up by any kind of algorithm or any kind of program that is looking for elements of dehumanising a particular people. So those kinds of things are not picked up.

So, if the regulation of these tech giants is not on, you know, in the place in which they are producing this technology, and doesn’t include the people that will be most impacted, then it will continue. But the truth of it, is it’s about money, it’s not about our welfare, it’s not about our safety and it’s not about our lives. It’s about collecting clicks for money and the more controversial that is the more money is actually made. And so that is the bottom line of it. Humans and people who are considered others, or ‘othered’ in the space are not necessarily the focus. Thank you.

Prof Paul Henman: 

Thank you very much Bronwyn. I appreciate those insightful words and drawing on your own research about those experiences of yourself and your other Australian Indigenous people. It’s obviously a harrowing experience to be on that online space and try to capture that and I thank you for sharing those words.

Our last speaker is Dr Karaitiana Taiuru.

Karaitiana is an interdisciplinary Maori academic and activist. He has several Iwi affiliations and has worked, originally working in the ICT industry. He has been an advocate and proponent for digital Maori rights, cultural appropriation, data sovereignty, digital colonialism and the revitalisation with technology and Maori representation in intellectual property rights. He’s recently completed his PHD thesis regarding Maori and international intellectual property rights with gene technology, identifying genetic data as the alga, and a data sovereignty right. Karaitiana Taiuru is currently a research fellow at the university of Otago Christchurch, so welcome to you from over the ditch.

Dr Karaitiana Taiuru:

Well thank you. First of all, just from my tribe in the South Island of New Zealand, I just want to acknowledge the Boon Warrung people, their spirits and their ancestors. I just, it’s amazing listening to my co-speakers and their stories just mirror what happens in New Zealand. It’s shocking in New Zealand you’re, I mean I’m privileged in the fact that I can pass off as Caucasian or as Maori and a male. My female colleagues who are Maori are basically targeted, systematically targeted by white women supremacists in encrypted chats. They basically have all their personal information gathered, they’re targeted in their homes, at their workplaces, sometimes or more often than not the New Zealand police can’t do anything about it because there’s a fine line between breaking the law and being a nuisance. A group of Maori activists in New Zealand here, essentially had to create their own underground group and basically ask for funding to buy security cameras and to attend events with physical security protecting them. And I heard the comments about being online with Maori targeted, and especially in media articles when we talk about anything that’s probability, or about equity and equality, and it’s yeah, I think in the last couple years the hatred has just grown and grown and probably, and since covid 19, since New Zealand’s been in lockdowns this year, we’ve seen white supremacist groups trying to partner up with radical Maori groups or pro Maori groups and promote through social media that Maori should not get the covid vaccine yet. Statistically Maori are more likely to die from that, so there’s a whole lot of issues I see happening all around the world with minority peoples, and yeah, not many people there’s- yeah, it’s hard to find a solution.

So just in New Zealand here, so I guess as an Indigenous people we have some protections that other indigenous peoples don’t have. New Zealand has two founding documents; one is the declaration of independence and that flag up behind me there is one of those flags, it is a flag that represents Maori independence and states that Maori should govern themselves. And we also have the treaty of Waitangi which is recognised by the New Zealand government and that basically that says that Maori did not seek sovereignty and that we do have rights. Certain rights, some of the key aspects of the treaty of Waitangi in terms of digital AI algorithms biases etc, is that the treaty itself gives Maori the right to control their own Talmud or treasure, so we’ve identified that data is a treasure therefore the New Zealand government is bound by the treaty of Waitangi with data. So furthermore, the treaty offers three key principles that the New Zealand government is bound by. That’s the right for Maori and the government to co-design, co-govern, and co-manage projects and anything of benefit to Maori, and the governments also required to ensure equity for Maori, so of course a lot of the racists and non-Maori say well how can data be a tonga? So, from our traditional stories talk about all the data from the world being taken by one of our deities, from in the higher realms of the spirit world and brought down to earth. So that gives all data genealogy to our creators. So that’s the first aspect of acknowledging that data as a treasure. From a Maori point of view, anything that i touch this zoom conference, a photo has a part of my life essence in it, so from a Maori point of view everyone who’s interacting with me right now has a part of my modi, part of my life force. And that’s also sacred. So that’s also a treasure. So, our data regardless of whether it’s been anonymised or not, has a modi and it has a life force. So, this has led us to the Maori staunch. Advocates for Maori data sovereignty- very similar to indigenous state of sovereignty or countries sovereignty of their own data- we’ve got a movement push for Maori data sovereignty. So, where we have any data that’s either biological or digital or in any format, any information that is about Maori, by Maori, or for Maori should fall under Maori data sovereignty principles. And so those Maori data sovereignty principles rely on the two founding documents of the of New Zealand, the treaty of Waitangi. And of course, the United Nations declaration of the rights of Indigenous peoples. It considers all of those and forces our government to accept that Maori data sovereignty must be considered.

So, at the moment we’ve got some government departments recognise Maori data sovereignty, we’ve got, there’s a number of other government departments saying, well how do we sort this out what do we do. We’ve got other government departments who recognise that data is a timer and so there’s a significant step going forward to address Maori data sovereignty. Now one of the key benefits for Maori, as in America and Australia, New Zealand police force has been researched and we know that the police force have physical biases against Maori. That’s no question. So we also now, have facial recognition systems being brought over here that are not tested on our pacific peoples but they tested on European Caucasian men and women. So, using the treaty of Waitangi, Maori data sovereignty principles we’re applying pressure on the New Zealand police to actually seriously consider the facial recognition systems, and the New Zealand police have actually been quite proactive in addressing those biases and working with academics, and with Maori to try and figure out how to stop those biases. We also have very strict privacy laws in New Zealand compared to say the states or Australia. So, a principle of Maori data sovereignty is that our data should not leave our shores. So unfortunately, at the moment the key rebuttal to that is, it’s too expensive to host data in New Zealand, but we’re seeing Amazon, Microsoft, bringing data centres over here, and we’re seeing a New Zealand company trying to set up data storage over here as well.

So, back in 2002 we mainly realised that the internet was designed for non-Maori they totally ignored Maori with everything. So, a group of us petitioned to have our own representative domain name dot maori.nc. Now the reason we did this is because unlike a lot of other indigenous peoples in the world, we don’t have an identifier on the internet to make the internet ours. And the bias and the the abuse and the bullying by senior members of the New Zealand tech community was absolutely horrifying. We were blocked at every step and i guess it’s probably only been up until about maybe a year ago where that organisation was actually starting to recognise Maori obligations and rights. And the fact that we have a lot of online abuse of our indigenous peoples.

In New Zealand we also have a largely government and community-funded organisation that deals with online abuse, and we also have New Zealand legislation that makes online abuse illegal. The problem with that legislation is it wasn’t co-designed with Maori. So, it was very Eurocentric and it’s almost Impossible to claim or to prove that you’ve been abused or harassed online. And it also doesn’t consider any race laws. You can be racist you can make any sort of comments or send any images you want, but it’s not against the law. So, in one way I think, I guess a lot of um indigenous peoples could look at New Zealand and you know, think we are very progressive and we’ve got lots of protections. But i think from a ground on the ground here, we’re still a long way to go and I guess like with everyone else it will involve proper partnerships and co-design with indigenous peoples to identify how best to prevent those biases. I think that was 10 minutes.

Prof Paul Henman: 

Thank you very much Karaitiana Taiuru. That’s been really insightful, and then demonstrates the differences from Australia and New Zealand about having those treaties, having those founding documents that give some leverage that we don’t actually have. I want to just hand it back to Bronwyn and Safiya. Are there any few comments that you would like to make in light of the other discussion presentations?

Assoc Prof Safiya Noble: 

Well I’ll just say one quickly. One thing that is so apparent to me is how desperately we need sovereign rights models to proliferate into spaces where sovereign rights are profoundly contested like in North America and other places. And I think many colonised people for example do not, who are displaced around the world and kind of in a diaspora around the world, you know, it has been very difficult for us to reach for sovereign rights models. You know in the United States for example, black people, LGBTQ people, people who are not indigenous, reach for civil rights models or human rights models for rights based protections but there are profound limits. Also, to those, and i think what I, you know, what i think the possibility is for, is so much more. You know kind of cross-continental kinds of research and possibilities and frameworks that we need to help stand up. Because as I look out I see so many different kinds of commissions and boards and people trying to talk about rights-based models and protections for the public, but they really don’t centre the kinds of principles that the two of you are talking about. And so, like how do we take the evidence of the profound disproportionate harms that come to our communities, and then negotiate and implement the kinds of much broader kinds of paradigm shifting protections that I think you know, we’re talking about today.

Prof Paul Henman: 

Thank you Safiya. Bronwyn did you have any words that you wanted to say?

Prof Bronwyn Carlson: 

Just to thank both of the other speakers. Their comments- and i often think about how they actually, how some of that stuff could or is impact or is useful here in Australia. One of the you know, issues for us and there’s a growing movement here for treaty and treaty negotiations, but aboriginal people- that’s a category that doesn’t actually exist. There are no aboriginal people, there are nations of people within this continent where engagement with those sovereign peoples need to be had in terms of treaty. And so, this idea of asking the white Australian government, and the crown, to engage in treaty is really an interesting one here. That is not actually one that is considered a model that people really want to pursue. It really ought to be the other way around where non-indigenous people need to engage with local people in terms of a treaty of being on country, and house and sovereignty of place. Yeah so, it’s interesting, but i do see some movement there and there’s a lot of indigenous people fighting in spaces for treaty here. Being one of the only places that doesn’t actually have a treaty with colonisers, sets indigenous people here at a massive disadvantage to many other nations around the world. Also, so here indigenous rights is not enshrined anywhere in any government level, and so we are really quite powerless in that sense. We of course have agency and exert that power in other ways but in that sense within this legal system, it leaves us at a major disadvantage and in, when we’re talking about the tech world, we have no protections there.

Assoc Prof Safiya Noble: 

Yeah, it’s really important what you’re saying, because so many of the tech companies really are operating as if they are nation states themselves right, and so how will we contend with curb, contain roll back, redress their posture and positions, and their impositions. And you know that could be a real opportunity in the sense that there have not been, you know, there are so many struggles around doing that like with different nation states, but if the tech companies assume the posture of nation state, then I think that seems like a really, you know something worth talking about more, and thinking about more.