EVENT DETAILS

Policing Insecurity: Debt, Fraud, Data and the Automated Welfare State
8 November 2022

Speakers:
Prof Paul Henman, QU
Prof Virginia Eubanks, University of Albany
Watch the recording
Duration: 1:34:01

TRANSCRIPT

Paul Henman

Welcome everybody on this bit overcast and rainy day in Brisbane and welcome to our Zoom attendees. My name’s Paul Henman and I’m from the University of Queensland School of Social Science and a member of the Centre of excellence for Automated Decision-Making and Society. And it’s a really great pleasure to actually introduce this event and having our guest, esteemed speaker Virginia Eubanks come and join us to talk to us about policing insecurity, debt fraud, data, and the automated welfare state.

But first I want to acknowledge the traditional owners’ land on which we are meeting today. the Turrbal and Jagera people people. I acknowledge their ancestors past, present, and emerging, and recognizing that this space was always a space of learning and coming together. And it’s only apt that we continue that tradition of learning and coming together.

Today’s program is going to be rather simple. I’m just giving acknowledgments and recognizing that we are recording this event. And then I’ll hand over to Professor Greg Marston who will give a welcome. And then Virginia will actually give our lecture. We’ll have some questions, time for questions and discussion. And afterwards we will have some drinks, some light nibbles and snacks, and an opportunity to mingle with each other and also to talk to Virginia and get your signature on any books that you have brought. And hopefully this will be an opportunity to really explore these issues in a way that does Justice to Virginia’s work and to the issues that she’s really given visibility to. But first of all, I just want to acknowledge also that this is anti-poverty week, and this actually wasn’t planned, it just happened to be anti-poverty week. And in the midst of robo-dets Royal Commission that we’ve just had and the ending of the Indue Basics card, the questions of digital technology come through and I think that’s really important. And Greg will explore that a bit further, and Virginia will speak to her research about that.

Before I hand over to Greg I do want to acknowledge that this event is being supported and organized by the centre of excellence for automated decision making in society. It’s based at RMIT with nodes at UQ and eight other universities. This event is also supported by the UQ digital cultures and society initiative, and Greg will speak to that. So, I also want to acknowledge particularly Amy Boike who’s sitting at the back, for all her hard work and stress that the grey hairs that’s emerged because of, organizing this event. So, thank you very much, Amy.

So, with that I want to hand over to Professor Greg Marston. Greg is the Deputy dean of the humanities and social science faculty at the University of Queensland. I’ve known Greg for like 20 years so we’ve detoured in and out from the institution, so it’s a pleasure to join you, have you come up, and I welcome everybody to the event.

Greg Marston

Are we able to move on from that unflattering photo though, of me up there? Well, that’s better. Virginia, that’s a much better photo. I think Paul and I worked out the division of labour was that Paul was going to say a few words about how fantastic Virginia is and I’m meant to say a few words how fantastic the faculty is in this space. As Paul mentioned, there’s a few exciting things that are going on and that’s adding capacity and capability, particularly around digital research, both as an object of study and also in terms of digital research methods, which is fantastic. And I’ll say a few things about that. But also I want to mention and acknowledge Virginia and her work. Virginia was here last year in 2019, I think, and which followed her award-winning book, Automating Inequality. And gave a fantastic passionate address in relation to that work. And Virginia continues a lot of that work in relation to narrating people’s histories and experiences around a whole range of technologies. And that’s part of what Virginia is doing here, on this trip. We’re certainly working her hard. I was in a workshop this morning with Virginia with a bunch of engaged HDR students and staff, thinking about how to write publicly and how to do that in a way that engages a range of publics. And I was there for the beginning and the end of that discussion. It was really terrific to see the tools and tips and techniques and wisdom that Virginia was able to impart to people in the room. So, I know that everyone got quite a lot of value out of that. And then Virginia’s doing other things for ADM+S in Melbourne and doing some drone field work here in Australia, but also in New Zealand while she’s here. Virginia, as said, makes a contribution that’s important in thinking about the impact of policies, thinking about the impact of digital technologies, and their unevenness – particularly around the distribution of burdens and benefits of those technologies, in the way in which some of those technologies liberate but also or you know, survail and make people’s lives very difficult in lots of ways.

So, it’s really important that the work of ADM+S and digital cultures and societies pays attention to that. I think it’s one of the things that can make UQ’s contribution unique in the way that we can bring the humanists and artists and social science together on those questions of power and inequality, and Virginia’s work has, as I said, been really important in tracking those impacts. Particularly for people around the intersection of class, race, gender, and ethnicity. Also just about digital cultures and societies and ADM+S, as I said. What’s also exciting at the moment in The Faculty is that we’re able to not just bring together researchers across disciplines within the faculty across the University, but we’re also able with the support of the university and faculty, employee new staff which has been fantastic. And digital cultures and societies, which is led by Nick Carah and which had a launch earlier this year, has just made five appointments – five postdoc appointments – across a range of disciplines, which some have just started this week. I think Luke’s here. Luke, welcome Luke. And others will be coming over the coming months.

Also automated decision making society all works with a team of researchers and PhD students to bring that work together as well. And so it’s fantastic to see that capability as I said, coming together, but also to acknowledge that it really is humanist as I said, and social scientists and artists, that need to be at the centre of these discussions. Putting people at the centre and giving people agency rather than leaving technical future and digital future in the hands of tech entrepreneurs and Engineers. Because that’s really important in terms of shaping the sort of society we want to live in.

The other thing I’d say about UQ entering this space in which we’re making a contribution to, is that UQ’s also updated all of its graduate attributes – which I’m sure every student in the room is aware of because these are the things we pay attention to – in ECP’s. But one of those has been around digital literacy, which obviously is important in terms of thinking about not just critical thinking, but what does it mean to think critically about these new technologies? And one of the ways in which we’ve tried to shape that capabilities is t say well it’s actually not so much about digital literacy in the present age, which you know, you think about digital literacy what that used to mean was thinking about how to empower people to use technologies like computers well. But it’s really, if you think about what our disciplines are making a contribution to and thinking about the students and their capabilities, it’s about an algorithmic literacy. And there’s no better person I think, than Virginia, to speak to that. Because it’s not so much about how we use technologies, but about how technologies are being used on people. And that algorithmic – struggling to say the word now – algorithmic literacy, you’d think I’m the one with jet lag but it’s actually Virginia. You know what that’s about, how to empower people to think about as I said algorithms, how to use them, how to also work around them and also obviously to know when it’s important to put a human in the loop. Particularly decision-making loops where the stakes are very high in cases like child protection, policing housing decisions, social security, which is where Paul works a lot in social services, which is what the node at UQ that Paul leads is very much focused on. So, that’s I think really important for the universe, acknowledge that it’s not just about digital literacy, that it’s about algorithms and to make them much more transparent. And to make much more engaged in those decisions.

So that’s my, all I really wanted to say. And as I said I’m really looking forward, excited, delighted to have Virginia here. Again, as Paul and I and others have acknowledged, we’re working you very hard. This is late in the day on a long day already for you, so without any further ado, am I handing Back to you Paul? Or straight to Virginia? Over to you, Virginia. Thank you, welcome.

Virginia Eubanks

Thank you so much for all the incredible words about the work, and the work that you guys are doing here. And look at how many of you there are here, that’s fantastic. Hey, thank you and hello to everybody in Zoom land. I will try to remember you and acknowledge you as well, where’s the camera? There it is, there it is. So, yeah there’s so much to say.

I want to get started by telling you just like a little bit about me and about where I find myself in this work right now. I actually probably won’t speak a lot about the book that everybody so generously said wonderful things about, Automating Inequality, how high-tech tools profile, police and punish the poor. I’m going to talk more today about what has happened in my work since then, and something that’s really important to understand about me and my career, and how I come to this work is that I started out as a welfare rights organizer which I did for about 20-25 years. I, in the middle of my life, became a researcher and an academic and got my PhD in science and technology studies in Upstate New York, in a small city called Troy which is where I live, and my favourite place in the world – though Brisbane is also lovely. And since then I have engaged in a number of different ways of researching, thinking and talking about the things that are most important to me, including technology and economic inequality. So, I have worked not just as an academic researcher, but increasingly as a long-form investigative reporter, as well. And I’m actually going to talk mostly about my reporting over the last couple of years today.

So, some of it may sound different than when academics talk about their research and I’m actually really happy to talk about that if it’s something that’s interesting to you, as we get to sort of questions and answers and conversation at the end. Because I do think actually, how you come to this work matters for the kind of work you do. And I’m super happy to talk to you about why I’ve made the choices I’ve made and what it’s meant for the way I think about my work and the world.

So, I just – good evening to everybody. It’s rainy, it’s the end of the semester. I’m sure you have lots of other things that you could be doing with your time, also, all of you in Zoom land. So, I really appreciate you being here and I appreciate your attention and your sort of interest in these things that are really important to me. I also want to be thankful to the ARC Centre of Excellence for automated decision making and society for the opportunity to be here. Particularly to Paul Henman and to Amy Boike who did a lot of really hard work to get me here, to be with you safely in these extraordinary times. So, this event has been rescheduled many times as you might imagine, and I’m so glad that we’ve been able to find time to be here together. And thank you Greg for the great introduction, as well. Thanks to you too, for spending time here together today, whether in person or online, to talk about automated social services, human rights, and what I’m going to talk about is the left hand of policing. And that won’t make any sense now, but hopefully at the end it will make some sense.

What that means, it’s a conversation that I think’s gotten more and more urgent over the last three years. And weirdly, my understanding of the subject really began in many ways with this trip I took to Australia back in March of 2019. So, I’m coming full circle with this trip, in a way. That’s really interesting and fun for me. But more on that later. We’ll get back to that.

I want to tell you a couple of stories first. I want to tell you three stories about people and the impact that automated decision-making has had on their lives.

So, the first story I want to tell you is about Dreama Richardson. This is Dreama. This comes from work I did on a long form piece I did for the Guardian called Zombie Debts are Hounding Struggling Americans. Will You Be Next? Some of this will sound familiar to you, on sort of a cross-National comparison. So, in September of 2018 the Illinois Department of Human Services sent Dreama Richardson here, a letter, claiming that she had been overpaid twenty five hundred dollars in cash assistance, and sixteen hundred thirty two dollars in food stamp benefits, over a period of 10 months, 30 years ago in 1988 and 1989. The State, in this letter argued that she had been given the benefits in error. They charged that Dreama had failed to claim income that her at the time, teenage daughter, Star, had made working part-time at a Taco Bell. And Dreama had 60 days to pay up, the letter said, or the debt would be referred to the Dederal Treasury Offset Program, TOP. I’ll talk about TOP a lot today. Which could withhold a portion of her social security check, which is her only source of income, until the alleged overpayment was satisfied.

This is sounding familiar to anyone? Yeah, something a little familiar. There was however a problem. This will also sound familiar. Star hadn’t been living with her Mom, or contributing financially to the household in 1988 or 1989. The alleged overpayment was miscalculated and the State and Federal Government were basically threatening a 72 year old woman with destitution over a debt that never existed. So, when Dreama received the letter she almost agreed to pay. At the time, I talked to her in 2019. She was beginning to suffer from some memory problems. She’s on a fixed income and the notice really frightened her and confused her, so she said to me in 2019, I didn’t understand. They explained it to me and I still didn’t understand. But how do you argue with government? How do you prove something from 30 years ago? So, after reading a piece in the Chicago Times by another reporter that was describing other women in their 70s and 80s throughout the state receiving these kinds of notices about long overdue Public Assistance overpayments, Dreama’s daughter Star got her a pro bono attorney who filed an appeal and submitted a number of documents submitting her case, including a signed affidavit from Star’s ex-husband, attesting that she was not living with her mother during the time in question. Because she was in fact, sort of on the down-low, living with him. She said this was a rather uncomfortable conversation to have, it is not something she would have chosen to sign an affidavit to, but she did. The hearing was held, Illinois Department of Human Services sent a casework manager to testify at the hearing but they didn’t submit any evidence at all, because the state didn’t have paperwork from that far back. All the same, hearing officer Sheila King found in favour of the State. She wrote in her decision that Dreama had failed to file her challenge within 90 days after the original decision in 1989 which is when she was supposedly first received notice of the Department’s action. And because of that the Bureau of Hearings did not have jurisdiction. She dismissed Dreama’s appeal, allowing Federal debt collection to move forward.

So to Star, her daughter, the result felt a lot less like lawful debt collection and a lot more like intimidation. So Star said to me, you don’t have the means to defend yourself. And she noted that it seemed like the Government was extorting money from some of the most vulnerable members of society, from our elders. And she said look, they’re just big bullies Okay, that’s story one.

Story two. Not about a person so much, as about a company. This is a story about a company called Pondera Solutions, and their explosive growth during the pandemic. So, a couple of years later in December of 2022 in California, the Employment Development Department, EDD, basically just unemployment with a fancied up name, the department was under pressure to address looming fears of Covid-era fraud in the state’s unemployment programs. So, the agency hired this private company Pondera Solutions, California company, to review 10 million claims that had been paid since the pandemic began. Pondera’s software – you’re seeing screenshots from the software there – is called Freudcaster, combining fraud and forecaster of course. And it flagged 1.1 million claims as suspicious and the agency stopped payment on all 1.1 million of those accounts without notice, potentially breaching the Social Security Act. So, the company, Pondera, claims that Freud  this is a sort of a quote from their promotional materials utilizes Google’s enhanced analytics and geospatial technologies to prevent improper payments and save taxpayer dollars by focusing on collusive activity to detect, predict, and prevent public benefits fraud.

So, the company explains that it’s software quote, analyses individual claimants and filers to detect individual anomalies, and makes extensive use of social media analytics to produce hot lists.

We’re just going to like love and live with the door sound. It’s just, it’s like a symbol. Just like, okay. So, in fact, the company claims that it can rank order every single applicant to a public benefits program based on their likelihood to commit fraud. Like, first most fraudulent person in California, second most fraudulent person in California, etc. That software deployment, Freudcaster, produces a scorecard, and you can see that up the top left. That’s their fake one. Gerald Ford, he was a president of the United States, is like an 89 on the fraud scale, out of 100, for example. He was a Democratic president, I just would like to point out. So it’s interesting, might say something about them. But so it produces the scorecard based on basically the past three years of State program data, which generates a fraud risk score for everyone in the system, for their likelihood of committing fraud waste or abuse. Pondera, the company suggests that for quote, programs with large number of participants, Freudcaster will likely generate thousands of flags each time the data is run. Meaning that the software will rank order vulnerable families for investigation for benefit suspension, and potentially even criminal or legal action.

So, in July 2021, this is just a half a year after um they hired Pondera, the Governor became really frustrated -Gavin Newsom is his name – became really frustrated because the State had a huge backlog in unemployment. 1.7 million applications that had gone unprocessed for months during the pandemic. So, we formed a Strike Team to investigate how this backlog had gotten so bad. And the team released an exhaustive report. And they wrote in part, in interviews and observations, stories and anecdotes about fraud and/or suspected fraud, were often used to explain why the Employment Development Department could not act quickly to avoid the growth of the backlog.

There has developed at EDD, a culture of allowing fear of fraud to trump all other considerations. So, the Strike Team also investigated this 1.1 million potentially fraudulent claims that Pondera had identified, and they found that in fact, 600 000 of those were legitimate claims. So, this is a false positive rate for the software of 55 percent. This is not a good false positive rate, right. Though Pondera and its new parent company had since been bought by Thompson routers, it continues to promise that it can efficiently and effectively risk rank every applicant to public programs. In fact, the system turned out to be less accurate than flipping a coin. Right, worse than flipping a coin. So, summing up the report, the Strike Team wrote, while it’s certainly EDD’s job to fight fraud, it’s also EDD’s job not to allow the fight against fraud to interfere with the delivery of benefits to legitimate claimants.

Second Story, how we doing? We’re all okay, still good? Twinkle fingers if you’re good. Okay, good. Alright, third story.

So late in Barack Obama’s second term, the administration passed legislation that required a new tool called electronic visit verification or EVV, that it be implemented to manage all in-home personal care services that are paid by Medicaid, which is the program in the United States that helps poor and working families afford health insurance, or to provide for health care costs. So electronic visit verification or EVV is usually an app that’s downloaded to a home care workers smartphone. It collects information about when and where they clock in and out of shifts, and it confirms which tasks they completed on their shifts. So, in April of 2021, under a federal deadline to implement electronic visit verification or to lose a percentage of their Medicaid funding, Arkansas – which is a State sort of in the mid Deep South in the United States, so a largely rural state – began to roll out electronic visit verification. Six weeks later in May, I journeyed to Arkansas to visit with people with disabilities and their loved ones and their care workers, to see how it was going. And I wrote about this in another piece for the Guardian called, I think that’s the sub line is ‘We don’t deserve this…’ I don’t remember the second half of the title. In Arkansas I met Dante Walker who had paid more than a thousand dollars out of his own pocket, to pay his care worker when the implementation of the new technology went so badly that caregivers pay checks were delayed for weeks or even months.

 I met Nancy Morrell, a family caregiver who saw electronic visit verification as a dismissive and frustrating attempt to quantify her labour, requiring her to account for minute portions of the eight hours a day for which she is paid, even though she provides 24/7 care for her older sister, in her own home.

And I met this couple here Melissa Harvill and Kevin Hoover in Greenbrier which is about 40 miles north of Little Rock. And from their front porch of this Log Cabin that Kevin and his dad built with their own hands, Melissa pointed to an invisible boundary around their property that limits their movements. So, she said to me this is a quote, ‘the metal building down there we know we can go that far. And to the end of the driveway.’ And she and Kevin joked they were living inside an invisible dog fence for humans, right, those ones that she put around your yard if you don’t want your dog to run into traffic.

So, tech companies and lawmakers promise that electronic visit verification will increase efficiency and accountability and in-home care, and of course they promise it will reduce fraud waste and abuse in Medicaid funded programs. But the tool’s really been a catastrophe for a number of people in Arkansas and across the nation. Advocacy groups warned that EVV from the start would erode client autonomy, make home care work more difficult, and threaten the progress of the disability rights and independent living movements. And for Melissa and Kevin, the new system said that – they told me – that it felt like being under house arrest. The EVV application incorporates GPS to verify Melissa’s location whenever she’s providing paid care services. Again, this is only for about eight hours, six to eight hours of her day. Even though obviously they’re partners, they live together and she’s providing care 24/7 to Kevin.

Arkansas’s electronic visit verification also includes a feature called geofencing which basically establishes a maximum distance around a client’s home inside which a care worker is allowed to clock in and out, and if you clock in and out outside that sort of invisible fence, you get flagged as non-compliant. Despite the experience, like Dante and Nancy and Melissa and Kevin, Arkansas and its private contractor, a company called Palco, claims that the Federal Law requires that any EVV system record the location of services as they are provided. So, the system must identify when those services are being provided away from the client’s home. This exception notice, this non-compliance notice that people are getting and that caregivers, clients, and agencies often perceive as errors, the state argued – and this is basically a prepared statement to me when I wrote them a letter that said I’m writing about you and this is the story I’m going to tell, would you like to respond? This is part of their response. They said that these messages are simply informational messages and they don’t mean that time outside the home will not be paid. But the National Council of Independent Living conducted a survey of Home Care recipients and their care providers across 36 States in 2020 and found that a third of their respondents said that they quote, stay at home more often than prior to EVV use, due to fear that geo-fencing limitations will flag a visit as fraud or cause delay in, or loss of provider wages.

So EVV has really eroded both Melissa and Kevin’s autonomy. Before the app they were always on the go. Melissa could take Kevin to therapy, grocery shopping, to see friends. She could log an hour to drop him off at a medical appointment and also run errands while he was in with the doctor. And she said to me, and now we can’t. You have to be at home to clock in and clock out. So, Karen Willison who’s disability editor at a health website called The Mighty, sums up the couple’s frustration. She said, “electronic visit verification is the equivalent of putting an ankle monitor on people with disabilities and telling us where we can and can’t go.”

So, those are our three stories. We’re about a third of the way through our time. How are you feeling? So far so good? Twinkle fingers if you’re okay. Alright, fantastic, okay.

So, great stories but let’s talk about my debt to Australia, right. This is me really sweaty in Melbourne, in March of 2019. That’s why I’ve given up on jackets and official clothes. I actually think you can literally see sweat on my shirt like.

So, this is me in Melbourne in 2019. And when I first visited Australia it was right after the publication of the book that Greg introduced my talk with, Automating Inequality. And it was just a couple of years after, a whistle-blower warned the Guardian that there was this new automated system in Australia that was being rolled out to assess and recover debts from hundreds of thousands of Australia’s most vulnerable families. And the whistle-blower warned that the system itself was fatally flawed, that there were just some like very deep mistakes baked Into The Code.

By the time I got here there was really extraordinary work already happening around Robo Debt and a significant chunk of the time I spent in Australia was spent speaking with Australian Unemployed Workers Association and ACOS, and members of the Not My Debt campaign. And Victoria legal aid path-breaking journalists like Asher Wolf, and talking to them about RoboDebt, often over beer, which was great. And when I got home I started to wonder like, is it happening here? And in October after about a year of interviews and research and public records requests, and all that stuff, I could show that it was.

So, I wrote in that piece about dreamer Richardson and her family for the Guardian, that predatory policy changes and high-tech tools had sort of turbocharged government, what I call Government zombie debt collection. Partially because the debts tend to be so old, to three decades old sometimes, and that in public service agencies across the country had really ramped up this practice. So, in Illinois for example, they were sending out an average of 23,000 notices a year, just in the Department of Human Services, so just for straight welfare. That’s something that’s important to understand. Here is, in the United States public benefits are federally funded but they’re locally administered. Meaning that even in a single state, you can have multiple systems and then multiple agencies within those systems. So when I say 20,000 notices, it doesn’t sound like a lot compared to like the 800,000 letters that Australia sent out. But the difference is that Australia’s a federal system, so all of the programs were coming through one office. In the United States, Illinois -just in basically food stamps – is sending out 20,000 notices a year. And unemployment in Illinois is sending out tens of thousands of notices a year.

And Chicago has a different system because it’s a really big system. So, Chicago is sending out notices, right. And that’s one state among 50 states. So, it’s important to understand that the numbers sound quite a bit lower, but if we add up all the states, they’re at least equivalent to – I believe they’re at least equivalent to Robo-debt. Iowa for example, when I was doing this research, had just recently sent out 20 000 notices in unemployment benefits alone, in two months in 2018. So, it was averaging 10 000 notices a month, which is pretty significant if you multiply it by 50, right, 50 states. So, my visit to Australia really helped me understand things like Dreama Richardson’s story, the work I did on Pondera that followed that and why they expanded so rapidly during the pandemic, and things like electronic visit verification. Because it really helped me see that they’re all three examples of what happens when we allow the fear of Fraud, and also underlying assumptions about the criminality of poor families and low-wage workers, to trump our commitment to providing care with integrity, equity, and dignity. So, Covid did not begin this trend and I’m going to talk about that in a moment. This is merely one instance in a very long American history of profiling, policing, and punishing care. But the pandemic has intensified how the country polices insecurity in these sort of deeply consequential lasting ways.

So, I want to talk for a moment about what these three stories have in common and what we can learn from them first.

Just a reflection on what I’m seeing in the United States as a kind of anti-fraud hysteria that has happened during Covid and the way that that hysteria is driving mass surveillance. And in the end, can weaken systems of publicly funded care if we allow it to. And just so you know I’m not making it up. Literally, I was like oh I need some proof that the panic is happening and so I just Googled terrible unemployment fraud, and like 700 articles came up this morning with all these increasingly panicked headlines. So, half of Covid unemployment benefits, is it true, 36 billion, 45 billion – right. So, there’s this very sort of panicked narrative about fraud during covid. This is a bit of an aside – can you stick with me for like a 30 second aside because I think it’s important, is that okay? I think you guys are so generous, okay.

There’s something really interesting in how the media talks about fraud during the pandemic because the great majority of fraud that did happen during the pandemic was undertaken by organized criminal entities, and it was directed in the United States against brand new federal programs. So, things like pandemic unemployment assistance PUA, or the pay check protection plan, PPP, of the small business administration. But, the attempts to then react to and respond to that fraud have been really interesting. They focused not on these sort of new threats, but on existing state programs like employment insurance, like Snap or food assistance, programs whose fraud rates have been actually quite low, historically. And they also focus on individuals rather than these sort of syndicates that have actually been doing much of the crime around fraud during the pandemic. So that’s an aside.

But, companies like Pondera, who we talked about before are obviously seeing this panic, this moment, as a growth opportunity for their products. And moderately liberal government administrations see crackdowns on fraud as a politically safe way to limit the care state without explicitly appealing to racist and sexist and classist and xenophobic narratives that more hardline right parties tend to use when they do that. And those narratives tend to have to do with welfare cheats and lazy care workers, who are often racialized or often made foreign in these stories. So, John Koss for example, who’s the CEO of Pondera, has said that he believes 75% of applications for public benefits during the pandemic were fraudulent, which is just insanely inflated.

In March of this year, President Biden announced that he has an executive order forthcoming on preventing identity theft in public benefits programs and that we should expect it in the next few months. So, this sort of anti-fraud hysteria is really producing opportunities for mass network surveillance of foreign working-class families. Pondera, for example collects just really enormous amounts of very sensitive information to train and run its prediction models. It draws data from commercial data brokers done in Bradstreet, LexisNexis, TransUnion, social intelligence Corp, which is like a big social media scraper. It also offers the capacity to integrate state agency data. So. all sorts of data sources like DMV records – it’s a Department of Motor Vehicles. The national directory of new hires, and incarceration records. And finally, since the company was purchased by Thompson Router recently, it’s one of the biggest sort of data brokers in the world, the software includes the capacity to search Thompson Router’s clear database, which according to in August 2021 ruling in a pending class action lawsuit against the clear database, is quote, an all-encompassing invasion of plaintiff’s privacy whereby virtually everything about them including their contact information, partially redacted social security number, criminal history, family history, and even whether they got an abortion, to name just a few data points, is transmitted to strangers without their knowledge, let alone their consent.

So, the clear database is currently the focus of several class action lawsuits, this is one of them. There’s another that specifically addresses the fact that Thompson Routers is selling data to immigration enforcement in the United States through the clear database. Further, even though Pondera basically promises to predict and prevent fraud in practice, what they actually do most often is identify overpayments like Dreamer Richardson’s. So, in their contract with Nevada Department of Human Services for example, 16 of the 56 data elements in their model have to do directly with debt. So, variables like eligibility overpayments, responsible person debtor, and dozens of others. So, the system that Pondera makes at least, is fundamentally not predictive. It is backwards looking, right, in more ways.

Efforts to build automated decision-making systems in publicly funded care are motivated by a genuine desire to mend fissures in a strained and fragmented system. The devastation that was wrought by the Covid pandemic in the United States and elsewhere, I think has made our need for better care clear. Not just in Hospitals and Clinics, but in our homes and in our schools and in our streets. And as a National Domestic Workers Alliance director Ai-jen Poo has urged us to acknowledge for decades, the care industry in the United States was quote, a house of cards on the point of collapse long before the pandemic arrived on our shores.

Okay, second thread to pull. A kind of anti-fraud tech washing is actually having the effect of rolling back very difficult hard-won rights. So, these three cases that I’m looking at right now what I think of as American Robo debt or these zombie debts, digital fraud prediction, and care tracking apps, they represent a stealthy rollback of rights that were fought hard for and won by social movements in the United States in the 1960s and 70s under the cover of providing supposedly more objective, more neutral, mathematical decision making. So, that’s what many writers call tech washing or math washing, sort of hiding political decisions behind a veneer of the cleanliness of mathematics. So in the late 60s and early 70s, the national welfare rights movement in the United States fought hard to achieve three legal victories that affirmed that people receiving public benefits should basically enjoy the same constitutional rights as everyone else. This is honestly – 1970 was the first time that people receiving public benefits in the United States had the same rights as anyone else – that’s only 50 years ago, right. So in my opinion – this is my op-ed voice – too early to stop insisting that people receive public benefits should have the same constitutional rights as everyone else.

So, there’s three victories, I won’t talk about them in any length, but King versus Smith in 1968 overturned what was known as The Substitute Father rule. Basically what the substitute father rule did was say any person you’re having sex with basically, if you’re romantically involved with, is responsible financially for your children. What that rule did was allow care workers all kinds of latitude – I’m sorry, caseworkers not care workers – all kinds of latitude to sort of pry into your life, into your sexual history, and look under your bed, like that kind of stuff. Often with a collaboration of police, right, just show up at your door in the middle of the night with police and look for a man in your bed.

So, King basically guaranteed, said that this was an utterly illegitimate, and guaranteed basic rights of personal and sexual privacy to people receiving public benefits. The second Victory Shapiro versus Thompson in 1969, the Supreme Court agreed that residency restrictions, which are basically eligibility rules that limit your access to public benefits if you’ve recently moved – like you can’t move across state lines and get public benefits, you have to live there for six months or ten years or whatever, before you can get benefits. So, the Supreme Court ruled that those were unconstitutional restrictions of a person’s right to mobility. Goldberg and Kelly, the year after in 1970, enshrined a principle that public assistance recipients have rights to due process and that they’re – importantly, their benefits cannot be terminated without a fair hearing, which is an administrative process that includes basic substantive rights that you would get in criminal court. For example, the right to examine evidence, the right to legal representation – though not help paying for it – and the right to a fair and impartial decision maker. Similarly, during the same era the disability rights, disability Justice movement asserted the right to autonomy over one’s life and the ability to live, work, and receive services within the broader community. And that organizing resulted in both the Americans with Disabilities Act in 1990 after many years of struggle, and the independent living movement, which we’ll talk about a little bit later.

So, what impact are these automated decision making tools having on these rights?

First, let’s talk about privacy. So fraud prediction software and care tracking apps both threaten these hard-won rights, I believe. So, for example, this on the left is – oh no they’re both from Pondera public records requests that I did – for example, Pondera does, their fraud casting software Caster software, does social media analysis. Meaning that they allow caseworkers to sort of delve into applicants and recipients lives much more deeply than caseworkers could do before. And importantly, broadcaster actually analyses the whole family and that family’s whole social network, rather than just the applicant. So, there’s also this kind of surveillance that happens to people who are connected to people who are applicants or recipients of public benefits. Which means that surveillance is being conducted on networks of people, on families, on neighbourhoods, on communities. Not just on individuals.

Like, often I say you know, the one thing that Orwell got wrong is that big brother is not watching you, big brother’s watching us. Watching groups of people, not individuals. It’s something that’s new about digital surveillance in some important ways.

Mobility, let’s talk about Mobility. I’ll go back to this. So, this picture on the right is from Pondera’s fraud prediction model. One of the flags it produces that it believes is an indicator of fraud, is anyone who travels a significant distance to do their grocery shopping. So, that’s what you’re seeing here. There’s a like a blue dot that is really covered in all of those green dots, that’s the retailer, that’s the store. And then the red lines are where recipients have travelled to get to that store. Data collected by electronic visit verification similarly includes care workers, and therefore recipients, longitude and latitude. It includes alerts like this one you see at the right. Alert, provider is 15 minutes, 15 miles from expected clock-in location. And this is really built on a premise that contradicts the Americans with Disabilities Act really quite explicitly, as well as the premises of the independent living movement. So, it assumes that people like Kevin Hoover who you saw before, who loves to fish and work on cars, and play music with his friends, and go out, are homebound. That they’re not living active and vibrant lives and in fact, enforces that on the. I had many people in Arkansas with disabilities tell me that they felt like what they were forced to do under electronic visit verification is turn their homes into nursing homes, turn their homes into institutions.

So, proponents of electronic visit verification argue that geofencing only tracks the movements of workers, it doesn’t track the movements of clients. But that distinction is obviously and patently like meaningless, because care workers and recipients tend to be in close proximity when services are exchanged. That is that the nature of the work, so if you’re tracking the worker you’re also tracking the recipient of benefits. And finally about due process, fraud prediction and care tracking also really have the potential to degrade and affair hearings and other sort of administrative law processes. This is a document from Dreama’s case and they basically sent it to her. So, if you look up the top, the only thing that’s not blacked out there at the top, is the date that this original notice supposedly went out, on October 5th 1989. And they sent it to her in 2018. And she said I’ve never seen this before and they’re like yep, it’s in the system, so you definitely got it. Which is actually not what noticing requirements require for due process, under Goldberg versus Kelly. The fact that this thing appears in a database doesn’t mean that there was a meaningful effort to contact the recipient, or that the recipient actually received that message. So, there’s really the possibility of weakening notification requirements, which is a problem because it’s not the most important fundamental right and due process, but it’s the one that advocates and legal aid representatives often win on. So, that’s what they usually win cases on, is failure to notify. So, when you erode those protections it really can mean a really big difference for folks who are trying to defend themselves, legally.

Third, quickly, sort of a thread to pull out of these three stories is what I think of is neoliberalism is dead, long live neoliberalism. So during Covid there was a moment of loosening of austerity measures, which I think was a really hopeful moment in political time. So, program changes that had been considered politically impossible for decades and decades, suddenly seemed inevitable, right. They raised the new start wage in Australia, they did away with mutual obligations in Australia. If you had – in fact I knew activists working on this four years ago, was not something they ever thought they could accomplish, they were working on it there was campaigns but they were like oh that’s never going to happen. I mean just sort of overnight, it’s like no let’s raise the wage. Like nope, let’s do away with mutual obligations. In the United States, we provided a specialized unemployment to gig workers, unemployment benefits to gig workers which is something that a year before the pandemic everyone was like that is impossible, it can’t be done. Oh look, suddenly done. And around the world a number of countries established minimum living incomes, in places like Spain, Colombia, and Mexico and other places. Again, something that a few years ago seemed politically infeasible. But sadly, I think the moment has been short-lived. And even as we start to address the rollbacks and the clawbacks that we’re seeing in the wake of that loosening, there’s so much work that didn’t get done in the moment of loosening, that we still need to do. So, we generally think that the problem with neoliberalism is the withdrawal of the state from its duty to care. And that’s true, in the United States for sure. The state has withdrawn more and more since the 1970s, leaving schools and libraries and community organizations and individuals, meaning generally women and other community, traditional community caretakers, to pick up the slack for abandoned, underfunded, stigmatized public service programs. But ask Dreama from the beginning there, if she feels like the state is withdrawing from her life. And I would – I did not ask her specifically this question – but I feel confident saying she does not feel like the state is withdrawing from her life.

So, this is really interesting sort of moment, and when we think about neoliberalism we think about it as this withdrawal. But when we look at how it actually operates in the moment, it is both withdrawing and intruding at the same time, in this really complicated way. And that’s what I’m going to talk about when I, as I talk in the sort of last third about the left hand of policing. Because what’s happening is the left hand of the state, this social provision piece of the state, is being more densely interwoven with the right hand of the state, which most people think of as the force of social control. So, things like the military, law enforcement, and the criminal courts. So, we’re going to talk about that for just a couple more minutes, depending on how our energy is. People are starting to look a little squinchy, squinchy meaning like you’re moving, like this a lot so that means you might want things to move along a little bit faster pace. So, up fingers if you’re still doing really good, middle, not middle fingers, but middle twinkles if you’re starting to get a little tired and – it’s okay, yeah some middle twinkles is fine – and down twinkles if you want me to shut the heck up already. Oh, nobody’s that brave, bless your hearts, you’re very polite people.

Okay, so very quickly I want to talk about the left hand of policing and what it has to do with data and predation. So, when we talk about policing we usually have a very particular image in our head. We generally think it means activities undertaken by law enforcement and or the military, focused on crime deterrence, punishment, defence, and social control. What we hear less about is what the French theoretician Pierre Bourdieu famously called The Left Hand of the state. So this is social workers, family counsellors, youth leaders, rank and file magistrates, and secondary and primary teachers. That what they do can also follow a logic of policing. So, the left hand of policing, that is this social benefits, the beneficiary policing – we’ll find a way to name it in a moment – seeks to exert social control and contain social unrest by treating both recipients and providers of care as cases to be investigated, as if they were criminal defendants, and solved through behaviour modification deterrence and the threat of economic or physical violence. So, this creates a system that Dorothy Robertson sort of a different context, calls benevolent terrorism. So, this is kind of the benevolent hand of policing.

Whether it’s via the left hand of the state or the right hand of the state, importantly also produces considerable opportunities for economic exploitation. So, this is something that Ta-Nehisi Coates’ really, really moving essay, the case for reparations, talks about specifically as a logic of plunder. So, it’s not just punishment that the left or the right hand of policing does, though the two practices of punishment and plunder are clearly linked.

So, what do we gain when we talk about the kinds of stories I’ve been talking about, as being the left hand of policing. So, the first thing that we realize I think, is that automated debt and care tracking are forms of predatory inclusion, and basically that several different scholars have used that phrase, probably most famously Keeanga Yamahtta Taylor and a wonderful book called, the race for profit, uses the phrase, and Josh Page and Joe Sauce have also used it. As they describe it, predatory inclusion exists when the conditions of poverty and distress become excuses for granting entry into the conventional market on different and more expensive terms. So that’s Keeanga Yamahtta Taylor’s definition and Sauce and Page talk about predatory inclusion as the sort of practice of marginalization of a community, leaves it open to and vulnerable to targeted resource extraction.

So, Keeanga Yamahtta Taylor writes about this practice in real estate. So, she writes about how in the 1970s the federal government and private partners ended the traditional practice of real estate redlining, which had traditionally in the United States, restricted mortgage lending in African-American neighbourhoods, really robbing generations of African-American families of the opportunity to have safe homes and build wealth. But that they turned instead to programs that actually encouraged poor people of colour to buy homes under conditions that left them vulnerable to predatory financial practices. Predatory mortgages, unanticipated expenses, cratering home values. So, she writes that rather than being excluded from participation in home ownership or working class, black families were specifically sought out for inclusion. And also a lot of this was algorithmic targeting, early, early versions of algorithmic targeting. And this is a quote from Taylor, because they were poor, desperate, and likely to fall behind on their payments. So, new financial instruments, things like mortgage-backed securities, perverse incentives from federal programs that guaranteed loans for low-income people, even if they ended up being foreclosed on, basically guaranteed that banks and predatory mortgage lenders still turned a profit. Which chain sort of mechanism of exploitation from one of exclusion to one of inclusion with predatory conditions.

Page and Sauce talk about slightly different practices. So, they talk specifically about the targeted and exploitative use of fees, fines and the aggressive collection tactics within the criminal justice system that props up failing Municipal budgets by going specifically, mostly after communities of, poor and working-class communities of colour, for things like traffic citations, quality of life crimes, other things that particularly where people actually can’t afford the fines. Actually turn into long-term means of economic revenue generation over long periods of time. So, Sauce and Page write ‘deprivations are more than just hardships endured at the bottom of the social order. They also supply the basis for predatory projects that enrich advantaged beneficiaries at the top.’ So, this is in a 2021 science article called the predatory dimensions of criminal Justice. They write the repressive effects of policing and punishment find their counterpart in practices that produce and position targets for resource takings, right. So this logic of plunder, not just punishment, which they’re connected but they’re not the same. The logic of plunder is a logic that says like through punishment and through targeting we can position people as long-term streams of revenue, because they can’t afford it. So, it is actually this predatory inclusion. So, as in law enforcement and in housing I believe the public assistance system creates barriers to employment, to housing, to education, to medical treatment, to family integrity, which creates needs and vulnerabilities and poor and working class communities that can be turned into profits.

So automated debt and care tracking produce similar opportunities. We often talk about them – and I have done this – we often talk about them as ways to cut costs, but in fact there are ways to create revenue by including poor and working class families in systems that typically exploit them. So predation, that kind of predation is happening both to those people who receive care, who are vulnerable to debt raising, like Robo Debt. Or even to debt invention like Robo debt. And to those who provide care which is a little harder to see, but that micro tracking of the labour of care workers means that they can no longer provide services that they can’t measure. So time spent engaging a client or a family member in conversation or just like you know shooting the [] and like having a moment, that you’re not actively doing a task. That’s very hard to measure. Bathing, medication, you can say I started at 12:15 I ended at 12:45, click and click those boxes. But like how do you capture these really important dimensions of care that are harder to quantify. What happens is when you micro track the activities of care workers what they end up doing is doing that work anyway and not charging for it, because they’re not able to charge for it. So, I think it’s a form of wage theft on this kind of micromanagement of caregivers times. So, it is a very similar kind of revenue generation but through a different mechanism.

Okay, we’re getting very close to the end. Everybody still okay? You’re so patient. Okay, two, what can the left hand of policing do for us, as a concept?

I think it helps us see that automated debt recovery and care tracking are not revolutionary, they’re really part of a long system that has changed in some really important ways. So they’re not revolutions in practice, they are moments in a long history of profiling, policing, and punishing care. So, attempts to automatically claw back benefits or to quantify and regulate the way that care is produced aren’t new, they go back in the United States at least to the beginning of slavery and probably before. They are what I’ve called in other contexts, the digital poor house. And it’s a project of both continuity and change. So, part of the benefits of being the keeper of a poor house, this is a 19th century, you guys are probably familiar with this practice already, but it’s a 19th century like physical institution where poor people were incarcerated. And part of the sort of benefits of being the keeper of one of these poor houses is that you had unlimited use of the labour of the inmates which you could kind of turn into lucrative side hustles, right. You could hire out inmates from the poor house as farm workers. You, generally poor house inmates cared for keepers children. They cooked their meals, they clean their houses. Female inmates were also regularly targets of sexual predation and exploitation.

So, none of this is new. And in fact, keepers of poor houses in my town, in Troy, New York, actually hit the papers because it was discovered that they were selling the cadavers of poor house inmates for medical experimentation. So, sort of birth to grave kind of exploitation of inmates of poor houses. But there are of course important differences. Like we’re not living in 19th century America. We are in 2022, I think, and in Australia. And database integration and administrative centralization, the sort of forever everywhere case file that now accompanies people who receive public benefits, and the sort of speed and the scale of these systems certainly could and can still lower barriers to getting benefits if we allow them to. But they can also provide as I’ve shown this Rich sea of information about families and caregivers that can be plumbed to identify vulnerabilities that are ripe for exploitation.

So, economic predation by algorithm, it’s really an increasingly important mechanism for revenue generation, both for the government and for private companies. So, the treasury offset program which I talked about earlier, has basically, in unemployment alone, netted an average of 300 million dollars a year. And that’s in one program, and it covers a number of different programs in the United States. And that doesn’t take into account those state and local programs and the nickel and diming that those offices do with folks.

So as Page and Sauce argue about fines and fees in the criminal justice system, quote ‘by imposing debts legal authorities lay claim to future resources, enhancing their ability to take what poor communities by definition, lack.’ So remember that when we talk about the right hand of the state today, we tend to use it to refer to law enforcement in the military but when Bordeaux was originally writing, he actually argued that the right hand was technocrats, not the military and not police. And so I’d like us to kind of think about that.

Last point, and then I’m going to wrap up. Which is, I think, understanding the left hand of policing can help us respond to the political moment that we’re in right now. So the pandemic really calls us I think to think more broadly about solutions to automated inequality, to mass surveillance, and ultimately to what is an international crisis of care. So, if automated decision-making and publicly funded care is a form of predatory inclusion, if it’s an extension of this historical practices of social control, then our current solutions which tend to sort of centre around opening up opportunity to the pipeline of tech design, or eliminating bias and in design and implementation of these systems. These will not work because we’re saying let’s remedy exclusion, and what we’re actually looking at is predatory inclusion, and that calls for really different strategies and tactics.

So fair or non-discriminatory automated decision-making systems, they can still cut millions of people off from needed benefits. They can do that in an equitable way. Like they’d just be like let’s cut everybody off. That’s a perfectly equitable and just way to cut people off from benefits that they need to survive. You can set people up for economic predation in an equitable way. You can include more people in that project, and you can imperil the quality and ethos of care in a way that is not necessarily beset by bias.

So, in the last decade in the United States we’ve really witnessed this crucial reckoning with the racial logic and the racial violence that is endemic to American law enforcement. And I think we can take some lessons from that moment. So, organizing to save black lives has focused on these big hopeful structural solutions. The movement for black lives does not ask for more black police officers, they don’t ask for improved community oversight of law enforcement, they ask for defunding and abolishing the police. They ask to invest in education, restoration, employment, and interestingly universal social services, so there’s a conversation to be had here as well about the right and the left hand of the state. But the call fundamentally is to build political power and to end the war on black people. So, in the welfare rights movement and in the militant struggle for genuine care, our solutions have to catch up with this political moment. Our questions shouldn’t be things like how do we get fairer automated decision-making systems and public benefits? It shouldn’t be, how do we eliminate fraud and home-based health care while maintaining today’s service levels? They should be better questions. Questions like how do we end the war on the poor? And how do we ensure and reward care in ways that centre workers and recipients’ health dignity and autonomy.

So, clearly, we have to start by stopping private firms and public agencies from using widespread human pain as an opportunity to push untested technical tools into our country’s most vulnerable communities. Like that feels like step one. And we also have to acknowledge that fraud is a problem in public programs, but it is much smaller than people think, even now. So, if we go back to that California example from the beginning of the talk, they paid out, at most, half a million illegitimate claims and holy, like that sounds like such an incredibly big number, but it’s out of 10 million claims. So, the fraud rate there at best, and this is probably an overestimate, is five percent. And that might sound unsettlingly high, but proportionately it’s actually just about right. It’s just about normal. It puts the agency on on par with U.S businesses, which according to the association of certified fraud examiners, U.S businesses lose an average of five percent of their gross revenues to fraud. It is kind of the baseline for fraud. So, instead of predictive analytics to root out and sometimes even invent fraud and publicly funded care, governments could instead use good old public information campaigns and social work shoe leather to identify and enrol those who are not receiving benefits for which they are eligible. And that is actually the rule and not the exception. So, the Bureau of Labour in the United States for example, found that in 2018 only 26% of Americans who lost their jobs even applied for unemployment benefits. So 74% of people who lost their jobs and probably were eligible for unemployment did not even bother applying for them. So the problem is not over utilization or targeting of these programs, it’s underutilization of these programs. The problem is not weeding out the chisellers, it’s building a state that doesn’t stigmatize shame and dehumanize people for seeking, or for providing care. All the while setting them up as targets of automated debt and digital wage theft, as sources of revenue for years to come. This kind of algorithmic pocket picking is accruing a debt that our democracies can’t afford. That’s it, thank you for sticking with me. I’m really excited to talk to you and hear your question. Thank you so much.

Oh we’re waiting for a mic because we want to make sure that folks who are on Zoom can hear your questions. So please don’t make the mic make you shy, just like take it and be an MC. You can do it. Anyone have a question? something they want to talk about? Yeah, we’ve got one right up here. And do you mind sharing your name, just your first name, let me know who you are.

Participant 1:

I’m Rishi, I’m studying physics and maths.

Virginia Eubanks:

It’s nice to meet you.

Participant 1:

You too. Thank you so much for the talk that was incredible, and hearing those examples was honestly shocking. Are they like, what you’re saying at the end with the algorithms, as someone who’s interested in maths, to what extent should we should we be using algorithmic technology to assist in problems like this. Or is there too much of a risk of lack of transparency or privacy to even consider using it?

Virginia Eubanks:

Yeah, so that is kind of the one billion dollar question, right. Like I appreciate you asking that. Can I respond in this weird way by just like telling you a story. So, the story that I want to tell is one actually also on this last trip in March of 2019 that I took, I also went to Aotearoa in New Zealand. And there I was in what I didn’t really understand was a debate until I got there, but a debate with a philosopher, like a professional ethicist, let’s say, who was involved in a project that I had written about in automating inequality. So, one of the projects I write about in that book is called the Allegheny family screening tool and it basically is a predictive model that is supposed to be able to guess which children might be victims of abuse or neglect in the future. I took quite a critical stance to it in my book, but it’s been you know quite, a controversial program in general. They actually started by trying to roll that out in New Zealand. New Zealand said no, New Zealand chose no on that and then the folks who designed it brought it over to Pittsburgh and Pennsylvania and started it up there. So, I end up on this panel with this guy who was one of the ethical reviewers of that system. And so, it’s already a little tense, as you would you might imagine. And I also think folks in New Zealand are like really polite on the surface and like, we had some hostility going on and I think it was making the audience super uncomfortable. So, one of the moments in that, in that talk is you know, somebody asks a very similar question. And I said you know, I think I’m going to say something that is going to make my colleague here really uncomfortable, which is that I’m going to say that I think there are situations where we should never actually, never ever use algorithmic decision-making, because the data is just too dirty. Like the data we have, there’s no way to clean the data and there’s no way to not reproduce the sort of bias and the harm that might already be in that system. And my colleague said I’m going to surprise my colleague and say I agree with her. And I was like, fascinating. Like, so tell me where you think we shouldn’t be using these tools. And he was like well in the United States you definitely shouldn’t be using them in law enforcement for that reason. The data is just way too racist, because the way it was collected and because of policing practices over time you cannot get a clean result from that data, like you should not be using it. And I said oh that’s so interesting because you said it’s okay to use it in child protective services but not in policing. And I was like, do you think that system – this is something that we’re talking about in the United States – do you think that system is somehow less racist than policing. And he was like yeah, it’s a supportive system it’s not a system that’s based in punishment. And that really becomes kind of – particularly for people who are directly impacted by these systems – it is that this is a left hand of the state thing. These are programs that are benevolent. But they are benevolent, they can be benevolent under these conditions that make them quite terrifying for people who are in them.

So, I don’t think there’s anyone who has dealt with a child protection investigation who wouldn’t say both that like yes, they got really important benefits from this system. And two, it was the most terrifying thing that had ever happened to them, because it feels like if you do anything wrong, once their eye is on you you’re going to lose your kids. So it is both a benevolent, and a terrorizing organization. So, there are folks in the United States currently calling for the abolition of child protection for just this reason. And that’s where these pictures come from, actually are from that movement. So, yeah, drawing this attention to the way that policing happens in child protective system, and particularly racist policing or racially motivated policing happens in child protection.

It’s not a clear answer to your question, but I do think what it shows is like that is a really interesting question and it’s an important one, and I think people’s answer to that tends to depend on whether they see the system as being punitive or as it being supportive. And often we need to spend a little more time with the systems to understand how complex, even like more directly benevolent systems are. Does that make sense? Yeah. Rishi, thank you for the question.

Do we have another one, anyone else? All the way at the back. Oh wait, we’re going to go behind you first and then we’re going to come back to you. We’re going boy, girl, boy, girl. Go ahead.

Participant 2:

Thank you Professor Eubanks. My name is Bernadette Highland-Wood. I’m utterly depressed now after your talk, on top of all the world events. I’d like to preface this by saying I’ve been involved with linking government data for 20 years, starting in the U.S for the Federal Government, and it’s been completely hijacked. The effort to bring together data for the purposes of transparency and accountability of the government, to many of these private vendors who’ve been given carte launch. I’m curious if you’ve looked at, and have any thoughts about the fact that the government has effectively outsourced all of the linking of data, and it’s been in the environment of no data legislation, no curtailment, no regulation around these private vendors, including Reuters, to stitch together disparate data sources and really come up with some pretty, as you’ve articulated in great detail, atrocious for society and arguably the economy, things that they’ve done. But it’s occurred in a world with no data regulation. And the other thing I would add is you know, we all use U.S and Chinese-based platforms, and many of us use platforms created and run by Chinese-owned companies and don’t even realize it because they’re not branded anything Chinese. They’re AI driven systems and so yeah, I’m just curious whether you’ve looked into the issues related to the unfettered nature that private entities – through very big ones like Palantir -have stitched together government data for profoundly detrimental things. Because I actually don’t think it’s the government organizations half the time that have any clue as to what the private, clever private companies, often base in Washington DC area, are doing. And I have lots of thoughts as to why they’re doing it, but I’m curious what your thoughts are.

Virginia Eubanks:

Yeah. That’s a great question. Yeah, thank you. So, in automating inequality, in the book, I actually very intentionally chose public systems, not systems that had been produced by private contractors, but systems that were actually produced by government agencies. Because, sort of to balance a little bit, what I was reading at the time as a tendency really to go to this sort of stepping in of private industry as the only part of the problem. Because part of what that does is really kind of not give you space to think about the ways that these technologies are means of social control. Like they’re not just mechanisms for producing profit, though I do talk about that economic exploitation now, but they are also ways of exerting social control. So, I would argue that certainly private industry is happy to – of the opportunities that government leave open to them, to take on some of these responsibilities the state has chosen to step away from, including responsibility for care. And so I think it is a really complicated relationship between the corporations and the government. I think in the left at least, we tend to be more comfortable like pointing at the corporations is like the moustache twiddlers in the dark room. But I think if you actually look at who’s part of these organizations, there’s a lot of traffic across that boundary. And what I found in automating inequality is even when systems were built with a large degree of transparency, and systems were accountable to democratic process to some degree – less than we might like but to some degree – that particularly when you were talking about systems that are regulating the lives of poor and working class people, they all kind of turn out not great. And that is less about the profit motive and more about social control. And that has been with us for a really long time. I’ve done less work with private companies but I do think that they’re really important work that’s happening now, for example around processes of procurement. Like thinking better about how to manage procurement so that you’re doing some kind of work to make sure that you’re standing in the way of the worst outcomes and the kinds that I talk about. I  think that’s incredibly important work. I think there’s also like direct regulatory work that’s really important, around for example, demanding that you only collect the barest minimum data that is essential to the practice at hand, rather than the general practice of just collecting whatever you want because you can. There’s a ton of work for good regulation and there’s a ton of really important legal work to be done, too. That tends to not be the kind of work that I focus on just because my thing is storytelling and I like to do work with and around people who are directly impacted by these systems. So, it just tends to not be the kind of work that I do, but it’s really important. Thanks Bernadette. Thank you for being patient.

Participant 3:

Sorry, I didn’t realize there was somebody behind me. Thanks so much. My question sort of leads on from that and it might be one of those too myopic questions instead of big ones that you’re thinking we should be asking, but if a lot of these sources scrape social media, do you think at least a starting point could be more effective regulation of social media and or laws like GDPR, and or public education about what to post publicly on social media you know, anything basically with social media be a, I don’t know, upstream or you know, way to at least start if sources are getting a lot of their information from just scraping social media.

Virginia Eubanks:

I think it certainly wouldn’t hurt to think more and better about that. I can say as like an advocate and an organizer that one of the first things that folks that I’ve worked with who have dealt with interactions with child protective services, the first thing we tell them is like erase all of your social media right now, like do not – and it’s not just like what you’ve done what there’s pictures of, it’s like what somebody who’s linked to you, who’s linked to you, who’s linked to you has done, and just like immediately get off social media. So, certainly it is education work that is happening, like in sort of organizing spaces about like how to use the benefits of social media in a way that doesn’t put you at further, doesn’t make you further vulnerable. But yeah, again, like it just tends to not be what I work on. There’s a lot of really great work on Twitter and on Facebook and on that stuff and, so both of these questions for me, like, the reason that I don’t do that work is not that I think it’s not important, it’s because I think that many academics who do this work tend to come to it from their own experience as all of us do, right. We come to what we think about and research, based in our own experience. And because of the way Academia works in the U.S as a sort of engine of class advancement, or as a perpetuator of class advancement, a class privilege, that folks who study these things tend to study them from a professional middle-class point of view. Which is that we tend to see these concerns as consumer concerns. It’s like how should you consume this thing or not consume this thing, and one of the things I was really interested in doing and automating inequality and all the work that’s come since then, is like really pose this question of like well what if you don’t really have a choice whether you consume this or not. And I think that we’ve done a lot more work on that in the last five or six years than when I first started working on automating inequality, which was probably 2012, probably 10 years ago now. But again, because I’m so interested in social control as a topic, I tend to steer away from the things that are more like based in consumer choice and more towards the things that people at least have this ambivalent relationship with in terms of like, is there ever a meaningful way to say that there is informed consent around being involved say, in the welfare management system, if the only choice you have is give up all of your information and maybe get benefits, or don’t give up all of your information and get nothing. And like, then if you don’t have enough food then that opens you up to a child protective services investigation. And like there’s all these cascading effects of that. So, I I’m really interested in working in these spaces where the question of choice is a little bit more complicated. I think a lot of people would argue that that’s true about social media right now anyway, like that we only have limited choice in whether or not we interact with it, but yeah I like these real uncomfortable intersections where it’s a little harder to say like, well just don’t use Twitter then. You know, it’s a little harder to say like, choose not to engage with it. I think a lot of people aren’t in a position to choose whether they engage with these systems or not. Yeah, thanks. Other questions? Right down here. Oh, right back there, hi, you have the mic? Please go.

Participant 4:

Hi, I’m Jesse. I study journalism and psychology. Thank you so much for your incredible talk. I was just wondering, I remember you talked about redlining and I know that that had very intergenerational ongoing effects that are still going on now. I’m wondering do you foresee something similar for this like wage theft from care workers? Like what’s the sort of future that you see for that practice especially if we allow other types of practices to come in unfettered? Like, how does that compound?

Virginia Eubanks:

Yeah, that’s a great question, and it’s like it’s something I was – this is one of the reasons that I chose to report that story in Arkansas. So, Arkansas is a super interesting State because it is I think the third poorest state in the United States, maybe the second? Like, go Alabama. Alabama and Arkansas and Mississippi are always fighting it out for poorest state in the in the U.S And it is an aging State, and there is huge out-migration. Like leaving these rural areas really untethered from support. And one of the reasons why what happens with EVV is so important, one of the big fears that people with disabilities had about their caregivers is like, that they already in Arkansas are paid so little for their work, that any form of wage theft basically would make the work completely unsustainable for people. So, the great fear that people like Dante Walker had was he literally had this relationship with his caregiver for 10 years, but he knew that she was always just on the line of not making it. And that’s why he literally had been saving that thousand dollars for years because he knew if she missed one two-week pay check that he could lose her. And that it’s so hard to find caregivers in Arkansas that he could potentially end up in a situation where he can’t get out of bed on his own, right. So, like he could literally end up in a situation where he could wake up one morning and not be able to leave his house, not be able to go to the bathroom, not be able to get out of his bed. And so that’s why he was tucking away those five dollars every time he had one, to provide that for his caregiver. But he was also really clear when I talked to him, he said look I know that I’m in a situation, he’s fully employed, like he’s also getting some support from disability, he had the ability to get money, and he also works as an advocate in an independent living centre for other people with disabilities. And so he really had a lot of knowledge about the way the systems work, too. So, the fear here is that of course there is intergenerational lack of wealth if you’re getting your money stolen, right, it’s a similar logic to sort of redlining in that kind of plunder. But also there is this issue of like if you’re already only making twelve dollars an hour, and you lose a dollar an hour, or your day suddenly goes from eight hours a day to 14 hours a day and you’re still only getting paid for eight, then you are unlikely to continue in that industry even if you really care about it.

One of the things I want to say to sort of address the sort of like ‘now I’m depressed’ thing is that what I found in Arkansas was that caregivers went and care recipe went to extraordinary lengths to support each other. So, I met so many caregivers who were putting in 12 or 14 or 16 or 20 hour days, who were sleeping on their clients couches to make sure that they got what they needed, were not getting paid for that time, did it because of their relationship with God, did it because of their relationship with the person, did it because they just thought it was the right thing to do, but they were doing that work anyway. And that’s one of the things that like gives me extraordinary hope in this work, is that’s what I see over and over again, is people really going above and beyond for each other.

That said, I would rather we not put those folks in a situation where that kind of generosity puts them in this extraordinary economic vulnerability, like leaves them open to predation and leaves them open to all of these other sort of bad outcomes that they can face because of it. So, yeah I do think wage theft – this is something that’s new to me, is like trying to put these things together. And in the past I’ve talked about systems of profile policing and punishing the poor, and I’ve changed how I talk about it. I now also have systems that profile police and punish care, and I think that’s like a really interesting way to shift it because I want to be thinking not just about people who receive support but also people who give it. And about how the way we’ve set both of those people, both of those sets of people up in ways that actually stand in the way of us doing the thing we do as people, which is I hope at least part of the time, really care for each other.

And so what I’ve seen in my reporting at least is how incredibly vigorous that spirit of helping each other, even in the face of these sort of difficulties is, and it’s the thing that keeps me going in work that can be as Bernadette pointed out, a little bit depressing sometimes.

Yeah, my mom is always like I don’t know how you could do that, aren’t you just sad all the time, and I’m like actually no, like these people are amazing. Like I feel so grateful and so lucky to talk to the people I get to talk to. Yeah, so yeah I’m not depressed.

Tell the truth, but yeah, tell it in a way that you can see the real sort of greatness of stuff that’s happening around you too. Yeah. Other questions? Or we want – oh Paul looks like a winding up guy, because we have snacks and drinks!

Paul Henman:

Thank you very much Virginia. That’s a wonderful lecture, you’ve gave given us a lot of food for the thought. We started off with this whole process focusing on digital technology but the story has also helped reminded us that it’s not just about digital technology. It’s about systems that digital technology gets embedded in and reproduces and expands. And that’s an important lesson that we need to go back to, because it’s not a technology problem it’s also a political social problem. And I think that also speaks to what you’re just ending up with is that yes there’s a lot of devastation, a lot of destruction, but there is also opportunities to see otherwise and we hope that the stories that you tell are giving visibility to those due to the needs for change, that leads to some of the changes that we see possibly in Australia, but hopefully in other places. We’ve seen the Dutch cabinet all resign over the Shiri case, a similar case to Robo debt. So, there are opportunities and we hope that as researchers in UQ and the centre of excellence for automated decision-making, and across the world, that giving visibility to these issues that you are so globally have helped to propel, that we are able to put in place changes by the way structural changes about policies, about legal settings, etc. So, that we don’t keep reducing them, reproducing those problems.

So, thank you very much Virginia, for bringing our attention to all of these things and to really helping us work through analytically, some of the dynamics of these complex elements, so thank you very much.

SEE ALSO