Everyday Data Cultures
29 July 2022
Prof Mark Andrejevic, Monash University
Prof Jean Burgess, QUT
Prof Kath Albury, Swinburne University
Prof Anthony McCosker, Swinburne University
Listen on Anchor
Duration: 36:47


Mark: Welcome to the automated decision making in society podcast. I’m Mark Andrejevic Professor of media studies in the School of Media, Film and Journalism at Monash University, and Chief Investigator in the ARC Centre of Excellence for Automated Decision-Making and Society.

It’s a pleasure today to be speaking with three of the authors of the brilliant new book Everyday Data Cultures Professors, Jean Burgess, Kath Albury and Anthony McCosker. Jean is Associate Director of the ADM+S Centre and Professor of Digital Media at QUT. Kath and Anthony are both Professors of media and communication at Swinburne University of Technology. Rowan Wilkin, who unfortunately couldn’t be with us today is Associate Professor in the School of Media and Communication at RMIT University. Thank you all for joining me.

Let’s start by talking about datafication. What does it mean to you?

Jean: Well, datification is a term that has come up through the new field of what is called critical data studies that come to us through sociology and has become the center of a lot of concerns about the way that information appears to be extracted from ordinary people, the way that people seem to be subject to data-driven decision making.

Anthony: Yeah, it’s more and more dimensions of everyday life captured by data or produced in some form of data and offering new forms of data.

Jean: Yeah, I suppose the term datafication stands in for the many ways that data is collected or generated through and by our everyday activities, our engagement. With everyday digital media technologies, social media visits to the bank travels around the city. You name it, this seems to be kind of a generic condition of everyday life in in the 21st century.

Anthony: Yeah, and I think it’s also just becoming more and more important in terms of the way that we engage with digital services and services that have been digitized, which is increasingly especially since COVID, you know, capturing all parts of our life, not just health and banking and finance and shopping, but everything that includes our cultural interactions as well.

Mark: So you spend some time in the setup for the book talking about the ways in which we as a society talk about technology and you draw on Vincent Mosco’s work on the digital sublime. Can you say a little bit about what that term means and why you invoke it and how we should respond when we hear people speaking in the register of the Digital Sublime?

Jean: I guess the term the digital sublime goes back to, well it goes back quite a way in kind of philosophy, particularly about the way that American imperialism worked, and it actually goes back to the kind of the majesty of the purple headed mountain and all that being kind of part of the manifest destiny of America through to building great railroads and bridges and things and this becomes the kind of natural inherited domain of Silicon Valley. Once we get to the Internet and everything like that and we see it today with stuff like the captains of industry taking off to space in phallic, spaceships and so on. So that’s at the kind of like, the really dramatic level, but there’s a kind of idea about the magic of technology that I think imbues very every day and ordinary technologies that that are pitched to us as consumers, and we have to be careful, I guess about buying into any of that idea of magical properties for technology, whether they’re good or evil.

Anthony: It’s also something that’s been talked about in terms of data and data visualization, so this is where like so following on from the digitization, and you know the imagination of what the digital can do and how it can transform life in society. It also carries over into data and how we can capture, you know the granular detail of life, but then visualize it and some of the critical work around data visualization is kind of addressed. This idea of data visualization as sublime is kind of beyond interpretation or intelligence and somehow revealing something to us that’s bigger han what we can see before us, and so I think that’s an important thing to consider when you think about how data integrates with our everyday life, particularly when we’re, you know, confronted every day with numbers around COVID outbreaks or spikes and the visualizations that come with that and the way that we have to make sense of that.

Mark: Great. I thought we’d also talk a little bit about the framework of the book. It’s called everyday data cultures and it draws on a tradition of the critique of everyday life. Can you say a little bit about why you took that approach to the book and what that means for you? The critique of everyday life?

Kath: Look in some ways it was a response to popular conversations about data and academic critiques of data that were, you know, operatic. Almost the notion of a kind of grand battle between good and evil, where you know we were, you know, the powerless mortals were cringing beneath the might of datafication and all of us in different ways in different projects have looked at and been really interested in the mundane and the ordinary ways that we use data, we become data we work with data from spreadsheets to kind of understanding our bodies and our genders and our sexuality’s through you know very classic but also kind of very ordinary and mundane norms about what you know, what men are and what women are and how many partners we’re supposed to have and how long sex is supposed to last and all of those kind of weird, you know cosmo top tips.

Modes of datafication – This seems so common, but so absent from a lot of the literature that we were reading. It was something that we all became very interested in.

Jean: And yeah, as you’ve suggested, Mark, there’s a really deep tradition of thinking about everyday life. Not as the inconsequential or the stuff leftover from politics, but actually the site where most of us live through experience and can test those politics most deeply and at the same time it’s the space where datafication is thought to be having the most insidious effects and impacts are on us.

Anthony: For me, the data is the site of struggle, so that’s where we work these things out and have to navigate and negotiate. You know any interaction with digital technology or with the systems that you know we’re forced to use with our data plans with our mobile phones with the services with the state.

Mark: Yeah, one of the many strengths of the book is the way you draw on concrete examples of what you’re describing is the identification of everyday life and you look at the details of those struggles or those pleasures, can you bring up some of the examples that meant a lot to you, or that you know really helped you think about everyday data cultures?  What are some key examples that jump out to you that would be useful to think about?

Jean: So, one of the most everyday examples of datafication or algorithmic culture, you could possibly think of is the way that we’re invited to use Spotify to soundtrack our lives to wallpaper our mundane existence with algorithmically curated audio content and there’s a lot of scholarship looking at the ways that platforms like Spotify, algorithmically individuate us.

And so you’re sitting in the bath listening to a podcast, or you’re walking the dog, or you’re driving the car and you’re just immersed in kind of data relations. So that’s one narrative. Another narrative is the way that when the platform tries to represent your use of it through something like the Spotify wrapped exercise.

Yeah, users push back and laugh at, mock or point out the tensions in the way that the platform misunderstands the way that they use it. So, for example, telling you that your favorite song of 2020 was a white noise track, which is just actually something you play on loop so that you can stay asleep at night or that you just are absolutely obsessed with this one nursery rhyme, which is actually the only thing that stops the kid from crying in the car.

So, there’s this tension I guess, but I mean the materiality and the messiness of everyday uses of the platform, the way the platform tries to kind of enroll you in, you know as we were talking about at the beginning, this sublime idea of visualizing your year of listening as in a dashboard and all of this takes place in public because the platform pushes you to use a hashtag to celebrate your year in review.

Mark: Great. There’s some really interesting examples too about how platforms or apps have kind of normative expectations that people are expected to conform to, but people find ways of subverting or challenging or resisting those. I wonder if you might speak to some examples there? I was thinking some of the ones that you talked about with dating, dating apps.

Kath: So Anthony and I both worked on a dating apps project that was looking at users perceptions of safety risk and well-being in relation to dating and one of the things that people talked about a lot was the ways that the apps tried to sort them into particular categories. Very binary categories in relation to gender, so am I seeking men? Am I seeking women or am I seeking both? And people who were gender, fluid or gender diverse didn’t find those categories particularly helpful but developed work arounds for the big platforms like Tinder where the majority of matches might be found so where they had a better dating pool, particularly in a in a rural environment.

And one of the people that we interviewed was a trans man in a regional town, which he described as a shallow pond for dating and he was bi and the way that he navigated the platform was he would log in as a man looking for men, then he would log out, set-up a different profile as a woman looking for women, and in that way he could ensure that he was never seen by CIS transsexual people in his town and never outed as being a genderqueer person because that was important for his safety, that he was able to not be seen by certain people, but he could be seen by the people that he wanted to make friends with, or that he wanted to establish some kind of intimate connection with, and the app did not make that easy for him. But he had developed a work around for himself.

Mark: This gets in some sense to the question of literacies, you know the ability to read what these systems are doing and then to turn them to your own ends if you’re able to. In the book you talk about vernacular literacy’s when it comes to data cultures, can you say a little bit about what you mean by vernacular literacy and give us maybe some more examples?

Anthony: Yeah, data literacy is usually defined. I’ll start with data literacy. It’s usually defined as the ability to read, analyze, work with data, or argue with data even. And so if you, if you take that as a starting point and think about, well, what is it that we do with data, how do we encounter it, understand it? How do we engage with it?  I think you know we can open up the idea of data literacy beyond the statisticians and the data scientists and computer scientists who traditionally have worked with data. And so we’re trying to get at this idea of vernacular or everyday data literacies.

And it’s really about those ways that, you know, when we encounter systems that are counting us or measuring us in some way, we have to, we have to work with that, we have to negotiate it, we have to navigate it.But some people are actually really amazingly skilled at that process.  So we were trying to understand first of all those everyday contexts for the way that we might read and work with and even create data and then think about how you know those highly skilled people might navigate and negotiate those situations.

Jean: I guess another aspect of the idea of like another example of the idea of vernacular of data literacies would be the way that, say, social media entertainers, content creators on popular platforms have very detailed experiential knowledge of the way that recommender systems impact on the visibility of there content and their careers, they share that knowledge, there’s a lot of social learning involved. It may not count as the same kind of technical expertise that a data scientist or a computer scientist building those systems might have, but it’s a really important an aspect of data literacy nonetheless.

Anthony: And another good example that we’re seeing more and more evidence of is people working in the gig economy. For example, driving for Uber or driving for other companies where they’re balancing the kind of rewards , the economic rewards that they get and the costs the labor costs their time costs, and you know some of them give really detailed accounts of the way that they way up the rewards of taking a ride and they’ll use you know, almost an algorithm themselves or a way of thinking about it that is, you know computational, but they’re working in their own, in their own way and responding to the app and to the algorithm and to the situation on the ground and a lot of that is shared knowledge, so it’s really knowledge that’s built up among cohorts of drivers in locations, in places, in response to the work that the app does.

Kath: And in a kind of more grim perspective again in relation to the dating apps. There are kinds of vernacular data sciences around optimizing matches, optimizing likes, or gaming the process of meeting and closing the deal with potential matches and these can be simply, you know, people who’ve worked out a system that works for them, but in some cases it can be much darker and manipulative. It doesn’t always work particularly well, although people kind of share these tips online it doesn’t always work successfully for them and then it can kind of flip into a resentment when potential matches refuse to follow the predestined gender algorithm if you like for you know what dates are meant to what men are meant to like and what women are meant to like and how people are meant to respond to a particular pickup line and we’ve talked about that all of us in relation to the dating app work, we’ve done that. It’s not just about the technical properties of the App, but that dating itself has always kind of had a calculative quality for it. You know if you think of the advice book in the 90s, the rules about the best way to get a husband or the pickup artists you know sharing of the right lines or the right ways to get a woman, there is a kind of calculative property to intimate relationships that predates digital platforms, but has been translated through into certain kinds of platform culture.

Mark: So, in a sense, vernacular data literacies taps into a deeper tradition of vernacular literacies, the practices that people have used historically to address both situations where they feel that they might be facing disempowerment and turning those around, or also engaging in forms of calculation for whatever end they’re searching for.

Anthony: I think the other important thing to say about our idea of using literacy in the book is we’re not thinking about this in terms of the individual and the cognitive or thinking, you know, some sort of individual ability, it’s really shared ability and it’s relational, so it’s really about how we speak to each other and how we learn together. And that’s the vernacular language that you might pick up in a particular situation.

So, if you are an Uber driver for example, or working in an Amazon factory, you have to learn to speak the language of that particular situation and place and it’s the same with all technology that we use and the more that we can speak the language of the technology and understand the data and data flows and more, we can respond to it and the more that people do push back, and I think the other thing that we want to say here, is that you know, we’re trying to acknowledge that people aren’t passive data subjects, and that’s sometimes how we’re described or people are described in relation to big technology companies and people do find their way and they do shape their profile and they do adjust the way that they act in relation to those systems.

Mark: Maybe this gets us to the distinction between what you describe as “known” and “knowing Publics”, do you want to say a little bit about that distinction and how it intersects with your thoughts about everyday data cultures?

Kath: We should say we didn’t come up with it, that’s Helen Kennedy line, Helen Kennedy from the University of Sheffield and it’s in thinking about those collective ways of thinking about or negotiating data. It was thinking about the ways that we might know ourselves as a collective group and then how we might make change in that space, through particular kinds of understanding data, so one of the examples that we talked about was the everybody visible campaign, which was an Instagram based activist campaign that was instigated over a number of months where a very wide range of Instagram users were feeling that their content was no longer getting the engagement that they felt it previously had gotten, and this ranged from doulas to breastfeeding advocates , to pole dancers to sex workers, and the thing they had in common was they showed unclothed bodies in some respect. But Instagram was claiming that there was no, what was called “shadowbanning” in relation to their content and the group worked together to do their own kind of vernacular A/B testing of different kinds of content and test whether other people could see it when they posted it on their pages and then they came together on a particular day to share hashtags and images and also to flood, what now we would say were meta executives with unclothed images that were tagged to them so their tagged pages would be entirely these kind of naked images and following this event, after this period of testing, the Meta  staff did admit that there had been some suppression of particular kinds of images in search results or in tag surfacing, and those kinds of things, but it was only after this testing process had gone on and the group had organised as a public that Instagram admitted that it had been doing what they had observed.

Jean: I suppose a really conventional or formal example would be of of “known” and “knowing publics” is the way that data both invites us to see ourselves as part of some kind of public. So census data does that or some of the data visualisations about COVID cases as an example, we use in the book, but that also they invite us to become knowing about the data and so that connects back to the literacy element of some of those public data visualisations.

Anthony: And I think what’s amazing about that concept and that we were really interested in thinking about is the way that different groups can then take that and work with it.

So, if you take the example of indigenous data sovereignty, it’s a really good situation where indigenous groups and indigenous people are thinking about and starting to think about the way that they’ve been included as data subjects or excluded. So part of it is about, you know, sometimes being part of data sets and being part of population census data in a way that’s more visible and sometimes it’s about, you know, having more control and taking themselves out of that, that sort of surveillance I guess. And I think in that sense, the kind of “knowing publics” and in “known publics” is a bit about that play and dynamic around control over how we appear within data and as data, and what we can do with data.

Mark: It’s interesting, some of the examples of vernacular data literacy and also the resistance to Instagram, both highlighted a kind of collective media knowledge or media activity and you invoke the tradition in media studies of looking at media rituals and I wonder if you might say a little bit about the role that media rituals play in everyday data cultures in our everyday data lives. Where would you look for rituals? What would you just described as rituals, and why is it important to think about what they do?

Jean: The concept of media rituals is, I think, super rich and important for thinking about the multiple scales at which our everyday engagements with media matter, so our friendly Humphries has written really powerfully about the very mundane and micro practices of what she calls media accounts that go back centuries, but that together combine to add up to a kind of public record of everyday life and that are just meaningful to people in their everyday lives , practices that might seem like complicit being complicit in datafication and at the other scale we’ve got the traditional the tradition of kind of big media rituals on media events or you know what DN and cats called the high holiday Media high holiday days like royal weddings and so on.

And thinking about how they are being transformed in the more kind of datafied age so that you know perhaps a slightly trivial example is that Spotify wrapped annual event where this platform tries to kind of confect this worldwide media ritual out of very mundane uses of the Spotify platform and the data that that represents that. So it’s a really rich concept that has a lot more life in it, I think.

Anthony: I like the aspect of that idea that’s about the kind of commune and communion that comes with you, know, connecting around media and I think we can extend that to connecting around data as well. So like it just taking those that that example of Spotify, it’s more you know, active way of shaping your identity in relation to other people, in relation to music and popular culture.

But if you think about it in another circumstance, say in terms of COVID data, you know we would in Melbourne here have been looking at that data daily and talking about it and communing in that sense, in that ritual sense, waking up with the data, waiting for it to be updated and the meaningfulness of that in our life at that point in time, actually now seems quite distant and but at the time it’s really, really, intensely felt I think.

Jean: It really helps you get away from the sort of over freighting of everything in the media environment in terms of information, right like, but that’s not what media means in our everyday lives. It’s not what it means in terms of the way that we relate to each other, the way that we experience ourselves as part of that kind of communal experience of the world. So that’s why it’s powerful.

Mark: It sounds like we’re moving in the direction of imagined data communities as well. The book also provides examples of ways that data can be used in abusive and discriminatory ways. A kind of a counterweight to some of the more salutory uses of data in everyday data cultures, and this is just as much by ordinary people as by large global corporations. Do you want to speak to some of the examples that you bring up in the book?

Kath: Yeah, I mean one of the things that we talked about a lot was that everyday life is not always great, and, you know an abusive household is still a mundane, everyday environment and so in thinking about things like the ways that you know Alexa or other kind of smart home devices might be used not just as kind of helpful household friends, but as surveillant devices or devices of control or abuse. It was quite important for us to acknowledge that, and also to think about, you know the ways that devices are being imposed, often by marketing, but sometimes to by kind of public discourse or media discourse about crime being imposed as a site where surveillance equals care or equals safety in ways that can’t actually be guaranteed, so you know the fact that your chat is captured by a dating app for example, makes you no safer than if it were not, and in fact it might make you less safe if, for example, you’re an indigenous person and you’re concerned about your conversations being perceived by the police at some stage, or if you’re a sex worker, or if you’re a man who has sex with men in a place where you know men having sex with men is criminalized and in fact attracts the death penalty. So these kinds of safety devices are often very, very unsafe for particular people, and yet this kind of a sense in which data collection is equated to care very often or equated to a kind of safety net.

Jean: You know this is kind of a flip side of that problem with the digital sublime. Now this is something that didn’t make it into the book, but under that framework, where there’s big tech that has all of this power, and then there’s all the nice innocent people. Big Tech is like Mordor, stay with me here, and all the rest of us live in Hobbiton, and nothing bad ever would happen behind the door of a Hobbit house, right? So that kind of framework is actually not only not only wrong and kind of disempowering to ordinary people trying to work their way through all of this stuff, but actually quite dangerous.

Mark: Against the background of the struggles and pleasures, and sometimes pitfalls of everyday data cultures. What would an improvement in the conditions of life in everyday data cultures mean? What types of things do you think it might be useful to strive for as as we try to adjust to the conditions under which we’re currently living?

Anthony: I think there’s there’s probably a couple of angles to this. One is that interesting things happen when we work collectively when we work together, when we share knowledge and we share understanding and where we work outside of the boundaries of say, organisations, technology companies, even government. I think we have to acknowledge that there needs to be, you know, a conversation and dialogue across all of those levels and across society and that’s really hard to do, but I think it’s happening in different ways.

One of the really good examples that we mentioned before is around indigenous data sovereignty which is a movement that’s a kind of global movement, but it’s really bringing together a diverse range of indigenous groups in different countries who have a mission to rethink what data is and what it does for them. One of the exciting things about that movement to me is that it opens up new ways of thinking about the benefits of data, but also what we need to avoid and move away from so you know if a bank is collecting, you know, data around financial practices and can see, you know, different ways that they could adjust their financial practices for particular groups who could benefit from different kinds of services then that would be, I think, a really positive use of that information.

It doesn’t tend to be the case, we tend to, you know, be working in in sometimes quite punitive fashion with data, looking for problems or trying to identify anomalies or outliers or whatever and trying to stamp them out. So I think there’s a lot of possibilities in that sense.

Jean: And one other idea that we mentioned briefly in the conclusion to the baulkers, really building out the idea of what citizen science would look like for the contemporary digital environment that would really take seriously everyday experiences and sort of going beyond sort of the passive data donation model to really enrolling people in translating their lived experience into our broader understanding of how datafication’s actually working on the ground.

Kath: And the notion of consent to sharing data or to aggregating data or using data for specific purposes has been defined, you know, from the top down in very blunt terms, so tick box you consent or you don’t and one of the things that we thought about when we were looking at marginalised or minority sexual cultures is the richness of negotiation of consent that have emerged from cultures that don’t operate, you know around binary, gender, or binary notions of sexuality. And similarly, they don’t have an on off button for consent in those cultures, and there’s very dynamic notions of consent and very contextual notions of consent and there’s a big push among certain people in the tech community to kind of build an app to help us understand consent, and in a way, it’s like no the tech people actually need to understand the ground up cultures of consent as they play out in sexual subcultures and relationships and maybe redesign the way the apps or the platforms are working to take account of all of the everyday dynamics of peoples understandings of data and what they want to do with data and what sharing data is for.

Mark: Maybe this will also help them rethink what consent means when we tick a box to have them collect all of our information. In one of the really wonderful things about the book, I think is the way in which you find empirical evidence and theoretical arguments to navigate the Scylla and Charybdis of “tech is horrible it’s going to destroy us all” to “tech is wonderful it will save us all”. This is a book about complexity, and I think in complexity you find the resources for hope, and I wonder if you might want to point to some directions that you find as hopeful directions, if there’s some in addition to what you’ve already discussed.

Anthony: I think, look, we need to be pragmatic and I think that’s the middle ground between the optimism and pessimism. Being pragmatic is about understanding what’s happening with people in their situation in their everyday life, that’s what we keep coming back to in. I think the work that we’re doing in the Centre of Excellence for Automated Decision-Making and society is really about looking at those practical and our everyday context and trying to build an evidence base about what’s happening and how we can intervene and we’re actually working with lots of organisations who are doing this on the ground.

So to me, that’s where there’s real glimmers of hope. It’s not to say that everything works out all the time, but it’s actually the striving towards new methods, new ways of building participation. So getting people involved and partly that’s the citizen science thing, you know empowering people to understand how data works in their everyday life and being part of research projects or being part of you know, new initiatives with their local council or with the with a community organisation in their in their area and we’re seeing that happen around the world in different places.

The thing that we need to be careful of is that all of this work is often very under resourced, it doesn’t have the same, you know, venture capital backing as the big tech companies or the startups have, but I think if we can change the culture around the value, the social value and human value that goes with data, not just the commercial value, then that’s where we find new directions.

Jean: I think I would distinguish between kind of cheery optimism and hope, which I see is a kind of a deeper emotion that is also tied up with things being a bit messy and a bit hard and living through that regardless. So I would hope that, well, we know the Leonard Cohen quote right about there being a crack in everything, and that’s how the light gets in? So I would hope that a person reading our book or hearing it about our book who’ve seen the social dilemma and thinks “oh it’s true, big tech is controlling my brain”, reads our booking goes “oh actually I just did this thing with this app so that it couldn’t read what I was doing because I didn’t want my mom to see what” like, and realise that they’re actually living with agents in the many, many cracks between the apparently seamless and perfect and magic face of big tech.


Mark: And so here we are in the ARC Centre working the cracks and that’s of course without that type of hope I don’t think we’d have a mission, so it’s it’s great to hear the resources for hope that you provided in the book. Thank you so much for putting that book together and providing us with those resources for the research that we’re all doing now in future research.

And congratulations on a really engaging collaboration that will provide all of us working in the Centre and beyond with ample resources, reference points and inspiration.

Jean: Thanks, Mark.

Kath: Thank you

Anthony: Thanks Mark.

Mark: You’ve been listening to a podcast from the ARC Centre of Excellence for Automated Decision-Making and Society. For more information on the Centre go to admncentre.org au that’s admncentre.org au