News and Media Symposium – Overview of Critical Challenges and Opportunities in News & Media Around Automated Decision Making
6 October 2021
Prof Patrik Wikstrom, QUT node, ADM+S
Prof Axel Bruns, QUT node, ADM+S
Prof Jean Burgess, QUT node, ADM+S
Prof Julian Thomas, RMIT node, ADM+S
Dr James Meese, RMIT node, ADM+S
Watch the recording
Prof Patrik Wikstrom:
Welcome everyone, to the first session of the symposium. The title of this session is Overview of Critical Challenges and Opportunities in News and Media around Automated Decision Making, and we have a stellar panel for you with the two of them sitting to the right of me, and Julian and James online. And I will take the liberty to introduce all four of them.
Very briefly starting with Julian. Julian is as you know, director of the ADMS, and a distinguished professor in the school of media and communication at RMIT. He works on histories of new communications and information technologies, digital inequality and inclusion, and internet regulation and policy. Professor Axel Bruns is an Australian Research Council Laureate Fellow and a professor in the digital media research Centre at QUT. He’s also a chief investigator of course, in the ADMS. His current work focuses on the study of user participation in social media spaces, and its implication for our understanding of the contemporary public space, drawing especially on innovative new methods for analysing big social data. James Meese is the senior lecturer at RMIT and holds an early career research fellowship from the Australian Research Council to study the algorithmic distribution of News. James is an associate investigator in the ADMS and in the Centre he is a colleague on a project that will explore how advances in telecommunication infrastructure will inform the future development of automated decision-making systems. Jean Burgess is Associate Director of the ADMS and a co-leader of the data program and convener of the QUT node at the Centre. She’s a professor of digital media at QUT and the school of communication and a current member of the ALC college of experts. Her research focuses on the social implications of digital media technologies, platforms, and cultures, as well as new and innovative digital methods for studying them.
So with that we- oops, it’s all falling to the floor. We will be starting with Julian Thomas who will give us an introduction, and setting the stage for this session, and that will be followed by statements and presentations by Axel, James and Jean, in that order. So Julian, over to you.
Prof Julian Thomas:
Thank you very much Patrick. Yes we thought it would be useful to start with an overview, a general discussion of the sorts of challenges and opportunities that arise across news and media. Especially you know, in the light of our work on automated decision making in general, to sort of situate that work across the centre and across our broader research concerns. I’m coming from you today from Wurundjeri country in Melbourne. I’d like to acknowledge the traditional owners of the land on which I’m working, I’m very sorry that I can’t be with you today. I would love to be there, I’ve had the pleasure of visiting QUT and the digital media research centre many times and I am very much looking forward to being able to do so again. but I’d like to thank you all at QUT, and at the Centre for doing so much to make this really important first event for us possible in Brisbane. And we’re also showing what we can do in the circumstances that we’re in. So it’s wonderful to be able to have this conversation. I thought it would be good to just say a little bit about where this area of news and media, and what is going on there with the new technologies that are emerging in the context of the centre’s broader research agenda- why we see news and media as such an important area for us. So why are we starting here. Since this is the first of our big public events, let me just say a little bit about the centre.
ADM+S as we call it, the ARC Centre of Excellence for Automated Decision-making and Society, as the long name goes, is really about bringing together researchers from many disciplines from the humanities, the social sciences, and the technological sciences to address the social challenges of a whole wave of new decision-making technologies that we are seeing appearing right across many areas of social life and the economy. These technologies are quite diverse, so sometimes we talk about AI, but when we look more closely we find a whole array of different sorts of new devices: machines, software systems, technologies from machine learning to the blockchain, being applied in all kinds of different ways across all of these different domains of everyday life. So we know that automation is moving very rapidly across many sector, and it’s been accelerated of course in the last year and a half as a result of the pandemic that we’ve been living through. We know that it’s been particularly rapidly proliferating in the areas of news and media, so this is one of the main reasons why we are starting here. It’s not just in news and media in the Centre, we are also interested in a number of other critical empirical domains, from health, to social services and mobility, and transport, but we know that a great deal has been happening in this space. And we have the advantage of an enormous amount of expertise and research that’s already been going on in this field- not least of course at QUT in DMRC, and of course elsewhere across all our partner organizations. So when we look at news and media we’re not starting from nothing when we’re thinking about automation, we’re really building on a very rich vein of scholarship that goes back some time. But we know that there are serious immediate challenges that we want to be thinking about.
When you see new technologies, new decision-making technologies appearing at the pace in which they are across all of these key sectors, we know that they’re likely to produce both spectacular successes and spectacular failures. So that’s the pattern we’ve seen because they are so new. So the question our Centre is keen to address is how do we mitigate those risks – the risks of spectacular failures which we’ve all seen. How do we help ensure that these new systems are as we often say in the Centre – in our language – how do we make sure that they are responsible, ethical, and inclusive. Whether we do that through design, from the outset through regulatory strategies of one kind or another through legal interventions, through different kinds of economic incentives. We are sure there are many ways in which it can be achieved when we are seeking to explore all of those. Our starting point, and this is one of the things that makes this Centre I think distinctive, and the conversations and discussions that go on within it unusual, is that we see these new modes of automation, – these technological systems – as social as well as technological. So they arise in our view not just from new machines but from the interaction between machines and people, and from the interaction between information or data and institutions. And so we’re very interested in the relationships, the dynamic relationships between all of those elements. We spend a lot of our time going very deeply into how all that works. So it’s all the more important as a centre, that we get together as we’re doing now to think about how that plays out in relation to a particular domain. In this case of course, news and media. So we can bring our knowledge together from across all of our disciplines, not just media studies and communications, but also computer science, sociology, anthropology, economics and so on, and focus on the particular challenges that arise. And there are many of them.
The particular challenges variety news and media, we think of as a hot zone for automation from end to end. If you think about just the sort of work that journalists do, what we find is human tasks being transformed into automated systems as I say, from end to end, identifying and sourcing news where the information is now. The work of search engines and automated news discovery systems developing and creating new stories, is also automated through automated content generation systems in many cases. The programming and sequencing of content for the discovery of content by readers or audiences, which was once the work of editors, is now the work of recommender systems engaging with readers and audiences – now happens through content moderation systems. Selling and placing advertising so central to media businesses, now happens through what we call programmatic advertising, automated advertising markets. Marketing and pricing of media products now happens through automated paywall managers. So this is why we think of news and media as I say, as a hot zone for automation. We have the enormous advantage of the extraordinary resources of the research that has been done over many years at the DMRC, and at other partner organizations within our Centre. We’re very lucky to have those things but we also think, and this is the last comment I suppose I’d make, that while we can learn a lot from the work that’s been done around the extraordinary transformation of this field, when we look at other areas of life – so if we think for example about the work that’s been done about content moderation – how does that then change the way we may think about the way in which some of those sorts of tools might be used in say, interactions between citizens and government? That’s very important, but it may also be that expertise from other domains can also help us understand better, what is going on in media communication news and journalism. So that’s the bet we’re placing as a Centre, getting people together to have the sorts of conversations and to undertake the sorts of collaborative projects which we haven’t had the opportunity to do. So before that, I think is what’s really exciting about it, and that’s why I think it’s very fitting and a great chance for us to start our public work as a Centre in this place, and with this topic. Thank you.
Prof Patrik Wikstrom:
Thank you so much Julian. That was a great overview and start to this session. So now turn to our next speaker which is professor Axel Bruns, who’s sitting next at the other end of the podium here in Brisbane.
Prof Axel Bruns:
Thank you very much, and Julian for that great introduction, both to the Centre itself and to the news media focus area. This particular area in the first place, as Julian said, this is the first of the four focus areas that the Centre of Excellence will address. Of course, when we planned all of this when we developed the application for the Centre, we couldn’t have anticipated some of the aspects of what was to come, the way that particularly this damn pandemic would heighten the importance of some of these questions. Particularly when it comes to the dissemination of information about Covid 19, to challenges of myths and disinformation, to the questions of what information recommendation systems like search engines would provide, and so. So all of this I think, has only become more important as we’ve gone through these last couple of years or so of the pandemic.
We do have problems here…
Thank you, let’s see if that works better. I do want to make sure and make clear that this isn’t just about news, that the focus area is very distinctly called news and media. So there’s plenty more that this focus area is working on beyond journalism and beyond the news as well. And as Julian said in addressing news and media, we are addressing the full range of these fields as well, from the practices of the industry itself and the technological developments and institutional settings that exist there, through to questions of policy and regulation legislation in this field. All the way through to user activities and user experiences, and everything else that happens on the user side. The user input into all of these developments and technologies, as well. So we’re covering the full range we’re not focusing simply on the industry, or industries of news and media. There is a really broad range of projects now underway. Again, I encourage you to have a look at the interactive wall downstairs, as well as our website, we’ve posted quite a few podcasts and other bits of information about what’s going on. But I wanted to just highlight maybe a handful of these to give you an idea of the breadth of activities that are going on. So when we talk about industry and policy developments, particularly we’ve got projects like one of our postdoc, Silvia Montano’s work, who is investigating the emergence of ADM technologies of AI, of automated news writing and of all these sorts of technologies in the news industry via interviews with newsroom workers. So really to understand well in Australia, what is actually happening. There’s been lots of talk about AI, there’s been lots of talk about automation there’s been lots of talk about algorithms, but how are they actually used in practice? To what extent is the hype actually being realized or perhaps debunked in the industry? As well, we’ve got colleagues like CI Mark Sanderson who’s working on considerate and accurate recommended systems- so the next kind of generation of the sorts of search engine recommender systems, and so on, that we have today. We’ve got colleagues in law and policy like Megan Richardson and Christine Parker, who work on the governance of ADM systems. We’ve got colleagues like Jean and Julian who are working on data ethics, together with quite a broad range of others across the Centre, as well. So there’s a whole bunch of different aspects of dealing with the industry side, the policy side, that this news and media focus area covers. And then on the user-centric, the I want to not say end-user, because these are not end users, they’re participants. They’re active participants in these processes, but on the user-cantered side again, we’ve got work that looks at the public conversation about ADM systems, AI algorithms, and so on. And Jean’s leading this. We’ve got colleagues like Tim Graham, who’s working on automated systems on understanding and identifying automated agents in social media bots, for short, but there’s much more going on than just bots in these spaces. We’ve got work that Julian’s leading on mapping the digital gap, so understanding the level and the problems with the digital inclusion of Indigenous communities. And that’s with support from Telstra. And I want to focus just as I finish here on a couple of projects that are probably particularly close to my heart, because I’m also quite involved with them but that I think are also very innovative in what they do. And these are data donation projects that are very actively engaging the general Australian community, in involving them as citizen scientists by donating their data about their user experiences, and of course some of that is also very much cross-linked with these questions of data ethics. So you’ll hear this afternoon quite a bit more from me and others about our Australian Search Experience project, which assesses the extent and the impact of search engine personalization on the user experience of these platforms, and the information that people encounter as they use common search engines. We have a project that CI’s Mark Andrejevic and Daniel Angus are leading the Australian Ad Observatory that monitors the ads that Australian users of social media encounter- again, using data donations. Asking users to donate data on the ads that they’re encountering. Now we’re doing this kind of work, particularly because data donations are at this point really crucial and critical for the independent critical review of the power of platforms, from a public interest perspective. There is really no other way to do this. Major industry can be valuable partners in this kind of research, and I do want to acknowledge that for instance, Google Australia is a partner in this, but even when platform data are available they should be scrutinized, and we must, I think, follow the principle of trust. But verify, we cannot simply rely on the platforms alone to be doing the right thing. We need to check that what they tell us is in fact correct, for instance. And this is not an example from within this Centre but one of our colleagues in Italy Fabio Giliecho from the University of Albino, recently showed that the data that Facebook provided on misinformation were incomplete – that they covered only about 50 percent of the US population, rather than as advertised, the entire US population. And this was an independent outside researcher pointing this out and making it all the way through to the New York Times, for doing so. Only with that kind of independent scrutiny, that independent verification can we be sure that the data and the information that platforms release is ultimately correct and complete. Sadly sometimes the platforms do frustrate such critical independent public interest scrutiny, by blocking data access or obfuscating information, almost as if they’ve got something to hide. But we’re not going to go that far of course, and when researchers persist and produce valuable insights into the platforms themselves, and the way that they operate, they may even sometimes be attacked personally, which is deeply problematic, of course. We saw that just last month with the suspension of the Facebook accounts of the people behind the New York University Observatory, these are problematic developments and they show that the kind of work that we and others in this field are doing, trying to scrutinize these major operators in our digital and social environments from the outside, is incredibly important. So the work that we do in ADMS, and the work that we do in this particular focus area is even more critical because of these kinds of cases, and these kinds of observations. ADM systems ultimately are too important to be left to the technologists alone, they must be developed and scrutinized in dialogue with society, and we are an institution that is trying to respectfully but firmly force that kind of dialogue further. That kind of dialogue between the different stakeholders that are involved in this, and ultimately that’s what these two days are also fundamentally about – furthering that dialogue understanding, what has been done, what can be done, what must be done, and taking this work further. So I might leave it there and hand to Jean, I think. Or James.
Prof Patrik Wikstrom:
Thanks Axel. James it is, but first thank you Axel.
I should just remind everyone before we hand over to James that please post your questions on Slido and we’ll pick them up at the end of the session. Over to you James.
Dr James Meese:
Thank you Patrick and thank you axel for that fantastic overview. Just before I start like Julian, I just want to recognize that I’m on Wurundjeri country and I want to I pay my respects to elders, past, present and emerging. So what I want to talk about is what we might think about as the second tranche of work being conducted. So these are projects that have come about as a result of the Centre’s research training and interdisciplinary collaboration. So we’re already seeing emerging work from a new generation of Australian scholars, knowledge sharing across disciplines which Axel and Anjalee have already spoken too, and this is going to help us better understand automation across the news and media sector. So the Centre is lucky enough to have some of Australia’s leading dis and misinformation researchers devising tools and conducting analyses in a critical area for the country, and we’ve already mentioned Tim Graham, alongside Tim we’ve got Damiana Spino, and Danula, who are exploring information retrieval and fact-checking operating voice assistance. Axel has his own team conducting work in this area, and several PhD students are researching misinformation, hate speech, and thoughts. So we look forward to the results from these projects and other similar ones in the future. We also have a growing team of researchers exploring how automation is affecting journalistic practice. Again, Sylvia Montana-Nino’s work in terms of experience, the adoption rate in Australia has been discussed but we’re hoping to support this work with targeted studies of specific publishers and news genres. So this sort of research helps us all better understand the risks and opportunities that automation presents for the Australian media industry, more generally.
We also have fantastic PhD scholars like Sam Kininmonth, who’s doing a deep dive into how automation is actually deployed across the programmatic advertising industry, which is particularly valuable in light of the HRC’s recent release of the Ad-tech report. Other members, and this is an area I’m really excited about, by exploring how automated distribution is shaping access to content across different platforms. So we have students like Louisa Bartolo researching her recommendations of work across Amazon and Twitch, and more senior researchers devising unique methods and tools to explore how does YouTube recommend stuff to us? Or how do smart CV systems work? What worries about filter bubbles might be overstated? We still know that there’s the power of platforms. They can shape content delivery in certain ways, and as has already been mentioned, the question of whether we can implement responsible recommendation is an active issue for regulators, industry, and consumers. Other researchers are focusing on the increasingly automated process of content moderation. One strength which Jean will talk to more and hear about later in the program, is the removal of borderline content. So this is content that’s kind of 50/50. It may or may not be in breach of the platform’s terms of service and this kind of comes up particularly around the removal of consensual sexual material, and shadow-banning of its producers from platforms. So a critical issue here is: can we introduce due process and transparency around some of these decisions? All of these areas clearly of implications for industry and policy makers, and it’s really pleasing to note that already at this early stage, researchers across the centre are already engaging with stakeholders, disseminating results, and in many cases actively collaborating with stakeholders to solve problems. We look forward to continuing these sorts of engagements into the future, so I’ll leave it there but I hope this short overview gives you a sense of the sort of work we’re starting to focus on – an introduction to some of the fantastic researchers who are working across the centre, and I think Jean is going to introduce a few more. So thanks, Patrick.
Prof Patrik Wikstrom:
Thank you so much James. That was a great overview and as you said, next is Jean. Thank you.
Prof Jean Burgess:
Thank you, Patrick. Thank you, Julian, Axel and James. You might think there’s not much left to say but I’ve got more shout outs to give to early career people and perhaps it’s my traditional role to make a special appeal to the importance of the media part of news and media, and perhaps an enriched idea of what that might mean. So it’s of course very understandable particularly at this moment that there’s so much focus on how various forms of automation and ADM might both promote and help us deal with misinformation and disinformation and other threats to the integrity of the information environment, or to digital inclusion. But there of course is much more to media than user information, for better or worse. In societies like ours, which is to say mediatized societies, the media are the means by which we come to know, to understand, give meaning to the world and our place in it. And our most intimate relationships with others and digital media. Also the means through which we help to shape and struggle over those meanings and experiences. They may also believe it or not, be sources of joy and pleasure, connection and belonging. We should probably try not to forget that in amongst everything that’s going on with platforms and regulations on so well beyond user information, algorithmic content curation and moderation also playing a major role in the characteristics and dynamics of our shared or common culture in all its diversity. And who gets to participate in that culture. And of course as well as representing a substantial sector of the creative economy, popular culture and entertainment matter deeply to people and communities. You might even say these are the forms of media that make life worth living. I would perhaps – even more so right now during the pandemic – as our colleagues in lockdown, I may or may not agree with that’s why it’s so good to see new strands of work emerging around recommended systems in Australian screen culture. As in Kylie Papillardo’s ARC early career fellowship work in music. As in Patrick Wikstrom’s work and as James already mentioned, some of our early career researchers are looking at the challenges of moderating humour- so that’s Ariadna Matamoros Fernandez. Sexual expression and Zahra Stardust, and moving up the career chain, presenting sexual health information on social media – that’s Kath Albury with her recently awarded ARC future fellowship. And in these areas, there’s so much more at stake for communities than just balancing free speech with harm. These kinds of rational deliberative kind of models of what matters in media, and as James noted all these researchers are already working directly with impacted communities and with industry stakeholders, to try chart a path forward through this very complex territory. Digital media is also increasingly mobile and situated in the home, in our cars, and on our bodies. So it’s important to recognize here that work on every day and domestic technology use is in scope for us, in thinking about ADM and automation with respect to news and media. So there’s important foundational work by particularly the group in RMIT, Rohan WIlken, Yolande Strengers and Jenny Kennedy, on domestic digital media, And particularly Jenny and Yolande’s book, on smart home devices and the politics of gender. This stuff is really going to be central I think, as the news and media environment, including the distribution of news, merges and converges with our everyday domestic lives, and thinking about our colleagues also at RMIT and all their fantastic work on the recommended systems of the future. Well what would it mean to think about responsible, ethical and inclusive personalization and recommendation in the context of a smart speaker, if we think beyond fact-checking, as important as that is. So what do trust and safety look like in that context, if we consider the dynamics of coercive control in domestic violence situations for example, and the well-documented role of smart home technologies in those situations.
But the platformization and convergence of the broad media environment also means that, as Axel was just saying, to study any of this means that we really have to develop rich, hybrid, new methods and techniques – quite creative ones in some cases – to actually understand and even observe the data operations of platforms and their many corporate partners, advertisers, and intermediaries. Raising major challenges for researchers with respect to data access and some of these new methods, including those based on data donation, may bring us into direct conflict with the tech companies that provide these platforms. So that actually provides an opportunity for a major piece of knowledge work and leadership among researchers in the Centres, to help guide our institutions through the implications. And so a group of us in the Centre already, collaborating on a project to broker a national conversation on the issues around data ethics in public oversight research on platforms in an increasingly risk-averse institutional environment. So as Julian said, and as I hope you’re getting by now, we need far more than media studies people to address these kinds of challenges. We really need work across the Centres for programs and perhaps across disciplines that have traditionally worked on other focus areas like health or social services, to work on all this stuff. And I’m really looking forward to the next few years, doing it with James and all of you. Thank you
Prof Patrik Wikstrom:
Thank you so much Julian, Axel, James and Jean. It’s fascinating to hear about all the projects and the fantastic people in this focus area, and across the Center, and what is going on at the moment. So please bring the questions to Slido out, put them in there and they should pop up on my screen. I think while you’re contemplating that I have a couple of questions that I’d like to put to the panel.
I can start with something really broad, I don’t know if this is something for Julian, or Jean, or for everyone. But look, your Centre will be around for a very long time, for seven years. And the world will look very different at the end of the Centre than it it does now. How will you be able to stay on top of things? How will you be agile and not stick to what you’re doing right now, which probably will be not that great at least in seven years time.
Prof Julian Thomas:
Shall I have an initial go at that, Jean? You, I’m sure you have things to say as well.
Patrick, this is actually like one of those questions we encountered in our interview with the ARC when we were applying for the Centre. But I think it is a really critical question because we mustn’t mortgage our research programs and all of our planning entirely to what we see around us right now. So I suppose one way in which we’ve done this is to frame our research programs quite broadly, and to make sure that when we think about planning the resources the way we where we are developing our research plans, we’re leaving some funds aside so that we can identify some new areas we can respond to; the sorts of questions that our industry partners and others raise. We can respond to the ideas that arise from our students and from our early career researchers, because we know that things will change, and it’s essential that we haven’t locked in seven years of research just on the sorts of things that you’ve been hearing about right now. It’s going to have to evolve, as you say.
Prof Jean Burgess:
I’ll just add to that Julian, that in a humanities and social science driven research centre, we spend most of our money on people, and the current cohort of early career researchers are already driving our research into new areas, including through the kinds of contestable funding that you mentioned. But there’ll be a new cohort coming online in another three years, and it’s really important I hope everyone here who’s on a postdoc agrees, but we’ve really tried to make sure that there’s plenty of creative agency and autonomy in the research programs that they’re pursuing, in alignment with those broad program areas.
I do feel like we’re at the interview again.
Prof Patrik Wikstrom:
Okay I’ll take another one which won’t be exactly like an interview question then. So I’m really interested in what you mentioned at the end Jean, about the fact that we very much focus on the critical aspects of automated decision making, and how we can address those problems. And you wanted us to make sure that we also focus on the positives if we want. So if I may be a bit provocative there and say ‘but isn’t that what the Googles and the commercial tech giants are already doing’, isn’t that their job? aren’t we supposed to be critical and just focus on the problems that they don’t care enough about?
Prof Jean Burgess:
No. Certainly. Critical social theory, critical data studies, critical platform studies, are all really core scholarly activities for many of our disciplines, but we also have within our even within the humanities and social science part of our centre we have many people who work in participatory ways to envision future technologies and collaboration with people who build things, and that’s a really important part of the overall Centre’s mission. It’s the Centre’s mission – phrased in positive terms – what would responsible, inclusive, and ethical ADM look like. So it’s actually our job to help envision those more positive futures, and really important to maintain intellectual energy as well, otherwise you just become purely reactive to whatever terrible things in the news about Facebook this week.
Prof Patrik Wikstrom:
Axel, James, Julian – do you want to comment on that?
No? In that case I have a question from Tim Graham, and so I’m reading from my laptop here, it says ‘wondering if you could comment on how science fiction films such as blade runner have shaped, or have shaped and translated to our perspectives about bots and AI today.
Prof Axel Bruns:
I can take a stab if you like.
I think one of the aspects about science fiction is of course that it tends to take things quite a bit further from where we are today, and so everyone’s expecting you know, a fully functional general AI, as in the film Her, or as in Terminator, for that matter. Something that is entirely self-aware and self-programming, and able to respond to other situations. We’re very far away from that, from all I know, but specialized AI and below that – more specialized algorithms and automated systems, of course – are very much here. So I think the danger is looking too closely only at the science fiction side of things, is that we kind of look for bots everywhere as you’re very familiar with, as well, from your work. When perhaps what we should be looking at is something that’s much more hybrid and that’s much more in development, and much more breakable, and much more malfunctioning, and not malfunctioning in the way that Terminator malfunctions, but malfunctioning just because it’s really poorly programmed and some programmer has a very limited life experience and therefore doesn’t anticipate how other users might actually engage with their systems. So that to me is actually a much more interesting question: looking at these early stages, the real existing stages of automated decision making systems and everything else around them, rather than to focus only on that kind of ideal model or dystopian model of a fully functioning, fully independent, generalized AI system.
Prof Patrik Wikstrom:
That’s great, and it actually takes us to the end of this session. So with that I’d like to thank Julian and James and Jean and Axel, for your excellent insights, and for giving an overview of the critical challenges and opportunities in this focus area. So thank you so much everyone.