EVENT DETAILS
Disciplining The Market Or Being The Market – Alternative Data Governance // Economies
23 April 2021
Speakers:
Prof Kimberlee Weatherall, University of Sydney node, ADM+S
Prof Julie Cohen, Georgetown Law Centre
Assoc Prof Kean Birch, York University
Prof Seth Lazar, Australian National University
Watch the recording
Duration: 1:28:52
TRANSCRIPT
Prof Seth Lazar:
Good morning. My name is Seth Lazar I’m a professor of philosophy at the ANU, where I lead a project on the morality law and politics of doctrine AI. I’m delighted to welcome you to this the second in the automated decision making society seminar series in alternative data governance and alternative data economies. The series is hosted by the automated decision making and society central ARC centre of excellence. I’m in collaboration with the centre for AI and digital ethics at the university of Melbourne, and the ANU’s humanising machine intelligence project. Our various universities and centres are based throughout Australia, and on behalf of us all I would like to acknowledge the traditional custodians of country throughout Australia, and recognize their continuing connection to culture, community, land, sea, and sky, and I’d like to pay my respects their elders past, present and future.
So, this seminar series organized by Jake Goldenfein got off to a fantastic start two weeks ago with Jake’s interview with Katharina Pistor and Salome Viljoni. What I appreciate about it especially, is how it moves forward from the correct framing of – if you like – critical studies, to think about what modes of political and economic governance we can imagine to find a path forward from where we are now. In two weeks we’re going to focus on organizations and intermediaries. In four weeks we’ll consider democratic politics. Today however, our topic is disciplining the market or being the market. Our discussion leader for today is professor Kimberley Weatherall of the university of Sydney. Among her many leadership roles in the Australian community working on AI and society, she’s the chief investigator of the automated decision-making and society centre of excellence. Kim, if you would please take the proverbial mic.
Prof Kim Weatherall:
Thank you Seth. Thank you for all those introductions. So, it is really exciting to have professor Julie Cohen and associate professor Kean Burch with us today for this second in the series. Just cleaning the market or being the market. So as seth was saying, last time we were talking about data and the government’s data, in this session we’re going to turn our attention to platforms and the way that they aren’t just a spot where market transactions happen to occur, but effectively can become the market and can entirely restructure the way that people transact both socially, economically, politically, so across all of the important domains in life. There’s been a lot of discussion publicly around the use of antitrust and competition law to deal with what’s going on with platforms and their multi-sidedness network effects, etc. But we wanted to explore what the trajectory of these developments has been, where markets, where platforms are going, how they are reconfiguring our relationships and transactions, and whether there’s scope for thinking about ways other than just you know, important as competition lawyers, there are other ways to think about how we deal with platforms, their role, and think about what kind of different I think, regulatory thinking, we need for this new environment. Because in a lot of ways it seems like many of our old tools aren’t working very well and we couldn’t have a better couple of speakers to think about these things. We have, we’re extremely fortunate to have professor Julie Cohen from the Georgetown law centre, who is a leading thinker, has long been a leading thinker in the space of data, background in intellectual property. You know, I first encountered her work in this space through her book configuring the network itself, where she thought very deeply about questions of privacy in this new environment, and author of the book between truth and power, which examined the whole trajectory of information capitalism and was an important legal response to the discussions that we’ve had since Shoshana Zuboff’s book surveillance capitalism. We also have associate professor Kean Burch from York university, from Harvard, from the science technology studies program there. He’s the author too, of many books in this space. You know, I’m looking at his publication list right now. 2020 we have the edited collection assetization, turning things into assets. And techno scientific capitalism. What this market now looks like with this technology in this environment, with all of these models, is precisely the space that he’s been working in. So, I’m utterly delighted to have both of these speakers with us today and looking forward to the discussion. The way we’re going to structure today is we’re going to have short presentations, not great long ones, from Kean and then from Julie. And then I have a series of questions that I wanted to explore more deeply or you know, bring up to an Australian audience, for a lot of the themes that they’ve been dealing with. Then we’ll throw to our awesome panel of people that you can see on the screen at the moment. A terrific bunch of researchers from our three centres. From ADM+S, from HMI, and from Cade, who have promised to be very engaged and interactive, and ask very intelligent questions. And then build an opportunity for questions from the broader audience. So, that’s the plan, that’s the running order. Looking forward to the discussion and at this point I’d really like to just hand over to Kean to start us off.
Assoc Prof Kean Burch:
Okay, thank you very much, Kimberly. So, it’s great to be here and you know, my thanks to Jake, Loren, Kimberly and staff for their roles in organizing this. I’m going to briefly talk about this idea of automated neoliberalism today, and I look forward to questions afterwards. So, to start I have to admit that I sometimes struggle with the concept of neoliberalism. I’ve been using it analytically for you know, a good 15 years now, but I found that over time it’s become less and less useful. And I think that there are other tools that are emerging that are becoming more helpful. That being said I’m still, I still engage with it and this is a talk where I’m trying to engage with neoliberalism within contemporary capitalism, whether we want to call it techno scientific or informational or surveillance, or what have you. I just want to thank Jathan Sadowski whose article the internet of landlords was a kind of inspiration for this this paper, and you know, helped me to think a little bit about this kind of notion, about the automation of neoliberalism. So to start with, and I’m going to just discuss a little bit, I’m going to discuss neoliberalism a little bit. That’s my kind of starting point, and then I’ll talk about the this automation idea of neoliberalism and automation and markets, and then I’ll finish with the discussion of reflexivity. I’m trying to keep my eye on the time so I will try and keep myself to ten minutes and yeah, I will get on with it now. So, my starting point then is neoliberalism is generally thought of as this kind of notion of the installation of markets. Market competition is the key institution in society but the problem arises then when we think about this era you know, big tech, where we’re at the minute, where you have market concentration, market power, monopolies and the wholesale replacing of markets with digital platforms that then are regulated by these large organizations. There seems to be some incompatibility here, but I think that we can match up what’s happening with this notion of neoliberalism. If we take, if we unpack neoliberalism a little bit more and think about the kind of specific ways that it is understood by neoliberals themselves, so the near labels themselves, they have multiple notions of what a market is and the social role of the market, and the ways that market organize society. And I think there’s at least four different conceptions of markets within neoliberal thought. The kind of haikai and notion of markets is these evolving as the kind of evolutionary information processes that leads really, to a better future, a more kind of freedom. A night notion of markets being this kind of idealistic starting condition for the efficient allocation of resources in society. The kind of Gary Becker position which is, I would characterize as being this idea that society is already in a market.
We can understand society already as a market but perhaps the most interesting one I want to focus on here is a the perspective of Richard Posner. So, the legal scholar Richard Posner, and in a great book by Sonia Amadek or prisoners of reasons, she kind of talked about this perspective of Richard foster who like Becker, assumes that society is already a market. And so we just need to work out you know, what markets should do and then we can just get on with you know, assigning through legal processes and so on, assigning the rights that reflect the most suitable market outcome. You know, most efficient kind of outcome. And so we can decide you know, make decisions on the basis of this kind of as if markets without needing actual market operations. And I think the Posner positive vision is reflected in some of the kind of key changes around notions of market design that emerge in policy and political thinking.
There’s a great book by Phil Morowsky and Eddie Nicar where they talk about the ascendance of market design. Basically, they shift economists away from looking at markets towards agents and the kind of jettisoning, the idea that markets are good mechanisms for identifying intellectual preferences and allocating resources. And rather taking up this idea that you know, with market design we can design markets in whatever ways we want in order to achieve the outcomes we want. And so you end up with this construction of these kinds of quasi-as-if markets. So, in thinking through some of these issues around you know, this kind of notion of these kind of as-if quasi-markets, I started thinking about digital technology. The dominance of big tech and the ways that we can think about the automation of markets, basically through this emergence of market design and the marrying together of that with the digital technologies. And I sort of thought about this on at least three levels. So that, I’m sure there are other ways of thinking about this one around the automation of sublime demand through platforms, one around the automation of prices and pricing through the collection of personal data and data analytics. And the final one around the automation of contracts through distributed electronic ledger technologies, blockchain essentially. So, just trying to be aware of time here, I’ll briefly discuss each of these and then go on to the final kind of issue, which I think is complicate some of this discussion around data governance.
So, the first one is automation of supply and demand through platforms. You know, we all kind of agree the platforms are some form of multi-sided market. We have intermediaries managing the supply and demand on their platforms. We can see different versions of this play out in different kinds of platforms, and what we end up here with, the platforms is the mediation of supply and demand by these platforms that set the rules of the game. Essentially, it’s a form of private regulation where you know, you might have a market being a participant, but you might have a platform controller being a participant, but also being the one that sets the regulatory or the rules for the operations of that platform. And there are examples of this that you know, there’s some really good examples. There’s this that come out of the US congressional report on markets and competition recently, from last year. So, things like Amazon you know, self-preferencing on its platform. Facebook controlling access to its social graph. Apple using its privacy policies and particular ways to disadvantage app developers, and such. And the end result of this is you end up with a series of kind of enclaves. You know, a platform becomes an enclave collecting data through users charging access to this enclave and such. Like the second one then is this notion of automating prices and pricing through data collection, data analytics, and this follows the investigation of data collection and analysis and largely done through forms of contractual arrangements like terms and conditions. And essentially experimentation on us without a consent in the sense that we collect data and then use data analytics to make certain particular decisions, et cetera. And this is supposed to enable forms of direct targeted marketing, dynamic pricing, as well as the kind of end point of working out the willingness of us, our willingness to pay for particular things. And you end up with a situation which certain people benefit while others don’t.
So, Mario Forge and Kirin Healey talk about people who have this, what they call this kind of higher status, digital status, than others. And that leads to them receiving perks, essentially. The final one then is around this automation contracting, through blockchain distributed electronic ledgers, through the implementation of smart contracts. So, the idea that you can do away with incomplete contracting by rolling out a you know, contract with your virtuality. And it’s underpinned by the assumption that the future is all knowable and ignores any notion of uncertainty, which is a point that Katharina Pistor makes in her book code of capital. And she points out that this then ends up meaning you can gain these smart contracts and you end up reinforcing the kind of informational power and asymmetries that exist through this automation of contracting. So, the final point then is about reflexivity. So, how do we understand data governance around this kind of automation of these platforms pricing and contracts, when we start thinking about reflexivity. So, the idea being that you know, reflexivity being the idea that the knowledge claims or the knowledge we produce ends up changing the world and the behaviours of the things we learn about. And so, I was thinking about this in relation to data. The more data we collect, the more it changes the behaviours and the actions of the people it’s collected from. And so you end up with this constantly changing fluid situation in which you know, the more the data gets collected, the more that we change and so on. And we end up with a kind of interesting situation in which it becomes harder and harder to tell whether the data that gets collected is actually useful as a consequence of the accumulation of all the little lies we tell, but on a daily basis about ourselves and social media, and such. And there are attempts as well here, by companies to try and sidestep this reflexivity so the firms that refuse to reveal their algorithms like Google firms that have a kind of opaque rule system, like Amazon and such, and then this leads to this kind of creation of mini industry optimization experts, or experts who know how to deal with the kind of big tech vagaries. So, I’ll finish there. Thank you very much and I look forward to the questions.
Prof Kim Weatherall:
That’s a terrific start, Kean. There’s a lot to unpack in there and I’m looking forward to unpacking it soon. Julia, I’ll turn to you and ask you to give us a brief rundown on some of the big ideas that you have in your work.
Prof Julia Cohen:
Okay great. Thank you, and I’m going to just get this slide deck rolling. Oh goodness, hang on. Resume share. Okay good, I’m sharing at the moment, you can see the slides? Thumbs up. Yes, okay. So, thank you so much uh for inviting me to participate and to Kean for that very interesting set of remarks. I’m going to just speak about the question of disciplining the market or being the market from the perspective of this 2019 book in which I took a Polanyian perspective on the question of platforms and what they are doing, and what they’re about. And I just need to ask for a thumbs up again because if you can see the slides, because I’m getting a message that suggests that you can’t see them.
Prof Kim Weatherall:
We can. We can we can see your slides, but we can see the full page. It’s not in viewer mode at the moment.
Prof Julie Cohen:
Okay, hang on just one second. This is not good PowerPoint. Technology is not my friend sometimes.
So, I find it useful to think about the – as I said – the question what platforms are about and what they’re doing in terms of a process of transformation and political economy that has been underway for some time. And that fits pretty neatly within kind of a Polanyian frame of sort of political economy slash economic sociology by analogy to some of the points that Polanyi made about the evolution of industrial capitalism. And so, in his classic study he just described a lot of very interesting sort of entailments of the transformation of the British system of political economy from an agrarian one to an industrial one, and talked about appropriation of inputs, but also about kind of a conceptual reorganization. A process of learning to see the inputs to production as commodities, which was sort of a new thing to be doing, and in particular about learning to see labour, land and money inputs to production, as commodities. And as I’m sure you all know he thought they were fictitious commodities because in fact, when pressed to behave like commodities they would not and all sorts of very interesting and calamitous failures ensued. Precisely from the failure of labour, land and money, to behave like commodities but nonetheless the transformation to an industrial and capitalist political economy demanded that these inputs be treated as commodities. And so, they were and there was also kind of an organizational change in which patterns of barter and exchange that were previously very easy to understand as local and contextualized and situated in their material places, became sort of understood as detached and re-embedded in a fixed fictitious construct called the market, which made it easy to develop entire theories about the market in terms of abstraction, right. In terms of sort of a fictional place where the forces of supply and demand encountering one another. And then they transact over goods and we don’t need to worry too much about those pesky and heterogeneous and hybrid places and materiality’s. And it struck me then and it strikes me now, that the parallels are just really good between that analysis of the emergence of industrial capitalism and the transformation that has been underway during what some have called the long 20th century. And now into the 21st from industrial capitalism to informational capitalism, I say informational capitalism is a term that I have borrowed from Manuel Castell, and it refers to sort of the continuation of capitalism as a mode of production, but a shift toward informationalism as the preeminent mode of development. So, knowledge accumulation, knowledge enclosure. So, that surplus can be extracted from it in accordance with the dictates of the capitalist mode of production, increased complexity and information processing, etc. And you see these three shifts that parallel those that Polanyi chronicled in the prior transformation, a great emphasis on the appropriation of new resources that are needed as inputs to informational development. And now there’s a lot of stress laid on intangible resources. I assume you’re all well familiar with the inputs to production that had formerly sort of, everyone had formerly had to learn to see as commodities, remain commodified. But now they become datafied as well. Whether they become re-envisioned in terms of streams of datafied inputs to production, even land. And then there’s a fourth one which is data flows extracted from people. So, people become datafied and commodified inputs to production as well. And then we see the emergence over decades, right. So, it’s not just social media, but over decades of platforms which are interesting creatures because they are not abstractions. The way the market is, right, patterns of barter and exchange that had previously become detached from markets now become sort of re-materialized and re-embedded in platforms because platforms have protocols and platforms require conformity to their protocols. You know, the visible ones that we all love to argue about today and are in our disputes about disinformation and competition policy are ever present in the mind. But if you look back over the last several decades, platformization has been everywhere. It’s been in terms of platforms for financial trading. Platforms have sort of infested all stages of the labour market and all stages of sort of transnational supply chain planning. Everything has become platformized, and platforms excel at taking these streams.
So, just a little bit of a deeper dive into datification and platformization in case there are sceptics out there. I think a lot of the behaviour that we’ve seen in labour markets, in terms of sort of outsourcing of human resources and just-in-time flexible scheduling of workers, a lot of what’s going on in financial markets, a lot of what’s gone on in terms of securitization of interest in land.
So, alright. Here we go so. So, platforms take these inputs and again, not just our social media platforms, not just our Amazon and Palantir, but across all these different sectors where we’ve seen platformization. And what they do is they sort of perform and they instantiate, and they entrench a new kind of infrastructure-based disinformation, intermediation of previous central market actors and a re-intermediation for their own benefit. And they rearranged the flows of data for access, right. So, in terms of the theme of disciplining the market or being the market, I suppose you could say as they rearrange the flows of information, they are being the market. They’re rearranging the flows of information to make them more legible to themselves, the platform intermediaries. So, they’re disciplining the market. They are extracting a whole lot of surplus from those flows. But at the end of the day, I would have to say they’re neither disciplining nor being the market, but rather transforming the market, right. And that’s the point back here, the market in a sense was always only ever a fictionalized construct, and it mapped well to the discourses of industrial capitalism. It’s still you know, it’s still performing a dispersive role in the narratives of neoliberalism that Kean spoke about, and that plenty of other people have written about. But the organizational logic of informational capitalism is not the market. It’s the platform. And I think it’s important then to understand platforms as transforming markets more than either of the two options in the seminar series title.
Now, I have no idea for how long I’ve been speaking but I’ll just do a little bit more. So, my own sort of point of entry into this was a specific curiosity about the role that law and legal institutions play in producing this transformation. A specific frustration with a form of thinking about that encounter between law and political economic transformation to informational capitalism that sees law as sort of the passive thing that has to respond or get out of the way, or do something in reaction to the transformation that is going on. And that’s very clearly not the case right. And it was very clearly not the case in the first transformation from agrarianism to industrial capitalism. You couldn’t have done the things that were done in the name of inventing and entrenching industrial capitalism, without using legal institutions to do those things. You had to constitute corporations and sort of ravage the natural resources of the world and appropriate them and invent capital, and you know, go read your Katharina Pistor for some of that a Kirin Heale. And definitely read your Polanyi. Law doesn’t just respond or get out of the way. Law is mobilized to produce not just the protected counter movement, but to produce the transformation in the first place. The thing that makes the protective counter movement necessary. And in fact, what we see and what then I sort of delved into at great and gory book length detail, is all of the ways in which law is mobilized and deputized to engage in legal entrepreneurship around making this transformation happen and making it happen in a way that benefits interested parties. And so you do see the large platform firms and other large and powerful tech economy actors again, mobilizing legal tools. Re-optimizing structures that will aid their sort of surplus extractive projects. Particularly around intellectual property rights, particularly around certain forms of corporate law that are more efficient for those purposes. Particularly around sort of, not just reorganizing the built environment for ease of data harvesting, but propagating narratives that forward the project of data harvesting by sort of constituting all of our data as a public domain that is available for interested parties to appropriate. And then by using other narratives of enclosure and appropriation to justify the way that they lay claim to the data and again, work to extract surplus from it. We see other narratives being mobilized around baseline relations of accountability and disempowerment. Generally and unsurprisingly, the disempowerment sort of rebounds to ordinary people and the immunity from accountability rebounds to the benefit of platforms and data brokers and the like. But it’s contested and performative right, there are these narratives of responsibility state actors, say you must remove terrorist speech or you must remove hate speech, or you must remove child pornography. And platforms agree that some of it should be removed but there is then a tussle, a contestation over who gets to control how that is done. And at the end of the day, when the dust settles, very often you have these sort of performative accountability structures like the Facebook oversight board. But a lot of self-governance being done internally without a whole lot of accountability for how that happens. And these are all sort of processes of legal entrepreneurship that are going on. And then there’s a whole other side of the story to tell which is sort of outside the frame of this seminar session. So, I’m not going to trouble you with it unless people want to talk about it more. That has to do with sort of reshaping forms of legal institutions themselves to make these other projects work more smoothly. So, I think that’s probably enough from me, for purposes of getting the conversation going. So, I’m going to stop the share and I’m going to revert the hosting back to whoever gave it to me in the first place, and I’ll turn it then over to you Kimberly.
Prof Kim Weatherall:
That’s terrific. Thank you, Julia. And I think we’ve already taken the hosting away from you so don’t worry, that’s great. There is so much that I want to unpack and discuss in all of that with both of you. I’ll start with a question about the platforms as they are effectively you know, you both talked about the way that the platforms are like market designers getting to set the rules and design the rules. If they are designing the rules, if they are effectively setting the rules of these quasi-markets to get the outcome they want, they can redesign those rules at any time, I assume. So, what’s the impact of that on participants in these markets clean up? I might turn to you first, Kean.
Assoc Prof Kean Burch:
Sure, thanks for the question. So I think that this is something that we can see play out in like the US congressional hearings that I already mentioned, where you had the hearings on the big tech competition, digital markets competition, and so there was some really interesting testimony that we got to hear there from a range of different people including the CEO’s of the big tech companies that were you know, kind of asked to testify basically. And I think that some of the examples of this this kind of ability to redesign the rules as they wish, play out in pretty problematic ways. When you look at the impacts of these large companies – well especially the large companies – the large platform companies you know, the impacts that they can have. So, things like withdrawing of access to I think in one case, one case was a discussion of Facebook and it’s withdrawing of access to its social graph, for example. That was seen as a big issue. So, the social graph is kind of all the information on the people that connects all these people together in this big kind of social network. And so it’s a really useful resource for companies who want to develop new products, services, ets. But in order to access it they need to you know, they need to sign over certain things so they don’t end up competing with Facebook. And Facebook can withdraw access at any moment. So, it’s a problem if you are that company that’s reliant on it. Another example would be Amazon. So, you know, third-party sellers on Amazon who their livelihoods depend upon what seems to them to often be a very arcane catholic kind of system of regulation of their actions. And so they can suffer. Again, the same sort of thing where they are blocked from selling on Amazon and they may not actually understand the reason why. And then they have to go through a whole process of trying to get back onto the platform. A final one would be Apple and app developers, and the kind of, their need to go after those is to follow the policies, or to get kicked off the app store.
Prof Kim Weatherall:
That’s great, thanks. Julie, did you have a comment on that? I know that you’re also interested in the power that platforms have to shape these interactions.
Prof Julie Cohen:
Yeah, I mean I think it’s important not to lose sight of the fact that they’re playing a much more sophisticated game than I think is suggested by the narrative that they can just make the rules and change the rules whenever they want. The game they’re playing is about sort of imbuing their authority over all things transactional and algorithmic, with sort of an aura of being natural and organic, and a matter of their entitlements. And there are categories that get naturalized, right. So it matters a lot. There is information that flows within data-driven algorithmic processes, internal to the platform about which sellers or buyers get prioritized, and so on. At present, largely completely opaque to the rest of us – except for the occasional journalist that gets lucky doing black box testing, and it is important to understand that the narratives that are told about those processes are under construction, right. Are the processes themselves knowable? Are the processes proprietary? Are the processes expressive? Are the processes the organic result of people, buyers and sellers, being matched with what they want, which is sort of that narrative of the market that we’re now in the process of transcending being used to do work. It’s not, none of those people is dumb enough to show up in congress and just say we get to make the rules the way we want. Even though I’m sure they might wish to. And that there are incredible sort of structures of performative accountability being created in certain domains, right. So let us put up a transparency report, let us show you the many thousands of items of information that we have, or not show you. Because we’re not going to show you anything. Let us tell you that we have taken down 12 000 pieces of information about hate speech and you know, 18 573 pieces of information about child pornography. We’re not going to tell you the denominator. We’re not going to tell you the hash function that we use to search for them. We’re not going to really tell you how they circulated within groups or pages. We’re not going to tell you how our algorithms up rank them for engagement. But we are going to have a whole bunch of narrative around transparency reporting to make it seem that we are putting rules in place.
Now, I recognize I’m answering you in a way that blurs competition policy and disinformation policy concerns, and I’m doing that deliberately because all the same on the back end. But it is very much more complicated than just you know, we get to make the rules however we want. They are invested in creating narratives within which their processes represent a kind of rule by law, even if not rule of law, in a way.
Prof Kim Weatherall:
Wow. Thank you for that. That’s terrific. Just going to those questions of data and information capitalism. One thing that struck me in work, in reading some of your work Kean, was that you talked about data needing to flow. You know, that data was flowing around and people were making money from that, that needed to flow for platform capitalism to work. But one of the interesting trajectories we’re seeing at the moment I think, is that the bigger platforms are taking a lot of steps to limit the way that data flows or to shape the way that data is flowing in order – usually as some sort of you know, to turn to Julie’s comment about narratives, they talk about this as a privacy thing, like we’re going to protect privacy by not allowing people access to all of this data of ours. So, is this shift still about data flow for profit? Or are we automating that away as well? Is that a new stage where we’re trying to cut off people’s access, or other firms access, today?
Assoc Prof Kean Burch:
Okay, yeah. I left my mute on. Yeah, so it’s an interesting question. I think that there’s a an emerging conflict I guess here, and I think this is something that was highlighted in a recent wired article by Gilad Edelman, which is basically about this conflict between privacy and competition. So, on the one hand you have, according to Edelman, basically it’s like fakebook’s being sued for reducing privacy and on the other hand Google’s being sued for increasing privacy. So, you have this kind of conflict that’s going on around what is the extension of privacy mean when it comes to competition? And I think there are some people who think that the opening up of the data would lead to a burgeoning of competition and an expansion of essentially, the erosion of the market power. I’m not sure like how you go about doing that, I think this relates specifically to issues of interoperability and how you’d go about forcing companies to enable you know, to transfer data from one platform to another platform, and so on. And so, there’s all sorts of issues around data governance here that are fascinating, that require particular kinds of intervention. That I don’t know whether the state or governments are either willing to pursue or have the necessarily, have the wherewithal maybe, to pursue at the minute. And so I think there’s some really interesting possibilities that could emerge over the next few years. And I thought you know, I’m keeping an eye on Europe at the minute because Europe’s got all these new digital services act, digital markets act, this new AI regulation, all these sorts of things that are coming out that look very interesting. They have a very different way of regulating the market. You know, if we go back to neoliberalism, we could see their approach is more of the kind of auto liberalism side of neoliberalism, which is about you know, setting the ground rules, the framework conditions for markets to operate. And having very strong, that’s very strong framework there, and then you enable firms to operate within those framework conditions. And that’s their idea of competition versus the north American, I guess Chicago style form of neoliberalism, which is a little bit less, well a little bit less state managed, if you like. And yeah. So, I think yeah I’m not sure if that actually really answers your question about clothes, but I think there’s this conflict that’s going on that we’ll see play out in very interesting ways.
Prof Kim Weatherall:
Yeah look, I absolutely agree with that. One of the interesting things you know Australia, is another interesting one to look at actually with some of this because we had the digital platforms inquiry and that talked a lot about privacy and how we needed to – very much following a basic notice and consent model of privacy – but we needed to improve that. But sitting beside that, we have the same regulator, the ACCC, Australian competition and consumer commission, doing this digital advertising services inquiry where their paper that they issued earlier in the year suggested that in order to solve some of the problems in the advertising market, you know the fact that advertisers were getting screwed over by what was going on in some of the platforms, and that the agency won’t seem to want to give more commercial parties access to information, but didn’t quite join the dots on how that could be done in a privacy protective way. So, you know, there are these sort of discussions around how you reconcile that privacy, and making markets work better I think going on sort of piecemeal in Australia through some of these initiatives. And then there’s the EFF white paper that they issued around ways to kind of extend that social graph in Facebook, which again is trying to reconcile that let’s create competition, by opening up some of this data. Although that takes a perspective that effectively by creating competition, you will enable more privacy protective social media type networks to exist. Which seems to exhibit a faith in the market I’m not entirely convinced by.
Julie, did you have a comment on that question of this different shift we have towards parties wanting to control their data, non-keep access, in the name of privacy?
Prof Julie Cohen:
So, it’s very complicated, right. So, it’s not even clear. We all mean the same thing when we use the word privacy. It is quite clear that privacy is being weaponized in terms of competitive battles among different platforms, right. And so, before we even get to what happened in Australia, the tug of war between Apple and Facebook over eliminating the capacity to be tracked in apps, is not necessarily something that would hit Apple the same way it would hit Facebook. Google’s moved to the flock tracking, Google Chrome’s putting people in population cohorts and telling them that protects their privacy, is a super interesting and I dare say super cynical manoeuvre, because it really plays up that notion of privacy as being entangled with individual assertion of control rights. But since is all about sort of profiling populations, or perhaps in that classic you know Zeisian incense individuals, that’s what is going on, right. And so, to put the Google situation, is not necessarily any more privacy protective. It just takes a different spin on how people’s data are used. So, we’re clearly in a situation in which the tech companies have sat up and recognized with varying degrees of cynicism that it behoves them to make privacy claims with varying degrees of seriousness, and to sort of adjust their particular environments and their relationships within their sort of own supply chains, their app stores, and so on and so forth, around privacy. It’s not clear that any of that should be called privacy protective in any meaningful sense, and it is I think, important to kind of disentangle the question of benefit and the question of control. It’s not necessarily going to further your privacy in any meaningful sense to allow you to check boxes yes or no. And we’re still in quite rudimentary stages I would suggest, in terms of thinking about how privacy might be furthered through design in ways that none of the big firms are really seriously doing it.
Assoc Prof Kean Burch:
Okay, can I follow up on that? Privacy is really interesting because just looking at yes, as Julie was saying around looking at the way that it is weaponized in the way that it comes through in the way the big tech platforms understand privacy and worry about privacy. So, I’ve been doing some research with some colleagues, Calvin Ward and Troy Cochran, where we were looking at the earnings calls of the five big tech companies. So, you know, Microsoft, Amazon, Apple, Google, and Facebook. And we took 10 years of earnings calls – an earning call comes out every quarter – we took the 10 years and we looked up you know, key words to keyword search. We did kind of quantitative textual analysis if you like. And there’s a really interesting trend over time in which you see privacy basically comes up about 20 times a year between 2010 to 2017 across all five companies. And then there’s a jump where it just jumps up to 80 times, and then 160 times. So, it’s kind of doubling as they get more and more concerned with privacy as a consequence of you know, Cambridge analytica basically, I think. And so this is where it comes in here, and then there’s – but it’s interesting to see who’s concerned about it. So, it is you know as Julie was saying, it’s being weaponized. You have Google and Facebook are being particularly kind of targeted with issues of privacy, and then Apple standing back saying you know, we’re okay with all this. So, there’s a quote from one of these earnings schools where Tim cook says, CEO of Apple says, if you look at our model, if we can convince you to buy an iPhone or iPad, we’ll make a bit of money. You’re not our product. So, they’re literally saying this is that part of their kind of arguments to investors and so on. They’re trying to you know, avoid this kind of issue but also it’s about trying to set themselves up as junior sayings with a certain kind of narrative going on.
Prof Kim Weatherall:
Julia, I think you’ve pointed out that regulation, the kinds of regulations that come with a neo-liberal mindset creates new entry points for economic power. So, new things. And it struck me that that was a useful way to think about some of the debates we’re having in Australia at the moment, right. We had this big public fight between Google and Facebook and the Australian government recently, about payment for news, similar to the big fight that they’ve had in Europe. And so the government came out and said they wanted Google and Facebook to pay. If there was going to be no commercial deal the government would set things like price. And then we had all sorts of behind the scenes lobbying, behind the scenes deal making around this public policy stash and actually shaping what came out. So, is this just the way things work now, that we have – does this decrease the power of the state that it tries to regulate, and that just creates new opportunities for firms to exercise their power in new and different ways and shape that regulation?
Prof Julie Cohen:
So, I think clearly. I think, with a caveat which I’ll get to in a moment, many states are sort of getting played royally right now. And then there’s India, which threatened to throw Twitter employees in jail – this is the caveat. There’s a new social media law, you must have people on the ground and we can jail them if you don’t do what we want, right. And there’s Turkey, you must have people on the ground and we’ll do x, y and z to them if you don’t do what we want. And then there’s China – I won’t go there at present. So, and then there’s Europe where they issue very draconian competition rulings and large fines, but I have yet to hear of one of them actually being paid. It just sort of seems to devolve into endless negotiation. So, in conclusion, to be reaching one of the – you know I think it is interesting to sort of game out the Facebook Australia debacle and ask whether there is some way that it could have spun out differently, other than by the Australian government, as I’m not trying to suggest it should have done this, right. You know, tossing all the Facebook employees in jail and that’s not what I’m trying to advocate for, but it’s sort of interesting to ask what could have been done more proactively. And there are places I think, that are pressure points that have not received enough systematic attention. And they’re the places where all of those disparate threads come together, right. The privacy scandals and the disinformation scandals and the competition scandals, all the threads come together around digital advertising and around design for optimizing engagement, and around matching up everyone in the long tail with their happy place content. Which means the white supremacists get their happy place content and so on and so forth. There’s not been I think, enough of a concerted effort across all of the governments that want to do something in a more rule of law friendly way, to really kind of bear down on the digital advertising business model in the way that I think it richly deserves. And I still, I don’t think I’ve really you know, I’ve really told a tale of a way that Australia acting alone could have proceeded to cause things to unspool differently. But maybe somebody who is more knowledgeable about that particular situation you know, disagrees, which it would be interesting to hear about.
Prof Kim Weatherall:
As both Kean and Julie know, I have more questions. But we have a terrific set of panellists here who are likely to have their own questions about everything that we’ve been talking about here. So, I’m going to throw it open to the panellists in the webinar.
So, Damian, Christine, Jake, Mark, Jeannie, Megan, and Julian, and Sam, if you may just have to put up your questions or jump in, in order to progress this conversation some more. Who would like to jump in first? Mark? Thank you.
Participant 1:
Thanks to Kean and Julie for really interesting and provocative presentations. Really appreciate you taking the time. I was just thinking about something that Julie said, a kind of observation, but I thought it might be interesting to pick up on it and see where it leads, and that was about journalists kind of stumbling across something that’s happening inside the black box. And it gets that question of accountability and you know, as somebody’s who’s been trying to do some work on accountability, of course the challenge is all the data and all the information about the algorithms and the outcomes are all stored by the owners, the platforms. And I’m wondering if you’ve thought about – or either of you have thought about, other possibilities for accountability, for looking at the outcomes that have been the object of concern and critique? And for also inspecting the details of how the algorithms and the sorting systems are working. You know, rather than relying on some ad hoc academics who are trying to scrape some data or some journalists who are pulling together little tools that give them tiny peep holes into what’s happening. Is there system that that might be the other alternative. I guess you know, ban the platforms or you know, make them public or something. But is there some institutional mechanism that might be possible for providing the type of accountability that might be more satisfactory, and addressing some of the concerns that have been the object of critique?
Prof Julie Cohen:
So, this is from in my slide deck. This was the part where I had one line and said the back half of the book talks about legal institutions and I’m not going to talk about that here, but so two things, right. So, number one, we do kind of find ourselves in a situation like that confronted by the British parliament in 1700 whatever, 1800 whatever, where nobody had ever invented protective labour regulation yet, and they had to imagine and then invent it. There are forms of governance and regulation that don’t kind of port well to this, and they’re not working. Competition regulators are not getting enough traction. Data protection authorities are not getting enough traction, and there is definitely, there are a set of kind of imaginative leaps and institutional transformations that I think are urgently necessary. But it is also true that there is a ton of governance happening right now, mostly internal to platforms and sort of shielded from the external world. And it is also true that there are sort of little nuggets of potential, kind of littered across the regulatory landscape that probably aren’t being used enough because they’re siloed off. So, to take one example at risk of sort of getting myself in trouble a little bit, because my knowledge is like this deep, money centre banks, and you have to have to submit to audits now for capital, for capital adequacy. In the wake of the 2008 financial crisis and even to some extent before, there are a set of tools for that are still emergent and evolving, and there are contests over keeping those proprietary too, yes. But when we have what we have in sort of the interbank financial trading space, are exactly sort of platformized algorithmic processes that can go off the rails, and their amplitudes can get so large that they can cause enormous wreck and ruin. And there are emergent protocols for auditing and benchmarking and stress testing, and we so badly need exactly that in the disinformation space, right. And in the privacy space. And and to take I think that idea, and join it and sort of pry out of platforms some of the knowledge about how they’re already governing and benchmarking and stress testing , it’s just that the end goals that they’re doing all this for are different than the ones we might want.
I do think that there are building blocks that could be used to get to a space that would be sort of more pro-social and that’s a challenge that we are confronting right now.
Assoc Prof Kean Burch:
Can I follow up?
So, I was thinking that I don’t know if I have a dedicated answer I guess, or anything but a speculative answer, I suppose. But in following up with what Julie was saying about stress testing and banks is that, is there an opportunity to integrate some form of AI responsibility, algorithmic decision making responsibility in the investment community. They’re getting very on board with ESG, your environment social governance risks, and so on. So, if stress testing is part of it, you can incorporate it as part of the bank’s own internal processes from the investment asset managers internal processes, where they become responsible for what the companies invest in do. And you know, that would be another way of doing it, potentially. I was also, I have my personal kind of what I would like to see happen which is I don’t know whether this is actually possible and Julie, as a legal scholar is probably a better person than me to say whether it’s possible, but I’d like to see some form of modular contracts where instead of having just generics either you know, boilerplate standard form contracts, that we tick with data collection. We actually have modular contracts that we can take out clauses if we want. And so we can take out the clauses which enable the use of our data forever or you know, that force us to use mediation when it comes to disagreements with, when we disagree with the companies themselves.
Now I don’t know the practicalities of that. I’ll do this truly, but I also had a kind of more – I’m not sure it’s facetious comment but it’s like if we’re thinking in a pelagian sense, what is this beam Speenhamland law for the digital age. Are we seeing, have we seen that already? Or are we going to see that in the future, something like the ACCC decision. Is that something that could be equivalent or you know, has that yet to happen?
That’d be another kind of interesting thing to think about.
Prof Kim Weatherall:
Yeah, I’d really love to pursue that. So, what does the counter movement look like? But I don’t want to dominate the floor here, and I know I have some questions lined up. Seth, you had a question?
Prof Seth Lazar:
Yeah, thank you. That was a really interesting couple of presentations. And I was just, it’s been fascinating to come read your work. I’m a philosopher, this is an area that’s new to me. I’m really interested in these questions of power and in particular, in the questions of the legitimacy of the power that is being exercised. It seems to me in reading your work and thinking about your work that I have this question and I’d like you to, if you could, tell me whether I’m interpreting things the right way.
So, both of you describe the way in which markets kind of have always been all marketed in the long 20th century as you’re saying Julie, we’re always kind of fictions in one sense. That were legally constructed and that you talked about as if markets can, and in that sort of posnerian sense. And so bearing that in mind, to what extent do you think that the ways in which markets are being recreated in these platforms constitutes a very significant change from what existed before – I mean in particular, does it constitute a significant change with respect to the degree of control that a particular set of individuals or a small group of individuals has over the contours of those markets? So, do we have significantly more empowered market designers than we had before? And is it more is it significantly more removed from if you like, the political process than it was before? So, do you think that we’ve always been in sort of one version of this topia or another, where small group power over the construction market has been concentrated among a few unaccountable kind of extra political hands? Or do you think the shift to platform markets constitutes a significant change with respect to those questions of degree of control, concentration of power, and political accountability?
Prof Julie Cohen:
So, I guess I want to push back on how you’re invoking the term markets as though we all already knew what that means. It’s not really clear. And it depends a whole lot on the context, and so you know, if you are an investment bank doing credit default swaps trading, there’s a platform on which you can do that. And it’s managed as what an economist would call I think, a club good. And there are processes for being admitted to the club. If you are a fortune 500 corporation wanting to outsource your human resources, you’re going to hire salesforce.com and you’re going to, or you know, your cafeteria workers, or your janitorial workers, you’re going to bring them in through outsource special purpose provider. I’m just saying that and thinking God, what I wouldn’t give to be back in my office with cafeteria workers and everybody. But the fact remains that it’s not to ask what platforms are doing to markets, sort of begs the question what we’re holding constant when we talk about markets, as though we already knew what those were. And those are very different, right. So, I actually tried to write, I think it was chapter one of my book, the section that described what platforms were becoming without using the word market that much. Because after all, my argument is that platforms are superseding the market with all that that entails, in terms of bringing the abstract construct of the market back down to earth in these protocols and boilerplate agreements, and everything that is very contextualized depending upon which set of economic exchanges we are talking about. And it’s a very instructive exercise to engage, and I commend it to you all because it forces you to make concrete exactly what you mean when you say market at all.
I think that the word market retains a great deal of power in the political spaces, in what I’ve called the narrative spaces and the discursive spaces, where we are told this and such can’t happen because innovation comes from the market, or some other just BS thing that seems to work on legislators a lot. But it’s not. So I’ve gone on for too long already, and I see Kean kind of gesturing to weigh in, so why don’t I stop.
Assoc Prof Kean Burch:
Sorry, I wasn’t gesturing, I was just nodding along with my pencil. So, alright. But yeah, in terms of – I think that to me – I became interested for one reason or another in contracts and contract law and its relationship to neoliberalism five or six years ago, when I started reading around contract law. I’m not a lawyer but i you know, it became interesting and I wrote a little bit about it because I think there’s an important missing piece when it comes to discussing neoliberalism in relation to contract. And I think that this contract has you know, has implications here for your question, because I think you can think of you know, platforms as being managed or managed through contract where contract is a form of privately made law, and this is different from a non-platform context. Let’s not say market, but let’s say a non-platform context, where you might have other account, other regulatory pressures or what have you, from say a state. That would change the balance of power if you like. But with the platform, it’s the owner of the platform that’s making as you know, as I said in my brief talk, but they’re making the rules and they’re the ones that are managing it through these contractual arrangements, in which of one-sided standard form boilerplate contracts we have very little control of.
Prof Kim Weatherall:
Thanks Kean. And Damian, you had a question?
Participant 2:
Yeah. Thanks very much for the talks. I found it really interesting, and I suppose my question goes back to this, the point that you’re making about companies controlling the narratives around privacy. Because I find that very compelling and kind of I suppose, it matches what I’ve been thinking myself and I want to kind of just get your reflections on what that means for future regulatory efforts. So Kean, you mentioned that there are plenty of draft regulations now in Europe at the moment. I mean there’s also the platform to business regulation which was adopted I think last year. So, there are like plenty of regulatory attempts to kind of start to counteract some of the issues that both of you have pointed out, particularly in that jurisdiction. But to my mind, I think Europe’s attempts to regulate data privacy or data protection is probably the ultimate example of how companies have tried to control narratives. So, I’m starting to think a little bit very cynically or maybe I’m getting a little bit depressed about the possibility for, or the potential for, a regulator to actually regulate this space. So, I’m kind of just wondering about your own reflections, about how we actually go about doing this. You know, I mean, we have a regulator in Europe that is trying to do something. Is it a mere question of enforcement or are they bad attempts at rules, you know. I understand that it’s very open-ended and quite difficult question but hopefully there’s some sense within it that you can find.
Prof Julie Cohen:
So, I have a little essay that got posted on the knight first amendment institute website at Columbia law school recently, it’s called how not to write a privacy law, that might be interesting on these points. There’s kind of this is a frame that I have become fond of using is Maslow’s hammer. The idea that you know, if all you have is a hammer everything looks like a nail. And we’ve got a whole lot of regulators running around the hammers now, so we have competition law, we have data protection law which presumes that individual agents exercising their autonomy can make choices about their own data right. We have in the US, this bizarre construct called the information fiduciary which some folks including some very good friends of mine have been trying to sell on the hill. And it’s just beyond bizarre how, to me, to think of how one could sort of take the inherently relational idea of fiduciary and blow it out to the entire tech industry, and that’s another conversation. But what we need to do I think, if register regulators are to have something that isn’t a hammer, is to sort of look at the contours of the problem. We have sort of network products operating at scale that are governed by standards, and control over standards that produce effects at scale, so it’s necessary. So, my colleague Paul Ohm calls them order of magnitude effects. So, regulators have to learn how to sort of manage order of magnitude effects that are mediated through protocols and dashboards and indicators. So, there’s a sort of opacity problem and there is a you know, you can never ever devolve to consumers or users or whatever you want to call them, authority to govern, because the consumer is always at the disadvantage of having their interaction mediated by the dashboard. And so you have to just kind of start I think, with the contours of the problem, and come up with some new tools. And again, I don’t think it’s a blank slate situation. I think inside a lot of these companies that deal with networked product slash services operating at scale, that produce order of magnitude effects that are accessed via interfaces and dashboards, there is a lot of thinking about how to print, how to use those to produce effects on how to govern. And so it’s just that that doesn’t map in any kind of coherent or recognizable way yet, to the regulatory toolkit. But there is no particular reason why that couldn’t change.
Assoc Prof Kean Burch:
And so I’ll just follow up. I think possibly in a different direction, I think that there’s, one of the issues here really is that there’s a significant conflict around, as government, the state around privacy on the one hand data protection on the one hand and economic concerns on the other hand. So, you know, innovation. That comes up a lot when people talk about the data digital economies and so on. There’s an example of Canada where Canada has introduced a piece of legislation called the digital charter, and it introduced it in November last year. And that in reading the legislation, you can see the tension between attempts to secure privacy data protection with the desire to enable companies to exploit data so that they can innovate. And so, I think this tension comes up a lot and I don’t think, I still think the same tension comes up in the EU context, potentially. You know, they fall more on the current data protection privacy act, so I think that part of this, part of this issue. You don’t have a simple kind of like we want privacy and that’s it. That I do think there’s some countries that we could really learn from on this front and I think some of those countries include the countries that have had really strong sort of data collecting state structures for a while. So, Scandinavian countries like Sweden and Denmark, Finland, you know. These are countries that have been collecting personal data including sensitive health data for decades, and they’ve been collecting it as part of their welfare state. So, it’s quite extraordinary if you go to a country like Denmark where you have one number that’s your company, what they call it, but it’s like your social insurance number or whatever you know, if you have this number that it connects to everything. It connects to all sorts of welfare services and all the private sector companies have also connected to it, so there’s some real benefits to it in the sense that you move house, you update your information on your social insurance number and it filters out. And so you know, it’s not like you’re continuing, you’re contacting a dozen companies to change your details and such, but they have the accountability structure for access to that data, for what you can use it for. And to you know, for recording who’s accessed it, all these sorts of things in place. That is you know, that has taken as I said, it’s decades to build up to this point, whether you have this system in place.
Prof Kim Weatherall:
Thanks for that. Actually, it’s interesting because we were focusing on that data in that privacy question and I thought I might just follow up with one of the questions from the Q and A there, which is Janet was asking, we seem to be focusing on privacy but not on questions about how data becomes a form of value revenue, stream or price, and we’re wondering whether the panellists could say something about how their research helps us understand those questions. Just wonder whether you might have a response to that.
Assoc Prof Kean Burch:
Sure, I can do it. Yeah, I can start if Julie doesn’t want to. So, I just finished a paper which is looking at the data as an asset. So, it’s called data as an asset, the measurement governance and valuation of personal digital data by big tech. And essentially it’s trying to understand how does big tech understand personal data, and the starting point was to say oh you know, big tech is turning our data into this asset, you know. But let’s look at what they’re doing. But in exploring this question, we started looking at the balance sheets of companies, and it’s very, you can’t identify obviously, you can’t identify the data on the balance sheets in the way because they’re not allowed to you know, call it an asset according to current accounting standards. And so we’re trying to try and think through what is being valued here, and in order to understand that we have to understand how does big tech make personal data, you know. How does it measure it and how does it make it legible as something that has value? And this process is really about trying to understand the way that big tech sees its users and understands its users and measures its users, and makes its users. And what these users do legible as something that can be valued. And so it’s about this transformation of user engagement, you know, user metrics. About trying to make those things. Big check trying to make those things measure those things to make them legible as something that’s valuable, that then it’s investors can understand as well. And so you don’t actually – again, going back to that the research we did on earnings calls – you don’t actually see the companies talk about personal data, they’re just 10 years a decade worth of earning calls. I think there were two mentions of personal data which is quite extraordinary. What they’re talking about is users engagement monetization, user monetization ecosystems, all these sorts of things that are very different than personal data. And so big tech understands us as users, as people want to click on things, view things and you know, that’s what they’re collecting. So, we related this to James Scott’s notion of seeing like a state. And so we called it tech craft. This is what they’re doing, is tech graph, not statecraft. And you know, seeing big tech sort of thing where they see us in a particular way. That has value to them, but as a consequence you know, you have this situation in which there’s lots of problems that are emerging here around you know, is all the data they’re collecting, is it actually collected from people or is it collected from bots? Is it collected from click farms, content farms, all these sorts of things that disrupt this collection. And the value of what they’re collecting. And so there’s some really interesting dynamics that are worth unpacking around there.
Prof Julie Cohen:
So, just a couple thoughts. So, first of all, he and I have a paper in philosophy and technology called the biopolitical public domain. You might want to take a look at it. It applies the Collin and Muniz framework of markets as calculated exchanges to data. But I think the moral of the story is that they don’t talk about the value of data as such because that’s not the way the digital advertising business model works, right. What is singularized to use that framework for buyers is eyeballs, right. Tranches of individuals or populations or whatever you want to call them. And the value in the data is simply to optimize the ability to get the messaging in front of the eyeballs. The bot situation is interesting in a variety of ways because it – among other things – it does as you note, have the potential to kind of wreak havoc with that game plan. But I think that narratives are in play here too, right. When you see the message, your data has value, take control of your data. I think we are being encouraged to see our data as something that has value, but I think if you want to understand the way the markets work, to use that word market which I scolded Seth for using – so sorry about that Seth – if you want to understand the way platformized economic exchange works over digital advertising, you need to pay attention to what is being again, singularized for prospective purchasers. And it’s not particularly like you Seth or you Kimberly or you Mark or you Jake, right. It’s population tranches.
So, now I cannot resist pointing out that investor calls may not be the only place to look, and that this problem does not begin with personal data. So, to my Polanyian obsession, we are very good at knowing how to have disclosure of the widgets and bits and pieces of the industrial economy, so that it may be reported to securities, regulators, and inform the so-called ability of capital markets to discipline themselves. We are just terribly awful at devising mechanisms to have disclosure of the value of the informational economy to securities regulators and others, so that it may inform investor behaviour. And again, you know, empower the capital markets to discipline themselves and it doesn’t begin with data. There are these terms that appear in investor calls and in our securities disclosures, and they’re like magic, like patents, trade secrets. But you never really know from reading any of the documents, exactly what is done with that stuff. You never really know from reading any of the documents what is done with any other kind of data that the firm may happen to have. You never really know anything about the algorithms. And as long as that is the situation, it is an exercise in fantasy to imagine that investors could be making decisions that are actually informed about anything other than by their belief in magic, or their trust in the VCS, or what have you. And it is a systemic problem that extends across not just the sectors of the economy that principally rely on personal data, but across everything that relies on intangible informational assets, which is basically everything at this point. And if you don’t have the ability to just you know, you could do well, I’m not going to invent words on the spot here, but I was thinking of seeing like a state, and there we have fictions about how our capital markets are supposed to work that entail the ability to see into it in different ways. We don’t have it with personal data, we don’t have it with pretty much anything else that’s informational, as far as I can determine, and that is a huge systemic problem. So, in addition to the fact that words appear in investor calls, or that words appear in securities disclosures, I think it started to be, it has been for decades now really, important to understand what is actually known about what those words signify.
Prof Kim Weatherall:
The tools for dealing with all of that are going to be a project. It’s fortunate that we have, there’s amazing minds thinking about that. I can’t even begin to unpack all of the thoughts that we’ve had today but I’m very conscious that we’re now at 10:30. So, unfortunately we haven’t been able to get to all of the questions but that’s the nature of the beast. These are big questions and they are going to go on for some time, and fortunately we still have two more seminars in this series to keep talking about them, which is terrific. But I’ll hand back to Seth now to close things off.
Prof Seth Lazar:
Yeah, so there’s not much more to say besides thanks to Kim for hosting, to Julie and Kean for your presentations, and to Jake for bringing this whole thing together. It’s been a pleasure to see you all and it’s been a wonderful conversation, and I look forward to continuing it in a fortnight’s time. So, thank you everybody for coming and thank you very much, keep paying attention.