By Eric Laine |
In his new book, Terms of Disservice: How Silicon Valley is Destructive by Design, Dipayan Ghosh, Ph.D. ’13, a computer scientist turned policymaker, offers technical analysis, recommendations for economic reform and practical ideas for using technology to create an open and accessible world that protects all consumers and civilians.
Ghosh is the co-director of the Digital Platforms & Democracy Project at the Harvard Kennedy School and a lecturer at Harvard Law School. He previously worked at Facebook, leading strategic efforts to address privacy and security issues, and he was a technology and economic policy advisor at the White House during the Obama administration. He received a Ph.D. in electrical engineering and computer science from Cornell and an MBA from the Massachusetts Institute of Technology, and conducted postdoctoral research at the University of California, Berkeley.
At Cornell, Ghosh was advised by Stephen Wicker, professor of electrical and computer engineering and a member of the graduate fields of computer science, information science and applied mathematics. They share an interest in digital privacy, artificial intelligence, disinformation and internet economics.
Ghosh and Wicker chatted by email about the new book; their conversation, with minor edits for space and clarity, follows.
Wicker: It was a pleasure to see your new book, “Terms of Disservice.” What was your main motivation in writing it?
Ghosh: As you know very well, I have a strong interest in data privacy, in large part because of the work we did together when I was your student. Building on work with you, in which we studied problems of information and privacy economics, I got a chance to dive headfirst into the field of technology policy. This was right after the [Edward] Snowden disclosures about the [National Security Agency] in 2013.
Later that year I had the opportunity to join the White House to help the team address privacy reform and technology policy more broadly. I gained perspective on how companies and regulators alike think about striking the balance between regulation and innovation.
In late 2015 I joined Facebook’s privacy team. Gradually I began to see that the business model underlying the company – and in my view, the consumer internet industry more broadly – was corrosive. I realized that there was a bigger story to tell concerning the nature of Silicon Valley, its interface with society, and the regulatory regime necessary to diminish societal harms like the disinformation problem. That was the origin of Terms of Disservice.
Wicker: As you know, technology companies collect a huge amount of our personal data. In some cases, individuals have amassed billions in personal wealth without actually making a product. How did we get to this point?
Ghosh: This question gets to the heart of why this industry – which in the book I called the “consumer internet” sector – is so prevalent in the modern media landscape.
Consider the fact that many consumer internet firms don’t make a profit at first; instead, their investors tell the startups to grow their platforms as quickly as possible by getting lots of users and collecting lots of data on those users. These are the two resources – attention and personal information – that enable the high-margin targeted advertising regime that has become the dominant mode of monetization in the consumer internet.
The United States lacks a fundamental, baseline privacy law – a law at the federal level that protects a citizen’s personal data from the corporation, no matter who the corporation is or what kind of data it is. There is a long list of reasons for why this might be the case, but the result, in my view, is that it enables this corporate culture of unchecked data collection for profit maximization as a default.
Sometime in the past 25 years, we passed a new technological threshold that enabled the rise of a new kind of business model that now underlies all of the consumer internet. With ongoing advances in computing and data storage capacity, it suddenly became cost-effective to harvest and hoard personal information, analyze it to derive behavioral insights on individuals, keep them engaged on the platform through algorithmic curation, and target ads at them.
Wicker: It is fascinating to think of our current problems with privacy as “superstructure” that arose naturally from a foundation of basic economic relations. We seem to have moved from large industries that make, for example, steel and automobiles, to corporations whose goal is to peel away hundreds of dollars from each of several hundred million people. I think they call this “poverty capitalism.” Do you think that the Internet and app-based consumer culture has accelerated this trend?
Ghosh: The way you put this – describing the economic situation on our hands – is precisely correct in my view.
Society has moved online. Digital advertising [recently] surpassed traditional advertising in terms of revenue, in part because of the social distancing policies forced upon us by the COVID-19 pandemic. I see no great problem with our having moved our attention to digital platforms over traditional media in and of itself; society always has a technological circumstance on which it sits, and which in part determines its expression of culture.
The fact that we now have the technology that enables the consumer internet business model is no surprise. Given the advances in computing described by Moore’s Law, we were always going to get to this point in a radically capitalistic society. Our constant participation in this consumer culture, as you put it, is part and parcel of the problem. But that said, I see very little we can do about diminishing that culture. Consumer education of the masses is a difficult task, especially when dealing with the complex business strategies thrust upon us by the likes of Facebook and Google.
The internet firms have benefited from that move online. When I say they are natural monopolies that extract monopoly rents, what I am attempting to suggest is that they are extracting from us a new form of digital currency, all the time, everywhere. That currency is a complex combination of our personal data and attention.
Wicker: How were your ideas and concerns about technology shaped or formed by your academic experience?
Ghosh: This really comes down to the interdisciplinary approach Cornell (and you) encouraged throughout my doctoral studies. Early on in my first year, we established together that I would work on studying issues around privacy. It was a hot topic at the time, but not quite in the way it is today. It was still more of a technical issue, and the problems of corporate surveillance we experience all the time today were only taking shape.
That was my foundation – to first understand how we could technically design systems and networks from what we called a “privacy-aware” perspective, and next to determine what kinds of economic and regulatory conditions would enable the adoption of such non-invasive systems at equilibrium.
Throughout my time in the world of public policy, I have tried to approach issues using this same lens of analysis – which is why I often tend to feel that regulatory rather than market-based approaches are needed to generate the kinds of incentive systems needed to change the economic terms around consumer privacy.
Read an expanded version of this conversation on the Cornell Engineering website.