Social Media
The adversarial persuasion machine: a conversation with James Williams
James Williams may not be a household name yet in most tech circles, but he will be.
For this second in what will be a regular series of conversations exploring the ethics of the technology industry, I was delighted to be able to turn to one of our current generation’s most important young philosophers of tech.
Around a decade ago, Williams won the Founder’s Award, Google’s highest honor for its employees. Then in 2017, he won an even rarer award, this time for his scorching criticism of the entire digital technology industry in which he had worked so successfully. The inaugural winner of Cambridge University’s $100,000 “Nine Dots Prize” for original thinking, Williams was recognized for the fruits of his doctoral research at Oxford University, on how “digital technologies are making all forms of politics worth having impossible, as they privilege our impulses over our intentions and are designed to exploit our psychological vulnerabilities in order to direct us toward goals that may or may not align with our own.” In 2018, he published his brilliantly written book Stand Out of Our Light, an instant classic in the field of tech ethics.
In an in-depth conversation by phone and email, edited below for length and clarity, Williams told me about how and why our attention is under profound assault. At one point, he points out that the artificial intelligence which beat the world champion at the game Go is now aimed squarely — and rather successfully — at beating us, or at least convincing us to watch more YouTube videos and stay on our phones a lot longer than we otherwise would. And while most of us have sort of observed and lamented this phenomenon, Williams believes the consequences of things like smartphone compulsion could be much more dire and widespread than we realize, ultimately putting billions of people in profound danger while testing our ability to even have a human will.
It’s a chilling prospect, and yet somehow, if you read to the end of the interview, you’ll see Williams manages to end on an inspiring and hopeful note. Enjoy!
Editor’s note: this interview is approximately 5,500 words / 25 minutes read time. The first third has been ungated given the importance of this subject. To read the whole interview, be sure to join the Extra Crunch membership. ~ Danny Crichton
Introduction and background
Greg Epstein: I want to know more about your personal story. You grew up in West Texas. Then you found yourself at Google, where you won the Founder’s Award, Google’s highest honor. Then at some point you realized, “I’ve got to get out of here.” What was that journey like?
James Williams: This is going to sound neater and more intentional than it actually was, as is the case with most stories. In a lot of ways my life has been a ping-ponging back and forth between tech and the humanities, trying to bring them into some kind of conversation.
It’s the feeling that, you know, the car’s already been built, the dashboard’s been calibrated, and now to move humanity forward you just kind of have to hold the wheel straight
I spent my formative years in a town called Abilene, Texas, where my father was a university professor. It’s the kind of place where you get the day off school when the rodeo comes to town. Lots of good people there. But it’s not exactly a tech hub. Most of my tech education consisted of spending late nights, and full days in the summer, up in the university computer lab with my younger brother just messing around on the fast connection there. Later when I went to college, I started studying computer engineering, but I found that I had this itch about the broader “why” questions that on some deeper level I needed to scratch. So I changed my focus to literature.
After college, I started working at Google in their Seattle office, helping to grow their search ads business. I never, ever imagined I’d work in advertising, and there was some serious whiplash from going straight into that world after spending several hours a day reading James Joyce. Though I guess Leopold Bloom in Ulysses also works in advertising, so there’s at least some thread of a connection there. But I think what I found most compelling about the work at the time, and I guess this would have been in 2005, was the idea that we were fundamentally changing what advertising could be. If historically advertising had to be an annoying, distracting barrage on people’s attention, it didn’t have to anymore because we finally had the means to orient it around people’s actual intentions. And search, that “database of intentions,” was right at the vanguard of that change.
The adversarial persuasion machine
Greg: So how did you end up at Oxford, studying tech ethics? What did you go there to learn about?
James: What led me to go to Oxford to study the ethics of persuasion and attention was that I didn’t see this reorientation of advertising around people’s true goals and intentions ultimately winning out across the industry. In fact, I saw something really concerning happening in the opposite direction. The old attention-grabby forms of advertising were being uncritically reimposed in the new digital environment, only now in a much more sophisticated and unrestrained manner. These attention-grabby goals, which are goals that no user anywhere has ever had for themselves, seemed to be cannibalizing the design goals of the medium itself.
In the past advertising had been described as a kind of “underwriting” of the medium, but now it seemed to be “overwriting” it. Everything was becoming an ad. My whole digital environment seemed to be transmogrifying into some weird new kind of adversarial persuasion machine. But persuasion isn’t even the right word for it. It’s something stronger than that, something more in the direction of coercion or manipulation that I still don’t think we have a good word for. When I looked around and didn’t see anybody talking about the ethics of that stuff, in particular the implications it has for human freedom, I decided to go study it myself.
Greg: How stressful of a time was that for you when you were realizing that you needed to make such a big change or that you might be making such a big change?
James: The big change being shifting to do doctoral work?
Greg: Well that, but really I’m trying to understand what it was like to go from a very high place in the tech world to becoming essentially a philosopher critic of your former work.
James: A lot of people I talked to didn’t understand why I was doing it. Friends, coworkers, I think they didn’t quite understand why it was worthy of such a big step, such a big change in my personal life to try to interrogate this question. There was a bit of, not loneliness, but a certain kind of motivational isolation, I guess. But since then, it’s certainly been heartening to see many of them come to realize why I felt it was so important. Part of that is because these questions are so much more in the foreground of societal awareness now than they were then.
Liberation in the age of attention
Greg: You write about how when you were younger you thought “there were no great political struggles left.” Now you’ve said, “The liberation of human attention may be the defining moral and political struggle of our time.” Tell me about that transition intellectually or emotionally or both. How good did you think it was back then, the world was back then, and how concerned are you now?
What you see a lot in tech design is essentially the equivalent of a circular argument about this, where someone clicks on something and then the designer will say, “Well, see, they must’ve wanted that because they clicked on it.”
James: I think a lot of people in my generation grew up with this feeling that there weren’t really any more existential threats to the liberal project left for us to fight against. It’s the feeling that, you know, the car’s already been built, the dashboard’s been calibrated, and now to move humanity forward you just kind of have to hold the wheel straight and get a good job and keep recycling and try not to crash the car as we cruise off into this ultra-stable sunset at the end of history.
What I’ve realized, though, is that this crisis of attention brought upon by adversarial persuasive design is like a bucket of mud that’s been thrown across the windshield of the car. It’s a first-order problem. Yes, we still have big problems to solve like climate change and extremism and so on. But we can’t solve them unless we can give the right kind of attention to them. In the same way that, if you have a muddy windshield, yeah, you risk veering off the road and hitting a tree or flying into a ravine. But the first thing is that you really need to clean your windshield. We can’t really do anything that matters unless we can pay attention to the stuff that matters. And our media is our windshield, and right now there’s mud all over it.
Greg: One of the terms that you either coin or use for the situation that we find ourselves in now is the age of attention.
James: I use this phrase “Age of Attention” not so much to advance it as a serious candidate for what we should call our time, but more as a rhetorical counterpoint to the phrase “Information Age.” It’s a reference to the famous observation of Herbert Simon, which I discuss in the book, that when information becomes abundant it makes attention the scarce resource.
Much of the ethical work on digital technology so far has addressed questions of information management, but far less has addressed questions of attention management. If attention is now the scarce resource so many technologies are competing for, we need to give more ethical attention to attention.
Greg: Right. I just want to make sure people understand how severe this may be, how severe you think it is. I went into your book already feeling totally distracted and surrounded by totally distracted people. But when I finished the book, and it’s one of the most marked-up books I’ve ever owned by the way, I came away with the sense of acute crisis. What is being done to our attention is affecting us profoundly as human beings. How would you characterize it?
James: Thanks for giving so much attention to the book. Yeah, these ideas have very deep roots. In the Dhammapada the Buddha says, “All that we are is a result of what we have thought.” The book of Proverbs says, “As a man thinketh in his heart, so is he.” Simone Weil wrote that “It is not we who move, but images pass before our eyes and we live them.” It seems to me that attention should really be seen as one of our most precious and fundamental capacities, cultivating it in the right way should be seen as one of the greatest goods, and injuring it should be seen as of the greatest harms.
In the book, I was interested to explore whether the language of attention can be used to talk usefully about the human will. At the end of the day I think that’s a major part of what’s at stake in the design of these persuasive systems, the success of the human will.
“Want what we want?”
Greg: To translate those concerns about “the success of the human will” into simpler terms, I think the big concern here is, what happens to us as human beings if we find ourselves waking up in the morning and going to bed at night wanting things that we really only want because AI and algorithms have helped convince us we want them? For example, we want to be on our phone chiefly because it serves Samsung or Google or Facebook or whomever. Do we lose something of our humanity when we lose the ability to “want what we want?”
James: Absolutely. I mean, philosophers call these second order volitions as opposed to just first order volitions. A first order volition is, “I want to eat the piece of chocolate that’s in front of me.” But the second order volition is, “I don’t want to want to eat that piece of chocolate that’s in front of me.” Creating those second order volitions, being able to define what we want to want, requires that we have a certain capacity for reflection.
What you see a lot in tech design is essentially the equivalent of a circular argument about this, where someone clicks on something and then the designer will say, “Well, see, they must’ve wanted that because they clicked on it.” But that’s basically taking evidence of effective persuasion as evidence of intention, which is very convenient for serving design metrics and business models, but not necessarily a user’s interests.
AI and attention
Greg: Let’s talk about AI and its role in the persuasion that you’ve been describing. You talk about, a number of times, about the AI behind the system that beat the world champion at the board game Go. I think that’s a great example and that that AI has been deployed to keep us watching YouTube longer, and that billions of dollars are literally being spent to figure out how to get us to look at one thing over another.
-
Entertainment7 days ago
Earth’s mini moon could be a chunk of the big moon, scientists say
-
Entertainment7 days ago
The space station is leaking. Why it hasn’t imperiled the mission.
-
Entertainment6 days ago
‘Dune: Prophecy’ review: The Bene Gesserit shine in this sci-fi showstopper
-
Entertainment5 days ago
Black Friday 2024: The greatest early deals in Australia – live now
-
Entertainment4 days ago
How to watch ‘Smile 2’ at home: When is it streaming?
-
Entertainment3 days ago
‘Wicked’ review: Ariana Grande and Cynthia Erivo aspire to movie musical magic
-
Entertainment2 days ago
A24 is selling chocolate now. But what would their films actually taste like?
-
Entertainment3 days ago
New teen video-viewing guidelines: What you should know