Share this article
It’s no secret that tech companies struggle to balance profit and ethical responsibility. Tristan Harris, Co-Founder of the Center for Humane Technology, observed this firsthand during his time working at Google.
During Solve at MIT’s “Diving into Data for Good” plenary on May 8, Harris discussed the complexities of this challenge with MIT Media Lab Professor Deb Roy. The two speakers explored how tech giants can change their practices to do more good than harm.
At the Center for Humane Technology, Harris is working to demystify the “incoherent agenda of grievances” being lobbied against the tech industry. Device addiction, diminished attention span, adolescent mental health, polarization, and outrage culture—these complaints, according to Harris, are “all actually coming from one source, which is the race to the bottom of the brain stem to extract attention.”
Harris calls this race and its effects on society “human downgrading,” and he believes that human downgrading deserves publicity proportionate to its mammoth impact. "We think it deserves a spot on the global agenda that's as big as climate change, because more than 2 billion people use smartphones,” he said. Holding his own phone aloft, Harris continued, “This is the choke point for sense-making and decision-making for everybody."
Harris went on to explain how tech companies work to pry attention from their customers. "These are not just neutral tools where you walk in, do what you want, and walk out,” he said. “There's this whole army of engineers and supercomputers that are tilting the playing field.” A major driver of this work is that tech company incentives often don’t align with people’s best interests.
According to Harris, technology’s quick global expansion has opened a Pandora’s box. “People massively underestimate the scale and uncontrollability of these digital Frankensteins. They’re influencing elections and cultures, but they don’t even speak the world’s languages,” Harris said.
For example, YouTube’s recommendation feature exists to guide viewers towards content that keeps them glued to the platform. Harris pointed out that while YouTube can hire 10,000 content monitors to remove inappropriate videos, “how many engineers at YouTube speak the 22 languages of India?”
Roy asked Harris what individuals can do to self-regulate when it comes to social media and device addiction. Harris contended that the dynamic between major tech companies and users is “so asymmetrically powerful that we need the companies themselves to change their behavior—because we actually inhabit this environment. It’s our new social environment.”
Harris compared big tech companies to the federal reserve of the attention economy. Their control of app stores, notification systems, and home screen interfaces is key to detoxifying the relationship between humans and tech. "When it comes to changing incentives, Apple and Google are actually in a fantastic position to change the currency," he said.
To combat human downgrading, Harris proposed the concept of a “regenerative attention economy.” Under this model, tech companies would leverage their data and power to identify solutions to people’s problems, as opposed to creating apps that vie for attention.
Harris used the example of dating apps to demonstrate this idea. Current dating apps want you to keep swiping as long as possible. In a regenerative attention economy, companies would empower app users to kindle romantic opportunities through their existing social connections: they might be connected with a friend who likes to host dinners or a group of friends who go to salsa class.
"This is like flipping the competition around,” Harris said optimistically. “They’re still competing, but they're competing to help, not hook you.”
Hear all thoughts from Harris and Roy by watching their full conversation here.
Solve intern Silvia Curry contributed to this article.
Harris and Roy discuss on stage during Solve at MIT. Photo: Adam Schultz/MIT Solve