Your Undivided Attention

By Center for Humane Technology

Listen to a podcast, please open Podcast Republic app. Available on Google Play Store.


Category: News & Politics

Open in iTunes


Open RSS feed


Open Website


Rate for this podcast

Subscribers: 74
Reviews: 1

m
 Jun 17, 2019

Description

Technology companies are locked in an arms race to seize your attention, and that race is tearing apart our shared social fabric. In this inaugural podcast from the Center for Humane Technology, hosts Tristan Harris and Aza Raskin will expose the hidden designs that have the power to hijack our attention, manipulate our choices and destabilize our real world communities. They’ll explore what it means to become sophisticated about human nature, by interviewing hypnotists, magicians, experts on the dynamics of cults and election hacking and the powers of persuasion. How can we escape this unrelenting race to the bottom of the brain stem? Start by subscribing to our new series, Your Undivided Attention.

Episode Date
Down the Rabbit Hole by Design
00:54:29
When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.
Jul 10, 2019
With Great Power Comes...No Responsibility?
00:55:41
Aza sits down with Yael Eisenstat, a former CIA officer and a former advisor at the White House. When Yael noticed that Americans were having a harder and harder time finding common ground, she shifted her work from counter-extremism abroad to advising technology companies in the U.S. She believed as danger at home increased, her public sector experience could help fill a gap in Silicon Valley’s talent pool and chip away at the ways tech was contributing to polarization and election hacking. But when she joined Facebook in June 2018, things didn’t go as planned. Yael shares the lessons she learned and her perspective on government’s role in regulating tech, and Aza and Tristan raise questions about our relationships with these companies and the balance of power.
Jun 25, 2019
Should've Stayed in Vegas
00:39:11
In part two of our interview with cultural anthropologist Natasha Dow Schüll, author of Addiction by Design, we learn what gamblers are really after a lot of the time — it’s not money. And it’s the same thing we’re looking for when we mindlessly open up Facebook or Twitter. How can we design products so that we’re not taking advantage of these universal urges and vulnerabilities but using them to help us? Tristan, Aza and Natasha explore ways we could shift our thinking about making and using technology.
Jun 19, 2019
What Happened in Vegas
00:40:51
Natasha Dow Schüll, author of Addiction by Design, has spent years studying how slot machines hold gamblers, spellbound, in an endless loop of play. She never imagined the addictive designs which she had first witnessed in Las Vegas, would go bounding into Silicon Valley and reappear on virtually every smartphone screen worldwide. In the first segment of this two-part interview, Natasha Dow Schüll offers a prescient warning to users and designers alike: How far can the attention economy go toward stealing another moment of your time? Farther than you might imagine.
Jun 10, 2019
Launching June 10: Your Undivided Attention
00:03:16
Technology has shredded our attention. We can do better.
Apr 16, 2019