Hello friends,
In my last email I drew comparisons between the rituals of Communion and Instagram, and both rituals ring hollow without remembering the heart behind them.
For a brief moment last year Silicon Valleyâs hottest new product was an email app called Superhuman. The email service costs $30/month (!) and their marketing sounds like the first draft of a voiceover for a luxury car commercial:
âSuperhuman is not just another email client. We rebuilt the inbox from the ground up to make you brilliant at what you do. We specifically designed it for those of you who want the bestâŚSuperhuman is so fast, delightful, and intelligent â you’ll feel like you have superpowers.â
(Side note â I love how they couldnât resist explaining the punchline of their own product name even though itâs the most obvious thing in the world.)
Superhuman started their marketing blitz last June â Venture Capitalists evangelized the app on Twitter and the New York Times published an article called âWould You Pay $30 a Month to Check Your Email? One of Silicon Valleyâs buzziest start-ups, Superhuman, is betting its appâs shiny features are worth a premium price.â
One of those âshiny featuresâ is what Superhuman calls âRead Receipts.â While the New York Times failed to mention any details about the feature, early-access Superhuman user (and former VP of Design at Twitter) Mike Davidson wrote a 4,000+ word blog post about it: Superhuman is Spying on You.
In Mikeâs article (which is one of the most nuanced, thoughtful reflections Iâve ever read on how product decisions get made in Silicon Valley), he explains everything thatâs wrong with Superhumanâs âread receiptsâ feature:
âYouâve heard the term âRead Receiptsâ before, so you have most likely been conditioned to believe itâs a simple âRead/Unreadâ status that people can opt out of. With Superhuman, it is not. If I send you an email using Superhuman (no matter what email client you use), and you open it 9 times, this is what I see: a running log of every single time you have opened my email, including your location when you opened it.â
Meaning: if I use Superhuman to send you an email, I can see when, where, and how many times you opened my email â regardless of what email app you use. Without you knowing. And to make matters worse:
âSuperhuman never asks the person on the other end if they are OK with sending a read receipt (complete with timestamp and geolocation). Superhuman never offers a way to opt out.â
In his post, Mike imagines three short stories to highlight the potential for abuse enabled by this feature. Iâve excerpted the first sentence of each:
âAn ex-boyfriend is a Superhuman user who pens a desperate email to his former partner.â
âA pedophile uses Superhuman to send your child an email. Subject: âTen Tips to Get Great at Minecraftâ.â
đł
Iâm sure that no one at Superhuman wanted their email product used in that way, which raises two questions:
Did anyone at Superhuman think about the potential problems of exposing location data without consent?
If someone did raise concerns, why were they ultimately overruled?
Unfortunately, Superhumanâs oversight isnât an anomaly. There are countless examples of how Silicon Valleyâs blindly optimistic view of tech (among other attributes) creates âunintended consequencesâ that can wreak irrevocable harm.
Perhaps the most famous example: content recommendation algorithms. What was originally designed to help you âdiscoverâ (to use Valley terminology) things youâre interested in also enables the rapid spread of misinformation and the development of extremist views.
Or ad targeting. What was originally designed to show you the perfect pair of shoes also enables the spread of political propaganda and fear-mongering to the most susceptible demographics.
In Superhumanâs case, there are several simple solutions to the problem they introduced: they could require users to opt-in to sharing their location, or remove the feature altogether (though Superhuman did change the feature in response to Mikeâs post, he still felt that the changes were inadequate).
But as more and more of the core technology woven into our lives becomes software oriented and algorithmically driven, finding solutions to âunintended consequencesâ gets thornier. Itâs sometimes literally impossible to understand why things are happening the way they are. The algorithms are black boxes.
As Abeba Birhane, a PhD candidate in Cognitive Science at University College Dublin, writes in The Algorithmic Colonization of Africa:
Data and AI seem to provide quick solutions to complex social problems. And this is exactly where problems arise. Around the world, AI technologies are gradually being integrated into decision-making processes in such areas as insurance, mobile banking, health care and education services.
The issue with this change is that
Societyâs most vulnerable are disproportionally affected by the digitization of various services. Yet many of the ethical principles applied to AI are firmly utilitarian. What they care about is âthe greatest happiness for the greatest number of people,â which by definition means that solutions that center minorities are never sought.
But their voice needs to be prioritized at every step of the way, including in the designing, developing, and implementing of any technology, as well as in policymaking. This requires actually consulting and involving vulnerable groups of society, which might (at least as far as the Westâs Silicon Valley is concerned) seem beneath the âall-knowingâ engineers who seek to unilaterally provide a âtechnical fixâ for any complex social problem.
Birhaneâs suggestions for a creative process sound a lot like what Iâve been reading about Jesus in the Gospel of Luke. Jesus says that he came
âto proclaim good news to the poorâŚto proclaim freedom for the prisoners and recovery of sight for the blind, to set the oppressed free.â
That does sound like good news. If I could do those things, Iâd like to do them. But it also sounds lofty.
(Iâll refraining from making a âif Jesus was making apps he wouldnât collect your location dataâ joke oops I just made it)
Iâve spent this email pointing the finger outward. Itâs time to point the finger back at myself. Superhuman, algorithms, Birhane, JesusâŚ
How am I pushing myself into the edges, to consider minorities, to involve vulnerable groups of society?
Iâll end each newsletter with a question thatâs been placed on my heart after writing. If you feel like it resonates, please reply to this email with your reflections! Over the course of the newsletter Iâll share some of the responses (anonymously) at the bottom of the next email. I hope we can share this journey together.
Todayâs question:
When have you been marginalized? Did someone fight to include you?
Last emailâs question:
What kind of rituals â spiritual or secular â do you take part in, and why?
I try to start every morning writing down three things from the day before and thank God for them. It’s hard to know how that small thing has changed my thoughts, but I hope it has!
I think a lot about my intentions in doing the things I do. I definitely feel that the vast majority of instagramming is done for others, not for ourselves. I don’t post nearly as much as I used to because I canât get past the thought that maybe I’m only posting because it’ll make others think of me a certain way. Of course I still have my camera roll but the fancy highlight reel will not be there. Kinda makes me wish I postedâŚbut for myself, not for others.
One practice that Iâve implemented in a more intentional way is communal prayer. Weâve been gathering every Monday, Wednesday, and Friday mornings at 5am for an hour of prayer. I used to pride myself (sorta speaking) in not being a morning person, but now mornings fulfill me and fuel me because Jesus keeps pouring in as He meets me there!