This transcript is generated with the help of AI and is lightly edited for clarity.
REID:
I’m Reid Hoffman.
ARIA:
And I’m Aria Finger.
REID:
Too often when we talk about the future, it’s all doom and gloom.
ARIA:
Instead, we want to sketch out the brightest version of the future and what it will take to get there. We’re talking about how technology and humanity can come together to create a better future.
REID:
This is Possible.
ARIA:
We love the community that we’re building with this show. And we hear you: You’ve asked for more of Reid’s takes. So starting this season, every other week, Reid will be in the hot seat. I get to ask him a few questions in the spirit of the previous week’s episodes and get his thoughts on the latest on AI technology and the future. So Reid, I’d love to know what, in your view, is the primary role and responsibility of a technology reporter like Kara Swisher? Where are the lines between sharing personal opinion, being highly critical of those with power, but also thinking positively about the future?
REID:
Well, it’s a great question. And you know, one of the reasons why I’ve been a fan of Kara’s for many years is I do think that it’s important to always have people speaking truth to power. And obviously, you know, major technology founders, companies, et cetera, have a lot of power these days. And so I think it’s important to have people criticizing them. Including, as has happened over the years, criticizing me, which is I think a very good thing. And I think it’s one of the good things. And I think people who get overly bent-out-of-shape about the criticism, you know, are really being thin-skinned, very narrowly self-interested, and a bunch of other things. So I think it’s a good thing to have the criticism. Now, that being said, I also share a frequent frustration within the tech industry that when all you have is criticism, it actually does a lot of harm.
REID:
It says “just bad, just bad,” as opposed to, “well, good and bad, and let’s steer towards more the good and less of the bad.” And that doesn’t mean it needs to be puffed up or something, but it’s like, you know, a ton of things, like people say, “well what has Google ever given me for its use of my data?” And the answer is, well, free search [laugh]. Right? Which is pretty handy for a lot of people. And similarly, it’s like, well, I’m recording the evils of big tech on my iPhone as I’m doing it. And you’re like, well, okay, you know, the iPhone is actually, in fact, a pretty instrumental thing navigating a whole bunch of places in the world. And like, for example, like a classic one that’s a very dejure topic amongst a lot of areas is how technology creates, you know, kind of mental health issues, loneliness, et cetera.
REID:
And I think there are some issues that are real there and need to be improved. But also, on the other hand, technology creates a lot of connection. You know, you can FaceTime with people and you can share pictures and a bunch of other things. And it’s like, well, but we see these other mental health issues. Yeah, yeah, yeah. That’s the reason why it’s a complicated landscape. It’s kind of like if you said, you know, there’s a thousand people and 900 of them get better connection and five of them have mental health issues, that’s one issue. And if a hundred of them get better connections and a hundred of them have worse mental health issues, that’s a different issue. And people are imprecise in that, because they hang up that one thing. And that leads into some of the challenges with journalism, because it’s like telling that negative story of that one person who, you know, got crushed by this thing.
REID:
And look, that’s—it’s bad. It’s like, well, it really resonates with us as Homo narrativus, you know, kind of storytellers. But it actually in fact leads us to very wrong conclusions. And that’s one of the things where I worry—that journalists do have to hold the critical role, but sometimes cloak themselves in it in a way that they go, look, it’s not about judgment. And of course it’s convenient if it’s not about judgment. Because if everything you’re saying is really negative, it’s just like covering the burning buildings or the plane crashes, and you get more clickthrough, and you get, you know, more attention, and so it tells you that you’re doing the right thing. Whereas really, just like, for example, there’s a role for holding technology leaders accountable to saying, “look, what are the things you’re doing well, and what are the things you’re doing wrong?”
REID:
I think it’s also holding, you know, tech journalists accountable for what are the things you’re doing right and what are the things you’re doing wrong? And part of the thing I think that happens too often wrong within tech journalists—or maybe journalists generally, but certainly tech journalists—is it’s all about the wrong, and not about the balance of it. For example, when I look at a lot of the subjects that are the choices of journalist ire—take, you know, Meta / Facebook—you know, if you read through it, it’s like it can’t do anything right. Everything it’s doing is wrong. And you’re like, well, but a billion people are using it every day, very voluntarily [laugh], right? And when I talk to a lot of people who are using it, there’s a lot of good things that are coming out of it.
REID:
And so, you know, you almost by definition know that the press coverage is very slanted and stilted and directional for some set of purposes. Now that being said, I myself have been a critic of Meta in its civic responsibilities. And kinda what, how do you interpret freedom of speech, and how does that, you know, play out to—both a virtue, but how do you, how do you manifest it within a media ecosystem of a democracy? But I also see all of the wonderful things that it brings. It’s that balance of helping us navigate and kind of forming a zeitgeist set of opinions and ideas about the current state of things, about where things might move to, where things could possibly move to—is actually, in fact, I think the thing that tech journalism needs to do more of, while it of course still occupies its critical role.
ARIA:
Yeah, it’s really interesting. One of our close colleagues—I won’t call him out on the pod—but he has a really interesting point about, the technology is often disrupting itself. So when people talk about the music industry and, you know, Spotify’s terrible because artists used to be able to make money by selling CDs, it’s like, well, a hundred years ago artists couldn’t make money anyway because there was no CD, there was no cassette, there was no record player, there was nothing. It’s like we needed that technology to give these people a living and a way to reach the masses, which was really amazing. And that technology has been progressing and, you know, we hope again that journalists will hold people accountable but willl also see sort of the amazing new things that can happen. And so you, you know, you brought up Facebook, Meta—another social media company that’s obviously super in the news is TikTok.
ARIA:
And I am, you know, as a mom of kids, like I’m pretty moved by the fact that we’ve, we’ve seen teenagers say, “I would pay money every month if no one could use TikTok. Like I feel compelled to use it because I need to, you know, keep up with the Joneses and keep up with my friends.” That doesn’t seem to me to be a reason to ban something. Maybe there’s regulation, maybe there’s worries about young people using it and there should be over 18, something like that. But I think what is more interesting is the law we have around foreign ownership. And so it seems to be sort of that—that framing is what legislators are talking about when they’re talking about TikTok, either, you know, divesting, being banned, it has to change ownership. I would love to know what you think. Is this the way you think we should be going with TikTok and not allowing foreign ownership?
REID:
Well, a whole stack of TikTok questions let’s start with the simplest one, which is, you know, China implements regulation against any foreign tech companies, including the US, to indicate that they can’t be, you know, foreign-majority-owned and have license to operate within China. And so I think the baseline is it’s quite fair for any other country to then say, “well, of Chinese companies we will then do the same. If you’re going to do that to our, our companies and prospective companies, we will do the same to you.” Cutting through all of the noise of discussions of shoulds and censorship and preference and all the rest, I think that’s a baseline okay thing for any country to do, right? Including the US. And that also then makes it not, you know, hypocritical of the US to say, “well, actually, in fact, you shouldn’t be regulating or banning Meta, Twitter, LinkedIn, et cetera, in these different countries.”
REID:
Because we only do that responsively and we think that that kind of general open ecosystem is, we prefer it, we’ll do that. So that’s kind of point one. Now point two is, I do think that one of the things in terms of all online kind of media spaces is we’re learning new things—and the presumption that the old world order is the same one that should apply to the new world order when we have these different things of “everybody on all the time”—you know, through their mobile phones and through these services—about how it extends, you know, the Lord of the Flies atmosphere of a school and the prison yard of a school, you know, into the home and all the rest, you know, for these things, and what that means is, I think, needs to be thought of anew.
REID:
It isn’t just like, well, we have these old principles and that’s it. We’re done. We made those decisions. It’s like, no, no, what actually in fact happens, and so I’m actually in fact pretty sympathetic, like I am in many things about saying we make decisions about what can children do and what could they possibly—you know, it’s one of the reasons why it’s like, well, you can’t drive until a certain time and can’t, you know, have guns and can’t drink and, and a bunch of other things to say, look, there, there’s children. Obviously it’s a broad-brush measure. Some people are mature enough to make decisions by the age of 14 or 16, and other people aren’t mature enough by the age of 30 or 40. But you know, we kind of say, okay, 18 or 21 in order to do that. And that’s the classic kind of broad-brush public policy.
REID:
And I think the notion of saying, “look, how should we engage with and be the right level of both protective and selectively opening up that protectiveness to have children decide their own things” is I think a very good thing. Now the question is, when you get to TikTok, you’re like, well, but there is a good thing about people expressing their voices. There is a good thing about the ability to kind of form community. There’s a bunch of good things that come out of that. Actually on TikTok, just like on YouTube, there’s a whole bunch of how-to information and other kinds of things that are in it. And so you’re like, okay, so what does that mean for how we should balance it? Now taking restrictions for children makes total sense, and then you have to have some consideration for what that should be.
REID:
As to your question, I think a lot of parents would also go, “I try to ban TikTok and then my kids hate me. I kind of rather the government do it.” And it’s like, ah, you know, it’s like I’m just, I’m just doing that. And so, you know, there may be some responsibility, you know, kind of there. And you know, maybe it’s also kind of a question of—as opposed to just kind of like idiosyncratic—it’s like, well, what are the attributes that we should say, you know, when can children be, you know, part of freeform communities, and what are the attributes of the freeform communities? Like, you know, is there things where you say, well, like TikTok, it’s fine from 3:00 PM to 6:00 PM [laugh]. Right? You know, and it’s great, and that’s what it is. And that’s, you know, or, or something like, you know, there’s a lot of, there’s timing variables, there’s, you know, kind of what kind of content, kind of regulation, what kind of adult participation, parental participation, guidance, all of that stuff.
REID:
I think we should be principled about it versus, you know, ad hoc. And then now of course is when someone smart goes, “well, Reid, what universe are you living in? We don’t seem to do that anywhere else.” And it’s like, “yeah, it’s an aspirational goal. [Laugh], Right? Is, you know, what we’re, what we’re trying to get to.” And so, as you know, and, I think, share, it’s kind of this question of, look, it’s not just as it is—you know, hit-it-with-stick, ban it—but how do we shape it so it’s actually much more positive? I am very attentive to the importance around, you know, mental health issues and everything else, and how do we shape these so that they’re more positive. I think one of the fundamental points, whether it’s TikTok—but also Meta and Twitter—is not to say, you know, it’s like, look, are you saying there’s nothing you can do?
REID:
There’s no low-hanging fruit to make small modifications that increase the level of, of positive, you know, kind of civic discussion, mental health, et cetera. You’re saying there’s nothing you can do, and you should allow the people who are out there running around going, you know “kill the minority X or whatever else,” and that’s the more important thing to allow? Really? Let’s have this discussion on like, what are some, some—and, you know, nothing will be perfect—but what are some small things we could do to improve that and figure that out? And that’s part of the reason why the discussion of criticism of these things I find frustrating when it’s like, “shut it all down” as opposed to both the critics and the proponents saying, “look, what are the specific things we would do to improve it?”
REID:
Possible is produced by Wonder Media Network. It’s hosted by Aria Finger and me, Reid Hoffman. Our showrunner is Shaun Young. Possible is produced by Katie Sanders, Edie Allard, Sara Schleede, Adrien Behn, and Paloma Moreno Jiménez. Jenny Kaplan is our executive producer and editor.
ARIA:
Special thanks to Surya Yalamanchili, Saida Sapieva, Ian Alas, Greg Beato, Ben Relles, Parth Patil, and Little Monster Media Company.