Andrew Ross Sorkin (45:00):I want to pivot and talk about AI for a moment. We had Jensen Huang here, who’s a big fan of yours, as you know.

Elon Musk (45:05):Yeah. Jensen’s awesome.

Andrew Ross Sorkin (45:06):Talked about bringing you the first box, by the way, with Ilya, interestingly enough, back in 2016, I think.

Elon Musk (45:14):Yes. There’s a video of Jensen and me unpacking the first AI computer at OpenAI.

Andrew Ross Sorkin (45:22):

So I’m so curious what you think of what’s just happened over the past two weeks. While you were dealing with this other headline, series of headlines, there was a whole other series of headlines at OpenAI. What did you think? You founded it, co-founded it.

Elon Musk (45:46):

Co-founded it, yeah. Well, the whole arc of OpenAI, frankly is a little troubling, because the reason for starting OpenAI was to create a counterweight to Google and DeepMind, which at the time had two-thirds of all AI talent and basically infinite money and compute. And there was no counterweight. It was a unipolar world. And Larry and Paige and I used to be very close friends, and I would stay at his house, and I would talk to Larry into the late hours of the night about AI safety. And it became apparent to me that Larry did not care about AI safety. I think perhaps the thing that gave it away was when he called me a speciest for being pro-humanity, as in a racist, but for species. So I’m like, “Wait a second, what side are you on, Larry?” And then I’m like, okay, listen, this guy’s calling me a speciest. He doesn’t care about AI safety. We’ve got to have some counterpoint here because this seems like we could be, this is no good.

(47:01)
So OpenAI was actually started, and it was meant to be open source. I named it OpenAI after open source. It is in fact a closed source, super closed. It should be renamed super closed source for maximum profit AI. Because this is what it actually is. Fay loves irony. In fact, a friend of mine says, the way to predict outcomes is the most ironic outcome is the most… It’s like his Occam’s razor, the simplest explanation is most likely. And my friend Jonah’s view is that the most ironic outcome is the most likely. And that’s what’s happened with OpenAI. It is gone from an open source foundation, a 5123, to suddenly it’s like a $90 billion for-profit corporation with closed source. So I don’t know how you go from here to there.

Elon Musk (48:01):I don’t know how you get … Is this legal? I’m like, “That’s legal?”

Andrew Ross Sorkin (48:06):So as you saw Sam Altman get ousted by somebody you know, Ilya, and Ilya was a friend of yours, you brought him there, your relationship with Larry Page effectively broke down over you recruiting him away, I think.

Elon Musk (48:19):That’s correct. Larry refused to be friends with me after I recruited Ilya.

Andrew Ross Sorkin (48:25):

And so here’s Ilya, apparently, saying something is very wrong.

Elon Musk (48:30):I think we should be concerned about this because I think Ilya actually has a strong moral compass. He really sweats it over questions of what is right. And if Ilya felt strongly enough to want to fire Sam, well, I think the world should know what was that reason.

Andrew Ross Sorkin (48:52):Have you talked to him?

Elon Musk (48:54):I’ve reached out, but he doesn’t want to talk to anyone.

Andrew Ross Sorkin (48:57):Have you talked to other people behind the scenes? Is this is all happening?

Elon Musk (49:01):I’ve talked to a lot of people. I’ve not found anyone who knows why. Have you?

Andrew Ross Sorkin (49:12):I think we are all still trying to find out.

Elon Musk (49:15):One of two things is, either it was a serious thing and we should know what it is or it was not a serious thing and then the board should resign.

Andrew Ross Sorkin (49:24):What do you think of Sam Altman?

Elon Musk (49:28):

I have mixed feelings about Sam. The ring of power can corrupt, and he has the ring of power. So I don’t know. I think I want to know why Ilya felt so strongly as to fire Sam. This sounds like a serious thing. I don’t think it was trivial. And I’m quite concerned that there’s some dangerous element of AI that they’ve created.

Andrew Ross Sorkin (50:09):Discovered.

Elon Musk (50:09):Yes.

Andrew Ross Sorkin (50:10):You think they’ve discovered something.

Elon Musk (50:12):That’d be my guess.

Andrew Ross Sorkin (50:15):Where are you with your own AI efforts relative to where you think OpenAI is, where you think Google is, where you think the others are?

Elon Musk (50:31):

On the AI front, I’m in somewhat of a quandary here because I’ve thought AI could be something that would change the world in a significant way since I was in college, 30 years ago. Now, the reason I didn’t go build AI right from the get-go was because I was uncertain about which edge of the double-edged sword would be sharper, the good edge or the bad edge.

(50:58)
So I held off on doing anything on AI, could’ve created, I think, a leading AI company, and OpenAI actually is that, because I was just uncertain, if you make this magic genie, what will happen, whereas I think building sustainable energy technology is much more of a single-edged sword that is single-edged good, making life multi-planetary, I think single-edged good, installing mostly single-edged good, giving people better connectivity to people that don’t have connectivity or it’s too expensive, I think is very much a good thing. Sonic was instrumental, by the way, in halting the Russian advance, and the Ukrainians said so. So I think, with AI, you’ve got the magic genie problem. You may think you want a magic genie, but once that genie’s out of the bottle, it’s hard to say what happens.

Andrew Ross Sorkin (52:09):How far are we away from that genie being out of the bottle, do you think, or you think it’s already out?

Elon Musk (52:15):The genie is certainly poking its head out.

Andrew Ross Sorkin (52:17):

The AGI, the idea of artificial general intelligence, given what you now are working on yourself, and you know how easy or hard it is to train, to create the inferences, to create the weights, I hope I’m not getting too far in the weeds of just how this works, but those are the basics behind the software end of this.

Elon Musk (52:41):

It’s funny, all these weights, they’re just basically numbers in a common separated value file, and that’s our digital god, the CSP file. I found that funny, but that’s literally what it is. So I think it’s coming pretty fast.

Andrew Ross Sorkin (53:05):

You famously have admitted to overstating how quickly things will happen, but how quickly do you think this will happen?

Elon Musk (53:16):If you say smarter than the smartest human at anything?

Andrew Ross Sorkin (53:20):Yep.

Elon Musk (53:22):

It may not be then quite smarter than all humans, well, machine-augmented humans, because people have got computers and stuff. There’s a higher bar, but you say it’s more than any … can write as good a novel as, say, JK Rowling, or discover new physics, or invent new technology, I would say that we’re less than three years from that point.

Andrew Ross Sorkin (53:47):

Let me ask you a question about XAI and what you’re doing because there’s an interesting thing that’s different, I think, about what you have relative to some of the others, which is you have data, you have information, you have all of the stuff that everybody in here has put on the platform to sort through, and I don’t know if everybody realized that initially. What is the value of that?

Elon Musk (54:21):Yeah. Data is very important. You could say data is probably more valuable than gold.

Andrew Ross Sorkin (54:29):But then maybe you have the gold in X in a different way, in a way, again, that I don’t know if the public appreciates what that means.

Elon Musk (54:42):

Yes, X might be the single best source of data. People click on more links to X than anything else on earth. Sometimes people think Facebook or Instagram is a bigger thing, but actually, there are more links to X than anything. This is public information. You can Google it.

Andrew Ross Sorkin (55:06):Okay, let me ask you a-

Elon Musk (55:08):

So it is where you would find what is happening right now on earth at any given point in time. The whole OpenAI drama played out, in fact, on the X platform. Google certainly has a massive amount of data, so does Microsoft, but it is one of the best sources of data.

Andrew Ross Sorkin (55:35):

Can I ask you an interesting IP issue, which I think is actually something I can say, as somebody who’s in the creator business and journalistic business and whatnot, where I care about copyright? So one of the things about training on data has been this idea that these things are not being trained on people’s copyrighted information. Historically, that’s been the concept.

Elon Musk (55:59):Yeah, that’s a huge lie.

Andrew Ross Sorkin (56:00):Say that again?

Elon Musk (56:02):These AIs are all trained on copyrighted data, obviously.

Andrew Ross Sorkin (56:06):So you think it’s a lie when OpenAI says that this is … None of these guys say they’re training on copyrighted data.

Elon Musk (56:13):That’s a lie.

Andrew Ross Sorkin (56:14):It’s a lie, straight up.

Elon Musk (56:15):Yeah, straight up lie.

Andrew Ross Sorkin (56:16):Okay.

Elon Musk (56:16):100%. Obviously, it’s been trained on copyrighted data.

Andrew Ross Sorkin (56:21):Okay. So let me ask a second question, which is, all of the people who have been uploading-

Elon Musk (56:28):It’s one every minute here.

Andrew Ross Sorkin (56:28):

All of the people who have been uploading articles, the best quotes from different articles, videos to X, all of that can be trained on. And it’s interesting because people put all of that there and those quotes have historically been considered fair use, right?

Elon Musk (56:47):Yeah.

Andrew Ross Sorkin (56:47):

People are putting those quotes up there. And individually, on a fair use basis, you’d say, “Okay, that makes sense,” but now, there are people who do threads, and by the way, there may be multiple people who’ve done an article that has 1,000 words. Technically, all 1,000 words could’ve made it onto X somehow. And effectively, now, you have this remarkable repository, and I wonder how you think about that, again, and how you think the creative community and those who were the original IP owners should think about that.

Elon Musk (57:20):I don’t know, except to say that, by the time these lawsuits are decided, we’ll have digital God. So I’d ask digital God at that point. These lawsuits won’t be decided before, on a timeframe that is relevant.

Andrew Ross Sorkin (57:34):Is that a good thing or a bad thing?

Elon Musk (57:41):

There’s that … I don’t know if it’s actually a real Chinese thing or not, but may you live an interesting time is apparently not a good thing, but personally, I would prefer to live in interesting times, and we live in the most interesting of times. For awhile there, I was really getting demotivated and losing sleep over the threat of AI danger, and then I finally became fatalistic about it and said, “Well, even if I knew annihilation was certain, would I choose to be alive at that time or not?” And I said, “I probably would choose to be alive at that time because it’s the most interesting thing, even if there’s nothing I could do about it.” So then basically, a fatalistic resignation helped me sleep at night because I was having trouble sleeping at night because of AI danger.

(58:43)
Now, what to do about it? I’ve been the one banging the drum the hardest, by far the longest, or at least one of longest for AI danger, and these regulatory things that are happening, the single biggest reason they’re happening is because of me.

Andrew Ross Sorkin (59:02):

Do you think they’re ever going to get their arms around it? We talked to the Vice President this afternoon. She said she wants to regulate it. People have been trying to regulate social media for years and have done nothing, effectively.

Elon Musk (59:13):

Well, there’s regulation around anything which is a physical danger, a danger to the public. So cars are heavily regulated, communications are heavily regulated, rockets and aircraft are heavily regulated. The general philosophy about regulation is that, when something is a danger to the public, that there needs to be some government oversight. So I think, in my view, AI is more dangerous than nuclear bombs and we regulate nuclear bombs. You can’t just go make a nuclear bomb in your backyard. I think we should have some kind of regulation with AI. Now, this tends to cause the AI accelerationist to get up in arms because they think AI is heaven, basically.

Andrew Ross Sorkin (01:00:04):

But you typically don’t like regulation. You’ve pushed back on regulators, for the most part, in the world of Tesla and so many instances where we read articles about you pushing back on the regulators. I’m so curious why, in this, instance now you own one of these businesses.

Elon Musk (01:00:19):

As I said a moment ago, one should not take what is viewed in the media as being the whole picture. There are literally hundreds, this is not an exaggeration, say, there are probably 100 million regulations that my companies comply with and there are probably five that we don’t. And if we disagree with some of those regulations, it’s because we think the regulation that is meant to do good doesn’t actually do good.

Andrew Ross Sorkin (01:00:53):But that’s an interesting thing because-

Elon Musk (01:00:53):It’s not defying regulations for the sake of defiance.

Andrew Ross Sorkin (01:00:55):

The question, if there are laws and rules, whether the idea is that you’re making the decision that the law and the rule shouldn’t be the law and the rule and then … right?

Elon Musk (01:01:05):

No, I’m saying you’re fundamentally mistaken, and it should be obvious that you’re mistaken. My company’s automotive is heavily regulated. We would not be allowed to put cars on the road if we did not comply with this vast body of regulation. Now, you could fill up the stage with, literally six foot high, the regulations that you have to comply with to make a car. You could have a room full of phone books. That’s how big the regulations are. And if you don’t comply with all of those, you can’t sell the car. And if we don’t comply with all the regulations for rockets or for Starlink, they shut us down.

(01:01:46)
So in fact, I am incredibly compliant with regulations. Now, once in awhile, there’ll be something that I disagree with. The reason I would disagree with it is because I think the regulation, in that particular case, in that rare case, does not serve the public good. And therefore, I think it is my obligation to object to a regulation that is meant to serve the public good, but doesn’t. That’s the only time I object, not because I seek to object. In fact, I’m incredibly rule-following.

Andrew Ross Sorkin (01:02:14):Let me ask you a separate question, a social-media-related question. We’ve been talking about TikTok today ahead of the election.

Elon Musk (01:02:21):TikTok is-

Andrew Ross Sorkin (01:02:22):What do you think of TikTok? Do you think it’s a national security threat?

Elon Musk (01:02:30):I don’t use TikTok.

Andrew Ross Sorkin (01:02:32):Say that again, you don’t?

Elon Musk (01:02:33):

I don’t personally use it, but for teenagers and people in their 20s, they seem almost religiously addicted to TikTok. Some people will watch TikTok for two hours a day. I stopped using TikTok when I felt the AI probing my mind and it made me uncomfortable, so I stopped using it. And in terms of antisemitic content, TikTok is rife with that. It has the most viral antisemitic content by far.

Andrew Ross Sorkin (01:03:20):But do you think the Chinese Government is using it to manipulate the minds of Americans?

Elon Musk (01:03:26):No.

Andrew Ross Sorkin (01:03:26):Is that something that you think we should worry about? You have different states that are trying to ban it.

Elon Musk (01:03:32):

I don’t think this is some Chinese Government plot, but the TikTok algorithm is entirely AI-powered. So it is really just trying to find the most viral thing possible. It’s what is going to keep you glued to the screen. That’s it. Now, on sheer numbers, there are on the order of two billion Muslims in the world, and I think a much smaller number of Jewish people, 20 million, something, many orders of magnitude fewer. So if you just look at content production, just on sheer numbers basis, this is going to be overwhelmingly antisemitic, just on a numbers basis.

Andrew Ross Sorkin (01:04:21):Let me ask you a political question, and I’ve been trying to square this one in my head for a long time. In the last two or three years, you have moved decidedly to the right, I think.

Elon Musk (01:04:34):Have I?

Andrew Ross Sorkin (01:04:34):

Well, we can discuss this. I think that you have been espousing and promoting a number of Republican candidates and others. You’ve been very frustrated with the Biden Administration over, I think, unions and feeling like they did not respect what you’ve created.

Elon Musk (01:04:56):

Well, doing nothing to provoke the Biden Administration, they held an electric vehicle summit at the White House and specifically refused to let Tesla attend. This was in the first six months of the administration. And we inquired, we’re like, “We literally make more electric cars than everyone else combined. Why are we not allowed? Why are you only letting Ford, GM, Chrysler, and UAW, and you’re specifically disallowing us from the EV summit at the White House?” We had done nothing to provoke them. Then Biden went on to add insult to injury and publicly said that GM was leading the electric car revolution. This was in the same quarter that Tesla made 300,000 electric cars and GM made 26. Does that seem fair to you?

Andrew Ross Sorkin (01:05:45):But tell me this, then. It doesn’t seem fair. And I’ve asked repeatedly, you’ve probably seen me-

Elon Musk (01:05:53):And by the way, I had a great relationship with Obama. So this was not a-

Andrew Ross Sorkin (01:05:57):But then there’s this.

Elon Musk (01:05:57):

I voted for Obama. I stood in line for six hours to shake Obama’s hand. Okay?

Andrew Ross Sorkin (01:06:04):

Okay. So let me just ask on a personal level, I can see it in your face, this hurt you personally.

Elon Musk (01:06:11):And it hurt the company, too, and it was an insult to … Tesla has 140,000 employees. Half of them are in the United States. Tesla has created more manufacturing jobs than everyone else combined.

Andrew Ross Sorkin (01:06:26):So let me ask this, then. You’ve devoted at least the last close to 20 years of your life, if not more, to the climate, climate change, trying to get Tesla off the ground, in part to improve climate. You’ve talked about that.

Elon Musk (01:06:40):Yeah, a real right-wing motive.

Andrew Ross Sorkin (01:06:42):Repeatedly.

Elon Musk (01:06:44):God, far right, if anything.

Andrew Ross Sorkin (01:06:45):No, I understand that.

Elon Musk (01:06:46):It’s a reverse psychology, next level.

Andrew Ross Sorkin (01:06:51):Well, no, but here’s then the question, which is how do you square the support that you have given … I believe you were at a fundraiser for Vivek Ramaswamy, for example, who says that the climate issue is a hoax, right?

Elon Musk (01:07:10):Yeah, I disagree with him on that.

Andrew Ross Sorkin (01:07:12):

But I would think that that would be such a singular issue for you. I would think that the climate issue would be such a singular issue for you that, actually, it would disqualify almost anybody who didn’t take that issue seriously.

Elon Musk (01:07:25):

Well, I haven’t endorsed anyone for president. I wanted to hear what Vivek had to say because I think that some of the things he says, I think, are pretty solid. He’s concerned about government overreach, about government control of information. The degree to which old Twitter was basically a sock puppet of the government was ridiculous. So it seems to me that there’s a very severe violation of the First Amendment in terms of how much control the government had over old Twitter, and it no longer does.

(01:08:04)
So there’s a reason for the First Amendment. The reason for the First Amendment, for freedom of speech, is because the people that immigrated to this country came from places where there was not freedom of speech. And they were like, “You know what? We got to make sure that that’s constitutional,” because where they came from, if they said something, they’d be put in prison or something bad would happen to them. And freedom of speech, you have to say, when is it relevant? It’s only relevant when someone you don’t like can say something you don’t like or it has no meaning. And as soon as you throw in the towel and concede to censorship, it is only a matter of time before someone censors you. And that is why we have the First Amendment.

Andrew Ross Sorkin (01:08:58):

Could you see yourself voting for President Biden, if it’s a Biden-Trump election, for example?

Elon Musk (01:09:09):I think I would not vote for Biden.

Andrew Ross Sorkin (01:09:12):You’d vote for Trump.

Elon Musk (01:09:14):I’m not saying I’d vote for Trump, but this is definitely a difficult choice here.

Andrew Ross Sorkin (01:09:23):Would you vote for Nikki Haley? Nikki Haley, by the way, wants all social media names to be exposed, as you know.

Elon Musk (01:09:31):

No, I think that’s outrageous. I’m not going to vote for some pro-censorship candidate. Like I said, I think you have to consider that there’s a lot of wisdom in these amendments, I mean, the constitution, and a lot of things that we take for granted here in the United States that don’t even exist in Canada. There’s no constitutional right to freedom of speech in Canada and there’s no Miranda rights in Canada. People think you have the right to remain silent. You don’t, actually, in Canada. I’m half Canadian, I can say these things, I suppose. So the freedom of speech is incredibly important, even when people … And like I said, it’s actually especially important, in fact, it is only relevant, when people you don’t like can say things you don’t like. [inaudible 01:10:40] they’re meaningless.

Andrew Ross Sorkin (01:10:41):

You think, right now, the Republican candidates or the Democrats are more inclined … This is where you go to, I assume, to woke and anti-woke and the mind virus issue that you’ve talked about. Which party do you think is more pro freedom of speech, given all the things you’ve seen? Because we also see DeSantis preventing people from reading certain things. Maybe you think that’s correct.

Elon Musk (01:11:10):

Look, we actually are in an odd situation here where, on balance, the Democrats appear to be more pro-censorship than the Republicans, and that used to be the opposite. It used to be left position was freedom of speech. I believe, at one point, the ACLU even defended the right of someone to claim that they were Nazi or something like that. So the left was freedom of speech is fundamental. My perception, perhaps it isn’t accurate, is that the pro-censorship is more on the left than the right.

Elon Musk (01:12:00):

… We certainly get more complaints from the left than the right, let me put it that way. But my aspiration for the X platform is that it is the best source of truth, or the least inaccurate source of truth. And while I don’t know of you will believe me or not, but I think honesty is the best policy, and I think that the truth will win over time.

(01:12:25)
And we’ve got this great system, and it’s getting better, called Community Notes, which is fantastic I think at correcting falsehoods, or adding context. In fact, we make a point of not removing anything, but only adding context. Now that context could include that this is completely false and here’s why. And no one is immune to this. I’m not immune to it. Advertisers are not immune to it. In fact, we’ve had Community Notes, which has caused us some loss in advertising. Speaking of loss in advertising revenue. If a community note… If there’s false advertising, the Community Note will say, “This is false and here is why.”

(01:13:12)
There’s one specific example that is public knowledge, so I’ll mention it, which is at one point Uber had this ad which said, “Earn like a boss.” And it was community noted. “If by boss you mean $12 and forty-seven cents an hour.” This did cause at least a temporary suspension of advertising from Uber.

Andrew Ross Sorkin (01:13:32):

I got to ask you a question that might make everybody in the room uncomfortable or not uncomfortable. It goes to the free speech issue. The New York Times company and the New York Times newspaper, it appeared over the summer to be throttled.

Elon Musk (01:13:45):What did?

Andrew Ross Sorkin (01:13:46):The New York Times?

Elon Musk (01:13:47):

Well, we do require that everyone has to buy a subscription, and we don’t make exceptions for anyone. And I think if I want the New York Times, I have to pay for a subscription, and they don’t give me a free subscription. So I’m not going to give them a free subscription.

Andrew Ross Sorkin (01:14:05):But were you throttling the New York Times, relative to other news organizations, relative to everybody else? Was it specific to The Times?

Elon Musk (01:14:13):They didn’t buy a subscription. By the way, it only cost like $1000 a month. So if they just do that, then they’re back in the saddle.

Andrew Ross Sorkin (01:14:25):But you are saying that it was throttled?

Elon Musk (01:14:27):No, I’m saying-

Andrew Ross Sorkin (01:14:28):I mean was there a conversation that you had with somebody you said, ” Look, I’m unhappy with The Times. They should either be buying the subscription, or I don’t like their content or whatever.”

Elon Musk (01:14:37):Any organization that refuses to buy a subscription is not going to be recommended.

Andrew Ross Sorkin (01:14:46):But then what does that say about free speech? And what does that say about amplifying certain voices-

Elon Musk (01:14:52):Free speech is not exactly free, it costs a little bit.

Andrew Ross Sorkin (01:14:57):But that’s an interesting-

Elon Musk (01:14:58):Yeah. It’s like in South Park where they say, “Freedom isn’t free, it costs a buck O’ five.” Or whatever. But it’s pretty cheap. Okay? Low cost freedom.

Andrew Ross Sorkin (01:15:13):I got a couple more questions for you. You’re headed back to Texas after this-

Elon Musk (01:15:17):Freedom-

Andrew Ross Sorkin (01:15:18):To launch the Cybertruck.

Elon Musk (01:15:20):Yeah.

Andrew Ross Sorkin (01:15:21):

It’s going to be a big launch, but I wanted to ask you right now more broadly just about the car business and what you see actually happening, and specifically the government put in place lots of policies, as you know, to try to encourage more EVs. And one of the things that’s happened uniquely is you have now a lot of car companies saying, “Actually this is too ambitious for us. These plans are too ambitious.” 4,000 dealers… I don’t know if you saw just yesterday, sent a letter to the White House saying, “This has gone too far, you’re going too far.”

Elon Musk (01:15:53):This is EV?

Andrew Ross Sorkin (01:15:56):

It was, “This is going too fast, too far.” And that there’s not enough demand. Underneath all this is this idea that maybe there’s not enough demand for EVs. That the American public has not bought into… I mean they bought into it with your company, but they haven’t bought into it broadly enough.

Elon Musk (01:16:13):

Well, I think if you make a compelling electric car, people will buy it. No question about it. I mean electric car sales in China are gigantic. That’s by far the biggest category, and I think that would be the case… I mean it’s worth noting, so probably the best reputation of that is that the Tesla model Y will be the best selling car of any kind on earth this year, of any kind, gasoline or otherwise.

Andrew Ross Sorkin (01:16:43):

Is there another car company that you think is doing a good job with EVs?

Elon Musk (01:16:49):

I think the Chinese car companies are extremely competitive. By far our toughest competition is in China. So I mean there’s a lot of people who are out there who think the top 10 car company’s is going to be Tesla followed by nine Chinese car companies. I think they might not be wrong. So China is super good at manufacturing, and the work ethic is incredible. So if we consider different leagues of competitiveness at Tesla, we consider the Chinese league to be the most competitive. And by the way, we do very well in China because our China team is the best in China.

Andrew Ross Sorkin (01:17:31):

How worried are you that the unionization effort that just took place, well, I shouldn’t say effort, but the new wages and the like at GM and Ford, that they’re coming for you? And they are coming for you. What is that going to mean to you and your business?

Elon Musk (01:17:50):

Well, I mean I think it’s generally not good to have an adversarial relationship between people on the line, one group at the company and another group. In fact, I disagree with the idea of unions, but perhaps for a reason that is different than people may expect, which is I just don’t like anything which creates kind of a lords and peasants sort of thing. And I think the unions naturally try to create negativity in a company, and create a sort of lords and peasants situation.

(01:18:25)
There are many people at Tesla who have gone from working on the line to being in senior management. There is no lords and peasants. Everyone eats at the same table. Everyone parks in the same parking lot. At GM there’s a special elevator for only for senior executives. We have no such thing at Tesla.

(01:18:45)
And the thing is that I actually know the people on the line, because I worked on the line and I walked the line and I slept in the factory and I worked beside them. So I’m no stranger to them. And there are actually many times where I’ve said, “Well, can’t we just hold a union vote?” But apparently a company is not allowed to hold a union vote, so it has to be somehow called for, but the unions can’t do it. So I said, “Well let’s just hold a vote and see what happens.”

(01:19:18)
The actual problem is the opposite. It’s not that people are trapped at Tesla building cars. The challenge is, how do we retain great people to do the hard work of building cars when they have six other opportunities that they can do that are easier? That’s the actual difficulty, is that building cars is hard work and there are much easier jobs. And I just want to say that I’m incredibly appreciative of those who build cars, and they know it.

(01:19:51)
So I don’t know, maybe we’ll be unionized. I say if Tesla gets unionized it will be because we deserve it and we failed in some way. But we certainly try hard to ensure the prosperity of everyone. We give everyone stock options. We’ve made many people who are just working the line who didn’t even know what stocks were, we’ve made the millionaires.

Andrew Ross Sorkin (01:20:21):We’re going to run out of time. Final couple of quick questions. When do you have the time to tweet or to post? I actually think about it all the time. As I said, I use it-

Elon Musk (01:20:33):well. I have to go to the bathroom sometimes.

Andrew Ross Sorkin (01:20:36):I use it all the time. Meaning, if we were to open up our phones and look at the screen time, what does yours look like?

Elon Musk (01:20:45):Well, about every three hours, I make a trip to the lavatory.

Andrew Ross Sorkin (01:20:50):And that’s the only time you do this? Seems like you’re on there a lot.

Elon Musk (01:20:58):No, I mean there’ll be brief moments between meetings. I mean obviously, I have 17 jobs and no, I guess technically it’s work at this point.

Andrew Ross Sorkin (01:21:15):It is, but I’m thinking just in terms of your mind share. I mean, by the way, there’s a lot of people who should be working who are on this app.

Elon Musk (01:21:22):Technically posting on Twitter or X is work. It does count as work. So there’s that. But no, I mean I think I’m on… Well I guess usually, probably I’m on for longer than I think I am.

Andrew Ross Sorkin (01:21:38):I know, but do you think that’s five hours a day, four hours?

Elon Musk (01:21:40):If you look at the screen time of a number of hours per week, sometimes that’s a scary number. It’s probably, I don’t know, it’s a little over an hour a day, or something like that.

Andrew Ross Sorkin (01:21:51):Just an hour a day? If we really looked at this together. Do you have your phone with you?

Elon Musk (01:21:56):Yeah.

Andrew Ross Sorkin (01:21:57):You want to look?

Elon Musk (01:21:59):Okay.

Andrew Ross Sorkin (01:22:03):Okay, here we go. You ready? Screen time in general?

Elon Musk (01:22:07):Yeah, screen time. Sometimes this is a scary number though.

Andrew Ross Sorkin (01:22:11):I know, that’s why I thought…

Elon Musk (01:22:22):I just got a new phone, so I think this is not accurate. It’s one minute. Pretty sure it’s more than that. Wait, it’s over the week. There we go.

Andrew Ross Sorkin (01:22:31):Yeah, go to the week.

Elon Musk (01:22:35):Okay, so it’s still wrong. It’s more than four minutes. I just got a new phone, so this is not accurate. It literally says four minutes.

Andrew Ross Sorkin (01:22:42):New phone? Tim Cook sent you that phone?

Elon Musk (01:22:44):New phone. [inaudible 01:22:45].

Andrew Ross Sorkin (01:22:47):I should ask, by the way I just mentioned Tim Cook. Do you feel like you’re going to have to have a battle with him eventually? Is that the next fight over the app store?

Elon Musk (01:22:56):The idea of making a phone… What do you mean like-

Andrew Ross Sorkin (01:22:58):No, no, no.

Elon Musk (01:22:59):The app store?

Andrew Ross Sorkin (01:23:01):The app store. Are you going to make a phone? Sam Altman’s apparently thinking about making a phone with Johnny Ive.

Elon Musk (01:23:05):

I mean, I don’t think there’s a real need to make a phone. I mean, if there’s an essential need to make a phone, I’ll make a phone, but I’ve got a lot of fish to fry. So I mean I do think there’s a fundamental challenge that phone makers have at this point because you’ve got basically a black rectangle. How do you make that better?

Andrew Ross Sorkin (01:23:31):So do you want to do that? What does that look like in Elon’s head?

Elon Musk (01:23:36):No, that’s literally… Yeah, good phrase, in the head. A neural link.

Andrew Ross Sorkin (01:23:42):Well there we go. We need to touch that before it’s over.

Elon Musk (01:23:46):The best interface would be a neural interface directly to your brain. So that would be a neural link.

Andrew Ross Sorkin (01:23:52):How far are we, do you think from that, and how excited or scary does that seem to be? And we read these headlines obviously, about monkeys who died, as you know, what should we think about that?

Elon Musk (01:24:03):

Yeah, actually this is… The USDA inspector who came by Neuralink facilities literally said in her entire career she has never seen a better animal care facility. We are the nicest to animals that you could possibly be, even to the rats and mice, even though they did the plague and everything. So it is like monkey paradise.

(01:24:35)
So the thing that gets conflated is that there were some terminal monkeys, where, this is actually several years ago, where the monkeys were about to die and we’re like, okay, we’ve got an experimental device. It’s a kind of thing which you would only put on a monkey that’s about to die. And then now the monkey died, but it didn’t die because of the Neuralink, it died because it had a terminal case of cancer or something like that.

(01:24:57)
Neuralink has never caused the death of a monkey, to the best… Unless they’re hiding something from me, it’s never caused death of a monkey. And in fact, we’ve now had monkeys with Neuralink implants for two, three years and they’re doing great. And we’ve even replaced the Neuralink twice and we’re getting ready to do the first implants, hopefully in a few months. The only implementations of Neuralink I think are unequivocally good. Speaking of the double-edged sword, I think these early implementations are single-edged sword, because the first implementations will be to enable people who have lost the brain-body connection to be able to operate a computer or a phone faster than someone who has hands that work. So you can imagine if Stephen Hawking could communicate faster than someone who had full-body functionality, how incredible that would be. Well, that’s what this device will do, and we should have proof of that in a human, hopefully in a few months.

(01:26:07)
It already works in monkeys and works quite well with monkeys that can play video games just by thinking. So in the next application after dealing with tetraplegics and quadriplegics is going to be vision. Vision is the next thing. So it’s like if somebody has lost both eyes, or the optic nerve has failed, basically where they have no possibility of having some ocular correction. That would be the next thing for Neuralink, is a direct vision interface. And in fact, then you could be like Geordi La Forge from Star Trek. You could see in any frequency, actually you could see in radar if you want.

Andrew Ross Sorkin (01:26:57):Two final questions, and then we’re going to end this conversation, which I think has taken everybody inside the mind of Elon Musk today-

Elon Musk (01:27:05):Not as well as Neuralink will though.

Andrew Ross Sorkin (01:27:12):

It actually goes to self-driving cars and vision and everything else. And I asked this question of Pete Buttigieg transportation secretary, it’s actually something you retweeted. So I wanted to ask you the same question. There’s a big question about autonomous vehicles and the safety of them, but there’s also a question about when it will be politically palatable in this country for people to die in cars that are controlled by computers. Which is to say we have 35, 40,000 deaths every year in this country. If you could bring that number down to 10,000, 5,000, that might be a great thing. But do we think that the country will accept the idea that 5,000 people, that your family might have perished in a vehicle as a result, not of a human making a mistake, but of a computer?

Elon Musk (01:28:13):

Yes. Well, first of all, humans are terrible drivers. So people text and drive, they drink and drive, they get into arguments, they do all sorts of things in cars that they should not do. So it’s actually remarkable that there are not more deaths than there are. What we’ll find with computer driving is, I think probably an order of magnitude reduction in deaths.

(01:28:45)
And the US has actually far fewer deaths per capita than the rest of the world. If you go worldwide, I think there’s something close to a million deaths per year due to automotive accidents. So I think computer driving will probably drop that by 90% or more. It won’t be perfect, but it’ll be 10 times better.

Andrew Ross Sorkin (01:29:09):And do you think that the public will accept that? Do you think the government will accept that?

Elon Musk (01:29:13):Well, in large numbers, it will simply be so obviously true that it really cannot be denied.

Andrew Ross Sorkin (01:29:21):

And what do you think, I know we’ve talked about the timeline before, and I know people have criticized you for putting out timelines that may not have come true just yet, but what do you think it really is? And by the way, do you feel, do you ever say to yourself, oh, I shouldn’t have said that?

Elon Musk (01:29:36):

Sure, of course. Wait, I shouldn’t have said that. So yeah, I’m optimistic about… I think I’m naturally optimistic about time scales, and if I was not naturally optimistic, I wouldn’t be doing the things that I’m doing. I mean, I certainly wouldn’t have sold a rocket company or electric car company if I didn’t have some sort of pathological optimism, frankly.

(01:30:04)
So as you pointed out, many people said that it would fail. And in fact, actually I agreed with them. I said, “Yes, it probably will fail.” And they’re like, “Hmm, okay.” But I thought SpaceX and Tesla had less than a 10% chance of success when we started them. So yeah, anyway. But the self-driving thing is, I’ve been optimistic about it, but we’ve certainly made a lot of progress. If anybody has tried, has been using the sort of full self-driving beta, the progress every year has been substantial.

(01:30:42)
It’s really now at the point where in most places it’ll take you from one place to another with no interventions. And the data is unequivocal that supervised full self-driving is somewhere around four times safer or maybe more, than just a human driving by themselves. So I can certainly see it coming. Actually, really-

Andrew Ross Sorkin (01:31:12):But do you think it’s another five or 10 years? I mean, people say-

Elon Musk (01:31:14):No, no, no, definitely not. Definitely not.

Andrew Ross Sorkin (01:31:17):Do you feel like investors have invested in something that hasn’t happened yet? Is that fair to them? And that’s the other question that people have about that.

Elon Musk (01:31:25):

Well, I mean I think they’ve all, with rare exception, thought it wasn’t happening. So they were investing, despite thinking, they’re very clear that they don’t think it’s real. So they’re not saying, “Oh, we just believe everything Elon says, hook, line, and sinker.” But the thing is that, I mean, it would be a fair criticism of me to say that I’m late, but I always deliver in the end.

Andrew Ross Sorkin (01:31:53):Let me ask you a final question. I took note of this. It was November 11th and you took to Twitter and you wrote only two words. You said amplify empathy.

Elon Musk (01:32:03):Right.

Andrew Ross Sorkin (01:32:05):

I was taken aback by that, given all the things that have been going on in the world. Do you remember what you were thinking?

Elon Musk (01:32:13):Well, I think it’s quite literally-

Andrew Ross Sorkin (01:32:15):I understand it, but what was going on? Why did you write that?

Elon Musk (01:32:21):Well, I was encouraging people to amplify empathy, literally. I tend to be quite literal.

Andrew Ross Sorkin (01:32:28):

But was there something that had happened, that you had seen that you said to yourself, I want to say that?

Elon Musk (01:32:35):I think I was talking to some friends, and we all agreed that we should try to amplify empathy. And so I wrote amplify empathy.

Andrew Ross Sorkin (01:32:46):If you wanted an unvarnished look inside the mind of Elon Musk, I think you just saw it.

Elon Musk (01:32:50):Look, sometimes it’s pretty simple.

Andrew Ross Sorkin (01:32:54):Elon Musk, thank you very, very much for the conversation.

Elon Musk (01:32:57):All right. Thank you.

Andrew Ross Sorkin (01:33:00):Appreciate it very, very much. Thank you. Thank you so much. Here, take that with you for a second.

Elon Musk (01:33:03):[inaudible 01:33:04].

Andrew Ross Sorkin (01:33:06):

I’m just going to say a thank you to everybody who stuck around for what has been a remarkable day. We are so appreciative of everybody who has been with us for so many years coming back to this every year. So thank you, thank you, thank you. I hope you had a great day, and I hope we have an opportunity to do this again. Elon Musk, everybody. Thank you.