Skip to main content

Elon Musk talks Twitter, Tesla and how his brain works — live at TED2022

In this unedited conversation with head of TED Chris Anderson, Elon Musk — the head of Tesla, SpaceX, Neuralink and The Boring Company — digs into the recent news around his bid to purchase Twitter and gets honest about the biggest regret of his career, how his brain works, the future he envisions for the world and a lot more. (Recorded at TED2022 on April 14, 2022)

This live interview includes an excerpt from another exclusive, extended conversation recorded a few days earlier at Tesla’s Texas Gigafactory. Watch that full video here: https://youtu.be/YRvf00NooN8

1:22 Tesla Texas Gigafactory interview clip
11:34 Elon Musk live at TED2022
11:48 Twitter, free speech and open-sourcing the algorithm
27:40 Tesla and short sellers
32:02 If you could go back in time and change one thing, what would it be?
33:44 Electric vehicles, manufacturing and sleeping on the floor of the Tesla factory
38:04 “At this point I think I know more about manufacturing than anyone currently alive on Earth.”
39:15 Elon’s “Master Plan” and accelerating a sustainable energy economy
42:36 SNL, Asperger’s, childhood and how Elon’s brain works
46:48 “I was absolutely obsessed with truth.”
49:23 Philosophy, the meaning of life and the “why” of things
51:43 What the future will look like
52:37 “We want to do our absolute best to make sure the future is … something you can look forward to and not feel sad about.”

Transcription

0:02 hello so in just a few minutes um elon musk will 0:08 be joining us here live on stage for a conversation uh 0:14 rumor has it there are a few things to talk about with him um 0:20 we we we will see but um before that i just want to show you something 0:26 special i want you to come with me to tesla's huge gigafactory in austin texas 0:35 so the day before it opened last week the evening before i was allowed to walk around it 0:40 no one else there and what i saw there was honestly pretty mind-blowing 0:46 this is elon musk's famous machine that builds the machine and his view the 0:51 secret to a sustainable future is not just making an electric car it's making a system that churns out 0:58 huge numbers of electric cars with a margin so that they can fund further 1:03 growth when i was there um none of us knew whether elon would actually be able to make it here today so i took the chance 1:10 to sit down with him and record an epic interview and i just want to show you 1:16 a nine an eight minute excerpt of that interview so here from austin texas elon 1:23 musk i want us to switch now to think a bit about artificial intelligence i i'm curious about your timelines and how you 1:30 predict and how come some things are so amazingly on the money and some aren't so when it comes to predicting 1:36 sales of tesla vehicles for example i mean you've kind of been amazing i think in 2014 1:41 when tesla had sold that year 60 000 cars you said 2020 i think we will do 1:48 half a million a year yeah we did almost exactly half a million five years ago last time you came today we um i asked 1:55 you about for self-driving and um you said yep this very year where i am 2:00 confident that we will have a car going from la to new york uh without any 2:06 intervention yeah i i don't want to blow your mind but i'm not always right um 2:12 so talk what's the difference between those two why why why has full self-driving in particular 2:17 been so hard to predict i mean the thing that really got me and i think it's going to get a lot of other people is 2:23 that there are just so many false stones with with self-driving um where you think you 2:30 think you've got the problem have a handle on the problem and then it nope uh it turns out uh you just hit a 2:37 ceiling um and and uh uh because what happened if you if you were 2:42 to plot the progress the progress looks like a log curve so it's like yeah a series of log curves so 2:49 uh most people don't like cookies i suppose but it shows the show it goes it goes up sort of a you know 2:55 sort of a fairly straight way and then it starts tailing off right and and and you start there's a kind of ocean getting diminishing returns you know in 3:02 retrospect they seem obvious but uh in in order to solve uh full self-driving uh properly you actually just you have 3:08 to solve real-world ai um you you you know because you said what are the road networks designed to 3:15 to work with they're designed to work with a biological neural net our brains and with uh vision our eyes 3:24 and so in order to make it work with computers you basically need 3:31 to solve real world ai and vision because because we we need 3:38 we need cameras and silicon neural nets uh in order to have to have 3:45 self-driving work for a system that was designed for eyes and biological neural nets 3:50 it you know when you i guess when you put it that way it's like quite obvious that the only way to solve full 3:56 self-driving is to solve real-world ai and sophisticated vision what do you 4:01 feel about the current architecture do you think you have an architecture now where where there is 4:06 a chance for the logarithmic curve not to tail off any anytime soon 4:11 well i mean admittedly these these uh may be an infamous uh last words but i i actually 4:18 am confident that we will solve it this year uh that we will exceed uh you're like what the probability 4:25 of an accident at what point should you exceed that of the average person right um i think we will exceed that this year 4:31 we could be here talking again in a year it's like well yeah another year went by and it didn't happen but i think this i think this is 4:37 the year is there an element that you actually deliberately make aggressive prediction timelines to 4:44 drive people to be ambitious and without that nothing gets done so it's it feels like at some point in 4:50 the last year seeing the progress on understanding you that you're that the 4:57 ai the tesla ai understanding the world around it led to a kind of an aha moment 5:02 in tesla because you really surprised people recently when you said probably the most important product 5:08 development going on at tesla this year is this robot optimus yes 5:13 is it something that happened in the development of fourself driving that gave you the confidence to say you know what we could do something special here 5:21 yeah exactly so you know it took me a while to sort of realize that that in order to solve self-driving you really 5:28 needed to solve real-world ai um at the point of which you solve real-world ai for a car which is really 5:34 a robot on four wheels uh you can then generalize that to a robot on legs as 5:40 well the thing that the things that are currently missing are uh enough intelligence enough to tell 5:47 intelligence for the robot to navigate the real world and do useful things without being explicitly instructed it 5:53 is so so the missing things are basically real world uh intelligence and uh scaling up manufacturing um those are 6:00 two things that tesla is very good at and uh so then we basically just need to 6:05 design the the uh specialized actuators and sensors that are needed for a humanoid robot 6:11 people have no idea this is going to be bigger than the car um but so talk about i mean i think the 6:18 first applications you you've mentioned are probably going to be manufacturing but eventually the vision is to to have 6:23 these available for people at home correct if you had a robot that really 6:28 understood the 3d architecture of your house and knew where every object in that house 6:36 was or was supposed to be and could recognize all those objects i mean that 6:42 that's kind of amazing isn't that like like that the kind of thing that you could ask a robot to do would be what like tidy up yeah um 6:50 absolutely or make make dinner i guess mow the lawn take take a cup of tea to grandma and 6:57 show her family pictures and exactly take care of my grandmother and 7:02 make sure yeah exactly and it could recognize obviously recognize everyone in the home yeah could play catch with 7:08 your kids yes i mean obviously we need to be careful this doesn't uh become a dystopian situation 7:14 um like i think one of the things that's going to be important is to have a localized rom chip on the 7:21 robot that cannot be updated over the air uh where if you for example were to 7:26 say stop stop stop that would if anyone said that then the robot would stop you know type of thing 7:31 and that's not updatable remotely um i think it's going to be important to have safety features like that 7:37 yeah that that sounds wise and i do think there should be a regular free agency for ai i've said this for many 7:42 years i don't love being regulated but i you know i think this is an important thing for public safety do you think 7:47 there will be basically like in say 2050 or whatever that like a a robot in most homes is what they will 7:54 be and people will probably count you'll have your own butler basically 7:59 yeah you'll have your sort of buddy robot probably yeah i mean how much of a buddy 8:05 do like do you do how many applications you thought is there you know can you have a romantic partner 8:10 lot of a sex inevitable i mean i did promise the internet that i would make cat girls we'll have we could 8:16 make a robot cackle how are you because yeah you know 8:24 so yeah i i guess uh it'll be what whatever people want really you know so what sort 8:30 of timeline should we be thinking about of the first the first models that are actually made and 8:37 sold you know the the first units that that we tend to make are 8:43 um for jobs that are dangerous boring repetitive and things that people don't want to do and you know i think we'll 8:49 have like an interesting prototype uh sometime this year we might have something useful next year but i think 8:56 quite likely within at least two years and then we'll see rapid growth year over year of the usefulness of the 9:02 humanoid robots um and decrease in cost and scaling out production help me on the economics of this so what 9:09 what do you picture the cost of one of these being well i think the cost is actually not going to be uh crazy high 9:15 um like less than a car yeah but but think about the economics of this if you can 9:21 replace a thirty thousand dollar forty thousand dollar a year worker 9:26 which you have to pay every year with a one-time payment of twenty five thousand dollars for a robot that can 9:32 work longer hours doesn't go on vacation i mean that could it could be a pretty rapid replacement 9:40 of certain types of jobs how worried should the world be about that i wouldn't worry about the the sort of 9:46 putting people out of a job thing um i think we're actually going to have and already do have a massive shortage of labor so i 9:53 i i think we'll we will have um uh 9:59 not not people out of work but actually still a shortage labor even in the future uh but 10:06 this really will be a world of abundance any goods and services uh will be available to anyone who wants 10:13 them that it'll be so cheap to have goods and services it'll be ridiculous 10:22 so that is part of an epic 80 minute interview 10:29 which we are releasing to people members of ted 2022 right after this 10:35 conference um you should be able to look at it on the ted live 10:41 website um there's public interest in it we're putting that out to the world on sunday 10:47 afternoon i think sunday evening but uh but if you're into this kind of stuff um definitely a good thing to do over the 10:52 weekend um now then hearing from elon live there's there's huge public interest in that we have 11:00 opened up this segment to live stream and so we're joined right now by i think quite 11:07 a few people around the world um welcome to vancouver welcome to ted 22 you're joining us on the last day of our 11:12 conference here in a packed theater and 11:17 we've been hearing all week from people with dreams about what the next era 11:23 of humanity is going to be and now arguably the biggest visionary of them all elon 11:30 musk [Music] 11:40 hey elon welcome 11:46 so elon um a few hours ago you made an offer to buy twitter 11:56 why [Laughter] 12:03 how'd you know little bird tweeted in my ear or something i don't 12:08 know by the way have you seen the movie ted about the bear i i i have i have a movie 12:19 so um yeah yeah so was there a question 12:25 why why make that offer oh so um well i think it's very important for 12:31 uh there to be an inclusive arena for free speech 12:37 where all yeah so uh yeah um 12:43 twitter has become kind of the de facto town square um so 12:48 uh it's just really important that people have the both the uh the reality and the 12:54 perception uh that they're able to speak freely within the bounds of the law um 13:01 and you know so one of the things that i believe twitter should do is open source 13:07 the algorithm um and make any changes uh to people's tweets you know if 13:12 they're emphasized or de-emphasized uh that action should be made apparent so you anyone can see that 13:19 action has been taken so there's there's no sort of behind the scenes manipulation either algorithmically or 13:27 manually um but last week when we spoke elon um i asked 13:34 you whether you were thinking of taking over you said no way said i i do not want to own twitter it is a recipe for 13:40 misery everyone will blame me for everything what on earth changed no i think i think everyone will still blame 13:46 me for everything yeah if something if if i acquire twitter and something goes wrong it's my 13:52 fault 100 i i think there will be quite a few arrows uh yes um it will it will be 13:58 miserable but you still want to do it why i mean i hope it's not too miserable uh but 14:03 um i i just think it's important to the fun like 14:09 uh it's important to the function of democracy 14:14 it's important to the function of uh the united states uh as a free 14:19 country and many other countries and to help actually to help freedom in the world more broadly than the u.s 14:26 and so i think it's uh it's a 14:32 you know i think this there's the risk civilizational risk uh is decreased if twitter 14:39 the more we can increase the trust of twitter as a public platform and so 14:45 i do think this will be somewhat painful and i'm not sure that i will actually be able to to acquire it 14:50 and i should also say the intent is is to retain as many shareholders as is 14:57 allowed by the law in a private company which i think is around 2000 or so so we'll it's not like it 15:03 it's definitely not not from the standpoint of letting me figure out how to monopolize or maximize my ownership of twitter 15:09 but we'll try to bring along as many shoulders as we right as we're allowed to you don't necessarily want to pay out 15:15 40 or whatever it is billion dollars in cash you you'd like them to come come with you in in 15:20 i mean i mean i could technically afford it um 15:28 what i'm saying is this this is this is uh this is not a way to sort of make money 15:34 you know i think this is it's just that i think this is um this could 15:40 my strong intuitive sense is that uh having a public platform that is maximally 15:46 trusted um and and and and broadly inclusive 15:52 um is extremely important to the future of civilization but you've described 15:58 yourself i don't care about the economics at all okay that's that's core to hear you this is not about the economics it's for the 16:05 the moral good that you think will achieve you you've described yourself elon as a free speech absolutist 16:11 but does that mean that there's literally nothing that people can't say and it's okay 16:17 well i i i think uh obviously uh twitter or any forum is 16:23 bound by the laws of the country that it operates in um so 16:28 obviously there are some limitations on free speech uh in in the us and and of 16:34 course uh twitter would have to abide by those uh right rules so so so you can't incite 16:40 people to violence like that that the like a direct incitement to violence you 16:46 know you can't do the equivalent of crying fire in a in a movie theater for example no that would be a crime yeah 16:52 right it should be a crime but here's here's the challenge is is that it's it's such 16:57 a nuanced difference between different things so there's there's excitement to violence yeah 17:04 that's a no if it's illegal um there's hate speech which some forms of hate speech are fine you know i hate spinach 17:12 um i mean if it's a sauteed in a you know cream sauce that would be quite 17:18 nice but so so but the problem is so so so let's say someone says okay here's one tweet i 17:24 hate politician x yeah next tweet is i wish polite politician x wasn't alive 17:30 as we some of us have said about putin right now for example so that's legitimate speech 17:36 another tweet is i wish politician x wasn't alive with a picture of their head with a gun sight over it 17:43 or that plus their address i mean at some point 17:48 someone has to make a decision as to which of those is not okay can an algorithm 17:54 do that well surely you need human judgment at some point no i think the like i said 18:01 in my view uh twitter should um match the laws of the of the country of 18:07 and and and really you know that there's an obligation to to do that um 18:14 but going beyond going beyond that um and having it be unclear who's making what changes to who 18:21 to where uh having tweets sort of mysteriously be promoted and demoted 18:27 with no insight into what's going on uh having a black box algorithm uh promote some things and other not not other 18:33 things i think this can be quite dangerous so so so the idea of opening the 18:38 algorithm is a huge deal and i think many people would would welcome that of understanding exactly how it's making 18:44 the decision and critique it and critique like i want to improve what wondering is like like i think like the 18:50 code should be on github you know so then uh and so people can look through it and say like i see a problem here i 18:57 don't i don't agree with this um they can highlight issues right um suggest changes in the same way that you 19:04 sort of update linux or or signal or something like that you know but as i understand it like at some point right 19:11 now what the algorithm would do is it would look at for example how many people have flagged a tweet as obnoxious 19:19 and then at some point a human has to look at it and make make a decision as to does this 19:25 cross the line or not that the algorithm itself can't i don't think yet um tell 19:30 the difference between legal and okay and and definitely obnoxious and so the question is which 19:36 humans you know make make that core i mean do you have do you have a picture of that right now twitter 19:43 and facebook and others you know they've hired thousands of people to try to help make wise decisions and the trouble is 19:50 that no one can agree on on what is wise how do you solve that 19:55 well i i i think we would want to er on this if if in doubt uh 20:01 let let the speech that let it exist uh it would have you know if it's a 20:07 you know a a gray area i would say let let the tweet exist 20:12 um but obviously you know in a case where there's perhaps a lot of controversy uh 20:19 that you would not want to necessarily promote that tweet if uh you know so the 20:25 i'm not i'm not saying this is that i have all the answers here um but i i do think that we want to be just 20:33 very reluctant to delete things and and have um just just be very cautious with with 20:39 with permanent bands uh you know timeouts i think are better or uh than 20:45 sort of permanent bands and um but just just in general like i said 20:53 uh how how it won't be perfect but i think we wanted to really uh have 20:59 like so the possession and reality that speech is as free as reasonably possible 21:04 and a good sign as to whether there's free speech is is 21:10 is someone you don't like allowed to say something you don't like 21:15 and if that is the case then we have free speech and it's it's damn annoying when someone you don't like says 21:21 something you don't like that is a sign of a healthy functioning uh 21:26 free speech situation so 21:31 i think many people would agree with that and look at the reaction online many people are excited by you coming in and the changes you're 21:37 proposing some others are absolutely horrified here's how they would see it they would say wait a sec we agree that 21:43 that twitter is an incredibly important town square it is a it is you know where the world exchanges opinion about life 21:49 and death matters how on earth could it be owned by the world's richest person that can't be 21:54 right so how how do you i mean what's the response there is there any way that you 21:59 can distance yourself from the actual decision-making that matters on content 22:05 at in some very clear way that is convincing to people 22:10 well like i said i think the it's it's very important that like the 22:15 the algorithm be open sourced and that any manual uh adjustments be uh identified like so if this tweet if 22:23 somebody did something to a tweet it's there's information attached to it that this that action was taken and i i i i 22:30 won't personally be uh you know in their editing tweets um 22:36 but you'll know if something was done to promote demote or otherwise affect uh a 22:42 tweet um you know as for media sort of ownership i mean you've 22:48 got you know um mark zuckerberg owning facebook and instagram and whatsapp um 22:54 and with a share ownership structure that will have mark zuckerberg the 14th still 23:01 controlling those uh entities so 23:08 literally um what's that need we won't have that on twitter 23:13 if if you commit to opening up the algorithm that that definitely gives some level of confidence um talk about 23:20 talk about some of the other changes that you've proposed so you at the edit button that's that's 23:26 definitely coming if you if you have your way yeah yeah and how do you i mean i i think 23:32 i mean one frankly um 23:37 the top priority i have i would have is is eliminating the the spammings and scam 23:44 bots and the bot armies that are on twitter um 23:51 you know i think i think these these fun influence that they're not they're they're they they 23:57 make the product much worse um if i see if you know if i had a dogecoin for every crypto 24:04 scam i saw [Laughter] 24:10 more you know 100 billion dollars do you regret sparking the sort of storm 24:16 of excitement overdose and you know where it's gone or i mean i think deutsche is fun and you 24:23 know i've always said don't bet the form of dogecoin uh fyi yeah 24:29 but i i think i think it's it's i like dogs and i like memes and uh it's got 24:34 both of those and but just on the on the edit button how how do you get around the problem of so 24:40 someone tweets elon rocks and it's tweeted by two million people um and um 24:46 and then then after that they edit it so i'm elon sucks and um and then all those 24:51 retweets they're all embarrassed and how how do you avoid that type of 24:56 changing of meaning so that retweeters are exploited 25:02 well i think uh you know you'd only have the edit capability for a short period 25:07 of time and probably the thing to do at upon the edit would be to zero out 25:12 all retweets and favorites okay i'm open to ideas though you know 25:19 so in one way the um algorithm works kind of well for you right now i just i wanted to show you this this is so 25:25 this is a typical tweet of of mine kind of lame and wordy and whatever and look 25:30 at and the amazing response it gets is this oh my god 97 likes 25:36 um and then i tried another one um and uh 25:45 29 000 likes so the algorithm at least seems to be at the moment you know if 25:50 elon musk expanded the world immediately um not bad right 25:58 yeah i guess so i mean that was cool i mean you but but you've 26:03 so help us understand how it is you've built this incredible um following on twitter yourself when 26:10 i mean some of the people who love you the most look at some of what you tweet and they they they think it's somewhere between um 26:18 embarrassing and crazy some of it's amazing i mean [Laughter] 26:24 is that actually why it's worked or why why has it worked i mean i don't know i mean i i'm 26:31 you know tweeting more or less stream of consciousness you know it's not like let me think about some grand plan about my 26:36 twitter or whatever you know i'm like literally on the toilet or something i'm like oh this is funny and then tweet 26:42 that out you know that's that's like most of them [Laughter] 26:48 you know over sharing but um but you are obsessed with getting the most out of every minute of your day 26:55 and so why not you know um so i don't know i just like try to tweet 27:02 out like things that are interesting or funny or you know and then people seem to 27:07 like it so if if you are unsuccessful actually before i ask that let me ask this if i 27:13 don't yeah so how can i say is uh funding secured 27:19 [Music] i i have sufficient uh assets to 27:26 complete the uh it's not a forward-looking statement blah blah but 27:33 i have to i mean i can do it if possible right um so um 27:38 and um i mean i should say actually even in the in originally 27:44 the uh with with tesla back in the day funding was actually secured 27:50 i want to be clear about that um in fact this may be a good opportunity to to to clarify that um 27:56 if funding was indeed secured um and uh i should say like why why do i do not 28:02 have respect for the sec in that situation and i don't mean to blame everyone at the sec but certainly 28:08 the san francisco office um it's because the sec knew that funding was secured 28:15 but they pursued the an active public investigation nonetheless at the time tesla was in a 28:21 precarious financial situation and i was told by the banks that if i did not agree to settle with the sec 28:27 that they would the banks would cease providing working capital and tesla would go bankrupt immediately so that's like having 28:34 a gun to your child's head so i was forced to concede to the sec 28:39 unlawfully those bastards and and and now that they they say 28:47 it makes it look like i lied when i did not in fact lie i was i was forced to admit that i lied for to save tesla's 28:53 life and that's the only reason given what's actually happened 28:59 given what's actually happened to tesla since then though aren't you glad that you didn't take it private 29:07 yeah i mean it's difficult to put yourself in the position at the time tesla was under the 29:13 most relentless short seller attack in the history of the stock market uh there's something called short and 29:20 distort um where the barrage of negativity that tesla was experiencing from short sales 29:26 wall street was beyond or belief tesla was the most shorted stock in the history of stock markets 29:33 this is saying something so you know this was affecting our ability to hire people it was affecting our 29:39 ability to sell cars it was uh they were yeah it was terrible um 29:47 yeah they wanted tesla to die so bad they could taste it well most of them have paid the price 29:54 yes where are they now um 30:01 so so that was a really strong statement i mean obviously a lot of people who who support you i thought would say 30:08 you have so much to offer the world on the upside on the vision side don't don't waste your time 30:14 getting getting distracted by these these battles that bring out negativity and and and make people feel that you're 30:20 being defensive or like people don't like fights especially with with powerful government authorities they'd rather they'd rather buy into your to a 30:27 dream do do you like aren't you encouraged by people just just to edit that 30:33 in that you know temptation out and uh go with the bigger story 30:41 um well i mean i i would say like you know i'm sort of a mixed bag you know i 30:48 mean well you're a fighter and you you don't you don't you don't you don't 30:54 you don't like to lose and and you you you are determined that you don't basically i i mean you are sure i don't 31:00 like to lose i'm not sure many people do um but the truth matters to me a lot really 31:05 like sort of pathologically it matters to me okay so so you don't like to lose if in 31:12 this case you are not successful in you know the board does not accept your offer you've said 31:18 you won't go higher is there a plan b 31:24 there is i i think we i think we would like to 31:31 hear a little bit about plan b 31:37 for it for another time i think another time yeah all right [Applause] 31:44 i that that's a nice tease all right so um 31:50 i i would love to try to understand this brain of yours 31:55 more ilan i i if with your permission i'd like to just play this this is the oh actually before we do that 32:02 um here was one of the of the thousands of questions that people asked i thought this was actually quite a good one um if 32:08 you could go back in time and change one decision you made along the way do your own edit button 32:13 which one would it be and why do you mean like a career decision or something 32:18 just any decision over the last few years like your decision to invest in twitter in the first place or your 32:26 anything um i mean the the worst business decision i ever made 32:32 was um not starting tesla with just jb straval 32:38 by far the worst decision i've ever made is not just starting tesla with jb 32:43 that that that's the number one by far all right so jb strabo was was the visionary co-founder who who who was 32:49 obsessed with and knew so much about batteries and your your decision to go with tesla the company as it was meant 32:56 that you got locked into what you concluded it was a weird architecture now this this 33:01 there's a lot of confusion tesla tesla did not exist in any 33:07 tesla was a shell company with no employees uh no intellectual property when i invested but the 33:13 a false narrative has been created by um one of the other co-founders uh martin everhard and i don't want to get into 33:19 the nastiness here but uh i didn't invest in an existing company 33:24 we created a company yeah and ultimately the creation that company uh 33:30 was was done by uh jv and me um and unfortunately there's a someone else and 33:37 another co-founder who has made it his life's mission uh to make it sound like he he created 33:42 the company which is false wasn't there another issue right at the heart of the development of 33:48 the tesla model 3 where tesla almost went bankrupt and i i think you have said that part of the reason for that 33:54 was that you overestimated the extent to which it was possible at that time to 33:59 automate a a factory a huge amount was spent kind of over automating and it didn't 34:05 work and it nearly took the company down is that fair 34:11 uh i mean first of all it's important to understand like what what has tesla 34:18 actually accomplished that is that is most noteworthy um it is not the 34:23 creation of an electric vehicle or creating electrical vehicle prototype or 34:29 low volume production of a of a car that they've been 34:35 uh hundreds of cars startups over the years hundreds and uh in fact at one 34:40 point um bloomberg counted up the number of electric vehicle startups and they i think they got to almost 500. yeah so 34:47 the hard part is not creating a prototype or going into limited production the the the absolutely difficult thing 34:54 which has not been accomplished by an american car company in 100 years is reaching volume production without going 35:00 bankrupt is the actual hard thing um the last company american company to 35:07 reach volume production without going bankrupt was chrysler in the 20s right 35:12 and and and it nearly happened to tesla yes it but it's not like oh geez i guess 35:18 if we just done more manual stuff things would have been fine of course not uh that is definitely not 35:24 the case uh so we basically messed up 35:29 almost every aspect of the model 3 production line from 35:36 from cells to packs to driving voters motors 35:42 body line the paint shop uh final assembly 35:47 um everything everything was messed up um and i lived in that fa i lived in the 35:54 fremont and and nevada factories for three years 35:59 fixing the that production line running around like a maniac through every part of that factory 36:06 living with the team i slept on the floor 36:12 so that the the team who was going through a hard time could see me on the floor 36:19 uh that they knew that i was not in some ivory tower 36:25 whatever pain they experienced i was i had it more and some people who knew you 36:30 well actually thought you were making a terrible mistake that you were driving yourself you were you were driving yourself to the edge of 36:37 sanity almost and yeah and and that you were in danger of making 36:42 bad choices and in fact i heard you say last week elon that that you because of tesla's huge value now and and you know 36:50 the the significance of every minute that you spend that you are in danger of sort of 36:56 obsessing over spending all this time to the point of to the edge of sanity 37:01 um that doesn't that doesn't sound super wise isn't that like your your your time 37:09 your your completely sane centered rested time and decision making 37:14 is more powerful and compelling than that sort of i can barely 37:19 hold my eyes open so surely it should be an absolute strategic priority to look after yourself 37:28 i mean there wasn't any other way to make it work there were three years of hell 37:35 17 8 2017 18 and 19 with three years this longest period of excruciating pain 37:42 in my life uh there wasn't any other way and we barely 37:47 made it and we're on the ragged edge of bankruptcy the entire time so 37:52 so when you felt like i want pain i don't like it um 37:58 those were three or three so so much pain but it had to be done or tesla would be 38:03 dead when you looked around the gigafactory that we saw images of earlier 38:08 um last week and just see where the companies come i mean do you feel that that this this challenge of 38:16 figuring out the the new way of manufacturing um that you that 38:21 you actually have an edge now that it's different that you've figured out how to do this and and um from 38:27 those three years what won't be repeated you've actually figured out a new way of manufacturing 38:36 at this point i think i know more about manufacturing than anyone currently alive on earth 38:45 between that yeah i'll tell you i can tell you how every 38:51 damn part part in that car is made which basically if you just live on the factory live in the factory for three 38:56 years and that was nice that was a poignant note or something 39:03 someone wants to compose a symphony to that uh expression of confidence uh something like that i have no idea what 39:09 that is anyway yeah every aspect of a car six weeks to sunday i know 39:15 i mean you you you talk about scale right now you're in the middle of writing your new master plan 39:21 and you've said that scale is at the heart of it why does scale matter why are you 39:26 obsessed with that what are you thinking yeah well see in order in order to accelerate the 39:32 advent of sustainable energy uh there must be scale because we've got a transition um a vast 39:39 economy that is currently uh overly dependent on fossil fuels to a sustainable energy economy one where the 39:46 energy is uh yeah i mean we got to do it 39:54 so so the energy's got to be sustainably generated with wind solar uh hydro 40:00 geothermal i i'm a believer in nuclear as uh as well i think ever talk about 40:05 and uh and then you you since solar and wind is intermittent you have to have stationary storage batteries and and then we're going to 40:12 transition all transport um to to electric uh if we do those things we have a 40:18 sustainable energy future the faster we do those things the less risk we 40:24 the less risk we put to the environment uh so sooner is better uh and and so 40:31 scale is very important um you know it's not about it's not about press releases it's about tonnage what 40:38 was the tonnage of of batteries produced and obviously done in a sustainable way 40:44 and and our estimate is that approximately 300 terawatt hours of 40:50 battery storage is needed to transition uh transport uh electricity and and heating and cooling 40:58 uh to a fully electric situation others may there's there may be some 41:03 different estimates out out there but uh our estimate is 300 terawatt hours yeah 41:09 so we dug into this a lot in the interview that we recorded last week and so people can go in and hear that more but i mean the context is that is i 41:15 think about a thousand times the current install battery capacity i mean the scale up needed is 41:21 breathtaking basically yeah and and and um yeah so so your vision is to commit 41:27 tesla to try to deliver on a meaningful percentage of what is needed yeah and what and call on others to do the rest 41:34 that this is what this is a task for humanity to massively scale up our response to change change the energy 41:40 grade yes it it's it's like basically how fast can we can we scale um and encourage others to 41:49 scale to get to that 300 terawatt hour installed uh base of batteries right 41:57 and then of course uh there'll be a tremendous need to recycle those batteries which is i and it makes sense 42:02 to recycle them because the raw materials are like high grade ore um so people shouldn't think well they'd be 42:08 this big pile of batteries now they're going to get recycled because the even a dead battery pack is worth about a thousand dollars so 42:15 um but but this is what's needed for a sustainable energy future so we're going to try to take the set of actions that 42:21 accelerate the day of and bring the day of a sustainable energy future sooner 42:27 okay there's going to be a huge interest in your master plan when you when you 42:34 publish that um meanwhile i just i would love to understand more 42:39 what goes on in this brain of yours because it is it is a pretty unique one i want to play with your permission this 42:44 very funny opening from snl saturday night live can we have the volume there actually please sorry 42:50 it's an honor to be hosting saturday night live i mean that sometimes after i say something i have 42:57 to say i mean that [Music] so people really know that i mean 43:02 that's because i don't always have a lot of international variation in how i speak 43:09 which i'm told makes for great comedy i'm actually making history tonight as 43:14 the first person with asperger's to host snl [Applause] 43:21 and i think you followed that up with at least the first person to admit it the first person to admit it 43:27 but i mean 43:32 so this was a great thing to say but i i would love to understand 43:38 whether you know how you think of of asperger's like whether you can give us any sense of even you as a boy how what 43:44 what the experience was or as you now understand with the benefit of hindsight 43:51 can you talk about that a bit well i think i think everyone's experience is going to be somewhat 43:56 different but i guess for me the social cues were not uh intuitive so 44:05 i was just very bookish and i didn't understand this i guess 44:11 others could sort of intuitively understand uh 44:17 what watches meant by something i would just tend to take things very literally as just like the words 44:24 as spoken word exactly what they meant but but then that didn't turn out to be wrong 44:31 you can't they do not they're not simply saying exactly what they mean there's all sorts of other things that are meant it took me a while to figure 44:37 that out um so i was you know bullied quite a lot 44:42 um so i didn't i did not have a sort of happy 44:49 childhood to be frank was quite quite rough um and um but i read a lot of books i read lots 44:56 and lots of books and so that you know sort of gradually i sort of understood more from 45:03 the books that i was reading and watched a lot of movies and 45:09 you know just but it took it took me it took me a while to understand things that most 45:17 people intuitively understand so i've wondered whether it's possible 45:23 that that was in a strange way an incredible gift to you and and indirectly to many other people 45:30 in as much as brains you know are plastic and they 45:36 they they go where the action is and if in for some reason the external world 45:41 and social cues which so many people spend so much time and energy and mental energy obsessing over if that is partly 45:47 cut off isn't it possible that that is partly what gave you the ability to 45:54 understand inwardly the world at a much deeper level than than most people do 46:01 i suppose that's certainly possible um i think this may be some value also from 46:08 a technology standpoint because i found it uh rewarding to spend all 46:13 night programming computers um just by myself and i think most people 46:20 most people don't enjoy typing strange symbols into a computer by themselves all night they think that's 46:26 not fun but i thought it was i really liked it um so so i just programmed all 46:32 night by myself and um i found that to be quite enjoyable 46:37 um but but i think that is not uh normal [Music] 46:43 so i mean it does you know i've thought a lot about 46:48 it's a riddle to a lot of people of of how you've done this how you've repeatedly innovated in these different 46:53 industries and it it does you know every entrepreneur sees possibility in the future and then acts to make that real 47:01 it it feels to me like you see possibility just more broadly than 47:06 almost anyone and can connect with so you see scientific possibility based on 47:11 a deep understanding of physics and knowing what the fundamental equations are what the technologies are that are based 47:18 on that science and where they could go you see technological possibility and then really unusually you combine that 47:24 with economic possibility of like what it actually would cost is there a system 47:29 you can imagine where you could affordably make that thing and that that 47:35 sometimes you then get conviction that there is an opportunity here put those pieces together and you could do 47:41 something amazing yeah i think one aspect of whatever 47:46 condition i had um was i was just absolutely obsessed with truth 47:51 just obsessed with truth and and so the obsession with truth is 47:58 why i studied physics because physics attempts to understand the 48:04 the truth the truth of the universe physics just it's just what are the provable truths of the universe 48:11 um and and true and truths that have predictive power um 48:16 so for me physics was sort of a very natural thing to study nobody made me study it it was 48:23 intrinsically interesting to understand the nature of the universe and then computer science 48:29 or information theory also to just i understand uh logic and 48:35 and uh you know there's an also there's an argument that you know that you the that information 48:42 theory is actually operating at a more fundamental level more fundamental level than than even physics 48:48 um so uh just yeah um 48:53 the physics and information theory uh were really interesting to me so when 48:58 you say truth i mean it's it's not like some people so it's what you're talking about is the 49:04 truth of the universe like the fundamental truths that drive the universe it's like a deep curiosity about what this universe is why we're 49:11 here simulation why not you know we don't have time to go into that but i mean it's you're just deeply curious 49:17 about what this is for what this is this whole thing yes i mean i think the why the why 49:24 of things is very important um i i actually uh when i was a i don't know 49:31 so young teens uh i i got quite depressed about the meaning of life 49:36 um and i was trying to sort of understand the meaning of life looking at reading religious texts and and 49:44 reading books on philosophy and i got into the german philosophers which is definitely not wise if you're a 49:49 young teenager i have to say can be ripped out but dark 49:54 so [Music] much better at as an adult i um and and 50:00 then actually i ended up reading um the hitchhiker's guide to the galaxy 50:05 which is actually a book on philosophy just sort of disguised as a silly humor 50:11 book but but actually the book it's actually a philosophy book 50:16 and uh adams uh makes the point that it's actually the 50:21 the question that is harder than the answer um you know this sort of makes a joke that the answer was 42. um that number 50:28 does pop up a lot um and 420 is just 10 14 10 10 times 10 50:34 times more significant than 42. okay you know there's um you can make a 50:40 triangle with 42 or 42 degrees and two 69s um 50:46 so there's no such thing as a perfect triangle or is there 50:52 but even more important than the answer is the questions that was the whole 50:58 theme of that book i mean is that yeah basically how you see meaning then it's the pursuit of questions yeah so i have 51:04 a sort of you know a proposal for a world view or a motivate a motivating philosophy which 51:10 is to understand what questions to ask about the answer that is the universe and the to agree 51:17 that we expand the scope and scale of consciousness uh biological and digital 51:22 uh we will be better able to to uh ask these these questions to frame these questions 51:28 and to understand why we're here how we got here what what the heck is going on 51:34 and so that that is my driving philosophy is to expand the scope and scale of consciousness that we may 51:39 better understand the nature of the universe elon one of the things that was most 51:44 touching last week was uh was seeing you hang out with your kids um here's if i may 51:52 um it looks vaguely like a ventriloquist dummy there [Laughter] 51:57 i mean how do you know that's real um so that's x and and you know you're it 52:04 was just a delight seeing seeing you hang out with him and what 52:09 what what what's his future going to be i mean i don't mean him personally but the world he's going to 52:16 grow up in what future do you believe he will grow up in 52:22 well i mean a very digital future um 52:32 a very a different world than i grew up in that's for sure um but i think we want to obviously do our 52:38 absolute best to ensure that the future is good uh for everyone's children um and 52:44 and that you know that the future is something that that you can look forward to and not feel sad about 52:51 um you know you want to get up in the morning and be be excited about the future and we should fight for the 52:56 things that make us excited about the future you know the future cannot it 53:01 cannot just be that one miserable thing after another solving one sad problem after another there got 53:07 to be things that get you excited like you're like you want to live 53:12 these things are very important you should have more of it and it's not as if it's a done deal like 53:19 it's all it's all to play for like the future may be horrible still there are scenarios where 53:24 it is horrible but you you see a pathway to an exciting future both on earth and 53:32 on mars and in our minds through artificial intelligence and so forth i mean in your in your heart of hearts do you really 53:39 believe that you are helping deliver that exciting future 53:44 for ex and for others i'm trying my hardest to do so 53:53 i you know i love humanity and i think 53:58 that we should fight for a good future for humanity and i think we should be optimistic about the future and fight to 54:04 make that optimistic optimistic future happen 54:09 [Music] i think that's that's the perfect place 54:15 to close this thank you so much for spending time coming here and for the work that you're doing and good luck 54:21 with finding a wise course through on twitter and everything else all right 54:26 thank you hey guys 54:34 [Music]

Leave a Reply

Chat with Elon AI