NVIDIA’s Q3 Earnings Proves That The Bubble ISN'T Popping

Josh:
[0:00] Over the last few weeks, markets have been crushed, and people are not only

Josh:
[0:03] asking, are we in a bubble, but they're asking, well, has it already popped? All of the stocks are down. All of crypto, if you invest in that, is down. Everything is down only, and it's becoming a little concerning until just yesterday where NVIDIA reported absolute blockbuster earnings results. They added $200 billion to that market cap. They had sales numbers throughout the roof, and I think it's a very optimistic data point that we're going to get into in this episode. We're also going to talk about a huge partnership between Anthropic and Jensen Huang and a whole cast of characters, including Microsoft. We have Nano Banana Pro, which is the new best image generation in the world. Also coming out of Google is the new weather prediction model, which is able to predict weather accurately for up to 15 days in advance. And we have Meta doing some really cool stuff with 3D generation and really world building that's very impressive. So super jam-packed week, starting with NVIDIA earnings. Ejaz, give us the lowdown. What happened? And how big of a blowout really was this?

Ejaaz:
[0:56] It's a big week. With NVIDIA's earnings this week, and they announced a record quarter, $57 billion of booked revenue, which is absolutely insane. I think it marks up about a 10% from last quarter. But that's not the news. Of course, the stock price pumped 5% to 10% on the news, adding $205 billion to that market. I mean, listen, you couldn't even see the bottom of this candle. I can't.

Josh:
[1:22] Until I scroll down.

Ejaaz:
[1:24] I mean, that is just an insane thing to see. $200 billion is just kind of like insane to think about a single company. That's like the GDP of multiple countries, if you kind of like stack rank them. But I want to talk about just a general theme that's been going through that you highlighted, Josh, which is people have been super worried that the bubble is bursting, right? Stocks have kind of been down, Macs haven't included. And the idea is, well, maybe we're running out of steam. Maybe people don't want these GPUs anymore. Maybe people don't actually care and i'm here to tell you that i think nvidia is here to stay and nvidia, A stock market without NVIDIA succeeding is a stock market that's going to crash. But fortunately, I don't see that on the horizon at all. Now, I've pulled up this tweet from Akash Gupta here, Josh, and I'm going to walk you through what the claims people had for the bubble. They were basically saying, listen, whenever anyone buys a NVIDIA GPU, it's only good for about two to three years. Do you recognize what I'm saying here, Josh? It was a similar thesis from...

Josh:
[2:18] Oh, I believe we recorded this episode earlier this week with Michael Burry, who is Mr. Big Short himself and got blown out on the idea that he could short this.

Ejaaz:
[2:26] Exactly. So Michael Burry, famous for his infamous short trade in the 2008 financial crisis where he bet against the real estate market rising and he won pretty big. He came back last week and he announced like, hey, like NVIDIA GPUs aren't as good as they say. Their lifecycle isn't actually that long. And people are marking their assets way higher than they should, marking the GPUs way higher on their balance sheet, which is pumping their stock up falsely.

Josh:
[2:48] The next day, he announced that he is closing his fund because his position was so down bad.

Ejaaz:
[2:53] And this week's earnings report from NVIDIA just kind of proves that there's a lot more muscle behind just kind of like the words that they're using. So one clear thing from NVIDIA's earnings report is this, Josh. They have six to eight-year-old GPUs that are being 100% utilized and booked out six months in advance. Now, we saw this for a fact the week before last when CoreWeave announced their earnings report and actually showed that they had customers that were booking out NVIDIA's old six-year-old GPUs six months in advance. So the point that's being made here is a lot of people think that we're in a bubble and people are overbuying GPUs because they don't have enough demand, but they're just getting involved in the hype and they're getting concerned. But the truth is, there's not enough GPUs. And the revenue from NVIDIA's this quarter proves that. You know what he's projecting for next quarters, for Q4's revenue, Josh? What's that? $65 billion. dollars. This is just insane.

Ejaaz:
[3:47] He's not putting these numbers out unless he's absolutely confident. And Josh, one final thing I'll say about this is typically in a quarterly earnings report, you want to be very strict about what you say. You don't make promises. You say, ah, we estimate this, we estimate that. Jensen did the complete opposite, which is what a CEO doesn't normally do. He goes, you don't ever have to worry about this. We are nowhere near this bubble ever kind of like forming or popping or whatever the speculated rumors are. NVIDIA is good. Our GPUs are good and they are booked out months in advance. We're not worried. He was using very casual language, which kind of insinuates that he's super confident. So why shouldn't we be?

Josh:
[4:25] Yep. And I knew this was happening when I saw him out last week with the CEO of like TSMC in Taiwan. They were all getting

Ejaaz:
[4:31] Beers and they were.

Josh:
[4:32] Having a good dinner. And you don't do that a week before earnings reports. And if you didn't know, if you weren't completely certain, one of my favorite lines from the report that he mentioned, within the first 60 seconds, he goes, we have a line of sight to half a trillion in revenue in 2026, which is just an unbelievable number. I mean, right now the market cap is $4.5 trillion. So to have a clear line of sight to half a trillion, that is so obviously the right direction that we want to be going in. And some other quotes, sales are off the charts. GPUs are totally sold out. AI is going everywhere, doing everything all at once. We've moved away from industrialization and manufacturing for so long. And so I really love the fact that at this moment, we're re-industrializing the United States. So he's doing it in the US. He's doing it in China. He's doing it all around the world and they're building all these ships for everyone. They're sold out proactively. They're sold out retroactively. against the Michael Burry short. And it's just total blockbuster rapport. I mean, there is no end in sight for the growth of NVIDIA. And I think that's what the market's reflecting, where we are now pushing that ceiling. EJs, we talked about the ceiling of where market caps can actually reach. We're creating new industries. We're pushing that roof higher. The most valuable company in the world is continuing to be the most valuable company in the world.

Ejaaz:
[5:37] I think about that. I'm reminded of that phrase where humans aren't really good at judging exponential progression. They kind of just think linearly, but they can't think about like, They can't fathom the numbers being that big until it actually is that big. If anyone's going to tell us or be a good indicator of what the truth is, it's going to be Jensen. He's literally selling the picks and shovels to the people that are claiming that they're getting a lot of demand. And if his GPUs aren't being used, he's not going to be getting purchase orders from any of these customers.

Josh:
[6:05] And it was really funny. This week, they had a Saudi investing meeting where I think there's a lot of funny pictures going around of all the AI leaders in the world getting together. Donald Trump was there. Cristiano Ronaldo was there. Elon was there. Jensen was there. And Elon and Jensen actually went on stage to have a talk. And it's funny that it takes a Saudi gazillionaire to come on in order to get them both to go on stage and talk. But Jensen was asked, is there an AI bubble? And his answer, it was three minutes long. We're showing it on screen. But I'll summarize, which is basically no, because if you believe that AI is going to be everywhere, that means the entire CPU architecture of the world needs to switch over to GPUs. And that transition from CPU to GPU is not even a fraction of a percentage of the way there. And there's still a huge amount of growth required to get AI into everything around us at a scale that it's fully immersive. And I think that's kind of where Jensen's at. He's like, we're still so early in this journey to deploying AI that currently it just exists in these black boxes on our computers and these models. And they haven't even reached out into the real world yet. So we're going to see much more permeation of these AI models as we go. And of course, he's going to be bullish. You got to take it with a grain of salt because, I mean, all of his incentives align to make it seem like things are great. But it really, based on these numbers, seems like things are pretty great over in NVIDIA. Yeah.

Ejaaz:
[7:18] And if you look to the sitting next to him, Elon is there. I think Satya was out there as well. If you're wondering why all these American CEOs in the Saudi, it's because they just announced that they're going to invest $1 trillion up from the $600 billion that they formerly said that they would. It's now an extra $400 billion. What's $400 billion to an investment? Into the US, specifically for AI and data center infrastructure, which means that there's a really bullish sentiment around

Ejaaz:
[7:43] the AI models that are coming out of the US. But Josh, like, listen, we've just spoken about like how bullish Jensen is, but I have this thesis, right? I have this thesis that he's worried about another particular company breathing down his back. And that company starts with a G. Google, who had a pretty crazy week, Josh, they released their new Gemini 3 model, which is the new state of the art model best in the game and by a wide, wide margin.

Ejaaz:
[8:10] And the reason why I think that Jensen might be slightly concerned is it was trained entirely on Google's own GPUs. So it didn't use a single NVIDIA GPU. So you can think, you know, you can imagine if you were the king, you'd be kind of worried. And in this new breaking news this week that I'm highlighting on this tweet right now, Satya speaking, they're announced a new kind of alliance

Ejaaz:
[8:33] between Anthropic, Microsoft and NVIDIA. I'm just going to give you a quick breakdown of this and I want to get your take on this, Josh. So basically, Anthropic is one of the leading Frontier model providers, and they recently signed a contract with Google to buy $50 billion of Google's TPUs. Now, that's a pretty big number to buy a non-NVIDIA GPU. So probably Jensen is kind of like, hey, what's going on here? How can we work closely with you so that you use NVIDIA GPUs? So in this new partnership, they being NVIDIA have agreed to invest up to $10 billion in Anthropic in return for them purchasing NVIDIA GPUs exclusively for their future models. But it's not just him.

Ejaaz:
[9:13] Satya Nadella, who is one of the biggest users and purchasers of NVIDIA GPUs, also wanted to get into this fray and say, hey, I'll invest up to $5 billion of you, Anthropic, if you continue to buy NVIDIA GPUs via our Azure, Microsoft Azure Cloud Compute service. So there's this whole kind of linchpin kind of circling around Dario, who you see on the screen right now, to kind of like focus him on purchasing NVIDIA GPUs alone and not segwaying off to use Google's TPUs. What are your thoughts? Any thoughts?

Josh:
[9:43] Yeah, it's becoming very clear who the players are. And it's becoming very clear where the linchpins and points of leverage are. We have Anthropic, Google, XAI, OpenAI, Microsoft also. And then beneath that, we have NVIDIA, who makes all the chips for all of these people. and you're starting to see we're going to have to make a visual chart to share with the people who are listening to share it on screen to kind of show the dynamics between the two because

Josh:
[10:06] OpenAI is kind of working with everybody. Google is existing in their own silo with TPUs. Anthropic, Microsoft, they're all starting to work on these exclusive agreements where they're kind of locked into some things. And we're starting to see, I mean, again, we always talk about Game of Thrones, but this is very much the Game of Thrones, people positioning themselves, people being strategic about where they're giving exclusivity to, who they're giving it to, how long, how much. And we're going to kind of see how these play out. It seems like, to me, Google really is positioned so strongly in this because you're starting to see these alliances between all these big companies but they come at the terms of exclusivity which is kind of a little scary when the world moves so quickly because i'm sure everyone wants their own chips they're just difficult to make and if you lock into nvidia and suddenly there's a new chip maker with a breakthrough it just creates you know a little bit of turmoil so it's funny it's like jensen is trying to further embed himself into the architecture the the mesh of these companies the fabric of these companies and i think that's probably what we're going to continue to see happening is just further embedding himself into this process so you can't leave and it's these are golden handcuffs man the chips are great they're at the most cost effective they're cheap but like you're stuck with me whether you like it or not here we go i

Ejaaz:
[11:15] I mean darius amode a ceo of anthropics kind of sitting there pretty pretty isn't he he's he's got all this money he's got everyone chasing after him it's kind of like a sweet position to be and i think there's rumors that he's raising at a 400 billion dollar valuation for anthropic in the next round we'll see if that breaks, which will put it basically on par with open AI. Another trend that I find interesting from this, Josh, is just how relaxed these CEOs are at spending tens to hundreds of billions of dollars on investments purely in other companies throughout this year. You know, Jensen, I think alone has spent, I was just looking this up, $50 billion this year, just investing in different AI model frontier companies, either through GPU infrastructure deals or through direct investments at the behest of the US government. Trump basically telling him to invest in Intel is one primary example, right? I think he put in like a $15 billion check into that.

Ejaaz:
[12:06] I've never seen people or CEOs rather being so freely relaxed at spending large copious amounts of money so aggressively in a single year, which tells me that this is a really important market for them to win. Like Jensen's just not sitting kind of pretty at the top of the chain right now being like, hey, I'm the richest man in the world. We have the most valuable company by a mile. He's thinking whoever has the winning AI model is going to make us so, so, so much money. And right now I can't pick which one, who that's going to be. I can't get to Google because Google already has the best model, but they're trained on GPUs that aren't ours. So I need to go aggressively for all the other bets. And that's what he's doing. Satya on the other side has the most golden cuffs position ever because he's, sorry, not golden cuffs. He's sitting on the easiest, comfiest sofa ever because he's just broken out of his contract with OpenAI. So he doesn't have to exclusively invest in ChatGPT and OpenAI. He can invest in other companies and he's making that pretty clear that he wants to.

Josh:
[13:01] Yep. All right. Well, we've talked about Google winning. They're winning more. And they have some cool new announcements that I want to get into this week, starting with Nano Banana Pro, which for those unfamiliar or for those who haven't watched our previous episode on the Nano Banana release, that was the best image generation model in the world, bar none. It's incredible. It is so accurate at text. It is accurate at real world understanding the physics we always talk about. It's just, it feels like the first time that you could actually engage with an image generation model that understands what you want truly and can customize itself in a way that really makes sense. NetNav Banana Pro seems like the next step in that. EJS, I understand that text rendering in particular is amazing, where you can actually get text. A lot of people, if you're not familiar, in the early image generation model days,

Ejaaz:
[13:45] Text was very difficult and it often looked like gibberish.

Josh:
[13:48] So it understood what a face looked like, it could make it, but a lot of times the hands would have like six fingers, the words would kind of look like squiggly lines, it didn't quite make sense. Nano banana pro seems to have solved basically all of this can you what are the interesting highlights between nano banana and the pro version

Ejaaz:
[14:03] Yeah so i think what nano banana really wowed people with was the quality of the images that they would produce right you mentioned that you can kind of like add text and yeah you're right you could create a bunch of really cool product images with whatever brand you want on it but it was really like the high fidelity of the image that people were like what you're telling me that's ai that person is ai no no way no way right but it was lacking something and what it was lacking was like kind of like that that full end-to-end experience where you could start with an image but like say like hey i want to create another image almost identical and maybe a sequence of images like can you help me do that and it wasn't really good it was pretty manual pretty arduous with nano banana pro which actually uses the gemini 3 pro model on the back end you now get this cohesive experience when you're kind of like creating imagery with nano banana so there's two main features that unlock here number one you mentioned is the text rendering not only can you add text be it large or minute into anywhere in the image you can translate it directly into whatever language you want whilst keeping the brand consistency throughout all the different images which is really important if you are say for example a product designer and you want to kind of like create different concepts for different markets or countries that your product is in you're able to kind of do that super simply and then there's this thing called world knowledge which basically connects nano banana which is this text to image generation model and train it on all of Google searches data.

Ejaaz:
[15:29] And the reason why this is important, Josh, is sometimes the intuitiveness for what you're asking for doesn't really translate very well. You might be like, hey, can you create a brand aesthetic in the style of, I don't know, what's a cool fashion brand? Oh, God, I may put on the spot. I don't know, Louis Vuitton. I don't know. It's probably a terrible, terrible suggestion. It can't quite kind of like gauge what you're saying. If you say like Louis Vuitton or a Dior 2025 type.

Josh:
[15:53] Kind of like aesthetic,

Ejaaz:
[15:54] But now you can state that and it'll get exactly what you mean because it's trained on Google searches data, both trending and historical. So those are the two main things.

Josh:
[16:03] I want to see what this looks like. Can we scroll down and see some examples? Oh, yeah. From my understanding, it is like significantly better, particularly at portraits. I saw one that looked really good.

Ejaaz:
[16:11] Yeah. So here we have a side by side of a portrait of a man. And as you can see, like it maintains the aesthetic of like black and white, but like you have different lightings, like on the left hand side of the image, you've got the light coming in from the left on the right hand side you've got we've got it coming from the right but it maintains the facial composition and structure of this man which i thought was super cool josh i know you're really really into photography so you're probably noticing all these nuances and thinking hell yeah if i could just prompt this would be easy am i right yeah.

Josh:
[16:37] I'm pixel peeping because the left image is nano banana the right one is nano banana pro and as i'm like zooming in i'm looking at these there is a clear difference in detail and clarity in the new image that there was not

Ejaaz:
[16:49] In the original. The right is so much better. Yeah, wow.

Josh:
[16:52] So as I look at these, I see the left clearly looks like AI because it's kind of smooth. There's a lot of inconsistencies that you would notice with the real photo. With the second photo, it's very real. Even down to the pilling, you'll notice there's like pilling on the sweater that the man's wearing. It's like very highly detailed.

Ejaaz:
[17:10] It's with high fidelity. It's very impressive.

Josh:
[17:12] Like that, to me, if I were to see that out with no context, that to me looks like a real picture. And I think that's the difference that Nano Banana Pro makes is it has crossed thresholds in being like, okay, this actually looks real. And yeah, here's a second example of another portrait from the side. I'm assuming they use the same prompt. Again, the image of Nano Banana Pro, so much more fidelity, so much more detail, and the lighting seems more accurate, whereas the right, it's softer. So I think maybe just looking at these two examples, if I had to put my finger on one distinction, it's the detail and clarity. That feels like it's been leveled up to a point where it has high enough fidelity where it can pass as a real picture and that's kind of a scary thing it's like man This is getting good. Yeah, it's going to get super good.

Ejaaz:
[17:51] And it's probably going to get embedded in a bunch of social media soon. And then we're going to have to figure out what's real and what's not. Anyway, putting that aside, there was another... Well, I'm going to use your words on this because you thought it was really cool. I come from the land of England where it is the worst weather ever.

Ejaaz:
[18:07] And so I do my very best to just ignore the weather because I know it's going to be miserable. But for some reason, Josh, you are excited about Google's new weather model, Weather Next 2. Can you tell me a bit about it?

Josh:
[18:18] This is the world's best weather model. Ejaz, I have trust issues with nobody more than I do my weather app on my phone. It always lies to me. It is always inconsistent. It's always wrong. And Google is trying to fix it. And for that, I commend them. What they have done is they've created the best weather model in the world. And it's all powered by their Gemini 3 AI. So you could think of weather like a planet's mood. It's always changing. And small things can lead to these huge swings. So for decades, we've used these giant supercomputers that crunch physics together to try to give us a more accurate representation of what happens. Well, with Google DeepMind, they learn that weather behaves just by matching a ton of data together. So they take these patterns over time and they run them through a machine learning algorithm. And then they can kind of predict based on the current state, what the future state will look like. So a lot of the way it works in the past is that it's kind of using this like big, weird, mysterious number crunching algorithm to come up with the probability. What this does is it just recognizes patterns and it ingests all of those patterns because it has so much data and so much context. And it pops out what is most likely to happen based on all of the historical weather patterns with remarkable amounts of accuracy. So this to me feels like a really cool improvement, really cool quality of life thing that I like to see from Google and just AI in general. It's just improving the things that improve quality of life on a regular basis, right? It's like, I want to know when it's

Josh:
[19:36] For days out, weeks out, for example, going to Patagonia next week. That's going to be really cool. I have no idea what the weather looks like because I can't trust it 70s in advance. And if Google is going to give us some weather predicting algorithm that allows us to see that, that's awesome. And not only that, but predictions for hurricanes and for natural disasters. And you can kind of anticipate these much further out than a lot of other companies can. And that to me just felt really exciting is like, okay, nice. My weather app, it's getting better, like sick. That's a good use of AI.

Ejaaz:
[20:05] Yeah, I think the natural disaster prevention or rather preparation for natural disaster is like the real cool thing about this for me. It's just like, OK, well, if I can predict that a hurricane is going to accurately come in in a couple of hours, that's like a couple of hours of time that you didn't know you had to prep against it, get people out, evacuate, all that kind of stuff. So safety wise, I think this is like a awesome, awesome tool to use. On the on the consumer side of things where you were talking about the weather app being kind of crap. I agree with you. And maybe this is kind of naive of me, but I've always wondered like, why don't we have this technology yet? Why wasn't this here yesterday? Like, why do I have to go out and it says it's not going to rain and then it starts raining? Maybe it's just England. That's probably why I left.

Josh:
[20:47] But- Well, the understanding is that it's actually, it takes a lot of compute. So it would normally take hours to generate a weather forecast.

Josh:
[20:52] And because of the efficiency improvements brought to us by AI, it takes minutes. And that efficiency improvement leads to a cost improvement, which means it's not reserved just for like the high-end weather stations it's available to everyone because the cost and time to generate it is much lower but there's more news ijaz you were particularly excited about this meta news that came out i'm excited by

Ejaaz:
[21:12] This can i introduce you to sam 3d sam sam stands for segment anything model josh have you heard of these sam models.

Josh:
[21:22] No, this one I missed. We've had a very busy week. I haven't been online much and I don't know anything about it. So please fill me in.

Ejaaz:
[21:28] Okay. So this model will basically allow you to take, you're seeing a video on the screen, take any image, any 2D image and pick any object within that image. It could be a person. It could be a ball. It could be a bicycle. It could be these buildings that you're seeing on your screen right now and create an individual 3D object for it. And if you can imagine, you could do some really funky things. You can move the chair around in your living room. You can kind of like switch the buildings around to kind of create a new different landscape. You can kind of move or change the color of the chair. You can change the position of the person.

Ejaaz:
[22:01] I just think it's super cool because something that you and I are super passionate about, Josh, and think will be a big deal in the future is something called world models, kind of like AI plus simulated worlds. It's cool for many different reasons. Like for example, you could train AI on various different scenarios in the physical world so it can understand the physics of the real world without having a kind of body. But then when it eventually does have a body in the form of humanoid robots, they'll also be able to be trained off of these types of models. So it's derived of the same family of these world models. But with these particular meta models, these new SAM models, you can apply it to any kind of creative pursuit that you have. So if you create a video that you want to kind of like promote a new piece of furniture, if you create a video where you just kind of want to change things around, think Adobe Photoshop, but on steroids for video and for images, you can now do that. The really cool part about this technically, Josh, is the astounding resolution that you get from the 3D object. You can literally pick any object from a picture, and it will create almost identical to its size dimensions in real life. You're looking at like a cheese spinning on a plate here, exactly what it will look like on the other side. We have never been able to do this before.

Ejaaz:
[23:10] Actually, I lie. You have been able to do this before. It would just take you roughly anywhere between six weeks to six months to do it in a full SFX studio in Hollywood. It's just super awesome to have this kind of like thing at your fingertips on an app from Meta. Very cool.

Josh:
[23:28] Yeah, this is sick. If you're not watching this, you need to be watching this. Find this video, watch the video version of the podcast because it's very impressive. I'm watching this for the first time and I'm thinking, wow, this is... Okay, I'm glad Meta is doing something cool. This is the first time I'm watching a video of theirs. I'm like, oh, I like what I'm seeing. This is actually interesting. This is unique and novel. It's really impressive that the detail, the fidelity and what, as I'm seeing this, I guess, first reaction is just, I'm very impressed at how quickly it's able to generate 3D models, which I assume are then fully interactive and embeddable into a virtual space. So this to me feels like a really fun step in the world of creating metaverses where you can just extract lots of elements from the real world seemingly instantly and doing so with a high enough fidelity that it actually feels real. And it feels like it's part of this digital space. And this is huge for game designers. This is huge for, this is one of those things, Ejaz, honestly, where it's just a tool that I'm sure we're going to see amazing use cases from that I'm just not even going to be able to predict on this episode. But this is a really powerful tool because previously, like you said, it took a long time to generate these 3D models. And even the fastest version is a lot of people don't know this. On the back of your iPhone, you have a LiDAR scanner that shoots lasers at objects and can actually scan them in the real world, it takes a long time to do that. And it requires a lot of, you know, physical availability to actually be able to fully scan the thing. This seems to be doing a lot of predictive work and that predictive work seems very accurate. So this is cool. I'm

Ejaaz:
[24:53] Excited to see what.

Josh:
[24:54] Types of applications and use cases are done with this. All right.

Ejaaz:
[24:57] For the final bit of news, I've got something kind of controversial for you, Josh.

Josh:
[25:02] What if I,

Ejaaz:
[25:03] What if I told you, you could now use AI to talk to your dead relatives.

Josh:
[25:10] Oh, God. I knew this day was coming. It's just a matter of time. I guess it's today the day. Are we going to do it now?

Ejaaz:
[25:18] Today might be the day. So Callum Worthy, an ex-Disney Channel star, announced his new company called 2Y, 2-W-A-I, which allows you to talk to your loved ones. I kind of don't want to talk about this. I want to show you it. So I'm going to play this quick clip.

Josh:
[25:40] He's getting bigger. See? Oh, honey, that's wonderful. Kicking like crazy.

Ejaaz:
[25:46] He's listening. Put your hand on your tummy and hum to him. You used to love that.

Josh:
[25:56] It feels like he's dancing in there.

Ejaaz:
[25:58] Oh, honey.

Josh:
[26:00] Mom, would you tell Charlie that bedtime story you always used to tell me?

Ejaaz:
[26:03] Once upon a time, there was a baby unicorn. So I'm not going to play the whole video, but basically what you'll... Basically what you're listening to is a young mom that is pregnant that is about to have a child and she's facetiming her mom saying hey listen like i know you can't be here right now but like you know i'm having this kid like do you have any advice and she's like you know put your hand on your tummy hum to him just like i used to do to you as a child the crazy part is her mom isn't alive in this hypothetical scenario she is talking to an ai version of her mom who says the things that she thinks she needs to hear that is trained on her memories and thoughts. And they show the progression of her life with this kid and her mom, who's deceased, talking to the kid and growing up with them. Josh, I just want to get your reaction to this before I talk.

Josh:
[26:52] Yeah, I mean, again, I knew this day was coming. It's weird. I don't love it. I actually don't think this day is here because this seems very much like a theoretical video. This is not a real production-ready thing. And as I'm kind of resisting this, I'm coming up with excuses of why it won't work. One of them being the context where this implies that the AI has the knowledge of how her mother treated her as a child. And that is hyper-specific. That is not really easily accessible or even known by more than maybe one or two people in the world. So there's a lot of gaps in this, but directionally, it feels correct. I mean, if you watch Black Mirror, or if anyone listening has watched Black Mirror, this very much feels like an episode in that where I'd love to kind of-

Josh:
[27:35] That's exactly what happens. That's right. Yeah, it is a plot line of one of them. Yeah, where I'm not really sure how I feel about it. I don't. I don't know. It's one of those weird esoteric questions where if you can stay connected with, I guess, the ai clone of a loved one is that valuable is that really them probably not how much value is that versus just kind of like leaning on this weird nostalgia that makes you feel connected i don't i don't really know where i stand on it but it's really freaking weird and this feels directionally like the place we're going to be heading soon where as these ai models get more capable as they collect more data about you as they have more context you start to unlock some really weird things. This is one of them. And you know, I'm happy it's not here today. I'm happy this is just a rendering. But this is a very real thing that I think we're going to need to expect coming soon. And I'm not sure what the societal implications of that are.

Ejaaz:
[28:29] So I think the people that are bullish around this kind of technology is going to be this therapeutic practices for this. And if it allows you to kind of like process your trauma and better yourself, then it's good. I think they would argue that if it helps you raise a better versioned kid, however way they measure that, it'll be good. Obviously, the bad side of this is like, it's just not real. It's kind of fake and kind of like life and death is kind of like part of life itself, right? You kind of go through those losses and grieve them and kind of grow in different ways. This kind of puts technology meddling in some really, really important life stuff, which I'm kind of like concerned and my gut kind of like clenches when I kind of like see a video like this. The other thing I think about is if we play out this future over the next decade, more and more kids that are much younger than us, Josh, are going to have AI companions. We already see Meta pushing AI companions. We see X, Elon Musk, pushing it through Grok companions. So if I envision a world where all these kids and eventual adults are so used to talking to AI companions, what's the difference between talking to a virtual AI companion and a virtual AI version of your mom? You know what I mean? So maybe this is a cultural thing that shifts. I don't like it. I'm not saying I like it. I'm just thinking maybe future generations are going to be more amenable to this than you and I are.

Josh:
[29:49] Yeah, like every technology, it is a tool. And this cuts both ways. I think the example they're using here is an interesting one of being with people who have passed away. But I'm sure there will be many examples of people who you just you just do not have access to. So maybe you want you have a celebrity crush, you want to clone that person. And like, you could start to see kind of how that gets weird quickly, where this instance is seemingly okay. It's a little creepy, a little weird. It can get very weird, very quick. So again, technology, this is a tool, we're going to see how it gets used. this is one narrow example that is going to be interesting to see play out. But I think that's everything for this week. That covers the gist of it. It was a big week in AI for the better and for the concerned. But I think, again, very optimistic outlook. NVIDIA has amazing numbers. Everyone is just trying to get more chips, trying to spend more money, trying to grow these AI models. And it's actually working. We are continuing to scale with the spend that we're doing. Therefore, it seems like we are not quite done with this run yet, even though markets might look otherwise. So we're going to have to watch and see. But I think that just about wraps it up for the week with everything. I think so.

Ejaaz:
[30:55] Just like NVIDIA's quarterly earnings, which they blew out the water, so did the Limitless podcast. We have had our highest performing month last month. And you know what we're going to do this month? Have an even higher one. You know what's going to help us get there? Is if the people that are listening to this show, you guys, wherever you're listening, or watching this on, subscribe, turn on notifications, give us a five-star rating. If you think we're five stars, I personally think we are. It is the season coming up of holidays, Josh. We've got to spread the Hollywood, Hollywood. We've got to spread the holiday joy, the prosperity. If you're listening to this, if you enjoy us, please help us out by rating and by subscribing, even commenting. We love reading your comments. The feedback that you give is invaluable and it actually feeds back into a lot of the types of episodes and ideas that we create in the future.

NVIDIA’s Q3 Earnings Proves That The Bubble ISN'T Popping
Broadcast by