On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The handpicked topics for this week are:
- AMD Advancing AI Event
- NVIDIA & MediaTek Getting Into AI PCs?
- Amazon Generative AI Summit
- Apple M4 Performance Leaks
- Tesla Robotaxi Event
- Commvault SHIFT Event
For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.
Watch the episode here:
Listen to the episode on your favorite streaming platform:
Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.
Transcript:
Patrick Moorhead:It’s Friday morning and that means The Six Five weekly show is back. Dan and I may have landed at midnight last night. We are up fully caffeinated, partially hydrated. I got an all-time, low score sleep and readiness, but I feel a lot better than my devices say I should feel, Dan.
Daniel Newman: Yeah. That’s one of the things I always tease you about. It’s like if the thing says you feel bad but you feel okay, are you supposed to feel bad? It’s kind of like if you can see the restaurant on the right but the navigation tells you to keep driving, do you turn or do you just keep going, Pat? And I don’t know. That’s why I struggle with these things.
Patrick Moorhead: I know. So yeah, you and I were on the road. We were at the AMD event. We had a lot of travel coming up, but a lot of really interesting conversations out there. We picked six topics from, I don’t know, we probably looked at 10 or 15, we could have potentially discuss, but we are going to be talking about AMD’s Advancing AI event, Nvidia and MediaTek potentially getting into the AI PC SOC space. Amazon had a generative AI summit, made some announcements. This wasn’t an AWS event, but what Amazon was doing related to gen AI, probably delivered by a lot of AWS technologies. There were some performance leaks of the Apple M4 chip inside of a MacBook Pro from some Russian YouTubers. We’re going to-
Daniel Newman: That sounds safe. That sounds safe. Let’s-
Patrick Moorhead: Pretty much. Tesla had its Robotaxi event last night in Los Angeles at a studio. We’re going to talk a little bit about the implications of that and then finally we’re going to talk about Commvault’s SHIFT event. We had Commvault’s CEO to do a Six Five episode, check that out. But they also made some… We’re going to break down announcements, talk about the big picture.
Daniel Newman: Hey, Pat, I just caught this, but about half of our topics this week, we did a CEO Six Five.
Patrick Moorhead: I know we did. Yeah, we talked to AWS CEO, Matt Garman, CEO of Commvault. We talked to Lisa Su, right.
Daniel Newman: Yeah.
Patrick Moorhead: Sanjay was a great conversation. I think he was actually in London when we were talking.
Daniel Newman: Yeah, around the world. Around the globe. The Six Five, we are there for all of you-
Patrick Moorhead: We are.
Daniel Newman: … right. I mean that’s what we do. I mean, this isn’t stock advice, but we are there for everybody.
Patrick Moorhead: Yeah. And don’t take anything we say as investment advice. In fact, seek a professional if you want to do that.
Daniel Newman: Just like Pat, even if the ticker indicates the stock is up, you should probably check for yourself.
Patrick Moorhead: That’s right.
Daniel Newman: ‘Cause it’s like the watch or the ring or the other watch or the other ring. Do you wear a band too? One of those-
Patrick Moorhead: I do. I’m going to work out in a couple of hours and I do that just ’cause it’s precise heart rate. Wrist worn stuff is off by 5, 10 beats sometimes.
Daniel Newman: Okay. Is the ring better or the wrist?
Patrick Moorhead: Wrist.
Daniel Newman: Really? Okay.
Patrick Moorhead: Yeah. Yep, yep, yep.
Daniel Newman: Okay. I was thinking about having an implant.
Patrick Moorhead: We can get that. You might want to talk to the folks over at Neuralink, another company by Elon Musk.
Daniel Newman: He’s pretty cool.
Patrick Moorhead: I’m sure they could figure that out.
Daniel Newman: Despite popular sentiment these days, he’s doing some pretty cool stuff.
Patrick Moorhead: He really is. Hey, let’s dive in. Dan, you and I went to the event out in San Francisco at Moscone. Big intersection. I mean, everybody wants to talk about data center AI, we got that and a lot more.
Daniel Newman: Yeah, I think there’s so much here. I’ll chat a little bit about some things that caught my attention and I’m sure there’s going to be a lot of oxygen left in the room, The Six Five and look out for all these to drop because we talked to heads of the PC business, client business. We talked to the heads of data center business, GPU business, of course we mentioned we talked to Lisa Su, so there’ll be a lot of in-depth stuff here. But let me give maybe some back off and give a little bit more of a broad observation. Coming into this advancing AI event, second year, you could absolutely be certain that all eyes were on the GPU. Everybody wanted to know what’s coming, which had already kind of been out there, it’s been talked about for several months. But are there any changes? Are there any material updates, any new cloud partner wins. The stock sort of went a little negative yesterday. I think people, and Pat, I think you said this very astutely, “People wanted to maybe hear about a big AWS win.”
Of course, Lisa came right out and this was the tweet. The tweet was, “$500 billion TAM.” She just went from I think 400, which everybody has said was sort of the most bullish forecast on Wall Street to 500, and now a 60% CAGR from 23 to 28 for AI accelerators. This just chips people. This is the volume. So when you hear things like, TSMC is sold out for two years, Blackwell is sold out for all of 2025. There’s no HBM3 or HBM3 memory out there. This is what’s going on. There is an insatiable amount of demand, but there’s also still this kind of, well, who’s going to get market share? Because we all kind of know right now where Nvidia sits and Nvidia’s got 90 to 93 depending on which data set you look at.
And there’s even some speculation that with Blackwell they’re gaining market share at least as a matter of revenue because of pricing. We know Intel is starting to shift down to three then. So AMD had a lot of success with their MI300, and in fact, she even broke some very interesting news about what AMD has been able to accomplish with Meta running the 405B model on the MI300 exclusively, by the way, another big breaking moment from the event. But as the event went on, what it was all about was the 325 and then the upcoming 355, the new architecture, how this is basically providing more memory ’cause memory is the big need, especially for training.
And then the comparisons came out, which of course everybody’s eyes were on that too. And this was a really interesting juxtaposition for AMD because you’ve got the Hopper series, two years old, the H200 announced a year ago, released earlier this year, and you’ve got this new part which is just starting to ship in Q4 and at the same time you’ve got Blackwell coming out and so you’ve got this kind of comparison that’s going on right now is, should they compare it to the part that came out earlier this year? This is such a fast moving market, Pat, with the annual cadences now is apart from early this year the right thing to compare to? You and I both had some pretty blown up tweets that got a lot of responses. I was getting a lot of pushback from making that comparison. But also at the same time, what else can they do?
Right now, that is what is out there. That is what is being used. So that’s interesting. But I think, Pat, this leaves a lot of work for Signal65 over the next year to start doing some really significant assessments. The MI part was really interesting. Pat, I’m going to just talk about one more thing and I’m going to leave the rest. I’m going to leave networking and client to you, plus anything else you want to cover. But I thought the Epic announcements were really promising. The company was able to really make clear that they have been the undisputed winner in cloud. I mean they just marched up Google, they marched up Microsoft, they had Meta up on stage and these companies were up there talking about really how they’ve gone all in on AMD. I mean Meta is probably the most symbiotic partnership of all of them. They seem to be very co-developing, working very closely and Meta, I think I’m hearing as well, maybe as high as, this is speculation y’all, but maybe as high 80% of the data center, CPU at Meta is AMD.
Huge wins, but of course just having that overall hyperscaler business. Satya was incredibly bullish about the company. He came on in a remote interview. I don’t know if he ever shows up anywhere. He’s the anti-Jensen. He shows up everywhere, but never in person. But the Epic business has just been really, really strong and some of the things I learned yesterday about the head node, about the impact that it can have about higher throughput and efficiency, that was really interesting too. I mean we heard double-digit performance gains on the Epic head node on some of these AI systems back, so that was great. There’s a ton more but 34% market share at this point. I mean, which basically means they’re crushing it in the cloud, single digit on the enterprise. So I see a big opportunity in enterprise for AMD, but that’s a totally different muscle. So I need to see, but I’ll pass this over to you ’cause like I said, there was way too much news to try to do in a five-minute bit.
Patrick Moorhead: Yeah, this will be our longest topic and I think it deserves it. So let me fill in some of the cracks here. So AMD came out with CPU GPU AI client solution for business, but they also entered a new market and that’s back end network. So there’s the front end network and the back end network. Back end network is connecting all of the GPU nodes together, other, and you need a different type of performance and different types of protocols to make that happen. One of the biggest reasons training runs don’t get complete and that’s bleeding over into inference latency is a breakdown in the network. By the way, the second reason stuff doesn’t work is GPUs burning up. And what’s interesting, this market has been traditionally dominated by companies like Broadcom. Interestingly enough, the tweet that got the most views was the tweet about the AI networking card that they brought out.
I didn’t expect this one, even though I know they have Pensando with DPU, that’s for front end networks, I didn’t think that that type of architecture could be high performance enough. Of course it’s programmable. That’s what it is from the start, but it’s an ASIC based design, so you’re going to get performance. We’re going to have to see this one pans out. This could be a sleeper, but if you look at the first of all, the market need for more reliable solutions and that the biggest reason for training runs is a breakdown in networking. There’s known-knowns, known-unknowns that you try to solve for clients. This is a known-known issue and AMD is coming into that market. We had a very interesting conversation with that on The Six Five, so we’re going to have to see how that pans out. I’m going to talk a little bit competitively.
What do we know? What do we don’t know? Let’s do AMD versus Intel in data center. So like you said, AMD has a rock in 34%. That’s peak, when I was at AMD, Opteron was 27%. That was the peak. There really wasn’t a cloud business. There was a web business. In fact, Google was our biggest customer for Opteron followed by HPE, Cisco and a little bit of Dell at the end. So I believe when the smoke clears, AMD will have put a little bit more distance between itself and Intel. Now when it comes to let’s say doing inference, Intel has the clear lead with its accelerators with AMX. AMD’s biggest challenge is going to be in the enterprise. They have single digit enterprise share, so they’re doing really well in the hyperscalers, they’re doing really well in enterprise SaaS, they’re doing really well in tier two CSPs.
But when it comes to enterprises choosing instances in cloud, there’s not a ton of that. And with single digit market share in the enterprise with what looks like a vastly superior product, AMD has to get on the stick and invest in enterprise sales, enterprise marketing in terms of collateral, in terms of POCs because if they give Intel some oxygen is going to come not roaring back in enterprise, they have what, 91% market share there? But it establishes them, gives them a point to pivot off of. And, Dan, we haven’t seen the lift in enterprise from AI yet, when that hits and if Intel has even a more competitive product going into Aetina, it could spell an issue for AMD. On AMD GPU versus Nvidia GPU, it’s murky. I mean, Nvidia came out with new numbers for more finely tuned software stacks that they dropped. Nvidia came out with their numbers and their numbers were not done by a third party, no third party attribution, by the way, there is Nvidia. So I don’t know, right. I have no idea. What I can say is what AMD showed with it’s MI355X, which would be out closer to second half of ’25, is a vastly superior product to the predecessors of the MI300 series, including the 325.
So first of all, it’s higher efficiency, it’s higher performance, and it supports lower bitrate models a lot better. It’s a pretty impressive four and six bit numbers that came out of here. One thing I think AMD did a great job on is showing the type of scale that it operates in. One of the head turners for me, and I had 45,000 people tune into this on X, was that 1.5 million Epic processors inside of Meta and Meta Llama 405B runs exclusively on MI300X for all live traffic. And I think what they mean by live traffic is not training and that would be inference. So makes sense, but it shows the scale. I was on Yahoo Finance yesterday and the first question they asked me was, “Why did AMD’s stock go down?” You had addressed this a bit, people were looking for a knockout kill. They were looking for a new customer like an AWS or Google. First of all, AWS if it’s going to do anything with AMD would be at re:Invent in December that you and I are going to be-
Daniel Newman: I agree.
Patrick Moorhead:… attending. And Google, they’re putting a ton of effort into TPU and NVIDIA GPU, they might not have the resources. But I think at the end of the day, all the hyperscalers will be an MI customer. Nobody wants-
Daniel Newman: Nothing else for merchant, right. Just to have the offering ’cause if AMD starts to get traction, don’t they want to make it available? I mean, that would be my take. Plus-
Patrick Moorhead: Yeah, and also there’s concern about the power that NVIDIA has and NVIDIA is making all the money here. Now the hyperscalers are the second-biggest beneficiaries where for every dollar they invest in GPU, they can charge $8 compared to a dollar in for CPU, they can charge $3. So they are making money, but they’re not making it hand over fist like NVIDIA is. So anyways, a lot of conversations. Check out the videos and they come out today, next week. We think you’ll enjoy them. We asked some pretty tough questions and I feel like we got some really good answers.
Daniel Newman: Just before we jump back, ’cause like you said, I wanted to put this on record is, I think there is a significant opportunity for AMD just based on what we heard yesterday about the CPU GPU combo. And I think you said something really poignant there, “Hedging is going to be a trend in ’25,” and these companies are hedging in two ways. They either hedge with Gaudi or AMD or they hedge with building their own. And frankly, my take is they’re going to do both. There’s absolutely no way this sort of monopolistic AI control is going to remain. That doesn’t mean Nvidia’s not doing great things. It just means these companies hacked it diversify, same way they did with Compute. It’s going to be the same. It’s just going to move faster with AI. Sorry, I had to put that on the record on 10/11, my mom’s birthday, happy birthday mom, because I think that’s what a lot of people might be missing about why this is an opportunity.
Patrick Moorhead: Yeah. And by the way, Nvidia needs to be very careful in the way it handles its customers and its ecosystem. The industry is hoovering evidence against Nvidia and shoveling it into DOJ as we speak. I mean, I haven’t personally seen the threats. I mean, nobody from Nvidia has ever said, “You better not say that,” or anything like that. But Nvidia needs to be very careful in how it’s handling its competitors and its customers even more. Hey, great topic. Let’s move into the next topic. There’s a rumor going out there, Tom’s Hardware wrote an article about potential for Nvidia and MediaTek getting into AI PCs. Now first off, Nvidia is already in the PC ecosystem with the form of GPUs for notebooks and for desktops, workstations. But what this is is the Copilot+ space for Windows. I do believe this has absolutely true, this has not been substantiated by either Nvidia or MediaTek. MediaTek did say that they were working on something, but with no details out there.
How could this stack up? First of all, a little murky on whether it’s Nvidia plus MediaTek only or where it’s a Nvidia and MediaTek and a MediaTek solution. But here’s how I can see this playing out. I can see Nvidia targeting, first of all, a huge GPU, okay. All day battery life gaming, all day battery life workstation, ’cause today if you want to do gaming and workstation, aside from the casual stuff, this thing lights up like a Christmas tree. It’s sucking an inordinate amount of power, has a power brick bigger than my head and ego it’s gigantic. But if you can tightly couple CPU, GPU memory, NPU together, you can do some amazing stuff. I do believe that there is an industry effort afoot to attack these two markets differently. And let’s just say it would be a MediaTek plus Nvidia combo. You could have the SOC aggregator as MediaTek. They’re pretty good at that by the way. They crank out a ton for smartphones and tablets and on the MediaTek side, just doing more of a vanilla with a lot of ARM IP in there, an ARM GPU, pretty much an entire ARM solution could be a lower end.
And it’s interesting, if we dial back, what would that mean? That would mean that there could be four, maybe five different solutions for the Windows Copilot+ a market, and it’s interesting, I talked to an OEM who will remain nameless and I’m like, “How on earth is the industry going to support all of these?” He said, “Pat, I don’t care how it ends up. I’m just glad to have more than one player out there, which is what I had for years.” And my final comment, thing that I’m thinking through is, is there enough, there’s 300 million units out there in the market, 250, how do five players, how can they afford to be out there, right. The OEM certainly can’t afford to do designs from everybody, so they’re going to have to pick different types of platforms. So I think it could be a very bloody second half ’25 and ’26 as AMD, Intel, Qualcomm, MediaTek and Nvidia are banging each other in the head. Now, there are different segments, but the question is do those segments, are they profitable enough? And I think the people who will win in the end are the ones that can leverage IP and leverage designs of what they were doing somewhere else and then coming up with a complete solution.
Daniel Newman: So do you think, just want to double click on something you said. Do you think there’s a Nvidia MediaTek kind of combo and then there’s an all in Nvidia coming later, or how do you rationalize that?
Patrick Moorhead: Well, Nvidia branded, but with overall SOC design from MediaTek, could be the combo. You’ve got a giant Nvidia GPU with tensor cores in there like an NPU, right. But might be called a tensor NPU or something like that. So it’s like a collaboration.
Daniel Newman: Yeah, that’s how I saw it too. I was just making sure I understood ’cause it really becomes kind of four bigger options. I guess you could sort of see how this could splinter even into five at some point. But it’s interesting Pat, you start doing the number crunching and you see a situation in which all the biggest players could actually lose share and the market could grow.
Patrick Moorhead: Right.
Daniel Newman: It’s really interesting ’cause I don’t think we’ve seen an inflection like this in a long time. I mean, the Qualcomm Wrinkle is already there and it’s already interesting. I think the biggest winner of this is ARM, if I had to give a winner in this because they’re going to now have two very competitive players in the market using ARM IP in a 300 million type volume where they’re going to start taking share. Now, Renee talked about like in five years or three years, 50%, I think he threw a number, I don’t remember the exact quote, but 50% of the AI PCs will be ARM-based. I think this was in the back of his mind when he was giving this number. It’s not that Qualcomm is not going to take some share. They’ve done a good job. I mean, I’m on a Qualcomm device right now as we’re doing this.
Patrick Moorhead: Me too.
Daniel Newman: So this is not a indictment in any way of what Qualcomm is doing. It’s just to make a meaningful change in this market. There’s going to need to be a lot of blunt force. And it’s funny, but I think Nvidia has a lot of power. We talked about this in the last one, but Pat, I mean, look at companies that need other Nvidia stuff. By the way, like I said, this is not a sort of indictment, but how could they kind of bundle together people’s demand for one thing to push that people are going to buy the other thing? I hope they wouldn’t do that. But you kind of see how that can play out. I had a few comments yesterday about Satya getting on stage with AMD and people saying, “Well, I guess they’re not going to get any more Nvidia.” It’s like that’s the perception though in the market right now, the loose sentiment in the market, which is super interesting.
But as I see it, I mean you could see how this would be very powerful and they could probably really accomplish something that has both that AI PC functionality and gaming, because that’s something that Nvidia really knows how to do and could sort of lean into both without so much compromise. I don’t know on the sort of, where I still see the strongest market case right now near term for AI PCs are these lightweight white collar enterprise, you know, I’m on the road traveling. What I like about my device is I go four or five days without having to worry about plugging it in. That’s kind of the big shift so far. We’re still dealing with feature creep. Is there enough feature in the new devices that would really make an average worker average company go out and buy all these and replace all these? Lisa Su said it well yesterday when she’s talking about, really, you said this too, “Future proofing.” It’s not so much about knowing what you’re going to do with it now, it’s about what you might do with it in the future.
So that’s another kind of interesting thing. But, Pat, we knew this was happening. Like I said, I’ll leave it to… I don’t know how much weight the MediaTek part plays in all of this. I mean, I think over in Asia they have a lot more clout than they do here domestically. But I do think putting an NVIDIA badge on a device right now at the absolute peak Nvidia insanity could create some definite demand pull. And of course, like I said, there’s a little bit of a debate still. I think you hear this too, about kind of NPU versus GPU. And of course if they can, like you said, using tensor cores kind of create and make the GPU still a little bit more of the center. A lot of stuff still on, like, the NPU can only do so much, and that’s why there’s a lot of debate about how really impactful is 45 versus 48 versus 50 versus if more efficient GPU could do a lot of these things too. So these are all-
Patrick Moorhead: Those are great points. I do want to close this out just by saying Nvidia created AI on the desktop and on the laptop, and they did it on a GPU, and there are hundreds of applications that leverage it already. I just want to make that clear. The only thing we’re debating here is an integrated SOC that gets the Copilot+ stamp out there. So great conversations here. Let’s jump into the next topic. Amazon made some pretty cool disclosures across, particularly across shopping and logistics. And while it wasn’t related to that, we did have a great interview with AWS CEO, Matt Garman, and while the company tries to say AWS is separate from the rest of Amazon, 100% AWS technologies are being used across Amazon.
Daniel Newman: Yeah, I think we could cover this as kind of like a what’s up with Amazon? What’s up with Amazon and AWS and AI? Maybe I’ll talk a little bit about the announcements. We could chat a minute about the conversation with Matt Garman, I think that to some extent needs to be shared and let people see it. But first of all, maybe something of worth note is just these big companies are the probably best customer zeros on the planet for a lot of this AI technology. And so when you look at Amazon, of course we have the AWS business. What are they building that enables other companies? And of course they have the AWS business and what are they building that can then be deployed for their own Amazon footprint? That’s really interesting. So yesterday they made a bunch of AI disclosures or one is about vision-assisted package retrieval, they’re calling it VAPR.
It’s basically something that they think can reduce a route by 30 minutes and per delivery, they’re going to, I believe, deploy it in a thousand vans to start off. But this is the number one growth challenge is when you’ve gone from two days to Prime to same day, to few hours to minutes, with the exception of these packages dropping out of the sky to get more efficiency. Now we’re down to the point where it’s literally minutes make a difference because they’re trying to get stuff to us so quickly. So they’re working on using AI to basically optimize routes using vision-assistance, and they’re going to deploy that out to about a thousand vehicles this year and try to save on an average about 30 minutes package. Now when you’re doing millions and millions and millions, you got to think about the efficiencies that this gains.
That’s super interesting. And of course, Pat, you and I have been, I think in San Marcos together, we went to one of their fulfillment centers. The amount of technology in these places is astronomical. Now, it’s not the humanoid robots that we’ll talk about later walking around these facilities, but the way they use robots for pick, pack and ship, this stuff is… Like, Amazon, while it does employ like a million people and they have tons of seasonality and you need a lot of people still, they have been on the forefront of investing in technology to basically enable this expedited service. So all of us sit in our couch and when we need something, we just click in, we order it, and then voila, it shows up at our door. There’s a lot of technology in it until right now, these new fulfillment centers, robot driven state-of-the-art AI, new logistics systems, and they’re calling it their advanced fulfillment center and they believe they can take 25% of costs.
So efficiency, as I’ve said, the prune to grow first part of AI, another great example here, how it’s being done in factories. And then of course they talked about their new AI shopping tools. So this is not surprising, and by the way, ties very nicely to Amazon’s ads business. I’ve been sharing a little bit of content about this on my tweeter lately, is it tweeter? Can I call it tweeter? That this is a 40 plus billion dollars business now. Ads are growing, but also being able to use AI to put the right product in front of people and get the best pricing. So Amazon kind of pushed out that they’re using more visual shopping tools, more personalized recommendations, and they are saying that they helped customers save a billion dollars. Now, I don’t know if this exactly becomes the Elon Musk office of government efficiency here for corporations, if they’re saving, the more you buy, the more you save. I think this is the new CEO verdict of save you more, as long as you spend more.
But they’re using tools, implementing their technology, and of course you can be certain this is all built in AWS’s cloud, which takes me quickly to look this week, Patrick and I had the opportunity, which is rare, to sit down with AWS CEO, Matt Garman. This is a really candid conversation, Pat. I think it was great because Matt, let us really ask him the hard questions. As analysts, we have to be the arbitrators of what’s going on in the market. We can’t just accept the talking points. We have to kind of get underneath it a bit. And there was a lot of speculation about Amazon being behind in gen AI, Bedrock coming late, models coming after not having the technology, and then of course growth slowing and not having much comment in any of their recent earnings about AI’s impact.
We hit Matt on all these things-
Patrick Moorhead: That’s right.
Daniel Newman: And then of course, you hit them up on our favorite topic, build the silicon or buy the silicon, and there’s a lot on that one too. Matt gave us some great feedback there. I’m going to leave it a little bit opaque here ’cause I want people to watch the episode, but it was a great conversation and I know you’re going to have some more time with Matt soon. But thanks, Matt. Thanks AWS for making the time and everyone out there look for us to drop that pod. Pat, I’m going to pass this topic back to you.
Patrick Moorhead: Yeah, you pretty much drained this topic, which is fine. That’s why I gave you first-
Daniel Newman: There wasn’t that much there. It wasn’t a big one.
Patrick Moorhead: You pretty much hit everything. But some of the announcements that came out, vision-assisted package retrieval, talking about advanced fulfillment center, technologies, AI shopping tools, again, all delivered by AWS technologies. And one of the things that I think is important is that when you know AWS, first of all, it did not partner with OpenAI at the outset. And I think when it comes to how AWS has historically made its decisions, it’s all about scale. What do we have to create where we can have the ultimate scale? And while I do think that Microsoft got the upper hand on this thing, I do think that it’s a marathon and not necessarily a race, and the big competition is who’s going to get share shift customers? And this is never binary. Everybody uses at least two cloud providers, but it’s more on an application basis. Those new generative AI infused applications, who’s going to get those? And I think time will tell on if there were share shift. Competitions good.
Daniel Newman: Yeah, no, totally.
Patrick Moorhead: Yeah. So let’s move to the final topic here-
Daniel Newman: No, we got three more.
Patrick Moorhead: Oh, sorry.
Daniel Newman: I’m just telling you because we’re not that fast. We just aren’t. But I want to keep you awake for the whole thing.
Patrick Moorhead: No, I appreciate that. So a couple of Russian YouTubers came out with what were supposedly benchmarks from an M4 powered MacBook Pro. Full packaging, all of the packaging looked like older packaging, so there’s some dispute of it, but the figures passed merit. So right now, there’s only CPU benchmarks out there. It does not look like there is anything related to the GPU or the NPU yet. I don’t know if they’re slowly trying to torture people on this to be able to tune in. I guess if you’re a YouTuber, it’s a smart thing to jack up your subs and views. But what it shows, nothing extraordinarily surprising does very well on the CPU with single-threaded scores higher than AMD, Qualcomm, and Intel. By the way, the M3 performed well on single-threaded Geekbench as well as Cinebench.
Where it gets, I guess, interesting here is multi-threaded benchmarks, Geekbench and Cinebench, and Apple got an extraordinary lift here where just to be brutally honest, they were somewhat of a… I mean, they were a complete lagger, right. You had Qualcomm and AMD pretty much running circles around the M3 and the score of 971 for Cinebench, AMD is still in the lead by a sizable margin. Go AMD. It’s very similar to the Qualcomm X Elite and what we know about the Core Ultra 288V, it shines mostly in single threaded, but it gets hit pretty substantially because it’s an eight core solution versus 12 and multi threading with AMD. So nothing surprising. I mean, I think for me, the MT scores were a lot better. And if I dial out what does this mean? I think what it means is that AMD, Qualcomm, Intel are still very competitive. AMD still has superior multi-threaded performance. Apple does put some distance with Intel Core 7 Ultra Lunar Lake, and less distance.
I mean, they’re almost the same on Qualcomm, but it’s what you would expect. Let’s just say that customers are paying a 200, 300, $400 premium for this chip and this design over everything else. It better perform well. But I think it also shows the durability of Qualcomm and AMD designs. And the funny part is Qualcomm is ARM and AMD is x86. But hey, we’ll have to see what the GPU, we’re going to have to see the NPU. But just the fact we’re talking about anybody else being in the zone with Apple, I think is a big freaking deal because Apple has been running circles on the laptop side around everybody on the CPU side. And I think it’s important too. Companies like Signal65, one of our sister companies gets in and does the hardcore testing to get underneath the reality.
Daniel Newman: Yeah, you hit the leaks in the data side. I just want to have a little fun with this, by the way, because the market perception is that Apple has this sort of ability to keep its launches and new products airtight, and over the years it’s gotten harder, and of course in a connected world. I pulled up some data points, Pat, on this. They’re saying that this was potentially the worst product leak since… You remember what happened with the iPhone 4?
Patrick Moorhead: Yeah, yeah. Legal got involved, investigators, police, all that stuff.
Daniel Newman: The prototype was apparently left in a bar in 2010-
Patrick Moorhead: Yeah, exactly.
Daniel Newman: … found that. It said that there’s a rumor also that 200 of these units may have been stolen from a Chinese warehouse. That’s one of the other rumors that’s out there right now and that it’s obviously they were selling them in the black market on Russian classified sites. I just thought that was super interesting how this happens. I mean, Apple has historically been incredibly good at containing this kind of stuff. Now the fact of everything you said is interesting, but I think didn’t it all kind of fall in line? Were there any surprises in these numbers to you?
Patrick Moorhead: I mean, the MT was a little bigger than I thought it would be. And by the way, the latest arm instructions are definitely playing a role in this, although Apple won’t give ARM credit, it should, but you’re seeing a boost just with ARM V92.
Daniel Newman: Yeah, yeah, exactly. No, I mean, the results are positive, and I think you hit it. We’re going to need to, one, see this actually get launched. Two, we’re going to need to put it in the lab and see how it compares. With all these advancements and these new AI PCs, Mac has sort of been on the outside. No one’s really even counting it in, and maybe it needs to be. So of course, when it comes to these ARM based, they were first to do it, and they were first to show. And you and I saw from M1 to M2, the generational improvements, Pat, you had some of the best documentation in the world on how crappy M1 was, but they did figure it out, and they did sort it out quickly. And by this next version, it looks like it’s going to be pretty darn compelling.
Patrick Moorhead: So let’s move to the next topic here. We’re doing pretty well. We got two more topics and 10 or 11-
Daniel Newman: Yeah, these are short. No, I mean the first one, we can go as long or short as you want. This was the Robotaxi day.
Patrick Moorhead: Yeah, let’s drive in. Are you can do the intro or am I?
Daniel Newman: I’ll do it. Sorry, dude, check your calendar, it was blue coded that I was going to intro this one. I don’t want to let you fall off. Sorry. I jumped the gun. No, I mean, this was really anticipated, you know, coveted. Journalists begging for an invite, people wanted to be there. They went to, I think it was Warner Brothers studio. They did this ad. This was not just a average tech event, this was a Hollywood moment. And Elon Musk is so controversial. I mean, people right now either kind love them or hate them.
Patrick Moorhead: Yeah.
Daniel Newman: I guess this is just reflective of who I am. I’m somewhere in between. I find him really fascinating, I’m not sure. My biggest… So here’s my TLDR. This wasn’t just a Robotaxi event. This was a kind of a paint the future event. There are Waymos and Ubers and there’s a lot of self-driving stuff out there. The difference of what I think Musk is showing with his cyber car is that these things evolve from being cars that look like cars as we know them with seats, steering wheels, gears, buttons, controls to literally moving environments for people to function. He talks about going to sleep and waking up at your destination, but the seats don’t look like cars. You got these Robo vans, they’re like yachts. You’re inside them. They got working spaces. It’s like being on a train. And so these cars, unlike the Waymos with all the lidar all over them and all the gears and the vision and the cameras and the sensors, they look just like ordinary vehicles, but they’re actually becoming designed ergonomically for people to be able to function while they are mobile.
So it’s a very different feature that he’s painting. And I thought that was interesting. And of course, Pat, these humanoid robots. I don’t know if you’ve watched some of these videos. Like, when people were like, “Oh, this was a nothing burger. It was just…” I watched a video, this guy was standing in the middle of the, just having a convo with this humanoid robot. And again, goes back to the Amazon product, VAPR. Is this VAPR, is it real, is it programmed? How far along is this? But we know what large language models are doing. This wouldn’t be surprising to be able to insert a very logical conversational chatbot inside of a physical… But the idea that this thing can go collect packages for you, bring you beverages. I mean, can you see a future where every home has a robot? I mean, I totally could see… This is like, jets in there stuff.
So when people kind of call it nothing, I get a little bit like, are people too small-minded? Do they not see the size of the TAM? I think Jensen came out and said, “More chips.” He said, “Of course, this is great. More chips.” All these devices will need more silicon content, so that’s always great. But we’re moving really fast and we’re at the point now where we’re starting to build to realistically experience the future of mobility, the future of robotics and the consumerization of all this. I will say, final thought, ’cause I could talk about this for a while, is that Musk, my biggest beef with him is that his timelines are never right. They’re not even close to right. We were supposed to have done an FSD drive, what, six, seven years ago now? It was like 2018. I mean, it was sometime at the end of the last decade that we were supposed to be doing a cross-country driverless experience. And here we are going into ’25 and we’re still not really there at scale. We are seeing FSD in regions and markets. So all the things he says always have to be weighed against a little bit of a grain of salt is, I think he’ll do everything he says. He’s going to put rockets out there, he’s going to put links in people’s brains, and he is helping people recover from brain… I mean, he’s doing amazing things. I get –
Patrick Moorhead: Amazing, amazing things.
Daniel Newman: When people pick on him, it’s like, I understand why he doesn’t care because you don’t even deserve to say his name, most of us, including me. But the net is just if you’re an investor, when is this real? When does this become TAM? When does this become dollars and when does this make the stock worth more? Because right now we’re still just talking about Model Ys and Model 3s and shipment volumes, we’re not really looking at how much this could add. And I think that’s what people wanted to see more of.
Patrick Moorhead: Yeah, you did a great job covering this topic, Dan. And we’re seeing here is a glimpse of the future and not some unattainable type of thing. Musk does stuff that people say can’t be done like reusable rockets by a startup and not a huge government contractor. We’ve seen Boeing basically embarrass itself trying to replicate what SpaceX has already done. And when Musk invented Tesla people saying nobody has been able to create a new car company in 50 years ’cause of the scale involved in manufacturing and quality control and servicing. And he did it. He had to sleep in the factory. He had to… I mean, by the way, read Musk’s biography. It’s amazing how they did this.
Daniel Newman: Is it Isaacson one?
Patrick Moorhead: Yeah, he was teetering on bankruptcy multiple times. He put all his investment that he made in Zip2 and PayPal right into, he rolled everything in here. I mean, the guy sells his houses and not financially, it’s just about, it takes too much brain space that he wants to be putting in here. So yeah, he does not hit his schedules. I believe that he puts out schedules to put pressure on his engineering team. If you read his methodology of how he thinks, in his brain, he does believe that it’s attainable and he’s waiting for somebody to give him a credible reason to tell him no. Had he accepted what all his engineers told him all the time, SpaceX wouldn’t be around, Tesla would not be around.
And his leadership style too is when he makes the call, he will take responsibility if something screws up. I mean, there was a major SpaceX splash and crash that he gave the thumbs up and there was a tile or something that was off and it got cooked. Heck, he changed materials to fricking stainless steel when that hadn’t been done before for different thermal reasons. So yeah, he’s a great inventor of our time. He’s imperfect like all humans, and what’s interesting, if you read biographies of like Thomas Edison, the Gettys, the Spruce Goose, stuff like that, these inventors are kind of bonkers. Some people would characterize them. And I mean, my gosh, look at Palantir, look at Anderol, defense contractor, creating defense weaponry that are 1/100th the cost with 10X lethality. I mean, these are the inventors of our time, and they are perfect in their imperfection.
Daniel Newman: Yeah, I love that, Pat. That’s so true. I mean, they just don’t understand inventors like Musk, Jeff Bezos, Patrick Moorhead, these guys are changing the world with their podcast… Oh, wait, sorry. I’m conflating all these things.
Patrick Moorhead: That’s so funny.
Daniel Newman: … conflate a lot last night. No, you’re absolutely right. I’m so glad you said that though.
Patrick Moorhead: Yeah. So hey, let’s go into our last topic here. Commvault had their big tent event in London. We had the chance to interview CEO, Sanjay Mirchandani-
Daniel Newman: Mirchandani.
Patrick Moorhead: … and I had a really big conversation. So let me dial out here. So you have, I’ll call it the fractalization of data and the fractalization of infrastructure where you have enterprise data pretty much everywhere. You have five flavors of clouds. You have traditional infrastructure, and then you have data that is attached to those infrastructure and applications, and you have enterprise SaaS. You cross-index that with the risk of people stealing your data, taking over your data, essentially coming in and ceasing all of your business operations that the need is absolutely there. It’s a super hyper-competitive market out there. So some of the key announcements there, one of them was Cloud Rewind, and this came through the acquisition of Appranix.
It recovers data, but also all of the infrastructure settings to a pre-breach state. We think of that, it’s kind of, “Oh, we do that on our iPhones. We do that on our iPads.” It’s a heck of a lot harder as it relates to applications and infrastructure. There was also some talk about Clumio, a recent acquisition there, protecting AWS data and workloads. So you’ve got AWS, you’ve got Google Cloud, you’ve got Azure. Now, not all features, but a lot of the features. And finally, there was a big collab between Commvault and Pure related to protecting data to comply with Dora. And DORA is the EU, Digital Operations Resilience Act, scanning, storage, rapid recovery, it’s a very interesting collab. You see the collapse between storage and other data types of sources. Pure offers some of their own, but here we have a collaboration between these two companies.
Daniel Newman: Yeah, Pat, I mean, everybody I recommend in the show notes, check it out. Great conversation with the CEO. This data protection space has been really interesting ’cause there’s been a lot going on it, some people probably look at it as a bit of a, “Redundant kind of boring tech,” but it’s really important. I mean, you can’t get access to your systems and your data. You can’t build this cyber resiliency. It can literally take companies down. And that’s why we’ve seen the rise of Commvault. The rise of Rubrik was a bigger IPO this year. We saw that Cohesity and Veritas are supposed to finish up a merger tying up, and these companies are doing interesting things with AI, they’re doing interesting things with acquisitions, multi-cloud, of course, continued acquisitions.
Rewind is really interesting. God, it feels, recall, rewind. In the end, keeping your business up and running is the coolest thing you can do in IT, and so that’s where companies like this are really important, and I think that’s why they’ve gotten a bit of a boost this year. So it was a successful event, Pat. It was great to have him on the show. Let’s definitely make sure we share the notes. I think you also wrote a great Forbes piece, if I’m not mistaken, you or one of the folks on your team.
Patrick Moorhead: Yeah, Robert was lead on that article. I added two paragraphs. I think about some of my favorite topic, which is the hybrid multi-cloud. And what Sanjay is trying to do a lot of other companies is create this hybrid multi-cloud that says, “Wherever your data is, whatever state it’s in, whatever cloud, whatever device, how do I protect that data? How do I recover it?” This backup went to DR and DR went to cyber resiliency, and it’s interesting, big companies putting front ends on this to query all that data with generative AI. That’s kind of the latest and greatest out there.
Daniel Newman: Absolutely. So there you have it, Pat.
Patrick Moorhead: Great. Great episode here. Thanks everybody for tuning in. Dan, where are we off to next week?
Daniel Newman: Well, I think we’re kind of together then kind of apart and then back together and then back apart. But Monday we’re heading up to Lenovo Tech World. Then I’m going to jet us in cross country for Bill McDermott, CEO Summit, and then an IBM event, and then back home for the race, which is a busy one. And then we’re off to Hawaii for Qualcomm. So it’s like it’s bang, bang, bang, bang for the next couple of weeks.
Patrick Moorhead: Yeah, I come back. I’ve got a wedding to do in Florida for the weekend, so I’m pretty much going to be on the road for almost two weeks. So I will be giving you health tips the entire way. And I know, Dan, you’re questioning whether people just tune into my channel for health advice, but we’ll see. I think-
Daniel Newman: Patty Atia-
Patrick Moorhead: I figure if I crash and burn in tech, I can always do… Sorry about that. Dogs.
Daniel Newman: No worries.
Patrick Moorhead: Yeah, the old man health thing. But hey, thanks everybody for tuning in. Check out all The Six Five coverage that we had for AMD’s Advancing AI event. Hit us up on the socials to compliment, whine, ask us hard questions. Challenge us on some of our commentary and analysis. We love to learn. I always like to say I’m not right on everything, but I’m right on a lot of the big topics.
Daniel Newman: Yeah. Just don’t be an a-hole.
Patrick Moorhead: Yeah.
Daniel Newman: Other than that, we’re down. We’re up for debate. It’s like with people and not a big fan of trolls. I like people that actually have real accounts. Those are my kind of people.
Patrick Moorhead: I hear you. Peace out. Thanks.