The Six Five: Talking Vast Data, Robotic Ports, Nvidia-Accenture, iPhone, GenAI, AMD, Qualcomm, Intel, Microsoft

By Patrick Moorhead - October 7, 2024

On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The handpicked topics for this week are:

  1. Vast Data COSMOS
  2. Automating Ports
  3. NVIDIA-Accenture Tie Up
  4. Weak iPhone Sales?
  5. GenAI Weekly: NotebookLM, NVIDIA LLM, OpenAI, Microsoft
  6. Signal65 Runs The Copilot+ PC Processor Gauntlet

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.

Watch the episode here:

Listen to the episode on your favorite streaming platform:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Daniel Newman: Hey, everybody. It’s Friday, and we are on episode 234. Welcome to The Six Five. Patrick Moorhead, fully sweating, broader shoulders, bigger hips, working out. What’s going on, my friend?

Patrick Moorhead: You pretty much said it. Yeah, hitting the trainer, two months in. And did some measurements, and yeah, I’m up three quarters of an inch on the chest, oddly enough two inches on the hips and booty. I don’t know where that’s coming from, but I pretty much lost my booty during the massive weight loss. And I don’t want to be up on waist, but I’m up an inch. I need to figure that one out. I don’t know if I’m getting fat again, but I need to drill. But no it-

Daniel Newman: You conveniently left that news out when you told me earlier.

Patrick Moorhead: Well, you’re like my parent. I just want to give you the best news possible, and then you can ask me questions and probe on the negatives. So anyways dad, it’s great to be back. It’s my favorite part of the week. It’s been a little bit of a challenging week. Even though we’ve been home, a lot of stuff going on, a lot of stuff to cover. We’re back on the road. I’m on the road for four weeks starting next Tuesday.

Daniel Newman: Well, I appreciate the effort that you’ve made to committing yourself to being healthy. It makes me feel better when I’m eating pizza. Lots of pizza, buddy. But listen, when I’m your age, when I turn 46 like you, I’m going to get really serious about my fitness once again. But no, it’s good to be back. It’s good Friday. Got a busy weekend, traveling, getting on the road. Next seven weeks or so, Pat, it’s going to be pretty relentless from what I can gather. And this is that fun, busy time of year, so strap in because here we go. But yeah, we got a busy week here, Pat. A bunch of stuff to cover. Started off with a big event for VAST Data called COSMOS. We’ll talk about that. We had some big news. I don’t know that sometimes people are able to put together just how important it is to be able to get goods in and out of our country, but our ports shut down for a couple of days. People were hoarding toilet paper. I don’t know if that was the thing they needed to hoard, but it does have a big impact.

Now, they’ve pushed that out. We’ll talk more about that. But what about the technology at the ports? We’re going to take look into that a little bit, Pat. You went on CNBC and you talked about this new NVIDIA Accenture tie up. Why would those two be tying up? And what are they doing together? There was also a report from Barclays that hit the street that iPhone sales for 16 are not hitting the mark. Of course, I have comments on that, and I’m sure you do too. It was a pretty busy Gen AI week, 157 billion valuation for OpenAI, but updates to NotebookLM, OpenAI. NVIDIA has a big LLM, a frontier size model. And then of course, Microsoft had some things going on as well. I think we can split and conquer that, Pat. And of course, the world’s most successful and important performance testing house ran some new tests, Pat, on the new Copilot+ PCs. And we’re going to share a little bit about what’s going on there. So a lot of news, a lot going on. Now, this is a reminder for all you out there. The show is for information and entertainment purposes only. And while we are going to be talking about publicly traded companies, don’t take anything we say here on the show as investment advice. Before I dig into topic one, Pat, anything I didn’t get in the warmup here? Nothing in the preamble?

Patrick Moorhead: You got it. We’ve got a lot to talk about here.

Daniel Newman: We do have a lot to talk about. I do want to send a little shout-out to all the people that were affected by Hurricane Helene. You’re in our prayers. Sending best. And hopefully, the dollars being donated to these various causes are getting to you. I know it’s a tough time out there. All right, Pat, let’s go ahead and talk first about VAST Data. Company is on fire, over $9 billion valuation. They had a big launch, a big event called COSMOS this week. What was going on there?

Patrick Moorhead: Yeah, so every four or five years, a hot storage company comes along and really resets everything, right? I think NetApp falls into that category even though they’re a lot older, but they redefined a lot of things like Filer as an example. And then you had Pure Storage that came out with a bladed architecture really focused on software first and the ability to create a chassis that you could plug into, but most of the magic was about the software. And then you have VAST Data who entered the scene. And if you can imagine the different stacks from storage to file systems to database, their effort was to collapse that stack. And by the way, they don’t sell storage. They are a software company. A very disruptive company out there.

And what they did is they came out at their COSMOS event. The biggest announcement was called Insight Engine. And this is in collaboration with NVIDIA. And there was a nice interview between Jensen and VAST Data CEO, Renen Hallak. And it was pretty good. And what it’s about is real time data operating system for AI. That’s the tagline. I think it’s fair. It’s one of the operating systems that you’ll be dealing with. But what it does, and this integrates with all the NIMs microservices, is it embeds the semantic data. Semantic meaning, into the data, right? And current process, again, there are people more technical than I, like Matt on my team, you’ve got a lot of great analysts that I’m sure we will dig deeper, but essentially typically FDTL this data off into a database. And it’s a two-step process to create all the factors, right? What they’ve done is they’ve collapsed. So essentially it includes a vector database, so it’s one step versus two.

And the cool part about the security side is it keeps all the security embeddings that are attached the data as opposed to ETLing that out and having to reapply some security policies. This is focused on the enterprise, not the hyperscalers. Although, VAST Data is very tied into hyperscalers like xAI and Grok, a platform that we spend way too much time on and they are likely part of this as well. The other announcements, Cisco and Equinix, Cisco’s integrating into UCS, they’re integrating the entire VAST Data stack. We don’t hear a lot about use Cisco, UCS, it’s not the number one priority at Cisco. It’s software or services and networking, but they very much are a major factor into blades for enterprises. There’s also a deal with Equinix. The stack’s going to be integrated into Equinix data centers for their clients.

And finally this was more of a community initiative. It’s called COSMOS, right? A lot of people came out, practitioners, researchers, tech vendors came out. And if you follow the storage market, every company has a, I’ll call it vast following of this. We’re going to have to see what does this mean competitively. We don’t know yet. It’s very provocative and we haven’t heard anything from Dell. We really haven’t heard anything from Pure. You attended the NetApp event out there and again, it’s different but it appears that they, they’re trying to accomplish the same thing, but we’re going to have to see.

Daniel Newman: Yeah, there is a lot going on, Pat, it feels a little bit like I’m going to be repeating some of what I said after NetApp. There seems to be a new paradigm that is going to be taking place in the storage industry. Now what’s fun about this is none of them want to call themselves storage anymore, which is understandable because the capacity and capabilities that these companies have is becoming much more prolific in terms of how these companies can be utilized in the AI computer. So you had the historic of right storage being multi-tier, lots of data that’s relatively inaccessible in real time from compute memory. Now, we are entering an era where we want to effectively make all data available at any time and in compute in order to process, inference, train, whatever you’re trying to do. And to your point, NetApp, Pure, all these companies are working on creating a compute and storage alignment where there’s less of this sort of what I call middle layers, by reducing the need for certain softwares by making files and data of all types instantly accessible by compute.

So basically, point the application at the Vast system as opposed to the Insight Engine, as opposed to needing to point it at some sort of data lake, right? This is where it’s all heading. Where it can then understand is it object, is it file, is it vector? What kind of data is it? How is it that data being, Meta. What kind of metadata is available to that data? And then allowing an application to instantly access that for say, inference. That’s where we’re heading. All these companies are on the same trajectory in trying to do this. Vast went to an over $9 billion valuation by really beating this drum for a few years and they were beating this drum by the way, well before Gen AI was really a thing. They’ve been saying for a long time that we need to de-complexify, if that’s even a word, the whole way storage is architected and the way data is made available.

And of course, we know with RAG and generative AI that vector that the ability to treat many different types of data consistently and making it very accessible without needing to all be always organized, constantly updating databases. That’s the holy grail, that’s the holy grail of where we’re trying to head. So that’s what’s going on. To me, companies made the right alignment, it’s gotten the right partners, it’s in with xAI, it’s in with NVIDIA. Clearly they understand where they want to be connected, where they want to be aligned to. And of course, from a rare on-stage appearance.

Patrick Moorhead: That’s good man.

Daniel Newman: Be a CEO Jensen Huang only further reiterated, but it was a solid event. The momentum continues. Of course, seeing these things shipping and in the real world and of course, seeing companies large, companies doing big AI work, being able to basically say that they have effectively maybe moved on from certain needs for different hardware software that were typical abstractions in this whole data pipeline could get really interesting. So, definitely a watch list company, Pat, and I got to imagine an IPO is coming, I just have to imagine. All right, let’s hit topic two. This one’s a little bit different, Pat, than something that maybe we talk about all the time, but there’s a technology band and a real time world news band. And I always enjoy topics like this. So, I don’t know if you out there if everybody was paying attention, but the International Longshoremen’s Association and the US Maritime Alliance, basically there was a strike that was it, for the East Coast and Gulf Coast ports. Okay, just to be clear.

What comes in on these ports? You’ve got fruits, you’ve got automobiles, you’ve got some electronics goods. By the way, not really toilet paper from what I understand. I understand that we actually did not need to go stock up on toilet paper, but there was a lot going on there. And effectively, this group was looking for something over 60% increase in pay. And this is the part that’s really interesting hopefully for our audience, Pat, is they wanted the guarantee that there would be no significant automation investments being made at our ports. Now, let’s think about this. Let’s back out for a minute and say, “Okay, so technology can meaningfully improve our supply chain,” right? This is really what’s going on here. In China for instance, they use 5G, IOT, AI, and they have almost fully automated parts of their ports.

Goods at that volume coming in, it can be done with very minimum human interaction, can keep ports moving, keep the containers loaded, unloaded, goods coming off. Pat, they want us to guarantee we’re not going to do that. And their President Harold Daggett was asking for a huge amount of increase. And to be clear, okay, these folks are making, they say about average of 150,000 a year. These are not low paid workers. So in the tentative agreement, by the way, they’re going to take it up by 61%. So these folks are going to be making potentially multi six figures as Longshoremen. And of course, really important job happen, Pat. What I wanted to zero in on with you was just get your take on this automation thing. My feeling is there is absolutely no way the US can keep up as one of the most technologically advanced nations in the world and at the same time run our ports manually.

I mean, you can do the parallel here, but this would be like never building gigafactories for automotive manufacturing. This would be like still canning bottles of beer by hand. I mean, the idea of doing things this way seems to me incredibly antiquated and highly risky, and it puts the US at a massive disadvantage. And so that seems to be the item with this tentative agreement that has not been fleshed out is whether or not this automation thing gets through, Pat. But just between us knowing that all the goods, all the GPUs that need to come in and out of the ports, everything else we’re trying to build, containers, full of automobiles. We cannot have a world in the future that doesn’t automate this stuff. What do you think is going to happen here?

Patrick Moorhead: So, I like history and we saw very similar things, I mean, on everything. Buggy makers, buggy tenders, right? We saw this in horses, we saw this in light bulbs. There used to be a group of people who would go in the city and they would light gas lamps, okay? And when you brought electricity in there, they didn’t need to do that anymore, and they went on strike and the whole city turned out. We saw this with elevators.

Daniel Newman: Don’t gaslight me.

Patrick Moorhead: No, I wonder if that’s part of that.

Daniel Newman: Actually, I wrote about it in my book, Human Machine. I could tell you more about that.

Patrick Moorhead: Nice, nice. Is it related to this topic?

Daniel Newman: Well, it was related to innovation and disruption and people-

Patrick Moorhead: No, let’s hear about it. I want to hear about gaslighting.

Daniel Newman: Oh, no. I mean, there was a period of time where every night there was gaslighters, gas lamp lighters, they would literally climb on ladders and they would light the lamps. And the belief at the time was that if we automated, or sorry, by putting in electricity, Pat, this would take away their jobs and they would have no opportunity to be employed. So, it goes back hundreds of years in technology. The belief has been by automating, innovating, advancing that we would somehow eradicate work. And so this is literally a parallel to that, and the term gaslighting has been suggested to have some historic context to that particular movement.

Patrick Moorhead: I love it. I never saw you as the history guy. I was always pretending to be one with all the books I read, but…

Daniel Newman: You just write books.

Patrick Moorhead: No, that’s good. We should write a book together.

Daniel Newman: Cheer, yeah.

Patrick Moorhead: We should do that. I think you and I need more things to do. So anyways, a lot of historical context, nobody should be surprised. And by the way, you’re going to see a lot more of disruption as soon as even more Gen AI kicks in. You might see some new types of unions that are kicked off between administrative assistant union or accounting union. This is the type of things that come with technological disruption. But it’s a very good example. I think what nobody appreciated was we had a major storm coming in with hundreds of people dead and they’re going to cut off port support in Florida. That does not give unions a good name out there. And that really, I think made an impact on the common person.

Daniel Newman: By the way, have you seen the compound of Harry Daggett? They’ve spent all over the internet. This guy makes a million dollars a year, he drives a drop-top Bentley, he’s got a 74-foot yacht. By the way- I’m not pure capitalist. I’m not upset for people being successful by any means. That’s feels like a lot of the politics of the world today with really, really wealthy people telling less wealthy and how they’re supposed to feel about things.

Patrick Moorhead: It is interesting, for sure.

Daniel Newman: But yeah, I mean look, I think in the end from a historical context, Pat, the US cannot afford to play and take a back seat and allow technology to surpass us. We need these good services to move. And again, this is by the way the ability for a group to strike and take down the system. To me, I don’t know about you Pat, but that’s a reason to automate. I mean, when people use a symmetrical power to potentially bring down society. I mean, did you hear… I heard comments like, “By the second week, people won’t be able to get this and by…” There was some very sort of-

Patrick Moorhead: Threatening.

Daniel Newman: Yeah. I mean, it was very teamster or whatever, and it’s sort of mechanics. But anyway, all right buddy.

Patrick Moorhead: You remember what happened when the airline folks went on strike, Reagan came in and fired them all and brought in the military to run it. And the president does have the powers to do that. One final thing, Screen Actors Guild in their contracts wanted to talk about likenesses rights about that. So, just a different variation of it.

Daniel Newman: Absolutely. Look, I’m trying very hard to automate myself, so Danbot2. I just got to figure how to monetize that. So anyways, are you doing an advertisement? Is this a commercial break? What’s going on?

Patrick Moorhead: Pretty much, if anybody out there needs some advisory services, some research, some white papers, some testing, some TEI.

Daniel Newman: Are we this desperate?

Patrick Moorhead: Not at all.

Daniel Newman: Crawl straight into the middle of the show. No, we’re pretty busy. Pat, you’re on with John Fortt, Morgan Brennan, CNBC, Overtime Plus, you were talking about a new tie-up. And maybe to people right away it seemed to make sense, and if you backed out it didn’t. But NVIDIA Accenture is some huge amount that Accenture plans to spend standing up a practice. What did you have to say?

Patrick Moorhead: Yeah, so let me spit out the facts real quick. So Accenture is standing up a new business group called the NVIDIA Business Group. They’re going to train 30,000 employees on NVIDIA AI technologies. I wasn’t able to parse actually what those 30,000 employees do. Are they consultants? Are they operators? My guess is knowing Accenture, it’s combination of both. And the training, it covers AI Foundry, which an offering that Accenture has, AI enterprise and NVIDIA Omniverse. So, this goes all the way from back office to front office to the industrial edge. And this whole AI refinery from Accenture, it’s all about agentic AI journeys, which is the hot topic, particularly around enterprises. So, let’s dial back a little. You have trillion-dollar investments that are going on in the picks-and-shovels, the NVIDIAs, the Broadcoms, AMDs, the Marvells and the hyperscalers who are offering IS services like Microsoft, Amazon, Google, Oracle, and then tier twos with Supermicro and Dell.

But we really haven’t seen the big push, right? We’ve got a 10X’er on the picks-and-shovels and the picks-and-shovels services, or actually most of the picks-and-shovels, but we haven’t seen is that downstream enterprise lift as measured by the software companies, right? The SAPs, the Adobe’s, Salesforce, everybody like that. So, this is to accelerate that in the enterprise and get that going. My research suggests that we’re 18 to 24 months from mass enterprise adoption. Some say it’s longer, and if this alliance can shrink that gap, it can have some really, really huge difference. This isn’t the first type of alliance than NVIDIA has done. In fact, six months ago they did a deal with IBM as a practice called IBM Consulting Advantage, and it does similar things and works in conduction with Adobe, AWS, Palo Alto Networks, Microsoft, Salesforce, and SAP. But overall, this is leaning into that potential risk that you and I talk about often here of trillion-dollar investments making a difference with the enterprises and also with consumers.

Daniel Newman: Yeah. Pat, I think one of the big sort of gaping holes that NVIDIA had risk was in the enterprise. We know that with the largest hyperscalers in the world, NVIDIA is powering almost all the training, barring some of the Google stuff they’re doing on TPU. We know that there is a lot of consumption of NVIDIA happening from these hyperscale cloud providers. We also know tier twos are building a lot of capacity. There’s these specialty data centers for doing applications. We know ISVs are building handsomely on NVIDIA’s platform, but enterprises at some point there’s going to be this opportunity to build out their own AI platforms and there’s going to require a lot of expertise. Now, when we did our sort of study early in ’24, we found that there was a huge volume of multimillion dollar POCs for AI, but there was a couple of things that also we were finding out. One, is most of them weren’t successful. You were talking about less than a third.

We were finding out that there was a huge lack of competency. So if you’re a mid-sized enterprise, even a small cap, mid-cap company and you have limited amounts of resources to put towards AI, you’re going to be calling in an SI. And Accenture by the way, was the biggest of the SIs out there in terms of volume. You mentioned IBM, there were another big one on consulting side, but companies were going to turn to organizations like Accenture. So while you might use a hyperscaler to consume AI or you might actually work directly with NVIDIA if you’re a massive company, if you’re like Elon Musk and you’re building a super cluster, you may have that direct interaction. But even Musk was building with Dell like through partnerships or Supermicro. But the bottom line, Pat, is expertise is a problem here. You’ve got to get these POCs up, you’ve got to be able to stand up these applications and build these applications. And most companies don’t have the expertise.

On the other side for Accenture, they also have to think about their own modernization into the future. How are they going to stay relevant in an era of AI? I think in ’23 they did a hundred million of revenue from Gen AI. They booked about 300. From what Julie Sweet CEO said, they’ve booked 3 billion in revenue this year for Gen AI projects. So, they are making substantial headway in the Gen AI space in terms of growth. And why wouldn’t you partner in a big way with NVIDIA? Now this will tie together when we get to some of the broader generative AI notes. NVIDIA is building these frontier, these massive models. Now, open source to compete. The way I see it too is that was sort of a missing piece in the sandbox, but you could take what they’re doing with NIMS, what they’re doing with industry specific, what they’re doing with their CUDA and their developer ecosystem. And suddenly the NVIDIA story becomes really one-stop-shop. Like, do you need anyone outside?

Patrick Moorhead: Yeah.

Daniel Newman: So what I’m saying is, so a lot of these OEMs have become big resellers of Dell, but if you’re Accenture, you can literally do the entire stack top to bottom for healthcare, for financial services. And if I’m Jensen it’s like, “Well, who’s selling big SI or big integration work to these customers that they want to get more and more in deep with Accenture?” So now the question is, at least I’ll ask, Pat, is do all the other big SIs follow and make large commitments? So, we’ve seen like a cascade. First wave was the hyperscalers, second wave were the big OEMs. Third wave has been all the ISVs partnering with NVIDIA. Is the fourth wave, the SIs now will all make big partnership announcements tying their sled to NVIDIA to make sure that they can monetize the NVIDIA demand that’s coming in this next era.

And remember Jensen said this week, “The demand for Blackwell is unbelievable. The more you buy, the more you’ll save.” So keep buying everybody. All right, Pat, we’re going to talk more about it. At least I’m a little bit when we get to some Gen AI topics. Let’s go to the fourth topic. Another fun one, only because of their great success do we sometimes enjoy picking on Apple a little bit, but I think you and I both said this. I know I said this for sure, but to our plays came out with a report that said that there’s some cuts in production, some forecast cut production, cuts on the iPhone 16 basically it’s not that hot. So we’ve heard this cycle, super cycle, the next wave commentary. My first take on when I saw the report was, “Duh.” Okay, I’m just going to be candid. Besides the little button on the side, do you have one of these yet, the 16?

Patrick Moorhead: No, no. We’ll talk about that as we get in there.

Daniel Newman: Yeah. I just picked up my Pro Max 13. I’m a techie, what can I say?

Patrick Moorhead: I love it.

Daniel Newman: But in all serious, people aren’t that convinced. And look, I can anecdotally say, people I’ve talked to aren’t that convinced. I can also say, less anecdotally that the numbers now that are coming in suggest the same thing. Even on the day of the announcement, Pat, we heard the lines weren’t as long, that they were typically around a certain corner in New York and they did the big spiel. And Pat, here’s what I think is going on is we’ve entered the first cycle where software is more important than hardware. You could argue that’s been the case for a long time, but in terms of what was being promised to the market, it was software driven. Apple intelligence from the 15 pro forward will work on all those devices. And so if you’re sort of a spectator and you have a 14 or a 13 or even a 15 regular edition to run out and buy the 16, really besides an incrementally better camera, nominally different battery life, which most of us have not really found to be a real thing in the real world. We’re waiting for some Apple intelligence killer app to hit, something that we would be like, “Oh my gosh, I will use this every day.”

And the thing is, none of the Apple intelligence features really shipped out of the gate. So, I’m thinking there’s probably a somewhat meaningful part of the market path that’s sitting idly waiting to see Apple intelligence features in the wild being used and then finding that thing that they’re like, “I really want to do that. I really need the device that’s capable of doing that.” So I’m thinking maybe in elongated cycle rather than that sort of instantaneous. Now, holidays of course might tell another stories. We’re heading into holidays, Pat, but interestingly enough we said it was a nothing burger and it turns out that we might be right.

Patrick Moorhead: Yeah, Dan, there were a lot of decent features that I think enthusiasts would like the four x microphone, which by the way, that could be the one thing that could pull me in if it does better recordings. So when I’m listening to a keynote for instance, I’m sitting there, I want to do speech to text, I want to do transcriptions. And if that can do a better recording, I will probably buy that. There’s 4K 120 capture, which again really awesome camera and video aficionado analyst of my natural SOG is like, “I am all in, this is important to me. Let’s do that.” Keith on your side was really stoked about the four microphones so he can do his video podcasts every day. So yeah, there’s a lot to really like. The challenge though is for the normies, right? They want better battery life and I would’ve really respected Apple if it came out and really leaned into better battery life than the iPhone 15, which by the way, that battery life blows, okay? I can tell you it’s nowhere near where it was before. And Dan, I know you’re not a boomer, but you got the Boomer phone on the 13. If you want better battery life, do not upgrade because that is an amazing battery life system there. And then there is the AI part, the Apple intelligence that I think you articulated it well in terms of where we are. I want to bring this back to… When I did consumer products. It’s called the law of if-then.

If you have to have too many if-then statements to get the benefit, it’s not going to land. And I think what I learned is that if it’s greater than two if-thens, if you do this, if you do that, then you get the benefit. It’s not going to land without some radical form of marketing and promotion. And what are some of the, if-thens, right? If you have this handset, if you go to the beta then you can do this. If you can actually find that AI feature. There’s some analysts that I do respect who I agree with, which is it’s hard to actually find the AI features that are in there already, but we’ll see. I don’t think it’s going to have a long-term effect on Apple. Where this really matters is outside of North America and Western Europe where Android is very much king and that is where this conversation will matter. A company like Apple has about, they have to goof up or screw up for five straight years. There’s so much equity banked into their brand and to the experience and the lock-in they have on all of the add-ons. And quite frankly Apple Watch and AirPods are a lot more compelling and differentiated than iPhone.

Daniel Newman: Yeah, I think the watch is pretty cool. It’s not my style, so I haven’t really broken it down yet. It’s not automatic. I can’t wind it, but sometimes I like the… You wear that orange band, right? That’s you.

Patrick Moorhead: No, I’m not doing that.

Daniel Newman: No?

Patrick Moorhead: No.

Daniel Newman: Thought you were really cool because now you’re like a fitness granola guy.

Patrick Moorhead: I am as much as I can be, as much granola as I can pound down.

Daniel Newman: By the way, I want you to see my really healthy breakfast. It’s 16 ounces of steak and 12 eggs.

Patrick Moorhead: Excited. I did not get my protein breakfast yet.

Daniel Newman: Did you get some protein though after you lift? You got to hit it right away. You can’t wait.

Patrick Moorhead: I don’t know about that.

Daniel Newman: You can’t wait. It’s like your synthesis. Ask your trainer guy. Don’t you have like five trainers now? You have one that works on biceps, one that works on abs.

Patrick Moorhead: I do have multiple trainers. I have the witch doctor, I have a nutritionist and now have a trainer.

Daniel Newman: Well, I’m just glad you’re going to be around longer than me. Someone’s going to have to take this ship over when I go. I got a couple of years left though, I think.

Patrick Moorhead: I’m banking on me to kick it first.

Daniel Newman: No way. Not with you what you’re doing. I mean, you’re like drinking baby’s blood.

Patrick Moorhead: Not yet. No. I’m just injecting it.

Daniel Newman: Injecting it. Okay.

Patrick Moorhead: Please don’t make up anything about me that’s not true.

Daniel Newman: I mean, if I read it on X, it’s true. Sorry. I believe everything I’m told and I’m fed. Anyway. So Pat, big week-

Patrick Moorhead: Huge.

Daniel Newman: … a lot going on in Gen AI. It’s like every time you turn around there’s something cool. You want to play with RAG, there’s NotebookLM. You want another massive mega LLM out and video was here to make news. OpenAI and what I think is a somewhat ridiculous evaluation, but you know what, the money’s piling in, so what the hell do I know? Copilot. Take your pick man. Go ahead and I’ll fill in the blanks. But what caught your eye in the Gen AI news of the week?

Patrick Moorhead: Yeah, we didn’t catch this last week, but Google brought out a service called NotebookLM, and essentially what it is an almost infinite RAG model where you can do up to 2 million tokens, up to 50 different data types, and even have cool buttons you can press. “Hey, you want a summary, you want a table of contents? Do you want a blog style?” And this is Wild, Daniel, is it will actually do a podcast between two people, a man and a woman. And you can just imagine where this can go in the future. First of all, when it comes to personal RAG for a lot of use cases that even people like us do, you want to pile in all this content from disparate sources and be able to query it, summarize it, do all that stuff. And it’s been a criticism that I’ve had of some of the big makers here, because quite frankly between Google and Microsoft, their personal RAG capability just was non-existent.

You had to go to OpenAI, you had to go to perplexity to do that right. Perplexity has a cool feature that says, “Don’t get anything from the internet, just take whatever is in the content that you’ve thrown me.” The context window isn’t high, but this context window is insane. Like 2 million from Google. Of course, we can improve, does not support video content, does not support Word, does not support XLS, any of the Microsoft formats. You can put in a link to YouTube, but only if YouTube has had time to go in and do a transcript. But I’m going to be playing around with it a lot more in the future, but I’m super excited about this. Microsoft did a major drop, two fronts. So first of all, overall Copilot, and again, I’m still piling through it right now, but to me it’s the AI assistant we had always envisioned, right? There’s more personalization, there’s voice, there’s vision. And by the way, vision not in the way that you might think. And there’s also a much more simplistic interface about what you can do. There’s a daily readout that essentially is the top news stories. Doesn’t look personalized yet, but it’s read as if it’s an NPR episode. So I’ve kicked the tires a little bit on it. And then, I think the biggest one is essentially you just have a conversation with it, right? It’s like ChatGPT advanced voice. And by the way, Gemini cranked theirs out. Google did, they cranked theirs out the same week, and I think Meta even came out with theirs. But I would say, this was the consumer launch for Microsoft and very provocative, better than I thought. By the way, more risky for them but big. And you can get this on your smartphone, Android, iOS, you can get this just on a web browser.

You can get this on Edge, you hit the Copilot key, it automatically comes up with the new capabilities. Super exciting. There’s no RAG capability on that. I think Copilot is the last service to not have RAG capabilities. Here’s the second thing that I’m super excited about, and these were Copilot+ features, okay? And Copilot+ features have the integrated NPU, amazing battery life, amazing performance. But I am so super excited about the features that are coming out. Because there’re features that we all use. Do you use Microsoft Photos? Do you use File Explorer? Everybody who uses Windows uses file Explorers to go in and do searches on that. And do a semantic search. “Hey, show me all my files that I’ve created about a client, an SOW in the last 12 months.” And this sucker pops up. “Show me a photo of me on vacation with my three kids and wife and give me those photos.”

For photos. Upscaling, photos up to 8X. You have a low quality photo, upscale it. You want to do generative filling erase, a lot of the features that we’ve seen through a paid Adobe subscription. It does that too. One of the most popular Android features right now getting a lot of use, and this came out through Samsung, is circle to search. Microsoft has their version called Click To Do. On vision, vision is like this. Vision is, I’m on a web page and I just start querying it, and I literally ask the questions or type in the questions and it’s looking at everything. To me it’s a real time recall, okay, to be able to query, and then it’s using data from the internet to come in and make it even better. Finally, just to make a long story longer, good progress on VPNs for ARM. Looks like going to fill in that gap in the first quarter of ’25. Super excited. And of course, recall. Bring it on Windows Insider, not there yet. I’m excited.

Daniel Newman: Yeah, there’s so much going on, Pat. We’ll talk a little bit more about some of the interesting stuff with Microsoft too when we get to the signal, what we’re doing there in the lab and the testing stuff that we’re doing. But you hit on a lot of them, Pat. I think a couple of the most exciting things for me is one is just the very accessible RAG technology with what they’re doing with Notebook. I’ve seen some reports, I’ve played with it a little bit. I’ll be candid, I got to play with it some more. I’d like to start putting some of our large data sets from our intelligence platform through there and some of the big reports and start querying it, just to see what’s available now. You can basically ingest the data, use language, and get really thoughtful responses with very little effort. Pat, I mean, I think about it a year and a half ago, what it would’ve taken to build these capabilities and how it’s been basically dropped at our doorstep now. We have what’s going on with o1 and these reasoning engines. I mean, you have systems now that basically have the capacity to reason like an executive team. Remember when Benioff talked about having AI in your boardroom? I mean, the problem is these things are way too inefficient power-wise right now to actually be used all day every day.

But if you could use o1 the way we’re using search and ChatGPT, Pat, you may not even need me anymore. You could just literally talk to the reasoning engine when you’re trying to work your way through a modest dilemma. Like, how do I get my biceps over 18 inches? The other thing I think that was really interesting to me this week too is this sort of pile-on thing that’s going on. We talked a little bit about the OpenAI valuation last week, but this NVIDIA large language model got me rethinking now, Pat. We’ve got all these different players contributing here to these frontier models and now you have a company like NVIDIA that’s actually just added one more layer to the stack. It’s always been able to do sort of specialty models. It’s been in this space a long time, and of course, with its own hardware, it can train pretty much anything it wants.

But you’ve got this really, I’d say, critical two lanes running of open source with Meta and NVIDIA that are really pushing and barreling down. And then, of course, you’ve got this black box models that are gaining in popularity. And some would make me wonder why this NVLM thing didn’t get more attention this week. It got some, but why it didn’t get more. Because if I’m an investor, I mean, you heard Apple pulled out. And obviously, there’s been a ton of brain drain in OpenAI. But if I’m an investor and I’m investing 6 plus billion dollars as a group in 157 billion evaluation, I would want to know what makes it sticky in the long run. And when you’ve got companies like Meta with Mark Zuckerberg, I think he’s not the second-wealthiest person on the planet, and you’ve got Jensen saying, “Hey, we can open source this, deliver a product. I think this NVLM one outperformed GPT4o in a number of different benchmarks.”

Not all of them, but some of them. And then you start to think how you could integrate that with NIMS. You could integrate that with CUDA, you could integrate that with hardware. And gosh, then you start to see why Accenture wants to double down. But I don’t know, Pat, I just look at this and I go, “This is going to get crowded. This is going to get noisy.” And unless there’s something in the back room that’s so exciting and so… I’m not sure there’s an advantage. And then on top of that for ChatGPT and OpenAI, I’m not sure there’s an exit other than going public. It’s really hard to go public and evaluation in the hundreds of billions when you haven’t made a dollar in profit yet. So there’s a lot going on, Pat. I think you hit a lot of the tech well. This space is evolving and changing so quickly. It’s really, really exciting times. All right, let’s finish with a little bit of chest pounding. Our team is doing some amazing work. Signal65, that’s one of the children you and I had together. And we ran the processor gauntlet. We’ve looked at AMD, we’ve looked at Intel, we looked at Qualcomm and these Copilot+ PCs. Pat, what are we finding at this point?

Patrick Moorhead: Yeah. So yeah, Signal65, we do data center testing and we do device testing and everything in between. We even test LLM. So, it’s exciting to put a numbers behind everything. So first and foremost, what’s clear, so Ryan and his team doing a great job. We published three reports. He published three reports, one on Qualcomm, one on Intel, and one in AMD, showing the generational comparisons between all three processors, even going back to N-3. And N-3, meaning three generations ago. And on every single element, whether it’s battery life, graphics, CPU, NPU. I say battery life, superior on all fronts. Now, N-1 N-2, you have areas where it’s vastly superior to anything. So for instance, pre-NPU, vastly superior, right? Zero, 10 or 11 up to 50, okay? Battery life, delivering that. There are platforms that it’s vastly better battery life than N-1, but most of the install base is N-3, N-4. So, that really shows why people should go out and buy this new class of PCs.

Second thing on battery life, Qualcomm still rules the worst. It’s the most power efficient, gets the best battery life across a range of systems and use cases. No surprises there. AMD has the best multi-threaded performance out of there at the lot, but comes in at the lowest on battery life. It makes sense. By the way, they also have the biggest NPU at 50. It doesn’t look like that’s making a difference yet, but potentially because all of the Microsoft algorithms were pegged in it at 40, we will see. Lunar Lake, right, as we talked about, I think on the last show, the one before Intel wrote a lot of checks about battery life and it’s closer than we expected to Qualcomm. Close the gap on single-threaded performance through Geekbench and Cinebench.

It has a really good integrated GPU. Well, how about Apple, right? The company that basically put Mega-distance and the X86 folks had to put a high-end desktop versus their notebook on performance. Pretty much the M2 and the M3 lost to every one of the Windows laptops. M4 is coming, it’s going to be better. We have some indications on where we think it’s going to be better, but it’s likely based on Apple’s new way, literally putting out a new chip versus let’s say binning, even between devices. In fact, even between the pro and the max is different chip on the smartphone side. It’s likely going to have some boost there and we just need to be able to… Once those machines come out, supposedly they’ll be out announced in the fourth quarter. We’re going to have to retest and evaluate pretty much everything.

Daniel Newman: Yeah, there’s a continuation here, Pat, and I know we’re running into time and I’ve got to get running to a meeting, but what’s been super fun to watch is just how quickly things are evolving and how competitive things are. As you sort of suggested from part to part, there’s definitely some disparity. I think for instance, I think surprise so far has been how well Intel did with Lunar Lake Battery Life. I think there was a lot of expectation that there would be a bigger Gulf with the ARM part, that was really successful. To your point, I think what AMD has is a potential to… I think it is early, so there’s a potential with that MPU that so much more offloading could be done and battery life could be extended, but it’s not necessarily showing up yet in the testing numbers. But that’s also the fun part of what we get to do, right? Tearing these things down and playing with them as we really get to see. And in a world where there’s very few sort of impartial testing and performance sites to gather. I really do think our team has done just a world-class job of breaking this stuff down. By the way, are all these reports published yet?

Patrick Moorhead: They are. And they’re on the signal site. You and I have sent them out over social media as well, and you can go in and download them and check out all the other tests that we have in there. It’s called signal65.com.

Daniel Newman: Yeah, I’m glad you said that. I just wanted to do a little promotion without having to do it all by myself. So this commercial is brought to you by Signal65. Pat, it’s been fun, man. It’s Friday. It’s been nice to be home. Getting on the road here, you and I both. I think what we’re going to be out at AMD next week.

Patrick Moorhead: That’s right. Got there for the big AI event, we’re going to be there as analysts. We’ll also be interviewing all senior leadership, including Lisa Su out there.

Daniel Newman: The Six Five we’ll be there, look for the socials. We appreciate everybody tuning in, join us this week. Hit that subscribe button. Join us for all of our coverage here on our weekly show, episode 234, and of course all of our interview conversations. But Pat and I, and of course, our extended team for this week, for this show, it’s time to say goodbye. Pat, go enjoy your steak and eggs, 12 protein shakes, and I’m going to go get another cup of coffee because I am an unbearably cool dad.

Patrick Moorhead: Oh, that’s so cute. I just want to give you a big hug. By the way, Baylor Bears, his young- middle goes to Baylor.

Daniel Newman: Baylor Bears. See you all.

Patrick Moorhead: I love it.

Daniel Newman: See you all until next time- or wait until next time. See you all later. Bye-bye.

Patrick Moorhead
+ posts

Patrick founded the firm based on his real-world world technology experiences with the understanding of what he wasn’t getting from analysts and consultants. Ten years later, Patrick is ranked #1 among technology industry analysts in terms of “power” (ARInsights)  in “press citations” (Apollo Research). Moorhead is a contributor at Forbes and frequently appears on CNBC. He is a broad-based analyst covering a wide variety of topics including the cloud, enterprise SaaS, collaboration, client computing, and semiconductors. He has 30 years of experience including 15 years of executive experience at high tech companies (NCR, AT&T, Compaq, now HP, and AMD) leading strategy, product management, product marketing, and corporate marketing, including three industry board appointments.