Moor Insights & Strategy https://moorinsightsstrategy.com/ MI&S offers unparalleled advisory and insights to businesses navigating the complex technology industry landscape. Tue, 28 Jan 2025 21:14:25 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://moorinsightsstrategy.com/wp-content/uploads/2020/05/cropped-Moor_Favicon-32x32.png Moor Insights & Strategy https://moorinsightsstrategy.com/ 32 32 Antonio Neri Discusses Hewlett Packard Enterprise’s Future – The View From Davos https://moorinsightsstrategy.com/the-six-five/antonio-neri-discusses-hewlett-packard-enterprises-future-the-view-from-davos/ Tue, 28 Jan 2025 21:14:25 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45410 Antonio Neri, President & CEO at Hewlett Packard Enterprise, joins Daniel Newman and Patrick Moorhead to share his insights on leading HPE into the future with a focus on innovation, sustainability, and strategic partnerships.

The post Antonio Neri Discusses Hewlett Packard Enterprise’s Future – The View From Davos appeared first on Moor Insights & Strategy.

]]>
There’s no question – AI is top of mind for businesses and governments alike and the focus is shifting from infrastructure to implementation & value derivation. Patrick Moorhead and Daniel Newman were joined by Hewlett Packard Enterprise‘s President & CEO, Antonio Neri, last week in Davos. In their conversation, Neri emphasizes HPE’s commitment to making technology equitable and sustainable, their strategic positioning in AI,
and the role of hybrid cloud in AI applications.

Tune in as they cover 👇

  • Sovereign AI initiatives by various governments
  • HPE’s edge-to-cloud platform, GreenLake and partnerships with companies like NVIDIA and Juniper to deploy and scale AI solutions with ease
  • Fostering a culture of experimentation and learning to unlock the full potential of AI
  • Antonio Neri’s perspective on global tech trends and their impact on the enterprise sector
  • The role of sustainability and ethical considerations in HPE’s corporate strategy
  • Key insights into HPE’s collaboration and partnership strategies for the future

Learn more at Hewlett Packard Enterprise.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five On the Road with a View from Davos. We are having great conversations here. World Economic Forum is really a unique event, a melding of technology, governments, regulation, a lot of discussion about sustainability. And of course, I think at least for us, the headliner here has been about AI.

Daniel Newman: Yeah, we knew that coming here, this was going to be sort of an inflection year. A lot of the research that we are all doing on our side of the house is looking at, okay, we’ve had these kinds of infrastructure boons over the last couple of years, but now we’ve got to figure out how to implement this stuff. Start to derive value. How do governments get value? Enterprises get value? And then of course in Davos here you’re going to have that challenge of how do we keep building AI at the scale and address the energy needs?

Patrick Moorhead: That’s right. It is definitely a balance. I mean engineering and a lot of engineering to go in there and get the right balance to deliver that enterprise value for AI. So I’d like to bring in Antonio Neri. Great to see you again. Welcome back to The Six Five.

Antonio Neri: Yeah, thank you for having me. Welcome to Davos, an amazing place in an interesting time.

Patrick Moorhead: It is an amazing place, and HPE is, multinational provider to businesses, but a lot of governments. You guys coined “edge to cloud,” and you were right. And now I think the industry believes that hybrid AI is the way to go, but more importantly, generally what do you like to achieve in the show? What are your objectives here?

Antonio Neri: Well, as always, it is a good opportunity to check in every 12 months coming here, and meet our customers. Pretty much all of them are here. And also, key government officials to understand their agendas, their objectives. And as you said, AI is top-of-agenda for all governments because they have to control the destiny with this amazing technology. So for me, it’s an opportunity to connect with everyone in one place, over a three days period. So logistically it makes it very, very easy, but also to tell our story. And HPE has a unique point of view when it comes down to some of these technologies and what they can do for business, but also for society. And I think people are realizing that companies like HPE, who has been investing over the years in hybrid cloud in AI at scale, in networking, in making sure these technologies, equitable, accessible, inclusive, and sustainable. These are messages that resonate here.

Patrick Moorhead: Makes sense.

Daniel Newman: Yeah. Antonio, it feels good to be right occasionally, and I think both of us can say we really did see what you were doing early on with GreenLake. We both were believers. And I still really do believe, I actually think as companies are starting to see the cost of doing AI, you heard me in the beginning talking about deriving value. I mean, you seem to be in such a good position right now between what you’re doing on the compute side, your partnership with NVIDIA, the Juniper deal, and now the networking scale that you’re able to offer. How is it coming together? Because all this infrastructure investment has to start to bear fruit. I imagine a lot of the conversations you’re having here about that.

Antonio Neri: Yeah, I think you guys have been great advocates of what we’ve been doing, but also be able to tell it in a way that people understand both from a business and a technology perspective. And takes a unique set of skills, because sometimes you can be very technology-oriented or very high level and then people don’t connect the dots. But I will say, our strategy has been consistent and curated over the almost seven years of being as a CEO. And each component is a building block for the next one. And when I think about preparing us for this moment, we said seven years ago, the enterprise in the future will be edge-centric, cloud enable and data-driven. That gives us the vision to build an edge-to-cloud experience and architecture underneath. And that starts with the networking. And we start the first at the edge with the campus and branch and IoT and private IG. And we were very successful with that.

But now with the Juniper, we complement that to the cloud and inclusive of the service providers. And I believe that’s the core foundation of what Hewlett Packard will look like once the transaction closes. That’s the core tenant by which you can deliver these hybrid experiences, whether it’s cloud or AI. You need hybrid cloud to deliver hybrid AI. And that’s true particularly for enterprises which have data everywhere. But then when you go to the AI, it’s not created equal, right? So you have people that are racing to build these models at massive scale, although we are moving to where are more a mix of experts and more agentic type of approaches to stitch together the best model, to deliver the best outcome for a specific task for a workflow. That’s true for enterprise. But these companies are in the business to build these models and they need enormous amount of computational power. And for them it’s all about cost, scalability, and sustainability. And HP has unique point of view when it comes down to that, because we are the market leader in supercomputing. And we have shown the last 18 months some of the amazing systems including 10 days ago with the El Capitan and really the largest supercomputer ever built.

But then when you go to enterprise as a different story, and you guys understand that it is all about simplicity of Deployment and adoption. And that’s where the partnership model is important. But our technology and our platform, GreenLake is a core tenant of that. And now we have more than 40,000 customers to give us the ability to allow them to go from whether it’s the poor networking or just cloud and hybrid cloud, to going through AI. And that’s why we conceive our offer to be so simple in three clicks and less than 30 seconds to deploy.

Patrick Moorhead: Yeah. I’ve had a lot of conversations here about sovereign AI clouds, and you have the easy button where it can be a full stack of HPE, you can mix and match with your partner, particularly on the software side. Question, there’s been a lot of action obviously in hyperscaler, the training of it. What’s your sense from the conversations here, but also for the last couple of months on the timing of, let’s say the surge of government AI and enterprise AI? I mean, it’s starting, the curve is swift, but it’s still a little bit in its infancy.

Antonio Neri: It is, but in fact, after this interview, I’m going to meet a few governments that are thinking about deploying sovereign AI clouds and-

Daniel Newman: We’re just warming you up.

Antonio Neri: Yeah. Thank you. But again, when I think about AI, think about the segmentation of AI, there is three key segments in my mind. There is the model builders and the hyperscalers and the service providers alike. That they are in the business to either build a model, train the model, or provide large infrastructure for people to do their own experimentation, or in some cases just to inferencing. And that’s, think about in that space, maybe 10, 20 customers, large CAPEX deployments and the tens of billions dollars, they soon will be consuming million GPUs. They already are in the tens of thousands, in some cases, hundreds of thousands. And that’s what a lot of the action is today because there is the race to continue to make this model better and more accurate. And then obviously there is a proliferation of these models, like we said before.

Then you have the service providers that either are there to fulfill a need with enterprises on co-locations and hosting, or some of them to provide capacity to do this model builders because they have enough. But also, many of them actually are in the business to provide a sovereign cloud. However, the second segment is this sovereign AI, and I think there are 15 countries that are taking the lead. Obviously United States is one. In Europe, think about UK, France, Germany. In the last six months, we have seen more action there. Last week we opened the HLRS system in Munich. We are building the UK Bristol, which is the UK AI cloud. We are building the Japan AIST for Japan. And then, now, Middle East is very active, both the Saudi and the UAE. And obviously we’re talking here with both of them, including the G-42 entity with a lot of CAPEX available and energy available, which is very important.

So I think they are about three different curves, one very steep, both consumption and speed. The other one is ramping up fast. And then you have the enterprise, by the way, on the sovereign, it will be tens of thousands GPUs at the time deployments. But when you go to enterprise, there will be hundreds of thousands of customers, but they deployed very small, maybe hundreds of GPUs. Because they’re in the business, do a racking or just a fine-tuning. And that’s where I’m very excited because, over time, that’s where the value is. You talked about early on, it’s all great, but in the end it has to deliver value whether it’s productivity or differentiation. And I believe that’s where ultimately we will achieve the full potential. And then with governments, obviously to make sure they address the needs culturally in the geo, and then fundamentally to provide better services to their citizens.

Daniel Newman: Makes sense. So Antonio, just a couple of minutes left here. It’s an exciting day. We’re actually launching the largest CEO research panel of its kind about AI. We talked to 213 CEOs that run companies over a billion of revenue. And we asked them about AI. You’ve always been very culture-driven in your company. One of the interesting findings was 80% of companies where CEOs are driving the AI initiatives are getting less value than they are when they actually properly… I guess I’m just kind of curious of what kind of advice do you give, given that you’re involved with so many enterprises, so many governments, what kind advice are you giving to the market about success with AI now that you’re really getting into the meat of seeing these companies being able to deliver?

Antonio Neri: Well, first of all, you have to be on the loop as soon as you can, because otherwise you’re going to be left behind. Secondly, you have to have a culture of fail fast and improve. The reality, you’re not going to figure all this out immediately. You’re going to have some quick wins and a lot of failures, but the failures are important to learn and then improve the next cycle. And you have to have this culture of iterative, until we get the right balance of doing the right thing, which obviously responsibility, inclusion, and all those core values are important, with the business results. And I think about AI, it’s less about the technologies, more about the business, and everybody plays a role in the entire enterprise. In our case, we have a group of people that drive the student team for us. We have more than 250 cases that we go through the process. Some of them are already in full production, some of them are in ideation. Some already failed and said, okay, forget it. Move on.

And so I think that’s what you have to establish. And I think that’s an opportunity in my mind for every enterprise, to rethink the culture of the company because it’s not the technology, it’s how you do business going forward. And I think this technology allows you to really take a step back, reimagine your enterprise using this technology in the ways that we haven’t imagined before. But again, if you don’t get on that train, you are going to be left behind. And that’s a problem, right? Because we saw that when people went through the Covid, those who were ahead on the digital transformation and they had digital processes in place, they were able to move quickly in adopting whatever they needed to be done, working from home or engaging customers in digital interactions. And that’s what’s going to happen here. AI is going to drive a lot of those interactions on that digital platform.

Daniel Newman: Antonio, great insights, as always. Thanks so much for spending a little time with The Six Five with Pat and I here. Appreciate your friendship and we appreciate all the work that you’re doing at HPE.

Antonio Neri: And thank you for coming here and covering the event. I think you bring a unique point of view and trying to make it simple for people to understand-

Patrick Moorhead: Thank you.

Antonio Neri: … What happens here, sometimes they think this is a big political thing, actually, is way more than that, I think is the intersection of social issues with technology and governmental agendas. Ultimately, when the right focus is in place, a lot of things can happen.

Daniel Newman: And public partnership. Thank you everybody so much for joining The Six Five On the Road with a View from Davos. Hit Subscribe. Join us for all of the content here on the ground at Davos 2025, the World Economic Forum. Patrick, you gotta say goodbye now.

Patrick Moorhead: See you later.

The post Antonio Neri Discusses Hewlett Packard Enterprise’s Future – The View From Davos appeared first on Moor Insights & Strategy.

]]>
Exploring the Future of Technology with LogicMonitor’s CEO – The View From Davos https://moorinsightsstrategy.com/the-six-five/exploring-the-future-of-technology-with-logicmonitors-ceo-the-view-from-davos/ Tue, 28 Jan 2025 19:44:47 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45407 Christina Kosmowski, CEO at LogicMonitor, joins our hosts to share her insights on leading a tech company towards innovative horizons and addressing the challenges of digital transformation.

The post Exploring the Future of Technology with LogicMonitor’s CEO – The View From Davos appeared first on Moor Insights & Strategy.

]]>
With an upcoming raise close to $1B LogicMonitor must be doing a few things right, right? Get the download straight from LogicMonitor’s Chief Executive Officer Christina Crawford Kosmowski and hosts Patrick Moorhead and Daniel Newman at the WEF in Davos. They discuss the future landscape of technology and how LogicMonitor is prioritizing innovation while making sure systems work as they should.

Get their take on:

  • AI as the dominant theme at this year’s WEF
  • LogicMonitor’s recent $800M funding round and an exciting new partnership with OpenAI
  • How LogicMonitor is helping organizations monitor and manage the performance and cost of LLMs
  • Discussing the crucial role of observability and infrastructure monitoring in the age of AI and hybrid cloud
  • Christina’s vision for LogicMonitor in the coming years

Learn more at LogicMonitor.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road with a View from Davos. It has been an incredible event. Actually, it’s been my first World Economic Forum, and it’s really been a really nice blend of tech, politics, regulation, a lot of government officials here. A lot of really good discussions, particularly around the economic opportunities and potential risks with AI.

Daniel Newman: Yeah. We knew that AI was going to be in focus at this event, Pat. And every year there’s always thematically something that’s going on here. And you as a novice and me as a wily second year here at Davos, I’m glad I was able to implore my wisdom upon you. But it’s been a great week, and this is always a moment for that kind of public-private partnership. But it’s also a really interesting moment because this is also a year where you have major administration changes. The US obviously always drives a lot of the world, especially in technology. And now you’re starting to kind of see how does the rest of the world adjust to it? So we knew it would be a big week. It’s been a big week, and having a lot of fun here.

Patrick Moorhead: Yeah. And one of the big elements of the new AI economy, as we’ve seen IT has been fragmented.

Daniel Newman: Yes.

Patrick Moorhead: And being able to monitor whether it’s the devices, the applications, infrastructure, pretty much everything, what’s going on has become paramount. And I can’t imagine a better person to have a discussion with than Christina of The LogicMonitor.

Christina Crawford Kosmowski: Hello.

Patrick Moorhead: Welcome back to The Six Five.

Christina Crawford Kosmowski: Thank you.

Patrick Moorhead: We’re not in Las Vegas, we’re here in Switzerland.

Christina Crawford Kosmowski: It’s a little different vibe, but loving it.

Patrick Moorhead: Slightly, right.

Christina Crawford Kosmowski: Love talking to you guys no matter where we are.

Patrick Moorhead: Thank you. Thank you, same.

Christina Crawford Kosmowski: Yeah. Fun.

Patrick Moorhead: So hey, maybe a good place to start. You and I are both first-time attenders of the event.

Christina Crawford Kosmowski: Yeah.

Patrick Moorhead: Can we talk a little bit about, we’re about halfway in, a couple days in.

Christina Crawford Kosmowski: Yeah.

Patrick Moorhead: What kind of conversations are you having? Why did you come in the first place? And what are you getting out of it?

Christina Crawford Kosmowski: Yeah, absolutely. It’s definitely been an incredible experience, a little overwhelming, especially coming here with all there is to do. So we were looking for three things. One was just an opportunity to meet with all of our customers and partners in one place has just been incredible. And to be able to really have these strategic conversations with the top level folks has been extremely beneficial. The second is those who know us love us, but not everybody knows us.

Patrick Moorhead: Right.

Christina Crawford Kosmowski: And so, just being able to get our name out and people to learn about what we’re doing is very exciting. And then third, just the conversations we’re having. It’s just really insightful for me as we even think about our strategy and where we go next-

Patrick Moorhead: That’s right.

Christina Crawford Kosmowski: … has been unbelievably eye-opening this week.

Patrick Moorhead: Yeah, the concentration has definitely been one of these things. Daniel, you bring that up time and time again and I agree with it.

Daniel Newman: Well, I always say that the amount of meaningful meetings you can have in such a short period of time, it’s hard to replicate that at any other event because of just the density of executive leaders, government officials, consultants, policy makers. It’s pretty great. Christina, you had a pretty big few months. First of all, you raised a little money.

Christina Crawford Kosmowski: Just a little bit.

Daniel Newman: Just a little bit, several hundred million at a nice…

Christina Crawford Kosmowski: $800 million.

Daniel Newman: I said several hundred.

Christina Crawford Kosmowski: Yeah, yeah, yeah. Exactly the same, yeah.

Daniel Newman: Effectively, but I didn’t have it memorized.

Patrick Moorhead: I think the correct word is almost a billion dollars.

Christina Crawford Kosmowski: Almost a billion. Yeah, yeah, yeah.

Daniel Newman: It was almost a billion dollars at a valuation north of that.

Christina Crawford Kosmowski: Yes.

Daniel Newman: Seeing great support later stage.

Christina Crawford Kosmowski: Yeah.

Daniel Newman: And of course, this week you also said, “I want to top that here at the World Economic Forum.” And I’m going to announce a partnership with probably, if not the most prolific, one of the most prolific names in LLMs, in AI, OpenAI.

Christina Crawford Kosmowski: Right.

Daniel Newman: Talk a little bit about the progress you’re making and share a little bit more about that announcement.

Christina Crawford Kosmowski: Yeah. Oh, my gosh. We’re having so much fun right now at LogicMonitor and doing lots of great stuff. And the raise is really exciting because it’s a testament to the work we’re doing, and it gives us an opportunity to get the work we’re doing in the hands of more customers and more markets. And that’s really important for us to accelerate that. And then, this OpenAI announcement is just really important to us. I think there’s three areas that we’re partnering with them on. And the first is obviously we ingest over a trillion records a day, and so we’ve got this tremendous amount of data. And so, AI is a huge part of our strategy ourselves. We launched Edwin in June last year, so six months ago. And that’s our AI ops kind of agentic product. And we’re building off of OpenAI’s LLM, so that’s super important to have that framework. And then two, our customers have asked us to monitor LLM.

Patrick Moorhead: Right.

Christina Crawford Kosmowski: So this opportunity to be able to monitor the LLMs for the performance and the cost and the sustainability impacts is another big area. And then finally, internally, we also are reskilling, upskilling our team to think about how we take advantage of AI in our internal operations. And ChatGPT for the enterprise is a big part of that as well, so it’s super exciting.

Daniel Newman: So I may say real quickly here, I would like to suggest that the keynote speakers you brought over the last few years to open up your sales kickoff.

Christina Crawford Kosmowski: Let’s do it.

Daniel Newman: And all this success, I don’t know, but I’m seeing a pretty big connection there.

Patrick Moorhead: Somehow I knew Dan would get in a plug for himself.

Daniel Newman: Wait, didn’t you do it?

Patrick Moorhead: I did.

Daniel Newman: Okay. So I said-

Patrick Moorhead: You can do that one.

Christina Crawford Kosmowski: You guys were very super high rated.

Daniel Newman: That last few years.

Christina Crawford Kosmowski: Yes. Both of you were.

Daniel Newman: Yeah. I appreciate that.

Christina Crawford Kosmowski: I think you’re the highest rated.

Daniel Newman: Nice.

Christina Crawford Kosmowski: So yeah, I’m going to give you that.

Daniel Newman: Well, thank you.

Christina Crawford Kosmowski: So let’s get you back.

Patrick Moorhead: I think she was looking at me, but anyways.

Christina Crawford Kosmowski: Yeah.

Patrick Moorhead: No, as you’re having these conversations, we talked a little bit about, “Hey, I want your help to monitor these LLMs.” As we’re moving into this world of generative and agentic AI, I know we’re here already in certain areas, particularly on the training side and a lot of consumer applications. But it’s really just the start when it comes to enterprises and governments.

Christina Crawford Kosmowski: Yeah.

Patrick Moorhead: What requests are you getting that you can fulfill or you’d like to put a solution against?

Christina Crawford Kosmowski: Yeah. Well, I think first and foremost and the heritage of our company is ensuring the performance of the infrastructure and ensuring that it stays up and running, right? And that’s critical, and it’s even more critical now that everyone’s rushing to do all of these different use cases in AI. If you’re building your businesses and your workflows on it and it’s not up and running and there’s latency, that can be a problem, right?

Patrick Moorhead: Latency matters. So many different connective tissue.

Christina Crawford Kosmowski: Right.

Patrick Moorhead: It’s not just the compute and the GPUs.

Christina Crawford Kosmowski: Right. Right.

Patrick Moorhead: It’s the connectivity between the networking.

Christina Crawford Kosmowski: Right. It is.

Patrick Moorhead: The storage system.

Christina Crawford Kosmowski: Yeah. Exactly.

Patrick Moorhead: Pretty much everything. And they’re so interconnected. Even these reasoning systems are systems that are talking to each other that are across the data center.

Christina Crawford Kosmowski: Right. Definitely. Yeah, it’s all interconnected, and I think that’s the theme here this week of World Economic Forum, is around this connected collaborative intelligence, right? And I think one, we’re seeing this connectedness of devices, and that’s really where we play, is there’s this hybrid world which you think of as on-premise and in the cloud. But there’s also these connected devices. And so, we monitor connected devices for our companies as well. And so, being able to pull all that together is critical. And then, as you mentioned, the stack is getting more complicated. And I was in a roundtable panel on chips, and everyone’s thinking, “What does LogicMonitor have to do with chips?” Well, no. The chips are coming into these data centers, and they’re having to be more efficient. And all of the customers are saying, especially in the IT departments, they’re like, “I have got to deliver all this innovation, but I also have to think about the long-term sustainability of both cost and energy at compute. So how do I think about managing that?”

Daniel Newman: Interesting.

Christina Crawford Kosmowski: And so, we’re able to kind of bring all that metadata together as well and give those recommendations around how do you think about where those workloads go? How do you think about fulfilling this demand? And then, we also have our collector architecture is on the edge, so we can also provide the efficiency. So while the chips themselves are getting more efficient, we also can do some of that balancing where we don’t have to bring all the data in, pass it all through it all the time. We can say, It’s not necessary to pull all 24 by seven. And so, there’s lots of ways I think we’re bringing that connectedness from the devices through the stack in the technology and just trying to kind of bring it all together.

Daniel Newman: Yeah. So Christina, as we sort of wrap up here, I’d love to get a little bit of your outlook. So we talked about all the success that’s been built up, a combination here a little bit of this experience. But we’ve got this big transition going on. You’re hearing a lot about deregulation.

Christina Crawford Kosmowski: Yeah.

Daniel Newman: That should be exciting for your own possibilities with M&A.

Christina Crawford Kosmowski: Right.

Daniel Newman: Continued investment growth and IPO, which finally the market might actually support.

Christina Crawford Kosmowski: Yes.

Daniel Newman: I’m not suggesting you do that, I’m also not suggesting you don’t.

Christina Crawford Kosmowski: Maybe not.

Daniel Newman: But what do you think the outlook for ’25, what are some of your big objectives, and what are some of the things you’re really paying attention to?

Christina Crawford Kosmowski: Yeah. I think this word collaboration I think sometimes gets overused, but it is really true. It’s about how can you collaborate across the data so you’re bringing the data in a unified view. And that’s critical because data is kind of going everywhere. As you said, it’s fragmented. There’s all these different devices, there’s connectedness in the supply chain and the tech stack, and then there’s connectedness in regulation and policies and frameworks. And so, I think collaboration’s one word. I think interoperability is going to be really important.

Patrick Moorhead: Right.

Christina Crawford Kosmowski: And so, all of that, and we’re certainly looking for seeing here, coming out with these frameworks. But then our technology ourselves, we’re saying, “where can we build these partnerships so that we can have that interoperability with the entire stack that companies are looking for?”

Patrick Moorhead: Yeah, it makes sense. That interconnectedness is literally everything. Just when you think things could get simpler and we figured this things out, we add another layer and more complexity.

Christina Crawford Kosmowski: That’s Right.

Patrick Moorhead: So absolutely company adding value in the stack, it’s great to hear.

Daniel Newman: Christina, I just want to thank you so much for joining us. It’s great to meet with you here.

Christina Crawford Kosmowski: Yes.

Daniel Newman: I’m sure we’ll see you a lot this year. As you continue your journey, you will show up more and more on our radar.

Christina Crawford Kosmowski: Yes. Great.

Daniel Newman: And we’re wishing you all the best and success, and enjoy the rest of your event.

Christina Crawford Kosmowski: Thank you. Thanks for having me.

Patrick Moorhead: Thank you very much.

Christina Crawford Kosmowski: And yeah, we’re ready to crush 2025, for sure.

Daniel Newman: And thank you everybody for being part of The Six Five On the Road with A View from Davos. Hit subscribe, join us for all of our content coverage and conversations here on the ground. It’s been a great event right here on the way in the middle of the World Economic Forum with the backdrop of the Magic Mountain behind us. See you all later.

The post Exploring the Future of Technology with LogicMonitor’s CEO – The View From Davos appeared first on Moor Insights & Strategy.

]]>
The View From Davos with Ericsson’s Åsa Tamsons https://moorinsightsstrategy.com/the-six-five/the-view-from-davos-with-ericssons-asa-tamsons/ Tue, 28 Jan 2025 18:40:51 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45403 Åsa Tamsons, Senior Vice President at Ericsson, joins Daniel Newman to share her insights on the innovative edge of enterprise wireless solutions and Ericsson's role in shaping the future of global connectivity.

The post The View From Davos with Ericsson’s Åsa Tamsons appeared first on Moor Insights & Strategy.

]]>
Daniel Newman had a fascinating “walk and talk” at Davos with Ericsson’s Head of Enterprise Wireless Solutions, Åsa Tamsons. ❄ AI & tech innovation demands that we push connectivity to the edge. Get their take on cutting-edge developments in enterprise wireless solutions and the future of connectivity. Tune in as they discuss 👇

  • The evolving landscape of enterprise wireless solutions
  • The importance of investing in network infrastructure for global competitiveness
  • Responsible resource management, even as networks scale to accommodate growing data demands
  • The potential of edge computing to power real-time applications like video analytics and AI-driven solutions
  • Ericsson’s collaborations with partners to unlock the full potential of edge connectivity

Learn more at Ericsson.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Daniel Newman: Hey, everyone. The Six Five On the Road here with a View from Davos. Actually, I should say The Six Five is on the move, because we’re moving here.

Åsa Tamsons: We’re moving.

Daniel Newman: We’re walking and talking.

Åsa Tamsons: We’re moving.

Daniel Newman: And we’re heading down the promenade. Asa, it’s so good to have you back. Last year, we stood in the snow. It was Six Five in the snow. And we talked a little bit about what was going on in Davos. This year, you and I have had a few chances to get together. You sat on a panel with me at Kearney where we talked about this new CEO research. Thank you so much for that. And then I moderate a panel for you.

Åsa Tamsons: Thank you so much.

Daniel Newman: So now we’re all even. Now I owe you one, because now you’re coming back and helping me. Look, as we walk and talk here down the promenade, one of the things I want to ask executives like you where you’re leading Ericsson is when you come to Davos, what is the big sort of goal? You know, You see some of these firms have big houses. They’re making huge investments. And I think Ericsson has a big presence, big delegates, a number of delegates, but not on the promenade necessarily. What do you come here? What do you hope to achieve in the five days you’re here?

Åsa Tamsons: Well, we’re a relatively small delegation from Ericsson and we have a very small, very thin back office. We’re basically just of us with a clear purpose, our CEO, our CTO, our head of government and political affairs, our head of sustainability, our head of Europe and Latin America, and myself who runs the enterprise wireless solutions business.

Daniel Newman: All right.

Åsa Tamsons: So I think overall, I mean, one is to drive the awareness and influencing understanding of how important investments in the network infrastructure is critical, not only for companies but also for industries and at, you could say, at the core of it, nation’s competitiveness. So that’s one part. Another part, there are a lot of tech firms here, so that prominent folks. So we have a lot of partners that we take the opportunity to meet while we’re here together. On my-

Daniel Newman: Like at Qualcomm last night.

Åsa Tamsons: Like Qualcomm last night.

Daniel Newman: We had a nice evening.

Åsa Tamsons: It’s nice and I think it’s a great opportunity to actually spend some quality one-on-one time while we’re all here. But there also sustainability, I think it’s a topic we have not talked as much this year. But I think the planet hasn’t changed. We still have to move that topic. So we still believe in energy efficiency. We do believe we need to be mindful of our resources, whether that’s energy capacity, if that’s other type of resources. So that’s one. And for me, being a traditional in the heart, we’re a telecom industry and telecom company, here, you have a much broader industry. And for us, networks of the future are here to not only provide our cell phones with connectivity but all important enterprise and industry applications. So personally, I spend a lot of time meeting companies in different sectors to understand, both drive awareness, but also how we can work closely together to connect to factories, to warehouses, to solve problems where they don’t have connectivity at the edge, but also make them aware about what competence, capabilities we have and build relationships.

Daniel Newman: Absolutely. Yeah. No. I mean, great. It was like a chance-and-stance meeting last year and now we’ve been able to build a rapport over a year and spend more time together and learning a lot about what you’re doing. It’s great. You mentioned something that I think has been sidelined and I think with the new administration in the US and knowing the world tends to follow the lead in the US in many things, there’s some things we do that are not always appreciated and I understand that, but like when it comes to technology, it’s really a driving force. We’re seeing regulation change. We’re seeing a bit of a pull away from some of this. Ericsson being a Swedish company and thinking… I’m not trying to get too political, but I just wonder how are you guys thinking about bringing that back into the light? Because at Davos here, it’s been much less of a subject, but to your point, planet’s not changing.

Åsa Tamsons: Yeah. I think for us it’s being… Running technology that actually relies on global standardization. So we work with all our competitors to set the standard for the foundation, our technology, which I think is pretty unique. So we actually can provide a better consistent service around the world. So at the heart of it, we’re a global company.

Daniel Newman: Yeah. Absolutely.

Åsa Tamsons: Sweden’s the whole market is tiny. So for us, being present in the large markets that drive momentum. And obviously US is a big home market for us. India is a growing market for us. And frankly, we have a mission where we really want to power the network and the next wave of growth for industries and the world. So I think for us, that’s our mission and now we have to adapt for how we need to operate.

Daniel Newman: Yeah. And one of the really practical things I would say is the edge provides a really big opportunity to be more thoughtful about where… You and I did a panel together, Where Computation Takes Place. Something else I’d love to talk to you a little bit about is Ericsson’s story. You and I have had the chance, a little bit on the sidelines, but not necessarily on the record, to talk about a company in transition. Still a prominent brand, known around the world, clearly driving the telco, the mobile, the network industry, but also much bigger than that. I mean, when you think of this stuff, it all connects to data centers. It all provides an avenue to running LLMs. How are you describing it for those that maybe didn’t come up in the first large wave of Ericsson’s growth and are now seeing this new sort of version of Ericsson? And by the way, a lot of great history, a lot of great pedigree. I’m not saying it’s not. But I’m kind of… I want to hear more about how it’s evolving.

Åsa Tamsons: Where we pushing now is really how we can drive limitless connectivity to improve people’s lives, redefine businesses and pioneer sustainable future, and at the heart of it, a really programmable open network where you can start access the resources like a network API, just as we have been doing with a lot of other capabilities in the software stack. But many of these capabilities are built down deep in a core network. So here we’ve done an amazing job, I would say, over the last couple of years to make sure we can actually start, build network capabilities that are accessible and also open so developers can access it in a different way than before. The other part is of course we see we play an important role in driving energy efficiency. So how do we continue while we bring more capacity, lower latency, better capabilities, and we actually increased the scaling of our services. How do we do that while becoming 10X for every jump, more energy efficient so we don’t consume more energy just because we’re using more? And I think that’s a question that is not only important for us, but also as we scale gen AI applications. And there I think AI, but also using the right type of connectivity or process the data where it’s really needed or more relevant will be an important topic as we continue to scale automation AI at the edge and at the core.

Daniel Newman: Absolutely. So maybe we can finish up here. You and I sat on a panel together in the bubble, The Dome with MIT, Forbes, and the topic was connectivity at the edge. I’d love to get your take on that particular conversation, that opportunity. We know a lot of what you just said about where computation takes place, but what is the sort of perspective that you and Ericsson have on connectivity at the edge? And what do you see happening in the next few years?

Åsa Tamsons: Look, we think, we always thought, connectivity and distributed connectivity is critical. I think what is interesting now, many best effort application and where it’s not critical get the data back, if you send a message, for instance, maybe it’s okay that it gets processed a minute later or whenever it’s available. I think where we’re quite excited is the possible when you can start to manage voice and video, not only data, but exchange at the edge. So think about many of these gen AI solution right now. We had this company, VoiceBrain. They’re actually powering, empowering, the staff at the front at airports to get real-time information just by analyzing and assessing unstructured voice data real-time. There are a lot of interesting applications. So we see that in law enforcement, fire responders, but also frankly in inventory in hospitals where you now actually start to see the commercialization of video analytics powered by our connectivity. And I think it’s quite… You will not need cellular connectivity everywhere, but you actually now have the possibility to do that but at much competitive cost with the delivery and quality that you need at edge to start actually scaling a lot of exciting applications that bring better safety, improve health, but also improve productivity and energy efficiency. I think that’s exciting. But I think also back to the question or conversation we had in IoT, we’ve been talking about it. I said 20 years, probably 40 years, or something like that, and now we start to see some of the real value that we’ve been talking about actually being enabled.

Daniel Newman: Absolutely. Hey, Asa, I want to thank you so much. It was great spending time with you here in Davos. We’re right… The Magic Mountain is right here behind us. Or is it that one? I don’t know which one it is, but it’s always so wonderful to spend time with you. Congratulations on the success so far and look forward to doing many more things together and having you back on the show with me.

Åsa Tamsons: Wonderful.

Daniel Newman: So have a great rest of your Davos.

Åsa Tamsons: Thank you.

Daniel Newman: And thank you, everybody, for joining this little walk and talk. I’m Daniel Newman. This is The Six Five. We are On the Road or we’re on the street here in Davos. Appreciate you tuning in. We’ll see you all later.

The post The View From Davos with Ericsson’s Åsa Tamsons appeared first on Moor Insights & Strategy.

]]>
MI&S Weekly Analyst Insights — Week Ending January 24, 2025 https://moorinsightsstrategy.com/mis-weekly-analyst-insights-week-ending-january-24-2025/ Tue, 28 Jan 2025 00:19:22 +0000 https://moorinsightsstrategy.com/?p=45339 MI&S Weekly Analyst Insights — Week Ending January 24, 2025. A wrap up of what our team published during the last week.

The post MI&S Weekly Analyst Insights — Week Ending January 24, 2025 appeared first on Moor Insights & Strategy.

]]>
MI&S Logo_color

Welcome to this edition of our Weekly Analyst Insights roundup, which features the key insights our analysts have developed based on the past week’s events.

I’m not surprised that 2025 has started right where 2024 left off, with AI dominating conversations across tech. The Trump administration has only added to this with the announcement of the $500 billion Project Stargate for AI, which our analysts Paul Smith-Goodson, Matt Kimball, and Will Townsend evaluate from different angles in this week’s updates. Look for more about Project Stargate from us in the days to come.

Chuck Robbins at Cisco AI Summit
Cisco CEO Chuck Robbins at the opening session of the Cisco AI Summit (Photo by Will Townsend)

Last week, Will also published his thoughts on the Cisco AI Summit, which he called “the best AI event I have attended to date.” (I’m not surprised; earlier this month I wrote about how thoughtful Cisco’s approach to enterprise AI has been.) Meanwhile, Anshel Sag wrote about NVIDIA’s latest graphics card for gaming PCs, the AI features of which make it, he says, “without a doubt the fastest graphics card in the world.” It’s a good reminder that even though datacenter GPUs are now the biggest financial engine for NVIDIA, the company has never abandoned its roots — nor lost its dominance — in PC graphics.

The team published a lot of research last week and did some travel as well. While I was in Davos, Anshel attended Samsung Galaxy Unpacked in San Jose and the MIT Reality Hack in Boston. This week, Robert is in Las Vegas for Acumatica Summit and then NYC for Microsoft’s AI Tour. February, March and April are already shaping up to be busy travel months for the MI&S team. Look for our thoughts on these events in upcoming installments of the MI&S Weekly Analyst Insights.

Hope you have a great week,

Patrick Moorhead

———

Our MI&S team published 25 deliverables:

This past week, MI&S analysts have been quoted in top-tier publications such as Network World, Yahoo Finance, and Venture Beat with our thoughts on Databricks, Intel, Samsung, and Starlink. Robert was a guest on the WBSRocks Analysts Gone Wild Podcast to discuss enterprise software

MI&S Quick Insights

When I was spending time with a client last week, it really stood out to me how far we’ve come lately in clarifying AI offerings. Thanks to the rampant pace of AI development over the past couple of years, sometimes the fundamentals of product management and product marketing have been deprioritized. For example, plenty of product messaging and feature descriptions have been released when they’re still at a notoriously high level across the board. But that is starting to change. I take this as a reminder that early adopters are far more tolerant of a product’s rough edges than the general market. It’s a positive sign that we are now seeing more maturity and wider adoption of enterprise AI software products. I believe it’s also a sign that things will slow down somewhat as pilots and prototypes move towards production.

I have been spending more time with NVIDIA agentic blueprints, and I can say that the detail and effort taken to document how to get the blueprints up and running is pretty impressive. This also stands out because these agents can be deployed on-premises or in a cloud. By contrast, most agentic efforts so far have been limited to a specific cloud or a SaaS platform. I am hoping that this level of deployment choice is a sign of things to come, rather than an exception.

2025 has started off with a lot of AI investing. Whether it’s ServiceNow’s acquisition of Cuein, new VC rounds at AI startups, or even the massive commitment from the Trump administration for U.S. AI data centers, it seems like investors are finally getting off the sidelines. That’s a good thing in general, but it will signal a shift in priorities for existing product teams. If you are a developer, I would expect more focus on ease of use, consumability, and samples versus net-new innovations and APIs.

Lastly, while this news is a bit old, I did want to address the hoopla that was made when Satya Nadella of Microsoft made his comments about the future of SaaS. It seems that in the tech world there is nothing we like more than declaring things dead. However . . . that’s not what he said. What will really happen is that the business-logic and user-experience layers of SaaS will be massively changed by agents. But the overall value proposition of SaaS platforms will likely remain intact.

When I was spending time with a client last week, it really stood out to me how far we’ve come lately in clarifying AI offerings. Thanks to the rampant pace of AI development over the past couple of years, sometimes the fundamentals of product management and product marketing have been deprioritized. For example, plenty of product messaging and feature descriptions have been released when they’re still at a notoriously high level across the board. But that is starting to change. I take this as a reminder that early adopters are far more tolerant of a product’s rough edges than the general market. It’s a positive sign that we are now seeing more maturity and wider adoption of enterprise AI software products. I believe it’s also a sign that things will slow down somewhat as pilots and prototypes move towards production.

I have been spending more time with NVIDIA agentic blueprints, and I can say that the detail and effort taken to document how to get the blueprints up and running is pretty impressive. This also stands out because these agents can be deployed on-premises or in a cloud. By contrast, most agentic efforts so far have been limited to a specific cloud or a SaaS platform. I am hoping that this level of deployment choice is a sign of things to come, rather than an exception.

2025 has started off with a lot of AI investing. Whether it’s ServiceNow’s acquisition of Cuein, new VC rounds at AI startups, or even the massive commitment from the Trump administration for U.S. AI data centers, it seems like investors are finally getting off the sidelines. That’s a good thing in general, but it will signal a shift in priorities for existing product teams. If you are a developer, I would expect more focus on ease of use, consumability, and samples versus net-new innovations and APIs.

Lastly, while this news is a bit old, I did want to address the hoopla that was made when Satya Nadella of Microsoft made his comments about the future of SaaS. It seems that in the tech world there is nothing we like more than declaring things dead. However . . . that’s not what he said. What will really happen is that the business-logic and user-experience layers of SaaS will be massively changed by agents. But the overall value proposition of SaaS platforms will likely remain intact.

In the opening days of his new term, President Trump announced a massive $500 billion AI project called Stargate. Supercomputers and AI datacenters will be built by key partners that include OpenAI, Oracle, SoftBank, and MGX. The project will use technology created by Arm, Microsoft, NVIDIA, Oracle, and OpenAI. SoftBank is the lead financial manager, OpenAI will handle model development and training, and Oracle will manage the data aspect. The project’s objective is to maintain U.S. leadership in AI and to create advanced AI in the form of artificial general intelligence. AGI will be able to perform a wide array of tasks with human-like intelligence, and potentially revolutionize fields like material science, medicine, and environmental science.

With an initial investment of $100 billion, construction has already begun in Abilene, Texas, where Microsoft is building an AI supercomputer. The investment is slated to ramp up to $500 billion by 2029. The plan is to establish 10 datacenters of 500,000 square feet each, with intentions to expand to another 10 across the U.S. once further site evaluations are complete. This project should create significant economic and security benefits for the U.S., especially because it emphasizes national and military security, aiming to enhance capabilities in data analysis, surveillance, and cybersecurity to safeguard against strategic threats.

To truly evaluate the project, we need more details. It appears that Microsoft initially planned to build a supercomputer exclusively for OpenAI with a $100 billion price tag, and that plan morphed into the much larger $500 billion national plan complete with supercomputer and multiple datacenters. We will need more information to fully understand the plans, how they evolved, and how they are being implemented.

Brightcove is smartly leveraging Amazon Q to address the complex technical queries from its global client base. (If you need a refresher on Amazon Q, take a look at this writeup we did last year.) By using Amazon Q, Brightcove aims to empower its support team to reduce research time and significantly improve the customer service experience, particularly for intricate issues like video embedding.

While acknowledging the potential of generative AI, Brightcove wisely emphasizes a cautious approach with rigorous testing and expert scrutiny to ensure accuracy and build trust. This focus is increasingly important in today’s AI landscape, where many companies get caught up chasing cost cuts and automation. Brightcove seems to recognize that the true potential of generative AI often lies in augmenting human capabilities and fostering deeper customer understanding.

Equipping its support team with Amazon Q will likely speed up response times and enable more effective problem-solving and stronger customer relationships. This is a strong way to leverage AI for a competitive edge. Brightcove’s strategy, with its emphasis on accuracy and human-centered implementation, highlights how accuracy, trust, and scalability can be key to maximizing the benefits of this technology to improve CX.

Adding on to my colleague’s contributions about Project Stargate, here are a few of my thoughts:

  • It didn’t take long after this historic announcement for the questions to raise. If Stargate is going to be an AI venture, what exactly is the product or service being offered? Is it an AI cloud? Purely an R&D platform? Ten separate 500,000-square-foot datacenters racked with AI-specific infrastructure sounds an awful lot like a cloud to me, but maybe it isn’t.
  • Is this more about creating 100,000 high-paying jobs, with the understanding that the market will find use for these datacenters in quick order? While Oracle’s Larry Ellison spoke at a high level about being able to create cancer therapies and vaccines, I’m a little confused about how this plays out specifically.
  • While the investment of $500 billion over five years is incredible, what is the projected time until the first customer, partner, user, or consumer is actually using Stargate? It seems likely that the ROI on this investment will be a bit further out.
  • While we know that Oracle, OpenAI, and Softbank are working in partnership with NVIDIA and Microsoft — what does this environment actually look like?
  • What does Stargate mean for the cloud market? Anything? Is Stargate reserved only for the largest of large use cases that would typically require an on-premises cluster? Or is the net being spread wider to enable a real incubation across the spectrum — from the largest-of-large to the smallest-of-small companies?

I am a big fan of the government recognizing the need for the United States to stay far ahead of its competitors in any area of technology, and certainly AI is hugely important. As a country, we are already investing far more than the rest of the world combined. However, as a person who is inherently skeptical of anything the U.S. government does in the longer term, I would like to understand better the how, why, what, when, and where of Project Stargate.

When HPE launched silicon root-of-trust back in 2017, it was a game-changer in the server market. By examining and responding to the millions of lines of code that execute before a server even boots an operating system, the company provided what was the most secure server in the industry. And by integrating this with its integrated lights out (iLO) management, HPE created servers that could not only detect malware at the lowest levels, but also take corrective actions to mitigate the impact.

Since 2017, the threat landscape has evolved considerably. Quantum lockers and AI-driven malware kits create a new set of challenges that require a new way of securing platforms. Here’s the question: is the infrastructure supporting our most critical workloads evolving to meet these challenges? We’ve seen the silicon vendors respond. Now as server platforms prepare to refresh with the newest CPUs, I’ll be curious to see whether the baseboard management controllers and hardware-based security mechanisms deliver the required protection.

Customer data platforms (CDPs) are like superheroes for enterprises looking to delight their customers. They collect data from various sources and organize it in a centralized location, ensuring that everyone has access to the same insights about each customer. This collaborative approach provides better teamwork and deepens customer understanding. Armed with this comprehensive data, marketing teams can deliver personalized campaigns that make customers feel valued and encourage repeat business. CDPs serve as treasure troves of customer data, simplifying data sharing and utilization for teams.

But CDPs aren’t without their challenges. They can be tricky to scale up, especially for big companies with lots of data. And if the data isn’t accurate, it can mess up everything. Many CDPs don’t have the best analytics tools, which can make it hard to figure out what’s working and what’s not. And integrating CDPs with old systems can be a real pain.

Despite these challenges, the CDP market is growing fast. It’s expected to reach $72 billion by 2033, which is a huge increase from $7.82 billion in 2024. This growth is happening because businesses want to do a better job of engaging with their customers and making them feel valued. They also want to be able to use data to make smart marketing decisions. And they’re starting to use AI, automation, and machine learning to make CDPs even better.

If you’re a business considering CDPs, consider your needs and how you’ll use them. Ensure the CDP can handle your data, integrate with your systems, and provide analysis tools for smart decisions. CDPs are promising solutions for business growth and success. Look for my upcoming Forbes article on the state of enterprise data, which will highlight CDPs.

Ericsson recently integrated large language models into its NetCloud management platform. Thanks to this, AI agents can process network data and technical documentation to generate configuration recommendations. What is unique is that the system performs this functionality without exposing sensitive information to third parties, and thus — by design — provides a higher degree of security and control for datacenter deployments. It is also worth noting that the architecture is agentic in nature and employs multiple agents to solve complex tasks, including troubleshooting connectivity issues, automating infrastructure provisioning, and translating business intent and requirements into network policies. Such tasks have required manual intervention in the past, and if Ericsson can successfully execute in this area, it could lead to incremental enterprise networking revenue opportunities for Ericsson’s customers.

It looks like manufacturers are getting serious about AI in 2025. They’re increasing their AI budgets to become more efficient and competitive. The good news here is that they’re mainly focused on using AI to help their employees, not replace them, aligning with the principles of Industry 5.0. To leverage AI, I suggest manufacturers modernize their ERP systems, improve their data management strategies, and upgrade management processes.

Transforming your business to cloud-based ERP systems is a key for taking advantage of AI. By modernizing, manufacturers can optimize their investments and reap the benefits of the new technology.

Last week I published a case study that highlights this ERP–AI connection and the importance of modernization for making it work: “Hearst Corporation Modernizes Oracle ERP with Strong Change Management and Data Management Practices.” As enterprises adopt AI-driven solutions, it’s crucial to balance the technology advancements with addressing the human and organizational aspects of transformation. Two key pillars — change management and data management — are essential for achieving actionable outcomes. Change management focuses on organizational and human factors, while data management ensures data completeness and quality, enabling accurate and timely insights. Without both, enterprises may struggle to modernize, integrate workflows, or make informed decisions.

For this case study, I had the chance to sit down with David Hovstadius, senior vice president of finance operations at Hearst Corporation, who emphasized the importance of these principles during Hearst’s transition to Oracle Cloud ERP some years ago — which continues to pay dividends as the company embraces generative AI today. By prioritizing change management and data management, the company laid a foundation that not only facilitated its ERP implementation, but also enabled continuous technological and process improvements as AI technologies emerged. For more details, check out the article linked above.

The new Samsung Galaxy S25 smartphone launch happened last week, and it demonstrated how Google, Samsung, and Qualcomm are working in lockstep not only in mobile but also in XR with Project Moohan. Witnessing the interplay of Gemini with the depth of the Moohan experience clearly demonstrates how the three companies are working together to deliver the best AI experience in mobile. For more context on this partnership, see my coverage of last month’s launch of the Android XR spatial OS.

NXP has announced the EdgeLock A30 secure authenticator. It’s a standalone chip compatible with many MCUs and MPUs, including NPX’s MCX and i.MX products. Its minuscule size (“smaller than a grain of rice”) and standard I2C interface make it easy to fit into small devices, and NXP’s comprehensive EdgeLock 2GO certificate services ease the commissioning process. Developers need integrated solutions that conform with new and upcoming security and privacy regulations — and customer concerns. For example, the EU’s Batteries Regulation (2023/1542) requires using a Digital Product Passport by 2027, including supply chain provenance, and the EdgeLock A30 is the basis for a scalable solution. The chip has a RISC-V processor and 16 kB of NVM for credential storage, is Common Criteria EAL6+ certified, and is available now.

Last July, I reported that IBM acquired two Software AG properties – StreamSets and webMethods. Software AG’s streamlining continues with the sale of Alfabet and a management buyout of Cumulocity, the company’s IoT division. Cumulocity, founded in 2012, is once again independent after eight years under Software AG. Founder and CEO Bernd Gross told WirtschaftsWoche, “We are moving towards independence as a scale-up,” and “The big IoT boom is still to come.” I’m expecting strategic changes that better align the company with physical AI trends, making it more of a solution enabler than a solution provider.

Verizon’s new AI strategy leans on its strengths in 5G with mobile edge compute (MEC) and fiber. This creates an opportunity for businesses and even cloud providers to move their AI applications as close to the edge as possible using available compute for inference and low-latency applications. I like to see Verizon leaning in this direction because the company has struggled to differentiate its offerings from those of AT&T and T-Mobile.

Cybersecurity researchers at Sophos have discovered that threat actors have exploited Microsoft Teams to spread malicious links and files, potentially leading to ransomware infections. These attackers use AI for social engineering, making the attacks harder to detect. Microsoft has acknowledged the issue and is working on a solution. While these findings highlight specific threats to Teams, they serve as a broader warning about the increasing risk of similar attacks across all collaboration platforms. The problem is likely not isolated to Microsoft and emphasizes the need for heightened vigilance and robust security measures across the board.

My review of the RTX 5090 graphics card found that NVIDIA continues to innovate in AI. While the 5090 is a very large and power hungry card compared to the 4090, its performance is also considerably higher in 4K with DLSS 4 and 4x frame generation turned on. I also found that the AMP (AI management processor) is a RISC-V core, which is programmable and shows the allure of RISC-V for such applications.

For the first time, all the subsystems necessary to implement universal and fault-tolerant quantum computation have been combined in a photonic architecture. Xanadu has created a photonic quantum computer named Aurora that is a scale model for universal, fault-tolerant quantum computing. Aurora incorporates 35 photonic chips, 84 squeezers, and 36 photon-number-resolving detectors.

The system achieves 12 physical qubit modes per clock cycle, which means it can handle 12 qubits for each processing step, and it has synthesized a cluster state with 86.4 billion modes — reflecting the vast number of different ways 12 qubits can interact with each other. For error correction, it uses a distance-2 repetition code with real-time decoding. Aurora’s architecture is divided into three stages:

  1. Preparing photons to create quantum states
  2. Adjustment of quantum states or entangling the qubits
  3. The QPU performs the computations

The Aurora operates at room temperature and uses fiber-optic networking, which facilitates scalable quantum computing. Xanadu’s design is focused on fault tolerance and scalability. Compared to other photonic quantum computing efforts from makers such as PsiQuantum and Photonics, Aurora stands out with its comprehensive system design, error correction, and scalability. Shared challenges among photonic platforms remain optical loss and high qubit error rates.

Last week the Trump administration instructed the Department of Homeland Security to disband all advisory committees within the agency, including the Cyber Safety Review Board. The CSRB was created under the Biden administration in 2022 and, interestingly, played a role in investigating China-sponsored cyberattacks against U.S. telecom providers. The clean sweep of all advisory committees may simply be a resetting of the guard and a change in policy direction, but it will be interesting to see whether it impacts cyber defense negatively in the short or long term.

Something subtle that I think has been mostly overlooked: Qualcomm’s Snapdragon 8 Elite for Samsung comes with more than just a frequency bump; it also includes customizations in the Qualcomm DSP for some of the new Samsung imaging features, plus an integrated display controller on the SoC for lower power consumption. This is something probably only Samsung could achieve, but it still clearly grows out of Qualcomm’s understanding that Samsung needs something different and custom.

Technology continues to transform sports in more and more ways. One example is the TGL indoor golf league, a tech-enhanced golf league cofounded by Tiger Woods, Rory McIlroy, and Mike McCarley in partnership with the PGA Tour. Recently launched after a year-long delay due to storm damage at its SoFi Center facility in Palm Beach Gardens, Florida, TGL combines virtual and traditional golf. Matches feature six teams of four players competing in a mix of simulator-based and on-course play, including a morphing 3,800-square-foot green. The league’s unique format includes nine-hole team matches, head-to-head play, and overtime closest-to-the-pin contests, with scoring determining playoff seeding. Matches will air live on ESPN and ESPN+.

This is a pretty interesting way to showcase golf with advanced simulators, mechanically altered greens, and innovative visuals. I believe that TGL does a good job of bringing technology together while creating a unique spectator and player experience.

Meanwhile, other sports continue to try out new tech, such as soccer using semi-automated offside technology (SAOT) to make video assistant reviews (VAR) for offsides clearer and faster. But fans aren’t always on board with these changes; VAR in particular has created significant concerns among fans of the Premier League and other top leagues about transparency and how the technology affects the flow and fairness of the game. As I’ve said before, it will always be important for sports to integrate new tech while keeping important traditions alive.

5G mobile and fixed wireless access could play a pivotal role within Project Stargate, the ambitious AI effort announced in the early days of the new Trump administration. As covered elsewhere in this update, the initiative aims to invest $500 billion in infrastructure to build out AI datacenters in the United States. As gen AI becomes more hybrid from the cloud to network edges, mobility could become instrumental in the processing of smaller language models hosted in smaller edge data nodes. 5G has been searching for its killer application beyond fixed wireless access consumer services, and given 5G’s low latency, fast throughput, and massive device support advantages — it may have found it in the rollout of AI.

Podcasts Published

The Enterprise Applications Podcast (Melody Brue, Robert Kramer)

DataCenter Podcast (Will Townsend, Paul Smith-Goodson, Matt Kimball)

Don’t miss future MI&S Podcast episodes! Subscribe to our YouTube Channel here.

Citations

Databricks / Funding and Partners / Patrick Moorhead / Opentools 
Databricks Scores Massive $15.25B Financing to Elevate AI Innovations

Intel / Ways to improve in 2025 / Patrick Moorhead / Network World
What Intel needs to do to get its mojo back

Intel / New CEO / Anshel Sag / Yahoo Finance
Intel races to find its next CEO, but insiders say no clear frontrunners yet

Samsung / Android XR / Anshel Sag / Venture Beat
Samsung teases Android XR devices coming later this year

Starlink / Growth under Trump Administration / Patrick Moorhead / Issues & Insights
Elon Musk’s Starlink Likely To Boom Under Trump Administration

Cohesity / Veritas Acquisition / Robert Kramer / Security Buzz 
Cohesity Acquires Veritas to Become World’s Largest Data Protection Provider

New Gear or Software We Are Using and Testing

  • Kindle Colorsoft (Anshel Sag)
  • Google Pixel Buds 2 Pro (Anshel Sag)
  • XREAL One AR Glasses (Anshel Sag)
  • Google Pixel Watch 3, 41mm (Anshel Sag)
  • Insta360 Link2 4K AI Webcam (Anshel Sag)
  • Google TV streamer – Matter and Thread features (Bill Curtis)
  • Various Matter devices (Bill Curtis)
  • ASUS Zephyrus G16 Gaming Laptop (Anshel Sag)

Events MI&S Plans on Attending In-Person or Virtually (New)

Unless otherwise noted, our analysts will be attending the following events in person.

  • Cisco AI Summit, January 15, Palo Alto (Will Townsend)
  • World Economic Forum, January 20-24, Davos, Switzerland (Patrick Moorhead) 
  • Samsung Galaxy Unpacked, January 22, San Jose (Anshel Sag) 
  • MIT Reality Hack, Boston, January 24-17 (Anshel Sag) 
  • Acumatica Summit, January 26-29, Las Vegas (Robert Kramer)
  • Microsoft AI Tour, January 30, New York City (Robert Kramer)
  • Cisco AI Summit, January 15, Palo Alto (Will Townsend)
  • World Economic Forum, January 20-24, Davos, Switzerland (Patrick Moorhead) 
  • Samsung Galaxy Unpacked, January 22, San Jose (Anshel Sag) 
  • MIT Reality Hack, Boston, January 24-17 (Anshel Sag) 
  • Acumatica Summit, January 26-29, Las Vegas (Robert Kramer)
  • Microsoft AI Tour, January 30, New York City (Robert Kramer)
  • ZohoDay25, February 3-5, Austin (Robert Kramer, Melody Brue)
  • Oracle NetSuite SuiteConnect, February 6, New York City (Robert Kramer)
  • Cisco Live EMEA, February 10-13, Amsterdam (Will Townsend)
  • SAP Analyst Innovation Council, February 11-12, New York City (Robert Kramer)
  • RingCentral Analyst Summit, February 24-26, Napa (Melody Brue)
  • Arm Analyst Summit, February 18-21, San Francisco (Matt Kimball)
  • Microsoft Threat Intel Summit, February 25, Redmond (Will Townsend)
  • Siemens Datacenter Analyst Summit, February 25-27, Zug, Switzerland (Matt Kimball)
  • EdgeAI Austin, February 25-27, Austin (Bill Curtis is a speaker)
  • Mobile World Congress, March 2-7, Barcelona (Will Townsend)
  • Susecon, March 10-14, Orlando (Matt Kimball)
  • Fastly Accelerate, March 12, Los Angeles (Will Townsend)
  • Synopsys Panel Moderation, March 15, San Jose (Matt Kimball)
  • Adobe Summit, March 18-20, Las Vegas (Melody Brue)
  • Extreme Networks Connect, May 19-22, Paris (Will Townsend)
  • Zendesk Analyst Day, March 25, Las Vegas (Melody Brue)
  • Oracle Database Summit, March 25, Mountain View (Matt Kimball)
  • IBM Infrastructure Analyst Summit, March 25, NYC (Matt Kimball, Melody Brue)
  • Microsoft FabCon March 31–April 2, Las Vegas (Robert Kramer)
  • Canva Create & Analyst Day, April 8-10, Los Angeles (Melody Brue)
  • Infor Analyst Innovation Summit, April 8-9, NYC (Robert Kramer) 
  • NTT Upgrade, April 9-10, San Francisco (Will Townsend)
  • Google Next, April 9-11, Las Vegas (Robert Kramer)
  • Appian World, April 27-30, Denver (Robert Kramer)
  • RSA Conference, April 28-May 1, Las Vegas (Will Townsend)
  • Nutanix.NEXT May 6-9, Washington DC (Matt Kimball)
  • Informatica World, May 13-15, Las Vegas (Robert Kramer)
  • Fastly Accelerate, May 14, Los Angeles (Will Townsend)
  • Dell Tech World, May 19-22, Las Vegas (Matt Kimball)
  • Zscaler Zenith Live, June 2-5, Las Vegas (Will Townsend)
  • Snowflake, June 2-5, San Francisco (Robert Kramer)
  • Cisco Live US, June 8-12, San Diego (Will Townsend)
  • HPE Discover, June 23-26, Las Vegas (Will Townsend)
  • Techritory, October 22-23, Riga (Will Townsend)

Subscribe

Want to talk to the team? Get in touch here!

The post MI&S Weekly Analyst Insights — Week Ending January 24, 2025 appeared first on Moor Insights & Strategy.

]]>
The View from Davos with Cisco’s Chuck Robbins https://moorinsightsstrategy.com/the-six-five/the-view-from-davos-with-ciscos-chuck-robbins/ Mon, 27 Jan 2025 20:41:35 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45335 Chuck Robbins, Chair and CEO at Cisco joins hosts Daniel Newman and Patrick Moorhead, sharing his insights on global economic trends, Cisco's sustainability efforts, and the transformative power of technology post-Davos.

The post The View from Davos with Cisco’s Chuck Robbins appeared first on Moor Insights & Strategy.

]]>
Cisco CEO Chuck Robbins gives us his state of the market from Davos. Learn what he is saying below ⬇

Patrick Moorhead, Daniel Newman, and Chuck spoke about, wait, you guessed it…. #AI, but Chuck connected the dots on its transformative nature from infrastructure to security. Plus, we got the scoop on the future of Agentic AI and Cisco’s ongoing transformation.

Specifically learn:

  • How AI is driving Cisco’s evolution: From infrastructure to security, Cisco is integrating AI across its portfolio
  • The next frontier of Agentic AI: Imagine a network administrator that’s an AI agent
  • Cisco’s transformation continues: The company’s pivot to recurring revenue and software, with AI driving further growth
  • The role of technology in addressing global challenges
  • Key takeaways from the Davos meeting for businesses and governments

Watch now and stay tuned for another Cisco segment with their Product Chief Jeetu Patel.

You can also subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road with a View from Davos. We are here at the World Economic Forum having some great conversations. WEF is a unique event. It’s the combination that pulls together, technology, finance, government, regulation, all of that, and of course the biggest conversation so far is artificial intelligence.

Daniel Newman: Yeah, Pat, it is really great, and we’re right here on the promenade and the action is all around us. You know, I just took my eye off you for just a second because some car drove by that I had to point to Chuck and say, “Hey, Chuck, look at that thing.” But, you know, it is really an interesting moment here in Davos. You know, they call it the Magic Mountain, and it really is scenic. It’s hard to actually get on camera how beautiful it is here, but people descend on this location to really see this opportunity to bring the public private conversation forward. Some of the biggest inflections in history are happening right now. It’s been a really big week in politics. It’s a massive shift for almost every business on the planet because of AI and being able to meet so many leaders in one place, the density, as a couple chip guys, the density here is awesome.

Patrick Moorhead: Yeah. It really is. So Chuck, thanks for coming on the show.

Chuck Robbins: Great to be here.

Patrick Moorhead: No. It is.

Chuck Robbins: This weather’s terrible too.

Patrick Moorhead: No. No. They tell me it’s always like this.

Chuck Robbins: It’s always like this. If you’re the Chamber of Commerce, no, you probably want snow actually.

Patrick Moorhead: So maybe a great place to start is why do you come to Davos? And what do you want to get out of it, or what observations have you seen so far?

Chuck Robbins: Dan just described why I come to Davos.

Daniel Newman: It’s cars?

Chuck Robbins: The great cars. Exactly. No. Look, it’s an incredible place where you can meet CEOs. I can meet government leaders. You can actually spend a lot of time talking to companies about next generation technologies, what’s going on, the geopolitical dynamics that are happening around the world. It’s just a great convening and a lot of great people to spend time with. It’s highly efficient for a lot of meetings.

Patrick Moorhead: Right. It really is right here on the promenade.

Chuck Robbins: Not efficient getting here. It’s highly efficient once you’re here.

Patrick Moorhead: Exactly. Well said.

Daniel Newman: It’s really interesting. But I said that, you know, when I convinced him to come, he had to do his whole mea culpa because all those years where he said, why he didn’t come.

Patrick Moorhead: Exactly.

Daniel Newman: Now he’s here. He is like, oh my gosh. I said, by 11 o’clock we’d had six meetings that would’ve taken us three months to have.

Chuck Robbins: Exactly.

Daniel Newman: Chuck, Cisco’s in a transformation of its own. You’ve completely evolved the company. AI, of course, is probably the biggest conversation that’s on the ground here. Talk a little bit about how Cisco is thinking about it, how it’s impacting the business. Some of the recent announcements, you had a huge summit a week ago.

Chuck Robbins: Yep.

Daniel Newman: I’d just kind of love for the market out there to hear the story because Cisco’s never been more relevant than it is right now.

Chuck Robbins: Yeah. I think the way to think about us is we plan in three areas. Number one, we provide infrastructure for AI. So if you’re a cloud player, we’re selling high performance ethernet underneath these GPU clusters. To help drive the connectivity, we’re selling optics to connect the clusters. We’re selling optics to connect data centers. And it’s just a huge opportunity for us in that space. Secondly, we’re building integrated technology for the enterprise. So integration of GPUs or NPUs or LPUs with traditional CPUs, but also with storage, networking, and an orchestration layer to really help our customers deploy AI apps more easily. And then the final area we plan in a big way is security.

And the launch you talked about last week, we had an AI summit where we launched AI defense, which is an incredible suite of products that really help our customers put guardrails around their models. And the great news is it applies to all models. So you don’t have to have a different security posture for every model. The team came out with it. We had a great turnout last week from both customers and analysts and press, and the feedback’s been very positive. Most customers that came want to get into trials with us pretty quickly. So that’s all going well. And then obviously we’re looking now at Agentic and what we can do there. And there’s some great thinking going on right now inside the company about some new offers that we think we can bring forward.

Patrick Moorhead: So Chuck, since you became leader, I mean it really has been transition after transition after transition. I mean, you successfully parlayed the networking opportunity, which still is an opportunity into security, and then you added another element which was data with the acquisition of Splunk and you’re layering, what should we expect over the next, I mean, I know there’s a lot of execution around AI, but what should we be looking for the next transformation?

Chuck Robbins: Well, I think two things I would call out. Number one, you’re going to see very tight integration across all those things that you just mentioned, because the reality of the power of all of them being together at Cisco is us leveraging them to give our customers more immediate insights about what’s going on in their infrastructure. So real quickly, application has a problem. Is it the application itself? Is it infrastructure? Is it storage? Is it networking? Is it a cyber attack? They don’t know. It takes too long to figure that out. We think we can help them get to that. So bringing all this stuff together is very important. But the second is this whole notion of Agentic AI, you think about what we could do to potentially create a network administrator that’s actually an agent…

Patrick Moorhead: Sure.

Chuck Robbins: … and actually an integrated solution in that space, so I think stay tuned. We’re looking at bringing some of that stuff forward as well.

Patrick Moorhead: Yeah. So it sounds a lot, I mean, you’ve been known for the easy button for a long time and you’re essentially extending the easy button. I’ve never had a CIO ever say, hey, I’d like more vendors to work with and I want it to be more complex. So I think with all of this, you’re really addressing that. And I, at least, our research suggests it’s very unique.

Chuck Robbins: Well, if you think about it, even when we acquire a company, you have a management console and then a management console. AI just allows you to put an abstraction layer on top of it and not go through the complex long integration that you typically have to do to get those things to work together.

Patrick Moorhead: Right.

Chuck Robbins: And you can put a layer of AI on top and solve the problem immediately. It’s incredible what this technology’s doing for us.

Patrick Moorhead: Yeah. It really is.

Daniel Newman: Yeah. There’s not a technology inflection in history, Chuck, that is going to move more quickly and it is going to abstract away more complexity and change the dynamic of these- a lot of these companies, I think there’s going to be a lot of change of who we see, what we see here. Now we’ve only got about a minute, but I did want to ask you one thing. You guys have been through, you know, Pat sort of alluded to this massive pivot. You’ve pivoted to more than half your revenue recurring. You’ve pivoted to a lot of your revenue in software. I mean, over the next few years, is Cisco going to, you know, I mean, how much more of your business can you turn to, that sort of high value EV shifting business? It should be a great market play.

Chuck Robbins: Well, the great news is that as long as we’re selling a lot of infrastructure for AI, that percentage isn’t going to go up significantly because we sell this stuff to our customers, and it’s a traditional buying method, but we’re okay with that. So we’re designing on silicon. We’re building great hardware, and the teams are doing a phenomenal job. And I think there’s going to be a lot of that. And over time, you know, I think over time we still look at consumption options where customers want to consume anything as a service and we’re going to deliver on it where we need to.

Daniel Newman: So it’s kind of an and you grow all the sides.

Chuck Robbins: Yeah. We like both.

Daniel Newman: And it keeps catching up. By the way, I love that he mentioned LPUs.

Patrick Moorhead: Me too. Me too.

Daniel Newman: So interesting.

Chuck Robbins: Isn’t that what you told me to say?

Daniel Newman: I couldn’t have paid you to say it, Chuck. Thank you so much. Thank you everybody for being part of this Six Five On the Road. It’s a View from Davos. We enjoy you being part of our community. Hit that subscribe button. Check out all the coverage from Patrick and myself here at Davos. For now, though, we got to say goodbye. We’ll see you all later.

The post The View from Davos with Cisco’s Chuck Robbins appeared first on Moor Insights & Strategy.

]]>
Six Five On The Road: Samsung Launches Flagship Phone Positioned as “A True AI Companion” – at Galaxy Unpacked https://moorinsightsstrategy.com/the-six-five/samsung-launches-flagship-phone-positioned-as-a-true-ai-companion/ Mon, 27 Jan 2025 20:00:20 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45328 Brad Haczynski, SVP at Samsung, joins Olivier Blanchard and Anshel Sag to share insights on deploying AI effectively in businesses with the new Galaxy S25.

The post Six Five On The Road: Samsung Launches Flagship Phone Positioned as “A True AI Companion” – at Galaxy Unpacked appeared first on Moor Insights & Strategy.

]]>
You could be carrying an AI agent in your pocket. 📲

Olivier Blanchard, Research Director & Practice Lead, AI, Futurum, and Anshel Sag, Principal Analyst, Moor Insights & Strategy, are at the Samsung Galaxy Unpacked event where they’re joined by Samsung Electronics America’s Brad Haczynski, SVP and General Manager for B2B Mobile eXperience. They share a conversation on Samsung’s reveal of the, Galaxy S25 series, packed with powerful AI capabilities for this episode of Six Five On The Road.

Tune in for details on:

  • Brad Haczynski’s insights into the Galaxy S25
  • Samsung’s AI strategy to empower businesses & professional with productivity and efficiency
  • How Samsung’s newly created AI agent will influence businesses
  • Approaches adopted by Samsung to ensure privacy and create a secure environment for businesses utilizing mobile technology, with a special focus on AI implementations

Learn more at Samsung.

Watch the video below at Six Five at Galaxy Unpacked, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Six Five On The Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Olivier Blanchard: Hi, Six Five is On the Road at Galaxy Unpacked in San Jose. I’m Olivier Blanchard with Anshel Sag, and we’re going to dig a little bit deeper into the announcements that were made today by Samsung.

Anshel Sag: We are joined today by Brad Haczynski. Welcome to Six Five. I know it’s been a while, but we’re happy to have you. And can you maybe tell us a little bit about your role at Samsung?

Brad Haczynski: Sure. So first of all, thank you for having me. It’s great to be here again. Always enjoy our conversations. So my role at Samsung is I’m responsible for our mobile B2B business in the United States. That means full revenue responsibility for growing the business. It’s across the entire portfolio from phones, tablets, watches, buds, rings, all the way to software and services and our incredible Knox platform. Our team is really responsible for engaging with customers, having a deeper understanding of what their requirements are for businesses, in particular with our mobile products, and then ultimately funneling that back to Suwon in Korea. So a lot of what we’ll talk about today is sort of what we’ve learned since we’ve launched technology such as Galaxy AI, and then how we quickly pivot and give that voice to the customer back so that we can make changes within our product groups at R&D to serve our customers best and continue the evolution of how we work with them.

Olivier Blanchard: So lots of announcements today. It’d be great if you could just run us through some of the main ones, the most important ones, and especially with the Galaxy S25 announcements. And it would be great also if you could give us a little bit of context on how moving from just the consumer market also how those are going to help business people and the enterprise and the commercial sector.

Brad Haczynski: Yeah. So today we launched, as you know, we do typically two launches a year. In the summer we launch our foldable series, and here we launched our exquisite S-series flagship devices, the S25 family. And what we really talked about today is really the evolution of Galaxy AI and the capabilities and what we’ve heard from customers across the world. So we’ve seen a big uptake in really how consumers and businesses are using the products. So we talk a lot about really making it a more personal native experience. So there was a lot of talk today about the One UI 7, which is more of a native agent capability. There’s a lot of incremental improvements at a device level where we talked about some of the new capabilities such as the camera, the wide lens camera capabilities, new power dissipation capabilities.

So really just it’s really improving upon that experience, but then working even more closely with a company like Google on how do we really define what they’re doing with their Gemini capabilities with what we’re doing with Galaxy AI and how we’re expanding upon that. And when it comes to businesses, we’re going to have a lot more of a discussion here. We can get into some of the specifics of what we learned. But it’s really about how do businesses take advantage of the AI capabilities on a device, what kind of outcomes can they create, and in different vertical industries, depending whether it’s a frontline worker or it’s someone in a backend distribution center. But also improvements. Today, we talked about security, the personal device engine. And then there’s a lot of concerns with AI around privacy and how can we ensure the users of these devices that their information is protected. Of course, the Knox hardware security platform, Knox Vault, is at the center of everything that we do at Samsung. So really it was a wonderful event and lots of evolution, I’d say, and more to come in the future.

Anshel Sag: And we’ve obviously talked a lot about AI, but I’d love to know about how you guys are helping your customers deploy AI to their employees and what that journey looks like.

Brad Haczynski: Yeah, so we’ve learned a lot. I will say last year when we launched Galaxy AI at the Unpacked event right here in San Jose with the S24 Series family, there were a multitude of questions that came from customers. How do I set it up in an enterprise environment? What is on device? What is not on device? So there’s a lot of things, a lot of questions that we didn’t have our answers to. But as we continue to explore, I mean we’ve learned quite a few things in the last year. The first is how the business use cases really work. So we’re seeing, we were building out throughout 2024 many use cases for different personas in different verticals. Some of those use cases scale horizontally. For instance, a frontline worker in a retail store might be the same as a frontline worker in a manufacturing plant, et cetera. How would they utilize the capabilities of the device, whether it’s increasing productivity or in a case of interacting with customers?

The second thing we’ve learned is that we’ve had over one million daily users on the communications capabilities of Galaxy AI. So every day, over a million people around the world are using the functions, whether it’s Live Translate or Note Assist, et cetera. And we’ve also seen that in the last year, we did a survey in July of 2024 that basically shows that effectively we’ve almost doubled the usage of mobile users who are now taking advantage of using AI on a mobile device. So it was about 16% and it’s gone up to 27% and continues to grow. And some of the feedback is when you ask those folks, 1.5x increase in productivity and a feeling of about 2x increase in creativity. So there’s a lot to unpack here. And again, it’s really about how can we take these capabilities, drive them into businesses, and then of course you have SMBs. The small medium business customers maybe require different things or have different concerns than large enterprises. So we really have to service that whole entire ecosystem of customers.

Olivier Blanchard: So speaking of AI, agentic AI is kind of the big topic for 2025. And you’ve made a decision, or Samsung’s made a decision to create its own agents. And I was wondering if you could talk a little bit about the thinking behind that, and also again looking at it from the perspective of consumers versus commercial applications and SMBs and the enterprise, how that plays across the entire ecosystem.

Brad Haczynski: Well, one is, again, the partnership we have with Google is really about driving Android as the predominant mobile experience in the world. And with generative AI and AI capabilities becoming fast and furious, it’s still a very nascent technology. So we would even have customers asking, well, you got Copilot over here and Gemini over here and Galaxy over here. So really it’s about that multimodal approach to build a native AI agent that’s sort of sitting at the phone level. One, it’s got the connectivity to Knox Vault. So when you talk about security, et cetera, you have security. But also the ability to have multiple applications really working seamlessly together. And it’s really about creating a more human-like AI experience. Because as you see, people want to move from this transactional calling of AI, whether it’s with a ChatGPT or even on a mobile device, they want it to be more like a personal assistant, a true personal assistant, that is knowledgeable, that can work across the different applications.

And so what we’re seeing is, again, for businesses, they think, wow, this is really cool, I could really utilize this. But that’s where the evolution of what we need to do to work with our customers around how do we ensure that security and productivity and really the security of those applications in a trusted environment for them to be able to truly take advantage of that agent. But it’s really about being first and most, and I call it, it’s my own words, it’s really being native. It’s about being a native AI mobile device versus a mobile device that has AI agents sort of just plopped on top as an application.

Olivier Blanchard: Makes sense.

Anshel Sag: Great. And you did talk a lot about this a little bit, but businesses have lots of concerns when it comes to security and privacy. And how are you guys helping to address those around Knox and then all the other components that you have with security?

Brad Haczynski: No, that’s a great question. Well, first of all, it’s incredible that in the last year, we’ve had Galaxy AI deployed on over 200 million Samsung devices. It’s quite a large number. So we’re seeing really the saturation into the marketplace across, again, the entire product portfolio. We’ve had active engagements. I talked a little bit earlier about how we’ve learned a lot through conversations. So customers know, especially when it comes to security and compliance, that Samsung really is one of the best in breed. As a matter of fact, the US government trusts us quite well because there’s a lot of our very public tactical edition products, et cetera, that have extra security requirements that are being brought into the battlefield with some of our soldiers and special operations. But when you get into these regulated industries, this is where the beauty of Knox comes in. Because when people say Knox, when we say Knox today at Unpacked, we’re really talking about the underpinning of the hardware security of the platform, which is a separate security chip.

So your biometrics, your passwords, your most sacred things are going to kind of be cordoned off into a separate enclave so that you can’t get any root of trust attacks, et cetera. So that’s the base. However, what we learned last year was, for instance, in some of the more regulated environments, there was questions around, well, I can’t have anything going to the cloud. Therefore, Samsung, how can you assure me that I… If I have to set this device to have the AI processing on device only, you have to help me set that through policy. So one of the things we did last year is we created a, it’s in the Knox, it’s called the Knox Services Plugin, which basically you can take an MDM such as Intune or MobileIron or we have our own Knox Manage, and you can give access to create policy to make sure that you can go in a corporate liable environment to ensure that the on-device toggle switch is always on. Right? You don’t want your employees doing that. You want to be able to control that.

Well, we’ve expanded upon that now. So what we also launched this year, actually today as part of the S25 launch, is now we’re making it so that it’s not just about toggling what’s on device versus off device, we can actually go individual applications. So now we’re giving that next level of control to the Knox Services Plugin, where now you can say, okay, I’m going to turn on Live Translate, I’m going to turn off Note Assist, I’m going to turn off this, so now you can go through the entire and give that more robust capabilities, and we’re continuing to build upon that. The other one that was another big challenge for us to solve is in some enterprise environments, they actually don’t allow employees to have a personal email address. So to activate Galaxy AI, you either have to have a Google account or a Samsung account.

So we actually had CIOs and IT decision makers come to us and say, “I’m not allowed to have that. By the way, if something happens and my employee loses a password, I have no way to reconcile that.” So launching later this year, we announced it today, but launching later this year, we’ll have the capability for enterprises to create a Samsung business user account, and you can tie that to their corporate email and identity, and then they can then have access to Galaxy AI in a much more controlled fashion, again, allowing IT decision makers to do what they do best. So those are just a couple of examples of we’ve pivoted that quickly in the last 12 months by getting the feedback from customers and, again, funneling it upstream. And we have one of the world’s most prestigious R&D departments, and they can move very quickly on implementing new capabilities as we go across to launching new devices.

Olivier Blanchard: Yeah. The ways that you are prioritizing security and data security on these devices for the whole ecosystem is really solid. That’s one of the things that I took out of today’s briefings and announcements. The AI stuff and the camera things, all these enhancements are super solid, but your emphasis on privacy was surprising, as good as it is.

Brad Haczynski: Yeah, no, and again, it’s the privacy of the user data, but also the security of where’s the data going? And we have full encryption standards. And as a matter of fact, we have some new capabilities for even for Quantum-

Olivier Blanchard: In a post-Quantum.

Brad Haczynski: Yeah, post-Quantum. So having just the absolute lockdown security capabilities that really keep out the threats that enterprise is worried about.

Anshel Sag: And I was going to say, I really like the granularity that you have too. It’s not super coarse. You can go by application. I think that’s super nice for enterprises that have different needs. So I think it’s good that you’re able to adjust to the enterprise needs.

Brad Haczynski: And that’s effectively what we’re trying to do. I mean, we really pride ourselves on being customer first. This is why these discussions are important because it’s really about giving them the tools and the capabilities to make their lives easier, and for them to have peace of mind that any device they’ve put in their environment, they’ll be able to serve their customers, which are their employees, but at the same time making sure they’re compliant with all the company protocols around compliance and security.

Anshel Sag: So Samsung has been growing quite a bit. And are there any new verticals or customers that you have that you might want to talk about?

Brad Haczynski: Yeah, so I mean we focus on verticals, but in one key particular, we had really big win in late 2024 with a large, very large retail sports company. And what they did is they took actually the Fold6 and deployed it to all of their store managers, and they generated a lot of excitement. But why did they choose the Fold6? The first thing we saw was they had the ability to, what I call device consolidation. They took the tablet and mobile phone and said, I have now one device for my store managers to carry. The second thing is they got store managers to come out from the back of the store into the front of the store, interacting with customers and interacting with their employees, what they call teammates. I think most importantly though is they are actively using the Galaxy AI features and Knox Security to make sure that they have the security capabilities on separating enterprise applications from personal.

But they’re using Live Translate with customers who come into the store who are not native speakers of English. And they’ve gotten a wonderful, it’s been a wonderful experience. They talked a lot about how incredible that capability is. And then as we move forward into the future, one of the things that we talked about today was the voice call recording capability, which I didn’t mention in my earlier answer, which is now if you have a customer on the phone, you can actually record that conversation through their permission, get a transcript and a summary of that, and then send out. So they’re seeing massive gains in productivity and really utilizing the AI on device in ways that are changing the customer experience.

Olivier Blanchard: Well, that’s all the time we have today, unfortunately. We’re going to have to have you come back on in a few months and talk about what’s next. But for the time being, thanks a lot for spending time with us. And for us, thanks for tuning into Six Five On the Road here at Galaxy Unpacked. Hit that subscribe button, follow us on socials, and also follow us and watch all of the other videos on SixFiveMedia.com. On behalf of Anshel and myself, have a great day.

The post Six Five On The Road: Samsung Launches Flagship Phone Positioned as “A True AI Companion” – at Galaxy Unpacked appeared first on Moor Insights & Strategy.

]]>
Insights from Infosys’s EVP Anand Swaminathan – The View From Davos https://moorinsightsstrategy.com/the-six-five/insights-from-infosyss-evp-anand-swaminathan-the-view-from-davos/ Sat, 25 Jan 2025 19:22:19 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45318 Anand Swaminathan, EVP at Infosys, joins host Patrick Moorhead, shares his insights on leveraging innovation and sustainability. An engaging overview that touches on the effective integration of emerging technologies and sustainable practices in business strategies.

The post Insights from Infosys’s EVP Anand Swaminathan – The View From Davos appeared first on Moor Insights & Strategy.

]]>
2024 was a great year for AI awareness building. Infosys EVP Anand Swaminathan, Infosys EVP, believes 2025 will be a pivotal year for organizations to unlock the true potential of AI to drive meaningful impact. Host Patrick Moorhead gets his thoughts on this and more on The View From Davos. Tune in as they discuss ⤵

  • AI driving product reimagination and process simplification across industries
  • Infosys’s approach to innovation and how it’s shaping the future of business
  • Small Language Models (SLMs) are gaining traction for business-specific purposes
  • Creating a culture of innovation and embracing AI within large organizations
  • Utilizing AI to drive tangible revenue growth

Learn more at Infosys.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road with a View from Davos. Been having some incredible conversations here. At WEF, it’s really a unique combination of finance, government, technology, everything wrapped into one, and I like to say the density of meeting with senior executives and government officials is truly amazing. A lot of the conversations about AI and how governments and enterprises need help to get ROI from their investments, and I can’t imagine a better person to have a discussion with about that than Anand with Infosys. Welcome to the show.

Anand Swaminathan: Thank you, Pat. Thanks for having me. Appreciate it.

Patrick Moorhead: Absolutely, super… Maybe a great place to start now. This is my first time here. Shockingly, it seems like all I do is attend events, but I’m curious, you’ve been coming here for a few years now. Why do you come here? And we’re halfway into the event, what are you hearing out there?

Anand Swaminathan: First, this is my 8th year, and I’m happy to meet you for the first time here in World Economic Forum, and thanks for having me on the show. For me, first is its efficiency. Most of my clients are here, partners are here. So within two or three days, you’re able to connect with most of these people together in one place. The second is it creates a level of energy when you have so many people across the world coming together to discuss both business as well as the social impact items. So, that gives me a certain sense of direction as we go into the next year.

Patrick Moorhead: It’s incredible. And this concentration part has, I think every CEO, senior executive I’ve talked to, they’ve said the same thing. So I don’t know why it took me this long to get here, but I’m here and I’m happy to be here. Usually, I ask a guest what they do first, but I wanted to get that first part out of the way. You have a very extensive practice at Infosys, technology, telecom, and media. Can you tell me what you do for clients?

Anand Swaminathan: So, I have three roles at Infosys. So I run the telecom, media and the technology business. I also manage all of our alliances and partnerships, which is a big part of our growth and a channel into the market. The third thing is I also oversee all strategic large deals, which is deals $50 million and above across the company. I am responsible for putting in place the right support systems and the right partnerships and pricing mechanisms to go after these large transactions. Now, as far as my telecom media and the tech responsibility, in the tech space, it’s basically all the hyperscalers, the semiconductor companies, contract manufacturing, tech distribution, as well as the OEMs and cyber security and other software companies. In the telecom space, it’s pretty much every telecom and media is media as generally the world defines it.

Patrick Moorhead: You’re a busy guy. That sounds like a lot. I think I interact with a lot of the same people you do. Really interesting thematic themes coming out of the show. Are you getting a sense at this year’s World Economic Forum, a sense of optimism, dread? I mean, with optimism sometimes, like AI for instance, it offers an incredible opportunity economically and societal, but there’s also the, “Oh my gosh, this is spooky and there are things…” But what’s the general sense that you’re getting from the show?

Anand Swaminathan: So, I think in relation to last year when AI or GenAI was just making its inroads into the boards and the C-suite conversations, it was more awareness building at that point in time. But I think in the last 12 months, a lot has been tried, particularly in the enterprise space, applying GenAI and now agentic AI in terms of how we could crack more complex, sophisticated problems. I think it’s ready to take off on the enterprise scale. Enough proof of concepts exist out there, and a few clients have actually also embarked on it. So, the conversations now is how do I apply AI into the core of my business, enhance my products, improve customer experience, bring cost efficiency, and it’s happening at a much more core level. So, that’s one big change.

Second is the general sense of optimism this year seems much higher. For a variety of reasons, people feel that the business climate is going to be much better than what it was last year, to what degree we have to wait and see in what sectors is something to be wait and watched. So, those are two very palpable things. The third one I would say is the continuous uplifting discussions we have had on the social agenda. I think clearly that’s always been on the rise and today and this year it’s gone to another level in terms of how companies that exist for profits should also have a societal thinking towards the impact we make. So, all three are the broad observations I see right now.

Patrick Moorhead: It feels like 2025 is going to be better than 2024, and maybe we always say that, I don’t know, but I think we really mean it this time. Let’s talk a little bit about AI. I think you expressed a really good status of where governments and enterprises are. They did their experiments, they’ve done POCs, and at least our research suggests that 2025, the expectation is they’re going to see some real ROI. I’m curious, what are types of the services that you can bring to the table to help enterprises and governments increase the ROI on AI?

Anand Swaminathan: So, I think the first is clearly unlocking value. So if you, for instance, look at the number of M&A happening in the world, clearly there is an opportunity to deploy AI at scale, to drive synergies much sooner to unlock the value and drive efficiency of the payouts that companies are expecting, and massive scale AI really offers that. Then there are specific areas in which we are driving far more advanced discussions. As a company, our strength is our industry understanding and our vertical understanding. So we are also focused in creating small language models for healthcare, for manufacturing, for telecom, and within those verticals specific processes. So we tie in nicely to the large language models, big platforms, but we also are fine-tuning specific models that are creating much more ROI for our clients. So there is a sense of urgency this year to drive value, and therefore increasing amount of focus in terms of how do we make this mainstream? And I’m excited about that sense of urgency because the moment you agree to a finish line, then the pace and the energy is totally different.

Patrick Moorhead: I’m glad to hear you talk about the investments you’re making as small language models. I know the hype was all around LLM, and we are using that for world types of answers and queries. But when it comes to an ERP or an SCM system or CRM system, HRM, all the Ms, getting value out of those requires a more tailored and less expensive model system to run. One of the things I’m also hearing from… CIOs are executives, CIOs is, “Hey, I’m getting the right output on what I’ve put in place, but I can’t get my people to use it.” Essentially, and we saw this when ERP systems first came out, we saw this when CRM systems first came out, getting people to change the way they work. Any thoughts about this human interaction with AI systems to get companies over the hump and actually people accepting technology that works?

Anand Swaminathan: I think that’s a great question because most of us think when you move towards natural language of interaction with the machines, it must be much easier. But sometimes if you’re conditioned do something in a certain way, it’s really hard to change and think that things could be so easier. So, a lot of investment around training and development and also coaching employees about the sensitivity and the responsibility they have when they use the AI platforms and systems. So, it’s a very holistic thinking about how do we drive context-sensitive skills and education training, and coming from emphasis where we think we are a learning company, we are extending our own learning platform with AI modules tailored to our clients to actually help them on this journey.

Patrick Moorhead: That’s very insightful. There’s also a recognition that maybe we need to do some business process, BPO type stuff to re-architect it. Something that may have taken 20 steps in serial fashion can become two or three, just getting employees up-skilling, re-skilling, to get them to do that. So great conversation so far.

One thing I wanted to ask you though is let’s say next year, Davos 2026, we’re having the same conversation. What is it that you would’ve wanted to achieve in 2025 for you to be satisfied?

Anand Swaminathan: I think I would say the first thing, which will give me gratification, is if we have enabled at least a few of our large enterprise clients to reimagine the products using AI. I think that’s where the value unlock and the revenue optic is clearly possible. So, we are working with many of them today to help them with it. I think we clearly see the possibilities now it’s about execution and getting it done. So, that’s one thing I would love to see.
The second is how do we simplify the complex set of tools and processes out there? So we make the enterprise world far more simpler, at least in one or two processes that we have chosen to work with our clients. It could be on the customer support supply chain, it could be on anything, but how distinctly it can be different with AI compared to what it is in the current world.

Patrick Moorhead: Oh, that’s great. Love the insights. It sounds like you’re on a solid track, a lot of work that needs to be done. But listen, you’re in the business of simplification. When your clients don’t want to do it themselves or need help doing it, this is where you come in to play, and every time I think that AI can make things easier, but getting there is a lot of complexity.

Anand Swaminathan: I agree. I agree. Once when things are simplified, it looks magical, that it could be simplified, effort may not be understood. But then I think as you rightly said, to solve the enterprise problem, you have to go through a lot of grind. Whatever is the simplified output can be effective as well.

Patrick Moorhead: That’s great. Hey, thanks for coming on the show.

Anand Swaminathan: No, thank you, Pat. Thanks for having me. Really appreciate it. Thank you.

Patrick Moorhead: Absolutely. So, this is Six Five On the Road with A View from Davos. We are having great conversations about economic opportunity, social issues, regulation, but a lot of talk about AI. Tune into all of the content that we have here in Davos. Thanks again. Hit that subscribe button. Take care.

The post Insights from Infosys’s EVP Anand Swaminathan – The View From Davos appeared first on Moor Insights & Strategy.

]]>
The View from Davos with AccuWeather’s Dr. Joel N. Myers https://moorinsightsstrategy.com/the-six-five/the-view-from-davos-with-accuweathers-dr-joel-n-myers/ Fri, 24 Jan 2025 23:29:41 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45315 Dr. Joel N. Myers, Founder and Executive Chairman at AccuWeather, joins Daniel Newman and Patrick Moorhead, sharing his insights on the advancements in weather forecasting technology and its essential role in addressing climate change challenges.

The post The View from Davos with AccuWeather’s Dr. Joel N. Myers appeared first on Moor Insights & Strategy.

]]>
AI is accelerating everything…including energy demand and climate change?

Patrick Moorhead and Daniel Newman are at WEF25 where they’re joined by AccuWeather‘s Dr. Joel N. Myers, Founder and Executive Chairman for this installment of The View From Davos. Dr. Myers shares insights on the critical role of weather forecasting in global decision-making and climate change adaptation strategies.

Tune in for these key takeaways:

  • How much we can expect the rise of AI to increase energy demand in the next 10+ years
  • Fusion energy? Why it could be the ultimate solution but its widespread availability remains uncertain
  • The future of weather prediction and its implications for the global community.
  • The evolution of weather forecasting technology and its impact on global business and safety.
  • Fostering a culture of innovation for successful AI implementation

Learn more at AccuWeather.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road with a View from Davos. We are having some incredible conversations here about tech, about finance, maybe a little politics put in here, but let’s put politics aside for a second. I want to introduce Dr. Myers. Dr. Myers, great to see you.

Dr. Joel N. Myers: Glad to be here.

Patrick Moorhead: Yeah, it’s been fun. So we’re all here at Davos. What are you trying to get out of this conference? What are the things that you’re going to be looking for?

Dr. Joel N. Myers: Well, I always learn a lot. Lots of creative people, brilliant people here, and so I always come back with new ideas and connections. And of course, business connections lead to business for AccuWeather, and ultimately it’s a business trip.

Patrick Moorhead: I love it. Same here. This is a business trip for us as well.

Dr. Joel N. Myers: Yeah.

Daniel Newman: It’s an interesting time though. We’re in the middle of a pretty significant day today, the inauguration of a new president in the United States. I know you’ve been in this business, I think, what, over 60 years now. So lots of change going on here in Davos. There’s always a bit of impact on how, not just impact on the weather, impact on climate, but there’s impact with different administrations on how this is all being sort of sought and seen. Just kind of interested, any thoughts on this big day, national championship football game, inauguration day, and of course first day of Davos, about how things are going to play out over the next few years?

Dr. Joel N. Myers: Well, clearly there’s going to be changes, different view of climate and business, and hopefully it’ll be positive. But you’ll have to wait and see the changes. I think one thing we’re going to see is greater government efficiency, but a big change obviously is coming.

Daniel Newman: But in terms of your world, let’s zero in a bit on climate and weather. We know this event has historically been very climate-centric, but it’s a little different now. It’s been very AI-centric. And AI is almost diametrically opposed to some of the sustainability conversations we’ve had. We know as tech analysts the amount of energy and power that’s going to be required. I think they’re probably going to be talking about everything from fusion to how we can grow in nuclear power because we certainly can’t keep doing it the way we’re doing it today. Do you have any thoughts on that conflict? Because I know you’re big on AI, you’ve been doing machine learning for decades now. We’re at a pretty big inflection.

Dr. Joel N. Myers: Yeah, well, humanity’s at a big inflection because of AI and because of the acceleration of technology as we all know. And that acceleration is accelerating. And so we have less ability to predict the future except for AccuWeather forecasts, of course, than ever before. And you’re right, I haven’t heard many people talk about it as a conflict, but I guess it is because AI is going to demand more and more energy. And I think a lot of the forecasts that we’re reading of how much energy we’ll need over the next 10, 12 years are too low. I could see the demand for energy doubling over the next 12 years. And clearly, where’s that power going to come from? And most of it’s going to come from the traditional sources, oil and gas, and even coal, even as rapidly as the alternatives are accelerating. And the only answer ultimately for humanity is fusion, because fusion produces no greenhouse gases and will be a real breakthrough. But how far off it is and is it going to be available on a huge scale, that remains to be seen. But AI may help us solve that problem.

Patrick Moorhead: Yes, sir. Dr. Myers, first of all, congratulations on your 63 year-old startup.

Dr. Joel N. Myers: Thank you. And that’s what it is. That’s how we call it, and that’s how we act at AccuWeather, always focused on creativity, innovation, and moving the boundaries forward.

Patrick Moorhead: So you’ve been doing this for 63 years.

Dr. Joel N. Myers: Yeah.

Patrick Moorhead: I haven’t, I think I’m just a mere 35 years in the game here. But I look at weather and analysis as one of the first big data exercises. I’m curious, we see all these generative AI startups, but it’s all about data. Are there any words of inspiration, words of wisdom, that you can give the generative AI startups today that you’ve learned in your 63 years at AccuWeather?

Dr. Joel N. Myers: Well, the data’s important, and obviously that’s one thing at AccuWeather we focus on. The data has to be the best, the cleanest, the most reliable, greatest depth, more time, minute by minute, hour by hour, more points on the planet than any other anybody else, 300 parameters. So it’s the depth and breadth and the cleanliness of the data. The old saying, “Garbage in, garbage out. So we get better results for the companies we work with. We work with thousands of businesses around the world to develop and train algorithms that are going to allow better prediction of whatever the weather correlates to. And it correlates to a lot of things.

Patrick Moorhead: Just generative AI, does it help anything in your universe of weather?

Dr. Joel N. Myers: Yeah, we used AI before people called it that.

Patrick Moorhead: Sure.

Dr. Joel N. Myers: But it’s not just the computer. But we found the best, the most accurate forecasts, are a combination of the computer outputs and taking the best of all the computer models. And we have 192 forecast models that cover the globe that we bring in from all, Canadian Weather Service, Europe, the US and private sources. But it’s taken the best of those in an AI way. But then also using our meteorologists, over 100 forecast meteorologists and their experience. I mean hurricane experts and tornado experts that can beat any model. And given the best model combination. And then the humans who are experts produce the best and most accurate forecast. And part of that is communications. In the end, a forecast is only as good as the decision people make based on it.

Patrick Moorhead: Sure.

Dr. Joel N. Myers: And so a lot of the output on things don’t take that into account. You can issue a warning that causes people to say, “Oh, I’ve lived through that before.” Or a warning that causes them to say, “I better get the hell out of my house because it’s going to burn down.” And that’s the key in the end to any kind of a forecast.

Daniel Newman: Yeah. Well, speaking of burning down, we’ve had some pretty chaotic weather climates. Of course, there’s always a continuum of people’s assessment of what’s happening, but just for instance, LA, we’ve seen a just unprecedented set of wildfires. Any sort of thoughts on the impact of climate and what are some of the things that with weather that someone like yourself with all this data, all this experience, is sort of suggesting and recommending? I’m sure you talk to leaders, and I’m sure you talk to politicians and different advocacy groups. What are you recommending to try to avoid this or at least be more prepared for situations like this so they don’t keep happening?

Dr. Joel N. Myers: Well, last year at Davos, I launched my book, Invisible Iceberg: How Weather and Climate Shaped History. And of course the weather and climate have shaped history and have impacted humanity from the beginning, from going way back what things got set up for humanity by the comet that hit the earth and destroyed the dinosaurs. And we were in an ice age just 12,000 years ago where the ice was two and a half miles thick over Montreal, Canada and the ocean levels were 450 feet lower than they are today, just 12,000 years ago. So the Earth has been warming since then. It was a little ice age up to 1850. So naturally the Earth has been warming coming out of the ice age anyway. But humans are certainly contributing to that warming and impacting the climate and there are more and more of these disastrous effects. And we’re going to have to continue to deal with those. And in some parts, the statistics that we use, talking about the database that we talked about earlier, that AccuWeather has, suggests that the warming is going to, in some places is happening faster than the climate models suggest.

So everybody’s using the same climate models and can make general forecasts, but our statistical analysis show that the warming in some places is happening faster than those models suggest. And you have to keep that in mind too. Some parts of the world are warming very slowly, other parts much more rapidly. So the individual forecasts long term have to take that into account. Now, getting to California, the things humans did or didn’t do on top of the climate change that caused this fire, it sort of shocks me that we depend on the government for so many things supposedly, that they were not better prepared to deal with the fires. I mean, a few small planes flying overhead, dropping water? I mean, it was like a war against the environment. That’s a political comment, I guess, and maybe I don’t know enough, but it just seemed to me, knowing how, I mean now, we estimated the total damage and economic loss, not only from the fires, but long-term from jobs loss, businesses gone, and all the effect that governments are going to have to spend and the health effects over years from the bad air, $250 to $275 billion, that was AccuWeather’s estimate. It’s got a lot of press as you know. That’s 1% of the GDP. And on top of the hurricane effects this year, we’re talking about 2% or more of the GDP lost due to environmental disasters.

Patrick Moorhead: I was going to say Dr. Myers, so I think, I believe, that the reaction to these fires is one of the worst government reactions we’ve seen in a while. I’m curious, and we talked about there’s garbage in, garbage out, there’s really good data and garbage response. And I’m curious, what is one of the best responses that you’ve seen due to weather where your data came in, gave an indication of something, and people made changes because of it?

Dr. Joel N. Myers: Well, the companies we work for, now with Hurricane Milton that hit Western North Carolina, we said five and a half days in advance for extreme Northern Georgia, Western North Carolina, extreme eastern Tennessee, there would be catastrophic flooding. Evacuate. You can’t be any more dramatic than that.

Patrick Moorhead: Right.

Dr. Joel N. Myers: Five and a half days in advance, constantly saying that. So some people left because of that and saved their lives. And other people unfortunately didn’t. But Hurricane Sandy that hit New Jersey. The company we worked for, we worked for a lot of businesses, told them eight days ahead. They told us that forecast, that one forecast, was worth $65 million to them. They had water and batteries and all the things people need in Pennsylvania, New Jersey, and Maryland. Before the hurricane hit, their competitors were out. And so this is the kind of help that we provide to businesses and to the public through our superior forecast. And that’s why we say AccuWeather has superior accuracy and it’s proven.

Daniel Newman: Well, Dr. Myers, maybe a final question. We did a really significant research study that’s actually going to be released here in Davos with Carney, and the focus of the study was really about CEOs and how they’re thinking about AI, implementing it, managing it, governing it. And one of the really interesting findings was that companies that were succeeding, now, again, it’s still early with AI. We could laugh because you’ve been doing it for four decades, but for a lot of companies and with really the disruption that’s happening, still pretty early. But when CEOs kind of tried to hold too tight, the CEOs and the boards held too tight to AI and sort of tried to joystick it into the company, those initiatives were struggling to be successful. It seemed more about the ability to delegate to your teams, the lines of businesses, the technologists. As someone who’s been doing this a long time, by the way, now you’ve even appointed a CEO for AccuWeather. You’ve taken that founder chairman role. I mean, what do you sort of think about that and how would you suggest to companies like yours, 60-year-old startups and new companies, that they can really get the most out of their AI projects and be successful?

Dr. Joel N. Myers: Well, as you said, we’ve used AI for a long time, and I did appoint somebody CEO, Steve Smith, almost two years ago, and I joked that that allowed me to cut back from 70 hours a week to 62. But I’m having a lot of fun. I’m doing more of the things I like to do. But that study doesn’t surprise me at all. You have to empower people. But I did appoint a couple of people to be sort of the leaders and the advocates across the company, but we brought a consultant in and trained everybody in AI and gave them the tools and encouraged those who wanted to, to play an active role. And five to 10% of people really embrace it and then become leaders in their area. So you got to empower people, give them the tools, and let them go. But we’ve always been, our motto is ICE, innovation, creativity, and entrepreneurship, and that’s how AccuWeather operates.

Daniel Newman: Sounds like every technological revolution ever. AI is just faster and a bit more.

Dr. Joel N. Myers: Right, yes.

Daniel Newman: It’s a bit more in our face right now, but it’s the same challenges about empowering your organization. Was it Steve Jobs, “You don’t hire smart people and tell them what to do thing.” It sounds like there’s a lot of that here. Dr. Myers, I want to thank you so much for sitting down with us.

Dr. Joel N. Myers: Sure, my pleasure, yeah.

Daniel Newman: Great conversation.

Patrick Moorhead: Thank you.

Daniel Newman: Congratulations on all the success.

Dr. Joel N. Myers: Thank you.

Daniel Newman: And we’ll hopefully catch up with you again soon.

Dr. Joel N. Myers: Yep. I look forward to it.

Daniel Newman: And thank you everybody for joining us for this episode of Six Five On the Road, a View from Davos. For Patrick Moorhead and myself, it’s time to say goodbye. Stick with us for all of our coverage here at the World Economic Forum 2025. We’ll see you all later.

The post The View from Davos with AccuWeather’s Dr. Joel N. Myers appeared first on Moor Insights & Strategy.

]]>
The View from Davos with Groq’s Jonathan Ross https://moorinsightsstrategy.com/the-six-five/the-view-from-davos-with-groqs-jonathan-ross/ Fri, 24 Jan 2025 21:49:08 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45312 Jonathan Ross, CEO & Founder at Groq, joins Daniel Newman and Patrick Moorhead to share his insights on pioneering developments in computing technology and AI's critical role in the future.

The post The View from Davos with Groq’s Jonathan Ross appeared first on Moor Insights & Strategy.

]]>
🔥 Did you know? Groq is rivaling giants like Google Gemini and Amazon Bedrock in LangChain’s API usage!

Daniel Newman & Patrick Moorhead catch Groq CEO Jonathan Ross to get his take on AI’s exponential growth and why Groq is poised to dominate the inference market.

🎧 Tune into this segment of The View From Davos for more on:

  • AI’s Exponential Growth: AI is evolving at an incredible pace, driven by the doubling of chips and transistors. Groq is uniquely positioned to capitalize on this with their data center-level approach
  • Energy Efficiency: Groq’s architecture uses 1/3 the energy of traditional GPUs, a major advantage in a world increasingly focused on sustainability
  • Massive Capacity: Groq is ramping up production to 2 million chips this year, rivaling NVIDIA’s output
  • Global Inference Network: With deployments in Saudi Arabia and beyond, Groq is building a truly global inference network
  • Jonathan also hinted at a potential 4nm chip in the works… 👀

Learn more at Groq.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road with a View from Davos. We are here at the World Economic Forum having some incredible conversations. It’s one of the few places where you’ve got the intersection of tech, politics, regulation, sustainability, all wrapped into one. Dan, it’s been a good show so far.

Daniel Newman: It has been. It is the morning and the Magic Mountain looks beautiful, but you really can’t get through a day here without just having some of the most inspirational and interesting conversations. But look, AI, Pat, is of course, well, every year there’s a theme. There was years where it was sustainability. There’s years where it’s been diversity and inclusion. This year is really the year I think it’s coming together. The last few years have been a lot about infrastructure and build up. This year it’s about seeing it in action, seeing AI work and driving enterprise and business and government value.

Patrick Moorhead: Very true ROI. One of the keys to this is obviously the inference, the processing of all that data. I can’t imagine a better person to have a conversation with than the CEO of Groq, Jonathan Ross. Good to see you.

Jonathan Ross: Sorry, did someone say inference? We’re doing LPU chips. They’re the fastest for inference and we’re going to ramp up quite a bit this year. Some of the numbers are unbelievable, but we’ve been hitting them.

Patrick Moorhead: Well, it’s incredible. I mean, we saw you over the holidays loading 747s off to a far end country, super exciting though. But I have to ask you though, what do you want to get out of this event? Is it a customer event for you? Are you listening to officials? Is it all the above?

Jonathan Ross: It’s meetings. We’ve been pretty heads down trying to deliver on what we had to deliver. Now I’m just bumping into people in the streets. In the coffee line this morning, someone recognized me and these are people who we need to get back in touch with because we’ve been so focused on delivering for this customer. But we now measure our deliveries in 747s.

Daniel Newman: I think that’s actually great. I mean, look, both Pat and I have been supporters of your journey from the very beginning. We watched the pivot. We both are on your cap table. Full disclosure, we’re invested and fully believe in what you’re doing, but we’ve also seen a pretty substantial pivot. Remember in the beginning you sort of had a chip that could do a few different things and then at some point you kind of said, we think it’s inference. You went to the LPU and then you kind of went to the cloud. You’re like, “Look, if people aren’t going to buy this and build it, we’ll build it. We’ll deploy it.” I mean, talk a little bit about some of these, what you can. I know you can’t share everything, but I mean these 747s are going somewhere. Customers are starting to consume this stuff. I mean you raised money recently. There must be some evidence now that you can maybe share out there about how this is sort of evolving.

Jonathan Ross: Well, actually, one of the biggest ones, LangChain recently announced their top API users or whatever are providers, and of course, OpenAI. Then a local thing, which isn’t really an API, then Anthropic, then OpenAI again on Azure and then Groq, and then Google Gemini, and then Amazon Bedrock. We’re actually above those in terms of usage. What we’ve lacked is total amount of capacity for AI and we’ve just added a whole bunch more. We just delivered a whole bunch of 747s worth of our systems to the Kingdom of Saudi Arabia. That’s our second region, but it’s almost on the opposite side of the planet, which has brought our latency down almost no matter where you are. Now we’re looking to deploy all over the world and build out a global inference network. But as for the pivot, we actually always wanted to sort of build this and make it easy for people to use from an API point of view and cloud. We never thought we would be able to get the capital to start doing this ourselves. We started selling systems, but we found it was just easier to get people using us when we provide an API. We launched in March and we’re now at over 800,000 developers.

Daniel Newman: Wow.

Patrick Moorhead: No, that is a staggering number. Every time I do a double take, every one of the emails I get from you guys or you or Sonny is kind of flaunting it on there. I mean, congratulations there. One of the questions I get a lot that it’s funny. I do think I know the answer to this, but I think our viewers need to hear is architecturally, I mean everybody’s like NVIDIA, NVIDIA, NVIDIA, NVIDIA, training, training, training. What is it about the Groq chip in services that makes it special just for the layperson?

Jonathan Ross: Well, for the layperson, I’ll give it a shot, a little complicated, but all right. Think of it this way. Everyone’s heard of Moore’s Law and so the number of transistors doubles roughly every 18 to 24 months. What we realized was that AI was actually getting better, faster than exponential. But if you look at the amount of data being created, it wasn’t exponential. The algorithms, they were improving, but it was big jumps. It wasn’t this sort of exponential growth. Where was this double exponential coming from? What we realized when we started Groq was that the number of chips was also doubling every 18 to 24 months. If you’re going to get an advantage, why focus at the chip level? Why not focus at the data center level? What we did that’s very different than GPUs. GPUs have a lot of external memory. They will do part of the problem, bring stuff in from memory and then do part of the problem and bring stuff in from memory. It’s very slow, very energy intensive. We use about a third of the energy versus a GPU because what we do is we’ll take hundreds or thousands of our chips, lay the problem out completely in those chips so we don’t touch any external memory. It’s like an assembly line. We’ll just go through that very quickly. We take advantage of that double exponential and we’re the first to do this.

Daniel Newman: I know we only have a minute with you Jonathan. By the way, thank you so much for taking the time. Now I do want to point out for everybody out there, you’ve been doing this on 14 nanometer, which you don’t have to, I mean you got to be a little technical to understand what that actually means, but there’s a lot of opportunity to improve. You’ve been these incredible speeds on like 14 nanometer. You could go to four, three, two and start to even get more processing power, transistor density. It’s pretty exciting stuff. As you kind of finish up your first journey to Davos, I know what is sort of the one thing you want people to be thinking about Groq as we head into the full, what are?

Jonathan Ross: Let me give you two. I’ll give you two for the price of one.

Daniel Newman: Two?

Jonathan Ross: The first one is going to the 14 nanometer. To give a sense of that, most of you will have heard of what a Blackwell is or an H100. Blackwell is four nanometer. Then you get the H100, which is older. Then you have, I think it was the Volta and then you have the Pascal. Now you’re at somewhere around 12 nanometer. We’re 14 nanometer and we’re running circles around these four nanometer chips. Rumors are, we may have taped out a four nanometer chip that may be coming soon. I’m not going to confirm or deny. That’s one. Number two, the amount of capacity, because we’re not capacity constrained because we’re not using HBM, we’re going to ramp and build an enormous number of chips this year. Our intention is to get close to 2 million this year. If we do that, keep in mind that NVIDIA is trying to do two to 3 million this year. Next year we plan to do a lot more.

Daniel Newman: That’s incredible and makes me really proud to just sit there, let the shares appreciate Jonathan. I think I’m glad that we bought early and believed early. I continue to believe. I’m really glad we had the chance to sit here in Davos. I know we’ve done it in other parts of the country in the US. It’s just been great to watch your explosive growth, dimensions on All-In. We’re all fans here. Pat and I actually, we went to the original All-In Summit. We actually did it the first year. Those guys are great. They’re super interesting. I love how the kind of truth telling and that they just say it straight. I know you’re a guy that always likes to say it straight. Have a great Davos, thanks for joining us and let’s talk again soon.

Jonathan Ross: Thanks. All right.

Daniel Newman: Thank you everybody for tuning in. Stay part of our Six Five On the Road, a View from Davos. So many great conversations here. This is a great way to start day number two. Hit subscribe. We’ll talk to you all later. Bye-bye.

The post The View from Davos with Groq’s Jonathan Ross appeared first on Moor Insights & Strategy.

]]>
The Enterprise Applications Podcast, Episode 2: Salesforce, Microsoft, SAP, IBM, & More https://moorinsightsstrategy.com/enterprise-app-podcast/the-enterprise-applications-podcast-episode-2-salesforce-microsoft-sap-ibm-more/ Fri, 24 Jan 2025 21:43:54 +0000 https://moorinsightsstrategy.com/?post_type=enterpriseapp&p=45324 This week on the MI&S Enterprise Apps Pod, hosts Melody Brue and Robert Kramer talk Salesforce, Microsoft, SAP, IBM, Active Directory and AI Agents ERP.

The post The Enterprise Applications Podcast, Episode 2: Salesforce, Microsoft, SAP, IBM, & More appeared first on Moor Insights & Strategy.

]]>
This week on the MI&S Enterprise Apps Podcast, hosts Melody Brue and Robert Kramer discuss Salesforce, Microsoft, SAP, IBM, Active Directory and AI Agents ERP.

Their discussion covers:

  • Salesforce has halted engineer hiring, sparking discussions about the impact of AI on software development jobs.
  • Microsoft is boosting workplace productivity with its new AI-powered assistant, Microsoft 365 Copilot Chat.
  • SAP and IBM are teaming up to help businesses transition to cloud-based ERP systems.
  • Commvault is making it easier for organizations to recover from Active Directory disasters with their new cloud-based solution.

Watch the video here:

Listen to the audio here:

Disclaimer: This show is for information and entertainment purposes only. While we will discuss publicly traded companies on this show, its contents should not be taken as investment advice.

Transcripts:

Robert Kramer: Hello and welcome to this week’s edition of Moor Insights and Strategy, the Enterprise Apps. Actually, we’re on episode two. I’m Robert Kramer, joined by Melody Brue, both vice presidents and principal analysts of the company. I handle ERP and enterprise data. Mel handles modern work HCM CX, and we cover everything else under the sun that relates to data, the glue that holds everything together. So just for disclaimer purposes, this show is for information and entertainment purposes only. Why we will discuss publicly traded companies on the show. Its content should not be taken as investment advice. So let’s get started. Mel, we had really a busy week last week. The year’s been off to a huge start. We’re going to talk about some great things in the first one. We’re going to kick it off with really in your neck of the woods with Salesforce, and we’re going to cover a bunch of other things. We’ll kind of touch on those as we go through. But what do you think?

Melody Brue: Yes, so it has been like a whirlwind kickoff to the year. Everybody just went from having a quick break to boom. Everything’s happening all at once. So this one’s interesting and I think this is going to start a very interesting conversation throughout sort of the thread of what we’re going to talk about today, which is one, how AI is being adopted, how it’s being priced, and just how people are looking at it in terms of jobs. At Salesforce, Mark Benioff, CEO recently just announced a pause in hiring any software engineers suggesting that AI could automate a growing portion of development tasks. So this is really motivated by potential cost savings and the productivity gains with their Salesforce agent force AI tool. So I think this raises a lot of questions. There’s the evolving role of tech jobs and how companies might manage this AI augmented workforce.

And then also, is this too soon? Can AI really replace human engineers? It also brings up this really important key challenge, which is who’s going to manage this digital labor? Is it IT or is it HR? And Cisco just had their big AI summit and they had, I think it was actually the CEO of Box, Aaron Levie was talking about this challenge of it becoming sort of an HR role when it comes to these AI agents, which is they’re not only responsible for the procurement, the hiring, the hiring of these ag agentic ai, meaning the creation of them, but they’re also responsible for their outcomes. So that creates a really interesting responsibility beyond what HR is typically responsible for in terms of like, okay, if you run marketing and you say, I need somebody to run this campaign for me, the HR department is responsible for finding that person and placing that person in that role. They’re not responsible for the output of what that person, what they create, who that person reports to is. So who do these agents agentic AI report to? Is it IT HR who’s managing this digital labor? That’s another point.

My last point on this is Salesforce showing the example of their AI is so good, they’re using it internally and Microsoft is doing the same thing. It’s like eating your own dog food that they’re highlighting, basically promoting their own AI adoption, demonstrating their own ROI through internal cost savings. So you want to sell something, show how well it’s working for you, right? I mean, Microsoft’s doing the same thing. So while I do think it’s really early to start saying that AI is going to be able to replace developers, I think it’s a smart strategy to show that Salesforce is using its own agentic AI to write code to really boost their own productivity. But I still think that we’ve got this digital work, digital labor management problem to solve.

Robert Kramer: Well, we definitely have that. What’s the setup required for these companies to be able to take advantage of this to see success? And like you said, the pricing is a big deal. How is that done? And then what’s the success that they’re actually having so far at Salesforce with not hiring those types of people and letting the agentic AI take advantage of consuming those roles? So I think it’s not put together yet. It’s not able to see the success or failure. My guess is it’s a bold comment from Benioff and if he can back it up, we’d all love that because companies would like to take advantage of how the agents can, I think they’re doing strategically handling tasks, not replacing, and he’s making an assumption that they’re going to be replacing. I don’t think, like you said, who’s managing these people? How is human resources interacting with it? This is not played out. There’s no governance around it. So there has to be some guardrails to show us how it’s all going to work.

Melody Brue: And I haven’t really seen anything. I know the World Economic Forum is going on right now, Pat’s in Davos, there’s been so much going on there. There’s usually a lot of conversation around this type of forward thinking stuff that happens there. He made a lot of bold statements a couple years ago about this digital labor and digital workforce. So we’ll see how this goes on. But interestingly enough, I was on, you may have been on this too, they didn’t show other participants, but Microsoft had a briefing this morning and gave some public facing ROI from case studies from their customers, which we don’t always get to see. We get sort of more anecdotal ROI, but they gave some kind of concrete numbers about marketing teams time to reduce content by 67%, seven months of work completed in seven weeks. They had sellers increased outbound calls by 40% due to process efficiencies, 50 million in annual revenue gains. So these are various companies, various ROI reports from using copilot, Microsoft’s 365 copilot in the Fortune 500. I appreciate them giving those concrete numbers and ROI because we do hear like, oh, it’ll make you more protective and all of this. So good to see those. I’m sure a lot of it’s, but still if these companies are seeing the value from it, that’s good to see that it’s doing what’s as promised.

Robert Kramer: Well, I think that the copilot, Microsoft is really weighing a lot of cards on that and specifically within enterprise data like you just mentioned, but also within the ERP world. And they’re utilizing that to actually filter the AI and AI agents to go through the ERP operational transactions that it can help with. And so this is a big deal and it’s something that they’re very emphatic about that it’s going to be kind of a game changer for them to help out with some of these tasks. And obviously the data has to be set up, but it can help with transactions within the financial area, within the inventory area, within the supply chain area to automate certain things to give the person who’s used to doing things on a spreadsheet, get them out of the loop because that’s a hazard to be more of a transaction helping situation with the agents. 20 years ago, these ERP systems, and not to get on a tangent, but there’s always people or a person who is doing something outside of the system. And when that happens, then that person feels they’re more reliable or they’re going to hinder the system in some way when they’re not there or something happens to that person. But at the same time, it’s not a global real time system because of the fact that it’s not present tense. So these agents, if it’s present tense, actionable data, these agents can really make a difference for these type of systems because of the fact that they’re helping these with the strategy. At the same time the nuance transactions are being done. So I’m still worried about the quality of the data, but if that is in gear, these types of copilot systems and automation purposes should be a big deal.

Melody Brue: Yeah, I think what I heard from Microsoft this morning on the briefing is just the agent versioning and updates that it can manage the lifecycle of the agents and update them over time. So it is going to be, this isn’t like a set it and forget it thing. There is going to be a lot of work that needs to be done to maintain them. Probably eventually it will be like a self-cleaning oven where the agents will be able to maintain themselves with the right amount of guardrails and the provisions. But I do think that there is probably with the right training, less risk of the human error that we have just from people entering, think if you enter from one thing to another, even copy paste, you can miss something, things can get transposed. So I like the oversight of humans on agents and the agent oversight on humans, but it goes both ways. It can’t just be one way or the other.

Robert Kramer: Yeah, I agree. And I also like that the agents can continuously learn and analyzing these workflows to up their logic to help out more as they get familiar with what those are so that the humans can interact a little bit better and trust the agents.

Melody Brue: Yeah, So on that, both Microsoft and Google made announcements to changes in their pricing for their AI. Microsoft added the consumption-based pricing, which we’ve talked about on our last podcast. And I think that as we were talking about, it’s just going to become necessary and it’s certainly necessary for people who aren’t going to be the power users that are going to be using AI all the time every day, but to get them to start using AI and start trying it out, I didn’t really see it. I saw that they’re doing kind of these batch pricing, but it didn’t see, and I don’t know if there was an answer to this of what happens when that batch runs out, if it can be topped up or what happens next. But I do like the controls on that because as we talked about on the last podcast, there has to be with this consumption based pricing, there has to be the ability to plan for how much you’re going to be spending on AI.

And that’s the real unknown when you have this consumption-based pricing. So I think the way that they’re pricing this allows for that planning, but it really, I think both with what Microsoft is doing and with what Google’s doing, either giving it for free, including it in certain level plans or consumption-based pricing is answering to two really big issues with AI adoption, which is one, adoption, getting it into the hands of people, and two, the cost barriers. So it’ll be really interesting to see how much usage we’ll see across enterprises and if those numbers on those case studies that they’ve shared, if those go up when you see it beyond just these test cases in particular lines of business.

Robert Kramer: I agree. I had a conversation about six months ago with John Case, he’s the CEO of Acumanica and he does consumption-based pricing for all of his customers. That’s all they do. And I asked him what was his method? They basically look and interview the teams that they work with at the customer and based upon what the use that they feel they’re going to get out of it and the users. And then they have a formula that they’re able to put together that has, this is what we are going to bill you and what you don’t use carries over. If you go over, we’ll bill you on that, but there’s expectations based upon those amounts that you gave us. If you over order, then you pay more. If you under order, you get it back. So the honesty helps the situation.

Melody Brue: So then each party knows what to expect. I mean, companies need to know what their billings are going to be too, and then the customer needs to know what their bill is going to be like. I think one last thing on this, and then we’ve got your Forbes article and big brainy stuff to talk about, Active Directory. I was like, wait, I need to study up on this before I can have an intelligent conversation with you on this. But I do also think on the pricing on this for both copilot and then Google’s workspace offerings is the parody of what’s available for the both free and consumption based versus a full subscriber plan in comparison, like Google is just throwing all of their AI offerings into their workspace offerings. I think that eventually what I would like to see is for customers to choose, do they want the full offering?

They want AI system all of it and be able to pay for it as they like and not have to have the full subscription? I like the flexibility of both plans, but I also think that Microsoft should allow for consumption-based pricing if they get people using it. Some of it is included in this consumption-based Microsoft Chat, copilot chat, but not all the full functionality is available to the new consumption based pricing. So I think eventually they might get to that point where they just say, okay, yeah, we can make all of it available because I think people are going to find that it’s really useful and they want it, but maybe not every single person in the organization needs that full subscription plan of 365.

Robert Kramer: So for your company, if you had to go one way or another, do you think consumption would be better for your company if you were the owner? You like the other-

Melody Brue: I think that most companies probably have different users in different groups of the business that are going to be kind of power users that you would say, yeah, give them the full, I want subscription. But these other ones, they may only have four or five tasks per month that they really need it for. So give them the consumption base, but they still want the full functionality. So that’s where I think there might be. The last thing I’ll say on this is I think in any scenario, giving people access, whether it’s the consumption based, subscription free, however they give it to them, having it in this controlled environment so people aren’t using their own ai, like using their own subscription to chat GPT or something that’s not controlled with their enterprise data is the best way to go. That is adding a security level beyond the security level that Microsoft already has, that Google already has to the enterprise so that they’re not having this bring your own AI to work because you know that employees are doing it. I mean, we saw it happen early on when companies were banning people using AI at work because information was getting leaked to outside models. So companies should be allowing, they should be turning this on immediately.

Robert Kramer: I agree.

Melody Brue: All right. That said, if you have nothing to add on that,

Robert Kramer: I’m good.

Melody Brue: Tell me about Active Directory.

Robert Kramer: Well, it’s a little technical, but at the same time it makes sense. So Active Directory is the core piece that really handles your authentications and your access to really your assets, your databases, your security, your apps, even the way that you walk through a door at the company that somebody works at. All those access controls are done through Active Directory. So this is a way that a lot of companies are getting breached because the attackers are getting in there at some point long time ago. I think that the Starwood hotels that breach with Marriott, they’re together a company now, but that whole was I think four or five years ago, but it started a dozen years ago. That was from an Active Directory breach. And Commvault has come with an announcement that introduces a backup and recovery for Active Directory. And this makes a lot of sense to actually put the pieces back together where it’s simplified, where it’s structured, where it’s not just a mayhem of day of time and confusion to put it together, which could take days. This is a automated process. So I wrote a Forbes article on it, and you can see that and the details around it, but I’m seeing a lot of activity within the data protection security industries because of the fact this is really a big problem. This is one of ’em. So that’s a really interesting subject, and I think a lot of the companies that are really involved in that are being innovative, and I think automation and AI is a big key in some of these companies in this field.

Melody Brue: So would you say that AI is hurting or helping companies in terms of their cyber risks?

Robert Kramer: I think AI is helping them because of the fact of what we talked about with the task and automating certain components and able to bring information quicker to the company and anomalies that are being pointed out at the time. Yeah, at the same time, I do think it’s all the technology in the world has its pros and cons. I mean, we didn’t have phones 30 years ago and it was probably an easier life, but the cyber attacks weren’t as prevalent, and AI is definitely pointing out those problems and correcting them a lot easier. The agents are pointing a role. The companies that are on the forefront are bringing automation, AI, and agents into the picture to bring things together quicker and more efficient just because of the bad actors are everywhere and they’re able to bring companies down to their knees pretty quickly. So Active Directories where a lot of it can start.

Melody Brue: Don’t you think it’s kind of just crazy when you see how quickly, and even just things like deep fakes, how quickly AI has evolved to be able to fool people. And I mean, you can upload a picture of somebody and the AI can make their mouth move and it looks like they’re saying something,

Robert Kramer: You don’t know if it’s real or not.

Melody Brue: Yeah. It’s actually kind of creepy how quickly that’s evolved from being used to be the little JibJabs where you could see it’s so clearly fake and now it almost looks real. I mean, it was, I don’t know, 30 minutes after the chiefs game the other night, and there were memes of, it was really bad, but memes of Mahomes making out with the refs, they were saying they were just favoring him so much with his flops and all the roughing, the quarterback calls and everything. But I mean, it’s crazy how I just feel like you’re not safe from anything with this.

Robert Kramer: No, they can really change the way things look very quick. I am a Kansas City Chiefs fan, just for the record, but I do like Buffalo a lot and would love to see Buffalo win, but Mahomes is a winner. I mean, you can’t be jealous of a winner, even though the refs might’ve been slightly in his favor, but I would say the same for Tom Brady. What’s the difference?

Melody Brue: Yeah, there was some meme about this with a dog talking about Patrick Mahomes, and he was like, he’s my homie too. He’s my homie too. I like him. But there was some, we can talk about this on a game time tech conversation, but those are the calls that are very, AI is not going to fix that for you, right? When a player people were accusing him of flopping and all of that, AI’s not going to fix that and say, oh, yeah, that was a flop, or was he really pushed hard enough or it was reffing the quarterback. But yeah. Anyway, we digress.

Robert Kramer: So AI is making a big influence on ERP and supply chain as well. And the reason why it is is because the companies that, and I’ll talk about SAP and IBM continuing their longstanding 50 year relationship with the new partnership, but companies need to modernize off of these on-premise. As I’ve talked about for a long time, ERP and IBM and SAP are solidified a new partnership to shift SAP S/4 HANA from on-premise to the cloud. So to facilitate this migration, the two of them are partnering on a way to move. There’s about 10,000 SAP customers that are on IBM power servers. So they’re collaborating. How do we get and modernize these customers to the SAP cloud S/4 HANA? But this is all to take advantage of what we just were talking about, which is AI and agents and automation. You can’t take advantage of that fully by being on premise because you’re on an older system, which is antiquated, which is not going to be set up for the modernization. And IBM is also, sorry to cut you off, but they’re also gearing up their consulting practice even more by purchasing, it’s called AST Application Software Technology, which is an Oracle Cloud House to help with all the migrations that deal with Oracle’s ERP. So you’re seeing this kind of a shift that these companies need to modernize to take advantage of this great technology that all the companies are talking about.

Melody Brue: What do you think? So we’ve seen SAP kind of deliver on a lot of the promises that they made more than a year ago. And I think actually it’s worth probably going back to the article that we co-wrote together along with Jason Anderson about all of Rise and Grow with SAP to kind of see where they were then, where they are now. Some of the things we brought up about continuing to support the on-prem customers, and there has been a bit of talk around about people staying on-prem or returning repatriating and why they would do that. And with Edge Computing becoming more powerful, why would they not? I mean, I understand the advantages to being in the cloud, but for say a smaller to midsize company, that’s a really expensive and long journey for them. Why would they not stay if they can and it’s still being supported and they can still have those same capabilities edge. Is there any reason why they really just should not?

Robert Kramer: I don’t think they’re, well, I don’t think it’s black and white. I think the companies are going to stay on premise. They have proprietary customizations that they’ve made, they will lose and they think they’re difference makers. I think that the companies that do move take advantage of modern technology, which is a big deal for long-term success. The companies do have an option, which is a hybrid approach where you can kind of keep some of those modules and then hybrid IT with the cloud modules and then have the best of both worlds. And the RISE program is a structured approach to the migration. I think it is a successful program, it’s just a matter of the companies getting on board. Like I said in the past, the companies that are on the SAP system are the complex ones, not necessarily SAP. So the companies have to be supported internally and externally because I believe SA going to cut the support off at some point. They talked about 2027, they’ve moved it out a little bit, but at the same time, lots of companies don’t want change and the change involves a gigantic approach of transformation. Your question was more or less, why would they modernize?

Melody Brue: No. Is there any reason why they wouldn’t?

Robert Kramer: The customizations are huge and they have their workflows and their processes and their proprietary ways that has to all go away. And the reason that the success of these companies like Infor or an SAP or an Oracle or Microsoft, they’ve made their systems industry specific. So for example, SAP might have a couple thousand chemical manufacturers on there. All those forms, all those workflows, all the ways that they’re doing business are inherently already in that system from the other ones. So you’re gaining knowledge by going to the new system, but you’re going to have to give up what you were on before. So if it’s proprietary, you might not want to do that, but unless it’s patented, I’d say, Hey, go. Because at the same time that you have to modernize, you’re stuck in time. And I think in 20 years ago, companies felt this was the way to do business.

Now you’re being passed up because you don’t have any of these new features and functionality. And I think it just starts with what we’re talking about. There’s enormous amount of capabilities within the supply chain within sustainability that we’ve talked about with iot getting this data from all over the place and be able to use that real time as I think it’s a huge difference maker. And I don’t know why you wouldn’t do it unless you didn’t have the resources internally to gear up and get it done. So there’s major benefits and they outweigh the ones that not staying on-prem, but you can go hybrid first to get there initially.

Melody Brue: Yeah. Alright,

Robert Kramer: Well no answer.

Melody Brue: We’re going almost at 30 minutes now, so we probably should wrap it up, but I think you’ve got a couple of trips coming up. I’m forgetting I saw some things on the calendar. I’m going to definitely see you at Zoho Day. What do you have before then?

Robert Kramer: So I’m going to Acumanica next week and Microsoft has an AI tour. Then after that I’ll be at Zoho and then going to NetSuite the same week. And then the following week I’ll be at SAP. So it’ll be a great time for us to revisit the discussion on SAP and then to write something new, but some great trips ahead and I’m excited. What about you? Where are you going besides Zoho?

Melody Brue: Right after Zoho, I actually don’t have my calendar in front of me, so I have to do this by memory. But I will be at, RingCentral is having their analyst event. Adobe Summit is coming up, enterprise Connect is coming up. There’s a bunch of other things that I’m not looking at, but a lot of travel and some good stuff. But I’m looking forward to Zoho Day, having some conversations with some customers and some executives there doing a couple of videos and looking at some new products. So we will have more on that after that trip. But for everybody who has joined us, thank you so much for joining us. This was great, great show. Lots of good information and thank you for everybody who joined us. If there’s anything that you’d like to hear us talk about in future episodes, please hit us up on X or LinkedIn. Let us know what you want to hear about. Also give us some comments. We really love feedback. We want to hear what you liked, what you didn’t like. And if you did like it, please hit that subscribe button and please join us for next time.

Robert Kramer: Thank you.

Melody Brue: Thanks so much.

The post The Enterprise Applications Podcast, Episode 2: Salesforce, Microsoft, SAP, IBM, & More appeared first on Moor Insights & Strategy.

]]>
The View from Davos with Qualcomm’s Cristiano Amon https://moorinsightsstrategy.com/the-six-five/the-view-from-davos-with-qualcomms-cristiano-amon/ Fri, 24 Jan 2025 19:32:40 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45308 Cristiano Amon, President and CEO at Qualcomm, joins host Daniel Newman to share his insights on the technological foresights and innovations discussed at Davos, highlighting Qualcomm's role in pushing the boundaries of connectivity and AI.

The post The View from Davos with Qualcomm’s Cristiano Amon appeared first on Moor Insights & Strategy.

]]>
What’s Qualcomm’s CEO Cristiano Amon saying from Davos?

He has great optimism for growth and the crucial role of collaboration between public-private partnerships in driving progress. Find out why below ⬇

Hosts Daniel Newman and Patrick Moorhead are back with another interview on The View From Davos. They met up with Qualcomm‘s Cristiano Amon, President and Chief Executive Officer, to discuss the latest tech advancements and market trends observed at this year’s WEF. Cristiano shares his valuable insights from the forum including his optimism for growth and the crucial role of collaboration between public-private partnerships in driving progress.

Check out the full interview for more on:

  • AI in real-world applications and tangible value creation with are top of mind for business and government leaders
  • Edge computing is key to unlocking AI’s potential: Latency, privacy, and cost are driving a shift towards distributed computing power
  • The lines between cloud and edge are blurring
  • Qualcomm’s role in powering AI innovation across industries, from mobile to automotive to industrial IoT
  • A new era of IoT is dawning: Advances in AI, edge computing, and connectivity are creating opportunities for a resurgence of the Internet of Things.

Learn more at Qualcomm.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road with a View from Davos. We’re here at World Economic Forum, having many awesome conversations. World Economic Forum is truly unique, it’s really the intersection of government, commerce and finance, and a lot of technology. And the conversations have been really insightful, haven’t they, Dan?

Daniel Newman: Yeah, Pat. This is the year that all the buildup that’s happened about AI and infrastructure and investment’s starting to come to fruition. The conversations I’m having have been really focused on how are we going to start to derive value from all these investments? How do we start to see this AI meaningfully impact the world in ways that can be counted, right?

Patrick Moorhead: Right.

Daniel Newman: That’s the World Economic Forum. The numbers lead the discussions here, but the underpinnings, it’s so much technology. And if you just walk up and down the promenade, you see how much technology and how much AI is going to shape the next few years.

Patrick Moorhead: Yeah, and obviously a major player in technology is Qualcomm, and we have CEO Cristiano Amon on the show. Welcome back.

Cristiano Amon: Very good, very happy to be here, it’s my favorite thing to do.

Patrick Moorhead: Yeah, I appreciate that. Thank you.

Daniel Newman: All right, and the beard.

Patrick Moorhead: You probably say that and be all the folks.

Daniel Newman: And the beard.

Patrick Moorhead: Yeah.

Daniel Newman: We’re going to have to ask him about that.

Patrick Moorhead: Exactly. No, but Cristiano, maybe we can start with… We’re obviously here at World Economic Forum, you’ve been a multi-year attendee. What do you like to accomplish here, and what kind of conversations are you having?

Cristiano Amon: Look, it has been very productive, and I think if you look at our presence is increasing, the conversations are becoming larger in terms of the number of companies. And I think it has 100% to do with the Qualcomm diversification and growth efforts, especially as you go to a number of different industries. I think if you look at some of the key meetings. I know there’s a lot going on with mobile, with AI, but there’s a lot of conversations about what we’re doing in automotive, and conversation about what we’re doing in PC. You saw a lot of the PC guys in here as well. You see many companies. There are system integrators for enterprises incredibly interested in AI PCs, and the ability to run models on the PCs and how that’s going to change economics. And then most important, we have been very clear about the role of Qualcomm and industrial and AI at the edge. This is a place that you have every single industry and we have been having conversations about how our technology at the edge can be transformative. And it has been a helpful event to continue to help Qualcomm in our growth and diversification as we take our technology to different industries.

Patrick Moorhead: A great answer.

Daniel Newman: Yeah, no, it was well-thought-out. Qualcomm though is historic around these parts for your nightcap events. Although now you’ve done two this year, because you did the red party which was really great. And then last night you had your big nightcap. And I’ve got some harder questions for you. But before I do, I was so impressed with the mix. We talk about public-private and you brought some just wonderful speakers on stage, really hearing how they’re thinking. And as I said, thinking how they’re leading of course with economic growth. But so much of what sits under the things that companies like Qualcomm are doing. Tell us about the event a little bit, and how this really brings that public-private partnership together.

Cristiano Amon: Yes, so the event is a cool thing. Actually, we started a few years ago, and it’s late. It was 10 PM, was Qualcomm nightcap. It’s becoming a tradition, there’s so many people. They’re coming in, it was good. I think you guys saw it, it’s well attended. You have, as you said, the right mix of CEOs in the government. I will say that with all humility, its company built itself on partnerships. Even when we started our licensing business, we develop a technology and license to everybody so everybody can build it. And I think it speaks to the role of Qualcomm in being a partner of both governments and companies, and I think that’s what you see in the nightcap. And we had, for example, the prime minister of Belgium. We have the finance minister of Germany, we have the AI Minister of France. It was good, having conversation as you saw about the importance of Europe, to understand the need for innovation and how they’re going to reshape their policies to enable innovation to happen, the excitement about AI for all industries, and it was a good discussion. I’m also really excited about having some of the major car company CEOs in here, the PC company CEOs in a number of other industries. And hopefully we’ll continue and be a tradition, and you guys will come in every year.

Daniel Newman: Of course.

Patrick Moorhead: We’d love to do that. Thank you for the invite for the next 10 years, I appreciate that.

Cristiano Amon: Oh, you have it, you have it.

Patrick Moorhead: So Cristiano, Qualcomm, it’s about compute and connectivity, and I think last night was a great example of connectedness in pulling that together. I did want to ask you about the general discussion. There’s so many changes right now, and I obviously have my opinions on what I’ve heard. But what’s your sense about the level of optimism for growth in the industry, the idea of regulation, maybe interactions between different geopolitical nations out here?

Cristiano Amon: Okay. Complicated question, I will do-

Daniel Newman: How much time do we have?

Cristiano Amon: Yeah, yeah, yeah.

Patrick Moorhead: A 10-part answer here.

Cristiano Amon: But I’ll do my best to give a simple answer. Look, there is optimism right now in the industry, and especially… By the way, just frankly speaking, there’s often-

Patrick Moorhead: Well, I heard it last night.

Cristiano Amon: Yes.

Patrick Moorhead: From people I wouldn’t normally expect, I heard optimism.

Cristiano Amon: That’s great. And I think there’s optimism, for example about the US economy. I think there’s expectation, there’s going to be growth. The combination of energy deregulation and technology innovation usually is a good combination. It drove a lot of the thought process about things that need to happen in Europe. And we see companies really thinking maybe this is the year that technology started to move into commercialization. Maybe there’s real value to be created. We obviously have a bias, because we think that the ability to do AI at the edge is going to be transformative. You can leverage all this computing that is going to be out there. Companies are really thinking about how technology is going to transform their company. I can speak all day about automotive companies thinking in a time of a crisis and accelerated competition, we need to innovate, we need to invest in technology, we need to become tech companies, and I think that dialogue is actually positive. So I think there is an expectation that we’re going to start to see some growth. I’ve been cautiously optimistic, but it was good. And if anything, I think there was a wake-up call I think in Europe, we’re here in Europe about needing to accelerate the ability to innovate, and maybe its regulation needs to be targeted towards that, towards innovation. And you heard that from the prime minister.

Patrick Moorhead: 100%.

Cristiano Amon: You’ve heard that, yeah.

Daniel Newman: It was good to hear it, and I had a good laugh. I’ve been telling this story of this funny post I saw online, Cristiano, which showed a US innovation and it shows Musk SpaceX catching the rocket out of midair. And it shows Europe innovation. It shows a guy lowering a bottle cap onto a water bottle with a little string connected to it, and he lets it go and they show the right and the left. So there’s obviously a need, and I think the US drives and I think Europe is seeing how fast we’re moving and hopefully they want to follow. And I know there’s a lot of great partnerships. And by the way, there’s a lot of good purpose in some of what they want to do with data and privacy. It’s not one is right and one is wrong, but we see in the US how fast things go. And now we’ve got even more deregulation.

I want to talk a little bit about Qualcomm specific and the edge. The edge and the AI opportunity. I see so much value from this side of vendors, from the side of states’ governments as it relates to everything from lowering the cost of energy, optimizing energy, bringing compute closer in automotive makes a ton of sense because it’s safe. You don’t want cars making decisions in the cloud. At the same time with things like LLMs though, we’ve seen consumers don’t necessarily care where it’s processed as long as the latency is low enough. How do you see that bifurcation of what gets driven to the edge, and making sure that consumers find value in edge applications that maybe they don’t directly feel, things like lowering costs of energy?

Cristiano Amon: No, look, you touch on a number of very important points.

Daniel Newman: Yeah, that was a long question, yeah.

Cristiano Amon: No, but you have all of the key aspects of it. I want to start by saying this. It’s a mistake to think about, oh, this is going to be cloud or edge, and one versus the other. I’ll give an example. Smartphones are the most cloud-connected devices of any devices. It’s 100% connected. If it’s not connected, it is like an iPod touch. So it is the most cloud-connected device. And at the end of the day, computing power on phones, generation after generation is going to continue to increase because the things that you do on your phone. And I think it’s not going to be different of any computing systems. And I think what’s happening right now is there are reasons that you’re starting to see some of those technologies becoming very relevant. Latency, you talk about that. And it’s funny, because I come, as you guys know, from the wireless industry. You used to talk on the phone. We used to talk on the phone. When we talk on the phone in the past, let’s go talk about the 3G era. Operators, they had an excellent network. There was a premium price for the operator, because voice is unforgiving. We start texting each other. If you have an operator not so good, you move around to still get your text, you equalize it.

But now if you are starting using voice, for example on the phone, to talk with AI, you want a response right away, latency becomes important. Voice becomes important again, so you start to see those technologies, you want a response. A lot of companies, when they start about agents in voice recognition, they say all of the LLM, they do audio to text, needs to be local on the device. Privacy. And the other thing which is interesting, is you mentioned cost. And the cost, you look… We’re starting to go from training to inference, and starting to go to real production and it’s starting to get to the end goal, real commercial value. So companies are coming to say every time somebody pressed the AI button in their application, whether it’s the phone and the PC, I run the stuff on the cloud, it costs me money. And why don’t I use the existing computer that’s available? So we had a number of conversations of ISVs that said, “AI PCs are open platform. I’m just going to push my model down, and I’m going to work on that and that’s free, because the computing is available.” So all of those things are starting to happen, and that’s a great opportunity for us.

Patrick Moorhead: As we bring this conversation to a close, I did want to ask you about priorities for 2025. Your diversification plan is working. Just looking at the numbers, you’d have to be under a rock to not see that, and you’ve extended this even farther to the industrial edge. There seems to be a lot of work, and impetus and effort going into that. What do you want to accomplish? When we’re back here in 2026, the day after your late night party, what do you want to have accomplished?

Cristiano Amon: Look, I’m going to answer this question this way. Because see, one thing about our company, we have to constantly be pushing. So it’s not because we succeeded in one area that we’re going to stop pushing in this area. So I’m going to divide the answer into two folds. Priorities right now is on mobile. We’re very, very aggressive. Now that we have the leadership position and performance, we’re going to drive a transition to AI phones. That’s going to create a lot of opportunities. On cars, our position is accelerating. And now there’s this view that GenAI in the car will create more value on those digital cockpit platforms. In this year of ’25, we’re going to launch our stack. They want to be jointly developing with BMW, so that’s a big priority for us. PC, we’re ramping. Our priority is we’re launching more markets, we’re more activity in the channel. We’re starting to see the designs. We talk about our designs went from 60 to 100.

So those things we’ll continue doing. But there is a new priority that we hopefully when we come here in ’26, we’re going to see traction. We’re super focused on industrial. We had made some acquisitions, we build a new product. We are announcing, we built a new software platform with a different solution for each vertical. Because we believe that there’s going to be incredible value of what I call the next wave of industrial high-performance computing, AI at the edge and connectivity. We build a platform for energy, for retail, for manufacturing, for warehousing. We’re building solutions across those, and hopefully we’ll get some scale when we think about this next year.

Patrick Moorhead: Excellent.

Cristiano Amon: Yeah, we have some other things we’re working on, but I can’t say it right now.

Patrick Moorhead: Ooh, do we hear that everybody?

Daniel Newman: I wish one of these days we could share some of the great conversations we have when we’re not on the record, but unfortunately we just can’t do that. I did want to say, and then I’ll run us out. But it sounds to me from conversations I’ve had, Cristiano, that we are ready for the next era of IoT. Literally, it had that dip. But it seems like with AI, with Edge, with all the compute processing we have now, networking that we’ve created, what we thought could have been IoT five, 10 years ago has a chance to come back really, really strong.

Patrick Moorhead: Yeah, for sure.

Cristiano Amon: 100%. Let me give you a parallel. Sorry, you want to finish your question?

Daniel Newman: No, no, go ahead. Let’s do it.

Cristiano Amon: I’ll give you a parallel.

Daniel Newman: I know we got to go. But cut it in, give the chance to say this part and then we’ll-

Cristiano Amon: No, because I want to give it a parallel. It’s actually very important. When you think about automotive before, what was the automotive platform before? It was microcontrollers not connected. All of a sudden, the car became advanced computing platform. This is now happening with all of the other things in the industry. Microcontrollers get replaced for high-performance computing with AI and connectivity. It’s the same exact transition.

Patrick Moorhead: Excellent.

Daniel Newman: Well, it’s very exciting. Everyone out there, thank you so much for joining The Six Five On the Road with a View from Davos. A great conversation as always with our friend Cristiano Amon, CEO of Qualcomm. Subscribe, be part of our community. Check out all the coverage we had here at this year’s World Economic Forum. Great year ahead, we’ll see you all soon.

The post The View from Davos with Qualcomm’s Cristiano Amon appeared first on Moor Insights & Strategy.

]]>
The View from Davos with Meta‘s Yann LeCun – The Future of AI is Open and Human-Level Intelligent https://moorinsightsstrategy.com/the-six-five/the-view-from-davos-with-metas-yann-lecun-the-future-of-ai-is-open-and-human-level-intelligent/ Fri, 24 Jan 2025 18:38:08 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45303 Yann LeCun, VP & Chief AI Scientist at Meta, joins Daniel Newman and Patrick Moorhead, sharing his insights on AI's transformative potential as seen at Davos. LeCun discusses Meta's AI advancements and their impact on global issues.

The post The View from Davos with Meta‘s Yann LeCun – The Future of AI is Open and Human-Level Intelligent appeared first on Moor Insights & Strategy.

]]>
How important is open source to the future of AI and are we at human-level intelligence yet?

Patrick Moorhead and Daniel Newman are joined by Meta‘s Yann LeCun , VP & Chief AI Scientist for a conversation on the latest AI developments and insights from WEF25 in this segment of The View From Davos.

Get their take on:

  • The importance of open source for accelerating AI development and
  • Going beyond LLMs: LeCun imagines future AI systems will understand the physical world, reason, plan, and have persistent memory
  • The role of AI in addressing global challenges
  • Insights into future AI projects at Meta
  • Yann LeCun’s perspective on ethical AI and its governance

Learn more at Meta.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View from Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road with a View from Davos. It’s been a great week so far. And World Economic Forum is this very unique combination, this melding of technology, regulation, talking about governments. And of course there’s a lot of discussion around AI.

Daniel Newman: Yeah, it’s been a really good week so far, Pat. And the opportunity to speak to many of the world’s leaders, both in enterprise and government, provides the chance for us to really share with the audience everything that’s going on in the market. We’re in this really interesting inflection. And we’re seeing AI accelerate this year. And Pat, the people we have on the show just bring so much new insights. So, hopefully everybody out there is spending some time with us here in Davos.

Patrick Moorhead: And I want to introduce a guest that actually probably doesn’t even need an introduction, really a champion of open source AI, real mover and shaker out there, and is not afraid to have conversations out in social media. Yann, welcome to the show.

Yann LeCun: A pleasure.

Patrick Moorhead: Yeah. I guess first and foremost, what are you trying to achieve at the show? There’s so many changes that have happened in the last year. What do you want to achieve here?

Yann LeCun: Okay. What I spend most of my time on, despite which might be misled into thinking because of my external activities, I’m really working on fundamental research to get to the next step in AI because current technology is very limited. Everybody’s excited about LLMs, and we should push them as far as we can and they’re super useful, but they’re not a path towards human-level intelligence. So, I’m really working on how can we fix that?

Daniel Newman: Well, Yann, you and your team are doing some really incredible work, especially around open. We hear open AI, and of course a lot of people will also argue that’s not actually open.

Yann LeCun: Not at all.

Daniel Newman: Meta has been really focused on bringing open source to the market and enabling so many people to use what you’ve built with Llama, expand upon it. Can you just talk a little bit about the thinking behind that? Because historically, Meta has not necessarily been all about open. But in AI, it seems like that is the strategy. And it’s really working well.

Yann LeCun: No. Actually, the whole openness story is really in the DNA of the company. When I joined Meta in late 2013, and I was talking with Mark Zuckerberg and Mark Schaefer, who was the CTO at the time, I said, “For me to join Facebook, to create a research lab, I have three conditions. The first one is I don’t move from New York. I don’t quit my job at NYU, so I’ll be part-time. And the third one is we need to do open research, publish everything we do, and open source our code.” And the answer from both of them was, “Oh, you don’t have to worry about this. It’s in the DNA of the company.” I quote and say, “We’re already open source, all of our infrastructure software.” And so, I found that very interesting and reassuring, some message I think I wouldn’t have gotten from any other player at the time. And as a consequence, we created the lab. We famously announced that we were going to do open research. And the consequence, other labs actually became more open, like Google. And they kind of rescinded this a little bit now, but… And then, OpenAI was created a couple of years later. And they were supposed to be open, but clammed up since then completely. Same with Anthropic.

So, we’re the only major player really to play an important role in open source, together with a few Chinese players who are really good. So, the advantage of this, I mean the reason why we’ve seen such big progress in AI over the last decade or so, is because of the openness. It’s because information circulates quickly and freely, and that’s what pulls everybody. If we start clamming up, progress is going to slow down inevitably. So, that’s one reason. The second reason is, if you want to attract the best scientists and researchers, and you tell them, “You can’t talk about what you’re doing,” you’re not getting the best people. Third, we get a lot of really interesting advances from the open source world: contributions, ideas, like how to accelerate inference with Llama and things like that. There’s a lot of really interesting work coming from academia, from startups, from independent researchers. A lot of applications are enabled by AI. I mean, basically Llama is the substrate on which the entire AI industry now is being built. Most startups use Llama. And a lot of large companies are now migrating from proprietary systems to Llama.

Patrick Moorhead: Yeah, it’s been a really impressive run. But Llama openness didn’t surprise me. Because if I look at the Open Compute Project that you did, PyTorch, you do have a history of enabling a lot of developers to make things happen. I want to ask you about the future. I know research should be measured in terms of years, but I’d like to ask you about what should we expect over the next two years. I know everybody’s got a different definition of AGI. What should we expect? And I know there’s no black and white answer here, but what are your thoughts about the future?

Yann LeCun: Okay. So, I don’t like the phrase AGI famously because human intelligence is very specialized in the first place, right? We know this because we have a lot of computer systems that can do much better than humans in narrow areas. That means we’re not so good at everything. So, at Meta we use the phrase AMI, Advanced Machine Intelligence.

Patrick Moorhead: Okay.

Yann LeCun: We pronounce it Ami because that means friend in French. And that’s the main mission of FAIR. So, FAIR is the Fundamental AI Research Lab. The F used to stand for Facebook, but now it’s fundamental. And the main mission is really to figure out the next generation AI system that is capable of doing things current systems can’t do: understanding the physical world, having persistent memory, and being able to reason and plan. Okay, those are the four things that LLMs really can’t do without added ingredients to it. So, what’s going to happen over the next two years is that there’s going to be progress using the current paradigm, LLMs with words with things bolted on it.

So, we can do a little bit of reasoning, it can understand images, and various things like this. But it is going to be a huge hack. And there is diminishing return in how much better they get with more data. We’re running out of data, so it’s saturating. So, we need this new paradigm for the next… After that, so I expect to see some early progress on this sort of new paradigm within three to five years. And perhaps in five years we’ll know if we’re on the good path towards something like human level intelligence. The idea behind this, I mean the reason we’re working on this, is because we see a future where everyone will wear one of those smart glasses and we’ll interact with them through voice or through bracelets with EMG and various other interfaces. And we need the system to have human level intelligence if you want them to basically act like human staff or assistant, right?

Daniel Newman: Yeah.

Yann LeCun: So, all of us would be a boss of a staff of virtual smart people.

Daniel Newman: Well, it’s a really exciting future, Yann. I want to thank you so much. By the way, the glasses look great. It’s come a long way.

Yann LeCun: Yes, yes.

Daniel Newman: Very stylish. That’s been sort of my inflection, is when they were stylish enough that I could actually pull it off and wear them.

Yann LeCun: Right.

Daniel Newman: And you’re wearing them very well. But thanks so much for opening up to us. This is definitely one of those conversations, Pat, that I would like to have spent maybe another 20, 30 minutes. But in Davos here, spending 20 or 30 minutes is like eight meetings, right?

Yann LeCun: Basically.

Daniel Newman: Speed dating. But Yann, thanks for joining The Six Five. Let’s have you back again sometime soon.

Yann LeCun: Thanks for having me on.

Patrick Moorhead: Thanks, Yann.

Daniel Newman: And thank you, everybody, for tuning in. What a fascinating conversation. We appreciate you joining the Six Five On the Road. It’s a View from Davos. Hit subscribe. Join us for all the great conversations here on the magic mountain for this episode. Time to say goodbye. We’ll see you all later.

The post The View from Davos with Meta‘s Yann LeCun – The Future of AI is Open and Human-Level Intelligent appeared first on Moor Insights & Strategy.

]]>
RESEARCH NOTE: Hearst Corporation Modernizes Oracle ERP with Strong Change Management and Data Management Practices https://moorinsightsstrategy.com/research-notes/hearst-corporation-modernizes-oracle-erp-with-strong-change-management-and-data-management-practices/ Fri, 24 Jan 2025 18:07:51 +0000 https://moorinsightsstrategy.com/?post_type=research_notes&p=45298 As more enterprises than ever embrace new AI-driven solutions, it’s worth remembering that technology modernization and transformation projects must address the human side of the process along with deploying advanced algorithms or adopting the latest tools. For years now, I’ve returned again and again in my writing to the two pillars of change management and […]

The post RESEARCH NOTE: Hearst Corporation Modernizes Oracle ERP with Strong Change Management and Data Management Practices appeared first on Moor Insights & Strategy.

]]>
Hearst has used thoughtful change management and data management processes to drive adoption of modern ERP functions as well as AI.

As more enterprises than ever embrace new AI-driven solutions, it’s worth remembering that technology modernization and transformation projects must address the human side of the process along with deploying advanced algorithms or adopting the latest tools. For years now, I’ve returned again and again in my writing to the two pillars of change management and data management, both of which are essential for turning technology aspirations into actionable outcomes.

Change management addresses the organizational and human aspects of transformation, while data management ensures that data completeness and quality are maintained so that insights drawn from data-driven systems are accurate, actionable, and timely. Long experience tells me that without both of these disciplines in place, enterprises will struggle to modernize applications, integrate workflows, or derive the technology-driven insights that will allow them to make informed decisions.

Recently I had the opportunity to talk about these principles with David Hovstadius, senior vice president of finance operations at Hearst Corporation. He paid close attention to both change management and data management when he led the company’s transition to Oracle Cloud ERP. Even though that ERP system was implemented some years ago, the groundwork laid then has continued to empower the media company to embrace the evolution of technology—and the continuous improvement of processes—as AI has come to the fore.

Hearst’s ERP Journey

Hearst began its modernization journey in November 2016 by transitioning to Oracle Fusion Cloud ERP, a SaaS application. The transformation was ambitious, encompassing 300 businesses across 60 locations and key functional areas such as finance, human resources, and procurement. The phased rollout started with a pilot in July 2017 and concluded in March 2018, providing a structured approach to managing complexity while maintaining operational continuity. When I talked with him, Hovstadius told me, “No one was talking about gen AI back in 2017. Our priority was to implement a SaaS platform and common data model that would allow us to adopt new capabilities and functionalities as they became available. At the time, we didn’t completely comprehend the value of this decision.”

Adopting the SaaS model was a strategic move, allowing Hearst to offload responsibilities such as system maintenance, updates, and testing to Oracle. This enabled Hearst’s finance operations and technology teams to leverage the system’s capabilities to enhance business outcomes rather than getting caught up in technical details.

The SaaS approach also marked a shift in mindset: Oracle evolved from being a service provider to a strategic business partner. This partnership allowed Hearst to integrate new tools, such as intelligent payment systems and automated financial processes, which proved crucial for navigating the complexities of the media industry. These tools simplified operations and enabled Hearst to adapt quickly to changing market demands and to promote innovation.

The Role of Change Management

The success of many ERP implementations depends on getting the organization and its employees to embrace change. For Hearst, transitioning to Oracle Cloud ERP meant moving away from legacy processes and adapting to technological advancements—as well as to organizational culture shifts. Hearst implemented change management strategies including early stakeholder engagement, structured training programs, and transparent communication to reduce staff resistance, minimize disruptions, and encourage employee buy-in.

Despite these proactive efforts, employee adoption of new processes remained a challenge, as is often the case when personnel have to navigate unfamiliar systems. To help smooth the process, Hovstadius and his team conducted roadshows, interactive workshops, and targeted training sessions. These initiatives went beyond merely teaching how system functionality works to helping users understand how the various ERP components are interconnected to support broader business objectives. Hearst achieved a smoother transition by helping employees see the system as an opportunity rather than a disruption. Hovstadius credits this approach as leading to minimal operational downtime and faster realization of the system’s value.

A key takeaway from Hearst’s experience is that effective change management extends well beyond technical implementation. It requires reshaping mindsets, improving collaboration, and building trust across teams. One effective strategy was appointing “change champions”—employees who advocated for the new system, addressed their colleagues’ concerns, and served as liaisons between leadership and end users. This collaborative approach reinforced the key insight that change management works best as a shared organizational effort rather than a top-down set of instructions.

Adapting to new processes and harmonizing data required employees to rethink their workflows. Hearst invested in comprehensive education, ensuring that employees were supported at every transition stage. Systematic sharing of best practices also played a critical role in easing the adjustment. Drawing on its experiences with change management during the ERP rollout, in 2023 Hearst used some of the same lessons—and further demonstrated its commitment to innovation—by establishing an AI working group to explore machine learning and AI technologies in the finance function.

The Significance of Data Management

Data management was another cornerstone of Hearst’s transformation. Accurate and consistent data flows are essential for any ERP system because they support decision making, operational efficiency, and long-term sustainability. Early in the ERP rollout process, Hearst extensively evaluated its data architecture to identify inefficiencies and establish a foundation for consistent monitoring and refinement. One example of an important outcome of this evaluation was the consolidation of 15 different charts of accounts into a single, standardized chart. Hovstadius said that this simplification revolutionized reporting processes, allowing senior leaders to access insights within minutes instead of hours or days.

In addition, integrating Oracle Cloud ERP and Oracle Fusion Cloud HCM systems with third-party platforms such as ADP payroll required introducing consistent data definitions across functions. This standardization enabled granular analyses of employee costs, categorized by dimensions such as function, region, and country. These detailed insights were transformative in themselves because they gave leaders a clearer view of operational performance and enabled data-driven decisions with unprecedented precision.

Hearst’s commitment to maintaining data quality was significant. Continuous monitoring practices were implemented to confirm that data remained reliable, actionable, and aligned with organizational goals. This emphasis on data integrity was more than a technical requirement; it was strategic. By reinforcing trust in the system’s outputs, Hearst ensured that the information resulting from the data was credible. And the company’s commitment to data quality continues today.

Building an AI-Ready Finance Platform

Hearst’s transformation extended beyond ERP-focused operational improvements to create a new foundation for integrating AI-driven innovations. The changes made back then proved to be valuable years later, when it became essential to understand AI and what could be achieved with it. As Hovstadius told me, “As we now look at AI/ML and what enables it, it is consistency in data and process, which means we are well positioned to continue to adopt these capabilities as they are released.” In other words, the hard work of data management and change management paid off in the successful ERP implementation years ago, but it has continued to pay dividends during the adoption of AI over the past couple of years.

Today the company is seeing significant efficiencies from AI-driven automation and standardization of Oracle ERP processes across eight business groups. These efforts have delivered tangible outcomes, such as automating discount proposals for early payment. That innovation alone has saved Hearst hundreds of thousands of dollars while strengthening supplier relationships. Another notable advancement was the implementation of intelligent document recognition, which modernized invoice matching processes and reduced manual errors.

Looking beyond these immediate applications, Hearst has also explored the potential of generative AI for financial narrative reporting. That capability promises to accelerate reporting processes while delivering deeper insights to support more strategic decision making. By continuing to adopt new AI capabilities in its ERP system, Hearst has positioned itself to remain responsive to evolving industry dynamics.

The Beneficial Outcomes of Good Change Management and Data Management

Hearst’s successful Oracle transformation reflected its deliberate focus on change management and data management. Operationally, Hearst integrated its 300 businesses, restructured finance, HR, and procurement processes, and standardized data structures to improve reporting timelines. More than that, it improved the company’s ability to innovate going forward. More recent AI-driven innovations have contributed to cost savings and better supply chain relationships. In the long run, practicing consistent data management and developing an AI-ready finance platform have enabled Hearst to keep adapting to the evolving media landscape.

Today, Hearst is seeing the dividends of its past transformation. Hovstadius said, “By developing a strong data strategy, creating a culture of innovation and experimentation, and instilling digital and technical skills in finance teams, we continue to embrace generative AI capabilities as they are introduced to improve productivity and insights. Change management will continue to be a priority as we adopt the latest AI features.”

For organizations pursuing similar transformations, Hearst’s experience offers a real-world example to follow. Effective change management must align people and teams, address resistance from affected stakeholders, and promote trust and collaboration. Emphasizing data quality from the outset is crucial, with early standardization laying the foundation for long-term success. Phased implementations can help manage complexity, while using integrated platforms facilitates cross-departmental alignment and operational efficiency.

By addressing both the technological and the human aspects of digital transformation, Hearst has demonstrated how organizations can achieve operational improvements while preparing for sustainable growth and innovation. Its experience underscores the importance of empowering teams, maintaining data integrity, and pursuing a well-defined strategic vision to achieve meaningful business outcomes. This is a great use case for enterprises starting their own transformation journeys.

The post RESEARCH NOTE: Hearst Corporation Modernizes Oracle ERP with Strong Change Management and Data Management Practices appeared first on Moor Insights & Strategy.

]]>
RESEARCH NOTE: Cisco AI Summit Highlights Need For Safe, Secure AI Enterprise Controls https://moorinsightsstrategy.com/research-notes/cisco-ai-summit-highlights-need-for-safe-secure-ai-enterprise-controls/ Fri, 24 Jan 2025 16:24:24 +0000 https://moorinsightsstrategy.com/?post_type=research_notes&p=45290 Cisco held its first AI Summit earlier this month in Palo Alto, California — and from my perspective, it stands out as the best AI event I have attended to date. The conversations featured a Who’s Who of tech innovators in the epicenter of AI innovation, serving as a dais for the company’s announcement of […]

The post RESEARCH NOTE: Cisco AI Summit Highlights Need For Safe, Secure AI Enterprise Controls appeared first on Moor Insights & Strategy.

]]>
CEO Chuck Robbins at the opening session of the Cisco AI Summit (Photo by Will Townsend)

Cisco held its first AI Summit earlier this month in Palo Alto, California — and from my perspective, it stands out as the best AI event I have attended to date. The conversations featured a Who’s Who of tech innovators in the epicenter of AI innovation, serving as a dais for the company’s announcement of a new security platform: AI Defense. In this research note, I will provide my insights on what was shared with a small group of industry and financial analysts and global enterprise executives. Most importantly, I’ll highlight what I found noteworthy about the content and Cisco’s newly minted approach to AI security, which is expected to be generally available later this spring.

Modern AI applications and use cases continue to be at the center of many technology event conversations — and for good reason. Gen AI is poised to transform whole industries and, in the process, deliver new levels of employee productivity and operational efficiency, along with improved cybersecurity posture. Cisco’s inaugural AI Summit brought together thought leaders, ecosystem solution providers, and customers for genuinely deep discussions related to silicon, data management, governance, security, developer operations, marketing, search, venture capital, and more. 

My Favorite Cisco AI Summit Sessions

Cisco chief executive Chuck Robbins kicked off the event, setting the context for the sessions that followed. The company continues to expend considerable effort in researching enterprise readiness to adopt AI solutions, as evidenced by its AI Readiness Index reports. Last year, Cisco published its findings across six pillars — strategy, infrastructure, data, governance, culture, and talent. Robbins drew from that data at the event to share that only 10% of its surveyed customers are fully ready to leverage the power of AI. That is a telling statistic, one that captures how fast modern AI applications and workloads are moving relative to previous technology sea changes such as cloud computing — and how unprepared enterprises are to leverage it.

After the Robbins keynote, three of the conference sessions stood out for me. The first focused on the power of silicon with Groq, Inc. Groq develops purpose-built silicon for AI — what it calls a language processing unit. The company’s LPU is designed to process the unique speed and memory requirements of large language models, the underpinning of gen AI workloads. My colleagues and I have written about the evolution of Ethernet and how companies including Marvell and AMD are leaning into custom silicon to unlock new value in AI with respect to faster, more performant networking and interconnect solutions for front- and back-end deployments. However, it’s not enough simply to provide lower latency and higher performance to address the demands of modern AI applications. In my opinion, the single biggest issue facing infrastructure providers is power consumption and the related need for more sustainable solutions. Those silicon providers like Groq that can unlock power efficiency alongside performance will reap the rewards in the form of market-share leadership.

The second session I found noteworthy was one focused on agentic systems with Box. Many corporate pros are familiar with Box, a cloud storage company that is innovating beyond its core product set with new tools that allow enterprises to build and manage custom AI agents through Box AI Studio. (My colleague Melody Brue wrote about this in detail a couple of months ago.) Agentic AI has great promise, and many industry pundits, including me, believe that it has the potential to birth a trillion-dollar industry. Agentic AI aims to build upon gen AI, moving from user prompts for content creation to the autonomous handling of complex tasks and workflows. The power of agentic AI lies in its ability to independently weave a set of separate tasks together, learning throughout the process. The uses cases are nearly limitless because AI agents can automate tasks that are tedious or prone to error, improve customer support, or transform both consumer and commercial applications and workflows in other ways. The session with Box went far to demonstrate agentic AI’s potential, as well as the opportunity for both Box and Cisco to facilitate its deployment.

However, both generative and agentic AI require safe and secure methods to ensure the integrity of how agents answer questions and generate content, which brings me to the third session I found especially impactful. Cisco’s Jeetu Patel, who now leads the company’s networking and security solution development efforts as chief product officer and served as the event’s master of ceremonies, spoke to the unique security challenges created by algorithmic models. Patel shared that, broadly speaking, both large and small language models are not deterministic in nature and can be easily compromised, especially as AI deployments become more hybrid, extending from the cloud to network edges and personal devices at massive scale. I wholeheartedly agree with his assessments. Safety and security will become paramount for both infrastructure protection and the use of modern AI. Patel used that context to announce Cisco’s latest AI security innovation — AI Defense.

Cisco Launches AI Defense to Protect Modern AI

Cisco says that AI Defense will allow enterprises to develop, deploy, and secure AI applications with confidence. It aims to provide an end-to-end set of capabilities that protect the use of modern AI applications, safeguard against AI’s misuse, prevent data leakage, implement access control, and leverage the power of network observability to dynamically protect companies. From an initial look, it is a compelling set of capabilities spanning AI application discovery, runtime security, and model validation.

To my mind, the model validation aspect is a potential game-changer. Cisco recently acquired Robust Intelligence, a company that specializes in automated AI security tools. Given AI Defense’s announced general availability in March, Cisco is quickly integrating Robust Intelligence’s automated red-teaming capability to enable dynamic guardrail protection. Historically, red-teaming has been a manual, labor-intensive process employing ethical hackers to break organizational security systems to expose vulnerabilities and inadequate security controls. Cisco claims that its automated red-teaming methodology is the first of its kind, and if that is the case it could command a significant advantage in delivering model validation more quickly and at scale. Only time can tell AI Defense’s overall value, but if Cisco can effectively move its customers and prospects into early preview, that could serve as a fast ramp to revenue for AI Defense.

Tipping the Scales in Favor of Defenders

Cisco’s AI Summit went a long way toward demonstrating many different business opportunities for Cisco tied to the transformative impact of modern AI within the enterprise. First and foremost, the event was highly educational, bringing industry luminaries together to share developments and lessons learned along the AI journey. It also highlighted Cisco’s capability to serve as a trusted AI advisor, and to provide solutions including AI Defense and the Hypershield security architecture launched last year. I expect that these two solutions may merge over time, but one thing is certain in my mind Cisco is doubling down on cybersecurity and is tipping the scales in favor of the defenders.

The post RESEARCH NOTE: Cisco AI Summit Highlights Need For Safe, Secure AI Enterprise Controls appeared first on Moor Insights & Strategy.

]]>
Data Protection In 2024 Brings Greater Automation And Resilience https://moorinsightsstrategy.com/data-protection-in-2024-brings-greater-automation-and-resilience/ Fri, 24 Jan 2025 15:00:21 +0000 https://moorinsightsstrategy.com/?p=45058 Data protection vendors are using AI alongside other technologies to identify threats, automate governance and maintain compliance for enterprise customers.

The post Data Protection In 2024 Brings Greater Automation And Resilience appeared first on Moor Insights & Strategy.

]]>
Data protection vendors are using AI alongside other technologies to identify threats, automate governance and maintain compliance for enterprise customers. Pixabay and Robert Kramer

In 2024, data protection software stayed front and center for enterprises as cyber threats became more advanced and regulations more demanding. AI-powered tools came to the fore as a practical way to automate governance, maintain compliance and identify threats. Vendors such as Cohesity, Commvault, Rubrik and Veeam improved their market presence through a variety of means including acquisitions, partnerships, going public and adding features to their platforms. The new features included immutable backups and AI-driven anomaly detection, giving enterprises faster and more reliable ways to spot disruptions and recover from them. Observability tools also became increasingly relevant, bridging the gap between system monitoring and data protection to provide more integrated and proactive solutions.

2024 Data Protection: Key Vendor Highlights

One of the biggest shifts came when Cohesity completed its $7 billion acquisition of (a significant part of) Veritas in December, merging operations with a promise to deliver comprehensive data management and protection solutions. Key offerings aim to improve threat detection, simplify compliance and reduce the inefficiencies and blind spots caused by data silos. Summarizing the logic behind the deal, Cohesity CEO Sanjay Poonen stated, “This deal combines Cohesity’s speed and innovation with Veritas’ global presence and installed base.” The merged organization, projected to generate $2 billion in fiscal 2025 revenue, will serve over 13,000 customers, including many Fortune 100 and Global 500 companies. For more details, see the research paper that I wrote on Cohesity and the merger a few months ago.

Commvault introduced its Cloud Rewind feature, leveraging technology from its acquisition of Appranix to automate cloud application recovery. This feature automates the rebuilding of cloud applications after cyberattacks, including ransomware attacks, to help enterprises recover faster without relying on time-intensive manual processes. Sanjay Mirchandani, Commvault’s president and CEO, stated, “We are taking resilience to the next level by combining Commvault’s extensive risk, readiness and recovery capabilities with Appranix’s next-generation cloud-native rebuild capabilities.” By addressing data recovery and the restoration of cloud apps that interact with that data, Cloud Rewind aims to help enterprises minimize downtime and resume operations quickly; according to the company, it can often achieve this in a fraction of the time traditional methods require. Additionally, Commvault used its fourth-quarter Shift event in London to introduce new solutions for data recovery and cyber resilience designed to integrate with AWS, Google Cloud and other environments.

In April, Rubrik raised $752 million in an IPO to support the growth of its Zero Trust Data Security solutions for multi-cloud data protection. Building on its partnerships with Microsoft and AWS, the company introduced tools such as Cloud Vault for air-gapped backups on AWS and Azure Blob Protection for compressed immutable backups across Azure storage tiers. It also integrated AI-driven technologies to address various governance, compliance and threat mitigation challenges. Key innovations include Ruby, an AI assistant using the Azure OpenAI Service for cyber threat response; Data Threat Engine, which applies machine learning to detect suspicious activities; and SentryAI, an in-house engine analyzing customer telemetry data. These tools, combined with features like immutable backups and anomaly detection, aim to protect against threats such as ransomware and support faster recovery.

In December 2024, Veeam Software announced a $2 billion secondary equity offering led by TPG, with participation from Temasek, Neuberger Berman Capital Solutions and others; the financing values the company at $15 billion. (The transaction is set to close in early 2025, subject to regulatory approvals.) As of September 2024, Veeam reported $1.7 billion in ARR for its software and SaaS subscription business. CEO Anand Eswaran explained that the funding was intended to diversify the company’s investor base and prepare for strategic opportunities, including a potential IPO.

Cloud Service Providers And Other Data Protection Vendors

Beyond the data protection providers already mentioned, there are a range of larger and smaller vendors offering solutions to meet their customers’ data protection needs. In 2024, the three largest cloud providers—Microsoft, AWS, and Google Cloud—prioritized advancements in integrated data protection tools to address the growing challenges of AI adoption and cybersecurity threats. Microsoft announced the general availability of its Purview Data Governance solution. It included new features such as an embedded Copilot experience for data governance and enhanced integration with Microsoft Fabric. AWS improved its Amazon Macie service by developing a proof-of-concept for automated discovery, allowing customers to validate its effectiveness in strengthening data protection strategies. And at its 2024 Cloud Security Summit, Google Cloud introduced Mandiant Custom Threat Hunt for targeted threat detection, plus upgraded its Security Command Center to identify complex vulnerability combinations.

Smaller data protection providers also made strides in 2024 by introducing innovative solutions. Barracuda focused on enhancing storage efficiency and recovery times through advanced deduplication. Quest Software expanded its suite of tools to encompass backup, disaster recovery, ransomware prevention and data management. N2WS enabled cross-cloud volume restores between AWS and Azure, allowing for rapid data recovery across different platforms. HYCU improved its offerings for Nutanix environments, adding database protection, enhanced remote office support and improved security measures. Plus, HYCU partnered with Anthropic to enhance AI workload protection. Acronis improved its cybersecurity offerings with its Extended Detection and Response solution (called XDR) and enhanced network visibility. Carbonite focused on cloud backup improvements, adding features to protect against ransomware and refine disaster recovery capabilities. These advancements demonstrate the industry’s broad commitment to addressing the challenges of multi-cloud environments and evolving cybersecurity threats.

Observability Makes Its Mark

The observability market also continued to evolve in 2024, largely due to the growing complexity of IT environments, the shift to cloud and microservices architectures and the need for more efficient IT operations. The market is estimated to grow from $2.4 billion in 2023 to $4.1 billion by 2028, reflecting steady growth and the importance of observability for effective IT management.

Cisco’s acquisition of Splunk in March for about $28 billion brought notable changes for both companies. Cisco combined Splunk’s log management and analytics tools with its own network monitoring and security solutions. This integration made it easier to provide a more complete view of IT systems, covering everything from applications to networks. It also improved collaboration between security and IT operations teams, helping organizations address issues more quickly and efficiently. Key updates included Splunk Log Observer Connect for Cisco AppDynamics and the integration of Cisco AppDynamics with Splunk IT Service Intelligence. For Splunk, the acquisition meant access to Cisco’s resources and broader customer base, helping its tools reach more users.

Other important observability vendors, including Datadog, Dynatrace, New Relic, AppDynamics, LogicMonitor, SolarWinds, Sumo Logic and Elastic, likewise enhanced their platforms in 2024. In general they focused on improving real-time system monitoring with precise metrics and automated alerts, enabling faster and more effective responses to performance issues. Distributed tracing became a key area, helping engineers understand the complexities of microservices and identify bottlenecks. Log analysis remained critical, with vendors incorporating AI-powered analysis and advanced search to improve log management. Open-source solutions also played a significant role, with Prometheus standing out as a scalable time-series database and Grafana providing highly customizable dashboards that can visualize data from various sources. The rise of OpenTelemetry, an open-source standard for instrumenting applications and collecting telemetry data, is also influencing the future of observability. Across the industry, vendors such as LogicMonitor are increasingly leveraging AI and machine learning to automate tasks such as anomaly detection and root cause analysis.

IBM also made progress in observability during the year, building on its AI-driven features to provide more efficient monitoring and problem-solving capabilities with Cloud Logs and Concert. Concert, a generative AI-powered tool, offers insights across enterprise application portfolios, while Cloud Logs provides efficient log data collection and analysis. Across the industry, there was a clear focus on improving tools that handle application performance monitoring, real user monitoring, infrastructure monitoring and log management. Many of these platforms added or expanded AI and machine learning features to help IT teams analyze data faster and address anomalies before they become major issues.

In November 2024, LogicMonitor received an $800 million investment from a group including PSG and Golub Capital, valuing the company at $2.4 billion. LogicMonitor provides an agentless monitoring platform for data centers, covering infrastructure including networks, servers, virtual machines and cloud services (AWS, GCP, Azure). Under CEO Christina Kosmowski, the company is enhancing its AI-driven tools for IT management, system performance and cost efficiency while targeting new industries for a broader customer base. LogicMonitor plans to use the recent influx of capital to support platform growth, strategic acquisitions and global expansion.

What’s Next In 2025: AI, Predictive Analytics And Unified Platforms

In 2025, the data protection landscape is expected to continue evolving, with a stronger emphasis on integrating observability systems with data protection platforms. This integration, supported by advanced AI and predictive analytics, will shift the focus from threat detection to proactive risk mitigation and the automation of compliance processes, including managing user consents and handling data subject access requests. Automation tools are also likely to improve their compatibility with data protection systems, allowing organizations to address risks preemptively and enhance overall resilience.

Privacy-enhancing technologies are also anticipated to see advancements, particularly in fully homomorphic encryption, which could transform how sensitive data is processed securely. These developments stand to be especially impactful in sectors such as finance and healthcare, where data security is so critical. Additionally, regulatory changes are likely play a key role, particularly around AI governance and data sovereignty. Organizations will need to adapt to stricter compliance requirements, especially in relation to AI systems and cross-border data transfers, and develop more transparent data governance frameworks to address these evolving challenges.

The integration of Cohesity and Veritas will likely deliver new capabilities in real-time recovery and proactive threat detection, potentially reshaping industry expectations and putting pressure on the combined company’s competitors. Enhanced AI-driven data classification tools should help enterprises prioritize protection for their most sensitive information while maintaining regulatory compliance. Vendors such as Microsoft and Amazon are also expected to advance their capabilities, offering deeper insights into data usage and potential security gaps.

More broadly, it’s likely that AI-powered simulations and threat modeling will get even better at enabling enterprises to refine their defenses against emerging risks. By combining these capabilities with the observability tools mentioned, enterprises should be able to optimize operational continuity, reduce recovery times and navigate increasingly complex digital environments.

Better Data Protection And Observability For A More Complex IT Landscape

Data protection and observability have come a long way in 2024, taking advantage of the meteoric rise in AI capabilities to address long-running trends of more pervasive cyber threats, stricter regulations and increasingly complex IT systems. Enterprises are turning to a mix of new tools and strategies to protect their data and keep systems running smoothly. Vendors are rolling out AI-powered features, automation and integrated platforms to handle everything from cloud-native setups to multi-cloud and on-premises environments. These advancements make it easier for enterprises to handle disruptions, stay operational and build trust in a world that relies heavily on data.

Blending data protection and observability tools signals a shift toward smarter, faster strategies that keep up with real-time operational demands. This evolution should help enterprises stay resilient, adapt to challenges and maintain continuity in today’s unpredictable digital landscape.

The post Data Protection In 2024 Brings Greater Automation And Resilience appeared first on Moor Insights & Strategy.

]]>
The View from Davos with Fox Rothschild’s Matthew Kittay https://moorinsightsstrategy.com/the-six-five/the-view-from-davos-with-fox-rothschilds-matthew-kittay/ Thu, 23 Jan 2025 19:49:32 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45287 Matthew Kittay, National Co-Chair, M&A Team at Fox Rothschild, joins Daniel Newman and Patrick Moorhead to share his insights on the global economic trends, M&A strategies, and key takeaways from the Davos meeting.

The post The View from Davos with Fox Rothschild’s Matthew Kittay appeared first on Moor Insights & Strategy.

]]>
Get the download from Davos on AI, Regulation, and the Future of Business

The convergence of finance, tech, government, and regulation is creating a pivotal moment placing AI, regulation, and the future of business at the forefront. Daniel Newman and Patrick Moorhead continue to bring exclusive WEF25 coverage with The View From Davos in this conversation with Fox Rothschild‘s Matthew Kittay, Partner; National Co-Chair, M&A Team. They share thoughts on the global economic outlook, key trends in mergers and acquisitions, and insights from the Davos meeting.

Key Takeaways:

  • AI’s impact and the need for thoughtful regulation are dominating discussions
  • Inflection Point: changes in leadership and societal priorities (think DEI and sustainability) are reshaping how businesses approach innovation, especially concerning AI
  • Where AI Meets Law: 66% of general counsels expect lawyers to use AI, but only 33% want AI used on their clients
  • Fiduciary Duty in the Age of AI: Boards and executives are grappling with how to integrate AI responsibly
  • Unprecedented Scale: While we’ve seen tech waves before (internet, social media, crypto), AI’s breadth and depth are unprecedented

Learn more at Fox Rothschild.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road with a View of Davos. We are here at the World Economic Forum talking about this configuration of finance, tech, government, and regulation. Dan, good week so far.

Daniel Newman: It’s great. I mean, the coming together, we’re at a major inflection today, the inauguration of the 47th president, 47th different president. The numbers always confuse me because we’ve had the ones and twos, but I think one of the only times in history where we’ve had one take a break and then come back in short order. But I mean, all of this means different things to everybody here. It changes kind of the tone, the tenor of what’s going on. Like you said, we’re going to be thinking more about regulation. Of course, AI is in big focus right now.

Patrick Moorhead: That’s right.

Daniel Newman: We’re seeing some changes in terms of how companies are thinking about things like DEI now. We’re seeing changes in terms of how companies are thinking about sustainability. AI’s brought that back into the picture, but in a totally different way.

Patrick Moorhead: That’s right. I can’t imagine a better guest to have on to talk about the intersection between AI and regulation. Welcome, Matthew, to The Six Five.

Matthew Kittay: Thanks for having me, guys. It’s a pleasure to be here.

Patrick Moorhead: Probably a good place to start is just maybe talk a little bit about what you do and maybe also love to hear about what you want to achieve here at the show.

Matthew Kittay: Absolutely. My day job is as the co-chair of the M&A practice at Fox Rothschild. It’s a 30 office law firm and law hundred in the United States. In that role, I’m constantly advising boards of directors and startup companies and private equity funds on changes in regulation, corporate compliance, things like that. AI is obviously a major impact on all of that advising that we’re doing.

Patrick Moorhead: Excellent.

Daniel Newman: It’s a really, like you heard me kind of in the setup here, I mean, I want to unpack that a little bit. Today’s the day. We know that there’s a few jets that haven’t landed in Davos yet because they’re at the inauguration, but many are going to show up in the next few days. I think there’s a lot of interest, going to be a lot of sessions in the Congress, all about AI regulation, and of course, what does it mean for the business climate? Some of your, I know you don’t give free advice, no lawyer ever does, but in your world, what’s that kind of big overarching perspective that you’re giving to your clients right now?

Matthew Kittay: Well, companies are constantly looking for ways to integrate AI into their everyday solutions and into their big macro decision making, the kind of advising that companies are looking for. An interesting statistic published in a paper that I just completed with the American Bar Association-

Daniel Newman: I read that.

Matthew Kittay: Thank you, is that 66% of companies of general counsels polled expect lawyers to implement AI in their applications and advising companies, but only 33% of those companies want AI used on their clients. There’s this constant tension in the market between looking for efficiencies, looking for technical innovation, but also being of the technology and the kinds of outputs that it can create.

Patrick Moorhead: Hey, can you help our audience? I mean, AI is unique and first of all, it covers consumer, enterprise, government, and it strikes through so many data sharing, as an example. The whole regulation, let’s say about how you’re using it, how have you broken down or segmented AI regulation, and maybe it’s in your paper?

Matthew Kittay: There are layers of that exactly as you’re describing. From a lawyer’s perspective, you’re looking at a global platform, a dialogue, a commercial impact, but then you’re also looking at federal, within the United States, federal regulations, state regulations, ethical rules, bars, and all of these things are untested arenas. Everybody knows, like I was saying before, that they have to implement, but nobody really knows the intersection and there’s pressure for adoption. But at the same time, there’s caution because nobody really knows what the outcome. Where’s your data going to go? Who’s going to be able to use it? What critical secrets or proprietary information is accidentally going to get shared with who? A lot of that is covered in my paper actually.

Daniel Newman: Well, we all love the dry academic experience of writing research. We do it. That’s what we do too. No, it’s actually a really important thing, but we’ve actually just spent a bunch of time working on one of the largest surveys of its kind on CEOs and AI deployment. We talked to 213 CEOs spending over with more than a billion dollars a year of turnover.

Matthew Kittay: Wow.

Daniel Newman: We asked them about how they’re thinking about AI. Governance compliance is a big part of that. But one of the things I want to ask you is, I know you lead I think the private equity venture capital board or advisory board for the American Bar Association.

Matthew Kittay: Correct.

Daniel Newman: That means you must be advising a lot of big company CEOs and M&A leaders and also probably talking to a lot of exciting growth companies. How are you sort of bridging that for them right now in terms of how to think about implementing AI successfully and in compliance?

Matthew Kittay: I’m going to really attempt not to be too lawyerly about it, but since you guys wanted a little bit of free advice, I’m willing to give it.

Daniel Newman: All right. Hold on. Let me get my phone out. Just kidding.

Matthew Kittay: Start your clocks.

Daniel Newman: Oh wait, we got that on recording. We’ve got that guy. He’s recording it.

Matthew Kittay: At the end of the day, what we’re talking about from the perspective that I’m advising is meeting your fiduciary duties. If anybody is paying attention to what’s going on in the dialogues that are critical to everyone, what Mark Zuckerberg is thinking about this, what Elon Musk is thinking about this. They’re still challenged to basically make decisions, informed decisions, to discharge their fiduciary duties to their shareholders, to their boards of directors. The complication comes from part of discharging those fiduciary duties is that you have a duty to kind of engage, and lawyers have this too, engage technologies, use state-of-the-art, think about things in a critical way that represent kind of what’s going on right now. Use all the tools available. At the same time, you can’t outsource these decisions to computers. There are all of the same functions that are critical of AI in general are in play in this arena, hallucinations, self-enforcing data where you get bad data, garbage in, garbage out. It’s a completely untested world. You’re going to start seeing case law on this in the next couple of years. In the meantime, we have to do the best we can to advise our clients on these issues.

Patrick Moorhead: As we wrap here, I want to ask you one final question. There’s amplitude and there’s frequency, and I like to look at different industry events. I feel like we’ve got big amplitude and frequency. Looking back on your career and what you do, have you ever seen this type of thing before? I mean, did you see this, let’s say, in the cloud or local, mobile, social?

Matthew Kittay: You’ve seen kind of waves. I’ve been a practicing lawyer for 17 years. When I started, it was really kind of the inception of the internet in a meaningful way and kind of global e-commerce and things like that. I was talking to a gentleman inside. I was a research analyst at Harvard Business School in 2005, so I had one of the first 1,000 Facebook accounts because I had an HBS EDU. There’ve been little spikes since that. Maybe you look at crypto or maybe you look at other areas of technology, but I don’t think you’ve seen anything as grand and sweeping as AI. I think that’s going to really define the next generation of issues.

Daniel Newman: I’m dying to ask him about the Trump coin, but I’m going to avoid it right now, just overall, the whole, what’s going to happen with crypto? Gensler’s out and now we have no regulation and we’re seeing these crazy pumps. Where did I see something that was like you could have made more in 36 hours than 54 years of investing in the S&P 500, and it’s just like that can’t be normal.

Matthew Kittay: If you’re walking around Davos today and you don’t have a picture of what you bought at and what you sold at, then you’re not on the in-crowd.

Patrick Moorhead: That’s not-.

Daniel Newman: Is that happening?

Matthew Kittay: I’ve seen five people do it already since I got here.

Patrick Moorhead: I need to amend my bad. It would be zero and zero.

Matthew Kittay: I didn’t participate either.

Daniel Newman: I don’t talk in my book a lot, but I bought the stable stuff, Bitcoin, Solana, Ethereum. All right, Matt, thanks so much. Really appreciate you spending some time with us.

Matthew Kittay: Absolutely a pleasure. Thank you very much.

Daniel Newman: Thank you everybody for tuning in, being part of The Six Five On the Road. It’s a View from Davos and we’re getting a lot of viewpoints. It’s been a great day so far. Stick with us, subscribe, be part of our community. We appreciate you for tuning in. We’ll see you all later.

The post The View from Davos with Fox Rothschild’s Matthew Kittay appeared first on Moor Insights & Strategy.

]]>
The View from Davos with Workday CTO Jim Stratton https://moorinsightsstrategy.com/the-six-five/the-view-from-davos-with-workday-cto-jim-stratton/ Thu, 23 Jan 2025 18:03:01 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45283 Jim Stratton, CTO at Workday, shares his insights from the Davos Forum, discussing key technology trends and Workday's role in shaping the future of global business landscapes.

The post The View from Davos with Workday CTO Jim Stratton appeared first on Moor Insights & Strategy.

]]>
There’s no question about it – AI is changing the workforce across entire industries and ecosystems. Can employers build trust while implementing AI? Patrick Moorhead and Daniel Newman are joined by Workday’s Jim Stratton, Chief Technology Officer on this segment of The View From Davos, for a conversation on the transformative power of AI, responsible implementation, and the evolving landscape of work.

Tune in as they cover:

  • Key themes and technology trends emerging from Davos
  • How Workday is contributing to the global technology landscape
  • Jim Stratton’s vision for the future of work and innovation at Workday
  • The impact of global economic changes on technology strategies
  • The importance of transparency and human-machine to ensure responsible use of AI, especially in HR and talent management

Learn more at Workday.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road with a View from Davos. Conversations have been incredible here. World Economic Forum is a great event, its unique event in that it intersperses not only technology, government regulation, AI, really looking at this public-private investments and rollout. They’ve been pretty good so far, Dan?

Daniel Newman: Yeah, it’s been a great event, Pat, and we’ve been up and down the promenade inside and outside of the perimeter and there is just so much going on. I mean we’ve hit this massive inflection, right? We’ve got new administration. Obviously there’s a continuum of feelings about that. Some people are very inspired, some people are concerned, but this event is always one of those places where the future is discussed at scale. And when you say tech and regulation, policy, economics, all coming together, these things, it feels like in our world tech people that everyone gets it all. But I mean that place where the leaders have a chance to really sit down and understand what’s happening, how does this stuff work, how do we keep it safe? How do we make it drive economic value? This is where a lot of it gets done and by the way, as a couple of chip guys, best density on the planet as it pertains to meetings per hour with people you want to meet with.

Patrick Moorhead: Yeah, that’s right, Dan. And a lot of the conversations we’ve been having, it’s almost like the punchline and how to get the most out of AI ends up in a human conversation, getting people out of their comfort zone. How do people in the technology interact and I can’t imagine a better person to have this conversation, Jim. Workday, you’re CTO there. Thanks for coming on the show.

Jim Stratton: Thanks for having me.

Patrick Moorhead: Absolutely. Probably the best place to start here is, and I don’t know if this is your first World Economic Forum, maybe not, but what do you want to accomplish here? What are your objectives? Is this is primarily a customer-facing meeting, government officials, enterprises?

Jim Stratton: It’s all of the above actually. And we have a good diverse team here too, but we’re covering all those bases. It’s the government entities, the press, media, talking with customers, talking with prospects, talking with other technology companies, talking with our own partners so some of the large companies that we partner with. As you said before, just the number of meetings and the diversity of those meetings of people that you want to meet with, it’s pretty incredible what you can get done in a week here.

Patrick Moorhead: Yeah, the density is high, isn’t it?

Jim Stratton: It is.

Daniel Newman: Price performance in area. We’re covering it. But we’re actually, Jim, you lead a very, very important software company. You drive a lot of the biggest companies in the world, very familiar with what you’re doing, but really interesting if you heard what I was saying in the buildup about the people catching up part. We talked to 213 CEO, companies over a billion in revenue and one of the gaps that they all seem to be having is the tech is very accessible to implement AI, but they’re having a really hard time getting it implemented and getting culture to buy in and turning proofs of concept into scale deployments. I mean as CTO of Workday, this has to be something you and your team are thinking about both with your customers and of course within the company. Can you talk a little bit about how you’re thinking about the evolution of building culture, trust, skills for the AI future?

Jim Stratton: Yeah, it’s a really interesting topic and talk about it maybe on two different fronts. One, in the product and what we’re building there and how we work with customers to drive uptake and value realization out of that. Then also internally within our own operations, I think like a lot of other large companies, even in the tech industry, we spent a while trying to figure out how we were going to use it for our own internal operations and we started with some easier things like developer co-pilots and just trying them out, try a proof of concept and get some of your best developers in there and see how they like it and they liked it. They started using it. They started seeing that actually this is pretty good and I’m getting more efficient, I can get more done. And we’ve started that across different… We then set up internally a strategic walk across all the different parts of the business and the work that gets done and what are some best of class tools that we can use to get in the hands of users.

And we start with small pilots and then word of mouth grows and people start seeing. At this point now we leverage it across quite a lot of the business that we do internally and it’s great. I can be more efficient, I can get stuff done. There’s things that used to take me hours or I couldn’t even do at all that are very, very easy to get done. So I think it’s mostly about taking that first step and trying it out and seeing that actually this is not as scary as I was worried about and I can use it to actually get some productivity gains, but I’m still doing the same job that I was doing. I’m just able to do it more efficiently or to cover more conversations. You just get more efficient at what you’re doing. And the same is true in what we put in the hands of our customers and it is about taking that first step, trying it out, getting a small group of folks to use it, and then they start to see the real productivity gains.

So if you look at something like a recruiter agent, so that today is helping talent acquisition professionals source 30 to 35% more so they can get more done with their own time, with their own job. They’re filling seats 20 to 30% faster, so the timeline to actually fill a particular role. Then if you add in for us a product that we call Talent Optimization, that actually helps companies retain 20 to 30% higher retention rates, that’s money directly back into customer’s hands. So they start seeing actually we can be more efficient in our operations and that starts to build this flywheel momentum to start taking up more. So that’s how we’re building it. It takes time. People need to get their feet wet with it and it does take time to start building that momentum.

Patrick Moorhead: Yeah, one of the big conversations about AI obviously has been around trust and it’s evolved and has crescendo when it gets into things like agents that are not just, okay, I can look something up and it’s this intelligent chatbot, but actually does something. And trust can mean I’m getting the right results that I can trust, but can also mean only having the right people looking at certain things. And it’s funny whether it’s ERP, PLM, HRM, HCM, all the different M’s out there in the enterprise. It all needs to be trusted, but I believe there’s a higher bar when it comes to people and that’s what your company is all about here. How are you ensuring that that level of trust is high in what you do and what you’re offering to your customers so they can feel like they can use this with confidence and their employees can too? The last thing we need is somebody doing a query and pulling back what somebody makes when they’re not supposed to be looking at that, intelligently creating an agent to go do something nefarious, maybe give somebody a pay raise.

Jim Stratton: No, it’s a great question. I’ll come at that from a couple of different ways. So first we started from the very beginning with building out a responsible AI framework. So we published our own AI principles. We did that first in 2019 and revisited that actually just about 18 months ago. We put in place, we leveraged the NIST risk management framework and we leveraged that across our development for all of our AI products. So that helps us put frameworks and guardrails around how we develop and how we test. And before we deploy software out there, we also provide a lot of transparency and visibility in the product itself about why did the machine make this decision or make this recommendation really, and that’s going to continue to be true. Providing that transparency about what the AI is doing underneath is going to be critical for developing trust. But the other thing that’s really key is for us and for a long time to come, I think it’s about human machine teaming.

And coming back to that example of the recruiter agent, there’s still a human that’s in that workflow and the AI is developing recommendations and saying, “I think these are probably the 10 best candidates based on the job requisition criteria and your sourcing selection criteria and location and all the other things that go into it,” but you’re still putting that in the hands of a human to make the decision about actually we’re going to reach out to these three people and start scheduling interviews. And that still flows through humans making the decision about actually I think this person’s the best fit. And so it really is about that human machine teaming side of things. And I think that’s going to continue to be the case for quite a long time. And for us agents are, they’re not task-based agents, they’re really role-based agents. It’s a broader, more complex concept than just task-based agents that you hear a lot about. But those are going to develop more capabilities over time and eventually over time as they continue to grow in their skills and capabilities, they’re really going to become full-fledged members of teams. They’ll really start to become part of your digital workforce now.

Daniel Newman: Interesting. That’s really exciting. By the way, every time you say a human machine, you’ve got to pay me a quarter. I wrote a book called Human Machine, I published it five years ago, so I just want to make sure we get paid. Let’s do that after the show. But in all seriousness, to wrap this up, you started down a path I’d like to finish you there. So as you see these agents get smarter and we’re seeing it, we actually did a study, we see about 5 trillion in costs that can be pulled out of current workforces through the implementation of agents successfully over the next 18 months. It’s going to happen. You also have the culture. You’re basically saying, “I want the company to adopt this stuff. I want our employees to love all this great technology.” And in some ways they’re looking at it going, “Does this replace me?” How do you think… How does Workday think about the workforce in a way that they make sure that you can really build the culture, build the investment, build the buy-in, and at the same time they understand that if we do this well, we’re going to grow productivity, we’re going to up-skill the workforce and everybody’s going to have an opportunity to stay, or many.

Jim Stratton: No, I think it’s exactly that. So this particular AI revolution that we’re in the midst of right now, it’s an interesting double-sided coin in that it will displace certain types of workloads that, I mean, we all know that. We all see it already. There are certain types of work that are going to be easier to automate away, but it also provides the opportunity for re-skilling, for up-skilling. It makes it… If you focus on skills-based management of the workforce, which many of our customers are currently doing today, we do ourselves internally at Workday, but we believe enterprises are moving that direction in general. That allows you to focus on what skills gaps you have within the work that you need to get done. It allows you to also train your teams, how to up-skill, how to re-skill them and retain those folks even if the type of work that they were doing before got displaced, still other work that needs to get done. And so it is really this interesting two, two-sided thing where yes, it’s going to displace certain types of workload, but it also is the thing that is going to allow us to go learn to do the new thing that we need to do.

Daniel Newman: Well, Jim, every industrial revolution, I think I was saying to somebody just recently that when they got rid of the gas lighters, we use gas lighters differently now, but there used to be gas lamps and every night they got to climb the ladder, light the lamp, and people would say, “Electricity is going to take everyone out of the job.” I mean, we’ve come a long way from there, but I think with each of these revolutions we see that there is more productivity gain. Sometimes they’re prune first, and grow later. We’re testing that right now. But Jim, I’d love to take some more time in the future, talk more. I know Davos here, 15 minute blocks, you can at least have six meetings in that time and it can all be highly productive. And when we can’t meet, we’ll have our agents do it.

Jim Stratton: Perfect.

Daniel Newman: So all right, thanks for joining us at Six Five.

Jim Stratton: Thanks for having me.

Daniel Newman: And thank you everybody so much for joining The Six Five here with a View from Davos. We appreciate you being part of our community. Hit subscribe, watch all the covers. We have so many great interviews on the ground and we have so much more great insights that we unfortunately can’t share with you because that’s what you get when you’re in the room. All right, from Patrick and myself, time to say goodbye. We’ll see you all later.

The post The View from Davos with Workday CTO Jim Stratton appeared first on Moor Insights & Strategy.

]]>
How VCs Are Supporting AI & Sustainability – Helena Capital – The View From Davos https://moorinsightsstrategy.com/the-six-five/how-vcs-are-supporting-ai-sustainability-helena-capital-the-view-from-davos/ Thu, 23 Jan 2025 01:43:41 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45280 Sarah Antor, Founder & General Partner at Helena Capital, joins Daniel Newman sharing her insights on technology's pivotal role in investment and the emerging trends from the Davos conference.

The post How VCs Are Supporting AI & Sustainability – Helena Capital – The View From Davos appeared first on Moor Insights & Strategy.

]]>
Helena Capital aims to break barriers with its focus on female-led tech companies. Join Daniel Newman as he sits down with Sarah Antor, the powerhouse behind Helena Capital. Sarah isn’t just another VC – she’s leading the charge in a revolution that’s reshaping the future of investment.

💥 Why this is a must-watch:

  • AI & Sustainability Collide: Discover how Helena Capital is backing startups that are weaving AI and decarbonization technologies into the very fabric of global supply chains
  • Female Founders Take Center Stage: Sarah reveals why investing in women-led businesses isn’t just about social impact, it’s a smart financial strategy
  • Davos Decoded: Get an exclusive insider’s look at the key takeaways from WEF25 and how they’ll impact the future of innovation and investment

Stay tuned for more as we continue to decode key takeaways at Davos.

Learn more at Helena Capital.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you not treat us as such.

Transcript:

Daniel Newman: Hey everyone. The Six Five is on the road. We are here with a View from Davos. It’s been a great week so far. It is the beginning of the week, but we expect there to be a lot of exciting coverage coming from here. Think about it, we’ve got a massive inflection point, the inauguration of President Trump, number 47, happening this week. And of course that’s going to be a big talk in the town here. And we have major trends going on in the marketplace like AI and of course sustainability is still in focus and all kinds of evolution that’s taking place with agents and whatnot. So I’m excited here because we’ve been having conversations with CEOs, presidents, business leaders, people in government, politics, and so much more. And VCPE is another place. And Sarah, you’ve been in many parts of the business, but you’re here, you’re raising capital, you’re talking tech, you’re empowering women. I don’t know if I’m going to do your whole speech for you, but welcome to the show. Thanks for spending a little time with me.

Sarah Antor: No, thanks so much. It’s absolutely a pleasure to be here. I’m super honored to be part of the magical mountain and really getting the chance… I think Davos is one of the only places on the planet where you can really mix government, private sector, philanthropic, public sector, and I’ve just come from a really interesting roundtable where we talked about something very near and dear to my heart, which is actually encompassing our investment thesis and that is trying to focus on AI enabled software to future-proof our global supply chain, but at the same time embed those decarbonization technologies that we so need.

Daniel Newman: Yeah, it’s really interesting, everything that’s going on with AI enabled software. I’ve been saying the last year, maybe even two, with what’s happened with Nvidia and infrastructure, everybody’s kind of watching the AI data center build out. And of course, we know that the consumption is still that kind of layer that everyone’s like, “How much of it’s really happening? Is software being implemented into the industries, into enterprises, into businesses? Is it being used? Is it bringing value?” And I think we’re starting to see it. We did some research recently with Agentic and we were looking at the $50 trillion labor pool, Sarah, and we’ve gotten to the point where we estimate about 20% of the tasks by white-collar workers now can utilize Agentic or some of the power of AI. And we see about a $4 trillion economic impact that could happen over the next one to two years. So it is really happening, but at the same time, I’m still seeing things locked in POCs. I’m seeing things not really getting to market and figuring out the company. So maybe that’s an interesting point I’d like to get… In your eyes, what does software enabled AI mean? Is that Microsoft Copilot? Is that Salesforce Agent…? What is it and how are you thinking about investing in it?

Sarah Antor: Sure. So I see it a little bit as two buckets, and you’ll remember last year at Davos, AI was sort of the only topic that anyone’s talking about. And I think what we’ve seen over a 12-month period is that we’ve shifted really from this focus on ChatGPT and LLMs and how the future of work is going… What are our future jobs going to look like? And the average kind of white collar worker. And now we’re seeing a fundamental shift, I think, into what are the real applications for corporate strategics and not just getting their workforce kind of using AI on a regular basis, but how are they going to apply them in their actual, again for our purposes, global supply chain technologies?

Daniel Newman: Yeah, it’s a lot. And by the way, a lot of our studies are looking at how AI is being implemented, where are those sort of friction points and those choke points happening inside of organization and what is that sort of takeoff and when is that takeoff moment going to take place? Maybe just a little opportunity to share about Helena Capital. I know you’re raising it. Is it fund one?

Sarah Antor: Yes, we’re raising fund one. We’re emerging manager-

Daniel Newman: This is a big moment. I mean, I know you ran Raymond James… By the way, long time Raymond James client. I don’t know if I get any extra credit for that.

Sarah Antor: Amazing.

Daniel Newman: I know with Germany, I think you ran them there.

Sarah Antor: So I was heading up the German offices and I was COO of the European investment banking division. I stepped back at the end of 2021 when we moved our headquarters from Munich to London. And yeah, I’ve been working on this new fund. We are hoping to raise north of $200 million. It’s a series B fund, so we’re looking really at the missing middle on a couple of different fronts. So one, as three female general partners we’re very interested in female led tech. We’re also very interested in this kind of decarbonization technology piece. Also, again, AI enabled. You see a drop-off at the series B for early stage funding. They’re a little bit too, let’s say, a lot too risky for private equity, but too big for early stage VC. And that’s where we really hope to play.

Daniel Newman: Yeah, you’ve got to always find those white spaces. It’s a great place to take off. Of course, if it was easy, there’d be a lot of people doing it.

Sarah Antor: Correct.

Daniel Newman: So you know that sort of being able to make that assessment of what has real takeoff value versus what’s going to maybe drag, because when you’re in VC, the returns you’re going to expect have to be substantially higher because you’re going to get some of them wrong. That’s just kind of how that goes. I’d love to tell everyone that, “I bet Sarah’s going to get them all right,” but we all know that there’s no VCs in the world that get it right every single time. So maybe just dig in a little bit on the decarbonization. I’ve been, now… This is my second Davos, I’m a wily veteran, but three or four years ago when I tracked what was going on at Davos, it was all about sustainability and DEI. We’re seeing a pretty significant sort of climate change… About climate change.

Sarah Antor: Absolutely.

Daniel Newman: And I don’t think it’s disinterest. I mean, I still think there is very meaningful sustainability, meaningful decarbonization efforts that can be done. AI is sort of a diametrically opposed venture right now. Without fusion, without some sort of super clean tech, we’re taking the grid over.

Sarah Antor: Sure.

Daniel Newman: I mean, how are you thinking about that? Because as far as I’m concerned, if Nvidia keeps scaling like this and the infrastructure keeps scaling like this, we’re going to run out of power. I mean that’s got to be a big topic here by the way.

Sarah Antor: I think it is a big topic and I think it’s a real key point because there is this… You mentioned it yourself. There’s this shift and whether it’s sort of a shift towards apathy in this area or we just are looking constantly at the shiny objects and not sort of remembering what we learned a couple of years ago. What I like to say, and this might sound a little bit controversial, but it’s a little bit like feeding vegetables-

Daniel Newman: It’s fun for the camera, so go for it.

Sarah Antor: Feeding vegetables to your children,. We need to sort of cover up the decarbonization technologies with some returns and future proofing these legacy industries. I think those industries are ready now. They’ve kind of done away with some of their larger R&D over the past decade, and now we’re seeing them hungry again for new technologies post-Covid. And what I hope is that the younger generations are going to embed those new technologies with decarbonization technologies.

Daniel Newman: All right, well, we’ve got just a couple of minutes left. Let’s end a little bit with the female founder things. As an investor myself, I certainly love to see how companies can span that sort of diversity horizon. Let’s be candid. As a white male founder, I can certainly say that I have a certain viewpoint of the world and I think it’s important that diversity is there. Now, you can also get into also the kaleidoscope of the opposite, where you get too far down the path of picking a certain diverse group. How are you balancing, making sure that you lead and think about…? You know what I mean?

Sarah Antor: Yeah, it’s a great question. I think one thing that was… It’s a mission for me. I’m coming out of a very two law… Well, three law, business, and technology backgrounds. So very male-dominated. I think it was very much a mission for me to focus a little bit on female founders and the further along… Less than 5% of female founders make it to series B. So this was a little bit of a mission and purpose. But what’s cool, when we started to hone in on where we were really interested, our investment thesis, because it’s such a complex… In my opinion, it’s such a complex kind of space to play in, I was very happy to find out that sort of eight of 10 of those companies were either female founded, co-founded, or had a woman on the management team leading. And I think what we’re seeing again is women have had less funding. So as we all know, less than 2% of funding worldwide goes to female founders. They’ve had to be lean, they’ve had to be mean. We’re seeing… From our perspective, from an investor’s perspective, this is a real arbitrage opportunity. We can get in it at lower valuations because they’re not valued according to-

Daniel Newman: Traditional.

Sarah Antor: To traditional. And those lower valuations allow us to get in… On a ticket for us, we can get in an eight to $10 million, whereas a male-founded team is right around $12 million for that series B in Europe. And we actually can then exit those companies at a higher valuation than the male-dominated ones because the growth is exponential over a two to three year period.

Daniel Newman: Like a lesson my father taught me when I was young about real estate, you make your money in the buy. So to some extent you’re kind of combining some real personal ambitions and personal beliefs with also being somewhat opportunistic. And I think the word you used was arbitrage, which to some extent, like I said, in every investment you realize in the end it’s the where you bought it, and then of course your ability to identify the right investments, hold them through the right periods of time. Sarah, I want to thank you so much.

Sarah Antor: No, thank you. It was really great.

Daniel Newman: It’s great to spend some time here with you in Davos.

Sarah Antor: Appreciate it.

Daniel Newman: I hope you have a great event. I hope you find all your investors and you raise that fund.

Sarah Antor: So far so good.

Daniel Newman: I hope it’s the first of many. I think the next four years, there can be a lot of debate about the political landscape, but I do think very pro-business. I think-

Sarah Antor: Absolutely.

Daniel Newman: The coffers are going to open up. I think people are feeling optimistic right now. Keep doing what you’re doing.

Sarah Antor: Appreciate it. Thanks for having me.

Daniel Newman: And thank you so much for being part of The Six Five On the Road. We are a View from Davos. It’s been great having so many of these conversat

The post How VCs Are Supporting AI & Sustainability – Helena Capital – The View From Davos appeared first on Moor Insights & Strategy.

]]>
Talking AI, Cybersecurity, Ransomware & Deepfakes with Check Point Software – The View From Davos https://moorinsightsstrategy.com/the-six-five/talking-ai-cybersecurity-ransomware-deepfakes-with-check-point-software/ Wed, 22 Jan 2025 22:53:16 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45274 Rupal Shah Hollenbeck, President at Check Point Software, joins Daniel Newman and Patrick Moorhead to share insights on tackling new-age cybersecurity threats, focusing on AI's role, ransomware defense strategies, and the control over deepfakes.

The post Talking AI, Cybersecurity, Ransomware & Deepfakes with Check Point Software – The View From Davos appeared first on Moor Insights & Strategy.

]]>
When it comes to cybersecurity, AI may be a double-edged sword.

You may be used to news reporters or journalists covering this big event, but we are here to go one step further, connecting the dots between tech, finance, and policy.

Six Five Media’s “The View From Davos” brings you slightly different perspectives & analysis from technology industry analysts, Patrick Moorhead and Daniel Newman.

Today they are joined by Check Point Software President, Rupal Shah Hollenbeck, to talk on the evolving landscape of cybersecurity, focusing on current threats like AI, ransomware, and deepfakes.

Get their take on:

  • The impact of artificial intelligence on cybersecurity strategies
  • Cyberattacks on the rise with ransomware attacks skyrocketing by 90%
  • AI is accelerating innovation, but also creating certain openings for cybercriminals with the prevalence of deepfakes & ransomware
  • Check Point Software’s approach to combating these digital threats
  • Insights into future cybersecurity trends and preparedness

Learn more at Check Point Software.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View From Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road with a View from Davos. We’re having incredible conversations here. You can see the beauty in the background, but this is a conference, not about skiing and snowshoeing and doing all these things. The World Economic Forum is really this blend of technology of governments and, quite frankly, enterprises from around the world get together to talk about some of the most pressing items. And right now the conversations have really been around what does the new administration in the United States mean? Global regulation, what does it look like in 2025? And of course, all the implications of generative AI. And with that, I’d like to introduce Rupal from Check Point Software. Great to see you.

Rupal Shah Hollenbeck: Thank you. Nice to see you too.

Patrick Moorhead: Absolutely.

Daniel Newman: Well, good to be here. Rupal, why don’t we start with the… Pat, sort of set the stage a little bit. This event brings the leaders from all over the world, not just from industry, not just from business. It seems to be this private-public partnership. The Congress has probably some of the most noteworthy names in the world come here every year. What are the objectives you have coming here this year? What are you hoping to walk away from this year’s event?

Rupal Shah Hollenbeck: Well, the magic of Davos for us is really exactly what you just said. It’s public and private. It’s four corners of the planet. And so markets, big and small, organizations, big and small, innovators, big and small. And when that all comes together, it makes some fantastic conversation. And being in the business of cybersecurity, that comes together beautifully here in Davos.

Patrick Moorhead: Yeah, and Davos is also really looking forward to the year. I know all the plans have been put in place, and we all know what we’re going to do for 2025. But most recently you did a study on cybersecurity about what’s the outlook for 2025. And I’m curious if you can give us some highlights. What are the implications, why should people here at the show care?

Rupal Shah Hollenbeck: Yeah. Well, they have been caring and the Cybersecurity Centre is a major center here at Davos, and we’ve been fortunate enough to be a part of it. And we put out this report every year, and this report talks about the cybersecurity trends and what’s happening. And as you might imagine, cybersecurity attacks are on the rise. They’re on the rise year-on-year from 2023 to 2024 by 44%. And then when you drill down into that and you look at the kinds of attacks, they are increasingly ransomware. Ransomware attacks, you won’t believe it, on the rise, 90% increase year-on-year. And you can probably imagine what the cause of that is.

Daniel Newman: Sure, absolutely.

Rupal Shah Hollenbeck: Yeah. So it’s startling-

Daniel Newman: Wait, wait. Were we supposed to say AI?

Rupal Shah Hollenbeck: You are.

Daniel Newman: Okay. Hold on.

Rupal Shah Hollenbeck: You missed your cue.

Daniel Newman: No, I know. It was fine.

Rupal Shah Hollenbeck: But I’ll take it.

Daniel Newman: No, it was fine.

Rupal Shah Hollenbeck: It was fine.

Daniel Newman: I wasn’t sure if that was the goal. I want to give you the shout-out. You could say that again. Do you know what they are? And I’m going to say AI.

Rupal Shah Hollenbeck: That’s right. That’s right. And the thing about AI is that it exacerbates the good and the bad. And so when you take the streams of AI and AI innovation and you intersect that with cyber security, both the opportunity and the damage is seemingly endless. And that makes for really important conversation this week.

Patrick Moorhead: Nothing’s easy, okay. Is it as simple as staying ahead of the threats, putting in the proactive and defensive maneuvers to thwart the bad folks?

Rupal Shah Hollenbeck: As easy as that sounds, it really isn’t. However, this really is our reason for being, and it’s the reason for being in the cybersecurity industry. And so we’ve been using AI for 33 years, but it’s always changing and you never know what’s around the corner. And so the thing about AI is that right now, with the advent of the most sophisticated generative AI tools that we’ve ever seen, the bad guys get badder and they get faster, their pace increases. And so things like deepfakes, there could be a deepfake out there that would mimic the three of us, and it would take us hundreds of dollars and it would take us a couple weeks of R&D work to get that. But we could get it on the dark web. Guess what? Today with seven seconds of each of us speaking and with pennies on the dollar, we could get a deepfake in a day that would mimic the three of us.

Patrick Moorhead: And probably not saying anything that’s smarter than we say, although it wouldn’t be too hard, but it probably wouldn’t be something that’s flattering.

Daniel Newman: It’s super interesting though, because nowadays we create so much of the content, we enable these folks. I mean, a lot of it’s our behavior. I mean, everything from the way we don’t read our terms to the way we don’t update our apps, to the way we sort of nonchalantly do shadow IT on our devices and don’t necessarily take advantage of security that’s being offered from the corporate and CISOs and whatnot. We’re the perpetrators of a lot of our own problems, right?

Rupal Shah Hollenbeck: It’s cyclical. And then when you add on top of that new generations. I will admit that I’m an Xer. And as an Xer, I’ve got a certain point of view on my privacy. I’ve got a certain point of view about my data. Well, my kids who are Gen Z, absolutely have a different mindset. And so we have to operate in this world where built in there is an expectation and an acceptance frankly, that I am going to be less private with my data, that it’s going to be out there. And so the decisions that I make therefore need to be way more deliberate with intended and unintended consequences taken into consideration. And cyber security players like Check Point have to behave differently inherently because the decisions that were made by baby boomers and Gen Xers are now being made by millennials and Gen Z. and they fundamentally have a different view about their data.

Patrick Moorhead: Rupal, we weigh in on about 50 cyber security companies at different levels, and we’re hearing a lot, what we hear is, hey, we’re the best to do this at AI. We’re putting the most investment or it’s the best investment. What is the secret sauce at Check Point Software in this age of generative AI security that you think sets you apart?

Rupal Shah Hollenbeck: Well, a couple things. First, we’ve been at it for a while. We’ve been at it for 33 years. And having been at the beginnings of this industry, I think that long legacy, I think, helps propel and inform our future. And so the only thing we know is that we don’t know what’s around the corner. And when you’re grounded in that, it allows you to stay ahead and it allows you to always learn. And so while we don’t know what’s around the corner, we’re going to be agile enough and invest enough in research and R&D to be able to act swiftly. And that’s really what matters here is acting swiftly, acting with efficacy, acting with efficiency, so that we’re providing the world the best solution. And here’s the thing, cybersecurity is incredibly complex and we know that we are in a hyper connected world. And so it is more important than ever that we invest in providing the simplest way through for CISOs and security professionals to navigate that complexity.

Daniel Newman: Let me take it up to the CEO level. We actually partnered with Kearney, large management consulting firm, and did the largest of its kind CEO survey on AI, 213 CEOs running companies with over a billion dollars of revenue. One of the hot topics was security, but I often have said over the years that boards and CEOs tended to always kind of spend on security a little bit like insurance. How much is the least I can possibly spend to cover my bases-

Rupal Shah Hollenbeck: And get away with it, yeah.

Daniel Newman: In the AI era, to that audience… Because there were some indicators in the data in your survey that they’re seeing how important security, privacy, governance are. But how do you kind of advise that group that sometimes, because they can’t necessarily look at it from the lens of productivity gains, they don’t necessarily see it in efficiency numbers to get the CEOs to really buy in. How are you sort of positioning that?

Rupal Shah Hollenbeck: Well, the business of cyber is everything, and increasingly so with AI. So our report said for example, that 67% of organizations said, yes, AI will have the biggest impact to my security posture than ever. 67%. But only about 30% are doing anything about it. And part of it is because they don’t know what to do. It’s a complex world, and that’s where it’s really important to consult with and partner with the cyber security professionals around you. And it’s important to expect and demand simplicity in the rollout of solutions because we know the world of cyber is anything but simple.

Daniel Newman: I always love those reverse data points like the two thirds, one-thirds. There’s a one I call the 80/8 rule where 80% of brands think that they’re truly differentiated in the market, but only 8% of their consumers do.

Rupal Shah Hollenbeck: It’s exactly the same. It’s exactly the same. And so our job is to take that 67% and say, right, you get it. Now what do we do about it together? And we are increasingly having those conversations with boards.

Patrick Moorhead: Yeah. One of the ways that I’ve seen moving the needle in 35 years of security is finding any way in the world to quantify the benefit of a said product. In other words, put a metric on it because with infrastructure, they can do performance testing and there’s a certain way and potentially a certain payoff. I think that the security industry as a whole would be helping itself by quantifying not just the big risk and the threat, but specifically giving the customer a way to do that.

Rupal Shah Hollenbeck: You’re exactly right. And I sit on a public board in addition to being a part of Check Point, and I see this every day, and it’s really unfortunate, but the majority of public boards out there face the cold hard truth when there’s a breach and when they feel it in their wallet and their shareholders feel it. And so our aim is to quantify that every time it happens, to avoid the next time. And to consistently put that data into the hands like the NACD, and to put that in the hands of public boards.

Daniel Newman: It’s interesting. The TCO turns into the RCS study, the real cost of stupidity. Ignorance is not bliss with security. And of course, like I said, we’ll see if AI kind of pivots that sort of insurance thinking and really puts it more into a proactive strategy. But it sounds like you’ve got a lot of evidence and you’re seeing it in the market, Rupal, I just want to say thanks so much for spending some time here with Pat and I at Davos. It’s a great event. I hope you have a great week.

Rupal Shah Hollenbeck: Thank you. You guys too. Stay warm.

Patrick Moorhead: Thanks.

Daniel Newman: And thank you everybody for tuning in and being part of The Six Five On the Road, a View from Davos. So many great conversations here, subscribe, be part of our community. Listen to all of the different interviews we have. We are talking to tons of leading executives and those that are really thinking about what the future looks like right here at the World Economic Forum in Davos. See you all later.

The post Talking AI, Cybersecurity, Ransomware & Deepfakes with Check Point Software – The View From Davos appeared first on Moor Insights & Strategy.

]]>
Six Five Connected: The Need For High-Density Storage in The Age of AI https://moorinsightsstrategy.com/the-six-five/six-five-connected-the-need-for-high-density-storage-in-the-age-of-ai/ Wed, 22 Jan 2025 17:00:45 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45251 Avi Shetty and Geeta Vaghela join hosts Diana Blass, Keith Townsend, and Patrick Moorhead to share their insights on Solidigm's revolutionary 122 TB SSD and the critical role of high-density storage in the era of AI.

The post Six Five Connected: The Need For High-Density Storage in The Age of AI appeared first on Moor Insights & Strategy.

]]>
The age of AI is here, and data is its fuel. But could storage limitations be holding us back? Conversations at SC24 explored this challenge diving into the SSD vs. HDD debate, and the latest advancements like Solidigm’s 122TB SSD – a game-changer that can hold 4K copies of every ’90s movie 2.6 times over! 🤯

Tune in for more on this critical piece of the AI race as Diana Blass, Patrick Moorhead, and Keith Townsend break down the latest in solid-state drive technology and their importance in AI and big data landscapes with Solidigm‘s Avi Shetty and Dell TechnologiesGeeta Vaghela.

This episode covers👇

  • The unveiling of Solidigm’s 122 TB SSD
  • The growing necessity for high-density storage solutions in modern computing
  • How Solidigm and Dell Technologies are addressing the AI-driven demand for data storage
  • The potential impact of high-capacity SSDs on enterprise storage strategies
  • Insights into future storage technologies and trends

Learn more at Solidigm and Dell Technologies.

Watch the video at Six Five Media, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Connected is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Diana Blass: Welcome to the Age of AI, where data is a driving force behind innovation and storage is a refinery making it all happen. Consider this, breakthroughs like self-driving cars and personalized healthcare depend on vast amounts of data. But it’s not just about collecting it, it’s about ensuring that data is organized, secure, and accessible for tasks like training AI models, running predictions, and analyzing insights. That’s where storage comes in. And as AI grows, traditional storage methods just can’t keep up. Is the industry putting enough focus and investment towards their storage solutions right now?

Avi Shetty: Clearly not.

Patrick Moorhead: We’ve heard a lot about GPUs with generative AI. We’ve heard a lot about photonics, but there’s just not a lot of conversation about the storage element.

Avi Shetty: We believe that’s changing.

Diana Blass: Decades ago, the priority was simple. Store as much data as possible at the lowest cost, starting with punch cards, then magnetic tape and core memory. The arrival of hard dish drives in 1956 marked a turning point. HDDs revolutionized storage with faster random access to data, higher capacity, and greater durability. Over time, they became smaller and quicker, but the mechanical design couldn’t keep up with the speed and efficiency needed for today’s AI-driven workloads. Enter solid state drives or SSDs. Replacing the moving parts of HDDs with NAND flash memory, SSDs deliver faster speeds, higher reliability, and improved energy efficiency, and they can hold more data too.

Keith Townsend: So I looked this up the other day, the largest hard drive that you can get, spinning rust, 32 terabytes. You know what the largest physical SSD you can get now? I think you might have something in your hand that says it.

Diana Blass: Could it be this, 122?

Keith Townsend: 122.88 terabytes.

Diana Blass: Solidigm unveiled that SSD in November of 2024, noting that it’s enough storage to house 4K quality copies of every movie theatrically released in the 1990s 2.6 times over.

Keith Townsend: This is an insane amount of storage.

Geeta Vaghela: More than the Library of Congress or the UK library. I mean, this is huge amounts of data.

Diana Blass: But HDDs still hold value. They’re more affordable and are often thought to offer compliance advantages due to their suitability for read intensive workloads. However, Solidigm challenges this notion with its new 122 terabyte SSD, asserting that the QLC media used in its SSDs is specifically optimized for read intensive tasks. Solidigm claims this not only matches, but even outperforms traditional HDDs in such environments. So what’s next? Will SSDs ultimately replace HDDs or both continue to serve distinct roles in the evolving storage landscape? To find out, we took these questions to Supercompute, one of the world’s largest events for computing, networking, and storage. Our first stop, Dell Technologies. So when we talk about solid state drives versus hard disk drives, where do you see the market going?

Geeta Vaghela: So I think data center designs, power, cooling, performance, thermals, all of those things are coming into play. And when I think about solid state drives or HDDs, they’re sort of the foundation of a storage system. That’s what the data ultimately resides on. And so partnering with drive vendors, understanding what they’re doing, getting denser in their technologies, becoming more power efficient, that’s a huge part of how I see SSDs playing into this landscape. And we continue to see HDDs have their place as well, compliance archive type use cases. But AI right now is really pushing performance, density and capacity in terms of a data center design.

Diana Blass: It’s all to enable outcomes that can transform an enterprise. What’s that look like? Come with us. All right, we’re now outside the Dell AI factory with NVIDIA, which as you can see is a 50-foot semi truck that they somehow figured out how to get inside the Georgia World Congress Center here in Atlanta. So what’s inside? Well, let’s check it out. All right, outfit change. I made my way over to the mixed virtual reality station where I got to meet Andy again, this is Dell’s digital assistant. And what’s super cool about this headset, not only can I see a combination of real world and virtual world, but I also… Andy walked me through her, the latest server here, and then as she was showing me the device, I got to actually pick up part of it. Oh my gosh, that is so cool. Now, as we look at all those applications, think about the data, data that’s being collected, moved, processed, and trained in real time to enable those outcomes. All this happening inside that semi truck raising important questions like-

Keith Townsend: How much weight can I put inside of a truck? How much power and cooling can I put inside of a truck?

Diana Blass: Keith Townsend, CTO advisor at The Futurum Group makes a great point about storage. It’s not just about how much data you can store. Sure, AI needs systems that can handle massive amounts of data, but there’s another side to it. How much space and power these systems demand.

Avi Shetty: Think about if you’re solving for, again, an example, solving for a hundred petabyte server. If you did that with a hard drive, you would need 10 racks and consuming HDD power. With RS high-density SSDs, you can bring that down to two or maybe three racks. Thus reducing your infrastructure space, thus reducing your cooling needs and helps with overall TCR.

Diana Blass: And it comes at a pivotal moment as a tech industry and its partners grapple with how to officially power its AI solutions.

Avi Shetty: There’s one study which kind of summarizes this, I believe is from a Meta and Stanford, where they evaluated Meta’s recommendation engine and looked for storage efficiency, especially on the power side. And the study came back with 30% of power was allocated to storage in their HDD-based storage infrastructure, which essentially says that if you move from an HDD to an SSD in that domain, you had an additional 80% of power, which could be going back to your GPU to get its efficiency high.

Diana Blass: Now interestingly, 122 terabytes is so large that some vendors we spoke to say they’re not sure most clients have a real need for it now, as AI is still in its early days. Here’s sound from our interview with VAST Data, whose data platform helps all sorts of companies manage and process massive amounts of data.

Timer Hagy: We have a lot of customers that store hundreds of petabytes and sometimes at the exabyte level. Those customers can benefit a lot by shrinking that footprint. And so if you’re talking about moving from four rack units to two, maybe that’s not a lot. But if you’re talking about a room full of racks and you cut that by half, that’s significant savings.

Diana Blass: So let’s go back to our original question. Will SSDs replace HDDs? It’s always hard to predict the future, and right now it appears the consensus is that the market will be hybrid. As vendors say, it all comes down to the use case. HDDs remain a cost-effective option, especially for compliance needs. But SSDs are rapidly evolving, dropping in price, and quickly gaining market share. Now interestingly, that solid state drive that we saw released by Solidigm has already been sold out. So it’ll be interesting to see the innovation to come, especially as we look at this show floor where everybody is looking to innovate with AI. What’s on the horizon? Well, stay tuned and you’ll stay connected. I’m Diana Blass.

The post Six Five Connected: The Need For High-Density Storage in The Age of AI appeared first on Moor Insights & Strategy.

]]>
IBM’s Rob Thomas on Generative AI, LLMs, and the Path to Value Creation at the World Economic Forum in Davos, Switzerland – The View from Davos https://moorinsightsstrategy.com/the-six-five/ibms-rob-thomas-on-generative-ai-llms-and-the-path-to-value-creation-at-the-world-economic-forum-in-davos-switzerland-the-view-from-davos/ Wed, 22 Jan 2025 01:03:00 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45247 Rob Thomas, Senior VP and Chief Commercial Officer at IBM, joins hosts Daniel Newman and Patrick Moorhead, sharing his insights on IBM's strategies and the technological advancements shaping our future. A compelling overview from the heart of Davos.

The post IBM’s Rob Thomas on Generative AI, LLMs, and the Path to Value Creation at the World Economic Forum in Davos, Switzerland – The View from Davos appeared first on Moor Insights & Strategy.

]]>
New insights from Davos ❄ – the future of AI is about value creation.

Has AI reached a pivotal point? IBM’s Rob Thomas believes it’s time to move beyond experimentation and to focus on generating real value with AI.

Patrick Moorhead and Daniel Newman are joined by IBM’s Rob Thomas, SVP, Software and Chief Commercial Officer, for a conversation on IBM’s recent initiatives and his take on the future of AI in this installment of The View from Davos.

Tune in as they cover ⤵

  • IBM’s strategic priorities and how they are shaping the future of technology and business
  • Insights into the latest trends and innovations discussed at Davos
  • The power of speed: AI’s ability to drastically accelerate processes – cutting cycle times from 6 months to 4 hours
  • The role of Open Source & avoiding vendor lock-in to fostering collaboration for AI adoption
  • Their take on building sustainable, ethical, and secure technology solutions for the future

Learn more at IBM.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The View from Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On The Road with a View from Davos, as you can see in this amazing background here, but we’re not here just to be on the ski slopes or sled down the hill. We’re having some great conversations here. I mean, there’s so many things going on right now. In fact, a new administration is coming in the US. There’s a lot of discussion around this. We have this ever-present question about regulation, and there’s innovation discussion at the conference as well, particularly around generative AI, things like sustainability. With that said, I’d like to bring back on the show Rob Thomas, welcome back to The Six Five.

Rob Thomas: Great to be here. Great to see you both. And how about Davos? Great to be here.

Patrick Moorhead: I know it’s great to be in Davos. It feels good.

Daniel Newman: Just for the ski slopes, I didn’t do any skiing, man. I don’t know about you.

Patrick Moorhead: Just had to throw that in there.

Daniel Newman: It’s beautiful out there, it certainly is.

Patrick Moorhead: Did my skiing while I… So Rob, a lot of stuff going on here at the show. I’m curious, what are your objectives at the show, customer, governments, what do you want to get done here?

Rob Thomas: This is an incredible place to do a lot of meetings in a short period of time. We have a very global business as you know, so I’m able to meet with businesses and governments from Indonesia to Japan to the Middle East, to people based right in the US where I live or across Europe. So in terms of the quality of meetings you get, it’s incredible with just one trip as opposed to with 100 trips. And also I do sense something different happening in Davos over the last few years, I’d say, and it’s almost the integration of public and private, meaning there’s a lot of discussion that happened here about businesses impacting governments and vice versa, and I think that can be very healthy actually.

Daniel Newman: Yeah, it is really exciting times. I mean, we’re seeing rockets being shot into space regularly, and I mean the private sector, how much it can drive, and then of course partnering with public to your point. And I know one of your very own, Head Of Research, Dario Gil is going to be taking a role in the new administration. That’s super exciting. It’s indicative of the kind of research that’s going on at IBM and that he’s driving, since we’re talking a little bit about this, love to get your take. I mean, I know new administration, that often means a lot for someone like you driving and leading such a large company. Any big thoughts on this next four years and what you really hope to see out of a Trump administration that might be strong for AI or we hear about blockchain and crypto, we’re hearing about possible M&A, what are some of the things that you’re thinking about?

Rob Thomas: First of all, we’re super happy for Dario Gil. As you mentioned, he was nominated as undersecretary for Department of Energy, which runs all the national labs, that type of thing. So we will miss him, but super happy for him because it’s an exciting area to be on the, I’d say leading edge of what’s happening in science and technology in the US and around the world. I think the thing I’m the most hopeful on, as you look out the next four years, is less regulation, meaning a business-friendly environment. We want to be able to create growth, create jobs, do M&A, and I think the incoming administration gives us a lot of opportunity to do that. So really excited about that. Growth solves every problem, and so good for the US. If the US gets growing, the difference between 2% GDP versus two and a half or three, if we could possibly imagine that, it has a remarkable impact on a per capita basis, and so to me, this is all about growth.

Patrick Moorhead: Yeah. Speaking of growth, I mean one of the big theses of what could drive that growth is obviously AI, and the latest flavor being generative AI and even agents, which I think we all believe is an incredible economic opportunity and a growth driver. I want to ask you, you have a lot of meetings with your clients, you get a certain read of where they are. A lot of folks have done their experiments, they’ve done some POCs, and now it’s this next jump to really scaling it, enterprise-wide, government-wide. And I’m curious, what is the read you’re getting on what help they need the most, and what are you doing about it to help solve those?

Rob Thomas: I think 2025 will be about value creation and AI, and think of it as almost-

Patrick Moorhead: So actually creating it.

Rob Thomas: No creating value with AI.

Patrick Moorhead: Okay, okay.

Rob Thomas: So value creation, meaning can you return to the top line or the bottom line of your business? Think almost like a curve. We did a lot of experimentation. You acknowledge that. Then many started down this path of we’ll do RAG, we’ll do fine-tuning. I think we hit a tipping point somewhere last year, where companies could actually see a return on automation of technology and operations. Starting to see that happen. Then that goes into code, which is starting to happen. That goes into AI assistance, that eventually then gets to agents, so I’m not really a believer that the only answer from here is agents. I think this is a curve where you have to do a little bit at each point, but we have to get past this point of experimentation.

Patrick Moorhead: Sure.

Daniel Newman: Yeah, that seems to be the big opportunity. We’ve done some assessments of the market. We’ve looked at, and by the way, Arvind was one of the early ones that kind of called out, I call it the prune to grow thing, and I know it was sort of one of those things that I think a lot of the media kind of misunderstood, but at first when you look at any new technology, you start to immediately look at where do we take out cost? Where do we become more efficient? How do we immediately get some return? It makes it easier to invest. We’re seeing four trillion in opportunity from agents and generative AI to be pulled out. The next though is how it sounds like you’re kind of looking at Rob is how do you put those dollars back to work to create scale like every industrial revolution in history has. I think that’s where the message sometimes gets misinterpreted, like, oh, you’re just going to get rid of four trillion of labor. Well, we want to upskill, repurpose, improve. I mean, are you sort of starting to see that? It feels like the first generation was POC, but it was also what redundant, monotonous, repetitive things can we get out and use agents and use generative and use assistance for, just curious your take there.

Rob Thomas: I think time may tell that maybe the biggest benefit of AI becomes speed, and how companies can deliver. We had lunch right before this with a number of C-suites from around the world, and two examples. One was a pharma firm that said, “We’ve gone from, we have to put a thousand things in motion to get one to now we can narrow it down to 12.” Because AI is becoming more part of the process, so that drives a ton of speed. If you’re not doing a thousand experiments and instead you’re doing 12, that’s a dramatic difference.

Daniel Newman: The right 12 is what they’re saying, right? The right 12-

Rob Thomas: Yeah. There’s enough signal now from the analysis they can do to make sure that you’re going to have higher odds of impact. Another one was a manufacturing example that said, we can cut cycle times from six months to four hours. If you can cut cycle times, that drives massive leverage and speed in a business, and that’s because AI can do things in partnership with humans that humans cannot do on their own, and I think that’s probably the biggest promise of what can happen here.

Patrick Moorhead: Rob, I meet a lot of CIOs that say that they’re having a hard time doing that sorting mechanism, going from that 100 to that 10, and it’s really a conversation about ROI. What is your guidance to folks who are in that situation, maybe not as enlightened as the folks that you just had your lunch with.

Rob Thomas: You should not underestimate the cultural impact of trying to do something on AI. Because let’s be blunt, it will disrupt how you do things. It will disrupt some amount of your skills, some amount of your employees. If you are scared of that or let that be a deterrent, then you will quickly fall behind. I’d say the positive side of that is every time there’s been new technology in the world, ultimately it’s created more jobs. It’s really been the forcing function behind people building new skills. I think people have to lean into this, but I do still talk to people that are a little worried If I do this, then I actually need a smaller team. Maybe that’s okay in the short term because that way we have a bigger team in the long term.

Patrick Moorhead: Sure.

Daniel Newman: Yeah. It’s really interesting. Rob, a couple of things. I’m going to maybe try to thread a couple of needles here, but I know you’re working on a book on value creation. I think it’s with AI. I want you to talk about that a little bit and also where the role that open is. I know, gosh, IBM has put a ton of effort into small language models. We’ve seen in our lab, we’ve actually been testing and looking at some of the work that you’ve done with Granite, and I think SLMs might be the future. Tying together how you create the value measure ROI and do it with open source collaboration, both with software and sort ecosystem. Can you kind of put those things together and share a little bit about the book as well?

Rob Thomas: Yeah, thank you. So I have a book coming in May. It’s called AI Value Creators, and I’ve worked with three people, Dario Gil, who you mentioned, he’s helped a lot. Paul Zikopoulos, Kate Soule, so the four of us have partnered together on this. We thought there was a story that wasn’t being told in AI. The story that’s being told is using an LLM, use ChatGPT. Use a chatbot. We thought that underplayed what you described as the four trillion opportunity. Maybe it’s even more than that. So think of this as a playbook for anywhere from the C-suite to a developer for how to think about what’s happening in an AI, how you get value out of it. The connection between data and AI, it sounds obvious to you guys. It’s not obvious to every company I would say. Step one is often make your data ready for AI, I would say. And all these pieces together we’re trying to build, I guess I’d say kind of a playbook for the next three to five years on how you get value from AI.

Patrick Moorhead: Yeah, Rob recently, and I don’t know if it was just the fourth quarter of last year that these conversations started to pop up, but the CIOs were saying, “Okay, I got my data in alignment. I did some incredible magic tricks. I figured out RAG. I either did a lot of RAG or I did an SLM or some combination and this thing works, but I can’t get the people in the company to use it and adopt it.” Are you addressing that in your book, any tips and tricks for how that, because this is a new one that has popped up for me. In fact, a Fortune 50 tech company, a CIO I had a conversation with, “It works. It’s amazing. I can’t get my people to use this thing.”

Rob Thomas: I think it comes down to there’s the cultural issue we talked about, there’s also a skills issue, and there’s also just a change management issue. If you’re adopting this technology and using it, you actually don’t have to do everything you’re doing today. You can probably do something different. So I think we all have to be a little patient that this takes some time. One reason back to the question on open source, we’ve been so focused on making this happen in open source is that brings a groundswell of support from any skill level. Certainly starting with developers, anybody can contribute, anybody can participate in what’s happening. That kind of reduces, I’d say the barrier to adoption, getting started. And also solves the fear of many companies have this fear, we’re going to get locked in. If we go down the path with one large language model, we are forever stuck in that. Open source prevents lock-in, so that’s a good point of leverage.

Patrick Moorhead: Yeah. It’s funny, this reminds me a lot of the debates we had in the early days of hey, ERP, business BPO, business process optimization, right? Getting people to do something different. I’m pretty optimistic, because this industry’s been through this before. There are certain generations who haven’t, and I think there’s a lot we can learn from the past to chart out an amazing future.

Rob Thomas: To give you a sense though, in our own story in IBM, we shared last year that we’ve driven $2 billion in cost out of the business through using AI. So I do think it helps us credibility-wise to walk in and tell people, “This is how we’re doing it.”

Patrick Moorhead: Sure.

Rob Thomas: In the development of our own products, we now are generating 6% of the code that we write with generative AI, using watsonx Code Assistant. Now, I know other companies give huge numbers, to be honest I’m incredibly skeptical of that. We’re at 6%, I think we can get to 20, 25%, which would be amazing, but even 6% at this point, that is a huge leverage adding on to the super-talented engineering team that we have.

Patrick Moorhead: Sure.

Daniel Newman: You’ve been able to move really quickly. Maybe would like to end with a bit of a culture question since you’ve been talking a little bit about what makes it happen, and we just did a landmark study, probably one of the largest of its kind. Talked to 213 of your peers, CEOs of companies with more than a billion of turnover, and we did it in partnership with a management consulting firm Kearney, and we were trying to understand, one of the big things was the adoption, and we learned something that basically companies that are struggling to adopt and successfully deploy AI, Rob, are also companies where the CEOs and top executives have held it too close, meaning they’re trying to kind of joystick it is the way I’m explaining it. You’re seeing it from both sides. You are customer zero, and I know Arvind’s passionate. You’re passionate. You’re also, through the Consulting Advantage program and through your customers that deploy your software, you’re seeing these deployments go everywhere. Any thoughts on how AI proliferates in the organization successfully, because we’re seeing almost 80% of companies that are failing, it’s because they’re basically are saying the CEOs held it and the board held it too close.

Rob Thomas: I think the companies that hold it close, the C-suite is probably not using it themselves. So the one thing I talk to my team about a lot is how much are you using it in your day-to-day job? I use it a ton, but that’s not obvious. A lot of people project as I know this is happening that’s for other people, that doesn’t set tone for an organization, so you have to be a user before you can convince others to use it I would say. So that’s probably the biggest thing we’ve tried to drive in IBM is you have to lead by example on these topics.

Daniel Newman: Yeah. It feels a little bit like the historic digital transformation where it’s like, “Why is it not working? I told everybody to digitally transform.” It’s like, okay, what does that mean? But AI’s happening so fast. So anyways, Rob, I wanted to just say thank you so much for spending some time with us here. Congratulations on the book.

Rob Thomas: Thank you.

Daniel Newman: Congratulations on the progress. Of course, this should be a really big year for IBM. I mean all these POCs turn to mass deployments, Rob, this should start to show up in the numbers. I know you can’t talk about that, but we’ll keep our eyes on it. We’ll be in touch and we’ll talk more soon.

Rob Thomas: We are excited. One preview I’ll give you, a year ago I think we announced Concert, which was how do you apply AI for resiliency in technology systems? We’re super pleased with the progress there. I think we kind of hit a pain point at the right time on resiliency, which is, every company’s trying to say, “How do I have an infrastructure that can stand the test of time, is not going to be exposed to threats.” I think we hit something with Concert, so you’ll see more on that in May.

Patrick Moorhead: Looking forward to it. Nice little tip there.

Daniel Newman: Yeah. All right, Rob, thanks so much for joining us. I’m sure we’ll do it again sometime in 2025. Catch you soon, and thank you everybody for tuning in, being part of Six Five On The Road, a View from Davos. Subscribe, be part of all of the coverage that we had here this week at the World Economic Forum. It’s been a big week so far and it’s only the beginning. But for now we got to say goodbye. See you all later.

The post IBM’s Rob Thomas on Generative AI, LLMs, and the Path to Value Creation at the World Economic Forum in Davos, Switzerland – The View from Davos appeared first on Moor Insights & Strategy.

]]>
MI&S Weekly Analyst Insights — Week Ending January 17, 2025 https://moorinsightsstrategy.com/mis-weekly-analyst-insights-week-ending-january-17-2025/ Tue, 21 Jan 2025 15:52:36 +0000 https://moorinsightsstrategy.com/?p=45152 MI&S Weekly Analyst Insights — Week Ending January 17, 2025. A wrap up of what our team published during the last week.

The post MI&S Weekly Analyst Insights — Week Ending January 17, 2025 appeared first on Moor Insights & Strategy.

]]>
MI&S Logo_color

Welcome to this edition of our Weekly Analyst Insights roundup, which features the key insights our analysts have developed based on the past week’s events.

This week I’m in Davos, Switzerland, at the World Economic Forum, meeting with business leaders from around the globe. Moor Insights & Strategy is also co-sponsoring a special session on “Protecting Press Freedom and Democracy,” moderated by Axios Media. While we at MI&S are not journalists, we rely on the technology press for good information, and in turn are often quoted in press outlets as we contribute our own viewpoints to the public discourse on events unfolding in the tech world.

Axios Views - Twitter Moment

A free press — definitely including social media — is crucial for a thriving tech ecosystem because it ensures that individuals, businesses, and policymakers have access to the information they need to make informed decisions in an increasingly complex technological landscape. We’re proud to support this event as an expression of our deep-rooted commitment to maintaining the freedom of information flow in the tech sector and beyond.

If you or your company executives will be in Davos and you’d like to connect there, please reach out — we’d love to hear from you. 

Hope you have a great week,

Patrick Moorhead

———

Our MI&S team published 16 deliverables:

This past week, MI&S analysts have been quoted in the press about Biden’s AI restrictions, Google drones, international malware security issues, intelligent content management, and as usual, AI. Our insights were included in Fierce Network, Yahoo Finance, Ciso2Ciso, The Deccan Herald, and The Straits Times.

MI&S Quick Insights

I think everyone realizes the impacts AI is having on a wide range of business activities. So it should not be a surprise to anyone that The World Economic Forum’s 2025 Future of Jobs Report projects that almost 90% of companies expect that AI will redefine company operations by 2030.

AI is reshaping workplace dynamics. It is expected to create a net increase of 2 million jobs, resulting in 11 million new jobs while displacing 9 million. It is not surprising that the titles with the greatest job growth will be data specialists and AI/ML technologists.

What about people being let go from companies because they don’t have the necessary AI skills? It’s not as bad as expected because 75% of companies plan to upskill current employees for AI collaboration. That shows a focus on adapting to AI advancement instead of replacing staff. And 70% of companies plan to hire people who already have AI expertise. So, most companies will do a little of both actions. Along with those statistics, 50% of businesses say they will reorganize around AI opportunities, and 40% will use workforce reductions to handle AI expansion.

Just reading the news daily will tell you how quickly AI is being adopted. It’s an instance of the old “early bird gets the worm” saying: many companies believe that those who integrate AI fast and first will have a competitive advantage over those that don’t.

The message is clear: AI is coming, and it’s coming fast. It is expected to cause the largest workplace shift in decades. Management should establish AI priorities and get ready to implement them as necessary. Go AI, and go fast.

I think everyone realizes the impacts AI is having on a wide range of business activities. So it should not be a surprise to anyone that The World Economic Forum’s 2025 Future of Jobs Report projects that almost 90% of companies expect that AI will redefine company operations by 2030.

AI is reshaping workplace dynamics. It is expected to create a net increase of 2 million jobs, resulting in 11 million new jobs while displacing 9 million. It is not surprising that the titles with the greatest job growth will be data specialists and AI/ML technologists.

What about people being let go from companies because they don’t have the necessary AI skills? It’s not as bad as expected because 75% of companies plan to upskill current employees for AI collaboration. That shows a focus on adapting to AI advancement instead of replacing staff. And 70% of companies plan to hire people who already have AI expertise. So, most companies will do a little of both actions. Along with those statistics, 50% of businesses say they will reorganize around AI opportunities, and 40% will use workforce reductions to handle AI expansion.

Just reading the news daily will tell you how quickly AI is being adopted. It’s an instance of the old “early bird gets the worm” saying: many companies believe that those who integrate AI fast and first will have a competitive advantage over those that don’t.

The message is clear: AI is coming, and it’s coming fast. It is expected to cause the largest workplace shift in decades. Management should establish AI priorities and get ready to implement them as necessary. Go AI, and go fast.

Salesforce CEO Marc Benioff recently announced a pause in software engineer hiring for the company, suggesting that AI could automate a growing portion of development tasks. This move, likely motivated by potential cost savings and reported productivity gains via Salesforce’s Agentforce AI tool, raises questions about the evolving role of tech jobs and how companies might manage an AI-augmented workforce.

While some praise Salesforce’s innovative approach, many remain cautious about AI’s ability to completely replace human engineers soon. This decision also highlights a key challenge: Could existing departments like IT or HR oversee this new workforce, or will companies create new roles specifically to manage this digital labor?

Salesforce’s strategy serves as an interesting example of how AI might reshape business operations. It remains to be seen whether other companies will adopt similar strategies and how these trends could impact the tech job market over time. This potential shift in the tech landscape underscores the growing potential of AI to reshape industries and redefine workforce needs, with Salesforce highlighting a key strategy for companies promoting AI adoption: demonstrating its ROI through internal cost savings.

What to make of the outgoing administration’s restrictions on AI chips and models? There are so many different angles to consider. However, the sharing of AI model weights and export controls on semiconductors are the two biggies. While the U.S. government has billed this as diffusing AI innovation, it is at the same time restricting innovation of a couple of players that are on the leading edge of AI development.

Does this stifle AI innovation in the United States? I don’t believe so. Perhaps it reshapes some of our collaborative efforts on a global basis, but the semiconductor, hardware, and software ecosystems are going to continue to accelerate at seemingly exponential rates. I just don’t see that slowing down.

Here’s an interesting take: The primary target of these restrictions — China — has been leveraging open-weight models and is using these to try and gain a global footprint. Models like Alibaba’s Qwen have been showing good performance relative to what we have here in the U.S. — especially in multilingual support. And Qwen has found traction in many countries outside of the U.S. and western Europe. Just as Huawei pivoted after its U.S. blacklisting and gained such a strong global footprint in telecom, Alibaba and others can (with Huawei) deliver their own AI factories.

One of the questions we have to ask is whether we are truly protecting the U.S. and its allies with these protectionist measures. Or are we accelerating investments from adversary governments into AI that perhaps pay out in the longer term? It’s a tough question to answer.

The big news of the week was Lenovo’s announcement that it will acquire high-end storage provider Infinidat. While Lenovo has long been strong in the low end of the storage market, it has struggled to find a foothold in the high end of the market. Adding Infinidat to the portfolio solves for this challenge. However, it will take a bit of rationalization across product, marketing, and sales to find success and compete in the enterprise.

I believe that one of the most important assets to consider in this acquisition are the people who have developed, marketed, and sold Infinidat’s solutions to date. The high-end segment they have sold into — and that Lenovo desires to capture — works differently from the volume/transactional markets (commercial enterprise, SMB) where Lenovo has made its mark in storage. I think those same developers and go-to-market professionals will be essential for the success of this business combination. For more details, check out my full analysis of this deal on Forbes.

Active Directory (AD) is a core piece of enterprise IT, as it handles authentication and access to many important IT assets such as apps, databases, and security systems. Unfortunately, its importance also makes it a prime target for cyberattacks. That makes AD recovery after an attack a high priority, but that’s been a function in need of more innovation. “Recovering Active Directory is foundational to maintaining continuous business after a cyberattack, yet traditional methods are too complex and prone to error,” said Pranay Ahlawat, Commvault’s chief technology and AI officer.

To address this issue, Commvault has recently introduced Cloud Backup & Recovery for Active Directory Enterprise Edition, which aims to make AD forest recovery much simpler and more automated. Read more about this in my latest Forbes article.

Microsoft has introduced Microsoft 365 Copilot Chat, a new AI service for businesses that blends free chat features with consumption-based access to AI agents. This offering leverages AI technology to help users with tasks like document analysis and process automation. Costs vary depending on the complexity of the task, with simple web searches being free and more complex actions involving company data costing more. This flexible approach allows organizations to dip their toes into AI without a hefty upfront investment, scaling their usage as needed.

Hewlett Packard Enterprise Aruba Networking recently announced a portfolio of products tailored to brick-and-mortar retailers. The company’s retail portfolio includes a cellular bridge, a smaller-form-factor switch, and wireless access points that can support more sensors and devices — ultimately providing broader coverage. HPE is also partnering with retail device leaders including Zebra Technologies to ensure an ecosystem approach to its solution delivery. There is a tremendous opportunity in this market to delight customers with automated shelf replenishment and online-like experiences as well as to improve operational efficiency tied to better logistics and reduced shrinkage. From my perspective, the company’s retail portfolio and its AI-infused HPE Aruba Networking Central management console is well positioned to deliver value to retailers and customers alike.

SAP and IBM are continuing their 50-year relationship with a partnership to support the shift of SAP S/4HANA from on-premises to the cloud. This offering looks to facilitate the migration of SAP S/4HANA workloads from on-premises IBM Power Systems. For context, SAP has 10,000-plus customers running SAP on IBM Power servers. The collaboration of SAP and IBM focuses on helping organizations modernize their ERP environments and support AI-powered business processes. The RISE with SAP program provides a structured approach to cloud migration, offering outcome-driven services and platforms to assist organizations in reimagining their operating models.

The longstanding familiarity between SAP and IBM makes the shift less daunting, though adoption will depend on factors such as a given customer’s current SAP setup, budget, and readiness for cloud migration. I’ve talked a lot about modernization and the importance of change and data management, which will be key areas to address during these transitions. Transitioning systems isn’t easy, and any change can add complexity. Still, modernizing is crucial for businesses using ERP systems to stay competitive. This is a good opportunity for companies to make the most of their IBM Power server investments and use this collaboration to bring their ERP systems up to date.

IBM Consulting has announced plans to acquire Applications Software Technology LLC. AST brings expertise in Oracle Cloud applications, specifically with public-sector organizations in government and education and companies in manufacturing, energy, and CPG. AST specializes in implementations of Oracle ERP, HCM, Configure, Price, Quote (CPQ), Oracle Cloud Infrastructure (OCI), JD Edwards ERP, and NetSuite. This move fits with IBM’s strategy and builds on its recent acquisition of Accelalpha, which offers Oracle Cloud consulting services. My thought is that this year is the perfect time for ERP modernizations, especially with the AI craze. In that context, IBM Consulting has set itself up to help businesses transform and succeed.

Epicor Prism is bringing AI agents to the supply chain, making it easier for users to gain relevant insights. Integrated with Epicor Kinetic ERP, Prism uses AI agents to handle tasks such as data analysis, demand prediction, scheduling, inventory optimization, and updates. This should allow supply chain teams to save time and cut down on routine manual tasks so they can spend more time on strategic work. This is part of Epicor’s push to modernize its ERP systems in 2025 and could make life easier for businesses using Epicor in manufacturing, distribution, and retail. Definitely something to keep an eye on.

The Nintendo Switch 2 is precisely the device that I expected Nintendo would launch. It’s a combination of generations-old hardware with significantly improved user experience and UI. I think Nintendo understands clearly that it needs to hit the right balance between a certain price point and a certain game experience, which is what the Switch is all about. I think the people who expected the new model to be like a PC gaming handheld are living in an alternate reality. The handheld gaming market will always have the Switch at the entry level, while PC handhelds are distinctly premium products.

AT&T announced a fiber and wireless guarantee that compensates customers for downtime. I’m watching this strategy from my edge/IoT point of view because of its potential applicability in industrial IoT. Specifically, the business case for private 5G adoption rests on delivering reliable, predictable, scalable connectivity in enterprise and industrial settings. However, unlicensed spectrum alternatives (Wi-Fi et al.) are “good enough” for many use cases — at substantially lower costs. 5G’s advantages must deliver quantifiable ROI to justify the higher cost, and service-level guarantees help make the case for buying more 9s of guaranteed reliability.

The recent Sonos fiasco teaches a valuable lesson about what can go wrong with long-term support for complicated mashups of device firmware, cloud services, and phone apps. In this case, the company released a major app rewrite last May, resulting in usability issues and a cascade of serious bugs. Sonos could not simply revert to the old apps because upgrades to firmware and cloud services broke backward app compatibility. Among other consequences, this fiasco led to the departure of the company’s CEO.

Here’s my take from an edge / IoT perspective: Software-defined products, including vehicles (SDVs), create technical debt that extends throughout the product’s lifetime. Regression tests aren’t sufficient to catch real-world bugs and usability problems. (Last year’s Crowdstrike outage is another example of a catastrophic testing failure.) The lessons are simple:

  1. Don’t bet the farm on internal tests. Experiential tests on deployed products with real users must be part of the plan.
  2. Avoid forklift updates. If unavoidable, budget for significant testing, roll the update out slowly, and have a rollback strategy ready to go.


In another firmware-related incident affecting a software-defined product,
Tesla is recalling more than 239,000 vehicles for a condition where a computer circuit board short circuit causes problems, including loss of the rearview camera image. The fix is a software update that alters the power-up sequence to avoid a potential reverse-voltage situation that causes the short. (Transistors hate reverse voltage.) This is an excellent example of how SDVs can simplify maintenance because the fix is an OTA update that is transparent to the customer.

After 18 months of preparation, the FCC announced the launch of the U.S. Cyber Trust Mark label for IoT consumer devices. The voluntary security and privacy testing program requires eligible products to pass compliance testing by accredited (FCC-recognized) labs. “Voluntary” is the operative word here. Consumers will only look for the mark if it becomes widely used on mainstream products. That might happen, but I’m not holding my breath.

AT&T’s new service guarantee will fundamentally change how carriers operate over time as consumers start to expect actual service-level agreements with their carriers — and compensation when things go wrong. I expect that Verizon and T-Mobile will follow suit if AT&T’s move successfully retains customers, or takes customers away from competitors.

Samsung is teasing its next-generation smartphone — the Galaxy S25 line — this week. It will be really interesting to see how Samsung’s new flagship devices perform in the latest benchmarks against the iPhone as well as other Android phones with Snapdragon 8 Elite processors. I am excited to see what new AI features Samsung introduces to differentiate itself from the other Android OEMs and even Apple.

Microsoft has introduced a new consumption-based pricing model for its 365 Copilot Chat alongside its existing subscription-based option. This model allows organizations to experiment with and scale AI usage according to their needs and budget. The consumption-based pricing facilitates controlled experimentation and proof-of-concept projects. However, potential inconsistencies in functionality and updates across the two models in this tiered system could create user experience disparities.

This disparity may be a strategic move by Microsoft to incentivize customers to upgrade. Still, the consumption model’s flexibility could also attract customers who desire the full feature set of Copilot along with consumption-based pricing. The flexible pricing strategy could potentially drive wider AI adoption, but ensuring a consistent and valuable user experience across both models will be crucial for Microsoft. The offering addresses many of the barriers to AI in the enterprise, including cost and adoption, and it promotes better security for companies by discouraging BYOAI.

Google has announced changes to its Workspace offerings, integrating AI capabilities into its Business and Enterprise plans without requiring paying for additional add-ons. Effective last week, this update includes AI assistance in various Workspace applications such as Gmail, Docs, Sheets, and Meet. The new features encompass Gemini Advanced for complex tasks and NotebookLM Plus for research assistance. By incorporating these AI tools directly into existing plans, Google appears to be lowering barriers to entry for businesses interested in AI. This approach, similar to Microsoft’s recent consumption-based model, could facilitate wider AI adoption and allow Google to demonstrate the value of its AI to customers. The strategy may encourage users to engage more readily with AI features within familiar applications, potentially leading to increased productivity and improved work quality. Google states that it has implemented security measures and compliance certifications for these AI features, addressing potential concerns about data protection and information access control.

CES 2025 has been over for a week now, and it’s quite clear looking back that lots of PC OEMs refreshed their lineups to take advantage of the latest chips from AMD, Intel, and NVIDIA, almost all of which focus on AI performance and experiences. It remains unclear whether AI applications will actually take hold this year, but it’s quite clear that they did not in 2024.

Miami University and Cleveland Clinic have created a partnership that will strengthen Ohio’s efforts to become a leader in quantum computing. The partnership will create Ohio’s first college quantum computing degree program. The collaboration will integrate Miami University with the Cleveland Clinic’s on-site IBM Quantum System One, the first quantum computer fully dedicated to healthcare. (Readers with long memories may recall the Forbes article I wrote a couple of years ago about the debut of that computer.)

Miami University will develop bachelor’s, master’s, and doctoral programs in quantum computing. Cleveland Clinic, in turn, will offer internships and research opportunities for Miami students. Aligning a quantum curriculum with actual healthcare applications will open a pipeline that will probably boost Ohio’s economy.

On a personal note, I’m especially glad to see Miami University move into quantum because my three grown daughters all graduated from that institution.

Last week, Microsoft launched its Quantum Ready program to alert business leaders that quantum computing has made significant progress over the past few years — and that they should get ready to take advantage of that progress. In 2024, several significant quantum breakthroughs and important pieces of research moved the technology forward. The field has gone from theoretical mathematical concepts to an emerging technology on the cusp of making major breakthroughs in multiple modalities. These modalities include superconducting, trapped ions, neutral atoms, photonics, and topological quantum computing.

Other factors have also helped improve quantum. One important advance is that quantum processors have improved significantly over the past five years. Current quantum computers have higher-quality qubits, allowing computations that weren’t possible five years ago. Microsoft’s initiative urges business leaders to get ready to harness the transformative potential of quantum computing coupled with AI.

Microsoft’s commitment extends to the global stage. This year, it is partnering with the United Nations, the American Physical Society, and others to celebrate the 2025 International Year of Quantum Science and Technology. This initiative commemorates a century of quantum innovation while fostering awareness of how quantum applications will revolutionize industries. By leading these efforts, Microsoft aims to empower organizations and communities worldwide to embrace the quantum future effectively.

Cisco recently announced its AI Defense platform, which is slated to be generally available in March. One of the challenges associated with securing algorithmic models is that they are not deterministic and can be easily compromised. As modern AI workloads move from the cloud to network edges, attack surfaces will be greatly expanded, making safety and security more difficult. Time will tell if Cisco’s approach is effective, but I believe that AI Defense has the potential to address AI security at scale with automated validation techniques that can dynamically adjust guardrails to an ever-changing threat landscape.

Nokia is making progress towards its goal of becoming an enterprise network services provider. The company has had challenges broadening its reach beyond the cellular market, but its innovation in delivering autonomous networks has great promise. Last week I published my insights on this topic in a Moor Insights & Strategy research paper.

Research Papers Published

Citations

Biden’s AI Restrictions / Matt Kimball / Fierce Network
Here’s why Biden’s new AI restrictions could backfire

Box AI / Melody Brue / Box Investor Relations (picked up in multiple outlets)
Box Delivers Intelligent Content Management to the Enterprise with New Enterprise Advanced Plan

Google / Drones / Anshel Sag / Yahoo Finance
Google’s next big bet: Taking drone deliveries mainstream

PlugX / Security / Will Townsend / Ciso2Ciso
International effort erases PlugX malware from thousands of Windows computers

Rang Intelligent / AI / Matt Kimball / Deccan Herald
ByteDance’s AI makes tech tycoon Zhou one of Asia’s richest women 

Rang Intelligent / AI / Matt Kimball / The Straits Times
ByteDance’s AI push makes Chinese tycoon one of Asia’s richest women


TV APPEARANCES
AI, restrictions, Google UK Antitrust investigation / Patrick Moorhead / Yahoo Finance
US restrictions on AI chips are a ‘step in the wrong direction’
Watch the Yahoo Finance clip on X

New Gear or Software We Are Using and Testing

  • Kindle Colorsoft (Anshel Sag)
  • Google Pixel Buds 2 Pro (Anshel Sag)
  • XREAL One AR Glasses (Anshel Sag)
  • Google Pixel Watch 3, 41mm (Anshel Sag)
  • Insta360 Link2 4K AI Webcam (Anshel Sag)
  • Google TV streamer – Matter and Thread features (Bill Curtis)
  • Various Matter devices (Bill Curtis)
  • ASUS Zephyrus G16 Gaming Laptop (Anshel Sag)

Events MI&S Plans on Attending In-Person or Virtually (New)

Unless otherwise noted, our analysts will be attending the following events in person.

  • Cisco AI Summit, January 15, Palo Alto (Will Townsend)
  • World Economic Forum, January 20-24, Davos, Switzerland (Patrick Moorhead) 
  • Samsung Galaxy Unpacked, January 22, San Jose (Anshel Sag) 
  • MIT Reality Hack, Boston, January 24-17 (Anshel Sag) 
  • Acumatica Summit, January 26-29, Las Vegas (Robert Kramer)
  • Microsoft AI Tour, January 30, New York City (Robert Kramer)
  • Cisco AI Summit, January 15, Palo Alto (Will Townsend)
  • World Economic Forum, January 20-24, Davos, Switzerland (Patrick Moorhead) 
  • Samsung Galaxy Unpacked, January 22, San Jose (Anshel Sag) 
  • MIT Reality Hack, Boston, January 24-17 (Anshel Sag) 
  • Acumatica Summit, January 26-29, Las Vegas (Robert Kramer)
  • Microsoft AI Tour, January 30, New York City (Robert Kramer)
  • ZohoDay25, February 3-5, Austin (Robert Kramer, Melody Brue)
  • Oracle NetSuite SuiteConnect, February 6, New York City (Robert Kramer)
  • Cisco Live EMEA, February 10-13, Amsterdam (Will Townsend)
  • SAP Analyst Innovation Council, February 11-12, New York City (Robert Kramer)
  • RingCentral Analyst Summit, February 24-26, Napa (Melody Brue)
  • Arm Analyst Summit, February 18-21, San Francisco (Matt Kimball)
  • Microsoft Threat Intel Summit, February 25, Redmond (Will Townsend)
  • Siemens Datacenter Analyst Summit, February 25-27, Zug, Switzerland (Matt Kimball)
  • EdgeAI Austin, February 25-27, Austin (Bill Curtis is a speaker)
  • Mobile World Congress, March 2-7, Barcelona (Will Townsend)
  • Susecon, March 10-14, Orlando (Matt Kimball)
  • Fastly Accelerate, March 12, Los Angeles (Will Townsend)
  • Synopsys Panel Moderation, March 15, San Jose (Matt Kimball)
  • Adobe Summit, March 18-20, Las Vegas (Melody Brue)
  • Extreme Networks Connect, May 19-22, Paris (Will Townsend)
  • Zendesk Analyst Day, March 25, Las Vegas (Melody Brue)
  • Oracle Database Summit, March 25, Mountain View (Matt Kimball)
  • IBM Infrastructure Analyst Summit, March 25, NYC (Matt Kimball, Melody Brue)
  • Microsoft FabCon March 31–April 2, Las Vegas (Robert Kramer)
  • Canva Create & Analyst Day, April 8-10, Los Angeles (Melody Brue)
  • NTT Upgrade, April 9-10, San Francisco (Will Townsend)
  • Google Next, April 9-11, Las Vegas (Robert Kramer)
  • Appian World, April 27-30, Denver (Robert Kramer)
  • RSA Conference, April 28-May 1, Las Vegas (Will Townsend)
  • Nutanix.NEXT May 6-9, Washington DC (Matt Kimball)
  • Informatica World, May 13-15, Las Vegas (Robert Kramer)
  • Dell Tech World, May 19-22, Las Vegas (Matt Kimball)
  • Zscaler Zenith Live, June 2-5, Las Vegas (Will Townsend)
  • Snowflake, June 2-5, San Francisco (Robert Kramer)
  • Cisco Live US, June 8-12, San Diego (Will Townsend)
  • HPE Discover, June 23-26, Las Vegas (Will Townsend)
  • Techritory, October 22-23, Riga (Will Townsend)

Subscribe

Want to talk to the team? Get in touch here!

The post MI&S Weekly Analyst Insights — Week Ending January 17, 2025 appeared first on Moor Insights & Strategy.

]]>
Google Takes A Big Step Into User-Driven Agents With Agentspace https://moorinsightsstrategy.com/google-takes-a-big-step-into-user-driven-agents-with-agentspace/ Tue, 21 Jan 2025 15:00:23 +0000 https://moorinsightsstrategy.com/?p=45055 Like its competitors, Google continues to experiment with low- and no-code tools that harness agentic AI for business productivity.

The post Google Takes A Big Step Into User-Driven Agents With Agentspace appeared first on Moor Insights & Strategy.

]]>
Google’s new Agentspace enables business users to create productivity-focused AI agents. Google

Google has just launched Agentspace, a no-code-ish environment geared towards personal work productivity. Many of Google’s competitors are also playing in this market, so it’s no surprise that Google is getting into it. At this stage, each of the major cloud vendors now has an AI development platform (in Google’s case Vertex AI) and multiple tools geared towards different types of developers. Google’s foray into no-code for power users is different than its competition’s, and quite interesting.

What’s Cool About Agentspace

Agentspace is an extension of Google’s viral NotebookLM offering. In fact, as part of the packaging, the enterprise-grade NotebookLM Plus is the base offering. NotebookLM has been in trial mode for a while now. I like it as a means for a user to create spaces around a topic or project and then leverage LLMs to search and create new content from that space. I know the “create a podcast” feature is what gets the clicks, but the whole concept is cool and useful.

The higher-end Agentspace offerings add in new capabilities including third-party integrations (think SharePoint, Jira and Salesforce) and team space sharing. These are all good nods to how enterprise users can work together and share best practices via agents.

Agentspace also has some prompt memory capabilities that enable agentic behaviors, such as prompt chaining. It’s not really a structured workflow like we are seeing with Salesforce’s and ServiceNow’s agent development tooling. However, if you are a typical business user, it is reflective of the less structured processes that people create for themselves while performing daily tasks.

What I Am Not Sure About With Agentspace

Above all, Agentspace is a prompt-driven experience. I know that this approach works for search—and is therefore a natural fit for Google—but I am not sure that a prompt-only agent will be accepted by most users. My issue is that sometimes I need to see something to act upon it. Is just a context window or a notification enough? Also, your mileage will vary based on how well you are writing prompts—and there is a vast range of skill in that area in the average workplace. Ask anyone who does not understand good prompting how their experience with AI is going. The response will probably not be very good.

As with other platform-centric agents, I remain concerned about Agentspace’s cross-platform integration and how it will scale. For low-scale scenarios (for instance if I do only a few prompts a day), it’s likely fine. But when a whole organization is hitting agents that then need to reach back to on-prem IT resources or to another cloud, how will it perform? And what will it cost? I am optimistic that over time we will start to see more of an integration between no-code tools and AI development frameworks such as Vertex AI or AWS Bedrock. In fact, we are starting to see this with Salesforce’s Agentforce, which leverages Salesforce’s own Mulesoft technology to provide high-scale integration connectors. I am hopeful that this becomes a trend rather than remaining a one-off.

Like its competitors, Google is putting a lot of different tools out there, and some of them overlap. For example, how does one compare Agentspace to Vertex AI Agent Builder? I appreciate that in the brave new world of AI, experimenting and failing fast is a good way to innovate. But I think that all of these AI vendors should be more declarative of their intent for each of these tools. Who was Agentspace designed for, and how will it improve those users’ lives or their work? I have this issue because I believe AI should elevate the capabilities of a business user to the point where we should see some sort of blurring or blending between no-code and low-code tools. But that may not be realistic yet, considering that Google has also announced a preview of a new tool called Jules, which seems kind of like AWS App Studio.

Looking For Disruptive Approaches In The No-Code Space

Google has made some pretty bold user experience decisions when it comes to no-code, which I respect. For example, I like the NotebookLM concept of creating a more freeform workspace rather than the more form-centric approach of other no-code tools. And extending the functionality with collaboration and third-party integrations provides a real opportunity to test whether that model can be a disruptor against SaaS no-code plays. Hopefully, Google has picked the right time to push against the convention with a solution that is pretty innovative.

The post Google Takes A Big Step Into User-Driven Agents With Agentspace appeared first on Moor Insights & Strategy.

]]>
Volygon Debuts Truly Photorealistic Gaussian Splats https://moorinsightsstrategy.com/volygon-debuts-truly-photorealistic-gaussian-splats/ Tue, 21 Jan 2025 15:00:17 +0000 https://moorinsightsstrategy.com/?p=45044 Volygon's Gaussian splat technology allows it to create highly accurate 3-D images for film production, VR, AR and other applications.

The post Volygon Debuts Truly Photorealistic Gaussian Splats appeared first on Moor Insights & Strategy.

]]>
A still of a video showing the scanned Amazon Studios set Tonaci Tran

You might have heard the term “splat” or “Gaussian splat” used recently in the XR space. Gaussian splatting is a rendering technique that has been around for a long time but has found a new application in XR when paired with neural radiance fields, commonly referred to as NeRFs. This technique blends many still images from different angles using neural networks and GPU-enabled AI acceleration to quickly create a 3-D model or 3-D scene. Sometimes, these can even be generated in real time.

These techniques help accelerate the creation of 3-D assets, whether they are real-world objects that need to be brought into digital media or full scans of an entire factory for the purpose of a digital twin. They are significantly less time-consuming and use less-expensive equipment compared to the old way of using lidar-based solutions that are costly to use and process. While lidar solutions are highly accurate and capture millions of data points for enterprise applications, they can be very limiting for anyone without the time and resources to use them. While neural radiance-assisted Gaussian splatting is still a form of photogrammetry, it is designed to deliver a cheaper and faster way to create 3-D assets. This addresses key factors inhibiting the growth of the spatial computing industry and is a critical technology for XR’s success.

Numerous papers have been published on Gaussian splatting, which is an AI-accelerated form of rasterization—a fairly quick and common way of rendering graphics. The latest generation of Gaussian splats can run on smartphones leveraging a hybrid of local and cloud computing, but many of these solutions still struggle with artifacts and quality issues. (Artifacts such as jagged edges around the subject are common with low-resolution or highly compressed videos or images.) Volygon’s Gaussian splat solution, like its volumetric video solution I have previously written about, solves some of these issues for commercial purposes and creates flawless 3-D scenes for film applications. Volygon is the new company name for HypeVR, one that I believe better matches the company’s technology, especially since that tech is applicable well beyond VR now.

The State Of Splats

There are two predominant ways to make a NeRF-assisted Gaussian splat. One is through a smartphone, usually assisted with a depth camera; this is commonly achieved on iPhones, starting with the iPhone 11 Pro. However, the technology has improved with time and doesn’t necessarily require depth data to create pretty accurate 3-D assets. The other way is to capture the images with a high-resolution camera and feed those images into tools such as Nvidia’s InstantSplat and NeRF studio. Nvidia also has its own NeRF models that it has created to make things easier for developers, including Nvidia Instant-NeRF and NeRF-XL.

Some of the most popular smartphone and web apps for this, such as Scaniverse, Luma AI and Polycam, are enabling people to easily create 3-D assets faster and cheaper than ever before. However, in my experience many of these have quality limitations, which is the compromise for being fast and cheap. It reminds me of the old adage: everyone would love to have a product that’s fast, cheap and good—but you can usually have only two of the three. While I do believe these apps will improve with time, they still do generate a meaningful amount of artifacts. This has prevented them from being used in film and other industries where things need to appear perfect. By contrast, Volygon’s depth-assisted Gaussian splatting can serve the film industry, and really any industry that needs super high-quality 3-D scans that aren’t expensive to produce.

Setting up the Volygon Gaussian splat with cameras, color checkers and drones Tonaci Tran

Volygon’s Depth-Assisted Gaussian Splatting

Volygon specializes in photorealistic real-time 3-D technologies. Its primary product is a 3-D volumetric video capture solution with a proprietary codec that enables extremely high-quality real-time volumetric video. With post-processing, it can achieve even higher-quality volumetric captures, which are among the best I have seen to date. Given this background, I was pleased when the company’s CEO invited me to see the latest thing Volygon has been working on with one of Hollywood’s biggest studios, Amazon.

Amazon is working with Volygon to accelerate and improve the efficiency of virtual production. Virtual production is a fairly new concept that’s been introduced in the last few years that takes advantage of many of the latest technologies available to the graphics industry to make production cheaper and faster for studios. The use of LED video walls has been one of the major enablers of this technology, paired with the ability to render scenes on those walls that is accurate to the environment that a director is trying to create. (The Mandalorian broke new ground in this use of technology over the past few years.)

Kenneth Nakada, head of virtual production operations at Amazon MGM Studios, said, “Working with [CEO] Tonaci [Tran] and his team at Volygon has been an exceptional experience. Their commitment to delivering extremely high-quality photorealistic scanning results, coupled with a professional team utilizing state-of-the-art equipment, has elevated our virtual production projects to new heights. Their expertise and dedication have made them a trusted partner for Amazon Studios.”

Volygon can create a scan of an interior or exterior set for Amazon Studios shows, then create a full photorealistic reproduction in 3-D of the set from the angles at which cameras have already captured footage. This enables the director and the studio to easily come back to any scene using that set and reshoot since the background is photorealistic and appears the same as if it were physically still there. This drives significant efficiencies for Amazon because it reduces the costs of reshoots, which are fairly common in the industry. It also means that Amazon can utilize the same physical space more efficiently since it doesn’t need to keep a specific set around for any longer than is needed and doesn’t need to rebuild that set if reshoots are required.

Behind the scenes with Volygon capturing the Amazon Studios set Tonaci Tran

When I went to check it out, I saw Volygon’s scan of an Amazon Studios set, and it was the most flawless photorealistic Gaussian splat I have ever seen. This is attributable to Volygon’s expertise in volumetric capture and stereo-pair capture, enhanced by the company’s proprietary AI-based depth estimation algorithm, which achieves sub-millimeter-level accuracy. Usually, Gaussian splats have some artifacts or blind spots that the camera didn’t perfectly see, but Volygon’s solution seems flawless and takes less than an hour to capture for a full set, which is still fairly quick—and much easier than using a lidar scan.

Going Beyond Film Production

While I believe Volygon’s Gaussian splat solution will be a huge advancement for the virtual production space, I also believe that digital twins could benefit heavily from it. The digital twin space is becoming ever more critical for cutting-edge applications in enabling autonomy for cars, automated factories, embodied AI robots and so much more. I could even see such a solution powering a robot in your home with hyper-accurate and up-to-date 3-D maps of the space.

I also believe that this level of 3-D image quality should enable more immersive education and collaboration uses for AR, VR and MR headsets like the Apple Vision Pro and the Meta Quest. Heck, even Snap’s Spectacles could benefit from high-fidelity scans like this—no great leap considering that Snap is already partnered with Scaniverse for 3-D Gaussian splats of smaller objects.

Regardless, Gaussian splats are continuing to gain steam across the industry. I even got to experience them recently at Google using the new Android XR operating system on Samsung’s Project Moohan MR goggles; a spatial version of Google Maps used Gaussian splats to create a 3-D scan of a restaurant I had just visited using only public photos from Google Maps. Now that this technology is becoming more refined thanks to the efforts of Volygon and others, I expect to see it deployed to address many more use cases.

The post Volygon Debuts Truly Photorealistic Gaussian Splats appeared first on Moor Insights & Strategy.

]]>
Cisco Desk Pro Review: A Collaboration Device For Hybrid Work https://moorinsightsstrategy.com/cisco-desk-pro-review-a-collaboration-device-for-hybrid-work/ Tue, 21 Jan 2025 15:00:11 +0000 https://moorinsightsstrategy.com/?p=45049 This all-in-one desktop video collaboration device is designed to boost productivity in hybrid work environments. Although pricey, it delivers on that promise.

The post Cisco Desk Pro Review: A Collaboration Device For Hybrid Work appeared first on Moor Insights & Strategy.

]]>
Cisco Desk Pro Cisco

With the explosive growth of hybrid and remote work models in recent years, videoconferencing has become an indispensable tool for businesses. As employees and clients are increasingly dispersed, reliable and user-friendly video collaboration is essential. Companies have embraced a diversified approach to videoconferencing, with the majority employing three or more platforms. This multiplatform strategy allows organizations to leverage the strengths of different tools, but critically, it demands the ability to seamlessly transition among the different platforms on a single device for optimal flexibility and efficiency.

In this context, the Cisco Desk Pro is an all-in-one desktop video collaboration device designed to enhance productivity and connectivity in hybrid work environments. Its adaptability empowers organizations to choose the meeting platforms that best suit their needs while maintaining the flexibility to connect users on different services. With a connected calendar, the Desk Pro makes joining meetings on any platform incredibly easy. Users can join Webex, Microsoft Teams, Google Meet and Zoom meetings with one click.

The Desk Pro can also be deployed with a Webex-centric approach, taking advantage of the “Cisco Rooms” functionality to optimize Webex Meetings and provide advanced cognitive collaboration features. Alternatively, the Desk Pro can be configured as a dedicated Microsoft Teams Rooms system, delivering a native Teams experience with optimized features and interface. While this configuration prioritizes Microsoft Teams, it still allows users to join Webex meetings with full functionality.

How does the Cisco Desk Pro stack up to the needs of today’s hybrid worker? In this review, I will examine the Desk Pro’s features, design, performance and overall value, focusing on its suitability for today’s distributed workforce.

Technical Specifications

  • Display: 27-inch 4K (3840 x 2160) touchscreen with anti-glare coating
  • Camera: 4K Ultra HD with 12 MP resolution, automatic framing and presenter tracking
  • Audio: Integrated speaker system with three directional speakers and a subwoofer, plus an advanced noise-canceling microphone array
  • Connectivity: Wi-Fi 6, Bluetooth 5.0, USB-C (with laptop charging capability), HDMI, Ethernet
  • Operating System: Cisco Webex OS
  • Other features: Integrated whiteboard, digital signage capabilities, proximity sensor

Design And Aesthetics

The Desk Pro has a sleek design suitable for both individual desks and conference rooms. The 27-inch touchscreen is prominent, providing ample space for video conferencing, content sharing and digital whiteboarding. The all-in-one design minimizes cable clutter and simplifies setup. Indeed, it’s clear that the system was designed with easy setup in mind: from opening the box to being fully set up took less than ten minutes once my company’s Webex administrator provided me with an activation code.

In my office, the Desk Pro serves as a third screen alongside my Apple MacBook Air and a Lenovo Qreator 4K UHD 27-inch monitor. With this setup, I can easily participate in meetings on the Desk Pro while accessing other applications and workflows on my other screens. I can also multitask directly on the Desk Pro, which is especially handy for in-meeting collaboration on whiteboards.

The Cisco Desk Pro (left) in the author’s home office Melody Brue

User Experience And Performance

The Desk Pro offers a very smooth user experience. The touchscreen is responsive, and the Webex OS provides a familiar interface for organizations already using Cisco’s collaboration ecosystem. The 4K camera delivers excellent video quality, and the automatic framing ensures that all participants are visible. When the auto-brightness feature is engaged, the camera can be slow to adjust to sudden or drastic changes in lighting. However, this feature can also be toggled off, which I recommend if you are using key lights or bright lighting for a presentation or podcast. One standout feature is the camera’s ability to track the user’s movement. This is particularly useful on a long call or when presenting. If I move around within my office, the camera tracks me, allowing me to get up and stretch or stand during a presentation without adjusting the camera. Camera settings can also be adjusted manually for pan, tilt and zoom. A self-view screen in the top right corner allows users to make any adjustments before joining a meeting.

During the WebexOne event in October, I sat down with Snorre Kjesbu, senior vice president and general manager of collaboration devices at Cisco. He explained that the tracking feature minimizes fatigue during long meetings, allowing presenters to stand up and move around naturally. “The Desk Pro reflects our focus on user experience,” Kjesbu said. “It’s designed to help people work more effectively. The minimalist design keeps the technology out of the way, allowing users to stay focused. We believe the Desk Pro is a great example of how thoughtful design can improve the modern workspace.”

With its three directional speakers and subwoofer, the Desk Pro delivers excellent audio quality, making it easy to distinguish voices in large meetings. The advanced noise-canceling microphone array effectively captures voices and minimizes background noise, which is particularly useful in busy environments. Microphone settings include neutral noise removal to eliminate background noise and an “optimize for my voice” feature to remove background noise along with background speech—a handy setting in an open-seating environment or shared workspace.

Collaboration Features

I found that the Desk Pro excels in its collaboration capabilities. For instance, the integrated whiteboard does a good job of facilitating real-time brainstorming. In my experience, this feature significantly improves engagement during online brainstorming sessions compared to traditional screen sharing.

The wireless screen-sharing feature is particularly convenient, and the ability to share content from a laptop or mobile device also enhances productivity during meetings. The Desk Pro also supports digital signage, allowing the display to show company information or announcements when not in use for video conferencing.

IT Management And Security Features

The Desk Pro integrates with Cisco Control Hub, simplifying device management for IT administrators. This allows for remote monitoring, configuration and troubleshooting across multiple locations, streamlining deployment for IT staff and ensuring consistent performance for end users.

The Cisco Desk Pro incorporates robust security measures to protect data and ensure a secure collaboration environment. It encrypts data transmitted during Webex and Microsoft Teams meetings to safeguard information. Access control features, including single sign-on and role-based access control, restrict unauthorized use and maintain privacy. The Desk Pro also benefits from Cisco network security measures such as firewalls and intrusion detection systems, providing an additional layer of protection.

The Desk Pro Value Proposition

The Cisco Desk Pro is a premium device with a price tag to match. However, its features, performance and all-in-one design make it a strong option for businesses investing in high-quality meeting and collaboration tools. Its integrated display sets it apart from alternatives such as the Logitech MeetUp and the Poly Studio X30, which require users to purchase and connect a separate display. This means that users considering one of those alternatives need to factor in the cost of a compatible screen.

I also happen to use the Poly Studio X30 with a television that I already had in my office. I find the Studio X30 is really useful for participating in and viewing larger-scale meetings. And, there are advantages to having a separate screen. For instance, users can select a display that perfectly fits their space requirements and budget, whereas the Cisco Desk Pro comes the way it comes—and with a price tag of $5,495. (The smaller 24-inch Cisco Desk starts at $2,995, and the 15-inch Desk Mini starts at $1,995; the Cisco website details the full feature sets for each device.) The best choice depends on whether the user prefers the convenience of an all-in-one solution like the Desk Pro or the flexibility of a modular system.

The intuitive interface and all-in-one design of the Cisco Desk Pro make it a useful tool for improving productivity and communication. Unsurprisingly for a device from an enterprise IT vendor of Cisco’s stature, the security functions and the simplified management offered through Cisco Control Hub make life significantly easier for IT departments. Although the price may be steep, I believe the Desk Pro’s comprehensive capabilities will make it a worthwhile investment for many organizations that want to support a productive and connected hybrid workforce.

The post Cisco Desk Pro Review: A Collaboration Device For Hybrid Work appeared first on Moor Insights & Strategy.

]]>
The Future of Work and Technology: Exclusive Insights from Davos, Switzerland https://moorinsightsstrategy.com/the-six-five/the-future-of-work-and-technology-exclusive-insights-from-davos-switzerland/ Tue, 21 Jan 2025 02:08:36 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45145 Daniel Newman sees 2025 as the year of agentic AI with the ability to take AI and create and hyperscale your business by maximizing and automating processes. Daniel relays to Patrick Moorhead that there's about $4 trillion of cost that can be taken out of the labor pool to drive the future of agentics.

The post The Future of Work and Technology: Exclusive Insights from Davos, Switzerland appeared first on Moor Insights & Strategy.

]]>
Curious about what the technology industry analyst angle is at Davos?

From AI and automation to sustainability and geopolitics, Six Five Media founders, Daniel Newman, CEO at The Futurum Group, and Patrick Moorhead, Founder & CEO at Moor Insights & Strategy, are on the ground at the World Economic Forum. They are capturing the key trends and insights shaping the future of business and technology, from a slightly different perspective than what you’re used to.

Watch the latest episode of this special coverage for exclusive interviews and analysis, focusing on the “re-ordering” of priorities like sustainability, agentic AI, automation, productivity, and more.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss our content.

Or listen to the audio here:

Disclaimer: The View from Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Daniel Newman: Hey, everyone, welcome to Davos. Daniel Newman here. Back again and joined, by the way, first time brought my bestie with me. Patrick Moorhead, you’re here first time, first experience. What are you thinking? Take it all in.

Patrick Moorhead: Yeah. So listen, Daniel, I know I have given this event, World Economic Forum, a lot of grief prior. But I have to tell you, when it comes to the combination of meeting with senior leaders in tech, with government and finance, this is the best event for that. We’re all in one place. We are literally on one street here in Davos, Switzerland to be able to do that, and a lot of geopolitical things happening. Today a new president’s getting sworn in. I want to read the room, read the room at Davos on that, on tech, generative AI, all the above.

Daniel Newman: Yeah, it’s a big transition here, Pat. Look, the last few years it’s been all AI. Remember this show was big on sustainability. We’ve seen a fairly significant rollback on these diversity programs that really had taken the helm of a lot of technology company strategies. And you walk down the street here and right behind us, you see Microsoft. We got consulting firms like Kearney right in front of us, SAP down the road, EY. And if you go down the road, it’s like every one of these houses, it’s Qualcomm, it’s Intel, it’s Dell, it’s Workday, it’s Facebook, Meta. Got to get that right. But this really does bring the constituency of tech, and basically all the companies that are driving the implementation of this AI transformation and bringing it to the industries.

Patrick Moorhead: Yeah. Dan, you hit it spot on. And it’s not that sustainability isn’t important, it’s really a reordering of the discussion. I saw a lot of overtones even so far, that it gets back to maybe where we were seven or eight years ago, which was look at all the money we can save in power. Oh, by the way, it’s amazing for the environment. And I think we’re going to see a lot of that here, a flipping of the order. If you look at the onerous energy costs, and then if you look at the demand for energy. Whether it’s from EVs, whether it’s from hyperscaler data centers to train the next two or three generations of models, we just don’t have enough electricity. And now water comes into play to be able to cool these data centers.

Daniel Newman: Yeah. There’s a lot of very interesting stuff, and we’ve done some really interesting research. You and I, both as analysts, we sometimes wear that media hat as analysts creating media, but we’re also researchers and we do a lot of that. We spend time with a lot of CEOs. We spend time doing a lot of research. Some of the research we’ve done is like this is going to be the year, at least as I see it, of agentic. We’re hearing all about agents. Well, what are agents? Basically, the ability to take AI and create hyperscale of your business by being able to maximize and automate all these processes. Pat, we identified that we think there’s about $4 trillion of cost that can be taken out of the labor pool to drive the future of agentics. That’s one of the things I’m really interested in listening to here. The other one here is hearing how CEOs are thinking about implementing. We’ve seen these historic transformations. Remember the big data one that went on?

Patrick Moorhead: Sure.

Daniel Newman: I don’t think it was a super successful one. Then there was digital transformation, analog transformation that happened.

Patrick Moorhead: Inside joke, folks.

Daniel Newman: Inside joke. You got to watch the show if you want to get all the way inside. But the digital transformation, again, board CEOs held onto things too tight. We’re seeing it again with AI. I want to hear a little bit about how are we going to make sure AI is successful? Because you and I both know chips, infrastructure, service providers, but where’s the consumption? I mean, big deal.

Patrick Moorhead: Yeah. And this is particularly on the enterprise side, let’s break up the market into a consumer, commercial and then government. We’ve seen a lot of AI interspersed into everyday applications like X, all the properties from Meta. Apple, even though it hasn’t had a great start, is injecting that as well. But we see that mass adoption of enterprise AI is probably a couple of years away. First implementations will be through enterprise SaaS, second applications will be homegrown. And Daniel, like we’ve seen in every huge wave, there’s this insatiable appetite to change something dramatically. And whether it’s agents replacing enterprise SaaS, which probably is a five to 10-year run, because data is the biggest issue. So I’m looking for affirmation, not in the nitty-gritty here because we have leaders here, but bouncing ideas across these leaders to get feedback from them.

Daniel Newman: Yeah. A lot of prune to grow is what I call like this $4 trillion of labor taken out. Ideally, like every revolution, you see an insertion of effort that creates more productivity. So we don’t only want to see cutting, but I think what we’re seeing is that there are these monotonous, redundant roles. And originally it was always blue-collar roles.

Patrick Moorhead: Right.

Daniel Newman: Now we’re seeing it in the white-collar roles, like these knowledge roles that can be replaced or augmented by agents creating huge amounts of scale. But now we want to hear, nobody wants to hear a whole bunch of CEOs talking about cutting costs. It doesn’t make it happen. We want to hear about creating more opportunity, more productivity, more jobs.

Patrick Moorhead: Yeah, and I think that is the reality. If I look back, even desktop publishing in the ’80s was going to get rid of all the creatives. Okay? And then the iPhone was going to eliminate the need for all of these jobs.

Daniel Newman: What about more cameras?

Patrick Moorhead: No, exactly.

Daniel Newman: More cameras.

Patrick Moorhead: There we go. Hello, crew. But if you look at it, what it did is it empowered more people to do more, but the pros did more professional stuff. They did less monotonous work. They focused on even stuff that they like to do, and that’s the way that I think that this one is going to play out. We do need to have reskilling. That is a basis, as we saw in the Midwest, hence the Rust Belt. If you don’t train people for the next wave of jobs, it does have some very dire consequences.

Daniel Newman: Yeah. So there’s a lot going on here, Pat. I think you and I are going to come back, we’re going to have a number of great conversations this week. We’re talking to Yann LeCun, who’s set up with us to talk at Meta. We’re going to talk to Chuck Robbins, CEO at Cisco. We got several other panels. We’re launching our big research in partnership with Kearney here at the event. We’re sitting on panels. You and I are going to divide and conquer some stuff. If you want to see everything, you’re going to have to stay with us throughout the week, everybody. Hopefully, you’ll subscribe, you’ll be part of the Davos journey. And the newbie here will have to come back and say, “Did he fall on his sword and say it’s worth coming?” He’s at least feeling it now because you had to go back. If you look back, maybe we can actually pull up some of those tweets. This guy said some really pretty spotty, some really questionable stuff.

Patrick Moorhead: He did. Look them up.

Daniel Newman: He’s coming around.

Patrick Moorhead: I even deleted them.

Daniel Newman: He’s seeing all the tech here. He knows it’s great to be here. It’s great that you’re here with us. Thanks so much for being part of The Six Five community, Futurum, Moor insights, and hanging out with Pat and I here in Davos. We’ll see you all in a bit.

The post The Future of Work and Technology: Exclusive Insights from Davos, Switzerland appeared first on Moor Insights & Strategy.

]]>
The Six Five: Talking Microsoft, AI Regulations, Google, Meta, and More https://moorinsightsstrategy.com/the-six-five/the-six-five-talking-microsoft-ai-regulations-google-meta-and-more/ Mon, 20 Jan 2025 20:41:19 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=45142 On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss Microsoft, Google, Meta, AI regulations and more!

The post The Six Five: Talking Microsoft, AI Regulations, Google, Meta, and More appeared first on Moor Insights & Strategy.

]]>
On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The handpicked topics for this week are:

  1. Microsoft Wants to Drive Enterprise Trial With Copilot Chat
  2. Biden Leaves Trump an AI Regulation Love Note
  3. UK Leads in Antitrust Innovation, This Time With Google
  4. Meta Lays Off Low Performers – World Shrieks
  5. Will Lenovo’s Infinidat Drive Enterprise Value?
  6. Why World Isn’t Bored with TSMC Earnings

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.

Watch the episode here:

Listen to the episode on your favorite streaming platform:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Daniel Newman: Hey everybody, we are back, episode 246 of The Six Five Podcast. It’s Friday, Pat, it’s the best time of the week. It’s always the moment of rejoice, rejuvenation, enthusiasm, excitement, passion and conversation. How you doing buddy?

Patrick Moorhead: I’m doing great. It’s kind of weird, some weeks I’m just deluged with advisory calls and everybody’s like, “I need time, I need time.” And this week I had a bunch scheduled but probably half of them were for various reasons, but some pretty good lifts. I had a personal best on bench press, that’s a plus. I did have some really good advisory calls and just getting ready for Davos in Switzerland. I fly out on Saturday.

Daniel Newman: I know why those meetings got canceled by the way, it was you posted, I can’t believe you told everybody about your new G700 and then you posted about it and then I think some of the customers were like, “Clearly this guy’s making too much money and doesn’t need us anymore.”

Patrick Moorhead: That’s right. Your post is basically, “I’m going out in my G700, I’m going to eat a lot of ribeye and then I’m going to finger wag everybody who isn’t at WEF and tell them to reduce their carbon footprint, stop eating meat because it creates too much methane gas and eat some bugs while I’m there.” No, I mean I’m looking forward, I am now officially a globalist. So yeah, it’s going to be great.

Daniel Newman: Yeah, you live in your little orbit of pretending that you weren’t always a globalist and lying around. You lost me at ribeye though, come on, gently massaged wagyu, probably dry-aged, grass-fed, grass-finished of course.

Patrick Moorhead: Yes.

Daniel Newman: You’re probably one of those guys that actually goes out with the butcher and looks at the cow. You kind of massage like, “Yeah, this is the one. This is the one.”

Patrick Moorhead: By the way, the funny part was I got congratulations on the jet, by the way G700 is fully stocked as $100 million folks. And there were some people who responded to it with laughing emoji and simply were like, “Man, that’s great. Congratulations.” Folks, $100 million, think about this, not going to happen.

Daniel Newman: Hey, hey, don’t underestimate what’s possible, don’t underestimate what’s possible.

Patrick Moorhead: I’m just saying right now buddy.

Daniel Newman: It’s the world’s best technology podcast.

Patrick Moorhead: No, it is, it is. I mean, this thing could go crazy.

Daniel Newman: Well, and I don’t think we can fully start to roll our plans out to everybody, but after 246 episodes, The Six Five may be getting a bit of a face lift in the near future, I think we could disrupt ourselves. And that’s not because our show is stale, we know it’s amazing and awesome and every week you want to know about the six big topics, but as we’ve evolved our banter as you’ve gone from fat to skinny and now muscle-y… You’re like Jeff Bezos, man, I saw Bezos in the Polo launching the Blue Origin. I’m like, “Damn, that dude is ripped.”

Patrick Moorhead: I know, I know, guy is getting jacked.

Daniel Newman: I got down and started doing pushups when I looked at that picture, I’m like, “Crap.” It’s like if he can do it, I can do it. But I’m even going to update my logo on this stupid frame that this thing is in because I hate that logo and what it represents and the idiots that helped create it and all that goodness. But hey, yeah, we got a great show, Pat. I mean, we got a lot to cover today. We got some macros, some micros, some topics, some M&A and we got some big events. We are going to be in Davos, Six Five making its first presentation, Futurum’s launching its biggest CEO AI research study of its kind. The band is getting together, we’re going to have some great conversations out there.

And Pat, there’s just a whole lot more going on in the world in general. We got the inauguration of the president, number 47, also number 45. That’s going to be a big moment. It seems that there’s going to be no politicians showing up there, but there’s going to be a whole lot of CEOs. No Democrats are going to show up, but a lot of CEOs are planning to come. Very interesting times we live in, all the CEOs have now been to Mar-a-Lago to kiss the ring. And we’re going to get a Bitcoin Reserve, that’s pretty cool. So for those of us that bought crypto, it seems that Donald Trump’s administration’s going to drive parabolic ness to our portfolios. But also, there’s some big news. Microsoft made big news this week. There’s some new AI regulation. Biden leaves the White House with a love note, love that topic. Who came up with that crap?

Patrick Moorhead: Our producers are amazing. I want to thank the back office for all the hard work they do.

Daniel Newman: Yeah, they are really good at their job, I have to say. Next level, next level. And it’s really driving this podcast into the future. The UK is pushing antitrust regulation, there’s a shocker. One of my favorite topics, big companies laying off unproductive people, I know, seems insensitive. I’ve got an opinion on that. There was a little M&A this week in the storage space, I’m going to let you sort of take that one, but we’ll talk about it a little bit more. And then of course the big tech earnings wave is led by TSMC coming out first. TSMC always the great indicator of what’s to come, especially at the sort of tier zero one of chips and infrastructure. So there’s a lot to cover today, Pat. By the way, for everybody out there, this show is for information and entertainment purposes only. And while we will be talking about public companies, please don’t take anything we say as investment advice, Pat, so much to cover, don’t do it, don’t do it, just don’t. By the way, that QUBT company, I’m just going to say this right now, the one that was a QUBT, actually, the fraud charges have started to come out today. I saw them hit. You just don’t go from making energy drinks to quantum computers in short order, it doesn’t happen. I’m not saying they’re real, Hindenburg research actually shut down this week after.

Patrick Moorhead: I know.

Daniel Newman: They were so good, they were like 90%. I was looking down the list of all their short reports and half of them were down 99.9%, meaning they absolutely nailed it. I think about 10 of them they maybe got wrong, but what a batting average. That’s a really hard thing. And the guy actually said, Pat, he goes, it was like killing him. It’s like being the guy at the craps table that plays the don’t pass line. It’s like even though statistically probabilistically it’s maybe the second-best bet on the table besides the actual pass line, so if you’re wanting to play that off the odds, but you’re just the *censored* if you actually are the guy. And so being short and being publicly sort of that sort of short research, terrible place to be, especially with the bullishness that exists in our market, people want the market to go. Pat, let’s dive in though. Satya Nadella, I mean look, it’s an agent war now, we’re in agent wars. Everybody’s talking. I think there’s going to be a huge topic this week at Davos. Pat, what did Microsoft announce this week?

Patrick Moorhead: Yeah, so Microsoft brought out another flavor of Copilot, this time it’s Microsoft 365 Copilot Chat. I know I saw a lot of snarkasm out there saying, “Oh, it’s another week, another Microsoft Copilot.” No, I feel you. And first of all, when I look at the breadth, first of all, Microsoft has a consumer entity it needs to take care of and it has a commercial enterprise, medium business, small business. And then you have vertical forms, right? You’ve got, for instance, Copilot for Sales and things like that. So I get the snarkasm and the way that I like to measure things is on is it delivering unique value? And if I look back and we do a lot of research on this, what are the key inhibitors to enterprise AI adoption? And they shift from quarter to quarter, but not as much as you might think, right? You’ve got the data management piece, you have the, “What’s the ROI?” “What’s the skill sets?” And one of them is cost, okay?

So when you look at how do you get somebody excited to pay for $30 per month per seat, if you have half a million employees, it might look, “Oh my gosh, this is a ton of money.” Now let’s say that that’s saving you $20 million a year, that cost becomes negligible, right? It’s like, “Take my money type,” type of stuff. And I think as humans we’ve seen this, I would pay a lot more for perplexity than it charges. So you have to get the organization to try it out. You need trial to demonstrate the return on investment. And what this new Microsoft 365 Copilot Chat does, it adds this pay as you go mentality, not per token, let’s say if you were in the data center, but per message, okay? One penny per message. And I think Microsoft did a pretty good job showing, and you can hit up the screenshots on X that I get laid out here, how it compares to the full up Copilot versus Copilot Chat, what would a, I’ll call it, full rotation around the planet if you are actually trying to do something for an entire day?

So for instance, a sales performance agent, the estimated cost per day doing four generative answers, four tengraphs, grounding for messages, you’re looking at 12,800 messages and that is really cheap when you’re looking at a penny. Plus, you could buy these packs, which is, I’ll call it a tweener versus pay as you go message packs going versus a full up Microsoft 365 Copilot license. So listen, I get the criticism, but I think what they’re doing is the right thing and we’ll see if it generates usage. The other thing when I step back, Daniel, I was wondering, does this indicate that Microsoft’s invested the right amount of capital expense to be able to unleash this to the world and they actually have capacity now to do this? I think we’ll figure this out as we get into the months and the quarters, but I like it.

Daniel Newman: Yeah, I mean, we know the agentic movement is really substantial. Our team has done a pretty big TCO breakdown on it, we’ve got it pegged at about $4 trillion of labor that we believe can be augmented or completely displaced by agents in 2025. We just kind of looked across the labor pool and we looked at how much of the labor we estimate could be done currently by agents and then we basically took a percentage and a fraction of that. So that’s kind of the opportunity that’s at stake. And of course four trillion in direct labor doesn’t mean 4 trillion in spend on software, the idea would be that that labor could be done at some fraction of the cost of software. And then there’s always what I call the prune to grow, meaning you find those efficiencies, reinvest in productivity gains. But agents are going to totally change the calculus by which businesses are run.

Microsoft has a massive opportunity and a real challenge in front of it. I mean, the massive opportunity for Microsoft, is it literally has all the users. The install user base, I look at the Microsoft opportunity for agents is a lot like the Apple opportunity for on-device AI. When you have all the users already there, you build the right product and you’re going to get people to use it. The business model has been in question. Look, I mean the external sort of rumor mill on the street is that there’s some concerns about the investment where people have gone in on Copilot, they’re paying the money for it, they’re not sure they’re getting the value. That’s been anecdotal, you read it across the internet and I’ve actually talked to some CIOs and a lot of them are saying like, “Yeah, we’re using it, but we’re not sure.” So this is also really interesting, Pat, because it’s like a new way to think about how to charge rather than that sort of bigger per seat fee. Now I have to say, the $20 a head or $50 a head or any of those sort of numbers, if this stuff works well and is really deriving value, it’s nothing.

Patrick Moorhead: 100% Daniel. I mean, imagine if the Word document to PowerPoint actually worked well. I’ve tried it out a bunch of times, it doesn’t work great. But I got to tell you, I am signed up for the ChatGPT Pro, right. When I want a long reasoned answer, I am paying it and I’m probably going to keep paying it until something better comes along. I mean, how much would you pay for Perplexity? You know.

Daniel Newman: I mean, at this point I’m not.

Patrick Moorhead: Oh, funny you’re not using Perplexity Pro Search.

Daniel Newman: I am using it, I’m not using the paid version.

Patrick Moorhead: You can’t use Pro without paid.

Daniel Newman: No, I’m not using the pro version, the paid version, I’m just using the free version. I have a few of them. I have Gemini Pro. I’m using that one right now, I had ChatGPT, I canceled that one. We’re kind of a Google in our workspace as our corporate platform so I’ve rolled up a lot of Google features and functionalities.

Patrick Moorhead: I pay for all of them.

Daniel Newman: Well, that’s just you though. You and I are just different that way. You’re sort of the tear everything down, geek out on it, share benchmarks. I let Signal65 do that kind of work, that’s my take. Let the geeks do the geek stuff. I mean, Pat, we are the only lab out there that has, what, I think a 72 node H200 system up and running and a 64 node MI system? Just saying out there, there are no other labs out there that have the type of cool tech stuff that we do. So anyone out there that wants to know more about that, you got to check out our lab people. Sorry, no promotion on the show, Pat, stop it, stop promoting please. This is a organic show, it’s supposed to be here to entertain people. Anyways, the agent space is going to grow rapidly, Pat, we’re going to talk a lot more about this. It’d be very interesting to sort of see adoption. I mean, it’s all the opportunity comes in the outcomes. If people are getting value in the outcomes, this type of model, just like you mentioned with using Pro, you’re willing to pay for it because there’s a ton of value in it for you. People need to get the value, they need to see the agents work. Right now it feels like a little bit of architecture across the board, not just them, Salesforce, ServiceNow, Workday, all the companies talking to agents, but I think very quickly we’re going to see sort as the data and the application and the agentic workflows become well understood and accurate and trustworthy, that business is going to just explode. I’m super bullish about it for this year.

All right man, we got to get to the next topic. We can run with this all day long, but let’s talk about AI regulation, Pat. So, what, eight days, you and I last Sunday night, eight days before we’re getting kind of a brief overview on what’s going on the Biden administration decides to push forward on some rules to basically further limit the exporting of AI chips. Now, again, this is rules on top of existing rules and the rules on top of the existing rules that were designed for really calling what’s now not much more advanced than what’s going into some advanced GPUs for gaming are now being put on these restricted lists. And let me be upfront about this, we have a real sort of difficult needle to thread because China is getting access to our most advanced semis, they always have, they always will. And there’s a lot of sort of muckety muck about how has that continued to happen. You get all these export controls and oh, it’s shipping through Singapore, well how does that happen? TSMC is shipping things to China they don’t even know, people are tearing apart boards and finding hardware inside of equipment that’s not supposed to be able to get there. So that’s kind of at the core of why this goat rodeo that is import and export controls is such a big thing with AI chips.

To be clear, AI chips are going to be the foundation in which the world’s economic growth and basically security is going to be built on the winner of AI will be the winner of the economy. The winner of AI will be the winner of national defense. But now the thing is, unlike these really restrictive rules about China, Russia, Iran, North Korea and other adversarial nations, we’ve expanded that with these new rules that basically make it very difficult for companies in Switzerland, companies in Iceland, companies in places that we would typically consider to be neutral or friendly… You can’t be more neutral than Switzerland can you, Pat? And basically making it very difficult for them to purchase and to implement next generation AI technologies. I mean, NVIDIA of course one of the biggest and most outspoken on this particular one, it’s 120 days from now it’s supposed to go into effect. Pat, I think we could talk a lot about what’s in this rule, I want to use my little bit of time that I have left before passing it to you to just say I genuinely believe this was nothing but 11th hour grandstanding by the Biden administration, not even sure Joe Biden, President Joe Biden’s lucid enough to know what he signed into law at this point. I think that the idea is not on his administration’s watch. So meaning that if China does over the next four years move quicker, we’re seeing what they’re able to do in frontier models, we’ve seen at least rumors.

Again, nothing is ever confirmed as to how things work there, but the stories that have come out about their most advanced frontier models being trained on older chips being competitive with our newest frontier models probably worried some people. But I think Biden does not want the world to ultimately walk away thinking that his presidency and his policies enabled China, North Korea or other adversarial countries in the world to gain or become on parity with us in any way around AI. In the process though he is putting business at risk, he’s putting our largest enterprises at risk. There’s a better way to approach this, it’s something we could probably talk about at another time. I think Trump will reverse this or most of this because I feel like it’s a blunt instrument being deployed to solve a problem that requires a scalpel. And we can spend more time, Pat, maybe some time on how we could actually address the need for these things, but I’ll pass that back to you.

Patrick Moorhead: So first of all, hat tip to our producers, I don’t know if any of you read history, but ex-presidents leaving the new incoming president an envelope goes back to 1993 with George Bush and Clinton, and then we saw it with in 2001 Bill Clinton to George W and then George W to Barack and then Barack to Donald Trump, and then even Trump left a note in 2021 to Biden. So this is a love note that Biden is leaving Trump. And with that said, thank you production team.

Daniel Newman: Nice job production team. Production team’s getting a bonus, they’re going to get a quarter zip Six Five media shirt.

Patrick Moorhead: Exactly, exactly. But here’s the thing, here are the facts. The facts are there are certain set of regulations that is keeping China, Russia, and North Korea and others from gaining access to technology that they could use to hurt the United States and others. And the second fact is that China is getting around it and there are countries and people who are helping it. I mean, years ago when this first BIS thing came out, I said, “Hey, look at the size of an A series. Look at the size of a graphics card.” You can smuggle those in your luggage if you wanted to. I mean, my gosh, look at how arms get over to the Middle East. I mean, it’s just like this is really, really hard. So I think the intentions are good here and I personally agree this is essentially the modern day nuclear weapons capability and centrifuges. But I just have to question, again, I think you had a great analogy, a scalpel versus blunt instrument here, I mean, listen, the BIS couldn’t even get the link at the bottom of their website to work, okay? And this is going to be the traffic cop to authorize all of this. That is an absolute freaking disaster, okay? And I don’t know who puts this stuff together, because you might get ahead of this and say, “We are staffing up, we’re going to give a 48-hour turnaround.” Because Daniel, we’re in this age where trust in institutions is at an all-time low, whether it’s the press, whether it’s government, and this needed to come out. So I also think the industry should work together to come up with some suggestions on the best way to do this. In our conversation with NVIDIA, they had some interesting suggestions, but I think this is getting very serious now. Maybe the other way to look at it, and Dan I know we’re looking at remapping our show, maybe we have a segment where we play both sides of something but-

Daniel Newman: I think I kind of did that. I think I kind of was trying to explain what-

Patrick Moorhead: Yeah.

Daniel Newman: Yeah, we tried.

Patrick Moorhead: But other side of this, because this thing is nuanced, but yeah, let’s have the industry-

Daniel Newman: No, you don’t want to be the president that basically allowed, it’s like allowing nuclear technology in the past era, I mean effectively there is kind of the legacy here, “I was the president that enabled North Korea and China to become AI global leaders because we basically gave our best IP away.” And we also know what China does when it gets access.

Patrick Moorhead: That’s pure politics.

Daniel Newman: When China gets access to our IP, they build their own and they tend to do a pretty good job of it. So they may not be able to create, but they’re certainly able to copy very effectively and they’ve done so. What was the old joke, when you talk to a certain CEO about VMware, he said something along the lines of “You know how many copies of VMware they have in China, you know how many licenses?””

Patrick Moorhead: Like one?

Daniel Newman: One.

Patrick Moorhead: Hey, the final comment I want to make is in the other side-ism here is like we saw with Huawei where defense departments saying, “Hey, they’ve got back doors. We need to get Huawei out of all western technology including their phones and their infrastructure,” people are asking, “Okay, show me the evidence.” Does Portugal want it aired out that they were the country that let all this stuff go through? And by the way, there are countries that are setting up data centers that are way beyond the capacity of the EU, of any customer coming in. And you have a lot of stuff going on in the Middle East as well, who’s going to soak up all of those? So anyways, this is a nuanced discussion, it’s not black and white. What is black and white is that China is skirting regulations and people are helping them. And the third is that anything tied to the government has a level of bureaucracy and time that will slow down commerce. This will slow down commerce based on everything I understand about it, and prove me wrong, everybody out there, if I’m wrong on that.

Daniel Newman: Yeah, absolutely. Pat, we’ve got to keep moving. We are doing the 620 today, good on us. So let’s stay with regulation, nonsense, craziness. What’s going on in the UK, Pat? Because they’re not EU anymore, but I mean, they kind of are sometimes.

Patrick Moorhead: Yeah. So I got up I think Tuesday morning, something like that, I’m supposed to be talking chips on Yahoo Finance on their opening show and about five minutes beforehand it’s like, “Hey, can you talk about the Google antitrust situation?” Always fun. But you know what the easy part is Dan is we cover antitrust so much that it’s like a copy and paste. So essentially when the UK left the EU, they had to set up a lot of their own bureaucracy and one of them is around antitrust. And as we’ve seen through Adobe getting shot down with the major acquisition they were making, the UK is trying to flex its muscle and be the top innovator in an antitrust regulation. And this time it’s with Google. And the investigation is very similar to other investigations that we’ve seen around the world, which is the following, that Google uses its monopoly power in search to unfairly limit competition from either competitors to its own applications, which could be advertising in, let’s say, restaurants that could be Google Maps versus going to an OpenTable or the French version of the open table.

And this is very similar in line with the EU’s DMA, which says there are gatekeepers, okay? The UK has a different flavor of it, but it’s essentially the DMA. Google has what’s called AdWords, I think it’s called Google Ads now, came with an acquisition, Daniel, I think they made in like 2000 back when I was at Alta Vista, which essentially is the mechanism to bid on advertising. And Google has their own property. So this will be the debate. And it’s funny, when I got on my big quote out of Yahoo Finance was, “Just ignore this thing.” I think we’ve been trained in markets that when’s the last time we saw a big ruling come down from any antitrust other than a fine that actually made a difference? So I’m kind of other-siding this with, I mean, I guess what the EU forced Apple to do on certain things kicking and screaming, I mean, I don’t think that has any material change to their short-term or mid-term business. So yeah, the market yawned and I yawned and that was the pull quote, that Yahoo Finance pulled out.

Daniel Newman: Yeah, so I love this. Pat, I don’t know if you’ve watched the big catch of the rocket again yesterday, second time with the boosters and Elon. And there’s this great visual of the EU, and I know again UK is not EU, but a lot of its kind of lagging policies are still of its time as part of the EU and the EC. And it shows a picture of Musk’s SpaceX catching the rocket and then the next one is a guy with a water bottle and a little cap and the cap is disconnected, you know how now on their water bottles when you pull them off, you can’t actually remove the cap, they kind of hang, they dangle. And that’s the sort of span of innovation that goes on there. If you actually listen to the Rogan Pod with Mark Zuckerberg, Mark actually came out pretty clean for the first time. Again, new Zuck, I don’t know who this guy is. This is Red Pill Zuck, I think, the red pill Zuck.

Patrick Moorhead: Yeah, which Zuck?

Daniel Newman: New Zuck comes out and he’s kind of talking about how basically they just stifle innovation and they just fine people. So basically, just so you know, I can get done more quickly with this topic, not because it’s not a good one, but because you covered the technical part of it pretty well is that the US innovates and the EU fines, and they basically penalize because they haven’t figured out how to enable it. And by the way, maybe they do know how to do it, but they just don’t want to do it. And so alternatively, no companies are really being built there of substance. And then meanwhile, when US companies come there bring their innovation there, they sort of create these tolls and the tolls are in a version of fines and regulation and US companies pay it and it raises money for the EC and the EC, but in this case, in the UK… I mean, look, there’s some real fundamental things with Google that need to be looked at, just like the real fundamental things with Apple that need to be looked at. These companies have built these really, really deep interconnected modes. What I think-

Patrick Moorhead: Concentration is not good for innovation, it’s just not.

Daniel Newman: And the one thing I’ll leave this at is that effectively antitrust law needs to be redone. And I’ve said this and I’ll say it and I’ll keep saying it is that the fact is it was originally intended for two things, to increase competition and remove consumer harm. And the fact is in the process of reducing consumers aren’t always harmed. The consumer experience across the Apple ecosystem or the Google ecosystem or the Microsoft ecosystem is generally a better experience. So you have these completely unified connectors between different parts of the ecosystem connecting you between devices and locations and experiences and it’s good. And the thing is, by actually breaking that up to give people the opportunity to compete in parts opens all kinds of security vulnerabilities, it opens up more friction in the experiences to get between an app and a device and the security and the payment. So the antitrust rules just suck. They suck because they don’t work. If you want people to have great experiences, you actually somewhat are an enabler of anti-competitive situations because these companies build these motes and the motes are great experiences, people want that. So nobody cares. So just keep fining EU, that’ll get you where you want to go eventually, you can raise taxes to 120%, that’s the plan. You can tax everybody and then tax them again when they spend the money they don’t have.

All right, we got to keep moving, speaking of big tech and Mark Zuckerberg Meta puts out an announcement this week, and by the way, not the only one, we’ve heard some things from Microsoft about some layoffs, but Meta is reducing 5% of its workforce. But over the last few weeks, Pat, you’ve probably heard DEI programs being shuttered everywhere, basically almost every company except maybe Apple has decided, every major company, tech company has decided to either roll back or completely eliminate this effort. Now this to me, by the way, is just an indication of when you see a CEO that you think is really on board with your political view spectrum, right or left, don’t care, just realize that CEOs are really great chameleons for returning shareholder value. So if that’s the right trend line, they’re going to get on board. When you see the pivot though, don’t be surprised because right now what you’re getting is a Trump pivot, you’re getting a MAGA pivot and people may or may not love that but what’s happening right now is we’ve kind of gone from a very equity, equality, diversity driven sort of structure of hiring to a complete flip to meritocracy.

And so basically what Meta announced is 5% of the workforce, which isn’t a ton, but it’s a meaningful number of a company that size, are going to be let go strictly based on non-performance, something that hadn’t really happened much over the last four years because frankly it would’ve been very difficult, companies would’ve been chastised, canceled. If the thought process was that the elimination was being done for reasons that were just merit, especially if it affected what are going to consider to be protected groups. This is a really interesting tightrope we’re walking right now. As CEOs of companies, Pat, you and I, we constantly face the situation of trying to decide how to make sure we incent performance, how to make sure that we always are building a culture that rewards the highest performance company, doesn’t sort of cater to people who are dragging the organization along. And also Pat, and I’ll kind of just say this and maybe kick it your direction, but starting with Musk, starting with Hock Tan, forget even the whole trend lines of DEI and non-DEI, just CEOs that have basically come in and said, “We’re bloated. This company’s over-staffed, we don’t need this many people, we can run with less.” I know there was a lot of sort of we media destruction of X, but from what I can tell, X is going to be worth more than Musk paid for it, it’s going to end up being more valuable. We’ve seen with VMware, I think Hock Tan eliminated more than half the employees now that one’s rolling and running.

So anyway, I think this is great. I’m not going to say, I think companies need to always be thinking about diversity, they always need to be thinking about hiring the best people and of course from diverse backgrounds that can bring more insights to companies. But I think meritocracy wins. I think companies perform great when they’re built for performance. That’s what our great innovation that we just talked about in the last segment comes from. I encourage it. I think that companies, and I think with AI path, this is only going to accelerate. I think with agents and AI, 5% might become 20% for a lot of companies in the next year or two.

Patrick Moorhead: This gets me back to a quote that I’ve been seeing a lot, hard times create strong men, strong men create good times, good times create weak men, and weak men create hard times. When I came out of college in 1990, the job market was absolute *censored*, okay? It was hard to get a job and you were lucky to have a job, okay? I also grew up in the Midwest, which is the Rust Belt, which is the fentanyl capital of the world where a ton of people lost their jobs. And with this amazing increase in GDP per person, we essentially even globally have eliminated most of poverty. Have you ever seen the curve, dude? I mean, it’s unbelievable. So I think this cycle, we’re not even in a bus cycle, Dan. It’s like a little blip cycle. We over-hire during the pandemic and this is a rebalancing, and like you so astutely said, AI is kicking in this work from anywhere thing, companies allowed it and promoted it, got out of hand, they didn’t feel like they were able to manage their troops effectively enough. So we’re here. Yeah, you’re seeing the rubber banding. Everything is circular, folks. One of the reasons I’m a student of history is because these big things don’t surprise, this shouldn’t surprise anybody. So net-net he’s making a business decision and my gosh, have you ever thought that eliminating layers and eliminating people actually speed stuff up? I see it every day. I mean, I was in the most bloated company you could work for back in 1992, which was AT&T, fricking bloated and huge, right? This was post-breakup, but I see it and even, gosh, I think I managed more than a thousand people when I was at AMD at one point and it was like stuff sped up after we did mass layoffs when we changed the process and when we prioritized. So this is where we’re at, the new Zuck is interesting. I can’t believe he didn’t stop by when he was in Austin for Joe Rogan, I’m going to need to have a talk with him.

Daniel Newman: When we get the new format, I mean, I think we are going to be talking to his right hand in AI in the near future.

Patrick Moorhead: Yeah, we’re going to be talking to Yann LeCun, firebrand, Musk hater, but one of the most powerful people in artificial intelligence right now.

Daniel Newman: Yeah, I look forward to that one. Everybody out there, keep tuned in for that one, that’s going to be dropping on The Six Five network. Very exciting. We’ve got some great conversations next week. I’m jazz dude. I don’t like leaving on Saturdays, I’m not going to lie. It definitely makes for these weird weeks. By Wednesday you’re like, “Is it ever going to end, the week?” On Wednesday night. But this is a once a year pilgrimage, we’re going to make it rain. All right Pat, we’ve got a couple more topics and of course we’ve taken forever, fiveever, Six-Fiveever to get here. But there was an acquisition made this week, a very, very smart tactical one. I think one of your analysts, Matt Kimball, did a really nice teardown, breakdown, the producers are great producers with their new shirts, can put that in the show notes by the way, but what happened? Who acquired who this week, Pat?

Patrick Moorhead: Yeah, a little bit of a backdrop. Lenovo is the market share leader in PCs. They’re also the market share in lower end storage, and that’s under $25,000. But where Lenovo has struggled is in the enterprise data center for various reasons. They’re very competitive in small and medium businesses, they do pretty well with governments, but they just struggled in this area. And it’s a combination of product fit and go-to-market. I know from talking to the leaders at Lenovo, they’re doubling down on getting into the enterprise data center in a much bigger way. And one of the product gaps they had was high-end enterprise storage. And they’re currently partnering with NetApp on this, having a lot of success in China. I do believe that Lenovo is also the ODM on a lot of NetApp’s equipment. It’s not a challenge, again, in the enterprise for Lenovo and China, just in even ASEAN countries and the West and even South America. So what Infinidat brings, and it’s interesting, they’ve been around since 2011, they’re not this recent startup that came up, is they have a full range of enterprise storage solution and it’s a combination called the InfiniBox SSA. That’s an all flash option, they have a hard drive option. They even offer cyber resiliency, which is the thing that all storage providers are trying to get into, but it’s a rabid market at this point as well.

So we’ll have to see. I mean, there’s a lot to be determined on this, like, what happens with the NetApp-Lenovo relationship? I’m hearing back from unnamed sources that it’s a non-issue with them, and I don’t know if that’s corporate speak or not. But this does increase the chance that Lenovo can increase their level of success in the storage market. When it comes to the enterprise, I think it’s more of a go-to-market. I’ll say out of the other side of our mouth, it’s having the content, it’s having the case studies, it’s having the sales coverage that I think is going to make the difference because, man, competition is tough. You look at Dell, you look at HPE, even Cisco’s getting invigorated in the compute market here that comes with storage, it’s a really tough market.

Daniel Newman: Yeah, so good comments. Let’s add in the show notes too, Camberley Bates on my team wrote a really nice research notepad, this was all about addressing the higher end, addressing the fact that certain parts of their storage portfolio are really relabeled of other people’s storage technologies. This gives them that high-end sand, high-end data protection that has been missing from the portfolio rounds out the Lenovo ISG business. And I think it’s a good home for where Infinidat lays out eventually and ultimately. And the only thing I’ll add to that is this whole storage segment is going to change massively. So there’s sort of the ability, it’s kind of X86 and the CPU versus the GPU and the AI accelerator. There’s going to be a near term era of basically continuing to build out what is sort of really your traditional storage arrays, and then there’s this neo storage technology that’s being developed for the AI era and it’s going to look very different. So I think Lenovo is sort of a addressing and rounding out its core sort of X86 CPU and storage attached and networking business, and I think next they’re going to really need to be thinking about how to round out the future of this kind of neo data managed platform that you’re seeing with the vast Wekas and even the next generation higher end NetApp platforms and et cetera. So a good move, good landing for Infinidat to be very interesting. This should close, I don’t see any issues with this one.

All right buddy, let’s round this out, Pat, TSMC. So we are heading to Davos, the inauguration and next week will be the last sort of quiet earnings week before the tech boom for about four or five weeks after that just absolutely explodes. But every quarter, Pat, TSMC does a couple of things to sort of give us a glimpse into the future. First and foremost, they report every month. Because of where they’re traded, they report every month a revenue number so there’s really never a big surprise when TSMC reports, but then they come out with a quarter, and this is where they share some guidance, they tend to share a little bit more about node density, where buyers are buying in on which node, which interestingly enough I think we’ve seen now their 3, 4, 5 are now close to 70ish percent. The vast majority of the business now is on the leading edge. But I mean realistically, Pat, it was very, very strong in a very condensed, concentrated part of the business. So once again, high performance, which is inclusive of their AI, was the big, big winner for the company. It outperformed again. Probably one of the most interesting statements that was made by the TSMC CEO was his 40% estimated CAGR for AI accelerators providing a really bullish outlook, not just for Nvidia and the GPUs, but also the whole Broadcom, Marvell, what’s going on with Microsoft, Amazon, all the different accelerator manufacturers or the companies that are making custom AI chips. This seems to be a part where he really thinks it’s going to buoy the long-term growth of TSMC.

Ironically though, smartphones, Pat, is still down, higher single digits. I mean, we have not seen the recovery on devices that we expected. The inventory took longer to sell off, that number’s not really growing. So when you see TSMC crushing it is a one-to-one AI is crushing it. The AI business is just good. And now the company’s going to be making big CapEx investments to expand capacity, they’re trying to address a lot of packaging needs with CoWoS and CoWoS-L and the next generation of CoWoS. And of course I always say that even though they’re not doing the memory part, this is also really a good indicator of the future of HBM. I mean, if the AI business is this strong, the HBM business is going to follow this trendline, Pat. But my take is the chip companies that are heavily connected to the AI chipmaking, Pat, they’re going to have a good quarter.

Patrick Moorhead: So the world is not bored with TSMC earnings, but I am. I mean, there’s no insights. Everything continues, right?

Daniel Newman: I thought I gave some good insights.

Patrick Moorhead: No, no, you did. But I mean, NVIDIA is moving to CoWoS-L.

Daniel Newman: Yeah, I get it. I think it’s more it’s the leading indicator of what part of the semiconductor industry is doing well, and I think it’s when you get the kind of bubble bears out there saying AI is a bubble and CapEx is going to collapse and nobody’s buying AI, and then you look at these numbers, Pat, you’re kind like, “Yeah, but someone’s buying it. I mean, someone’s buying it.” By the way, I mean, I saw a number, look, I mean the number in this Yahoo note said Apple and Nvidia, they’re spending 38 to 42 billion on technology and capacity this year, up way more than expected. So I mean, the CapEx rolls on. And this also obviously has to fuel the chip equipment makers, right? ASML, Applied, all those companies have to be really feasting on this as well.

Patrick Moorhead: They have. What would be interesting, the only thing that would be interesting to me is they give some indication that Microsoft’s pulling back. Amy Hood talked about a leveling out in the second half, and then the question is what actually does that mean? To support what level of end user demand? Now that would be interesting to me. Or a spike up, we’ve had this permafrost on the industrial IoT including non-EV cars, right? That would be interesting. So I don’t know, man, it’s a big yawner to me, but I appreciate you bringing the juice on this.

Daniel Newman: I’m going to convince you this isn’t boring at some point that it’s not. But yeah, no.

Patrick Moorhead: No, it’s not always boring, it’s like this one was boring because it’s like okay, same thing I said last quarter and the month before and the month before that. Hey, hyperscaler enterprise data is big. Ooh, no way.

Daniel Newman: Well, Pat, when I was teasing out our new big, big survey that we’re doing on this AI and CEOs, I said, the big finding is CEOs think they’re going to use AI. I mean, so breakthrough kind of stuff. CEOs are going to use AI. Anyway, great show buddy. It’s a lot of fun. Like I said, as we wind down this exact format, it is just going to get better from here, it’s going to get better. We’re going to get more unfiltered. Pat’s going to get some packs finally, so he can start benching some real weight. I’m going to push him here, he’s going to get to 300 this year. He’s getting to 300 this year.

Patrick Moorhead: Maybe a derived 300, which means I got to push 250 for three and I just, dude, I don’t know, we’ll see, we’ll see.

Daniel Newman: We’re up in the creatine to 50 grams a day and I’m just going to start drinking TRT.

Patrick Moorhead: Yeah, yeah, I’m just wondering if I get my big bottle of blue anabolic steroids through customs.

Daniel Newman: Yeah, I don’t know if you’re going to ship yours. I generally stop over in Turkey before I go, get some from one of those local pharmacies. All right, everybody, I got to go, I got a meeting or something like that. Hit that subscribe button, join us for all the other episodes of The Six Five podcast. We appreciate you being part of the community. Look out for our great coverage from web each and every minute. Just during this show we booked two more amazing interviews, Pat, very exciting times.

Patrick Moorhead: I know.

Daniel Newman: Yeah, love you bestie. I’ll see you later.

Patrick Moorhead: See you, bro. Take care.

Daniel Newman: Bye.

The post The Six Five: Talking Microsoft, AI Regulations, Google, Meta, and More appeared first on Moor Insights & Strategy.

]]>
RESEARCH PAPER: AI in the Modern Enterprise https://moorinsightsstrategy.com/research-papers/research-paper-ai-in-the-modern-enterprise/ Fri, 17 Jan 2025 17:38:10 +0000 https://moorinsightsstrategy.com/?post_type=research_papers&p=45134 This report explores enterprise IT organizations’ challenges & how hybrid cloud environments with modern AI-ready infrastructure are a solution.

The post RESEARCH PAPER: AI in the Modern Enterprise appeared first on Moor Insights & Strategy.

]]>
We’re in perhaps the most dynamic era of enterprise IT. Modernization initiatives have been rescoped and accelerated to support generative AI projects, which have captured the attention of every executive with good reason. Gen AI promises to transform businesses in ways we haven’t witnessed.

However, as the need to accelerate and alter modernization efforts to support this new wave increases, IT budgets are only rising incrementally at best. Sustainability is another variable in the equation. While AI initiatives require more compute, storage, and other resources, CIOs are tasked with lowering power footprints to drive sustainability goals.

How can enterprise IT organizations simultaneously achieve modernization, AI, and sustainability goals, which seem to directly contradict one another? Moor Insights & Strategy (MI&S) sees the solution as rooted in infrastructure.

Outdated operating stacks powered by outdated hardware and processors unable to deliver the required performance, agility, security, and targeted acceleration are, in some cases, used as the building blocks for the AI-driven workloads running the modern business. This is a recipe for failure.

This research brief will explore enterprise IT organizations’ technical and operational challenges and how technology vendors are responding with hybrid cloud environments powered by modern AI-ready infrastructure. Further, it will evaluate how Nutanix, Dell, and Intel have partnered to deliver the Dell XC Plus running the Nutanix Cloud Platform (NCP) and GPT-in-a-Box powered by AI-accelerated Intel Xeon CPUs.

Click the logo below to download the report:

AI in the Modern Enterprise

 

Table of Contents

  • Summary
  • Modernization and AI — Complementary Yet Competing
  • Can IT Modernization and AI Operationalization Occur Simultaneously?
  • Where Do We Get Enough Power?
  • The Optimal AI Foundation Begins with the Cloud
  • Nutanix Cloud Platform — Simplicity Through Abstraction
  • Dell XC Plus — Performance and Security
  • Intel Xeon — Modernization Starts in Silicon
  • Managing the Modernization-Plus-AI Journey
  • Call to Action

Companies Cited:

  • Nutanix
  • Dell
  • Intel

The post RESEARCH PAPER: AI in the Modern Enterprise appeared first on Moor Insights & Strategy.

]]>
RESEARCH NOTE: Qualcomm’s Growth Targets and Execution https://moorinsightsstrategy.com/research-notes/qualcomms-growth-targets-and-execution/ Wed, 15 Jan 2025 22:49:02 +0000 https://moorinsightsstrategy.com/?post_type=research_notes&p=45115 Over the last few months, Qualcomm has held two major events to inform the press, consumers, and investors about the state of its business. At its Snapdragon Summit in Hawaii, Qualcomm announced the next generation of its mobile chipsets and automotive solutions, then it followed that up in New York City with its Investor Day. […]

The post RESEARCH NOTE: Qualcomm’s Growth Targets and Execution appeared first on Moor Insights & Strategy.

]]>
Qualcomm CEO Cristiano Amon at Investor Day

Over the last few months, Qualcomm has held two major events to inform the press, consumers, and investors about the state of its business. At its Snapdragon Summit in Hawaii, Qualcomm announced the next generation of its mobile chipsets and automotive solutions, then it followed that up in New York City with its Investor Day. I was fortunate enough to attend the Investor Day, where Qualcomm executives reviewed the company’s progress over the last three years since the last Investor Day it held.

New Products — Mobile and Auto

Every year, Qualcomm announces some of its upcoming products for the next year at its annual Snapdragon Summit, usually held in Hawaii. For the latest version of the event, the focus was squarely on the company’s biggest and fastest-growing businesses—mobile and auto. In each case, the big news came in the form of a major update to the CPU architecture thanks to the company’s new second-generation Oryon CPU cores. This marks the first time that Qualcomm has introduced Oryon to its mobile and automotive products, representing significant performance improvements.

Qualcomm’s Snapdragon 8 Elite mobile performance claims

These new high-performance, low-power CPU cores inside the Snapdragon 8 Elite have significantly improved performance over the last generation of Snapdragon 8 Gen 3. This is possible thanks to the second-generation Oryon architecture and a process node shrink, which enables one of the highest-clocked Arm mobile processors in the world—faster even than Apple’s A18 Pro series. Qualcomm claimed that against its own Snapdragon 8 Gen 3, the Snapdragon 8 Elite is 45% faster in single-threaded and multi-threaded applications and 44% better in power efficiency. The Snapdragon 8 Elite has already found its way into a bunch of OEM devices including the Xiaomi 15 Pro, ASUS RoG Phone 9, OnePlus 13, iQOO 13, Realme GT7 Pro, and Redmagic 10 Pro. I expect that it will become the default SoC for flagship Android phones, competing directly with MediaTek’s Dimensity 9400.

On the automotive side, these same cores also power Qualcomm’s new Snapdragon Ride Elite and Cockpit Elite platforms for future vehicles. Snapdragon Ride Elite is Qualcomm’s ADAS solution, which is custom-built using various pieces of Qualcomm IP and is an ASIL-D-compliant architecture. This ADAS platform is part of Qualcomm’s efforts to enable Level 3 autonomy with the potential for Level 4/5 in the future. Qualcomm uses the same SoC to power the Snapdragon Cockpit Elite for infotainment. These chips combine with others for functions including connectivity to create the complete Snapdragon Ride digital chassis solution. This is Qualcomm’s platform for the software-defined vehicle of the future and is at the heart of the company’s strategy in its ever-expanding automotive design pipeline.

Qualcomm Executives, left to right: Durga Malladi, Alex Katouzian, Cristiano Amon, Akash Palkhiwala, Nakul Duggal, and Alex Rogers

Investor Day

Speaking of the automotive design pipeline, Qualcomm updated investors on the progress of its automotive business, including yet another increase in its pipeline—upgrading it to $45 billion. This is up from $30 billion in 2022, showing the continued growth of the business. The company also talked about how it would further lean into its Snapdragon Ride Flex platform, which enables both ADAS and cockpit functions with a single chip for more cost-effective automotive solutions. Qualcomm expects Snapdragon Ride Flex to reach OEM start of production in 2026.

At the event, Qualcomm’s head of Automotive and IoT, Nakul Duggal, detailed that the company expects growth for the automotive business to come from ADAS solutions. Qualcomm then brought out a parade of partners including Mercedes-Benz, BYD, and many others to prove its industry strength among the world’s leading automotive OEMs.

PC and XR Growth Opportunities

During Investor Day, Qualcomm also touted a lot of its growth opportunities in non-smartphone markets such as PC and XR. Interestingly, these two markets are very different for Qualcomm. For XR, Qualcomm has nearly 100% chipset market share, but in a fairly new and small market. Meanwhile, the PC sector is extremely mature in terms of growth, but Qualcomm has a very small market share; it hopes to grow in PCs as that market transitions towards AI PCs and especially Copilot+ PCs.

Alex Katouzian talking about the Snapdragon X series of processors

Qualcomm’s Alex Katouzian addressed the AI PC opportunity by talking about the company’s efforts to democratize AI PCs with a new $600 entry point in addition to its existing good/better/best pricing structure for 8, 10, and 12 CPU cores. The company expects to be able to address 70% of notebook volume by 2026 and have more than 100 devices commercialized by then as well. This is part of Qualcomm’s assertion that roughly 100 million laptops per year will be Copilot+-capable PCs priced above $500 by 2029. Qualcomm is confident about its competitiveness in this space thanks to its ability to ship a 40 TOPS NPU within all of its AI PC processors regardless of price. Qualcomm also teased that its upcoming third-generation Oryon CPU will be even more powerful and efficient.

Qualcomm’s performance and power curves for the Oryon CPU

When it comes to XR and spatial computing, the company has dominated the industry and was even at the core of Google’s new Android XR announcement that I recently wrote about. In this market, Qualcomm is closely partnered with both Meta and Google, the only true challengers to Apple’s Vision Pro and XR aspirations. Qualcomm’s view is that XR’s growth—whether in VR, MR, or AR—is driven by generative AI. This aligns with Google’s and Meta’s visions as well. Unlike any of its rivals, Qualcomm can also enable XR growth with its other low-power compute products in smartphones and PCs, which could either act as additional compute or be leveraged to rightsize the chips for the solution. In the bigger picture, Qualcomm is uniquely positioned to take advantage of spatial computing as the next compute platform and to enable its partners to grow in this market.

IoT

Qualcomm has rebooted its approach to IoT many times, partially because of strategic acquisitions like NXP falling through, but also because the IoT market is always changing. To put it more bluntly, the company’s strategy has been all over the place during the 10 years that I’ve covered it as an analyst. But this time around, Qualcomm is focused on what it can enable thanks to AI. This means providing both AI hardware and sensors at the edge, but also solutions that combine them in a coherent strategy. This means bringing together its chipsets, software stack, Qualcomm AI hub, Aware platform, and other solutions in a single approach that has been developed by working closely with partners such as Saudi Arabia’s Aramco. These industrial-grade solutions will deliver up to 100 TOPS of AI performance with the IQ series of products and be edge-AI ready. Qualcomm is also focused on scale, which is one of the biggest challenges in IoT; to this end, the company is working with module makers, ODMs, OEMs, distributors, SIs, and end-user companies across different verticals.

Qualcomm CFO and COO Akash Palkhiwala

Financials and Outlook

Qualcomm’s CFO/COO Akash Palkhiwala wrapped up Investor Day with a very pointed World Series joke about the people in the room—many of whom came from San Diego or New York City—being united in their hatred of the Los Angeles Dodgers. More seriously, Palkhiwala talked about strong financial growth and execution, as shown by fiscal 2024 non-GAAP revenues being up 9% while non-GAAP EBT and EPS were up 20% and 21%, respectively, over 2023. He also focused on the last five years of growth in those areas, highlighting 2x revenue, 3x EBT, and 3x EPS growth over that same period. He used these as proof that Qualcomm has delivered on its diversification strategy while keeping its core business and growth steady.

Looking forward, Palkhiwala also focused on the automotive sector as a key driver for revenue and profit growth, with an expectation that ADAS will ramp starting in 2026, and the automotive business will deliver roughly $8 billion in revenue for Qualcomm by 2029. He also reaffirmed that PC and XR will also represent larger revenue for the company, with XR alone projected to bring in more than $2 billion in revenue by 2029.

In total, Qualcomm expects that by 2029 its whole IoT business (which by its definition includes PC and XR) will represent $14 billion in revenue, with $4 billion coming from PCs, $4 billion from industrial IoT, and $4 billion from other sources including networking, tablets, headphones, and smartwatches.

CES 2025

At CES 2025, Qualcomm rounded out its Snapdragon X line with its entry-level offering targeted at PCs $600 and up. We also saw Lenovo and Geekom announce Snapdragon X Elite desktops, the first of their kind for the industry, running Windows. The Snapdragon X also delivers on Qualcomm’s promise of maintaining 45 TOPS of AI performance across its entire lineup. This also means that Qualcomm will enable Copilot+ PCs as low as $600 and premium notebooks at lower prices, for example the Zenbook A14, which is ultra-light at under 1kg and features Copilot+ features with an OLED display, 32GB of RAM, and 1TB of storage for only $1,099.

Qualcomm also announced partnerships with Panasonic, Hyundai, Garmin, Leapmotor, and Desay SV for various automotive solutions. These announcements included digital cockpit, ADAS, and digital chassis solutions all around the vehicle. In addition to automotive, Qualcomm also announced an on-premise AI solution for edge AI and IoT solutions using the company’s AI100 accelerator PCIe card.

Considering Qualcomm’s execution and market momentum, barring a global recession I believe its projections to be pretty credible. In my view, Qualcomm’s projections for XR seem fairly conservative, while the company might be a tad optimistic on the PC side—though it could still hit those numbers considering global annual PC volumes. While I do believe that Qualcomm’s competition is fiercer than ever, CES 2025 showed that the company is very much on track to execute its strategy, and I expect that we’ll see even bigger announcements from it later this year.

The post RESEARCH NOTE: Qualcomm’s Growth Targets and Execution appeared first on Moor Insights & Strategy.

]]>
The Value of Custom Silicon in the AI Era https://moorinsightsstrategy.com/mis-insider-podcast/the-value-of-custom-silicon-in-the-ai-era/ Wed, 15 Jan 2025 22:05:28 +0000 https://moorinsightsstrategy.com/?post_type=mis_insider&p=45122 Will Townsend delves into this transformative topic, exploring the evolution, impact, and future of custom silicon with insights from Marvell Technology’s Sandeep Bharathi, Chief Development Officer, and Will Chu, SVP & GM of Custom Compute and Storage.

The post The Value of Custom Silicon in the AI Era appeared first on Moor Insights & Strategy.

]]>
Generative AI and next-gen applications are transforming the tech landscape, demanding unprecedented levels of innovation. As data grows exponentially, our tech stacks must evolve, from the cloud to data centers and network edges. To unlock the full potential of AI, custom, purpose-built silicon is essential—offering performance, power efficiency, and scalability that traditional solutions simply can’t match.

In an engaging conversation, Will Townsend delves into this transformative topic, exploring the evolution, impact, and future of custom silicon with insights from Marvell Technology’s Sandeep Bharathi, Chief Development Officer, and Will Chu, SVP & GM of Custom Compute and Storage.

💡 Key Highlights:

Why Custom Silicon?
Hyperscalers and enterprises are adopting custom silicon to keep pace with massive workloads. With Moore’s Law slowing, custom silicon enables leapfrog performance and power efficiency.

Defining Custom Silicon – Marvell outlines a spectrum:
– Greenfield designs with unique IP
– Optimized chips with enhanced cores
– New device lines supporting diverse customers (e.g., AI accelerators, CXL, NICs).

The Benefits of Custom Silicon
Investment protection, programmability, and unmatched performance per watt are just a few—especially critical for power-hungry AI workloads.

Marvell’s Vision for the Future
Cutting-edge advancements like 3.5D, optical integration, new materials, and multivendor chipsets are paving the way for the next wave of innovation.

Watch the episode here:

Or listen here:

Disclaimer: This show is for information and entertainment purposes only. While we will discuss publicly traded companies on this show. The contents of this show should not be taken as investment advice.

The post The Value of Custom Silicon in the AI Era appeared first on Moor Insights & Strategy.

]]>
RESEARCH PAPER: Nokia — Delivering Operational Cost Savings Through Autonomous Networks https://moorinsightsstrategy.com/research-papers/research-paper-nokia-delivering-operational-cost-savings-through-autonomous-networks/ Wed, 15 Jan 2025 16:07:08 +0000 https://moorinsightsstrategy.com/?post_type=research_papers&p=45110 This report explores how Nokia is positioned to deliver on the promise of intent-based autonomous networks designed to lower operational costs and unlock new monetization opportunities.

The post RESEARCH PAPER: Nokia — Delivering Operational Cost Savings Through Autonomous Networks appeared first on Moor Insights & Strategy.

]]>
Communications service providers (CSPs) face daunting economic and operational challenges. On one hand, capital expense levels and management complexity tied to next-generation mobile core, radio access network, and backhaul deployment continue to be significant despite the introduction of disaggregated infrastructure and Open RAN systems. The sky-high cost of licensed spectrum used for data transmission and service support also adds to this burden. On the other hand, operator monetization beyond access remains an elusive endeavor because of legacy constraints, persistent security challenges, and friction tied to a historical lack of developer tools to facilitate fixed, mobile, and converged network programmability.

AI and automation combined have great potential to help CSPs unlock significant operational cost savings, new monetization opportunities, and delightful subscriber experiences—and drive returns on upfront capital infrastructure investment. However, equally important are the strong analytics support, prescriptive telco security measures, and network programmability that are necessary for architecting and delivering fully intelligent autonomous networks at scale. The bar is high, and few network infrastructure providers have the depth and breadth of capabilities to deliver what is needed today.

Moor Insights & Strategy believes that Nokia is well positioned to deliver on the promise of intent-based autonomous networks designed to lower operational costs and unlock new monetization opportunities. The company has formulated a complete, end-to-end vision to address this opportunity. Consequently, Nokia’s approach to autonomous networks is supported by: significant investments in AI ops and generative AI as fundamental building blocks; a clear grasp of what is needed from an analytics perspective to refine operational controls; deep capabilities within its security portfolio to safeguard operators and end users; and the underlying orchestration required to facilitate developer innovation through network programmability.

Click the logo below to download the report:

Nokia

 

Table of Contents

  • Summary
  • AI’s Critical Role in Autonomy
  • The Importance of Analytics
  • Telco Security Considerations
  • Empowering Developer Innovation
  • Call to Action

Companies Cited:

  • Nokia

The post RESEARCH PAPER: Nokia — Delivering Operational Cost Savings Through Autonomous Networks appeared first on Moor Insights & Strategy.

]]>
Samsonite Chooses Fibbl For Interactive 3-D AR Shopping Tools https://moorinsightsstrategy.com/samsonite-chooses-fibbl-for-interactive-3-d-ar-shopping-tools/ Wed, 15 Jan 2025 15:00:35 +0000 https://moorinsightsstrategy.com/?p=45039 The big luggage maker is the latest consumer brand to use Fibbl for high-quality 3-D images of its products for e-commerce — a trend which is primed for rapid growth.

The post Samsonite Chooses Fibbl For Interactive 3-D AR Shopping Tools appeared first on Moor Insights & Strategy.

]]>
A 3-D model of one of the Samsonite suitcases supported by Fibbl Samsonite

Probably everyone in the world who has traveled anywhere knows the Samsonite brand, and luxury travelers are familiar with Samsonite’s Tumi bags. So, it comes as no surprise that one of the world’s leading travel consumer brands is teaming up with a leader in 3-D AR product tools for websites, Fibbl. Samsonite is partnering with Fibbl to use the company’s plug-and-play platform for creating and integrating 3-D models via AR into its e-commerce sites for its most prominent brands, including Samsonite, Tumi and American Tourister.

What Does Fibbl Do?

Fibbl is an end-to-end platform that enables brands to create and integrate high-quality 3-D models based on high-resolution scans into e-commerce sites. The company got its start by creating 3-D scans and models for shoe brands to host on their websites. These scans are extremely high-quality, highly automated and cheap to produce, and over time the company has built quite a library of scans of different shoes for retailers to use in e-commerce.

The company’s technology can easily be applied to any kind of product that people want to view beyond 2-D photos to get a better idea of its appearance and dimensions. That’s where Samsonite and its other brands fit in so well.

Why Is Samsonite Working With Fibbl?

As a consumer brand, Samsonite is always looking for ways to improve traffic and conversions on its website. Meanwhile, customers want to know whether a bag is going to look the same in person as it does online. With its hyper-photorealistic 3-D scans and models, Fibbl is enabling both the brand and consumers to get what they want. For some of its other customers, Fibbl has reported double the engagement with e-commerce content, which then drives double the time the average shopper spends on the site. Fibbl also reports “significantly improved” conversion rates, as well as reductions in product returns of as much as 28%.

A Tumi Backpack that can be viewed in 3-D either in the browser or from a smartphone Tumi

The Samsonite partnership is adding to Fibbl’s already impressive momentum, as Fibbl is reporting 10x growth in end-user interactions and 10x growth in revenue over the last year, across 60 different customers. Working with these customers, Fibbl has already added more than 10,000 products to its library of 3-D scans, with more added all the time.

Fibbl has enabled its growth with an extremely low barrier to entry for new customers: a three-month trial offer of $1,000 per month for 60 products, including scans and website integration. These assets can be reused across many platforms, allowing design and marketing teams at the customer companies to share 3-D assets more efficiently.

3-D Assets Are The Future

I believe that Samsonite may also turn around and ask its retailers to use Fibbl to integrate 3-D versions of its products into their e-commerce platforms. After all, these assets need not be unique to Samsonite’s website and could be used by retailers like eBags or even Amazon. In the bigger picture, I believe that most brands want to increase user engagement and the perception of their brands with improved product representations. Suitcases might not be the sexiest thing, but many people, including myself, are very particular about them. Some of Tumi’s products now sell for thousands of dollars, and people want to be sure of their purchases before they make a significant outlay.

A 3-D image of an American Tourister suitcase American Tourister

The idea that I can see a 3-D model of a piece of luggage in AR to get an idea of how big it is and how it looks — without having to go to a store — is my idea of an online shopper’s dream. I did this myself on Fibbl’s and Samsonite’s websites on my smartphone and was able to play around with a suitcase and a backpack, spinning them around as if the actual items were in front of me in real life.

There are also so many opportunities for reuse that I believe these assets will have a life across many different platforms and potentially even prepare companies for the future of AR shopping when everything needs to be in 3-D. I even noticed recently that Google has taken it upon itself to offer lower-quality 3-D models of certain shoes on Google Shopping. In short, I think 3-D models will be a big part of the future of e-commerce — and Fibbl is quickly making a name for itself in this niche.

The post Samsonite Chooses Fibbl For Interactive 3-D AR Shopping Tools appeared first on Moor Insights & Strategy.

]]>
RESEARCH NOTE: Applied Materials 2024 Results Showcase Continued Momentum in AI and Beyond https://moorinsightsstrategy.com/research-notes/applied-materials-2024-results-showcase-continued-momentum-in-ai-and-beyond/ Tue, 14 Jan 2025 23:33:16 +0000 https://moorinsightsstrategy.com/?post_type=research_notes&p=45081   Applied Materials (NASDAQ: AMAT) recently released its SEC filings for its fiscal year 2024, and the results are what we’ve come to expect from Applied: another year of solid performance and continued growth. That makes five consecutive years of growth, which is not the easiest thing to do given the volatility of the chip […]

The post RESEARCH NOTE: Applied Materials 2024 Results Showcase Continued Momentum in AI and Beyond appeared first on Moor Insights & Strategy.

]]>
 

Applied Materials (NASDAQ: AMAT) recently released its SEC filings for its fiscal year 2024, and the results are what we’ve come to expect from Applied: another year of solid performance and continued growth. That makes five consecutive years of growth, which is not the easiest thing to do given the volatility of the chip industry it serves. (Remember the supply-chain nightmares at the beginning of this decade, not to be outdone by the volatility in the memory market.)

The company’s Q4 performance was especially strong, with record revenue, record earnings, and plenty of exciting news about its semiconductor production equipment and especially its Applied Global Services unit. As I’ve previously stated on X, AI doesn’t happen without this company. Applied’s continued growth reflects how much chip manufacturers continue to rely on it for critical hardware, for AI chips and just about every other type of semiconductor.

Let’s get into the numbers.

Applied Materials’ Strong Financial Performance in 2024

It was another strong year from Applied, with growth in every major segment of the business. This culminated in net revenue of $27.2 billion, up 2% year-over-year, along with operating income of $7.9 billion (up 3% YoY) and diluted EPS of $8.61 (up 6% YoY). By far the largest component of revenue was the semiconductor systems business at $19.9 billion, up 1% YoY. The largest growth was seen in Applied Global Services, up 9% YoY to hit $6.2 billion in revenue. Rounding out the picture, the much smaller display segment was also up 3% YoY, to $0.9 billion.

How Applied Is Winning

The boom in AI chips benefits Applied. The industry continues to demand greater performance and greater efficiency from silicon, and Applied has the resources and expertise — not to mention the institutional heritage — to do the fundamental engineering and materials research necessary to advance the field. As president and CEO Gary Dickerson mentioned in the Q4 earnings call, advancements in the semiconductor industry “are increasingly enabled by innovations in materials science and materials engineering, where Applied has clear leadership.”

I’ve talked before about Applied’s ability to tackle the most arcane engineering problems that concern the chip industry. Although its work often escapes the headlines, this company produces some genuinely astounding technical developments. One notable recent fruit of its efforts, advanced gate-all-around nodes, was recently moved into high-volume production by leading logic companies. Production in that area has already generated about $2.5 billion in revenue, with expectations of around $5 billion in 2025, as Dickerson noted on the Q4 earnings call. He went further, stating that the transition from FinFET-based nodes to gate-all-around nodes “grows Applied’s available market from around $12 billion to approximately $14 billion for every 100,000 wafer starts per month of capacity.” If that weren’t enough, Applied anticipates capturing more than half of all the process equipment spending industry-wide for gate-all-around.

During 2024, Applied found growth in other areas as well. Integrated solutions are a growing part of semiconductor systems revenue, and are expected to become even larger in coming years. DRAM also saw massive gains, with revenues growing by more than 60% YoY; this is in line with a long-running trend, and Applied has grown its share of the DRAM market by ten points in as many years. Advanced packaging also saw large improvements, providing chip makers with important advances in performance, energy consumption, and cost for new chips.

It’s important to note that Applied’s display business did not have such remarkable results during the year. But I believe the company is well positioned to capitalize on any marketwide increase in demand for OLED displays, which I see as being likely over the next couple of years. Applied also saw some decline in revenue from China, which seemed to spook some investors. However, given ongoing tensions between the U.S. and China over semiconductors — and concerns about possible changes in trade policy by the incoming U.S. presidential administration — it’s not surprising to me that Applied has faced volatility among Chinese customers. It’s also worth noting that the downturn in Applied’s China performance did not stop the company from delivering a strong 2024 overall.

Record Performance for Applied Global Services

Notable in this year’s filings was a record year for AGS, rounded off by a record quarter in Q4. AGS is the part of the company that assists manufacturing customers with the upkeep and operation of their fabs, ensuring maximum uptime and performance. This is a critical service in a period when enterprise customers are demanding more AI-ready chips — by yesterday.

AGS is the fastest-growing segment of Applied in both relative and absolute terms, adding approximately $500 million in revenue YoY. The Q4 earnings report mentioned that AGS signed its first five-year service contracts during the year, raising the average length of AGS contracts to 2.9 years. To top it off, AGS maintained a contract renewal rate of over 90%, which tells me that it’s delivering on its promises for customers. With high renewal rates and growing contract tenures, I expect AGS to continue to provide great returns for Applied.

Applied Materials’ Pressures

Here’s a big question I’ve asked before in the context of chip manufacturers: When do enough companies start seeing a real return on investment from all the CapEx being dumped into generative AI? AI has been one driving factor in Applied’s business for the past few years. As enterprises demand more and better chips from manufacturers, those manufacturers in turn demand more and better manufacturing equipment and processes from Applied. While Applied does a very good job managing the inherent volatility faced by any chip equipment vendor, its fortunes are inevitably tied to the success of its manufacturing clients.

In the context of AI, this means that if real, widespread value cannot be realized from gen AI, Applied will take a hit alongside the manufacturers it serves. AI has been running so hot for the past couple of years that all kinds of AI vendors (not just in semiconductors) have been able to make money on it. But the pace of the past two years won’t last forever, and anything that slows down the AI train as a whole could potentially introduce additional uncertainty to the chip market — which can already be temperamental. As I said, Applied is well managed and tends to keep good hands on the tiller at all times to avoid the worst of this, but the risks inherent to the industry remain.

Applied Has Lots of Room to Run

As long as companies want gen AI and lots of it, Applied should keep doing very well for itself. And this trend is likely to go on for a while, because the industry keeps demanding higher quantities of better chips. In the words of Gary Dickerson, “Leading AI companies are talking about the need to drive a 10,000-times improvement in computing-performance-per-watt over the next 15 years.” That kind of four-orders-of-magnitude shift is not going to happen without Applied. Also, the market is trending toward a “more than shrinks” stage via lithography as well as technologies including GAA transistors and heterogeneous packaging — so the future looks bright for the company.

Barring a sudden meltdown in gen AI demand or some other unforeseen catastrophe, I expect that Applied will continue to grow steadily. I’ll be intrigued to see how it fares in displays (especially if OLEDs take off) and in the China market during the second Trump administration. Regardless, I don’t see anything on the horizon that’s likely to stop Applied from continuing its strong run.

The post RESEARCH NOTE: Applied Materials 2024 Results Showcase Continued Momentum in AI and Beyond appeared first on Moor Insights & Strategy.

]]>
Datacenter Podcast: Episode 35 – Talking Extreme Networks, OpenAI, Oracle, Microsoft, IonQ, Dell https://moorinsightsstrategy.com/data-center-podcast/datacenter-podcast-episode-35-talking-extreme-networks-openai-oracle-microsoft-ionq-dell/ Tue, 14 Jan 2025 16:18:14 +0000 https://moorinsightsstrategy.com/?post_type=data_center&p=45061 On episode 35 of the Datacenter Podcast, Moor Insights & Strategy co-hosts Matt, Will, and Paul talk Extreme Networks, OpenAI, Oracle, & more

The post Datacenter Podcast: Episode 35 – Talking Extreme Networks, OpenAI, Oracle, Microsoft, IonQ, Dell appeared first on Moor Insights & Strategy.

]]>
On this week’s edition of MI&S Datacenter Podcast, Moor Insights & Strategy co-hosts Matt, Will, and Paul analyze the week’s top datacenter and datacenter edge news. They talk Extreme Networks, OpenAI, Oracle, and more!

Watch the video here:

Listen to the audio here:

3:38 Can Extreme Networks Vie for Share in 2025?
12:51 Do We Really Know How To Do It?
19:21 Oracle Exadata X11M – The Real Data Platform
28:51 Microsoft Betting Big on AI Data Centers in 2025
36:50 Entangled Ambitions
42:23 Dell Embraces OCP
50:16 Getting To Know The Team

Can Extreme Networks Vie for Share in 2025?
https://www.extremenetworks.com/resources/blogs/introducing-extreme-platform-one

Do We Really Know How To Do It?
https://blog.samaltman.com/reflections

Oracle Exadata X11M – The Real Data Platform
https://www.oracle.com/news/announcement/oracle-introduces-exadata-x11m-platform-2025-01-07/%C3%82%C2%A0

Microsoft Betting Big on AI Data Centers in 2025
https://blogs.microsoft.com/on-the-issues/2025/01/03/the-golden-opportunity-for-american-ai/

Entangled Ambitions
https://investors.ionq.com/news/news-details/2025/IonQ-Completes-Acquisition-of-Qubitekk-Solidifying-Leadership-in-Quantum-Networking/default.aspx

Dell Embraces OCP
https://moorinsightsstrategy.com/research-papers/evaluation-of-open-compute-modular-hardware-specification/%C3%82

Disclaimer: This show is for information and entertainment purposes only. While we will discuss publicly traded companies on this show. The contents of this show should not be taken as investment advice.

The post Datacenter Podcast: Episode 35 – Talking Extreme Networks, OpenAI, Oracle, Microsoft, IonQ, Dell appeared first on Moor Insights & Strategy.

]]>
Moto Razr+ 2024: Improving On A Great Flip Foldable Smartphone https://moorinsightsstrategy.com/moto-razr-2024-improving-on-a-great-flip-foldable-smartphone/ Tue, 14 Jan 2025 15:00:52 +0000 https://moorinsightsstrategy.com/?p=45018 The Razr+ 2024 improves on the already-great 2023 model. Minor quibbles about camera performance aside, this is an excellent compact phone with a great user experience.

The post Moto Razr+ 2024: Improving On A Great Flip Foldable Smartphone appeared first on Moor Insights & Strategy.

]]>
The outside front screen of the Razr+ 2024 while folded Anshel Sag

I’ve been a big fan of Motorola’s Razr line since the 2023 model. This year’s model is a nice improvement on that design, with some major upgrades to the display, processor and AI capabilities. When it launched Gemini Live AI chat earlier this year, Google demoed the latest Gemini AI capabilities using this Moto Razr+ alongside the Samsung Galaxy S24 Ultra.

People have also been taking note of the Razr+ since Moto recently disclosed that 20% of its Razr customers are switching from the iPhone, which is nearly unheard of in the Android landscape. So, let’s explore why this phone might be so compelling to iPhone users and younger users.

Moto Razr+ Design And Specs

On paper, the Moto Razr+ is nearly perfect. However, some crucial design choices have been taken to make it affordable. Its launch price of $1,000 is competitive with Samsung’s Flip6 at $1,100, and the Razr+ is currently on sale for $749 at T-Mobile and $799 elsewhere, which is extremely competitive. The Razr+ is the higher spec of the Razr line, and it includes a bigger cover screen and faster processor. It features an internal foldable 6.9-inch 2640 x 1080 (413 PPI) LTPO AMOLED Dolby Vision 165-hertz display capable of 3000-nit peak brightness. The cover screen external display complements this with a 4-inch 1272 x 1080 (417 PPI) Dolby Vision 165-hertz 2400-nit peak brightness display with Gorilla Glass Victus. Moto excels against the competition with its cover screen specs, and here it has carefully matched refresh rates and pixel densities so that the user experience is the same regardless of the display used.

Moto has also paid careful attention to the processor, giving the Razr+ a Snapdragon 8s Gen 3, which is a slightly slower version of its flagship Snapdragon 8 Gen 3 that was popular in Android phones in 2024. I believe that this was mostly to save on cost without sacrificing too much on performance. That said, Samsung did go for the full Snapdragon 8 Gen 3 in its competing model, so there is a bit of a difference in performance there. Motorola also opted for a MediaTek Dimensity 7300X chip on the base-model 2024 Razr, which is a chip MediaTek specifically created for the foldable market.

For the Razr+, Moto defaulted to 256GB of storage and 12GB of RAM, which I believe pushed Samsung to match those specs in the last generation and this one. Motorola also invested heavily in wireless tech with Wi-Fi 7 support and AptX Lossless Bluetooth. I believe that this sets up users for a better experience in the long term, since Wi-Fi 7 improves performance, latency and battery life.

The Moto Razr+ 2024 with the camera lenses sticking out of the screen Anshel Sag

The two companies differentiate the most in the camera department, because while Motorola went for a dual-50 MP camera configuration, Samsung went with a 50 MP main camera and a 12 MP ultrawide camera. Motorola has a 50 MP main camera and a 50 MP 2x telephoto camera, which allows for better zoom performance. This comes from a fundamental difference in camera philosophy and what each company thinks is more important. I think that it would be better if flip-folds simply came with three cameras so people wouldn’t have to choose. But I respect Moto’s decision because telephotos are very useful very often.

Both the Flip6 and Razr+ have 4000 mAh batteries, but I believe battery life is a crucial area for innovation—and improvement—in foldables. That said, I do appreciate that the Razr+ supports 45-watt fast charging, which continues Moto’s lead in fast charging against Samsung’s 25-watt setup. The Razr+ also comes in four fun colors: Midnight Blue, Hot Pink, Spring Green and Peach Fuzz.

Moto Razr+ Software And User Experience

Motorola’s software has matured considerably over the years, and I would say it provides a mostly enjoyable experience. This is especially true for the camera software, which feels like it is very rich in capabilities. But realistically, a lot of the difficult software design work on foldables is done on the extra screen, whether it’s the inside screen on book foldables or the cover screen for flip foldables. Since you spend so much time on the outside screen of a flip foldable, it needs to be easy to use and capable of handling most of your apps.

This is where Moto has done the most work and enables the class-leading 4-inch display to be useful without needing to open the large (and more power-hungry) main display. That said, not all apps take full advantage of the 4-inch screen; for example, Google apps including Gmail, Photos and Messages use only about 80% of the screen. Moto does have some first-party apps including its camera, calendar, weather and some games that can take advantage of the full screen, but I believe there is room for improvement here in terms of user experience.

The experience of switching an app such as Photos from the front screen to the main inner screen is quite seamless and allows you to appreciate going from a 4-inch screen to a 6.9-inch screen. I also love the many camera features that are specific to using a flip-fold phone, for instance the ability to record photos and videos without holding the phone. I did this with my baby daughter, and it turned out to be a very fun time. There is also a not particularly new but still fun camcorder mode that automatically records a video when you pick up the phone half-folded, turn it sideways, and hold it like a camcorder.

The Razr+ has a feature that I also found useful on my Google Pixel 9 Pro Fold (review coming soon), which uses the outside screen to play a cartoon character to get a baby to look at the camera. It also uses this screen to give the subject of the photo a chance to preview the shot before it’s taken. This isn’t a particularly new feature, but I know there are a lot of people who could benefit from it, and I believe it is one of those features that could persuade someone to switch to this phone. I also think that Moto did a better job of integrating Google Lens into its camera than Google did; it’s much easier to access and faster to use than on the Pixel, in my experience.

All in all, I’ve been pretty satisfied with the camera quality and low-light performance of the Moto Razr+, although it’s still not quite a flagship-class camera when compared to the S24 Ultra, Pixel 9 or iPhone 16 Pro.

Connectivity And Performance

I was quite impressed with the Moto Razr+’s 5G performance considering that it’s foldable—and space is precious. On a T-Mobile-branded Razr+ (the pink one), I was able to get comparable speeds on its 5G UC network that I was getting on my Pixel 9 Pro Fold—in excess of 1,200 Mbps down and 60 Mbps up. These are impressive speeds considering they were measured inside a hotel lobby without line-of-sight for the cell tower. A great wireless front-end paired with a fast 5G network means better app experiences and better battery life, which I believe a lot of people forget when choosing their next phone. For comparison, my iPhone 16 Pro on Verizon’s 5G network got only 100 Mbps down and 1 Mbps up in the same spot.

In terms of processor performance, the Snapdragon 8s Gen 3 falls somewhere between the Snapdragon 8 Gen 2 and Gen 3. This gives it pretty competitive performance in benchmarks and still has it beating out Google’s latest Tensor G4 in the Pixel 9 Pro, though it doesn’t beat out the Snapdragon 8 Gen 3 in the Flip6 or the A18 Pro in the latest iPhones. That said, I don’t think many iPhone users care much about which processor is inside their phones as long as it is fast enough and gets the job done. In fact, I believe that better 5G and Wi-Fi performance is more important to users than processor benchmarks.

How Moto Razr+ Stacks Up To The iPhone

I have been enjoying using the Moto Razr+ 2024 since its launch in July, and it has stood the test of time and traveled with me quite a lot as my third smartphone. While it hasn’t quite elevated itself to the point of being my primary or secondary device, that’s mostly because the camera isn’t quite flagship-tier, and I am very picky about camera performance. Even so, it is so compact that it really is not difficult to carry around with me anywhere I go, so much so that sometimes I forget I have it on me.

For iPhone users looking to switch to Android, the flip features and external camera are so different and enjoyable compared to a traditional candybar phone that I can fully understand someone wanting to switch. Additionally, I believe that phones like the Razr+ satisfy a lot of people’s desire to have a smaller phone while still offering the ability to open the phone to a larger screen when that’s what you need. I don’t think Apple is going to ship a foldable for quite some time, so I believe that if you’re an iPhone user looking for something new and exciting in a foldable, the Moto Razr+ 2024 is indeed the right move.

The post Moto Razr+ 2024: Improving On A Great Flip Foldable Smartphone appeared first on Moor Insights & Strategy.

]]>
MI&S Weekly Analyst Insights — Week Ending January 10, 2025 https://moorinsightsstrategy.com/mis-weekly-analyst-insights-week-ending-january-10-2025/ Mon, 13 Jan 2025 22:05:05 +0000 https://moorinsightsstrategy.com/?p=44920 MI&S Weekly Analyst Insights — Week Ending January 10, 2025. A wrap up of what our team published during the last week.

The post MI&S Weekly Analyst Insights — Week Ending January 10, 2025 appeared first on Moor Insights & Strategy.

]]>
MI&S Logo_color

Welcome to this edition of our Weekly Analyst Insights roundup, which features the key insights our analysts have developed based on the past week’s events.

It’s no surprise that my colleagues and I spent much of last week focused on CES. In particular, Anshel Sag—who’s a heck of a device reviewer, besides being a savvy industry analyst—will be publishing a number of pieces this week covering the big PC OEMs, chip makers, and players in the XR industry. Many of my own thoughts from CES made it into Friday’s installment of The Six Five Podcast.

Cisco Desk Pro in Mel Brue office

The Cisco Desk Pro (left) is a slick — albeit somewhat pricey — tool for getting more out of your video meetings. Photo: Melody Brue

Plenty of the announcements at CES are about eye-popping (or wannabe eye-popping) consumer devices, but Melody Brue’s review of the Cisco Desk Pro last week is a good reminder of the difference that high-quality enterprise tech can make for individual productivity. This reality is only going to be reinforced by the increasing adoption of AI agents in 2025 to augment the work of corporate employees, from the shop floor to the C-suite.

If you have a piece of new technology that’s changing the way you or your team work in 2025, I’d love to hear about it. What’s your favorite new gadget that’s moving the needle?

This week, Will is at the Cisco AI Summit in Palo Alto and Mel is attending Zoom’s virtual Work Transformation Summit. The rest of us are busy writing, researching, and advising clients. If there is anything we can help you with to start your year off strong, please reach out.

Let’s do this, 2025!

Patrick Moorhead

———

Our MI&S team published 15 deliverables:

This past week, MI&S analysts have been quoted in multiple syndicated top-tier international publications including CIO, Computerworld, Fierce Electronics, Fierce Networks, InfoWorld, MIT Technology Review, TechTarget, Wired, and others. The media wanted our thoughts on AWS, CES, Dell, HPE, IBM, Intel, Nvidia, Oracle, WordPress, and of course AI and some 2025 predictions

MI&S Quick Insights

I was quite intrigued by the agentic blueprints that NVIDIA announced last week at CES. But it was not necessarily the use cases—which were pretty commonplace—that were the real story. Much more compelling was the vision of what agentic development could be. The first thing that stuck out was that these are partner-driven solutions. This is in contrast to what we have seen so far, which have been siloed and internally developed agentic solutions. With those, you basically have to use a homogeneous stack of technology to realize the value. And, yes, there is still a need to use the NVIDIA AI Enterprise platform to deploy the blueprints that were announced, but the simple thought of co-development is good to see—and hasn’t been highlighted enough.

Second, I am very interested in how NVIDIA is thinking about agentic AI in the physical world. That is what I consider a second leap from what we are seeing so far. Today’s agents are very much bound to a cloud or a platform. The first leap I am hoping to see is a leap to the on-premise compute world. This could mean collaboration between the AIs on devices such as an AI PC or an iPhone and the cloud (edge AI, so to speak). The second leap is the same idea but to physical devices and robotics. Again, it’s refreshing to see NVIDIA paint a picture of the agentic world that is so visionary.

Over the past month I have been researching AI development platforms, and I have some research coming out on that very soon. But the deeper I have gotten into the topic, the more I’ve realized that each platform regards different user roles with different priorities. It is almost as if each vendor started development from a completely different place, yet they all ended up close enough to each other that we now have a new category of solution. This is a very good thing early in a product lifecycle. By having a broad base of solutions to choose from, the market will have a better opportunity to judge what ends up being the best use of the technologies.

To that point, I want to mention that the newly released Azure AI Foundry from Microsoft is, to my mind, the first of these platforms to really take on the IT management aspects of the problem set. And while it may not have all of the very coolest developer features we see from the competition, it does highlight an under-represented set of requirements that will be essential for enterprise deployment and success.

Last week a member of the media reached out to me to discuss a topic that got me thinking—how value is being redefined in the age of generative AI. Here’s an example: for as long as we can all remember, having lots of data was critical to drive decision making. And if you had good and exclusive data, that was highly valuable. But now that AI can assemble and infer data so quickly and so well, has the value moved away from mere data possession towards reasoning and prediction? Or will people now go to greater lengths to hoard the best data? I am leaning towards reasoning winning the day, but I do think it’s a great topic for reflection. (I’ll let you know when the article comes out.) At the very least, I expect that we will see a more distinct break between data and reasoning in the business world—like we already do between training and inference within AI.

I was quite intrigued by the agentic blueprints that NVIDIA announced last week at CES. But it was not necessarily the use cases—which were pretty commonplace—that were the real story. Much more compelling was the vision of what agentic development could be. The first thing that stuck out was that these are partner-driven solutions. This is in contrast to what we have seen so far, which have been siloed and internally developed agentic solutions. With those, you basically have to use a homogeneous stack of technology to realize the value. And, yes, there is still a need to use the NVIDIA AI Enterprise platform to deploy the blueprints that were announced, but the simple thought of co-development is good to see—and hasn’t been highlighted enough.

Second, I am very interested in how NVIDIA is thinking about agentic AI in the physical world. That is what I consider a second leap from what we are seeing so far. Today’s agents are very much bound to a cloud or a platform. The first leap I am hoping to see is a leap to the on-premise compute world. This could mean collaboration between the AIs on devices such as an AI PC or an iPhone and the cloud (edge AI, so to speak). The second leap is the same idea but to physical devices and robotics. Again, it’s refreshing to see NVIDIA paint a picture of the agentic world that is so visionary.

Over the past month I have been researching AI development platforms, and I have some research coming out on that very soon. But the deeper I have gotten into the topic, the more I’ve realized that each platform regards different user roles with different priorities. It is almost as if each vendor started development from a completely different place, yet they all ended up close enough to each other that we now have a new category of solution. This is a very good thing early in a product lifecycle. By having a broad base of solutions to choose from, the market will have a better opportunity to judge what ends up being the best use of the technologies.

To that point, I want to mention that the newly released Azure AI Foundry from Microsoft is, to my mind, the first of these platforms to really take on the IT management aspects of the problem set. And while it may not have all of the very coolest developer features we see from the competition, it does highlight an under-represented set of requirements that will be essential for enterprise deployment and success.

Last week a member of the media reached out to me to discuss a topic that got me thinking—how value is being redefined in the age of generative AI. Here’s an example: for as long as we can all remember, having lots of data was critical to drive decision making. And if you had good and exclusive data, that was highly valuable. But now that AI can assemble and infer data so quickly and so well, has the value moved away from mere data possession towards reasoning and prediction? Or will people now go to greater lengths to hoard the best data? I am leaning towards reasoning winning the day, but I do think it’s a great topic for reflection. (I’ll let you know when the article comes out.) At the very least, I expect that we will see a more distinct break between data and reasoning in the business world—like we already do between training and inference within AI.

Sam Altman, CEO of OpenAI, made a very interesting post on his personal blog about when he believes OpenAI could achieve artificial general intelligence (AGI), an advanced level of AI that can perform at human levels. It was only two years ago that OpenAI made the historic launch of ChatGPT. Just two months after its release, ChatGPT had 100 million active users, heralding AI’s potential as one of the most powerful technologies ever created. Beyond that, the launch transformed OpenAI from a small research lab into a major AI player. Today, an advanced version of the GPT platform handles more than one billion queries daily.

In the blog post, Altman shared personal anecdotes, including his unexpected firing from the company and the governance issues that followed, offering lessons learned in leadership and company management. The rapid growth of OpenAI required him to build the company culture and infrastructure almost from scratch. That led to both successes and setbacks. Altman admitted to his own failures in governance, particularly around his firing. In another lesson learned, he emphasized the importance of having diverse and experienced board members.

Now, here’s the most interesting part of his post. Altman predicts that AGI will be achieved in 2025 in the form of AI agents that will impact workforce productivity. Altman said, “We are now confident we know how to build AGI as we have traditionally understood it. We believe that, in 2025, we may see the first AI agents ‘join the workforce’ and materially change the output of companies.” Moving beyond AGI, Altman talks about focusing on superintelligence, envisioning a future where AI could dramatically enhance human capabilities and societal prosperity.

From my perspective, the trajectory towards more incremental and advanced AI capabilities looks doable. However, AGI needs a level of human-like reasoning and adaptability that can be applied over a wide range of tasks. That is a very complex goal. We are not there yet for the strict definition of AGI, but AI agents in the workforce with a limited form of AGI might be doable this year.

I believe that superintelligence is not possible at this stage or at any time within several decades, if ever. While AI has made tremendous advancements over the past decade, superintelligence involves numerous unknowns, including abstract reasoning, creativity, conscious thoughts, and problem-solving at the highest level. None of those is on the horizon yet—let alone all of them.

Oracle has launched the Exadata X11M data management platform, with a focus on driving extreme performance across three key workloads—AI (vector search), online transaction processing (OLTP), and analytics. Exadata is a combination of tuned hardware and software that enables organizations to accelerate performance of these key workloads while enabling greater levels of consolidation in the datacenter.

I like what Oracle is doing. For decades, Oracle Database has been the data management platform of the enterprise (97% of Fortune 500 companies run Oracle). It only makes sense that the company would take its IP and better enable core workloads that power the enterprise. The numbers are quite compelling across the board: Oracle claims vector search performance increases of up to 55% on storage servers and 45% on compute servers, along with 25% faster OLTP and analytics performance relative to the X10M platform. And this performance is delivered on-prem, in any major cloud, or in hybrid environments.

It can be very difficult for legacy infrastructure companies to pivot and maintain relevance as the market shifts around them. Oracle is unique in how it has smartly pivoted and taken full advantage of its footprint in enterprise data.

The data protection software market grew in 2024, driven in no small part by advanced cyberthreats and stricter regulations. Unsurprisingly, AI tools became much more important for automating governance, ensuring compliance, and detecting threats. Vendors such as Cohesity, Commvault, Rubrik, and Veeam Software improved their market presence through acquisitions, partnerships, going public, and adding new features to their platforms. Observability tools also progressed, integrating system monitoring with data protection for proactive solutions. Read more in my latest Forbes article about what I see ahead for data protection in 2025.

Zoho Analytics has grown into a full-fledged, AI-driven business intelligence platform. Its September 2024 release included more than 100 updates, with a big emphasis on expanding access to data analysis across different job functions. Considering its advancements in AI and machine learning, Zoho Analytics now competes with established BI solutions, enabling a broad range of users in different industries to make more informed decisions. Check out the recent MI&S Research Brief about Zoho Analytics from Melody Brue and me for more.

Extreme Networks launched its Extreme Platform One in early December. Platform One is positioned to allow IT professionals to manage and secure networks faster and more efficiently. The company claims that the offering has been developed based on customer feedback and aims to unify connectivity experiences with a single composable workspace and high degrees of AI-powered automation, and to deliver a simplified licensing structure. I believe that when Platform One becomes available in the second half of 2025, it will allow Extreme to compete with the likes of Cisco, HPE, and others more effectively given its historic focus on providing commodity connectivity infrastructure.

At the National Retail Federation’s show, SAP rolled out some new features for the retail industry. These include the SAP S/4HANA Cloud Public Edition, designed for retail, fashion, and related businesses, as well as an AI-powered shopping assistant. The company also shared plans for a loyalty-management solution for retailers and consumer goods companies, which is set to launch in late 2025. The updates are geared toward helping retailers work more efficiently and better connect with their customers.

In 2025, AI agents are expected to change the game in retail by enabling personalized customer experiences, flexible shopping options, and sustainability initiatives. At NRF, Microsoft highlighted tools such as Copilot and Dynamics 365 ERP agents that can handle routine tasks, improve operations, and make real-time decisions. This gives employees more time to focus on what matters most while improving efficiency, reducing costs, and helping build relationships with customers and suppliers.

Key statistics from Adobe’s 2024 Holiday Shopping Report reveal significant trends in online retail during the holiday season. Online retail spending reached a record $241 billion, representing an 8.4% increase compared to 2023. Additionally, spending on buy now, pay later (BNPL) options grew substantially, exceeding $18 billion and peaking at $993 million on Cyber Monday, which set a new single-day record. Mobile revenue accounted for 53.2% of online shopping, totaling $128 billion. This shift towards mobile spending and the increasing popularity of BNPL options highlight changing consumer preferences in digital payments and financing.

Indeed has released its 2025 U.S. Jobs & Hiring Trends Report, which includes interesting data points and trends for the workplace and workforce in the new year. Two significant trends stood out to me as poised to reshape the workforce in 2025, presenting challenges and opportunities for businesses and workers alike.

1. Demographic shifts and labor shortages: The U.S. is experiencing a decline in its prime working-age population, a trend with profound implications for labor supply. This demographic shift suggests that future workforce growth could hinge on immigration, potentially leading to persistent labor shortages across various sectors. Companies may need to reevaluate their talent acquisition strategies to focus on upskilling existing employees, embrace remote work to access wider talent pools, and implement aggressive retention initiatives.

2. The dual nature of AI: Artificial intelligence is rapidly transforming the workplace, potentially automating existing jobs while creating new roles. While estimates suggest that AI could automate millions of jobs, it’s also projected to generate millions of new positions requiring a blend of technical expertise and uniquely human skills such as empathy, creativity, and critical thinking. This duality underscores the growing importance of adaptability and continuous learning for workers at all levels.

The convergence of these trends presents a complex landscape. A shrinking workforce may accelerate AI adoption to address labor shortages, potentially increasing productivity but also raising concerns about job displacement. To thrive in this evolving environment, businesses and individuals must proactively adapt, embrace learning, and cultivate a workforce equipped for the demands of the future.

After last year’s CES, I predicted that Matter would reach its tipping point in 2025, becoming the preferred connectivity standard for new smart home product designs. I’m doubling down on that prediction this year because Matter ecosystems (platforms) are maturing, and consumer adoption is finally taking off.

1. Matter ecosystems: Certifying products and developing product-specific apps is becoming much easier.

  • Easy product certification — “Works with” compatibility programs from Apple, Google, and Samsung agreed to accept Matter interoperability testing. Apple is already accepting these lab results, and Google and Samsung plan to do the same later this year. What about Amazon? Stay tuned. My take: This announcement validates Matter’s “universal interoperability” brand promise and encourages more device makers to get on board. Meanwhile, the economics are compelling—one interoperability test replaces three or four.
  • Easy app development — As promised earlier this year, Google is opening up Google Home as a developer platform. The company just announced a new set of Matter Home APIs for Android developers, with iOS support coming in a few months. These APIs link partner apps with Google Home hubs to control devices and automation experiences. The apps connect directly to the Google Home runtime package on local, on-premises hubs. The runtime controls Matter devices without a round-trip to the cloud, reducing latency while improving reliability and privacy. Google Home’s installed base is over 40 million hubs, including Nest, Chromecast, Google TVs, and some LG TVs. My take: This is a big deal. Today, Matter standardizes connectivity, but CE manufacturers often require product-specific features at the ecosystem level. Creating new ecosystems is complicated and costly, and consumers don’t want a separate ecosystem for each product, so Google is on the right architectural path here. Google Home hubs connect local devices without round trips to the cloud, and APIs let partners extend the ecosystem with product-specific features and experiences. Other ecosystems (Apple, Amazon, and Samsung) already have comparable APIs, and could add on-premises control logic to their hubs. I hope ecosystem companies consider standardizing APIs or at least using similar design patterns.

2. Matter products: The tech news outlets will review all the new Matter products that debuted at CES, but here are my short takes on a few that caught my eye.

  • Resideo (Honeywell Home) announced the Honeywell Home X2S Matter-enabled smart thermostat ($79.99 MSRP). My take: The low price point proves that adding Matter is cost-effective.
  • GE unveiled two new wall-mounted Matter-based “Cync” dimmer switches with several interesting innovations, including single-device three-way circuits ($44.99 and $25.99). My take: It’s great to see major consumer brands support Matter with innovative, mainstream, reasonably priced products.
  • LG’s over-the-range microwave oven with a 27-inch touchscreen and full Matter support created considerable interest at CES. It’s a smart TV, Matter hub, Thread border router, and home control panel. Oh yeah, it also microwaves food and has three cameras to show it cooking. This product is part of an industry trend to use touchscreens as the UI for appliances, from light switches to washing machines. My take: Some analysts dismiss this trend as silly, but we should take it seriously. LCD panels are inexpensive peripherals for smart appliances, so the question isn’t whether to use them, but how to use them. For instance, I see a rough road ahead for CE companies that envision these screens as advertising billboards or sales tools.
  • Aqara is going all-in with dozens of Matter products and variants. Examples include control panel hubs, dial-based touchscreen controllers, touchscreen switches, light switches, dimmer switches, presence sensors, climate sensors, a doorbell camera, and a Matter hub. My take: Aqara is beating established brand names to the punch with a broad Matter product portfolio.
  • Locks — Several companies introduced innovative Matter-enabled smart locks. Schlage’s first Matter product is the Sense Pro Smart Deadbolt. It uses UWB for hands-free unlocking. ULTRALOQ’s Bolt Fingerprint and Bolt Mission locks have Matter support, and the latter has UWB spatial awareness. My take: I’m pleased to see house locks get some of the great features we’ve had in car locks for years. I’m also bullish on UWB.

3. Industrial automation: There were hundreds of industrial announcements at CES. Here are three examples.

  • NVIDIA — For IIoT and edge tech, the quote of the week was from Jensen Huang: “The ChatGPT moment for general robotics is right around the corner.” He defined three kinds of robots that require no special accommodations to put them into service—agentic (because they’re information workers), self-driving vehicles (because roads are already in place), and humanoid (because they fit directly into our world). I think enterprise and industrial operations technology is a fourth AI embodiment. Industrial IoT systems are increasingly autonomous and adaptive but lack the uniform connectivity and interoperability needed to act “robotic.” This is the new definition of industrial IoT—enabling robotic physical infrastructure.
  • NXP — The company has agreed to acquire TTTech Auto in an all-cash transaction valued at $625 million. NXP plans to upscale TTTech MotionWise in its CoreRide software, accelerating the shift from hardware-based designs to software-defined vehicles (SDV). TTTech stands for time-triggered technology, a set of techniques for synchronizing and scheduling events across distributed systems. My take: This savvy acquisition ensures the CoreRide platform can use standard networks for tightly timed, safety-related distributed automotive applications. CoreRide and TTTech technologies could also apply to manufacturing and other industrial applications, but NXP hasn’t confirmed that.
  • Ceva-MediaTek collaboration — Imagine wearing VR headgear, turning your head, and the audio space remains fixed relative to the 3-D video space. Ceva’s RealSpace immersive spatial audio integrates this and other advanced audio processing techniques into MediaTek’s Dimensity 9400 mobile chipset. My take: Locking the audio space to the virtual visual world is very cool, and not just for gaming. For instance, spatial audio adds realism to Industrial digital twins.

IonQ recently completed its acquisition of Qubitekk, a quantum networking firm. The acquisition provides IonQ with advanced networking technology and a large number of new patents, bringing IonQ’s portfolio to over 600 patents. It also acquired an important networking asset in Qubitekk’s EPB Quantum Network, the first commercial quantum network in the U.S. That will enhance IonQ’s quantum networking capabilities and remote ion-ion entanglement. The integration of Qubitekk’s technology will likely provide IonQ with faster quantum network deployment, which will enhance its secure communications and distributed computing capabilities. This should push IonQ into a leadership position in quantum networking, which could result in new partnerships and/or contracts. Combining IonQ’s quantum expertise with Qubitekk’s networking experience could result in significant advancements in security and computational power. This acquisition is strategic for IonQ, given its dependence on networking for how it plans future scaling of qubits.

Recent negative news related to the Palo Alto Networks Expedition firewall migration tool may be overblown. The tool was offered as a free utility to migrate configurations from third-party firewalls to Palo Alto Networks’ next-generation firewall platform, but it was never intended for production deployments. There is no evidence of active exploitation, but even though the company retired the tool last year, patches have been issued and production migration tools have been provided to its customers.

In 2025, sports tech is sure to keep evolving as part of the ongoing transformation of how fans experience games and connect with their teams. For example, the platform Cosm and Meta’s Xtadium app are bringing sports into virtual reality. Meanwhile, streaming services—as we saw in 2024 with Peacock during the Summer Olympics, Netflix with boxing and the NFL, and AWS with the NFL—are expected to expand with AI features that include personalized highlights and real-time stats. This tech is also branching into other areas of entertainment and music, with AI shaping everything from songwriting to virtual concerts and even influencing events like the Grammys. We can expect platforms like TikTok and YouTube to continue blending sports, music, and entertainment, giving creators and fans new ways to connect and engage.

Last week, Dell announced substantial upgrades to its AI PC portfolio, highlighting enhancements in performance and sustainability. This initiative reflects the increasing importance of sustainability in business, a trend expected to continue influencing industry strategies through 2025. Dell’s approach includes implementing circular design principles, such as modular components and greater use of recycled materials, to extend product lifecycles and minimize e-waste. The company’s initiatives to improve energy efficiency, battery life, and repairability likewise underscore its commitment to addressing environmental concerns while catering to the performance demands of the AI PC market.

AT&T recently announced a customer guarantee for consumers and small businesses that use its wireless and fiber networks. Any customer who experiences a fiber outage of 20 minutes or more or a wireless outage of 60 minutes or more will receive compensation in the form of a billing credit. Additionally, the company is setting a goal for its customers to reach a call-center technical expert within five minutes or receive a callback at a chosen time, as well as a commitment to send a field technician the same day or next day for unresolved issues. AT&T Guarantee is a significant move for the operator, given that it’s the first of its kind for consumers, and I expect many of AT&T’s competitors will respond with similar commitments.

Research Papers Published

Citations

AI / Matt Kimball / AI Business
AI’s New Wave: Great Spaceships, Bumpy Runways

AI in 2025 / Anshel Sag / MIT Technology Review
The Download: our 10 Breakthrough Technologies for 2025

AWS / Graviton / Patrick Moorhead / Medium
AWS Graviton Adoption on the Rise: Half of All Instances Use Custom Silicon

Data Platforms & AI / Jason Andersen / Fierce Networks
Move over, data platforms – this is the dawning of the ‘Age of Intelligence’

Dell / AI PC / Patrick Moorhead / PRNewswire (picked up in several publications)
Dell Technologies Leads AI PC Movement with New, Redesigned PC Portfolio

Dell / AI PC / Patrick Moorhead / Investing.com
Dell unveils new AI-enhanced PC lineup for professionals

Dell / AI PC / Patrick Moorhead / IT Brief
Dell unveils streamlined AI PC portfolio with focus on productivity

C Code / Jason Andersen / InfoWorld
Researchers build a bridge from C to Rust and memory safety

CES 2025 / Anshel Sag / Wired
AI Hardware Is In It’s ‘Put Up or Shut Up’ Era

HPE / HPE acquisition of Juniper Networks / Will Townsend / SDX Central
Can HPE integrate Juniper opportunities?

IBM / RISE with SAP on IBM Power Virtual Server / Robert Kramer / CIO
IBM offers SAP-on-Power users a new way into the cloud

Intel / Company timeline, market issues, resolutions / Patrick Moorhead / Tech Target
Intel’s rise and fall: A timeline of what went wrong

Intel / 2025 Plans & Goals / Patrick Moorhead / Fierce Electronics
Intel takes deep breath, faces new year in upbeat showing at CES

NVIDIA / GenAI / Patrick Moorhead
Nvidia’s new model aims to move GenAI to physical world

Oracle / Data / Matt Kimball / InfoWorld
Oracle offers price-performance boost with Exadata X11M update

Oracle / Data / Matt Kimball / Oracle Blogs
Global Industry Analyst Perspectives on Oracle Exadata X11M

WordPress / Ongoing legal battle / Melody Brue / Computerworld
Matt Mullenweg: WordPress developer hours cutback may or may not slow innovation – Computerworld

New Gear or Software We Are Using and Testing

  • Kindle Colorsoft (Anshel Sag)
  • Google Pixel Buds 2 Pro (Anshel Sag)
  • XREAL One AR Glasses (Anshel Sag)
  • Google Pixel Watch 3, 41mm (Anshel Sag)
  • Cisco Desk Pro (Melody Brue)
  • OnePlus Buds Pro 3 (Anshel Sag)
  • Insta360 Link2 4K AI Webcam (Anshel Sag)
  • Google Pixel 9 Pro Fold (Anshel Sag)
  • Google TV streamer – Matter and Thread features (Bill Curtis)
  • Various Matter devices (Bill Curtis)
  • ASUS Zephyrus G16 Gaming Laptop (Anshel Sag)
  • iPhone 16 Pro (Anshel Sag)

Events MI&S Plans on Attending In-Person or Virtually (New)

Unless otherwise noted, our analysts will be attending the following events in person.

  • Cisco AI Summit, January 15, Palo Alto (Will Townsend)
  • World Economic Forum, January 20-24, Davos, Switzerland (Patrick Moorhead)
  • Acumatica Summit, January 26-29, Las Vegas (Robert Kramer)
  • Microsoft AI Tour, January 30, New York City (Robert Kramer)
  • Cisco AI Summit, January 15, Palo Alto (Will Townsend)
  • World Economic Forum, January 20-24, Davos, Switzerland (Patrick Moorhead)
  • Acumatica Summit, January 26-29, Las Vegas (Robert Kramer)
  • Microsoft AI Tour, January 30, New York City (Robert Kramer)
  • ZohoDay25, February 3-5, Austin (Robert Kramer, Melody Brue)
  • Cisco Live EMEA, February 10-13, Amsterdam (Will Townsend)
  • SAP Analyst Innovation Council, February 11-12, New York City (Robert Kramer)
  • RingCentral Analyst Summit, February 24-26, Napa (Melody Brue)
  • Arm Analyst Summit, February 18-21, San Francisco (Matt Kimball)
  • Microsoft Threat Intel Summit, February 25, Redmond (Will Townsend)
  • Siemens Datacenter Analyst Summit, February 25-27, Zug, Switzerland (Matt Kimball)
  • Mobile World Congress, March 2-7, Barcelona (Will Townsend)
  • Adobe Summit, March 18-20, Las Vegas (Melody Brue)
  • Extreme Networks Connect, May 19-22, Paris (Will Townsend)
  • Zendesk Analyst Day, March 25, Las Vegas (Melody Brue)
  • Oracle Database Summit, March 25, Mountain View (Matt Kimball)
  • IBM event, March 25, NYC (Matt Kimball)
  • Canva Create & Analyst Day, April 8-10, Los Angeles (Melody Brue)
  • NTT Upgrade, April 9-10, San Francisco (Will Townsend)
  • RSA Conference, April 28-May 1, Las Vegas (Will Townsend)
  • Nutanix.NEXT May 6-9, Washington DC (Matt Kimball)
  • Dell Tech World, May 19-22, Las Vegas (Matt Kimball)
  • Zscaler Zenith Live, June 2-5, Las Vegas (Will Townsend)
  • Cisco Live US, June 8-12, San Diego (Will Townsend)
  • HPE Discover, June 23-26, Las Vegas (Will Townsend)

Subscribe

Want to talk to the team? Get in touch here!

The post MI&S Weekly Analyst Insights — Week Ending January 10, 2025 appeared first on Moor Insights & Strategy.

]]>
The Six Five: Talking Meta, NVIDIA, Qualcomm, AMD, Intel https://moorinsightsstrategy.com/the-six-five/the-six-five-talking-meta-nvidia-qualcomm-amd-intel/ Mon, 13 Jan 2025 19:40:34 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=44915 On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss Meta, Qualcomm, Nvidia and more.

The post The Six Five: Talking Meta, NVIDIA, Qualcomm, AMD, Intel appeared first on Moor Insights & Strategy.

]]>
On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The handpicked topics for this week are:

  1. Meta Goes Community Notes, Eliminates Moderators
  2. NVIDIA Outlines Its Next Big Strategic Opportunity Beyond Datacenter AI
  3. NVIDIA Widens Enterprise Software Mote With Agents
  4. NVIDIA Gets Into Developer Desktops: A Notebook Pipe-Cleaner?
  5. Qualcomm Expands Auto Empire & Lowers PC Entry Price Points
  6. AMD Goes Big With Dell & Targets Apple M4 With Max
  7. Intel Reiterates Go-Forward Strategic Priorities

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.

Watch the episode here:

Listen to the episode on your favorite streaming platform:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: Welcome back. The Six Five is back after a small two and a half week sabbatical. We were on vacation. We may or may not have been together in the Snowmass, Aspen area. Dan, is your brain back to work?

Daniel Newman: Yeah. We didn’t really get a chance to shut down that much, did we? We would ski in, ski out and yes, we were together. I would actually play that off a little bit more if there weren’t a whole bunch of digital evidence of pictures of us eating dinner in a, I don’t know what that’s called, like a teepee?

Patrick Moorhead: A yurt.

Daniel Newman: In a yurt in the middle of Aspen, or all the chairlift photos, mountain photos. It was a-

Patrick Moorhead: Steak photos. Half of our photos were steaks, I think.

Daniel Newman: It was a bestie fest.

Patrick Moorhead: Totally.

Daniel Newman: It was a bestie fest. And we hung out with Bezos a bit. Not really, but we were close enough to almost see him that one time.

Patrick Moorhead: Yeah. There was a bunch of superstars. J.Lo, we ran across in the Ralph Lauren store. There was a JD Vance spotting-

Daniel Newman: On the mountain.

Patrick Moorhead: … out there as well. And all the Kushners were in Snowmass.

Daniel Newman: Wasn’t Kurt Russell? For the older folk over at Kemo Sabe getting a new cowboy hat.

Patrick Moorhead: Yeah. Yeah. And I got a ridiculous picture of him. It looks like he’s smiling at me. I’m pretty sure he was smiling at Goldie Hawn, who was carrying a big huge hat bag. But we’re back and we started on Sunday night, you and I both flew to Las Vegas to attend the Consumer Electronics Show 2025. And I think we have a few topics related to that. So, what are we going to talk about? We’re going to talk about Meta going all community notes, eliminating most of their 14,000 moderators. We’re going to talk about the next big NVIDIA move beyond data center AI. They also announced some new enterprise software, which may or may not be widening their enterprise moat. NVIDIA also got into a developer desktop, but is it the notebook pipe cleaner? We’re going to see.

Qualcomm won a bunch of awards at the show, too many to talk about. But we’re going to focus on what they did to expand their auto empire, and also what they’re doing with the price points on their Copilot+ PC processors. And then we’re going to round it out with some AMD discussion, a big deal with Dell. They’re now targeting Apple’s M4 processors. And then we’re going to wind up with another chip company, Intel. We’re going to talk about what we found out related to their strategic priorities with the new leadership team, the Dynamic Duo. So, we’re going to kick this off with Meta, big announcement. The press is going absolutely bonkers. Essentially Zuckerberg and his three social media properties is going to go all community notes, very similar to X. Dan, is this the end of the world for everybody? Are we going to be crying memes out there?

Daniel Newman: There will be some crying memes, but Pat, let’s face it, there are always crying memes. When there’s too much moderation, people think that they’re having their free speech taken away. When they have too much free speech, people think it’s all hate speech. When I can’t tell you, Pat, that you still got food on your chin right now, it hurts your feelings. There’s a feeling hurting barometer that goes off and the expectation is you need a moderator to take that down. And then on top of it, there’s just the business and economics and politics. What we saw in November was a, we can span whether it was decisive, whether it was a landslide, whether it was a close defeat. There’s a lot of different metrics that could be used and describe what happened in the election. But what happened was there was a feeling that what would be considered to be strongly left or liberal politics needed to come back towards the center in some ways. I know, we’re not a politics show, so just put up with my crap and listen to me for a minute.

This is why you have Tim Apple, Tim Cook, Mark Zuckerberg, and basically every other CEO flying to Mar-a-Lago right now. Or if they’re invited and they’re writing their million dollar checks for the inauguration. Because the bottom line is, most of us feel that the politics that these folks articulate is some permanent state of their beliefs. But realistically, it’s a state of the climate of the economy and the world and the business, and their goal is to continue growing their companies in every different regime. I know that’s not Meta exactly, but let’s face what really brought the whole community notes and moderation to the forefront. Really goes back to some of the things that happened with early utilization of social platforms to drive election or opinion at the beginning of misinformation. This didn’t start in the Trump era. This goes back to the elections with Obama.

Patrick Moorhead: 2016.

Daniel Newman: What’s that?

Patrick Moorhead: It was 2016, I think.

Daniel Newman: That’s when it really started, its first Trump election. But it had been happening for some time. We got to this era where people figured out that they could use social media to move opinion. And by the way, people are not particularly able to discern between fact and fiction. They’re not able to always completely understand the nuance of information. We’ve seen how different outlets, media, different publications, this can span both of the sides of the political continuum, can take snippets of comments that are made out of context. I still think the best example of this is that Trump calling the far right extremists, very fine people. And that was debunked everywhere, including on Snopes. But I had an argument with my mother about this in the most recent election, because she still to this day believes he said that.

So, this is what’s going on. And what’s happening now is there’s still this bifurcated view of the world that X is a niche play, but it’s really become the number one news source for, I think Musk shared the other day, over 150 countries. It’s the number one news source. It is the first place, it is the fire hose of where people get information from all major media outlets. But also from these highly reliable citizen journalists that are out there that people trust, whether that’s you Patrick, about fitness and healthcare, or other people about technology. But in all seriousness, Zuck went the right way. And look, I do think there is a place for moderation, meaning when there is violent, horrific content, child exploitation that takes place. I think that is a real meaningful thing where maybe moderation should take it down before it becomes harmful and gets into the wrong hands.

Having said that, an opinion about healthcare, an opinion about a political bill of law, an opinion about an economic decision, an opinion about a political figure. We’ve seen over and over, Pat, on both sides of the political divide, information has come out, the media has basically done some type of censoring to provide the belief to the public that a story is wrong. And then we found out that that story turned out to be true. It’s happened a lot. And so the better way to do this is using real time community notes from sources that have to be validated, that have to be checked and have to be willing to be public about what the sources are, who the sources are. And that way people can read it, the community note appears. I don’t know about you, Pat, but every time I see a community note, I instantly go, well, maybe I shouldn’t have shared that, or maybe that is inaccurate.

So it works for me. I think it can work for others. So, I think Zuck is following the momentum. He understands that this country is politically divided, but that there are two very large swaths of potential consumers for the platform. There will always be people at the farthest edges of the fringes that are going to feel that they are not doing enough to support their political beliefs. But just like the way primaries have destroyed political parties, too much moderation can destroy social platforms. And we need more centrism. We need more town square, we need more debate in public, Pat, and I think this is a good thing. And I also think it’s good that he’s moving what’s left of the moderation to the great state of Texas. I believe I also heard that that’s happening.

Patrick Moorhead: Yeah. Good rundown there, Daniel. And really is a shift for Meta and whether it’s politically motivated or something different, it seems like we are moving to more of a centrist area. And just to be clear, it’s not removing all humans from the loop. In fact, the way to get reinstated, they’re actually increasing the amount of resources if you get banned or things like shadow-banned. And I think the biggest thing that came out, and I do appreciate the opacity of the blog. Zuck came out and said, “Hey, we’re going to loosen restrictions on some sensitive topics here.” As it relates to community notes, it’s just not one way or another. It actually, how many times has Elon been community noted?

Probably 5%, 10% of his posts get community noted, and I think that just shows that the system is working as it should. We will, and we’re already seeing examples in press stories about the type of content and how horrible, but I think I’d like for everybody to give it a chance. I also really like the focus on the high severity violations, like terrorism, things like child exploitation, stuff that I’m hoping that we can’t have a disagreement on. But all in all, I’m an old guy as you like to point out, and the pendulum always swings, and the pendulum is currently swinging in a different direction. So, I like X because of the freedom. That’s a personal statement. I don’t do a lot on Facebook, or Instagram, or I forget the text-based platform that I get bugged on, on Instagram.

Daniel Newman: Threads.

Patrick Moorhead: Yeah. But maybe this will give me a reason to go back to Threads.

Daniel Newman: And Pat, by the way, I don’t think, we have so much to talk about that we can’t spend too much more time on this one. But we are, I think today, we’re very close to a TikTok ban, so Zuck obviously sees a fairly significant reason to potentially try to win over more audience. That’s a big opportunity out there. I do want to say though, that watch he wore really showed that he’s a man of the people and I appreciate that.

Patrick Moorhead: Way to round it out and we can’t spend more time.

Daniel Newman: $900,000 watch dude, that’s epic. And by the way, such a baller move. We’re in the era where you could do that again.

Patrick Moorhead: Yeah. And the next thing he might do is show up on Rogan and go through this. That’d be crazy. So, let’s move to the next topic. So, NVIDIA stock is rocking because of what it’s doing in data center AI. But believe it or not, it’s actually still very engaged in elements like automotive, the industrial edge, and of course gaming. Dan, you and I attended CES where we had a couple engagements there at NVIDIA. I was fortunate enough to have a conversation with Jensen. It wasn’t scheduled, but it was very happenstance. I’ve known the guy since 1998, but you and I were both part of the Financial Analyst Day and the Q&A. And I think we both watched the keynote in the comfort of our own rooms. So, what is the next opportunity? And the company has been laying out cookie crumbs for a long time, and particularly on its financial analyst calls, in the amount of time it devotes to topics that are not data center AI.

So, this notion of primarily robots, we know that they’re in the automotive industry, but industrial robotics is the next big thing. And when I say industrial, that’s a loose term that can go even into healthcare. But essentially they introduced a product called Cosmos. And think of Cosmos as a world foundational model to be able to essentially train robots. And they’ve got different levels. And you might be asking, “Well, wait a second, isn’t that what Omniverse did?” Well, think of Omniverse as having very, very detailed and precise models in the physical world. It’s a little bit more of a challenge though to extend that into the virtual world and to be able to train to essentially make these robots a lot smarter. So, if you take Omniverse and you feed that data into Cosmos, the theory here is that the quality goes up.

And we’ve seen synthetic video worlds being created before. NVIDIA did it with Omniverse. And you see companies, like OpenAI with Sora, and you see companies like Adobe, they haven’t gone GA on this, but video Firefly to be able to create synthetic. And the challenge though has been if you’re trying to train a robot with a high degree of accuracy in what it does literally and how it interacts, the quality of the video output has to be higher. So, it’s clear that the next big planned growth area for NVIDIA is in robotics. You can use Cosmos for automotive. We’re going to see this, and let my imagination run wild, why wouldn’t we use Cosmos for something like entertainment, right? And the ability to be able to improve the output of entertainment content, basically a better Sora.

Daniel Newman: Is that it? Are you done?

Patrick Moorhead: Are you there, Dan? Tap the microphone. You there?

Daniel Newman: No, I was there, I was just-

Patrick Moorhead: Stock portfolio.

Daniel Newman: I’m not used to you talking that little. I’m just like, Lord, that was actually really fast. Pat probably has another workout to get to. When you’re getting huge, you just don’t have that much time for this stuff. No, in all seriousness, that was really a good breakdown. I’m going to try to tie some threads together. I know we have a seven five today with three NVIDIA tracks, and I know they’re different topics, but some of this is all the same to me. And so the thesis about NVIDIA has been that its business is basically valued for the number of data center GPUs it can sell, and then of course the attached software licensing. And it’s all really been attached to the amount of expected CapEx spent by the largest hyperscalers in the world. The antithesis of the business has been whether or not the hyperscalers of the world are going to continue to build all of the AI in the future on NVIDIA’s systems. Or are they going to vertically integrate with their own GPUs built with Broadcom or Bell? And by the way, these things are somewhat interdependent, but in some ways mutually exclusive, because I think the answer is going to be both.

Having said that, I said the entire CES keynote was all about NVIDIA showing that their AI extends beyond the data center, and here’s all the ways it extends beyond the data center. But, and it’s an important but, at the same time, all of this will create more demand in the data center. Because let’s face it, $30 $40,000 GPUs, sold 72 at a time with hundreds of CPU cores and licenses attached to that. That ultimately end up then creating more licenses for Omniverse, and then more licenses for edge based solutions and Orin in the vehicles and everything else that’s going on is a really, really lucrative business. And so the themes of this event, and I’m going to stay with the one theme here, but there were three. The three themes were, it was physical AI, it was automotive, and it was PCs and future of client. So, you hear basically talking about this physical AI thing. And the physical AI thing as I see it, is the next trillion dollar plus leg up for NVIDIA. This is where they get the next trillion dollars in market cap. Why? Because two things.
One, it’s really hard.

So, all the things that you’re hearing about ASICs and accelerators for the data center, what NVIDIA is doing here, none of that stuff can do that right now. Now they will be able to build that stuff. But the second thing is how this all threads together between that simulated world with synthetic data and the physical world with real data. We heard Elon Musk sees Tesla stock trade at something like five times higher than the average mag seven in terms of forward multiple. And the reason is, is because people believe he’s going to solve two major themes of the future around AI. One, autonomy and driving, which by the way, Waymo does better right now, but he has more cars, more data. There’s a lot of reasons why Tesla could win. The second thing is he’s going to solve humanoid robots. So he’s got, you remember the guy on stage in the suit two years ago, and then this year they had the ones pouring drinks that were semi, I think there was someone in the back probably with a remote control doing things. But we went to CES this year, you saw physical robots.

You saw robots that could move boxes and robots that could walk around with us, and you saw robots that could offer companionship. But the thing is, is that if you just get to the level of every human has a robot, just that, and you’re talking about something that costs as much as a Tesla 2 or 3 at 30 to 40,000, the economics of this become really powerful. Well, someone actually has to build the software stack to train that robot to be useful. Otherwise, you end up with the little things we have now that walk around the house and then fall over and set themselves on fire. The thing that walks around your pool that forgets to clean. There’s robots now. There’s robots in commerce, it’s everywhere. But Jensen is basically saying the mechanics of the human, the physical mechanics of the way we move, say walking up a hill or down a hill, is really hard. To simulate that you would need to literally have tons and tons of training data.

Well, with what they’ve built in the connective tissue between Cosmos, GR00T, Omniverse, you can basically simulate this stuff faster, build models, and then get us to humanoid robots a heck of a lot faster. So, it’s really cool, exciting stuff, Pat. And so I know we’re going to talk more NVIDIA, so I’ll end that theme on that one. But I really think the market misunderstood what happened. I think it was basically, and he said the ChatGPT moment for physical AI, but I just think it’s the stack. The stack that’s become the all in data center, digital AI stack that people are using for software. We’re seeing them build something very similar for the physical world, and that’s going to be really hard to replicate and it could be a really great long term moat for NVIDIA.

Patrick Moorhead: Good stuff, Dan. And by the way, one other thing, big announcement that came out of CES was NVIDIA announced yet another enterprise software stack that sits right on top of NeMo related to agents. Dan, what is it, and what’s the significance of it?

Daniel Newman: So, basically agentic AI for all of you out there, there’s these two schools of agentic. And so I’m going to talk about it a little bit theoretically here and, Pat, maybe you can add some color as well on what NVIDIA is specifically doing as well. But this has been the hottest trend in digital. So, we saw this evolution of, two years ago, the moment was about generative, text-based, and then you saw multi-model, multimodal, text, speech, video. And then you started to see smaller language models evolving, which is text language model for very specific use cases. And then you’re seeing the companies like ServiceNow, Salesforce, Microsoft building the next wave, which was assistance, which are basically AIs that could do a specific task.

And now you’re seeing agents, which is the idea of being able to basically deploy a force of assistance that can work on various tasks and actually co-communicate. They can intercommunicate between one another. And so you could essentially put forward someone to do a work stream on supply chain, or a work stream on operations, or a work stream on finance. And you can say, “We’re working on these 3, 4, 5 work streams,” and all the agents can work together. So, what needs to happen for this to work is there’s a lot of orchestration that is required. And so there’s these two consumption layers that are going to be really important for people out there. There’s the software ISV consumption, which is what we’ve heard a lot about.

So just use Agentforce as an example. Agentforce, the idea is you’re a Salesforce customer, you can get pre-built agentic AI that can then be put into your environment that can do sales motions. That’s very cool. Well, guess what? Someone has to develop all that stuff. It actually has to be developed and Salesforce’s platform itself doesn’t really develop agents. What happens is they train agents on a platform like what NVIDIA is building with Blueprint, that can basically take a bunch of pre-canned agentic capabilities, mix it with proprietary data sets, put it all together so then it can be deployed in software. And NVIDIA has been left out of the overall agentic conversation, believe it or not, despite the fact that they’re the hardware for training everything. You really didn’t hear about agents and NVIDIA until this event. And so I think that’s in my take, that agent and Blueprints to NVIDIA is for the developer what the ServiceNow Microsoft and Salesforce products are for the user, for the enterprise. Someone has to build the agents, NVIDIA is trying to make that easier.

Patrick Moorhead: Yeah. It’s pretty wild the amount of software that NVIDIA is generating. And I’ve been super disappointed with the enterprise, I’ll call it the on-prem enterprise capabilities to be able to do that. Really nobody has stepped forward yet. I think the closest example of it are bits and pieces of things that, let’s say IBM has done with their Watsonx platform. You’ve seen a little bit of the data management action from companies like Cloudera, but pretty much a little bit of model management with Cohere, but there hasn’t been a lot of action here. So, one thing that Jensen had said or reinforced, I’d heard this before, he reinforced, “Hey, we’re market makers, not market takers.” And I think this is a market making opportunity. And they had put forward different agents; a research assistant, a financial analyst, employee support, customer service, sales development, all the ones that Daniel, you and I have seen in our travels and travails to these enterprise, SaaS companies. And if they can light these things up for the enterprise, I think they’re shrinking the time to market that looks like a couple of years to really scale enterprise AI. And because they’re NIMs, you can run them anywhere. You want to run them in the public cloud, do it. You want to run it in your private cloud, run it. So, AI Blueprints sit on top of NVIDIA NeMo that I know it’s not per se a data management platform, but you’re evaluating, you’re customizing, you’re curating, there’s guardrails and it also includes a RAG capability on top of that.

Daniel Newman: And by the way, they’re doing a lot of interesting stuff with models, the Nemotrons, what they’re doing with Meta, Pat. You could say it’s not a data management platform, but I would argue how long do we need data management platforms in an era where AI should be able to discern different data format types? I don’t want to get too far ahead of ourselves. By the way, one other thing, I just got an email literally as we’re talking from NVIDIA about their AI Blueprint for retail shopping assistance. That’s the example though, is when you hear the stories from a Salesforce about what they’re doing with Nordstrom, someone has to actually build and train a model that can then be implemented into the SaaS platform.

Patrick Moorhead: That’s right.

Daniel Newman: That’s the TLDR here, right?

Patrick Moorhead: It is, it is. And it’s funny, I don’t applaud NVIDIA on a ton, but I think this is great because quite frankly, there’s nobody else doing this that you can apply this to on-prem capability. And Daniel, I have not talked to a single CIO or a single CDO that says, “Now, I’m going wholesale into the public cloud.” They want their strategic data for the most part to sit there. They’re afraid of the cost of the public cloud. Not everybody of course. Just look at the growth that the public cloud is experiencing, but I think this is a very positive move. So, let’s move from software and agentic services to NVIDIA getting into desktop. So, NVIDIA is not new to creating devices. In fact, nearly 15, 20 years ago, even when I was at AMD, they did a foldable chip for WinRT.

It was ARM-based, had a GPU on it, along with Qualcomm and Texas Instruments. And then they’ve also done these gigantic desk-side developer platforms as well. I believe it was a desktop desk-side DGX. It was a monster though. And what they introduced at the show was something called Project DIGITS. Think of it as a very compact desktop. And not too many things surprise me, but this one did. I knew because people are telling me, not NVIDIA, but everybody else, that they’re going to be bringing out a notebook processor that will have devices in it, the first half of ’26. So, when it relates to it being a, I like to call it a pipe cleaner, I really find this fascinating. The size of it is crazy. It’s literally like this big, which I-

Daniel Newman: It’s like a Mac mini.

Patrick Moorhead: Yeah. And it just has the coolest-looking design too. It looks like it fits in a 1U rack. Obviously it doesn’t, but that’s the feeling that you get. Has a new chip called the GB10. Not to be confused with the GB100, and GB is Grace Blackwell, but they partnered with MediaTek to make this happen. And I think they had to partner with MediaTek because low power is really not what NVIDIA is great at, and the ability to churn and burn an SoC in very quick time. We’ve seen this with Qualcomm. Their ability to churn out something so quickly is absolutely amazing. And the performance is really, we really don’t have a sense even though we know it has 20 ARM cores, and this is the new Blackhawk core that I wrote about last year. But we do know its AI performance, which are just absolutely kind of mind-blowing. I think it’s half of the performance of ADA out there at 500 teraflops, which is seriously, and that’s FP4 performance, which is absolutely insane.

So, priced at $3,000, available May of 2025. In the limited time that I had to talk with Jensen, I think the first thing I asked him is, “Hey, can I get Windows on this thing?” And I’ll note that there was a lot of confusion based on what Jensen said. So, this ships with Linux and not Windows. And I think when he talked about this on stage, he did not get his message across very well. And you saw Satya Nadella weigh in as we saw Pavan on X trying to clarify what was going on here. But essentially Windows has a Linux subsystem, and the TLDR is all the great software that you can run on DIGITS, because it comes with the full NVIDIA stack, you can run on Windows in what’s called WSL, and that is it. NVIDIA is not abandoning Windows and moving everything to Linux, but their software stack, their entire software stack is Linux optimized, because it’s for the data center.

Daniel Newman: And there you have it. I think some of it’s pretty provocative, Pat, and I am with you. I don’t see any massive immediate changes. Like I say, the end of enterprise SaaS is going to lead to these deprecated services and databases and abstractions and new toys. There might be a future in which a Linux Kernel and a chat interface is all you really get when you open the lid. Because you certainly aren’t going to see apps on your phone the same way in the long term, if agents actually do what they’re supposed to do. That’s the bottom line. If they actually do what they’re supposed to do. So it’s funny because all these companies are in conflict with themselves. It’s like they’re creating their own demise, but at the same time, if they don’t, they’re going to miss the trend and become the Kodaks and Blackberries of the next era of pundit stupidity. Anyway, I think you hit that one pretty good. I don’t have a lot to add to it. I think I did like the launch. I thought it was cool how he came out and took the big GB box and made it small. I think the way he told the story, I think the idea of having a high-powered AI machine at your desktop for developers, super, super-duper cool. I do want to just say before we leave the topic of NVIDIA and CES, that I did find it particularly funny that the CEO of CES kept calling it NVIDIA.

Patrick Moorhead: I was wondering if you’re going to bring that up. Gary Shapiro.

Daniel Newman: Yeah. The most consequential company of our time, NVIDIA. All right. I’m not going to say whether that really bothers me or not. I’m just going to say that if it is the most consequential company of its time, we should learn how to say its name. So, for everyone out there that’s listening that’s been unsure of how to say it, Pat, how do you say it?

Patrick Moorhead: NVIDIA.

Daniel Newman: Thank you. I didn’t want to say it wrong, so I put it back in your lap.

Patrick Moorhead: I appreciate that.

Daniel Newman: All right. That’s it. Let’s keep going. I don’t need, you nailed DIGITs.

Patrick Moorhead: Let’s get into Qualcomm. So, Qualcomm came in, did seven or eight announcements. They were primarily around automotive and an extension of their PC, SoCs, Dan.

Daniel Newman: Well, first of all, it’s really interesting, Pat. A year ago, you were on stage with a whole bunch of the executives of the OEMs on the devices side. It was the year of the AI PC. Being self-critical and going back in a year of review, I never said emphatically that we were going to have a super cycle, but I think I was probably more optimistic about how the year would go than it went. I will mostly blame software for being wrong because if software roadmaps had met what I’d expected, I think there would’ve been more allure to people buying AI PCs. The reason I’m pointing this out while we talk about Qualcomm is Qualcomm was obviously the new, exciting, disruptive, ARM-based entrant that was going to create the new future of these Copilot+ PCs. And the year in review is you ended up with a really exciting silicon year, a race and an ARMs between Intel, ARM-based Qualcomm, AMD, and now to a lesser extent, MediaTek and NVIDIA, all ripping and racing towards this era.

But here’s the question and the point, Pat, is Qualcomm comes out with a slightly lower cost, a mid-tier, to address probably a subpart of the market that their early Elite product was going to be too expensive for. Part of which is because we’re getting into this era of AI PCs or Copilot+ PCs. Part of it is because what’s ended up happening, at least so far, because of maybe software not quite meeting the moment yet, yet is an important word here, is we’ve ended up with needing to address price points again to be able to attract a bigger market to try to accelerate updates. Because what we do have, Pat, are really powerful next generation devices with really good battery life. We did end up getting that. And in the process we ended up getting also some of the best out of, and we’ll talk more about Intel later, but some of the best out of some of the chip makers to really get back to competing. Because in the beginning, Pat, it looked like Qualcomm was just going to run away in terms of specs.

And now we’ve got a three-headed race and we’ve got some really competitive parts, but the new X product is going to hit that middle price point, Pat. And I think that’s all about accelerating people moving over and buying in and utilizing, because I still think there’s a lot of questions on what software is going to drive people and how much on-device stuff happens in terms of it being a value add for the consumer.I get why cloud companies want it. It’s expensive to have all these things being done in the cloud, but do the buyers notice the latency and care about having to go to the cloud to do a query on ChatGPT, Perplexity, or any other language model? I think that’s something we got to discuss, Pat, and we’ll spend some more time talking about that, I’m sure over the course of the year.On the other side of this, Qualcomm has been an absolutely amazing journey in automotive, and the announcements keep ticking. And while we did hear NVIDIA seem to have a more bullish forecast, Intel has talked a little bit about some automotive progress that it’s making. Qualcomm has been over the last, I’d say three or four years, the de facto or undisputed winner of the automotive opportunity building a nearly $50 billion automotive pipeline. Business is growing every quarter. They’ve had more design wins.

And here at CES, they just furthered partnerships with Google, partnerships with Garmin for GPS and Unified Cabin. With Tata, which of course is a big India opportunity for the company and in other markets. Partnerships with Alpine, Amazon, Leap, Mahindra, I’m just reading this by the way, Hyundai, Royal Enfield all doing more with AI-powered Cabin. They have a new partnership with Panasonic for the Cockpit.And Pat, they’re also showing more and more how generative AI can work in the vehicle. And of course we know you can’t have full language models, and there’s a lot of work to be done to synthesize these models down to make them power efficient enough to be run at the edge. So, this is something Qualcomm is well-equipped for. The customers, the wins, the designs, the partnerships indicate its success. It’s continued progress as I see it. It was very a high volume of what I call small movers, that all show Qualcomm is moving in the right direction.

Patrick Moorhead: So, I like to look through old photos that I took at shows years ago, and literally four to five years ago, I didn’t know that Qualcomm was even looking at ADAS. And they asked me if I wanted to take a ride in their ADAS-infused car, and that was a huge surprise to me. And here we are, the backlog stands at $45 billion and in their Financial Analyst Day, they put 4 billion in automotive revenue by fiscal 2026. And then doubling that to 8 billion fiscal ’29, really driven by ADAS and even fully autonomous driving. And their announcements here were really just, I’ll call them tuck-in announcements, this Snapdragon Ride Flex SoC with Hyundai, important movement. It’ll be interesting to see on the software side, Qualcomm versus NVIDIA and what they can do. NVIDIA has made most of their moves, I would say in the autonomous driving. Qualcomm has taken most of NVIDIA’s dashboard design wins, and Qualcomm has always been a huge player. And obviously the connectivity, whether it’s 3G, LTE, or 5G inside of the car.

So, strategically, the company just gets moving forward here. One thing I want to focus here is, is on this new Snapdragon X. With adding X, now there’s three tiers. You go all the way from $600 to whatever, $2,000 with the entire Qualcomm Snapdragon platform. It’s going to be really important for Qualcomm this year to do a few things. So, first and foremost, they really have to nail education as it relates to performance. And what I mean by that is, is Qualcomm doesn’t win all the performance figures all the time, but what they do do is they win most of the speed races when it’s not plugged in, like you’d expect a notebook to do. Can they punch that through there? Qualcomm has to scale. It was a challenge to do that before because of a couple of things going on. First of all, price points. But now as low as 599, you’re going to see platforms out there. You also have a combination of desktops and notebooks that will be coming out. I did say last year that I expected the Super Sky goal to start sometime in the second half. What I didn’t factor in is that Microsoft would miss their software schedules of their Premier AI feature by eight months. So, Recall was supposed to be out May, June, GA, it is still not.

It is a very useful tool out there that I use every day and I think it’s pretty good, but it was severely delayed. And the second thing is, is to start getting traction inside of the enterprises to take advantage of the Windows 10 to Windows 11 transition. Qualcomm is investing a tremendous amount of money in there, but it’s going to be tough. AMD and Intel are running serious plays in the enterprise and don’t want to give up any real estate whatsoever. You have Intel playing defense and AMD playing hardcore offense. So, anyways, good stuff from Qualcomm. So, let’s talk a little bit about AMD and some of the announcements here. They made a ton of announcements. What? Is this your topic? Huh?

Daniel Newman: No, it’s all you buddy. Don’t let these gorgeous pearly whites distract you. Just keep going. I’m just smiling at my bestie.

Patrick Moorhead: Sorry, I just thought you were looking at me and laughing. That’s okay. So, AMD made a lot of announcements, but I think the two that we thought were the most impactful was they signed a go big. They announced a go big deal with Dell on their commercial lines that cut across desktops and notebooks. And it’s interesting, we’ve seen a lot over the years, I’ve seen many announcements. I’ve been involved in deals with Dell and really up to this point, Dell has not been able to effectively drive volume of AMD equipment, AMD desktops and notebooks. And it’s not just a Dell issue. AMD did not invest enough in the channel. I still don’t think they’re investing enough in content and programs. AMD really is about best product wins, but when it comes to enterprise desktops, that’s not just the case. That’s not the case. You’re not going to get somebody to pivot and go with you because your platform is $50 or $100 dollars less because it’s really a TCO. And quite frankly, people don’t want to change if they’ve been satisfied with what they’re using.

So, moving forward, I am hopeful that, and I’ve seen, I’ve talked to both companies, all the executives here, I’ve looked at how the two companies are positioning it. I’m looking forward to get a even deeper dive in the go-to market with both AMD and Dell. And made AMD stock rock, which good for them, but they’ve got to turn this into a reality here. Final thing I want to talk about is AMD brought out with what I think is a super differentiated product called Ryzen AI Max, lovingly referred to as Strix Halo. It’s the top of the line AI processor with 50 TOPS, but it’s not necessarily the TOPS. It has a big GPU maybe up there with the RTX 4060, a unique memory, a shared memory system. And it comes at a reasonable TDP between, I would say 45 watts is reasonable, but it goes all the way up to 120. This is a multitasking monster.And as you would expect, AMD outlined a lot of use cases related to content creation, which Apple has tended to dominate as of late. And I like that AMD went right after the M4 on this. So, probably the most differentiated product that AMD brought out the entire time. I believe that they’re partnering, at least publicly now, with HP and ASUS. I want one of these by the way, in a desktop format.

Daniel Newman: I want you to have one. I’m going to buy you one.

Patrick Moorhead: Thank you.

Daniel Newman: Merry holidays.

Patrick Moorhead: Thanks dude.

Daniel Newman: All right. Look, I will just add to this that I think it was a really big signal of intent that both AMD and Dell are finding a way to expand and grow that partnership. I think for AMD it’s an important growth area, especially in client. I like the work that Dell is doing to simplify its nomenclature or its design. Pat, you and I are both, I’m becoming a product marketing guy as I spend more time around you. And what I would say is I felt for a long time that Dell’s notebook lineups were really hard to discern and to understand, unless you were paying attention all the time. I think that Dell-AMD, this was probably the biggest go forward, I think the two have had so far in this space. And so I think it was a really strong moment for the company. In terms of what’s going on, I would also add that I think Dell, they did a bigger splash with Qualcomm when the first wave, they did that at Dell Tech World. We’re going to talk about Intel in a second, but let’s be really candid about it. Look, there’s some hope for Intel on the product side.

I’ve got some positive things I’ll say about it, but I also think if you’re Dell or you’re any of these companies, this is the hedge. You have to be hedging right now. There’s so much uncertainty. I think a lot of good work is being done to shore up some of the risk by Intel. But gosh, in AMD, Dell needs a stronger partnership. Dell and Qualcomm needs a partnership because until we really know for sure that Intel is out of the woods, these companies would be crazy to not be diversifying a little bit. And I think this was a signal of that intent. But at the same time, everyone out there knows Dell and Intel are like, and there’s a huge and very tight, very long and very deep partnership. So, we’re going to have to sit back a little bit, Pat, and wait and see if the numbers do the talking.

Patrick Moorhead: Let’s dive into our final topic. You and I both met with MJ at Intel, one of the co-CEOs. MJ is running products. We also attended Intel’s press and analyst conference as well. What were your takeaways, Dan?

Daniel Newman: Yeah. The conference was, it was a iterative moment of what’s going on. If you wanted to do the check, check, check, it was like Lunar doing really well, Arrow not as bad as people were saying. I think that was the takeaway is that there’s some of the kinks in Arrow Lake are being worked out. There’s a lot of focus on Panther Lake and what’s going to happen there. That seems to be a big moment for Intel, and that’s really the big ATNA moment that people are going to wait and see at a high volume part. But also, Pat, I think you said this earlier, maybe you said it online, maybe you said it offline. But I also think it was a really important moment for the company to refocus that its product first, process second. Which to me by the way, and I’m not starting any rumors, I don’t have any inside information. I’ve talked to the executives there for a long time. I think the separation/ or spin is getting closer and closer. I just think what I felt from that event was product, product, product. This company is going to get back to its roots of strength and product.

I also think it was pretty willing to acknowledge that a lot of its pain has less to do with the client and more to do with data center. The fact that it just doesn’t have a competitive AI part right now is just absolutely killing the company. Has to get that going. It has to get DCAI back on track. So, what I said in that last segment, I’m just going to weigh in a little bit more on it and then I’m going to pass this to you, because I’m also getting close to time and I want you to have some moments here. I got corporate BS meetings, everybody. Don’t you know how many corporate BS meetings I got to go to? So, I got to end this podcast with you. So much corporate crap, I can’t-

Patrick Moorhead: So sad.

Daniel Newman: When I’m old Pat, like you, I’m just going to pod all day. That’s all I want to do. Just podcast and crap talk on Twitter. But the other thing I would say is that the client’s side of the business has shored up. It just isn’t that bad. I think that they’ve made the big investments. They’ve kept the biggest customers in tow in many ways. They haven’t seeded as much market share as people have feared. The ARM stuff has done okay, but it hasn’t crushed Intel in any way more than Intel was already being crushed by its own challenges and problems. And I think MJ got up there and reiterated that a little bit, but I still say, Pat, until they get something figured out competitive in the data center beyond Xeon, it’s always going to be a problem for the product side. And I already said my piece about the foundry side. I really have a suspicion and a hunch there, but I’m going to let that play out.

Patrick Moorhead: So, from a priority standpoint, strategic priorities, this is really a big carry through all the way from the themes of Pat’s departure, the discussion that both you and I had with the co-CEOs. It’s subtle, but it’s important. Overall, for Intel to fully succeed, it has to have competitive products, competitive process, competitive foundry. And my thought was that 75% of the focus before this was on foundry, and now its products. Which it’s interesting, I think it’d be very hard for the company to have competitive products without a competitive process. But if you’re using somebody like TSMC, you could have much more competitive products.And I think this comes down to resource allocation and focus of the senior leadership team. And it seems like a subtle change here, but this is, I think Pat probably exited the company based on its competitive posture with its products, particularly on the data center side. Xeon has a lot of highlights, but AMD still continues to gain market share with the hyperscaler data centers, and it’s nearing double-digit with the enterprises as well.

And interestingly enough, doing really well with Dell on the EPIC side. But I think that the one thing that nobody can dispel is that Intel missed the boat on a data center AI GPU. But somehow in the same period, AMD was able to turn their 64 bit high precision FLOPs monster into a very competitive inference with low precision TOPS. And Intel will be coming out one to two years after AMD on this, and that’s probably a fireable offense, I guess, or I’m sure added to the momentum for a management change. Listen, the proof is in the pudding. I do believe that MJ is going to be listening more to customers, and she already has that reputation on the client side. Intel is a very large organization, where we see how quickly they can get this out. One thing that I find super interesting is, is that I do believe that the marketing and sales tactics are going to shift more back to where I remember Intel being strong. I think that Intel is going to price their products to not lose any market share, particularly on the client side. We will see. So, Dan, great show. Thanks for going over. I know that your corporate meetings are important to you now.

Daniel Newman: Well, you going to go to the gym again? What are you going to do the protein shake?

Patrick Moorhead: It’s super important. I’ve got six advisory calls.

Daniel Newman: You going to eat some eggs this morning?

Patrick Moorhead: Oh no, I’m absolutely going to go at steak eggs and avocado.

Daniel Newman: I just want to say this, I’m going to actually stay one more minute to say this to everybody. I walked around CES with you for three days. I had 50 strangers come off the street just to tap you on the shoulder and tell you how inspiring you have been. While I’ve had the joy of spending the entire process with you and being inspired by you, it was very inspiring and aspirational for me to see the difference that you can make in people’s lives. So, hopefully for everyone out there, we make a difference in the lives of the way they see tech, the way they think about what they buy, what they purchase. We certainly don’t give stock advice, so don’t ever make any decisions based on that. But Pat, way to keep crushing it, buddy. I’m going to drop.

Patrick Moorhead: Great show, Dan. Thanks for the thanks for the kind words. But listen, you were the guy who I was watching who could figure out how to stay healthy on the road. And so I think there’s a back at you bestie. When you used to tell me you would get up at 5:00 to go work out at 6:00, I thought you were nuts. And here we are both in the gym at 6:00 AM. By the way, we’d be there at 5:30 if it opened earlier in Las Vegas. But hey, thanks everybody for tuning in. We’re looking forward to a great year. We’re going to try to shift a little bit of the show. We’re going to give a little bit of details on that and we’re going to go broader, probably broader topics, talk about the impact to the overall. We’re still going to talk about companies, but we’re going to give broader impact. Spend a little bit more time on that. Give us some feedback. We’re always looking for feedback on the show. What do you like, what do you dislike? Thanks for tuning in. Take care. Have a great-

The post The Six Five: Talking Meta, NVIDIA, Qualcomm, AMD, Intel appeared first on Moor Insights & Strategy.

]]>
RESEARCH PAPER: Sustainable Performance in the Datacenter https://moorinsightsstrategy.com/research-papers/research-paper-sustainable-performance-in-the-datacenter/ Wed, 08 Jan 2025 13:00:46 +0000 https://moorinsightsstrategy.com/?post_type=research_papers&p=44874 This report explores datacenter power challenges and how Solidigm’s new D5-P5336 SSD helps solve for both performance and power consumption.

The post RESEARCH PAPER: Sustainable Performance in the Datacenter appeared first on Moor Insights & Strategy.

]]>
There is tension between a business’s need to maximize the value of AI across the organization and its need to drive down its energy consumption. On one side is the need for performance—powerful (and power-hungry) GPUs attached to highly performant storage. On the other side is a power budget that is expensive—in terms of carbon and datacenter footprint. Both sides contribute to significant financial costs.

While many datacenter professionals look to GPUs and CPUs as the key contributors, they often overlook the role of storage in this power consumption equation. This Moor Insights & Strategy (MI&S) pulse brief will explore this power challenge and how Solidigm’s new D5-P5336 SSD with a capacity of 122.88 TB helps datacenter operators solve for both performance and power consumption.

Click the logo below to download the research paper and read more.

Sustainable Performance in the Datacenter

 

Table of Contents

  • Summary
  • Sizing the Sustainability Challenge
  • The Storage Performance Tax
  • Solidigm Drives Sustainable Performance and Capacity
  • Call to Action

Companies Cited:

  • Solidigm
  • International Energy Agency
  • Dell Technologies
  • HPE
  • Lenovo

The post RESEARCH PAPER: Sustainable Performance in the Datacenter appeared first on Moor Insights & Strategy.

]]>
Six Five Connected – New Dell Product Lines and Announcements from CES 2025 https://moorinsightsstrategy.com/the-six-five/connected-with-diana-blass-dell-client-portfolio-innovation/ Tue, 07 Jan 2025 21:00:43 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=44869 Diana Blass, host at Six Five Media, shares her insights on Dell Technologies & Alienware's groundbreaking CES product showcase, highlighting the future of computing.

The post Six Five Connected – New Dell Product Lines and Announcements from CES 2025 appeared first on Moor Insights & Strategy.

]]>
The PC is back in the spotlight 🔆 and Dell unveiled a slew of innovations at CES 2025. On this episode of Connected with Diana Blass, we get the inside scoop from several experts at Dell Technologies and Dell’s Alienware team on their CES 2025 product showcase.

Key takeaways:

  • Dell is bringing AI to the edge, integrating powerful MPUs from Intel, Qualcomm, and AMD into its new line of laptops, meaning faster, more efficient performance for demanding workloads
  • Simplified Portfolio: Dell has streamlined its PC lineup with three clear categories: Dell, Dell Pro, and Dell Pro Max
  • Dell is committed to using recycled materials and reducing its environmental impact
  • Insights on how these innovations address current tech trends
  • The impact of these products on the future of personal and professional computing
  • Their vision for the evolution of computing devices

Learn more at Dell Technologies.

Watch the full video at Six Five Media, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Connected with Diana Blass is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Diana Blass: Hey, guys. Welcome to Dell’s CES Preview. We’re getting a look at all of its latest innovation. We’re inside the living room, I guess you could say.

Kris Graft: Welcome.

Diana Blass: A monitor combining the powers of video and audio.

Kris Graft: And it feels like someone’s right beside you.

Diana Blass: Enabling us to get even more out of our PCs. Can I lift up the finished product here?

Amanda McNight: How much do you think it weighs?

Diana Blass: Four pounds?

Amanda McNight: 3.6?

Diana Blass: Oh, my gosh, I was pretty close. But not just any PC.

Michael Dell: These PCs are beautiful, they’re blazing fast, they’re powerful with all day life battery.

Diana Blass: And that calls for a big announcement.

Michael Dell: Today, we’re announcing a simplified portfolio that unifies our PC brand under the Dell name.

Diana Blass: A brand that goes back 40 years in the marketplace, all starting with the original PC that Michael Dell launched in 1985 and a lot has changed, so let’s go check it out. Dell’s DC portfolio has been simplified into three categories. Dell, designed for work, school, and play, Dell Pro, designed for commercial use, and Dell Pro Max, designed for maximum performance. Each category is even further broken down into specific tiers tailored to consumer’s needs. Inside the CS Preview, we got to look at some of these devices.

Amanda McNight: This is our new Dell Plus line. You guys heard about the new naming. So historically, Inspiron, we would’ve talked about in Inspiron 14 and 16 Plus. These are the follow-on devices that we’re launching. So there’ll be the Dell 14 and 16 Plus laptops and two-in-ones.

Diana Blass: These devices have been boosted in design.

Amanda McNight: So we got feedback from our customers that they wanted more viewing angles. So we’ve redesigned the hinge so it lays completely flat for you. So this is the plastic, we’ll call it the bottom cover that ships today, you can see Inspiron on the bottom. And so you can see the copper here for our thermal solution. With these devices, we’ve actually moved to aluminum, so you see the metal on the bottom. And then we’ve replaced the copper with graphite sheets, so we’re dissipating air faster with the graphite. The aluminum allows us to get a little bit thinner, so you increase the air gap between the motherboard and the bottom. So you’re pushing more of that cool air through.

Diana Blass: Over here, we got to see how the Dell Pro and select Dell Pro Max are the world’s first commercial notebooks built with a modular USB-C port.

Katie Green: The power port is the most commonly used. You plug in power all day in and out, and it can kind of wobble. So we wanted to make it extra durable, so it’s four times more durable than the previous designs, and then it’s attached with two screws instead of typically being soldered onto the board. There we are. From there, you can just pull the port straight up and you have the USB-C port, fully replaceable.

Diana Blass: And it’s not just about PCs. Dell is also redefining what monitors can do with innovations in color accuracy, eye comfort, and even audio.

Kris Graft: You’ll sit in front of this display and you get the beautiful 4K, which has amazing contrast ratio, really deep blacks, super smooth motion with 120 hertz. And it also has five speakers that are installed across the bottom. When you move your head around, it is precisely placing the audio, like wearing headphones without wearing headphones.

Diana Blass: But let’s get to the bigger story here. The PC is back in the spotlight, more powerful and more relevant than ever before, as AI shifts from the data center to the edge.

Jeff Clarke: Quite frankly, the PC is the closest thing to end users. It’s the closest thing to their data, it’s the closest thing to their intent.

Diana Blass: At Dell CES 2025 preview, the Dell team painted a picture as to why it’s uniquely positioned to deliver upon this craze. Much of it boiling down to innovation around MPUs, the chips that make AI magic happen.

Michael Dell: As a powerful source of highly efficient, low latency, low cost computing, it’s going to change the way work is done.

Diana Blass: Dell is also prioritizing an open ecosystem, integrating silicon with Intel, Qualcomm, and now AMD into its portfolio. Dell says the AMD chips will roll out first in Dell Pro and Dell Pro Max lines.

Kevin Terwilliger: What we’ve learned is actually those AI workloads are very diverse, and we want to meet customers where they are, similar to our server business, whereas we understood more of those workloads, we added more silicon options.

Diana Blass: It’s all part of Dell’s strategy to match the right performance to the workload. But how exactly does that work? Let’s dive into the Dell Pro AI Studio to find out.

Marc Hammons: So as an app developer, you might say, “Hey, I want to go and introduce some type of chatbot into my application, but I don’t necessarily want to have to track what the silicon is underneath that application. Is it going to be Intel? Is it going to be Qualcomm? Is it going to be an AMD processor? What is it?” And so what it’ll do is kind of figure out, “Okay, I know what I’m running on. I know which model is best suited to deliver performance on that device. So let me pull down that model and use it.” And that takes the burden off the developer. They don’t have to learn about the model, they don’t have to learn about the tool chain. They don’t have to worry about all of that complexity. It’s masked and hidden from them, and they can make use of the service and it just makes development a lot easier.

Diana Blass: Now, if you’re thinking AI PCs, didn’t we already hear about this last year? You’re not wrong. But this year, it’s hitting a whole new level. Why? Because the clock is ticking. Windows 10 end of life is just around the corner, meaning enterprises are gearing up for a massive device upgrade. Combine that with the leaps we’re seeing in hardware and sustainability, and suddenly, those futuristic applications we’ve been dreaming about, they become reality. So lots of news coming out here and more to uncover, so stay tuned and you’ll stay connected. Till next time, I’m Diana Blass.

The post Six Five Connected – New Dell Product Lines and Announcements from CES 2025 appeared first on Moor Insights & Strategy.

]]>
Six Five On The Road – Alienware Announces New Flagship Products at CES 2025 https://moorinsightsstrategy.com/the-six-five/six-five-on-the-road-alienware-announces-new-flagship-products-at-ces-2025/ Tue, 07 Jan 2025 19:03:51 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=44860 Matt McGowan, General Manager at Alienware, joins Anshel Sag to discuss Alienware's exciting new launches at CES 2025, including the return of the iconic Area-51.

The post Six Five On The Road – Alienware Announces New Flagship Products at CES 2025 appeared first on Moor Insights & Strategy.

]]>
Area-51 Gaming PC’s are Back! 👽

Dell Technologies has showcased a ton of innovations at CES 2025. On this episode of Six Five On The Road, host Anshel Sag, Principal Analyst, Moor Insights and Strategy is joined by Alienware General Manager, Matt McGowan, for a conversation on the highly anticipated announcements from Alienware, a subsidiary of Dell Technologies, continuing its tradition of unveiling cutting-edge gaming technology.

Their discussion covers 👇

  • The reintroduction of the iconic Area-51 product line, its legacy, and its significance in the gaming community
  • How the feedback from the gaming community influenced the new generation of Area-51 desktops and laptops, emphasizing customization and upgradeability
  • The inspiration behind the new design language for the Area-51 laptops, and Alienware’s vision for the future of gaming products
  • An overview of Alienware Elite Care, a new customer support initiative, and the exclusive benefits it offers to ensure an unparalleled gaming experience
  • The unveiling of the Alienware 27 4KQD-OLED display alongside the Area-51 lineup and how this flagship monitor enhances the gaming experience

Learn more at Dell Technologies.

Watch the video below at Six Five On The Road, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Six Five On The Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Anshel Sag: Hey, everybody, I’m Anshel Sag. We’re here at Dell’s CES event and I’m joined by Alienware’s GM, Matt McGowan. Thanks a lot for joining me today.

Matt McGowan: Hey, thanks for having me.

Anshel Sag: Every year, CES proves to be a playground for Alienware and all of the latest things that you have in the gaming space, and this year is no different, but can you give us maybe a rundown of everything that the team is announcing this week?

Matt McGowan: Yeah, you’re absolutely right, this year is no different. We are relaunching Area-51, which is a storied sub-brand for us, but we have a brand new desktop, Area-51 full form tower. We have two new notebooks, Area-51 16 inch and 18 inch notebooks. We have a brand new Alienware 27-inch display, which we’ll talk about. And then we have Alienware Elite Care, which is a new service offering that we’re going to wrap around all of our product offerings.

Anshel Sag: Area-51 is a storied product, and for those of the people who are not familiar with it, could you possibly give us a little bit of background on the Area-51 brand and how it came to fruition?

Matt McGowan: Yeah. Area-51 has been part of the Alienware family of products for decades now. We’ve had multiple Area-51 towers over the years. That name really embodies the best of the best that we can provide from a gaming experience and a piece of hardware. There’s really a couple of things that every year that we’ve launched that product we’ve tried to do. One is around performance. We wanted to make sure it had a huge performance. And then the other one was around scalability. So all of the Area-51 products that we’ve had are larger products before that upgrade ability, more storage devices, bigger graphics, and as you know, the graphics aren’t going down in power, they’re actually going up in power, and those cards are getting bigger and heavier and more power-hungry, and you need the space to accommodate that. So we’ve taken that Area-51 name and we’ve extended it across desktops and notebooks. So whether you’re talking about our new 80-liter tower or whether you’re talking about the new 16 or 18 Area-51 notebooks, they are the best of the best from a gaming perspective and deliver that best experience.

Anshel Sag: And that’s part of the simplification around your gaming product line, right?

Matt McGowan: Yeah, correct. We want to make it really simple for customers to self-select in those products. If you’re an enthusiast gamer, if you’re somebody that loves technology and wants to tinker with the product and geeks out on things like we do and is really into gaming, Area-51 is the product for you and people that are willing to invest in that technology and those features and design, they’re going to be fantastic products for those users.

Anshel Sag: Great. And is there another place where gamers can go if they’re not going for Area-51?

Matt McGowan: Yeah, so over the course of next year, we’re going to be offering other products. We’re going to have other products. We do plan to adapt the Aurora name to introduce new products later in the year. TBD on what those look like and what’s in them, but suffice to say it will be a different tier of product that we have where we’ll be leveraging the Aurora name that we use on desktops over to the notebooks.

Anshel Sag: Great. Area-51 has been around since 1998, so could you maybe tell me why is now the time to bring it back?

Matt McGowan: For several years we’ve been talking about reintroducing a full-sized tower. So Area-51 is an 80-liter tower, it’s more than enough space to fit every component and then some into the future. There’s a lot of intersection points that happen for this launch. So we have next generation NVIDIA graphics. As you know, over the years, the graphics have become bigger and bigger and more powerful and comes with a lot of capability, which equates to better gaming experience. So this was the right point to reintroduce this product and come into the category in really a big way.

Anshel Sag: The new design language for the Area-51 laptops is unique and is inspired by aurora borealis and extraterrestrials. I’m curious if you could tell me how this reflects the future of Alienware’s gaming products.

Matt McGowan: Yeah, that’s a great question. Over the years, you can see that our products don’t look the same. We try to reinvent ourselves, and this one is no different. You mentioned some of the inspiration behind the notebooks. We call the design language AW30. It’s really an ode to our 30th anniversary upcoming. And we looked at a bunch of different sci-fi thematics, and what was really interesting to us was around aurora borealis and this idea of morphing colors and shapes and how we could apply that to a notebook. So you’ll notice on the Area-51 notebooks, we used anodized aluminum.

We have this metallic finish that we call liquid teal. We have gorilla glass on the bottom where you can see the inside of the product. We use translucent materials and we pipe light through it, and then we’ve created a new animated lighting effect that makes it look like aurora borealis. So we’ve made a concerted effort to apply that to the product. The other thing that happens when we embrace the fluid nature of those shapes is the product becomes very ergonomic. So using the product is really nice. We have a pillowed palm rest so as marathon gaming sessions, you don’t feel that in your palm.

Anshel Sag: I’ve had that line on my wrist before.

Matt McGowan: Yeah. So we’ve picked apart the notebook and paid attention to every single detail. We talked about the gorilla glass window on the bottom. We pay as much attention to the inside of the product as we do the outside of the product. Well, now customers can see that. And we actually illuminate the fan similar to how we do on a desktop computer, and they can be seen from the bottom, the top, and at a certain angle from the side. So it’s a really immersive design, so it’s really, really cool. We’re excited about it.

Anshel Sag: You know what’s funny? I was just thinking about this. It kind of reminds me if a stealth bomber had RGB lights.

Matt McGowan: Yes.

Anshel Sag: It’s really sleek. It has really nice aerodynamics, but it’s also really pretty. And I really love that window, and I think it’s great for thermals because it kind of creates that gap so you have a little bit more air and yeah, I think it looks great.

Matt McGowan: Yeah. You mentioned the innovative thermal design. We actually have a quad fan thermal design that’s embedded inside of that cryo chamber, and what that allows us to do is we can actually move more air through there, which allows for us to have quieter operation as well. So you’ll notice that these notebooks are actually quieter than other ones that we’ve done in the past.

Anshel Sag: Even though they’re a lot faster?

Matt McGowan: Even though they’re a lot more powerful, yes.

Anshel Sag: That’s great. So I’m super excited about the launch of Alienware Elite Care, specifically for gamers. So I’d love to hear about how Alienware Elite Care came to fruition and what you’re trying to achieve with it.

Matt McGowan: We wanted to develop a service offering that really catered to the gamer needs, which gamer needs are a bit unique.

Anshel Sag: Totally.

Matt McGowan: The devices are different, the use cases are different, so we built a tier of service offering that delivers the best of the best when it comes to service. So when you buy into Alienware Elite Care, you’re going to get people that understand gaming. They understand our products, they understand gaming, they’re able to optimize the product for certain games, and they’re able to set up parental controls that are specific to game settings. You get accidental damage included in that. Something new that we’re doing is you’re getting a battery replacement when you buy notebooks, that’s offered between year two and year four. Then we offer support assist on the product itself, which automatically tunes the product for best performance. So you’re getting quite a bit more that’s specifically targeted to gaming use cases to where we didn’t have that in the past. So we’re really excited about Alienware Elite Care.

Anshel Sag: I’m super excited about it as well because I think one of your strengths is your support structure, and I feel gamers generally are not getting the best support that they should be, but they’re also the most demanding customers. So by having a differentiated and extremely unique support offering, I think you guys are doing something special.

Matt McGowan: Yeah, we’re really excited about it. I couldn’t agree more. I think the needs of the gamers who are gaming on PCs is very different than your typical PC buyer.

Anshel Sag: Definitely.

Matt McGowan: So we had to develop that offering specifically for them that caters to them.

Anshel Sag: As a final question, alongside the Area-51 lineup, you guys also have the 27-inch 4K QD-OLED monitor. I’d love to hear what you think about that and what the ethos is behind that product.

Matt McGowan: Yeah, so QD-OLED, as a technology, we were actually the first to introduce that in gaming displays. And to this day, we’re heralded as the best gaming display out there with the QD-OLED technology. And we’ve had several versions of it. This year, we’re launching the 27-inch 4K QD-OLED at 240 hertz. So when it comes to high-speed gaming, high fidelity, high color and contrast ratio, this is going to become the new high standard. So we’re excited. Now, it does adopt the new AW30 design language that we talked about.

Anshel Sag: I saw that.

Matt McGowan: Yeah.

Anshel Sag: It’s got that nice little base.

Matt McGowan: Yeah, it’s very fluid and organic and really otherworldly. We’re excited about how that pairs with the Area-51 notebooks, for example.

Anshel Sag: Right. And the desktop too.

Matt McGowan: Yes, absolutely.

Anshel Sag: I think a lot of people don’t really know that you guys are the worldwide leader in displays, so there’s a reason why you have the QD-OLED before everybody else, but I think it’s unfortunately one of those things where you have to see it to really appreciate QD-OLED. I’ve been gaming on QD-OLED since the 34-inch came out and I’ve been gaming on the 32-inch after that. So I’m excited to play with the 27-inch and see how that experience improves specifically with FPS games where that visual acuity is super important.

Matt McGowan: Yeah, we hear that all the time from customers that once they go QD-OLED, they can’t go back. It’s very difficult because of that performance and for the various needs based off the game that you play.

Anshel Sag: Thank you very much.

Matt McGowan: Yeah, thanks.

The post Six Five On The Road – Alienware Announces New Flagship Products at CES 2025 appeared first on Moor Insights & Strategy.

]]>
Six Five On The Road: Dell’s Innovative Approach with AI PCs https://moorinsightsstrategy.com/the-six-five/six-five-on-the-road-dells-innovative-approach-with-ai-pcs/ Mon, 06 Jan 2025 20:00:37 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=44765 Dive into Dell Technologies' strategic approach to AI PCs with Jeff Clarke, exploring market readiness, unique opportunities, and future AI innovations

The post Six Five On The Road: Dell’s Innovative Approach with AI PCs appeared first on Moor Insights & Strategy.

]]>
Despite recent concerns about the overall AI PC market, Dell Technologies is proving that the future of this tech is more than just hype. Host Patrick Moorhead is with Jeff Clarke, Vice Chairman and Chief Operating Officer of Dell Technologies on this episode of Six Five On The Road. They discuss how Dell is bolstering its AI PC endeavors through strategic partnerships, and what future AI advancements might look like.

Tune in for details on:

  • AI PCs progressing from a “what if” to a reality. They discuss whether the AI PC era has commenced and if the market is primed for these advances
  • A look at Dell’s partnerships across the technological spectrum, emphasizing collaboration in silicon innovation and AI PC market penetration
  • Dell’s strategic advantage in the burgeoning AI PC market. How their strong R&D, vast supply chain, and deep customer relationships may give them an edge
  • Future predictions for AI technology and its implications for consumers and businesses alike

Get the latest announcements from CES 2025 and learn more at Dell Technologies.

Watch the video below at Six Five On The Road, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Six Five On The Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is back, and we are talking about one of my favorite topics with one of my favorite people. We’re talking about AI PCs with Jeff Clarke. Jeff, it is great to see you, always enjoyable. And it’s just one of our conversations just with a camera here.

Jeff Clarke: I’m an engineer, I don’t like cameras. Good to see you, thanks for having me. And I do like AI PCs like you.

Patrick Moorhead: No, totally. And what I’d love to do is start off with, I don’t know, environmental analysis. We’ll call this the AI backdrop. What are you hearing for your customers regarding AI PCs? Maybe you can talk a little bit about the consumer side, but also on the enterprise side.

Jeff Clarke: Well, I think probably the more interesting thing from my seat is, it is not well-known what an AI PC is capable of doing. And the discussions I have with customers are in the what if. Why if your PC could do natural language search? What if your AI PC could take video and look for some particular snippet? What if an AI PC or your PC could write your text, write your email, respond for you? And you begin to engage in a conversation whether the consumer, small business, large business, of the what if. And then you eventually go, “Well, that’s tomorrow. The AI PC is here.” And they get better next year. They get better the year after, they get better the year after. And we’re now on a path where we have a dedicated processing unit, the NPU plant engineer with you, this pursuit of accelerated performance. And we now have an engine that is dedicated to do these tasks offloading the host, very much like we’ve done in servers for the past decade.

Patrick Moorhead: Right. And AI is not new on the PC. We’ve been doing different elements of it. A lot of the times we’ve done it on the GPU, now we’re moving it to the NPU. There’s various sizes of NPUs as well. And I’m curious, can you say do a flag plant and say, has the AI PC finally arrived?

Jeff Clarke: Well, to your point, I know Dell has been doing AI in our PCs for the better part of two decades. I go look at some of our battery optimization. You go look at how do you really maximize performance with a particular energy density in the algorithms that we built in, which is really forms of artificial intelligence and machine learning. You extrapolate that going forward, and you really like the notion of what was done in 24 really beginning to put a floor, if you will, of NPU performance. When I think of ladder 24 now 25, oh my gosh, we now have the capability to do the very many things I just talked about. And you think about what’s capable in the new modern microprocessors. And it doesn’t matter if it’s Intel, AMD, Qualcomm, and others that might come, they all have met a performance threshold that we now can do wonderful things. Or tracking, I know you are with your research. My gosh, 150 ISVs put in 300 specific types of applications and techniques and features that are dependent on an MPU? This thing’s coming.

Patrick Moorhead: Yeah, it’s pretty cool too. And sometimes this gets overlooked that what used to be done on a 250 watt graphics card a couple of years back can now be done on a very optimized battery powered experience with low latency. And I think that’s a game changer, because it really brings the opportunity to everybody. But Jeff, our consumers and business ready for it? Are they ready for the AI PC?

Jeff Clarke: Well, I would assert yes. And I’ll give you my following basic case. If you’re going to buy a new PC, why wouldn’t you buy an AI PC? The install base is one and a half billion units. Half of those are four years or greater. If you look at what we shipped in the first Covid year, 2020, 300-ish million PCs. They’re four years old, they’re ready to be refreshed. 2025 presents the case of 340 million PCs that are four years old in 2025. If you’re going to upgrade, why wouldn’t you buy one? Because it’s an investment protection. And if you’re going to buy a new one and you’re going to have it for four years or greater, you’re going to want AI or you’re going to be buying a new PC shortly thereafter. So, I think they’re ready because of that very reason.

Patrick Moorhead: Yeah, that’s a great point, Jeff. And then you factor in the Windows 10 to Windows 11 transition from enterprises, and enterprises like to have a very consistent rollout of platforms. It almost becomes a no-brainer. I need to watch myself. Every time I say a no-brainer, then something comes up there. The other thing that I think gets forgotten is, not only are these AI PCs great for AI, they’re great for everything else. Battery life is better, the non-AI performance is so much better. And when you package that all up, it’s a very compelling value proposition, I think.

Jeff Clarke: And I’m in my 38th year here at Dell, and I think about what we used to build and what we build today, and how much more efficient they are, what they’re capable of doing. We’re taking the world’s greatest productivity device. We’re making it efficient, more efficient than what it used to do, and we’re giving a new capability called an NPU that’s going to run new workloads. I think about our developers here at Dell, I can now take with smaller models in the quantization that’s occurred, and I can now do AI code assistance out on my PC. I’m not backhauling into a data center. I think about what we can do with smaller data sets across smaller models and getting, if you will, the PC close to the data where inference will be done, it is the device. So in my way of describing it, the PC is set up for this moment to really have people experience AI through the PC. More people will experience AI on the PC than they will through a data center.

Patrick Moorhead: And Jeff, I haven’t been doing it through 38 years, but I’m going on year number 35. And I joke with you all the time that I used to have a real job managing PC product lines, and managing PC processor product lines. And I can pretty much say this, as an analyst, we need to be very careful with the wording we use, but I do think this is the single biggest opportunity I have seen since the dawn of the internet. And yeah, there’s going to be bumps. No kidding. We saw those when the first internet PCs came. And there were discussions about costs. Oh my gosh, we had to have an RJ-11 plug on this thing, and that costs at least a 30 to $40 bomb cost. And then my gosh, wireless. How are we going to afford all of this? And then once people started using it, it literally became a no-brainer, but it set up the situation, which I think we will have, is there will be AI-enabled PCs and then they’re going to be everything else.

And we’ve had growing pains on when software was supposed to come out. One of the biggest things that on the commercial side that really gets me going is, some of the ISVs and even the GSIs who are putting small models, and you have financial institutions who want to run models on this to do credit scoring, not in their own cloud or in their data center, but actually running their own models. So, just seeing all this activity going on in this investment gives me a lot more confidence on where this is going. But I do have to hit you on this, surprise, Dell is not the only company doing AI PCs. There’s two, maybe three, plus more out there. What makes Dell unique with its AI PCs?

Jeff Clarke: Well, I think when you step back and look at our track record of innovation, our track record of market leadership, our track record of talking to more customers directly than anybody else, and then translating that into specific features that our customers value, what’s not to like, that’s what we do, that’s what we think is different. What’s different for us is we have a second to none go-to market reach. You look at our marketing, you look at our Salesforce, our direct coverage of accounts, that knowledge goes back into our R&D organization. We have the largest supply chain of everyone that you would want to mention in the marketplace. So we have scale, we service our products directly, the telemetry data that our systems are calling home or feeding back to our corporation, and we’re synthesizing that with actually AI tools to build better services into better experiences for our customers. And I’ll put our R&D up against anybody’s. We’re good. And that’s what makes our company different. Those are the four distinct differentiations. It started in the PC business, and it continues to be here.

Patrick Moorhead: I like those, and we’ve discussed those, and we’ve pulled these apart. And what’s really interesting about this, Jeff, is since the creation of Dell, you’ve stuck to these, many of these. And when a lot of other companies are moving around and maybe changing it, the other part is how you’ve augured in on the enterprise market where you’ve got the highest market share. And it’s not that you don’t love the consumer market, it’s just that you’re very focused on the enterprise.

Jeff Clarke: Well, and I think part of that, if you take what you described and extend that, one of our other differentiating capabilities, to your point is, these have been four consistent attributes of our company. We’ve invested in double down. Then you take AI from the PC all the way to the data center into the large format training that’s happening today in the large language models, none of my PC competitors are doing that. We’re taking the knowledge and understanding we have to build these incredible computational engines, how to cool them, how to power them. We distilled that down to how we build PCs in a very efficient way. We think about by the end of the decade roughly 90% of I believe AI will be inference. Inference out of the edge, inference in the data center closest to the data. Well, that’s what we do. And our ability to describe not only a consumer solution, but a small business, medium business, large, at the edge, all the way to the data center, I think is another distinguishing characteristic and attribute that we have in the AI era.

Patrick Moorhead: Yeah. So Jeff, you’ve always, it used to be in the spirit of open industry standards. You look back at your history, you’ve always leveraged the tech ecosystem. And it takes a village to pull all this together. And I’m curious, how are you doing this? How are you leveraging the tech ecosystem on AI PCs? I mean, you have got the silicon providers, you have the GSIs, you’ve got the ISVs. How are you looking at that right now?

Jeff Clarke: Well, it’s a little bit of secret sauce, but let’s see what I’m willing to expose here.

Patrick Moorhead: You can spill it on The Six Five, Jeff.

Jeff Clarke: Oh, I’m sure I can. Look, we’ve built an at-scale R&D organization. You have techniques. I mean, you and I know this. We’ve designed a long time ago and some of the attributes still exist today. Where do you draw the lines of modularity? How do you build complementary components? How do you do leverage and reuse in your design? We don’t do the same schematic over and over and over just because you can. If it can be leveraged, and reused, and it meets the need, we spend our engineering efforts elsewhere. So the ability to leverage, reuse, the ability to do modular designs, the ability to think forward, and the fact that we make conscious choices, we don’t build one of everything. And so we do fewer things. We like to think we do fewer things better by directing our engineering resources on the choices we’ve made for our customers, based again on what we think is a great set of insights, and that’s how we do it here.

Patrick Moorhead: Yeah. Can you talk about the partnerships, Jeff, a little bit?

Jeff Clarke: Well, the partnerships, clearly you start with Microsoft and what we do on the operating system side, and what’s exciting about Copilot, Copilot+, what is capable there, how we work closely with them and building the right foundational platforms. Then you extend some of the work that we’ve been doing with, whether it’s Hugging Face, whether it’s some of the GSIs where we have a complete dedicated go-to-market capability there to help customers sell with, sell to… I don’t know whether to describe it. There’s so many partnerships that we’ve built across our ecosystem to reach customers, to fulfill customers’ needs that are complementary to us. We clearly can’t do it all, we know that. We need a, if you will, the proverbial takes a village, and we want a village, and we’ve tried to build a village around our entire business, not just the PC business, but end-to-end from enterprise, infrastructure, storage, servers, networking, all the way down to the edge, which even the far edge out to the PC and with our telecom business. So stitching that fabric of partners together, building this village of, if you will, an ecosystem, that allows us to meet the needs in a very unique way, I think is something that differentiates us over others.

Patrick Moorhead: Yeah. Jeff, I talked to most all of your partners, and one thing they came back and they do tell me is they appreciate that with one company they can, it’s not just PCs, it’s not just data center, it’s not just servers or something like that. It’s a larger opportunity. And the ability, they see a benefit at the ability to leverage the entire Dell estate as opposed to one singular entity.

Jeff Clarke: Right, because one of the things that’s happening, Pat, certainly in our working together here, is, well, we get excited about the gear, and the device, and what it’s capable of doing, and what’s inside it as former designers, it’s about outcomes. Customers talk to us about outcomes. And we don’t know how to deliver outcomes for customers unless we talk about the end-to-end infrastructure that’s going to be deployed at an account. How are we going to secure things? How are we going to provide data? How do you get data to these AI PCs? How do you secure that? What are we going to do? So we’re taking the practices that we’ve built now in advance of our infrastructure business, and there’s light versions, if that makes sense to you, or to the audience, light versions of that for our PC teams.

Patrick Moorhead: Makes a lot of sense, Jeff. Hey, I want to wrap this conversation. I want to look to the future here, Jeff. Jeff, what is it that we can expect in the future next in AI? I know that’s a wide-open question, but I want to ask it, I don’t want to pin you into a certain corner.

Jeff Clarke: Well, I’m personally excited, and what I think happens is that Agentic technology takes off, and we’re all going to be a manager of many agents. We are going to have agents in our personal life, we’re going to have agents in our professional life. The agents are going to talk to one another. They’re going to make us more effective employees, they’re going to make us more effective consumers. I can imagine, if I wanted to play what if with you, what if I had an agent on my notebook, which we will, and we have an agent working with the operating system, and we have an agent working with a third-party software, they actually talk to one another, without any human intervention, and we actually know we’re at rev, we are secure. It does all of that stuff because it’s supposed to and they learn from one another.

Or you are working in, your agent sees, “God, Pat, we’re using way too much memory. I’m looking at your usage pattern and this PC is not capable enough. You need a new PC that has this memory footprint, this MPU capability.” “Oh, by the way, I’ve already done the search. I’ve already looked at it. Here’s the best price. It’s available tomorrow if you order by 3:00 P.M.,” blah, blah, blah, blah, blah. You’re going to have an agent do that. So, I think that’s the world we’re moving to. And whether that’s an agent on the smartphone that will work with the agent on the PC, that will work with the agents within your business or your agents in your personal life, what’s not to like? So we’re all going to become incredibly more productive, more efficient, and use this technology even at a greater extent than we do today. Does that make sense?

Patrick Moorhead: It totally makes sense. And everything you outline there is not some pie in the sky thing. The tech is there. It takes software a while to get there, but the plumbing has been installed. It’s there. And making it happen is a lot of times the harder part in the software side to replicate it, but it’s absolutely a possibility. But even more important question, Jeff, is, when is the Jeff bot and the Pat bot going to get together and have the conversations that we have?

Jeff Clarke: Well, the real-life versions do it frequently enough and we need to do it again. It’ll be interesting when the bots actually take over, but I think it’s an evolving from bots to agents that actually learn, and it can anticipate, and then ultimately can go, you and Pat tell us, they actually go to our calendar, go, “You two characters were saying these things you need to get together on Thursday at 2:00. I’ve looked at both of your calendars and on Thursday at 2:00 you’re going to sit down, and here’s the outline we built for you.” That’s what’s coming.

Patrick Moorhead: Again, another realistic thing that is there. Not just some made up thing 30 years from now. Because I was talking about agents 35 years ago when I was at AT&T, right? And General Magic, and all this stuff that was going to happen over the network. But none of it was there. And it’s all there right now. So exciting stuff, Jeff. Really appreciate the conversation. Hopefully this won’t be the last, but I’m excited about the future. Congratulations on all of the success, Jeff. I know that you rarely look in the rearview mirror, but I mean, your stocks are up 61% this year alone, and I always make fun of Wall Street and what they think, but I think they got it right this time. And it’s good to see the success that you’re having as a company, and the employees at Dell have got to be excited about this, that they absolutely have to.

Jeff Clarke: We’re excited about our opportunities. Thankful for our customers. And as I like to say, we’re going to get up tomorrow and get after it. That’s what we do.

Patrick Moorhead: I love it, Jeff. We are cut from a very similar cloth and I appreciate that. So, thanks for coming on the show, Jeff.

Jeff Clarke: Thanks for having me. I appreciate it. Thanks, Pat.

Patrick Moorhead: So this is Jeff Clarke signing out with Pat Moorhead on The Six Five. We hope you enjoyed this segment. Hit that subscribe button, be part of the community. I want to thank you for coming, and take care.

The post Six Five On The Road: Dell’s Innovative Approach with AI PCs appeared first on Moor Insights & Strategy.

]]>
Six Five Podcast: Exploring Micron G9 NAND – The World’s Fastest NAND https://moorinsightsstrategy.com/the-six-five/six-five-podcast-exploring-micron-g9-nand-the-worlds-fastest-nand/ Mon, 06 Jan 2025 18:21:18 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=44771 Russ Meyer, Senior Vice President at Micron, shares his insights on pioneering the industry with the world's fastest NAND, the G9, and its real-world implications.

The post Six Five Podcast: Exploring Micron G9 NAND – The World’s Fastest NAND appeared first on Moor Insights & Strategy.

]]>
On this episode of the Six Five Podcast, host Daniel Newman is joined by Micron Technology‘s Russ Meyer, Senior Vice President of NAND Technology Development, for a conversation on the groundbreaking advancements in NAND technology. Micron, a leading name in memory and storage solutions, has recently unveiled a series of innovative products, including the world’s fastest data center SSD and the revolutionary G9 NAND technology, marking a significant leap in the storage industry.

Their discussion covers:

  • Micron’s back-to-back announcements of cutting-edge products and their impact on the technology ecosystem.
  • Insights into the Micron G9 NAND’s unique features and its positioning as the industry’s first in many aspects.
  • The rationale behind Micron’s shift in naming its NAND technology to reflect industry generation, moving away from the traditional layer count.
  • Micron’s strategy on leading NAND transitions, ensuring that announced NAND technology is not just a promise but a present reality in the market.
  • The significance of the 2650 SSD powered by G9 TLC NAND, highlighting its performance benefits and value proposition amidst industry competition.

Learn more about Micron G9 NAND at G9 NAND | Micron Technology Inc.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The Six Five Podcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Daniel Newman: Hey everyone. Welcome back to another episode of The Six Five podcast. I’m Daniel Newman, CEO of The Futurum Group and host here on The Six Five sitting in today for this conversation. We’re going to be talking to our friends at Micron about the G9 NAND, the world’s fastest. Now we’re going to talk about memory. By the way, talk about a renaissance. I mean in the era of AI, has there been a bigger moment for memory? Let’s talk about that. But let’s not talk about that with me alone. Let’s bring Russ Meyer. Russ is the Senior Vice President of NAND Technology Development at Micron. Russ, welcome to the show. How are you doing?

Russ Meyer: Hi, Dan. Yeah, great to be here. Thanks.

Daniel Newman: It is good to be here. So you hear my little introduction, all excited like. Sometimes it’s funny about the booms and busts of semiconductors. And of course those sometimes historically commoditized technologies like memory where people like to put it in a box, we’re going to take it out of the box today. We’re going to talk about some of that differentiation. But what a moment you’re having, right? I mean, are you feeling it right now? I mean, the demand is through the roof. A few really great earnings after a tough period of time. Everyone at Micron has got to be feeling pretty good, right?

Russ Meyer: Yeah, that’s a great call out, Dan. We’re coming off a very challenging year in 2023, and as you said, we’re looking forward to some, let’s say, strong secular growth driven by AI and the applications that that’s going to drive for the memory demand. So we’re excited about that and we’re excited about our string of recent product announcements.

Daniel Newman: Yeah, that’s been a big part of the story with Micron. It’s not just memory, it’s about innovation here. And as we’re seeing with high bandwidth memory and what’s going on with it, powering AI, not all memory and not any memory serves the need. And as we hear about the rate limiting innovation and the rate limiting manufacturing requirements, it’s like we can build all the GPUs in the world, but if you can’t get enough memory, we can’t build the systems. And so this is coming to a head right now. But I want to talk about G9 NAND and I want to talk a little about what you’re doing there. You’ve been on a roll, lots of announcements. You had the SSD, the 9550, and now the G9 NAND and 2650. Kind of run me through these. Just run me through the whole thing. And I mean, you’ve even got the, what’s, the industry’s first PCIe Gen6 SSD for ecosystem enablement that you guys showed off at FMS. Wow, that was a mouthful, but give me the rundown on all these announcements.

Russ Meyer: Yeah, yeah, Dan. Thanks for noticing. We are pretty excited. We are pretty excited about the string of announcements. We’ve been busy here at Micron building some great products and targeting these storage solutions. They’re going to accelerate AI. The announcements that you highlighted, they’re pretty strong announcements as far as industry leading products. But the third announcement, and the one that’s very close to my heart is that G9 NAND technology announcement. And that’s not only an announcement of the technology itself. It’s the announcement that we are shipping that technology in a drive, the 2650 client SSD drive.

Daniel Newman: So let’s double click on that a bit. You’ve sort of started to allude to it, but give us a bit of a deeper dive on the G9 NAND technology. What makes it unique? This is sort of a first, right, NAND in an SSD?

Russ Meyer: This is the first ninth generation NAND in SSD, and it’s really, this is the culmination of years of a multi-year development cycle. So the team’s been working on this for many years now from concept through development to now to productization. And there’s many milestones along the way as we go through that cycle. But this milestone is always a big one for us ’cause this is the culmination, this is the realization of shipping this new technology into an actual product so our customers and partners can start leveraging the advanced performance that it delivers.

One of the focus points on this Gen9 technology is focusing on the performance and the increased bandwidth. We delivered that with multiple, let’s say, parameters. One is the on fee interface speed. So this is going to be an industry leading 3.6 gigabytes per second interface speed, which is a 50% jump from any competitive NAND available and from our previous Gen8 or G8 solution as well. On top of that, the latencies enable a 99% better read and 88% better write bandwidth than what we see on competitive die that are available today. So these are strong, these are pretty strong numbers. So this is delivering really driving now into the client SSDs, but we’re going to be driving products through data center and automotive with these solutions as well.

The other thing I’ll highlight, Dan, is the density. So we’ve driven the density on this NAND. It’s 73% more dense than what we see in competitive NAND shipping and SSDs today. If you compare it to our previous G8, we drove it to a 44% higher density. So gets us to a staggering 21 gigabits per millimeter squared on that component die. So the other enabler that we brought in with this technology is we’re doing a six-plane architecture within the word line. So what that enables is that enables a much stronger bandwidth and improve read and write performance, which from the customer usage experience is a stronger or higher quality of service. So you have less collisions between read and write commands.

So all this pulled together and the density was also key in enabling us to maintain the footprint in the 11.5 millimeter package, which we’ve been targeting, which is 28% lower than what we see with most of the competitors in the industry as far as how much area it takes on that PCB board. So it gives the customers a lot more, let’s say, flexibility and signal routing, thermal performance and the PCB layout in general.

Daniel Newman: No, I mean that’s really encouraging and I’m glad you called out the generation item. I was twisted tongue there. When you start going through the specs and builds of memory, it’s moving so quickly and there’s so much innovation going on. And speaking of moving quickly and innovation going on, you have been historically focused on layer counts, 232-layer NAND. Now you’re sort of pivoting here in this particular with the ninth generation and you’re not using layer count. I like my cakes with layers, but maybe I prefer my NAND with generations. Talk a little bit about why you did that, because obviously that’s still an important technological differentiation.

Russ Meyer: Yeah, there’s a couple points here, Dan. One is primary reason for the change, it drives consistency with how the rest of the industry is communicating their generational transitions with NAND. So it gets us more consistent with the industry. It also helps drive better clarity with the customers as to what products we have and what the generation they belong to. Historically, the layer count or the tier counts, historically, it’s been a kind of good headline metric. It’s been one of the key metrics that we highlight as to signaling where we’re at with the technology cadence.

But it’s really, the reality is that layer count is just one of several vectors that we have available to us to drive the scaling and the performance roadmap. So it’s just one of those vectors. And sometimes Dan, it’s not even the best one. So what we’re seeing as we look forward, I mean the customers are most concerned about performance, bit density, cost efficiency, power efficiency and those considerations, and like I mentioned, this layer count in and of itself isn’t the best way to bring that package together.

Daniel Newman: Yeah, it’s kind of like in cars. Everyone likes to talk about horsepower, but sometimes that’s the best example. Sometimes it’s torque. Sometimes it’s the amount of efficiency you can get out of an engine. Different people care about different things. I think what you’re basically alluding to right now is that there are a number of key elements, that’s one of them, but this newest generation gives them many of these different key elements that they’re trying to optimize and build their products around.

Russ Meyer: That’s exactly right. And as a development team, one of the things we focus on is what’s the best overall solution to deliver the customer’s demands? And on the G9 for example, we put a lot of innovations and focus into compressing the array in an X-Y direction. So you get more pillars, more memory cells per unit area. And what that does is it allows us to get to those scaling numbers without driving tier count ever higher. And the other thing I point out is, and you’ll be seeing this across the industry, is layer count for the first decade of vertical NAND was a primary driver in the scaling. And what you’re going to see is there’s a paradigm shift happening as we go into the second generation of vertical NAND scaling, where there’s going to be a lot more pressure and focus on the other enablers for scaling above and beyond tier count or layer count.

Daniel Newman: Yeah, absolutely. And look, there’s growing competition of course here in the US, leader, manufacturer, innovator, critical part of our infrastructure. And that’s been really important, even just following what’s gone on with the CHIPS Act and now with AI and the desire to build more here. But you’re not only leading here in the States, you’re leading around the globe from a design standpoint. I’ve watched Micron be a first in a number of different areas, especially throughout this NAND transition, shipping them in SSDs straight out of the gate. And I think this goes back four or five years now. It seems that when y’all make an announcement, that’s my Texas coming out, when y’all make an announcement around NAND, it’s real. It’s not a horizon. It’s not this is what’s going to happen in two years, or this is the next decade. It’s now. And of course, seeing it consumed so heavily in SSDs. Talk a little bit more about that kind of leadership that the company’s been able to achieve.

Russ Meyer: Yeah, connecting these announcements with product delivery to our customers is important. It really allows them to have confidence and a reliable understanding of what our timeframes are. So when we make these announcements, we like to link it to actual shipping of the next generation SSDs to our partners and customers. It’s not an announcement of an upcoming product in the next year, or it’s not an announcement of introducing next gen NAND into manufacturing. It’s an announcement of delivering these drives to our partners and customers.

So it’s a very real milestone and it’s one that we prioritize. Like you said, we’ve focused over the last few years to drive our next generations into the SSDs first. So G7, and back in 2020, we were the lead SSDs delivering G7. A couple years later we were the lead company delivering G8 into the SSD solutions. And we’re continuing that now with this recent announcement where we’re driving the ninth generation or G9 into the world’s fastest TLC data or client SSD.

Daniel Newman: And by the way, it would’ve been easy given the turbulent economy, the market, the difficulty of memory for Micron to probably, it wouldn’t have been fun, but you could have kicked and pushed things out. You could have said, “Hey, we’re going to slow R&D spend and we’re going to slow certain investments.” Actually, throughout those years you made a number of big commitments to building further capacity to driving innovation, and that wasn’t necessarily true across the industry. I mean, do you see this opportunity to keep the leadership? I mean, we’re seeing big dollars doled out in CHIPS act, some to you, some to your competition. What do you see there? Is staying in front something you feel comfortable and confident in?

Russ Meyer: Yeah. Yes. We’ve been driving the technical leadership for the past few generations, and we do expect to maintain that cadence. And you do make a good point is when you have very challenging industry periods like where we are just finishing up, it can be a real, let’s say, issue or challenge for the team to keep the cadence going. And the fact that Micron is able to keep this cadence moving despite the challenging headwinds is really a testimony to how important it is for us to maintain that technical leadership so that our partners realize that they have a strong technical leader with us.

Daniel Newman: Yeah, it’s always such a push and pull with the markets, the economies, the pressures of Wall Street. You want to deliver. You want to make shareholders happy. And in this business, of course, it’s not like SaaS where it’s just kind of steady up into the right. It tends to be, there’s peaks and valleys. But any of us that’s been through these boom and bust cycles know that with every bust comes a boom and we’re back in it. But like I said, you’re kind of on the front edge. I want to make sure I’m getting this right about the way you’re approaching shipping the latest, so the 2650, for instance. First, it’s a client SSD. It’s one that OEMs will put into their system. It uses the G9, TLC 9 you’ve built, and it’s a value drive. So it’s leading, but it’s also being designed to be value, but therefore it’s still really highly performant. I’m curious. How are you tackling all those things? Usually it’s like that pyramid, you get two, but you give up something and it sounds like no sacrifice.

Russ Meyer: Yes. All the component enablers and the technology enablers that I walked through earlier, they’re all critical in enabling this total solution that we’re seeing on the 25 or 2650. And the component enablers are critical. They allow the realization on the drive itself to help improve real world user experience. And the 2650 does that. It does it very well. So if you look at it compared to current existing competitors, you’re looking at a 70% better sequential read, 100% better sequential write, 150% random read.

So all these show up as not only top level benchmark improvements, but they actually show up as improving the real world experience of our users and our customers. And there are some, let’s say, industry benchmark scores that target quantifying that and the G9 NAND on the 2650 in the PCMark 10, that score came at 38% better than what’s available in the market today. So the highlight there, Dan, is that these values or these performance numbers that we’re driving through the next-gen NAND are really coming out in the SSD for improved user experience. And we do expect that as we drive this G9 into data centers and automotive and other solutions, you’re going to see that benefit just carry forward.

Daniel Newman: Well, Russ, I want to thank you so much for spending some time. There’s a lot here to unpack. I hope everybody heard it, learned it, and if you didn’t get it all, of course you can listen again, but you can also check out the notes below. We put more information about the G9 and all the announcements we’ve mentioned here on the podcast in the show notes. Russ, you’re going to have to keep us up to date. Sounds like a lot of innovation going on. It’s really nice. Well, in 2019, I previewed the year 2020. Russ, I didn’t know some of the big things that were going to go on in the world in 2020, but I did say, I wrote a MarketWatch op-ed that said silicone will eat the world. And that was a big prediction that I made. Next five years it seems to have indeed done that. And it’s not just software. Try running software on air as I always like to say.

But anyways, it’s been a great run this last 12 months and I’m very encouraged about where things are headed. Continue to be impressed by the stuff that you and your team over at Micron are doing. Let’s have you back soon. Let’s do this again. Congratulations on all the success and keep on keeping on my friend.

Russ Meyer: All right, thanks Dan. I appreciate the time.

Daniel Newman: And everyone out there, hit that Subscribe button. Join us for all our episodes. Check out The Six Five, but I’ve got to go. So we will see you later.

The post Six Five Podcast: Exploring Micron G9 NAND – The World’s Fastest NAND appeared first on Moor Insights & Strategy.

]]>
MI&S Weekly Analyst Insights — Week Ending January 3, 2025 https://moorinsightsstrategy.com/mis-weekly-analyst-insights-week-ending-january-3-2025/ Mon, 06 Jan 2025 18:00:42 +0000 https://moorinsightsstrategy.com/?p=44775 MI&S Weekly Analyst Insights — Week Ending January 3, 2025. A wrap up of what our team published during the last week.

The post MI&S Weekly Analyst Insights — Week Ending January 3, 2025 appeared first on Moor Insights & Strategy.

]]>
MI&S Logo_color

Happy New Year from Moor Insights & Strategy!

Happy New Year from Moor Insights & Strategy!

(Photo by Jireh Foo on Unsplash)

Welcome to the annual tech trends edition of our Analyst Insights newsletter. 2024 was a year of rapid advances and unexpected developments across the technology landscape, and 2025 promises to bring even more surprises. As we embark on a new year, we’ve gathered insights from all of our MI&S analysts across their specialty areas to provide you with an overview of the key trends that shaped the past year and what our experts anticipate for the year ahead.

It was a pleasure working with you to navigate the transformative trends of 2024, and we look forward to providing valuable insights into the forces shaping the technology landscape in 2025 and beyond.

As always, if there is anything you would like to discuss as you plan for the year ahead, please reach out. Many of us will be in Las Vegas for CES next week—we’d love to connect with you there!

Patrick Moorhead

———

Our MI&S team published 22 deliverables:

Since our last newsletter, MI&S analysts have been quoted in top-tier international publications including OpenTools and Yahoo Tech. Reporters wanted our thoughts on AWS, Google Pixel 9, Nvidia, and smartwatch and wearable trends in 2025.

MI&S Quick Insights

My biggest surprise of 2024 was learning how developers have embraced AI assistance. Developers are a smart and often skeptical group of people. But time and again, I heard stories about devs paying out-of-pocket for assistant technologies to speed up their work. I expected more cynicism about AI’s ability to help with coding — which tells me that the technology must be pretty good.

I have two predictions for 2025:

  • Agentic development will continue to be big in the first half of the year, especially since we now have some highly viable agentic development platforms including Bedrock (AWS), AI Foundry (Azure), and Agentspace (Google). I also expect to see non-cloud competitors to these platforms this year. (Red Hat, can you hear me??)
  • AI governance and controls will be a massive challenge. We are already seeing technologists grapple with the implications of AI usage and apps. But once line-of-business professionals get comfortable with pervasive AI use, we will see IT and legal departments flex their muscles in a meaningful way.

My biggest surprise of 2024 was learning how developers have embraced AI assistance. Developers are a smart and often skeptical group of people. But time and again, I heard stories about devs paying out-of-pocket for assistant technologies to speed up their work. I expected more cynicism about AI’s ability to help with coding — which tells me that the technology must be pretty good.

I have two predictions for 2025:

  • Agentic development will continue to be big in the first half of the year, especially since we now have some highly viable agentic development platforms including Bedrock (AWS), AI Foundry (Azure), and Agentspace (Google). I also expect to see non-cloud competitors to these platforms this year. (Red Hat, can you hear me??)
  • AI governance and controls will be a massive challenge. We are already seeing technologists grapple with the implications of AI usage and apps. But once line-of-business professionals get comfortable with pervasive AI use, we will see IT and legal departments flex their muscles in a meaningful way.

I believe that in 2025, 5G will become an accelerator for AI and gain more prominence as a key component for enabling AI. While many vendors talk about edge AI and running models on devices, the reality is that many models simply cannot run on the device and that hybrid AI will remain in the future for a long time. The only way for hybrid AI to work effectively is with an always-on connection; this is really easy for smartphones but more challenging for PCs, and we might actually see 5G PCs grow as a result of that this year. Additionally, XR is an excellent interface for AI, and—conversely—AI is an accelerant for XR capabilities and growth. I believe we will see the new Android XR spatial OS as a proof point for that interconnection in both MR and AR products and solutions.

AI is a snowball that gathers larger amounts of material and grows bigger and more capable every day. In fact, it is accelerating in functionality and scope. Every morning when I open my inbox, it is filled with more new information about larger models, new features, larger funding, new funding, new startups, better reasoning, and so on.

I was curious about how much information is distributed about AI on a daily basis. I thought Google’s Gemini search might give me a general idea, but after spending a few paragraphs explaining why it couldn’t offer a hard number of publications about AI, Gemini said, “However, I can offer some informed speculation: Considering the immense volume of information Google indexes, the widespread interest in AI, and the constant stream of new content, it’s safe to say the number of publications is extremely large. We’re likely talking about millions, perhaps even tens of millions, of pages.”

So without using any specific numbers, here’s how I see AI’s growth. Most everyone has seen videos of the earth in comparison to the size of other objects in the universe. It starts like this: A basketball-sized Earth is initially shown next to a stadium-sized Sun. Then, the giant star Betelgeuse appears on screen, dwarfing the sun and the earth. Betelgeuse is as big as a city block. But Really Big is yet to come. When the massive star called VY Canis Majoris appears, Betelgeuse shrinks in comparison. What was once a giant star is now an insignificant sandbox compared to an entire beach. Finally, a supermassive black hole covers the screen. Relative to it, the Earth and Sun are nearly invisible specks.

Today, AI is like Earth in the video, but it will grow to the size of the bigger stellar objects over time. At least that’s how I envision its long term growth—AI of today is a speck compared to what it will likely become in 25 or 50 or 100 years. Let’s hope humanity has the wisdom and ability to use it wisely.

In 2024, despite the rise of AI, customer service saw a surprising trend: a renewed emphasis on human interaction. 77% of customers said they preferred an immediate connection with a person, and 81% would rather wait for a live agent than interact with a bot. While businesses strategically blend AI with human agents to enhance efficiency, customers overwhelmingly prefer connecting with real people for a more nuanced and practical experience. In 2025, AI-powered voice data analysis will become crucial, enabling hyper-personalized experiences by detecting emotions and predicting needs in real time. While omnichannel remains necessary, companies must prioritize voice interactions and leverage AI to extract valuable insights from this channel.

Meanwhile, CRM trends in 2024 revealed a shift towards user-friendly, self-service solutions, empowering businesses of all sizes. This trend will continue into 2025, with “CRM à la carte” and low-code/no-code platforms allowing for easier customization and simplified data entry. To combat data silos, companies are increasingly unifying teams under a single CRM system, streamlining communication, reducing errors, and enhancing data-driven decision-making.

2024 saw a dynamic in the compute silicon space that somewhat parallels the storage market: a bifurcation of silicon along AI versus non-AI lines. Bespoke silicon for bespoke workloads and functions has existed since semiconductors have been in existence. However, AI is different. The needs of AI have led to a renewed focus on semiconductors and startups such as Cerebras, Tenstorrent, Untether AI, and so many others. Further, the market has accelerated growth in the custom silicon space as companies like Broadcom and Marvell have benefited greatly from the needs of hyperscalers, which have very specific computational and power requirements around training, inferencing, moving, and securing data. So while NVIDIA has commanded the AI silicon market overall, it has been somewhat surprising to see the amount of VC funding that has gone into the silicon startup space.

I believe that 2025 will see this trend continue. AI inference will take center stage alongside AI training, with increasing focus on the many startups serving this market. Additionally, smaller functions along the AI data journey that currently add significant latency will spawn a new wave of silicon innovation to drive better performance and security. As in recent years, I expect to see significant VC funding going to seed startups that help in the collection, storage, preparation, and movement of data in the AI pipeline.

As in the silicon market, 2024 saw somewhat of a bifurcation of the storage market as high-performance storage vendors such as VAST Data, Weka, and DDN pivoted to address the AI data pipeline and data management. While storage is a critical element of the AI equation, gathering, cataloguing, and readying enterprise data is where the real complexity of AI becomes real for business and IT leaders alike as projects move from conceptual to operational. The early-mover status achieved by VAST and other high-performance computing storage players is logical, as these companies have been focused on the more advanced functions of storage systems for the sake of accelerating workload performance. This is why we saw VAST’s valuation skyrocket through 2024, at the same time the profiles of Weka and DDN rose considerably.

I believe 2025 will see the storage market shift in both technology and messaging as these upstarts continue to increase awareness, share, and valuation. We have already seen NetApp begin its evolution, and both Dell and HPE have quietly made moves that better position their respective portfolios. While AI-washing in terms of messaging is no surprise (because every vendor tries to exploit market trends), the investment in technology being made by these companies is the real tell; it signals that they see AI as fundamentally shifting enterprise IT organizations, in terms of both operations and technology consumption.

One partial outlier in this equation is Pure Storage. While the company continues to broaden its support for enterprise AI through its portfolio, it has not lost sight of the enterprise storage needs that exist outside of this one significant workload. However, the company seems to be taking a more measured approach in terms of allowing the market to come to it and meeting customers where they are. I believe it is this approach that has led to the company to regularly recognize roughly 10% year-over-year growth in its quarterly financial reports.

The enterprise application market experienced significant growth from 2023 to 2024, with the market size increasing from $335 billion to as much as $363 billion, depending on the source. This represents an approximate growth rate of 8.4% year over year. However, despite this growth, customer dissatisfaction with enterprise software vendors rose in 2024. This dissatisfaction primarily arose from perceived unfair pricing strategies and a lack of clear value delivered by vendors. This indicates a market shift in which customers are demanding greater transparency and a better ROI from their software purchases.

In 2025, I expect deeper integration of AI within application ecosystems. At the same time, customer companies will prioritize trust and demonstrable value, seeking clearer ROI and more flexible pricing models from vendors. Unlike the 2024 emphasis on feature expansion, in 2025 we will see more focus on efficiency, interoperability, and user-centric design. This shift reflects a maturing market where vendors must adapt to needs of discerning customers that have multiple buying personas and significant budget constraints.

ERP systems got a shake-up in 2024 with AI and vendor modernization efforts. But what really stood out was the shift in mindset—enterprises realized adopting ERP isn’t just about new tech. It’s also about getting their data organized and making sure their teams are ready for change. Functionality matters more than features. If anything, 2024 made it clear that ERPs aren’t just about keeping the lights on—they’re a critical tool for businesses to grow, adapt, and stay competitive.

The payoff for those who got it right was obvious. Modern ERPs centralize data across departments, automate routine tasks, and deliver sharper insights with improved analytics. Cloud technology—pushed hard by vendors—has made these systems more flexible, mobile, and user-friendly, while also being easier for vendors to support. Even so, with an estimated two-thirds of enterprises still using on-premises setups, there’s an emphasis on moving to hybrid or fully cloud-based systems.

Looking ahead, I see data strategies and ERP systems shaping up to be even more important in 2025. Cloud adoption will keep growing because it’s flexible, cost-effective, and lets enterprises keep their locations connected while enabling their people to work from anywhere. Managing data will still be a big deal, with businesses focusing on keeping the data clean, secure, and well-governed, with better tools for protecting sensitive info.

In many industries, supply chain management will also stay front and center. With IoT increasing in functionality and especially getting better at providing data, ERPs will get better at real-time tracking and analytics, making it easier to handle inventory, logistics, and demand planning. (See Bill Curtis’s “IoT and Edge” entry in this newsletter for more on how this part of the data landscape is changing.) I expect pricing to move toward consumption-based models (versus user-based) to make it easier to bring more employees onto each system.

We’ll also continue to see ERP systems designed specifically for different industries. But here’s the thing: all this technology only works if businesses manage change well. Teams need support to adapt to new workflows, or it won’t stick. And finally, sustainability will be a bigger part of the picture, with ERPs helping businesses track environmental goals and ethical sourcing.

In 2024, operational data was the unsung hero of digital transformation. While LLMs, generative AI, and agentic AI captured headlines, operational technologies (OT) quietly emerged as critical enablers of enterprise digital transformation. Enterprises with significant physical assets discovered that fusing OT and IT data into a company-wide, multimodal, real-time data estate transforms AI-enhanced ERP, SCM, and BI applications from reactive to proactive. (Robert Kramer’s entry on ERP and SCM elsewhere in this newsletter gives more perspective on trends affecting those software vendors and their customers.) This profound change upgrades decision-making, enhances process efficiencies, and provides a holistic context for advanced industrial automation. The rapidly growing ROI for OT-IT integration projects creates insatiable demands for OT data.

However, despite compelling integration business cases, most OT data remains inaccessible due to the complexity, cost, and security risks of connecting OT systems with mainstream enterprise applications. This is the OT-IT gap—the chasm between the uniform, managed world of IT and the heterogeneous, chaotic world of industrial IoT (IIoT).

Motivated by AI-driven demand for operations data, enterprise software suppliers that are scrambling to find more efficient ways to bridge the OT-IT gap are adopting a straightforward “data first” approach. Instead of trying to manage devices from end to end, just grab the data. Replace complicated, costly, hard-coded, application-specific device-to-cloud connectivity and device management solutions with simple cloud interfaces for data, events, and status. This approach provides immediate access to IIoT machine data and enables OT software to evolve independently from cloud-native IT systems. Multimodal AI applications can use many types of unstructured IIoT data as is, further reducing device-side software complexity.

Recent announcements from AWS, Google, Honeywell, Microsoft, Qualcomm, and other major cloud frameworks and ERP suppliers confirm this trend. The goal is clear: feed the rapidly growing market for AI-enhanced business transformation with massive amounts of OT data via standard protocols and simple APIs. In other words, simplify getting OT data from IIoT devices.

For 2025, I’m watching three enterprise edge trends and one consumer trend.

  1. CSP and ERM frameworks simplify and accelerate OT data collection, processing, and correlation for AI-powered enterprise applications. AI is now IIoT’s “killer app.”
  2. IIoT devices transition from customized, end-to-end mashups to scalable platforms supporting multiple enterprise frameworks via simple interfaces.
  3. Middleware companies fill the gaps, providing industry-specific connectivity, data, edge analytics, and device management services.
  4. For “smart home” consumer applications, 2025 is the year Matter reaches its tipping point, with significant design wins and increased adoption. Other vertical industries are carefully watching Matter’s standardization efforts, learning from its successes—and mistakes.

In 2024, platforms such as Zoom, Microsoft Teams, and Webex evolved into essential all-in-one communication and collaboration business tools, integrating features including whiteboarding, collaborative documents, and project management functions. This trend towards unified business platforms will accelerate in 2025, combining previously separate tools. Expect deeper integrations, such as what we’ve seen this year with Adobe Express within Box and the ability to create Jira tickets in the Grammarly extension.

Inclusivity will also be a significant focus in 2025, with accessibility features such as real-time translation and closed captioning becoming standard. Companies such as Ava are leading the way with tools specifically designed for individuals who are deaf or hard of hearing, while companies such as Google continue to prioritize accessibility.

As hybrid work persists and collaboration becomes more complex, security concerns remain paramount. Organizations will demand collaboration tools with robust security features, including end-to-end encryption and compliance with evolving data protection regulations.

With Qualcomm’s introduction into the PC market as a chipset vendor, we’ve seen new levels of competition in the space—something that I don’t think we’ve seen in probably the last 25 years. While the introduction of Copilot+ PCs with Qualcomm’s Oryon-based Snapdragon X Elite processors wasn’t necessarily the smoothest (lots of Arm app compatibility needed to get worked out), it did present an alternative offering that pushed the incumbents to accelerate their roadmaps and improve their execution; as a result, the PC market is now far more competitive and faster paced. I expect that this trend will continue to accelerate in 2025 as PC OEMs continue to negotiate with the chip vendors for better products and pricing, which I believe will ultimately benefit the consumer and accelerate the uptake of the AI PC.

Over the past five years, quantum computing has made significant progress—with 2024 as a big contributor to that progress. IonQ has become a public company. IBM has created a roadmap with corresponding technologies to push superconducting qubits past the 1,000 mark. Atom Computing has firmed up its neutral atom technology and has begun pushing aside barriers with its own 1,000-qubit machine. Quantinuum’s H-2 quantum processor has an unbelievably high quantum volume. Microsoft and Quantinuum are both advancing topological computing. Finally, the ecosystem has made several breakthroughs in quantum error correction. In fact, Google’s latest Willow chip actually reduces the error rate as more qubits are added.

In 2025, the trajectory of quantum computing will continue to be shaped by technological breakthroughs, increased investment, and the integration of quantum into broader technological ecosystems. We’ll also see IonQ begin to network its quantum processors together for increased power and scaling. IBM will continue to move forward with advancements in post-quantum cryptography. (More on that in this article.)

I also expect to see some early movement in using quantum for financial applications, such as applying QAOA (the Quantum Approximate Optimization Algorithm) for portfolio optimization and possibly some real-time analysis. JPMorgan Chase has a large portfolio of financial operations where quantum computing could replace parts of classical systems.

Meanwhile, PsiQuantum and Photonic are well on their way to creating photonic quantum computers. We will also see the beginning of real supercomputers that integrate AI, Quantum, and HPC.

Overall, I expect 2025 to be a year of early proofs-of-concepts.

The CrowdStrike global IT outage in 2024 was a seminal moment not only for the cybersecurity industry but also for developer operations in general. The million-dollar question—more likely a billion-dollar question, given the collateral damage—is what could have prevented such a devastating occurrence. Modern continuous integration and continuous delivery/deployment pipelines coupled with test environments are designed to provide a failsafe mechanism that catches bad code and allows rollback before catastrophe strikes. Integrations will continue among software platforms to provide the highest levels of endpoint security. I believe the CrowdStrike incident will serve as a learning experience for other IT solution providers.

Cybersecurity in 2025 will be defined by its ability to adapt to a rapidly evolving threat landscape. Identity access solutions will embrace zero trust architectures, automation, and seamless integration. Endpoint security will rely on AI-powered analytics and lightweight architectures. Cisco’s ongoing momentum with its security cloud, plus recent innovations from Microsoft, Okta, Palo Alto Networks, and others, demonstrate an industry move toward unified, scalable, and AI-enhanced platforms. Organizations must stay ahead of bad actors by investing in modern cybersecurity infrastructure, employing a culture of security awareness, and adopting an integrated approach to cyber defense that facilitates improved security outcomes.

2024 saw both progress and contradictions in tech sustainability. While green datacenters and energy-efficient AI emerged, the industry’s footprint remained significant. “Greenhushing” highlighted the need for transparency as companies became more cautious about publicizing their environmental efforts. In 2025, sustainability will shift from an optional good deed to a core business imperative driven less by a sense of virtue and much more by the energy demands of advanced technologies, regulatory pressures, and investor scrutiny. Companies must integrate sustainability into all operations, as it will become a key differentiator, separating leaders from laggards.

Citations

Arm / PC / Anshel Sag / PCWorld
Why 2025 will be the year Arm dominates PCs

AWS / CPUs / Patrick Moorhead / Network World
Graviton progress: 50% of new AWS instances run on Amazon custom silicon

AWS / Layoffs / Patrick Moorhead / Opentools
AWS Reshuffles with Layoffs in Tech Sales Division Amid Reorganization

Dell / AI / Patrick Moorhead / Yahoo Finance
Dell embodied 2 of the corporate world’s biggest themes in 2024: AI and RTO. It’s paying off.

NVIDIA / AI Chips / Matt Kimball / Singularity Hub
Here’s How Nvidia’s Vice-Like Grip on AI Chips Could Slip

AWS / Cut back on ZT Systems Spendings / Patrick Moorhead / OpenTools
AWS Trims Spending on ZT Systems Amid In-House Hardware Boom

Google / Pixel 9 / Anshel Sag / Yahoo Tech
2025 could be very different for Google and Samsung — here’s why

NVIDIA / EU clears acquisition of Run:ai / OpenTools
Nvidia Gets EU Thumbs Up for Run:ai Acquisition!

Smartwatches / Ansehl Sag / Yahoo Tech
Here’s everything we expect and want from wearables and smartwatches in 2025

New Gear or Software We Are Using and Testing

  • Kindle Colorsoft (Anshel Sag)
  • Google Pixel Buds 2 Pro (Anshel Sag)
  • XREAL One AR Glasses (Anshel Sag)
  • Google Pixel Watch 3, 41mm (Anshel Sag)
  • Cisco Desk Pro (Melody Brue)
  • OnePlus Buds Pro 3 (Anshel Sag)
  • Insta360 Link2 4K AI Webcam (Anshel Sag)
  • Google Pixel 9 Pro Fold (Anshel Sag)
  • Google TV streamer – Matter and Thread features (Bill Curtis)
  • Various Matter devices (Bill Curtis)
  • ASUS Zephyrus G16 Gaming Laptop (Anshel Sag)
  • iPhone 16 Pro (Anshel Sag)

Events MI&S Plans on Attending In-Person or Virtually (New)

Unless otherwise noted, our analysts will be attending the following events in person.

  • CES, January 7-10, Las Vegas (Patrick Moorhead, Anshel Sag, Will Townsend) 
  • Acumatica Summit, January 26-29, Las Vegas (Robert Kramer)
  • CES, January 7-10, Las Vegas (Patrick Moorhead, Anshel Sag, Will Townsend) 
  • Acumatica Summit, January 26-29, Las Vegas (Robert Kramer)
  • ZohoDay25, February 3-5, Austin (Robert Kramer, Melody Brue)
  • RingCentral Analyst Summit, February 24-26, Napa (Melody Brue)
  • SAP Analyst Council, February, New York City (Robert Kramer)
  • Adobe Summit, March 18-20, Las Vegas (Melody Brue)
  • Zendesk Analyst Day, March 25, Las Vegas (Melody Brue)
  • IBM event, March 25, NYC (Matt Kimball)
  • Canva Create & Analyst Day, April 8-10, Los Angeles (Melody Brue)
  • Nutanix .NEXT May 6-9, Washington DC (Matt Kimball)

Subscribe

Want to talk to the team? Get in touch here!

The post MI&S Weekly Analyst Insights — Week Ending January 3, 2025 appeared first on Moor Insights & Strategy.

]]>
RESEARCH PAPER: Evaluating the New Zoho Analytics https://moorinsightsstrategy.com/research-papers/research-paper-evaluating-the-new-zoho-analytics/ Mon, 06 Jan 2025 16:17:46 +0000 https://moorinsightsstrategy.com/?post_type=research_papers&p=44758 This report explores the evolution of Zoho Analytics into a sophisticated, AI-powered business intelligence (BI) platform.

The post RESEARCH PAPER: Evaluating the New Zoho Analytics appeared first on Moor Insights & Strategy.

]]>
Since its 2009 launch as Zoho Reports, Zoho Analytics has evolved into a sophisticated, AI-powered business intelligence (BI) platform. Its September 2024 release marks a significant leap forward, showcasing Zoho’s commitment to democratizing data analysis through over 100 enhancements. By heavily investing in AI and machine learning, Zoho has become a strong competitor in the BI market, challenging established players and empowering users to make more informed, strategic choices.

Click the logo below to download the research paper and read more.

Table of Contents

  • Summary
  • Zoho Analytics Tools for Data Preparation
  • Data Management Features
  • Varied Uses Across a Range of Roles
  • Interconnectivity with the Zoho Ecosystem
  • Future Trends and Zoho’s Role in the BI Market
  • Zoho’s Broader Roadmap and Strategic Vision

Companies Cited:

  • Zoho

The post RESEARCH PAPER: Evaluating the New Zoho Analytics appeared first on Moor Insights & Strategy.

]]>
Elastic’s Generative AI Momentum – Six Five On The Road at AWS re:Invent https://moorinsightsstrategy.com/the-six-five/elastics-generative-ai-momentum-six-five-on-the-road-at-aws-reinvent/ Thu, 02 Jan 2025 17:41:14 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=44752 Ashutosh Kulkarni, CEO of Elastic, joins Daniel Newman to share insights on Elastic's GenAI momentum and future outlook, highlighting the significance of partnerships and ecosystem in accelerating AI innovations.

The post Elastic’s Generative AI Momentum – Six Five On The Road at AWS re:Invent appeared first on Moor Insights & Strategy.

]]>
Enterprise RAG?

CEO of Elastic Ashutosh Kulkarni sat down with our host Daniel Newman at AWS re:Invent 2024. They touched on how Elastic is driving generative AI adoption by empowering developers with tools and partnerships, focusing on efficiency and integration within the AWS ecosystem, and aiming to lead the enterprise AI sector in 2025.

Specific highlights covered ⤵

  • Drivers behind Elastic’s strong momentum in generative AI adoption and their efforts to help customers accelerate their GenAI projects.
  • Insights into the newly announced Elastic AI Ecosystem and its role in aiding developers to navigate AI product choices and integrations more efficiently.
  • The significant influence of the AWS partnership on Elastic’s strategic directions and key takeaways from this collaboration.
  • Elastic’s achievements in 2024 within the enterprise tech landscape.
  • Future prospects for Elastic and the evolving enterprise AI sector in 2025.

Learn more at Elastic, The Search AI Company.

Watch the video below at Six Five Media and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Six Five On The Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Daniel Newman: Hey everyone. The Six Five is On The Road here at AWS re:Invent 2024 in Las Vegas. It’s been a great week. Lots of announcements. Heard so much about all the things that are happening in the AWS cloud ecosystem, and of course AI has been a central focus. One of the things that we’ve been spending a lot of time during this week doing has been talking to the partners. As we know, AWS has a rich and vibrant partner ecosystem, and I have the chance, the opportunity to have one of my friends back on the show, a partner, actually an award-winning partner this year at AWS re:Invent. I’ve got Ash Kulkarni, CEO, Elastic. Ash, welcome back. It’s been a minute, but it’s great to see you.

Ashutosh Kulkarni: It’s been great. Thanks for having me.

Daniel Newman: So we’re going to hit AWS re:Invent, but maybe before that you and I, you’ve been on my market show, we’ve talked a number of different times. It’s been great to watch the company’s growth. You had earnings what, just about two weeks ago?

Ashutosh Kulkarni: That’s right.

Daniel Newman: Q2 really good results. The market seemed to really like it. Give me the quick rundown. I have to imagine this enthusiasm has something to do with the AI strategy that you’ve been able to execute upon.

Ashutosh Kulkarni: Yeah, AI was absolutely a big part of it. So in terms of the numbers, we grew top line revenue by 18%. Our cloud business grew by 25% and we delivered a strong operating margin. So everything was greater than what the street was expecting. We were really happy with that performance. The team’s been executing really well on multiple fronts. One part of it has been on leading the charge on becoming the platform for what’s called retrieval augmented generation, RAG, as it’s often referred to. This is about building general AI applications. And then our go-to market engine has been really leaning into that. So our customer base is responding wonderfully. So absolutely, AI has been just this nice tailwind.

Daniel Newman: Yeah, absolutely. And it’s interesting, Ash, is I think a lot of investors have been searching for companies, pun intended, that they can attribute AI in a very meaningful way. And what do I mean by that is I think there was a lot for a period from about the time of ChatGPT’s onset, there’s been a lot of this AI washing, oh, we have an AI thing, and then you’d get earnings call to earnings call or you’d listen to the transcript or read the transcripts, listen to the conversations, and the question was how much is AI actually moving the needle for your business?

Now, you and I talked over the course of a couple of years and one of the things I said is you guys were sort of a company made for this moment. So when you think about what you’re doing with search and with RAG, these are two of the biggest problems is how do I actually extract meaningful data that helps me drive my enterprise? So it seems like you’ve met the moment. I am guessing you’re probably hearing that from some of your biggest investors and partners.

Ashutosh Kulkarni: We clearly knew that there was an opportunity because at the end of the day, if you think about what Elastic has always been about, we’ve been about helping our customers find insights in the messy, unstructured data, Word documents, machine generated log files that are everywhere within the organization. And the hardest part of all of this was how do you get insights in a meaningful way from all of that data without drowning in it? And large language models became that one missing piece that was the unlock. And when you take these large language models and combine them with a search AI technology like Elastic, you’re able to very quickly find just the right nuggets within your enterprise data and pass it to the large language model. And now that large language model can respond to you in a conversational style just like ChatGPT, but for meaningful business applications, business workflows. And our customers are loving it.

One of the stats that I gave in our Q2 earnings was we now have over 1,550 customers just on Elastic Cloud that are using us for various generative AI applications. Our customer commitments almost doubled quarter over quarter, Q1 to Q2. So all of those are really great signs. And of those commitments, three of those deals were million dollar plus deals. So we are seeing customers now really lean in and use these applications in ways that are ROI generating. They are value generating for them and these are enterprises. So it’s been great.

Daniel Newman: Yeah, you’re able to basically attribute growth to a specific set of solutions that are AI.

Ashutosh Kulkarni: That’s right.

Daniel Newman: And that’s that 1500 plus customers. But then on top of that, your RPO is growing as a by-product. And of course everybody that’s invested in that particular space cares a lot about is that backlog building, is that demand building. And so you’re putting this together. I think part of it too, we talked about you being a partner, award winner, ecosystem matters because not everybody is outgoing looking solution to solution to solution. The marketplace and things like that have been really successful because people are kind of building their enterprises in the control plane of the cloud and then they’re looking which solutions can I integrate. Talk a little bit about what you’re doing with the Elastic AI ecosystem because I think it sounds to me like you’re sort of accelerating the partner ecosystem to get more people adopting your solutions.

Ashutosh Kulkarni: The ecosystem is going to be incredibly important for multiple reasons. So first is this is a very, very new space and anytime you’re dealing with a new space like this, the innovation is unparalleled. It’s happening everywhere. Large organizations like the cloud hyperscalers that are building their own large language models, you have companies like Anthropic and Cohere and so on also adding to that mix open source companies like Mistral, but then there’s an entire ecosystem of things like how do you guardrail and protect the output of a RAG system. Companies that do LLM orchestration and all of these taken together is important for this ecosystem to move forward and really for customers to build complete applications that they can deploy in the enterprise. So the way we’ve looked at it is we want to be the most open player in this ecosystem, build really deep integrations with everybody that matters within this environment.

So our customers don’t have to do the integration. The integration just works out of the box. So it doesn’t really matter whether you want to use OpenAI’s LLM or Mistral or whether you want to send some queries here and some queries there. But all of them work really well with the Elasticsearch vector database and our platform to deliver the most optimized RAG solution with minimum hallucinations. And that’s working really well. Our customers are loving it.

Daniel Newman: So the API connectivity’s that can run on a combination of these kind of closed black box as well as in Open. But you’re still very open and you can expose what you’re doing to them. So if someone wants to use OpenAI for business, they can still get the benefit for their custom internal data, the high value data, because that’s really where it’s all at. I mean, to some extent, I don’t know how you feel about this, but I bounce from LLM to LLM. I play with them all. I use Perplexity, I use Gemini, I’ve played with the AWS models. My point though is sometimes you’ll find that one you’re like, oh, I like this one for this thing. I like this for research, I like this for code generation. I like this for getting financial data quickly. But there is a bit of a table stakes element and I think that acceleration to how do we get our data, whether it’s distilled models, whether it’s in RAG, whether it’s fine-tuning so that the outputs has something that’s unique, otherwise it’s all the same.

Ashutosh Kulkarni: It all comes down to your data. If you can help monetize your data, drive automation in your business processes, that’s the name of the game for enterprises. So the data itself is going to be most important. And that’s where we have an amazing advantage in terms of just the incumbency that we have. Out there Elasticsearch today is easily the most widely downloaded vector database. There’s so much data already sitting in Elasticsearch and we’ve made it extremely easy for customers just with a minor change to take everything that they’ve done in Elastic for search and now turn it into a RAG style application. And now with that, we give them the choice of using any LLMs and in my opinion, a few things are going to happen or need to happen for this space to truly become everything that it can be. First, the cost of inference and the cost of running these RAG applications needs to keep coming down because without that, you’re going to see things capping out.

The second thing is privacy and security are going to be incredibly important. Without that, no enterprise is going to really put sensitive data through these applications. And that’s where we are going with our solutions. It’s the most privacy-sensitive and secure solution out there. Everything from document-level permissions to very fine-grain, rule-based access control and then our ability to continually drive the price down for our customers just means that they’re doing more and more with us. And the last quarter, I talked about an automotive company that’s building 32 different chatbots for different use cases on Elastic. And to your point, they’re picking different LLMs for different use cases because I don’t believe there’s going to be one thing to rule it all.

Daniel Newman: If there is, then there’s a lot of companies that are going to lose badly at this point ’cause there’s so much attention. But I think there is enough distribution of use and it’s really interesting and I think it’s worth kind of just a quick double-click, is what you’re effectively saying is what you’re able to do with RAG and Vector in the Elastic database can allow companies to use substantially less compute networking and resources because you can get the right data exposed to the application in a way that’s available that doesn’t require so much horsepower.

Ashutosh Kulkarni: That’s right.

Daniel Newman: Because the bigger the model, the more compute required. So I always say Nvidia makes great stuff, but it’s an F1 vehicle. And so it’s great on the strip here, we’re in Vegas, but if the strip’s got traffic, it’s no fun driving an F1 car. It’s got to be open.

Ashutosh Kulkarni: You got it. You got to bring the price down and you got to make sure that the answers are accurate and you’ve got to reduce the risk.

Daniel Newman: That’s why there’s options for different solutions. It doesn’t mean that one won’t continue to be great, it just means that the market’s going to get bigger and there’s going to need to be different options. That’s why we’re hearing so much about smaller models, stilling models is because it’s not efficient and obviously we’ve got an issue. We only have so much power, all these liquid-cooled racks going in data centers, but sometimes you just need a very efficient architecture and it sounds like you’re getting there. So let’s talk a little more specific about AWS since we are here. How is the partnership, which seems to be very strong, influencing your strategy? What are you learning about this as AWS is going through their own transformation with AI?

Ashutosh Kulkarni: Yeah, so first of all, the news of the week for us was we won the AWS Global Partner of the Year for generative AI and data, which was great. So the recognition that-

Daniel Newman: Congratulations.

Ashutosh Kulkarni: Thank you. In their ecosystem, they see us as one of the absolute leaders in innovating in the area of vector databases, in RAG and so on. That’s been fantastic. And the way we have always approached AWS is first and foremost, they’re a wonderful marketplace and channel for us to work through because our customers, many of them want to run Elastic, our platform on AWS. So we make it possible for our customers to purchase Elastic through their commitments that they’ve already made through the marketplace to AWS. And I see it in three phases. I always think about it as build with, market with and then sell with. And you have to start with the customer experience. So that’s all about building these deep integrations and that’s what has really helped us absolutely improve our experience and our relationship with AWS like never before. This year, we had a prime spot when it came to the convention hall where we were exhibiting. We are a diamond sponsored this year, and it was just fantastic to see the foot traffic. I’m very excited about what this means for the future.

Daniel Newman: Other than that long walks around this event, which anyone that’s been to-

Ashutosh Kulkarni: I got my steps in.

Daniel Newman: Yeah, definitely one of the most compelling kind of partner exhibits out of all the events I attend every year is this one. The ecosystem AWS is built is second to none.

Ashutosh Kulkarni: It’s amazing.

Daniel Newman: It is really, really well done. So let’s kind of move a little bit more to the broad Elastic viewpoint here. You and I have talked, like I said a handful of times. I think you rose from the engineering and product side. In your last answer I could hear it, a salesperson might come from a different lens and you can definitely sense that, but how do you sort of now looking at it through the lens as CEO, you’re a couple years now, right?

Ashutosh Kulkarni: Three years.

Daniel Newman: Yeah. Gosh, it’s gone fast.

Ashutosh Kulkarni: Time flies.

Daniel Newman: Congratulations. But sort of this progress in transformation, how do you kind of evaluate it? How do you grade it? How do you characterize it?

Ashutosh Kulkarni: So I take a much broader perspective on the opportunity ahead of us. When I think about what we represent as a search AI platform, Elastic is all about helping our customers with unstructured messy data, like I said, and there’s multiple orders of magnitude of that unstructured data in most organizations than there is structured data that sits in databases that’s used for BI and so on. But the challenge ultimately has always been how do you find the right insights out of that data in real time? And that’s been Elastic. That’s been our core strength. When you add the power of large language models to that mix, now all of a sudden you can present a very conversational way for people to explore that data. And then it unlocks all kinds of fascinating use cases.

Observability, which now represents 40% of our business. This is all about very quickly finding what’s in your environment that might be slowing down your application. How do you monitor your systems, keep them running at the optimum speed and so on. Security, which now represents over 25% of our business. This is all about cyber security. How do we prevent threats? So for me, the opportunity landscape is just massive. My aspiration is that there will be a day when every single organization, big and small will need a search AI platform and we will be it. And that’s what we are going to work towards.

Daniel Newman: Yeah, it’s very aspirational and it’s also a massive opportunity. In the end, most people will consume AI through the lens of some sort of managed service, whether it’s the SaaS application itself or a managed service. I mean the DIY companies, we’re seeing them. We know who they are. They’re building these massive clusters, these racks, these tools. But most companies, the way SaaS had its own sort of revolution and even enterprise software did, it’s about making the business need available and making it simpler to consume. And so I think that’s where you’re heading. And I mean, I think that’s the opportunity.

Ashutosh Kulkarni: It’s a multi-billion dollar opportunity. So we are very excited about it.

Daniel Newman: So is that what you’re most excited about as we head into 2025? Give me across the tech and AI and Elastic landscape, give me the most exciting thing for the year ahead.

Ashutosh Kulkarni: From our perspective, the two things that are really driving our momentum, first and foremost, it’s genitive AI like we talked about, but it’s not just about customers going from search to semantic search to these RAG style conversational applications and then agentic workflows for entirely automating business processes. That’s one big pillar for us, and it’s been just fantastic to see the momentum. The second pillar for us is we are infusing AI into our observability and security solutions. And that’s helping us disproportionately win and take share both in security and observability. And we are seeing customers consolidate onto our platform, displacing incumbent vendors in those spaces. And so we are leaning in hard into this idea of a search AI platform. So that’s really exciting for me.

But long-term, what I also see is the industry is now getting a lot of things right. You are seeing this week at AWS at re:Invent, there were a lot of announcements around custom silicon. And when you see all of that, what’s going to happen is the price of compute is going to come down, the price of inferencing is going to come down, and that will benefit every layer of software that sits on top of this hardware. It’ll benefit companies like Elastic and effectively it’ll raise all tides. So I’m super excited about what this means for the whole AI ecosystem.

Daniel Newman: And I love that you called that out. We’ve been evaluating and watching that part of the market very, very closely. First of all, that whole segment’s going to grow a lot. And the idea that you can basically have workload-specific silicon, which you always could, but there was kind of idea that everything needs to be a giant GPU. And it’s like, no. I mean a lot of use cases, I mean, heck, you can still do a lot of inference on a CPU.

Ashutosh Kulkarni: Absolutely.

Daniel Newman: But I mean there are these very efficient inferencing in the cloud, chips, managed services, and it’ll certainly benefit Elastic and the industry as a whole.

Ashutosh Kulkarni: Effectively what you’re seeing here is things that, you’re not seeing displacement of activities that we were doing before. You’re seeing net new things happening. You’re automation of workflows that just were so tedious and so painful. Yeah, I talk about we have customers who use us to build chatbots for customer support and call center automation and so on. And yeah, I always ask this question, you talk to a call center representative that’s worked a shift and see their energy level at hour one and their energy level at hour seven, there’s a marked difference because nobody calls a call center to thank them. They usually call a call center because they’re upset about something. So if you can automate the grunt work there and allow that person to actually deal with the harder problems, that’s a huge unlock of human potential. And now you’re driving meaningful return on capital, it’s going to raise all tides. This is what really excites me about AI.

Daniel Newman: Yeah, nobody after an eight-hour shift of being yelled at that their dog food didn’t show up as feeling good about themselves. But yeah, absolutely. Ash Kulkarni, thank you so much for joining me here on The Six Five.

Ashutosh Kulkarni: Thank you.

Daniel Newman: It was a lot of fun to have you here. Congratulations on everything.

Ashutosh Kulkarni: Thank you.

Daniel Newman: And thank you so much for joining us for this episode of The Six Five. We’re On The Road here at AWS re.Invent 2024. Hit subscribe. Join us for all of our coverage here, and of course, be part of our community and watch all of our shows when you have time. But for this one, this episode, time to say goodbye. I’ll see y’all later.

The post Elastic’s Generative AI Momentum – Six Five On The Road at AWS re:Invent appeared first on Moor Insights & Strategy.

]]>
ASUS Zenbook S 16 Review — Ryzen AI 9 HX 370 Processor At Its Best https://moorinsightsstrategy.com/asus-zenbook-s-16-review-ryzen-ai-9-hx-370-processor-at-its-best/ Wed, 01 Jan 2025 22:01:44 +0000 https://moorinsightsstrategy.com/?p=45013 Sporting the fastest AMD processor available, this laptop offers a great balance of performance, entertainment and productivity, despite some sacrifices in battery life.

The post ASUS Zenbook S 16 Review — Ryzen AI 9 HX 370 Processor At Its Best appeared first on Moor Insights & Strategy.

]]>
The author using the ASUS Zenbook S 16 Anshel Sag

The ASUS Zenbook S 16 laptop, in my opinion, has been the flagship laptop using AMD’s Ryzen AI 300-series processors going back to this past June at Computex in Taipei, where I first set eyes on a preview version. AMD formally launched these processors, along with its other Zen 5 laptop and desktop processors, in July. While this specific laptop wasn’t launched until August, it did stir up quite a bit of excitement with its new CPU, GPU and NPU capabilities.

While Microsoft is enabling this laptop with new Copilot+ capabilities in preview thanks to the Ryzen AI 370 processor inside it, I wanted to write a review of this laptop and evaluate how the Zen 5 CPU adds to the experience. ASUS is able to pack such a powerful chip into such a sleek design thanks to the laptop’s advanced thermal design and “Ceraluminum” (sleek high-tech ceramic) materials, which make the laptop lightweight, thermally conductive and sturdy.

The AMD Ryzen AI 9 HX 370 Processor

The AMD Ryzen AI 9 HX 370 chip (admittedly a mouthful) is the powerhouse behind this laptop’s capabilities — and the fastest in AMD’s lineup available today. That said, it is likely that a thicker, more thermally robust laptop design could squeeze even more performance out of this chip. This is important to note because every laptop’s implementation of this chip will likely vary on thermals and power delivery and could affect peak performance. Additionally, battery life can be affected by different implementations of the same chip as well.

The HX 370 is a 12-core chip, featuring four Zen 5 cores and eight Zen 5c low-power cores. This is clearly a configuration designed for power efficiency without compromising too much on performance. That’s why it should be no surprise that single-core performance fell within the middle range of my GeekBench 6 benchmark results of Copilot+ PCs powered by Qualcomm’s Snapdragon X Elite processor. Where it faltered was in multi-core performance, where it fell towards the bottom of the stack. Cinebench performance was also towards the lower end of the range, which I was a little surprised by. GPU performance was great, especially in Counter-Strike 2, where I got a very stable and playable 60 FPS on medium settings, something that Qualcomm couldn’t claim.

AI performance was one of the hardest things to measure because there aren’t any benchmarks that can test AMD’s NPU, so all of them are forced to test the GPU. As such, AMD’s AI performance — at least from the NPU — is a mystery compared to the competition. This is an area where I believe AMD has fallen behind and is playing a fast-follower role, trying to follow Qualcomm’s and Intel’s lead and capitalizing on their successes and failures. That said, this approach does not really fit AMD’s current market-leader position, and I believe that the company should be investing more in developer access to its NPU.

Battery life is another difficult discussion to have for this laptop because, although it’s quite clear that the platform brings a significant improvement in battery life over the previous generation, that’s hard to quantify with a 3K OLED display and a 78 Wh battery. OLEDs are inherently power hungry, and a 78 Wh battery is quite large and designed to account for that hunger, which may negate any chip-level power savings. Even so, we’ve seen similar configurations from Lenovo using Qualcomm processors that get much better battery life. In my experience, I could get about eight hours of work time on this laptop, which is an improvement over the preceding generation but not as good as what I’ve seen from comparable platforms. In fact, I would assess that battery life was about on par with the Samsung Book4 Edge 16, which unsurprisingly has the same OLED panel — which I believe affects battery life the most.

The Zenbook S 16 stacked on top of the Samsung Galaxy Book4 Edge 16 Anshel Sag

The Zenbook S 16 Laptop

As a complete laptop in terms of design and functionality, the Zenbook S 16 is a stunning machine. I remember first seeing it at Computex and being enamored with the design and weight. And I was excited when AMD sent me one for review because it was one of my most anticipated laptops of the year. System performance aside, this device does feel quite thin and light — much like the Samsung Book4 I mentioned, although it has a very different design language.

The Zenbook S 16 also benefits from having a more mature x86 processor inside, meaning that application compatibility isn’t a problem. Additionally, most of ASUS’s choices with this system are overwhelmingly good: 32GB of RAM, 40 Gbps USB-C ports, and full-size USB, HDMI and SD card slots. I also don’t mind the OLED panel decision because OLEDs are simply superior to LCDs; still, there is a considerable battery-life penalty for OLEDs no matter what you do. The OLED also came in quite handy for movie watching and photo and video editing, which I did plenty of over the last few months.

I did, however, run into a long-time Windows issue on this machine that I had all but forgotten about once I had reviewed a few Arm-based windows PCs over the past few months. The issue is that the Zenbook S 16 woke up in my laptop bag overnight and drained its battery while overheating inside my bag. I truly thought that the industry had moved away from this problem, but it seems that x86 systems are still plagued by it, while Arm systems are not. Another oddity I experienced with this laptop was that the display would go dark for a second when switching between battery and wall power. I can’t say I’ve experienced that before.

Thinking of power issues, I also believe that ASUS should consider having USB-C ports on both sides of the laptop. Yes, there is a USB-A port on one side, but the reality is that people often need the option to charge from either side of the laptop. I hope that for the next generation ASUS considers this a necessity and puts at least a 10Gbps USB-C port on the right-hand side.

Copilot+ Is Coming Soon

While this laptop is not officially a Copilot+ PC yet (that update is available via Insider Preview Dev Channel), it will be mainstreamed to the broader public in 2025 once Windows Insider Preview users help work out any bugs. The entire situation with Windows Copilot+ updates and which features are available based on different silicon vendors is confusing. It also remains unclear how long it’ll be until x86 Copilot+ PCs reach true feature parity with their Qualcomm-based peers. Still, the Copilot+ capabilities, while limited and not fully ready, are a nice bit of futureproofing for added value.

With all that said, and with special thanks to the powerful Ryzen AI 9 HX 370, I am extremely comfortable recommending this laptop to anyone who wants a great balance between performance, entertainment and productivity.

The post ASUS Zenbook S 16 Review — Ryzen AI 9 HX 370 Processor At Its Best appeared first on Moor Insights & Strategy.

]]>
Unlocking Cloud Efficiency: AWS Reveals AI-Driven Operations – Six Five at AWS re:Invent https://moorinsightsstrategy.com/the-six-five/unlocking-cloud-efficiency-aws-reveals-ai-driven-operations-six-five/ Tue, 31 Dec 2024 16:46:10 +0000 https://moorinsightsstrategy.com/?post_type=six_five&p=44748 Nandini Ramani, Vice President of Search and Cloud Ops at AWS, joins Daniel Newman and Patrick Moorhead, sharing insights on AWS's innovative approaches to cloud operations, including AI-driven tools like Ops-i-tron and the Explore Related button.

The post Unlocking Cloud Efficiency: AWS Reveals AI-Driven Operations – Six Five at AWS re:Invent appeared first on Moor Insights & Strategy.

]]>
The cloud ops landscape is evolving at lightning speed ⚡ from simple web servers to serverless functions and now, the explosion of AI workloads. 🤯

Hosts Daniel Newman and Patrick Moorhead are joined by Amazon Web ServicesNandini Ramani, Vice President of Search and Cloud Ops, on this episode of Six Five On The Road at AWS re:Invent, for a conversation on AWS’s latest innovations in cloud operations, spotlighting the integration of AI and machine learning to elevate efficiency and performance.

Their discussion covers:

  • The evolution of cloud operations challenges over the years and AWS’s approach to addressing them
  • Lessons learned from AWS’s 17+ years in operation and their impact on the development of cloud services
  • Integrating AI and machine learning in cloud operations for enhanced efficiency and performance
  • An introduction to the Ops-i-tron concept and its significance in cloud operations
  • Exploring the new “Explore Related” button and its role in simplifying troubleshooting across interconnected services

Learn more at Amazon Web Services.

Watch the video at Six Five Media at AWS re:Invent, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Six Five On The Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On The Road here in Las Vegas. We are at AWS re:Invent 2024. Dan, the conversation has been unsurprisingly about AI, different ways to slice it, just different ways to operationalize it. Just getting ready for this gigantic enterprise swell of AI that I think both our firms have estimated is a little bit off.

Daniel Newman: Yeah. Well, I think a lot of people are starting to get the fatigue of hype. And what they’re starting to look for is pragmatism. They want to understand where does this technology really help drive the enterprise. We hear these astounding numbers, 20 trillion of economic opportunity, 25 trillion. I’ve heard some gigantic numbers. You’re hearing half a trillion dollars of spend just on the chips in the next handful of years. But in the end, a lot of this starts to be about how we experience things.

Patrick Moorhead: It is, and enterprises are trying to get their data estate in line, questions about security, questions about governance. And then doing AI at scale or any scale, you have to be able to operationalize it. IT ops we cannot forget because that basically keeps everything moving. And if you don’t put that into your strategy with AI, you’re going to not be able to do this. And I can’t imagine a better person to talk about this, Nandini, welcome to The Six Five.

Nandini Ramani: Thank you for having me. Looking forward to this chat.

Patrick Moorhead: Yeah, let’s talk about IT ops at scale.

Daniel Newman: Of course, it’s been a really exciting week. It’s a fire hose and everybody kind of knows that, that is the AWS, so much engineering prowess, so much pedigree in this particular space. But yeah, as we see all this data move to the cloud, we see all these workloads, you got to keep it up and running. And I know you gave a talk, you talked about … by the way, I think you had three challenges and I always love this. What keeps you up at night? CIOs, CTOs, CISOs-

Patrick Moorhead: Operators.

Daniel Newman: They are awake at night trying to figure out how do they make ll these cool features that we like to talk about work. So give a little bit of that background of what’s keeping them awake and how’s that changing in this era?

Nandini Ramani: That’s a great question. So when you think about the complexity these days, you started off with staggering numbers just in terms of compute, the spend that’s coming with Gen AI, et cetera. So if you think about the trajectory of where we started and where we are, we had simple days where you had one box, maybe one for load balancing redundancy, you ran a web server on it and you were off to the races. Then we got to EC-II as compute, then we went to EKS, ECS. And now serverless with ephemeral little workloads that spin up, disappear. You have no idea where things are running. But it lets you scale. It lets you do so much more than you could initially. So I always ask our customers, what keeps you up at night? And it’s a trifecta, if you will. Number one, they want their operations to scale as their business scales.

Patrick Moorhead: Makes sense.

Nandini Ramani: Without having to do anything, it just needs to work out of the box. Second, they want to be able to have insights into the data and all the telemetry that’s being emitted. Whether it’s on-prem, on their own, whether it’s on EC-II instances or in some cases, multi-cloud. No matter where it resides, you want to be able to gather insights without doing any heavy lifting on ETL.

Patrick Moorhead: Right. Understand.

Nandini Ramani: Maintaining pipelines, doing all of those complicated things. Third, everybody wants automation. Can you just automate it? Give me built-in controls, fully managed. Those are sort of the three things that we try to address, and that’s what most of our services do for them. The undifferentiated heavy lifting. So our customers can just focus on their business and their end customers.

Patrick Moorhead: No, it makes sense. I mean, listen, AWS was all about simplicity. Focused initially, actually still focused on developers and builders and letting a lot of the driving to somebody else that they just didn’t want to do because it didn’t add business value. So that makes total sense. So I want to drill down a little bit into the history and what you’ve learned over the past 17 years that have given input into the products that you’ve chosen and the services that you’ve chosen to deliver at scale operations.

Nandini Ramani: It’s exactly the 17 years of experience. And remember, we were the first cloud. So we’ve got 17 years to build things for ourselves. That’s how it started, AWS was born because Amazon was scaling. And we needed all the challenges that I outlined came from learning of 17 years of building it for ourselves. In fact, systems manager, which is one of our services, was built so we could maintain our own instances and keep Amazon retail running. That’s how it started out. And now we externalize it. And that’s typically what we do. In fact, if you take CloudWatch, our flagship observability product service, we use it heavily internally.

In fact, you can find developers both across Amazon and AWS, pouring over dashboards, troubleshooting to make sure that we are ready and always available for our end customers who rely on us. So it is that 17 years of experience that has helped us to it. And in fact, we ourselves use our tooling in a similar fashion internally. And the second thing I would say is, I have never found a company where we listen to our customers so closely. 90% of our roadmap is driven from customer requests. That’s what we do, we have these operating plans that we build. They’re entirely based off of the customer requests. I think it’s those two things, our own experience and what customers want.

Daniel Newman: So you heard us in the preamble talking about AI and the acceleration, and you sort of alluded to it, because you were giving a little bit of the histrionics of going from web server to container to serverless. And AI’s kind of doing the same thing. We’ve had this era of sort of data and data management, and then we had this machine learning era, and now we have the AI era. And cloud operations has to follow this. How are you integrating cloud operations into … I don’t know, the last two days of announcements which are almost all built on a combination of managed AI services and self-built AI services that enterprises are really just beginning to adopt?

Nandini Ramani: Yeah. And I think Swami said it in this keynote today, and it’s true. We’ve always had it, if you think about anomaly detection, we’ve always had the journey of AI ML. And now Gen AI, where it’s the ability to reason. So first of all, Amazon Q Developer, our flagship product, think of it as the one and only service that fully understands AWS. That is powerful. You don’t have to go to run books, you don’t need to call support. Like you can ask it any question about AWS and it gives you an answer. So that’s, in and itself, it’s already powerful. But on the journey of Gen AI, last year we released a natural language querying, also powered by Gen AI behind the covers. Because every tool has its own query language, SQL this, PPL that, and so on and so forth. So we’ve had that integrated in config, CloudTrail, CloudWatch, OpenSearch, all of them now support natural language. And we’ve received tremendous feedback, it saves a lot of time for developers instead of typing queries.

This year, Matt Garman announced in his keynote, ability to do operational investigations with Q Developer. So what it does, you ask Q Developer a question, it brings you to the CloudWatch console and you can start troubleshooting. We have 17 years of experience on our own services. So we built a knowledge graph based on those learnings of so many years of what customer behavior patterns, how do they use our services. And so it’s able to let you know … once you turn the investigation on, it’ll traverse and tell you, what the likely cause is. And the typical causes for things going wrong, deployment, configuration changes. Or in some cases, load balancers, auto-scaling, those sort of problems. So it can pinpoint, it builds the topology for you, pinpoints where the problem is. And if you accept it, you can even remediate it in place.

Now Matt also alluded to the fact that currently there are hallucinations, and it’s not a hundred percent yet. But we’re working hard with automated reasoning to make sure that it is solid and it can correct itself as it builds up the confidence. So that capability is available today for everyone to use in preview. And in fact, like I said earlier, we always eat … I like to say we sip our own champagne. But basically we use it internally. And in fact, Amazon Kindle support team has used Q Investigations and they have saved 65 to 80% in troubleshooting time. That is phenomenal. Think about the possibilities when people start using this at scale. And I truly believe, just like today, we don’t talk about anomaly detection as a thing. Gen AI will just be part of everything we do.

Patrick Moorhead: Yeah, so a couple themes I’m picking up so far here is first off, customer zero. Amazon, but also when it comes to cloud operations, your customer zero for all this … Q, by the way, when Matt got up on stage and showed all the operational stuff you could do with Q, I thought that was pretty cool and pretty amazing. And there’s also the at scale part. But I do have to ask you, in your talk, this word phraseology came up that I thought was pretty cool. Opsitron. I hope I’m saying it correctly.

Nandini Ramani: You are saying it right.

Patrick Moorhead: What is Opsitron?

Nandini Ramani: So I mean, it’s a pun on the fact that we run cloud operations. So Ops and cloud Ops, and we came up with … I didn’t, to be honest.

Patrick Moorhead: What’s the tron? Is it-

Nandini Ramani: It’s a made up thing for ourselves.

Patrick Moorhead: It’s like a verb. It’s a verb.

Nandini Ramani: It’s a verb. But now it’s going to be for us, anyway. But the idea is-

Patrick Moorhead: It’s a movie.

Nandini Ramani: We build individual services. Like I talked about Systems Manager, which can do node management, CloudWatch does observability. Individually these things are very powerful. Whether there are some metrics logs. So what we came up with is each of our individual services are powerful in themselves. I think it’s a quote from fourth century BC from Aristotle. The sum is greater than its parts. But I was like, I don’t want to use that analogy. So they came up with this fun new contemporary way of saying, individual bots. The whole theme for the talk was, we have a metrics bot and a logs bot and so on and so forth. But when they come together, they become even more powerful and help you troubleshoot much faster.

Patrick Moorhead: Yeah, it’s like the Wonder Twins unite. I’ve been here, I watch the cartoons.

Nandini Ramani: But it’s a theme for us because just like I said, we do the undifferentiated heavy lifting. This is another thing we want to do. We don’t want you … the customer shouldn’t have to stitch all this information together. We want to do it for them. So we thought it’d be fun and it seems to resonate, so we got the theme of Opsitron.

Daniel Newman: Tell us a little bit more about that though, the kind of explore-related button. The demo looks like it’s basically stitching services together and making the observability or observable nature of all-

Nandini Ramani: Much easier. So Gen AI is still early. And so we’ve also built a contextual graph within CloudWatch which pulls up, that’s the one you’re alluding to, so you don’t have to type anything. Just click, point and click and it guides you through the topology. It points to where the issue is and it takes you all the way from metrics to logs. Which is usually the hard part of doing troubleshooting. Like it’s like looking for needle in a haystack. Multiple times it’s looking for a particular needle in a particular haystack.

Or how many needles in a haystack. I can expand on those analogies, but that is the part that’s so hard for folks to do. So picture, you have the contextual thing guiding you through the telemetry and you have this investigation assistant. If this answer that you are deriving as humans, which is what we do today, aligns with what you’re seeing with Gen AI, it improves the confidence and it improves the learning capability of the service, the operational assistant. I think that combination is going to be amazing.

Patrick Moorhead: Yeah. So another scenario was troubleshooting. And I think it was this, now that we’re talking about mashups between CloudWatch and APM, can you talk us through a little bit about that? The need, the value, the benefit.

Nandini Ramani: Yeah. So while we live in infrastructure land all the time, and that’s our world.

Patrick Moorhead: Infrastructure’s cool.

Nandini Ramani: I think it’s very cool, but it’s not for everyone. But our customers want to focus on their business, their application. They need all the infrastructure, they need the nodes, they need the logs, they need the telemetry. But what they really care about is, is there any latency for my end customer? Are there packet losses? Did my latest deployment cause an issue? So if it’s … they need to start at the application and this is what we do internally. We take everything as a span from every web service, and we convert that into logs and that’s how we troubleshoot. So that is the feature we’ve launched now with application Signals, bringing it together with the service that many of our customers use called X-Ray. And the latest one where you can actually go from those spans, that was the demo in the innovation talk that David showed.

Daniel Newman: Yeah. So we’ve covered a lot of ground and kind of the history of AWS and re:Invent is all about this fire hose, this funnel of announcements. So you touched on a few. Let’s do the recap. Let’s kind of end this thing a little bit on the recap. Biggest announcements in your business, what are you most excited about? What do you want all the viewers out there to take away from this conversation? As your sort of big moments from this year’s re:Invent.

Nandini Ramani: Yep. So the biggest things for me, some of the launches I already talked about, the investigations assistant. Please kick the tires on it. The other thing I would say is fault injection service. Resiliency is so important. Think about Prime Day, I can’t think of anything that needs more high availability than Prime Day. So we from AWS help retail run over 700 experiments on fault injection service. So I would encourage viewers to give that a try because resiliency is as critical as observability.

For example, we’ve launched Database Insights and CloudWatch, we’ve launched … oh, here’s a big theme that I’m super excited about. Zero ETL, remember I said customers don’t want to move their data around. So we have Zero ETL between CloudWatch and OpenSearch service. And we’ve also extended that to Security Lake and OpenSearch service. So you can run analytics. OpenSearch has very rich analytics and you can run that no matter where your logs reside. Whether it’s in security use cases or in CloudWatch, it just works seamlessly. Containers, many of our customers run their applications on containers. So we’ve launched Enhanced Container Insights for EKS last year, and this year we launched it for ECS as well. We have two new preventative policies to help you prevent drift once you set your configuration. We’ve had enhanced node management capability and systems manager. I’m just like, as you can tell, I love this stuff. So we have so many launches.

Daniel Newman: Appreciate all of the children, as we like to say-

Patrick Moorhead: She does –

Daniel Newman: No favorites, right? You don’t want to upset anybody.

Nandini Ramani: I have no favorites, I love them all.

Daniel Newman: Any of your product leaders. You want them to know you love them all.

Nandini Ramani: I love them all.

Daniel Newman: You love them all and all those customers clearly, including Amazon as Customer Zero. Appreciate it so much you spending the time here with us at re:Invent. I’m sure it’s very busy, if your feet hurt a little bit like mine do from all the steps you’re getting in.

Nandini Ramani: Love the steps, though. But wear comfy shoes.

Daniel Newman: Yeah, you wear comfy, but they’re-

Nandini Ramani: Lesson one.

Daniel Newman: Still stylish. Y’all can’t see them, but I promise you they are. And for everyone out there, I want to thank you so much for joining us here. The Six Five is On The Road at AWS re:Invent 2024 in Las Vegas. Covered a ton of ground. Subscribe, join us for all of our other content and coverage here from Pat and I from the whole Six Five team, it’s been a busy week. But we got to go for now. So we’ll see you all later.

The post Unlocking Cloud Efficiency: AWS Reveals AI-Driven Operations – Six Five at AWS re:Invent appeared first on Moor Insights & Strategy.

]]>
Microsoft Fabric Simplifies Data Management For AI https://moorinsightsstrategy.com/microsoft-fabric-simplifies-data-management-for-ai/ Fri, 27 Dec 2024 22:25:12 +0000 https://moorinsightsstrategy.com/?p=45000 Microsoft has introduced new AI-driven features within the Microsoft Fabric data platform to accelerate application development and improve other enterprise functions

The post Microsoft Fabric Simplifies Data Management For AI appeared first on Moor Insights & Strategy.

]]>
Microsoft has introduced new AI-driven features within the Microsoft Fabric data platform to accelerate application development and improve other enterprise functions. Microsoft and Robert Kramer

As 2025 approaches, managing data effectively is becoming increasingly important for unlocking the potential of generative AI and AI agents. Over the past year, I’ve written extensively about the need for clear data strategies to make AI practical and impactful. At the 2024 Ignite event, held November 19 to 21, Microsoft shared updates on its work with data and AI, providing insights into how it addresses these challenges.

The event highlighted Microsoft’s efforts to integrate AI into its tools, tackle data management challenges, enhance security, expand cloud functionality and make its systems more practical for enterprise teams and frontline workers. Particularly important from a data perspective is the newly expanded Microsoft Fabric, and here I want to dig into Fabric’s role in data management, especially to enable faster and better AI development.

Microsoft Fabric offers new ways to manage and use data, helping streamline workflows, enhance productivity and improve AI integration. Microsoft

Key Components And Features Of Microsoft Fabric

Microsoft Fabric is a data platform that provides tools covering the entire data lifecycle, from integration and engineering to storage, analysis and reporting. Arun Ulag, corporate vice president for Azure data at Microsoft, described Fabric as “Office 365 for data,” highlighting the company’s intent for Fabric to offer a unified, comprehensive approach to solving data challenges.

Fabric combines several components into a cohesive platform. Data Factory is a tool for data ingestion, integration and orchestration, enabling the creation of data flows and data pipelines from multiple sources. Data Engineering focuses on preparation and transformation of that data to make it more usable. Scalable storage is handled through the Data Warehouse module, and the Data Science component allows users to perform various data science workflows, including data exploration and the creation of machine learning models. Real-Time Intelligence enables working with live data streams from IoT devices, applications and logs. (Databases is addressed in detail below.) Microsoft Fabric also provides industry solutions across sectors such as manufacturing, healthcare, financial and retail—with specialized functions available for sustainability as well. Customers can also use Power BI for robust data visualization and reporting.

Underlying these components, OneLake offers a universal data lake, enabling users to access and query data from various sources without needing to migrate it. It supports open data formats such as Apache Parquet, Delta Lake and Iceberg, which ensures compatibility and avoids vendor lock-in. Microsoft has also added the OneLake catalog within Fabric to simplify data management and governance. It includes key features such as the Explore tab, which helps users find and access data, and the Govern tab (slated to launch in preview soon), which provides tools for data owners to manage and protect their data. The catalog integrates with Microsoft 365 apps such as Excel and Teams, allowing users to access data directly within frequently used tools. Its connection with Microsoft Purview adds more governance capabilities, including global data catalogs, policy enforcement and data quality management, which are essential for managing diverse data sources effectively.

All in all, these capabilities simplify data discovery and governance, particularly for organizations that already use Microsoft tools. Arun Ulag stated, “Fabric is designed to meet the customers’ needs, to support their data journey.” Overall, Fabric is equipped to address data management challenges by improving data connectivity and reducing unnecessary complexities.

Microsoft Fabric Databases — Enabling AI Developers

The release of Microsoft Fabric Databases addresses a major pain point for developers: the complexity of integrating operational data with AI capabilities. Essentially, it makes it easier to work with data and AI models within the same platform, which should simplify the process of building AI applications. The integration of key functionality—such as native vector support, API connections to various AI models and compatibility with development environments such as Visual Studio Code and GitHub—means developers can focus more on building and less on managing infrastructure. The initial offering in Fabric Databases is Azure SQL, with plans to include Azure Cosmos DB and Azure Database for PostgreSQL in the future.

From a strategic standpoint, Microsoft is looking to reduce friction for developers by providing a unified environment where they can easily access both data and AI services. This also brings security into the fold with automatic features like cloud authentication and encryption, so developers don’t have to worry about setting up those aspects separately.

In terms of market impact, I think Microsoft is positioning itself to attract a broader audience by making AI development more accessible, even for those without deep expertise in data management. The idea is to lower the technical barriers to AI adoption and, in the process, build tighter integration into the Azure ecosystem. As developers can now replicate data from different sources directly into OneLake, it reduces the complexity of managing multiple data platforms, which could be a significant advantage for organizations looking to scale AI-driven applications more efficiently.

Ultimately, Microsoft seems to be betting on a more seamless, end-to-end solution that could drive adoption of both its AI tools and the Azure infrastructure. If successful, this approach could set the company apart from competitors by providing a smoother, integrated experience for developers working across both data and AI.

Microsoft Fabric Addresses Business Challenges

Microsoft Fabric is a valuable tool for organizations that are incorporating AI into their operations. Its integration with Azure AI Foundry and Copilot Studio supports the development of AI agents and applications, enabling developers to transform data into automated workflows. Fabric’s tools also simplify data preparation tasks such as classification, summarization and extraction with minimal code, making AI development faster and more accessible to a broader range of developers.

Fabric addresses key data and AI challenges businesses often face. OneLake can be used to eliminate data silos by providing access to data from various sources without requiring migration. The platform also simplifies complex data pipelines, further easing integration and transformation tasks. Its serverless compute model optimizes resource use, reducing infrastructure costs. Fabric also helps businesses gain insights faster by accelerating data analysis. In one example of its value provided by Ulag, a U.K. consumer goods company reduced its data spending from $165 million to $45 million after transitioning 15 products to Fabric over a year.

Microsoft plans to enhance Fabric with expanded database integrations, improved AI tool alignment and better scalability and governance—all of which should make it even more capable of addressing data management challenges and improving AI development processes. Staying informed about new features as they are released will be crucial for organizations and developers to utilize the platform fully.

Competitive Landscape

From a high-level perspective of the data-to-AI pipeline, Microsoft and its peers among the cloud service providers are positioning themselves to offer integrated platforms that enable enterprises to make the most of their data for driving AI-powered applications and insights. Microsoft Fabric is a critical piece of Microsoft’s data and AI strategy. It integrates a data lake house, data engineering, machine learning, analytics and business intelligence into a unified platform that connects important tools and services both inside and outside the Microsoft ecosystem.

AWS maintains a strong presence in this space with its own suite of services, including the Redshift cloud data warehouse for analytics, S3 for storage and the newest generation of SageMaker, which brings together machine learning and analytics capabilities. The new SageMaker Lakehouse unifies S3 and Redshift data to enable rapid AI development. Customers can also draw upon the AI modeling capabilities within AWS Bedrock.

Google Cloud has built its own ecosystem around the BigQuery data warehouse, complemented by tools such as Dataflow for streaming analytics and Dataproc for data processing, as well as the Vertex AI development platform. Meanwhile, IBM’s data fabric solutions provide a platform for managing and integrating data with AI capabilities. Cloud Pak for Data serves as the core, supporting data solution development and deployment. The platform also includes DataStage for data integration and transformation, Db2 for creating and managing data lake tables, and Watson Knowledge Catalog for organizing and governing data. Guardium Data Protection enforces security, while watsonx facilitates AI-driven analytics and machine learning. These components work together to ensure that data is accessible, secure and prepared for AI applications across hybrid cloud environments.

Microsoft Fabric also competes with smaller, more specialized vendors. One of these is Databricks, which offers a data lakehouse platform with ETL and governance features to support enterprise AI efforts. It enables collaboration between data scientists and engineers and provides tools for big data processing and advanced analytics. Another competitor, Snowflake, provides a cloud-native data platform with an architecture that separates storage and compute, enabling flexibility and scalability for large-scale analytics and AI applications. Snowflake also emphasizes data sharing and governance features, catering to businesses managing and analyzing extensive datasets across cloud environments.

Cloudera provides enterprise data cloud solutions designed for hybrid and multi-cloud deployments. Its platform supports the entire data lifecycle—from ingestion and processing to analysis and AI model deployment—and helps customers manage complex data environments across cloud and on-premises infrastructures. Informatica offers a distinct approach by providing generative AI blueprints for platforms such as AWS, Databricks, Google Cloud, Microsoft Azure, Oracle Cloud and Snowflake. This strategy aims to simplify and accelerate the development of enterprise-grade generative AI applications, with a strong emphasis on data integration and management.

The choice of platform ultimately depends on an organization’s specific needs. Factors such as the complexity of the data environment, machine learning requirements and the preference for integrated tools will guide decision making. While Microsoft Fabric could be ideal for many organizations already using Microsoft data products, other platforms may better address specialized or diverse use cases.

Harnessing Data To Streamline Enterprise AI

Microsoft Fabric addresses critical challenges that organizations face as they try to harness their data for meaningful uses of AI. By unifying data workflows and integrating tools such as OneLake and Fabric Databases, Microsoft seeks to simplify data access and management, enabling the creation of AI-driven solutions while reducing operational complexity. By integrating with tools like Power BI and Azure services, Fabric also offers a cohesive environment for data engineering, analytics and business intelligence. Its low-code/no-code interface broadens accessibility, allowing teams with diverse technical expertise to participate in data projects. The platform’s serverless computing model also helps manage costs effectively, making it an appealing choice for organizations already using the Microsoft ecosystem.

Microsoft has begun addressing multi-cloud use cases through initiatives such as Azure Arc, which extends Fabric’s reach to hybrid and multi-cloud environments. However, Microsoft could further enhance Fabric’s appeal by expanding support for native integrations with other major cloud providers and offering more versatile multi-cloud features.

For organizations considering Fabric, it’s crucial to evaluate how its capabilities align with existing systems and long-term AI goals. While the platform offers significant cost and workflow efficiencies, its reliance on the Azure ecosystem might not meet the needs of businesses requiring broader multi-cloud compatibility. Microsoft’s ongoing enhancements in this area could make Fabric an even more compelling choice in the future.

The post Microsoft Fabric Simplifies Data Management For AI appeared first on Moor Insights & Strategy.

]]>
Google Blends Its AI With XR To Build The Android XR Spatial OS https://moorinsightsstrategy.com/google-blends-its-ai-with-xr-to-build-the-android-xr-spatial-os/ Fri, 27 Dec 2024 21:35:45 +0000 https://moorinsightsstrategy.com/?p=44980 Android XR reflects Google's comprehensive approach to XR and delivers what the industry has needed for a long time: a platform to rally around for XR empowered by AI

The post Google Blends Its AI With XR To Build The Android XR Spatial OS appeared first on Moor Insights & Strategy.

]]>
Google’s new spatial operating system, Android XR Google

To say that the road towards a spatial computing future has been bumpy for Google is an understatement. Google has had a long and challenging history with XR, encompassing products such as Google Glass smart glasses and the Google Daydream VR platform. Despite facing numerous obstacles, the company has continued to invest in XR technologies, recognizing their potential to enable advancements in AI; it has also continued to build spatial features into its products across search, maps, and more.

The recent release of the Gemini 2.0 AI models and Project Astra multi-modal agents exemplifies Google’s leadership in AI and underscores its commitment to integrating XR and AI. I recall when Google first introduced Astra via a phone demo at Google’s I/O event this year: all the interactions had me begging for a headset . . . which the demo then transitioned to midstream.

Today Google launched Android XR, which is what the industry has needed for a long time from Google—a platform to rally around for XR empowered by AI. This is something that Meta has also developed over the years with a lesser focus on AI and hopes to spread in the form of its Horizon OS, which now stands as a competitor to Android XR.

My Experiences With Android XR

Having the opportunity to experience Samsung’s Moohan headset, complete with Gemini, affirmed Google’s comprehensive approach to XR for me. Google’s strategy spans from lightweight single-display smart glasses to full AR dual-screen glasses and includes the Moohan prototype MR goggles equipped with high-definition passthrough, eye-tracking and hand-tracking functionalities. While many people will compare Moohan to Apple’s Vision Pro and Meta’s Quest 3 or Pro, it physically felt like some kind of blend in between those products. I was pleased to know it uses both hand and eye tracking and that the passthrough quality was extremely high, with very low latency. Thanks to Gemini, the interface felt like it borrowed some familiar ideas from Apple while also being much more capable.

Project Moohan, a mixed reality headset by Samsung built with help from Qualcomm and Google Google

A big part of Moohan’s performance and appeal comes thanks to the three-way partnership among Google, Samsung and Qualcomm, which delivered the computing power through its XR2+ Gen 2 platform. This is another way of saying that Google is able to access the best chips and hardware to empower the best AI and XR experiences. Thanks to Google’s partnership with Qualcomm on Android XR, other OEMs, including Lynx, Sony and Xreal, will also have devices running the OS, ensuring a diverse ecosystem of experiences and capabilities. Google will also absorb Qualcomm’s work with Snapdragon Spaces and enable forward compatibility as developers transition to Android XR. Snapdragon Spaces was Qualcomm’s attempt to fill the hole that Google left by not launching something like Android XR. With the existence of Android XR, there should no longer be a need for Snapdragon Spaces. The new OS should soften the move from Snapdragon Spaces for those developers and OEMs that use it.

Although convincing developers to build for Android XR might be a considerable challenge considering Google’s past, its initial low-friction strategy appears to align closely with Apple’s approach for the Vision Pro. Both companies aim to simplify application support for 2-D applications using existing apps in their stores; however, Google’s approach to spatial XR apps differs because of its embrace of open standards like OpenXR and WebXR that are familiar to XR developers. Google’s approach to accessories on Android will also translate to Android XR, making support for things like keyboards, mice, controllers and headphones a breeze. Google has already tapped industry veteran developers, including Resolution Games, Virtual Desktop and Tripp, as early developers for Android XR, which indicates that it has already spread its reach across gaming, productivity and health/fitness. That said, I believe that Google needs to add considerable fuel to the developer ecosystem to improve developer excitement and engagement.

When I tried out the Moohan headset, it was evident that the version of Gemini being used was multi-modal—and that it significantly enhanced the user experience by facilitating ease of use and comprehension. Given that keyboards are unlikely to become a primary interface for XR, Gemini plays a vital role in providing a seamless and high-quality user experience based on voice, gaze and gesture commands.

Astra on glasses also offered an impressive experience, notably allowing multi-language interactions and the ability to recall overlooked details visually. While these technologies are currently in the prototype stage, Google’s consistent integration of AI across its XR platforms is apparent. Google’s AI capabilities in XR appear even more credible when compared to Meta’s AI on Ray-Bans, which is less advanced, and Apple’s seeming reluctance to fully integrate Apple Intelligence into VisionOS.

Google’s prototype glasses Google

Android XR’s Future

Google believes that now is the time to launch Android XR because AI tools like Gemini and Astra have matured enough that they can empower spatial computing in ways that weren’t possible before. I believe that Google’s development environment will be attractive to developers already familiar with XR and Android, and that it should make porting applications easy. I have long believed that AI and XR are complementary technologies, which is why I was truly surprised to see Apple bypass VisionOS with Apple Intelligence. Clearly, Google agrees because it is infusing AI everywhere within Android XR; I believe this is the right approach and will only increase the appetite for AI computing, whether in the cloud using Google’s Trillium silicon or on-device using Snapdragon. I expect Google’s launch of Android XR to be slow-rolled through 2025, with the Samsung Moohan starting the rollout but many other devices arriving throughout the year.

The industry has needed something like Android XR for years, and while I have said some less-than-nice things about Google’s role in XR in the past, I do believe Android XR’s deep integration with Gemini and Astra will be transformational for the industry. It was really powerful to experience the spectrum of XR from smart glasses up to a mixed-reality headset and understand how Android XR bridges all of those platforms in a way few companies could. It’s quite clear that Meta has some real competition from Google, and I’m genuinely glad to see that Google is back in the XR space with real gusto.

XR’s biggest problem is that the install base is too small for many developers to get on board; this was evident with Google’s earlier efforts, as it is today with the Vision Pro. Meta is the only company that has somewhat bucked that trend, but it has done so by spending tens of billions of dollars—far in advance of the XR revenue that would sustain that spending. Now, however, I believe that Android XR has the real potential to finally break the install-base problem with a single unified operating system for the XR ecosystem.

The post Google Blends Its AI With XR To Build The Android XR Spatial OS appeared first on Moor Insights & Strategy.

]]>
RingCentral And Verint Partner To Elevate CX With AI https://moorinsightsstrategy.com/ringcentral-and-verint-partner-to-elevate-cx-with-ai/ Wed, 25 Dec 2024 21:31:16 +0000 https://moorinsightsstrategy.com/?p=44967 The new collaboration aims to enhance RingCentral’s contact center solutions by using Verint AI bots injected into customer service workflows.

The post RingCentral And Verint Partner To Elevate CX With AI appeared first on Moor Insights & Strategy.

]]>
This new partnership aims to enhance RingCentral’s contact center solutions by using Verint AI bots injected into customer service workflows. BiancoBlue (ID 324922337) via Dreamstime.com

RingCentral and Verint have partnered to integrate Verint’s AI-powered workforce engagement management and CX automation products with RingCentral’s RingCX platform. This collaboration aims to enhance RingCentral customers’ contact center interactions and improve operational efficiency by using AI bots injected into workflows.

AI is increasingly critical in customer engagement solutions as businesses face pressure to meet customer expectations for personalized and efficient experiences. RingCentral’s offerings are designed to enhance customer support, increase productivity and reduce costs. RingCentral founder, chairman and CEO Vlad Shmunis said the integration with Verint is a “key component of delivering a complete customer experience platform” for RingCentral. Shmunis noted his company’s strong market traction and said that he sees the partnership as an opportunity to “further enhance the value proposition of RingCX in the mid-sized and enterprise segments.”

What Do Verint Bots Do?

Verint’s AI-powered bots are part of the Da Vinci AI and Analytics solution and are designed to automate routine tasks and support customer interactions across multiple channels. These bots use natural language processing and machine learning to respond to customer inquiries in various channels, including voice, chat, e-mail, and social media; this helps companies provide consistent responses across different platforms.

Each bot performs one specific task and is injected directly into workflows. Contact centers can use bots for individual tasks or layer them to weave together multiple workflows. Each bot is tied to an outcome and incorporates specific features, including contextual understanding and escalation capabilities. Bots can interpret customer intent and sentiment, which enables them to provide responses that align with the customer’s needs. If a query is too complex, the bots can transfer the interaction to a human agent and supply relevant context for a smooth handoff. Through machine learning, Verint bots adapt and refine their responses over time, learning from each interaction to improve performance.

Key Applications Of The RingCentral/Verint Partnership

Based on what the companies have said about this partnership and my own background in enterprise software and HRM tools, I see several areas where the newly combined capabilities could be useful for customers. These include:

  • Workforce Engagement Management — Verint’s extensive suite of bots includes tools for quality management, interaction analytics, forecasting and scheduling. Quality management features could be used to automatically score agent interactions based on predefined criteria, helping identify top performers and areas for improvement. Interaction analytics can help detect trends in customer inquiries, allowing managers to address recurring customer issues proactively. One of my favorite Verint WEM bots is the TimeFlex Bot, which gives contact center agents unlimited flexibility for schedule changes without going through approvals.
  • AI Automation — Routine administrative tasks such as updating customer profiles or processing simple refund requests can be automated, reducing the burden on human agents. In many cases, AI can also categorize and route inquiries to the appropriate department or agent based on message content. Verint’s Interaction Wrap-Up Bot, for example, can handle post-call summarization, freeing up agents to focus on more complex tasks.
  • Intelligent Virtual Agents — Verint’s IVA, powered by its Digital and Voice Containment Bots, could handle basic account inquiries, such as reviewing balances or recent transactions, without involving a human agent. Customers could also use an IVA to schedule appointments or make reservations outside regular business hours. For e-commerce businesses, an IVA could guide customers through the steps to track an order or initiate a return.
  • Knowledge Management — Verint’s Knowledge Automation Bot could help customer service agents quickly access a centralized knowledge base during customer calls, potentially improving response times and accuracy. Using a single search bar, agents could instantly access information across multiple knowledge sources, reducing or eliminating the need for multiple time-consuming searches. The bot can then use generative AI to analyze and summarize search results, delivering a single, concise answer curated explicitly for the contact center environment.

The Bot(tom) Line

The success of this partnership will depend on several factors, including ease of integration and demonstrable improvements in customer satisfaction and operational efficiency. Successfully integrating AI often requires significant changes to workflows and processes. While AI adoption is increasing rapidly, many organizations are still in the early stages of implementation and may lack the expertise to get the most out of their AI investments. A recent survey by IBM found that only 35% of companies currently use AI in their business operations, while another 42% are actively exploring its potential. This suggests that the widespread adoption of AI will require organizational change management, AI skills development and a principled approach to addressing concerns about data security and ethical implications.

The post RingCentral And Verint Partner To Elevate CX With AI appeared first on Moor Insights & Strategy.

]]>