AI is moving at 11/10 speed.
If you're not paying attention, you'll get left behind.
In our latest episode of Pivotal Clarity, Erik Salo joins us to unpack the intersection of high-performance computing (HPC), AI, and data storage.
In today’s conversation, Erik dives into:
You'll find this particularly valuable if you're:
[Rajan] (0:03 - 1:08)
AI is reshaping businesses and ignoring it could leave your industry disruptive. I'm Rajan and this is Pivotal Clarity. We talk to those building or using AI, founders and engineers with real-world experience.
Our aim is to cut through the hype and see where AI is truly making an impact. If you're a business or following tech trends, these conversations offer clearer insight than most of the press. Let's get into today's episode.
Welcome to Pivotal Clarity, an AI podcast. Today, I'm excited to have Erik Salo with us, who has over 30 years of experience in the tech industry. He's played multiple roles in strategy, in product management, and in marketing.
Currently, he's the vice president of product and marketing at Vidura. And prior to that, he had worked at Seagate. And Erik is looking at how high-performance computing, data storage, and AI is coming together and what kind of new products can be shaped up from this.
Erik, welcome to the show. Rajan, thank you. So, Erik, what does high-performance computing mean for a layman?
[Erik] (1:09 - 2:30)
Well, high-performance computing means computing is fast. Ultimately, there's so much energy on AI right now. And AI and HPC are not exactly the same thing, but they're close.
And ultimately, it's when you need to do computation at a rate that is significantly faster than standard enterprise computing, you call that high-performance computing. And where does the AI layer come in right now? Well, you know, that's something that's really interesting.
People, I think, especially if you haven't been in the industry, you think AI is new. But actually, AI has been around for my whole career, right? And, you know, we call the different things.
We call them machine learning and neural networks and things like that. But really, this fast computing, high-intensity computing that is designed to find patterns of data has been around for a long time. And we are seeing quite a bit of innovation.
I mean, Leo, the explosion in the last couple of years with the AI world has been, I don't know about you, I've never seen anything in my whole career with the velocity of change that we're seeing in AI. I mean, if you don't pay attention for a couple of weeks, you're behind the times. It's just amazing how that's going.
And, you know, one of the things that I think a lot about is there are some subtle differences between the legacy HPC market and the AI market. And that's something that we've spent a lot of time thinking about and making sure that our products are focused on because we see just tremendous investment and energy going into the AI proper kind of world right now.
[Rajan] (2:30 - 2:35)
What would you say are some of the changes in the AI market? Like, how has the landscape changed?
[Erik] (2:35 - 3:29)
I mean, I'd say there's two ways to look at it, right? So one difference, I think, is that even though this whole notion that computers can find patterns and data that are useful to people, that's been around for a while. But just recently, the capabilities, especially of these LLMs, has broken through some kind of barrier where everybody, you know, my mom, you know, my mom can figure out what's going on in the technology market.
It's hit the mainstream, right? She still hasn't, you know, mastered the concept of a window, you know, like she closes one thing to open another one. But she's telling me all about AI.
And, you know, so you see from more perspective, you see that. The other thing that's been just a tremendous change is the scale of computing is just mind-blowing, right? You know, we're doing these relatively modest, you know, trainings and things like that.
And now you see these big models are spending, you know, billions of dollars on models. And I don't see any, at least for one vector of innovation, I don't see it slowing down at all. It's just getting bigger and bigger.
[Rajan] (3:30 - 3:54)
Usually in tech, Erik, it's a game of strategy. They are like, you know, specific choices that you're making. And you've seen multiple shifts in terms of platforms, you know, whether it is the old school enterprise or whether it is cloud or mobile and now AI.
What do you think are, like, you know, important right now? And how do you actually think about making strategic choices with all these landscape changes?
[Erik] (3:54 - 5:52)
Well, that's a good question. And, you know, there's actually been one big fundamental change. And, you know, I think a lot about storage.
You know, my career, I spent about half my career in chips. I worked for AMD for a long time. Great company, by the way.
I loved them. I had a great run. Glad to see they're doing so well this last few years.
So half my career was in CPUs, computing. I was a chip designer. And then half my career has been in storage.
Storage and computing are related, but they're not the same. And the big change in storage over probably the second half of my career is that the world's gone from what's called scale up to scale out. And if you think about it, it wasn't too long ago that, you know, high-end enterprise storage was these kind of, called scale up in the rack, great dual-dual, active-active.
You'd have a couple of servers, and they would each be able to talk to all the storage elements, usually RAID. And, you know, those systems reached a certain level of reliability, a certain level of performance, and kind of oddly, we're still using that in some of these newer applications, which is really puzzling to me because, you know, and geez, is it 15 years ago now? The cloud world got to this point where they realized that their scale was so big that no matter how reliable any of these single scale up elements were, it wasn't reliable enough for the whole system.
And so what they did was they spread the reliability out, scale out across many nodes, right? And that was a fundamental change, right? Scale up to scale out.
And one of the things that's been really surprising to me is while the bulk storage has fully transitioned to scale out object stores, things like that, I spent a ton of time at that on my last stop at Seagate. The high-performance world's still kind of in this old-school scale up architectures, and they're just not very good for the new workloads, right? They're not super reliable.
They lose your data all the time. They go offline, they're hard to manage. And that's actually the product problem that we set out to solve at Vadura is we saw that, hey, the whole world of big storage has moved to this scale out architecture.
Why has the high-performance world moved to this, right? And that's what we've done, right? It's a big shift.
[Rajan] (5:53 - 5:59)
How does one not get disrupted in all these big changes, whether it's a small company or it's a large company? How do you think about that?
[Erik] (6:00 - 7:27)
I mean, that's always the problem in product management, right? It's like you've got to recognize what's a shiny object and what's real. And I tell you, fundamentals always, always win, right?
And if you think about the fundamentals of the business, right? So storage, what are the fundamentals? Think about it.
You can tell me. Don't lose people's data, right? When they put the data on your product, don't lose it, right?
That's number one. Number two is make sure they can get to it. Maybe you don't want it so that it's on your device, but they can't get it, right?
So don't lose people's data. Make it available to them and make it fast and make it inexpensive, right? Like those fundamentals, they've always been true and they always will be true.
And so as we see this big shift in high-performance computing from I'd say HPC-centric, which is kind of very CPU-centric, there are fundamentals, right? It's very write-intensive. It's mostly sequential.
The files are usually big. There's like millions of files. That's kind of the traditional HPC world.
AI is similar, but it's not exactly the same, right? There are differences. It's read and write-intensive.
There's a mix of sequential and random workloads. There's quite a bit, many more files, orders of magnitude more files, a lot of smaller files and things like that. So you're seeing a difference in the users, right?
The users are different. The profiles are different. But the fundamentals are exactly the same, right?
And so as a company, what do your customers actually want? And by the way, one of the best ways to figure out what they want is to ask them and listen to what they say. But work on those fundamentals and that as the market changes and as what's popular and this day moves up and down, those fundamentals are always going to be true.
[Rajan] (7:27 - 7:37)
You've also spent a lot of time on product marketing. How would you describe that for a layman with the role of product marketing and how has that changed with the change and the shift that we are seeing recently?
[Erik] (7:37 - 8:23)
I think of marketing, I'm an engineer. I started out as an engineer and I still, in my brain I'm an engineer. So I'd look at the world through that lens and marketing is really two things.
One is it's understanding the trends and the requirements in the marketplace early enough so that your engineering team can do something about it. That's important, right? You don't have to be a genius to figure out what happened after it happened, right?
It's like, you gotta watch out. You gotta figure it out ahead of time, that's one. And then the second thing is positioning and promoting your product in the market in an appropriate way.
And a good marketing strategy is really just a multiplier, right? It just takes what the engineers have done and what your customers have told you and it puts it in a clearer voice to new customers and to the market.
[Rajan] (8:24 - 8:38)
And how has that changed, Adik? Has it changed from how you would do product marketing in the high-performance or old-school market and how would you do it in enterprise and how it's shaping up in the AI markets?
[Erik] (8:39 - 11:05)
You know, there's two things that are different. So the one that's really struck me is how fast things are changing right now. If I look back a long time ago, a year ago, right, which a year ago is not a long time, but in the AI world, it's a really long time.
I thought about, well, here's what we think's gonna happen and as a team, we talked about the trends. And looking back, I would say we were exactly right about what was gonna happen, but it happened more quickly than we thought. So that number one is just the speed.
Everything's at 11 right now. I've never seen anything innovate this quickly in the market. That's number one.
The other thing is that it's not a mature market. You think about it, the HPC world is a relatively mature market. The workflows are mature.
A lot of people you deal with have been in the market for decades. It's very optimized, it's very mature. Whereas the AI world is kind of like a gold rush right now.
It's like everybody's coming in and everybody's got a different idea and there are some trends emerging, but I spent a lot of time looking at the market. I don't think anybody exactly knows how it's gonna go. And so when that happens, what you have is you actually have a lot of inefficiency that is created so that you can have speed.
And so as a product, what do you do? Well, it's back to the fundamentals. I make a storage system.
I don't wanna lose your data. I wanna make sure you have access to your data. I wanna make sure it's fast.
I wanna make sure it's inexpensive. Those are the things. You just have to keep that in mind because the speed could distract you.
What are some of the misconceptions that people have with respect to AI? I'm not sure anybody really understands what's gonna happen with AI. I mean, it's amazing.
I'm using it every day now, but it's probably not gonna take over the world tomorrow. I think that it's probably not gonna do that. Probably the, I don't know if it's a misconception, my biggest impression of AI is it's gonna basically just scalp things.
It's gonna make mundane tasks more efficient. It's gonna make people more efficient in their workflows. I just read an interesting study from Marginal Revolution.
It's a blog I follow, Tyler Cohen. And it basically said that, this is material science, that if you look at AI's impact, it's really helped the top material scientists and it hasn't helped the bottom tier of material scientists nearly as much because the smarter people are clever enough, or I shouldn't say the smarter, the higher performers are clever enough to kind of filter what they're getting and so they don't waste time on things that aren't gonna work. I think that's how it's gonna go. I think it's just gonna basically be another tool in the toolbox and it just makes humans more efficient.
[Rajan] (11:06 - 11:11)
The ones that are doing well are gonna do even better, but the ones that are not doing well are gonna do worse.
[Erik] (11:11 - 12:13)
Right, they're still gonna not be well. One of the analogies, so I've been in the computing business for my whole career. And when I was in college, I don't think you could take computer programming.
You could take microprocessor design. That was computer programming. And my first class was writing assembler language and moving bits through a CPU, right?
And if you think about it, in the space of my career, that has gone from abstraction to abstraction to abstraction. So at the beginning, literally you were moving bits through the register. And then all of a sudden, you came up with this, you know, languages like COBOL, FORTRAN, things that you could sort of talk to them and then you had Object-Oriented, C+, and now you have these higher level languages like Python.
And then, you know, today with the new LLMs, you don't have to know anything. I mean, you could just start typing and you can make a program, right? And so I think that's what we're going to see is that you're just, it's just going to be another abstraction to almost every workflow, right?
I don't think there's any industry or job that is completely immune to the effects of this really interesting part of the market.
[Rajan] (12:13 - 12:24)
Given that it is moving so fast, there seems to be some misconceptions. What are mistakes that you think industry players are making, whether it is big companies or startups, in approaching this market?
[Erik] (12:25 - 14:58)
Well, I think the thing that people need to pay more attention to is what's it for? It's great that, you know, an LLM can write an email for you, but what does it really do for you? You know, I've spent a lot of my career with the enterprise and, you know, so many great professionals, right?
You know, think about it, medicine, oil and gas, government, labs, genomics, all those sort of things. They all have a mission, right? They all have a mission.
I want to cure cancer. I want to cure this genetic disease. I want to be able to find energy reserves, things like that.
Those are very clear objectives, and I think that the mistake that people are making in AI right now is there's just so much explosion of innovation that I'm not seeing like the, here's a thing that does a thing and it's better, right? And I think that's really the key, and I think that's ultimately where the first really big innovations or useful innovations are going to land is when, you know, you use AI to do a better job at looking at x-ray or things like that. I think that's the real gold in the beginning.
What are your favorite AI tools that you're using? When ChatGPT first came out, for example, I'd write an email and then I'd ask ChatGPT to write it, and it would give me a couple of good ideas, but frankly, it wasn't as good as my email, right? So that would be good.
And then, you know, I forget what, you know, at what point this happened. Then I realized that it was just as good as everything I was writing. And now it's better.
Like, I still try to write all my own stuff, but my gosh, every time I use these LLNs, you know, it's really interesting. I subscribe to about half a dozen of them, and I'm just astonished on what kind of good output you can get for a relatively small amount of input. You know, type in a few sentences and it gives you this whole world of answer.
And so it's helped me with coding. You know, I have a lot of hobbies. I'm a terrible coder.
It's made me a mediocre coder, right? Which is a pretty big increase. It's helped me a lot with organizing my thoughts and things like that.
And then just recently now, I'm starting to use it to help me to understand and visualize data. You know, look at our sales, look at our market, things like that. And all of a sudden, these tools now are actually, they're not, frankly, better, but they're faster.
You know what I mean? Like, if I could spend a couple of days looking at a dataset, I can get, I think, pretty good output or pretty good insight into what that data means. But I can do that in an hour or two with an LLN, and that's a big difference, right?
Just gives you much more range. Other than ChatGPT, what other tools do you use? I use ChatGPT, and then my second favorite, actually, my first favorite is the Anthropic one.
Anthropic's been my go-to. And then I use Copilot for coding. And then I've got a couple of the image ones, like MidJourney.
[Rajan] (14:59 - 15:15)
What are things that you would suggest or give as an advice to a startup that is coming out in the AI infraspace today? Like, you know, given that so many things have changed, if you were to advise startups on the AI infrasight, what are some of the things that you would tell them? What should they do?
What should they not do?
[Erik] (15:15 - 17:04)
Well, you know, infrastructure ultimately powers everything, right? Like, you know, it powers the world in there. Solve a real problem.
That is the key, is solve a real problem. You know, now there's, you know, everybody wants to get into AI in some way, and so there's a ton of misinformation. There's a ton of just people just trying to do something, so it seems like they're doing what the other cool kids are doing.
But, you know, from an infrastructure point of view, and that's something we're super focused on at Vidura, is solve a real problem, you know? And the problem that we've thought about, I'm kind of going for your question, I'll come back in a second here, is that what we've thought about is that, you know, if I look at the storage for these fast computing systems, these high-performance computing systems, they're kind of a mishmash of different technologies, just because that's what you had to do, right? You have the scratch layer from the front, and it's unreliable and hard to use, but it's really fast, and then you've got these really inexpensive, really reliable data lake layers, and there's bailing wire and string holding them together, and it's not a very good solution.
And so what we focused on with Vidura is, like, let's put this all together, right? Because people don't want to worry about the storage. They want to solve their problem, and they want to work on genomics or healthcare or whatever.
They don't want to think about it, right? So make it easy for them, and just do all that orchestration in the background for them. And from an infrastructure point of view, if you're a startup, find a problem and solve the problem, right?
Don't just be in the world. You know, actually go and find a pain point and solve it, because, you know, you think about all the people that are your customers, they're trying to solve a completely different set of problems, and they just don't want to worry about the infrastructure. They want the infrastructure.
That's why clouds, even though they're expensive, they've been very popular, because they're an easy button. You push the button, it works. You don't have to worry about any of the messy infrastructure.
You pay them for it, but ultimately, you know, they make it easy. And, you know, if you're an infrastructure company in this market, make it easy for your customers. They've got a lot of problems they're trying to solve.
Don't make your product one of their problems. Make it a solution.
[Rajan] (17:04 - 17:37)
I agree with you that companies should start with, like, focusing on a customer problem. But this whole infrastructure space, if you really look at it like, you know, there are these observability companies, then they're also doing guardrailing, and then they are talking about things like, you know, how do you make it, like, you know, abstracted? And then how does one position this, even when you're solving a particular problem?
And that partly becomes, like, a product marketing challenge. When things are changing so fast, are you an observability offering? Are you a data storage offering?
And then, you know, this keeps changing. So how do you navigate those challenges in a fast-moving space?
[Erik] (17:38 - 18:16)
My biggest lighthouse for that has always been to go talk to people who are actually doing it. Actually, it's amazing to me how open customers will be. You go and say, will you tell me what you're doing and what your challenges are?
They will. I remember we were talking before the podcast show. I did a startup in San Francisco before Seagate, and one year I was on the road 50 weeks just talking to customers about what they needed.
And ultimately, to me, that's the key, is that go talk to people who are in the pits, who are in the, you know, in the thick of it and have their sleeves rolled up and are trying to solve problems and just ask them, like, what are you doing and what is hard? And they will tell you, and you go solve those problems, and that's how you make a successful entry.
[Rajan] (18:17 - 18:23)
Erik, is there something that I should have asked for our conversation that I missed that you think we should chat about?
[Erik] (18:24 - 19:19)
One of the things, I touched on this a little bit earlier, but, you know, like I said, we work in the storage infrastructure world, and I've been in storage for a long time. I was at AMD for about half my career, then Seagate for a long time, and then now Fedora, which makes storage. And, you know, the question I always ask myself is, what's wrong with the product offerings in the market, right?
What's wrong? And the answer is they're too hard to use and they're not reliable enough. It's as simple as that, right?
Like, so for high-performance storage, it's fast. It's actually a pretty good value, but it is not reliable and it's hard to use, right? And so the lighthouse for us, and I think this works kind of more generally, is like, okay, that's great.
Is it technically possible to make a solution that addresses those problems? Yes, it is. Okay, go build that, right?
Go build that and people will want to buy it because, again, they're thinking about their problem. They're not thinking about the infrastructure in the background. They want that to be the easy button and we need to make it so that they can just, you know, they can just push a button and have everything work from the infrastructure side.
Makes sense.
[Rajan] (19:20 - 19:48)
Teg, thank you so much for taking the time. It was such a pleasure talking to you. Thanks, Rajan.
I had a great time. That's it for this episode of Pivotal Clarity. This is an Opeca podcast.
Opeca is an accelerator for global Indian founders building AI software companies. We're exploring the fast-changing world of AI together with our listeners. If you like this podcast, you can find more on our website and other popular podcast apps.
Subscribe if you want to keep up.
Transcribed by TurboScribe.ai. Go Unlimited to remove this message.