Mike Lenox ADT podcast cover
Episode: 102

Mike Lenox - Strategy in the Digital Age

Posted on: 24 Aug 2023
Mike Lenox ADT podcast cover

Mike Lenox is an award-winning professor, consultant, author and speaker who teaches at the University of Virginia's Darden School of Business.

In this episode, we discuss some of the key insights from Mike's latest book, Strategy in the Digital Age: Mastering Digital Transformation. We talk about why digital transformation is changing the nature of competition and why digital disruption is transformation's evil twin. We also emphasize the importance of addressing ethical and legal challenges for responsible digital transformation, and finish with a few key lessons for business leaders.


Links & mentions:


“Disruption, actually, shouldn’t be, in my viewpoint, a negative word necessarily. Industries are periodically disrupted by new technology, new business models. That is arguably a feature of our market-based economies here, and a good thing, cause that’s what allows us to continue to innovate and create anew.”

Welcome to the Agile Digital Transformation Podcast, where we explore different aspects of digital transformation and digital experience with your host, Tim Butara, content and community manager at Agiledrop.

Tim Butara: Hello everyone, thanks for tuning in. I’m joined today by Mike Lenox, award-winning professor, consultant, author and speaker who teaches at the University of Virginia’s Darwin School of Business. Mike has just released a new book which actually comes out on the exact day that we’re recording this conversation, and it’s called Strategy in the Digital Age: Mastering Digital Transformation. And obviously today we’ll be talking about the book, discussing the key lessons and points from the book. Mike, welcome to the show.

Mike Lenox: I’m very excited. And, like you said, today is the big day, at least when we’re recording here, when the book is officially released. 

Tim Butara: Yeah, it’s a very cool little detail that we just talked about before I pressed record, that we’re recording this on the exact release date of the book. I don’t think that this has ever happened to us, so, yeah, very cool, awesome to be here.

Mike Lenox: Great. Well, thanks again.

Tim Butara: So, one of the key points and one of the first things that you talk about in your book is how digital transformation is inherently changing the nature of competition. And I really want to talk about this for a little bit; what do you mean by this, changing the nature of competition?

Mike Lenox: Yeah, I think for, especially a lot of established organizations, when they think about digital transformation, they think about, oh, do we need to be using the cloud, and how do we get our databases organized, maybe put them in a data lake? And all of those things are critical here, but I think what’s sometimes missed is that a lot of times, digital is having more fundamental impacts on, again, the nature of competition within different industries.

It’s changing your relationship with your customer, maybe changing the value proposition in some very significant ways. It’s allowing for a proliferation of new business models, of course you think about the advertising models of Facebook or Google, but others as well. You see the rise of platforms, in various ways, that create value but also create more competitive threats when these platforms become so ubiquitous. 

We talk about deconstructing the value chain, where now there are all the interesting opportunities for businesses open that are digitally native to find niches that take away some of your traditional value. And so, all of these, they’re coming together in a way that are transforming industries, and for those firms who are proactive, creating new strategic positions that they might be able to achieve in those markets.

Tim Butara: And I guess when it comes to major players in tech, I’m talking, like, big tech giants, I’m guessing that one of the ways that they handle competition is just to kind of bypass it through acquisition, right? As we see with examples like OpenAI getting acquired by Microsoft really early after it got super big and everything.

Mike Lenox: Yeah, it’s one of the things I worry about, is that the traditional model of grow, get that IPO, become another big player is being somewhat supplanted by this idea of grow, get acquired by one of the big tech companies, which might have longer term implications for innovation in tech.

But I think more broadly, what I think about is the primacy of data. Some people said, data is the new oil. It is what’s being used to drive our algorithms, it’s being used to create this new value in many different ways. And so, for the big tech companies, they’ve been very effective at gobbling up lots of data; a lot of it, obviously, under a personal background and usage of technology and the like. And that allows them to improve their algorithms, improve their AI uses and the like, and that kind of feeds on itself.

And so, one of the things that’s happened with big tech, is they’re starting to put tentacles out in a whole bunch of different directions. We see it in terms of the entry into media and entertainment by Amazon and Apple and of course Netflix there. We see it in terms of the financial services sector with Apple Pay, and Google and the others kind of moving in that direction as well. So, healthcare, education, any number of sectors now, are finding tech, these big tech companies in particular, entering it.

Tim Butara: Yeah, that sounds exciting on the one hand, but a little bit scary on the other, as you just said.

Mike Lenox: Yeah, absolutely, absolutely.

Tim Butara: And the next thing that I wanted to talk about is digital disruption. And, particularly, I’m very interested in why you’re calling it the evil twin of digital transformation.

Mike Lenox: Yeah, you know, disruption, actually, shouldn’t be, in my viewpoint, a negative word necessarily. Industries are periodically disrupted by new technology, new business models. That is arguably a feature of our market-based economies here, and a good thing, cause that’s what allows us to continue to innovate and create anew. I say it’s the evil twin of digital transformation really for, again, those kind of incumbent firms who maybe have a solid position in the marketplace who suddenly find themselves wrecked by digital transformation, and that’s where the disruption side comes in.

And so in the book I use a lot of the classic examples of companies that were once highly successful who found themselves, when facing digital transformation, ultimately in some cases – in many cases – going out of business. So, the classic cases like a Kodek, or a Blockbuster video. And history is littered with companies who fail to successfully digitally transform. So that’s the idea of digital disruption here, that it can be quite disruptive to the current status quo.

Tim Butara: So, it’s the evil twin because digital transformation should be something that propels you forward, but digital disruption, if not approached properly, can actually doom you in a way.

Mike Lenox: Exactly. For a lot of established companies, this is the threat and the thing they’re most worried about, as we’re seeing just this proliferation and growth in digital technology.

Tim Butara: But also, returning back to an earlier point, we said that the days of the traditional IPO seem to be over in this context. Another thing that came to mind was, I’m sure there are cases where the main goal– like you said, that the goal is to get acquired by a larger company, right? And that kind of invalidates the entire purpose and the entire mission and vision that you set for yourself. It can’t be honest if the intent was not actually to do something meaningful with it, but just to swindle somebody into getting loads of money.

Mike Lenox: Yeah, maybe, I don’t know if I’d use the term swindle–

Tim Butara: Yeah, that was harsh.

Mike Lenox: That model of acquisition as an exit strategy for entrepreneurs has been around forever, it isn’t necessarily new. I think it’s just the balance of how many companies could we see emerging that are growing to be their own big tech companies versus ultimately getting acquired by the existing big tech companies.

And what I worry about, again, is I think we have seen under industries when they mature, that might not be a great thing for further innovation, that we’ll see that innovation engine has been so cleared, especially in Silicon Valley, but in tech in general, will it start to dry up if we get a handful of big tech companies that are just so dominant that there’s no room for others to grow underneath them there.

Tim Butara: Yeah, that’s definitely an important question and something that we’ll have to keep a closer eye on over the years as this progresses further.

Mike Lenox: Absolutely.

Tim Butara: Well, I guess this is the right place to talk about the most important ethical and legal challenges that businesses need to keep top of mind if they want to successfully digitally transform and do this properly.

Mike Lenox: Yeah, this is such an important topic, it’s one I put an entire chapter just on the social and political implications of digital technology today. I don’t have any right or wrong answers, I think, to what the right choices are. My main message is, you have to be intentional. As a company, you need to be thinking through what are you doing, what do you feel comfortable with, and what are the ultimate implications of that?

So I think something like data privacy; what do you feel comfortable using data, especially personal data, to do? And clearly there are positive uses of personal data to create value added services that customers love, we can’t deny that. But there’s a line there somewhere that we have to recognize that maybe begins to go too far.

Look what’s happening right now with generative AI. I think you see this almost play out in the public sphere with Google and Microsoft almost struggling to be like, we need to keep pushing on this technology, we need to keep pushing on these large language models; and we’re also simultaneously very afraid of where this technology could go.

And again, I wish I had the right answer, this is the regulation we need today, or this is what needs to happen. But I do think the intentionality is going to be really important. Really think through what are the impacts on different stakeholders, what are the possible negative consequences, how are you going to address those; and then ultimately, be very clear about your values as an organization are, and again, are these actions ultimately aligned with your values?

I don’t know how much Google still touts their “do no evil”, but that was their original mission value statement. And I think, as vague as that is, it’s not a bad one; are they doing evil or not? They need to be asking themselves that every single day as they continue to push on AI.

Tim Butara: Man, yeah, that was very well put. And I don’t think it’s vague, I think it’s actually a very clear mission statement, just don’t do evil. But what’s vague is the concepts of evil and everything around that and how that changes. And another thing about AI, right, tying back to some of the points that we already discussed – and I’m sure that both you and listeners will have already this point numerous times on different generative AI related podcasts. 

But it’s like, this paradox of AI innovation. We know that it’s dangerous, we know that it has risks, but other people are doing it, so if we don’t do it, then we’ll just be left behind even more. And if it really is such a dangerous thing, then we are the ones that should be kind of the stewards of it, or something like that.

Mike Lenox: That’s right. To be a little too cute, I like to say technology always wins. And what I mean by that is, I could think of very few cases where it was just the decision, we’re just going to stop this technology, we’re not going to advance it anymore. What more often happens is institutional structures evolve with the technology.

So, take a classic example. When the automobile, the original automobile, came around a hundred plus years ago, what didn’t we have? We didn’t have things like stop lights, we didn’t have rules of the roads and speed limits and things like that; all of that had to evolve with the automobile to make it a safe technology to use.

We’re seeing a similar thing happen now with AV, with autonomous vehicles, as we move to autonomy, whole similar set of questions. I don’t think the answer is, let’s just stop doing autonomous vehicles. I think the answer is, we need to, again, evolve the institutions we have.

Same thing with generative AI. Obviously higher ed is an area that is very worried, and rightfully so, about what’s this going to do in the way our students are going to do exams and the like here. But I’ve been an advocate here at our school, saying, you just can’t say no. The answer can’t be, you can’t use generative AI, period. We have to work with our students to understand the positive use cases, and then set the boundaries about when you might not be allowed to use it, maybe on exams, for example. 

And I think that’s kind of the natural trajectory for technology; doesn’t mean it’s not fraught. And again, I think with AI, the idea that this is a pretty powerful technology, and it has broad implications. So, this worry that there could be some real negative downsides to it, I think we’re right to be concerned in thinking about this.

Tim Butara: A lot of really great points here. And I especially love your approach, since you are a professor, and it’s a first-hand take on a professor how students should be using AI. Really, just using it properly, being aware of how you’re using it, just not using it willy-nilly. And I’m guessing that as things progress, as things evolve – as you said, with stop lights and crossroads and stuff like that, there will be more stuff institutionalized as we go and people get more used to it, and that will just kind of feed into itself as we progress. 

Mike Lenox: When it comes to AI, I always like to say, I have to be clear, I’m an economist, I’m not a computer scientist on the forefront of AI. But from those who are and whose opinions I trust, the idea that the AI is going to gain sentience and have the Terminator scenario is really unlikely. That seems, hopefully, the stuff of science fiction. But what should be equally scary is that our AIs are going to do exactly what we tell them to do. And that tends to be how it works, like, what’s the objective function we give?

So one that I think we’re already seeing and I worry about is creating a ChatGPT or chatbot that’s whole goal is to keep you addicted to it for as long as possible, right. So it can sell ads in the background and the like. And this will be a conversational bot that will be honed to your particular interests and tastes, and form a relationship, if you will. I worry that could be coming down the pipeline pretty quickly. I think there’s already people who are having lengthy conversations with ChatGPT and the like. Those are the types of things I worry about, less, like I said, the AI deciding to destroy humanity.

Tim Butara: Well, I think that now we need to talk about this. Because I just recently– I don’t think it’s fueled by ChatGPT, but I recently watched or encountered some content about these romantic chatbots. And because of the rise of loneliness and everything that has been even highlighted by Covid and everything, a lot of people are lonely, we’re basically in a loneliness epidemic. And the ability to have deep, personal connections and conversations with something that’s just programmed… Man, that’s super scary and super worrying. 

Mike Lenox: Yeah. And, look, it’s a continuation of technology trends we’ve seen. You could argue social media has obviously some of those same elements. I think what the large language models are doing is they’re just raising that addictive quality even further, and being even just that much more personalized, and seemingly having this connection with what is a piece of technology. So, yeah, I think that it is something to worry about.

Tim Butara: Yeah, because with a person, with a friend, even when you’re talking online, still the online communication has value because it’s tied to in-person, real life communication, and to the real person behind it. Whereas with a chatbot, the digital communication with a not really existing digital entity is all there is to it, there’s no substance behind it in this particular sense. So, yeah, wow.

Mike Lenox: Yeah.

Tim Butara: Ok, so, back to the main focus of our conversation today. I think this was a necessary detour. But the next thing that I really want to talk with you about today, Mike, is – why is it so important to have the big-picture perspective on digital transformation initiatives?

Mike Lenox: Yeah, I use this analogy, I’m sure I’m borrowing it from someone else. But you can imagine, if your AI efforts are about just creating efficiency or improving some of your business processes and operations, it could be a little bit like making the engines on the Titanic more efficient. And at the end of the day, you’re still going to hit the iceberg, it’s just a question of how quickly you’re going to hit the iceberg. And so, I’m a big believer, not surprising as a strategy professor, that strategy matters. And where you direct the ship is going to be critically important here.

And then through my own work, I’ve seen so many companies waste in many cases millions of dollars, maybe even more, on digital transformation efforts that lead to a lot of applications and the like that end up not really creating much value, or not really helping the company achieve the position that they want in the marketplace. So you really got to start with that strategic positioning question; where are you trying to go, where are you trying to position your company?

The other thing I hear a lot of and I worry about and I caution companies I consult with, is they’ll say things like, we want to be the Google of X. And my reaction often is, well, Google’s going to be the Google of X. Google’s much better positioned than you are for achieving that position that you think is where you’re going to be going. 

And so it causes you to take a step back and reflect on, what are our particular strengths? What do we bring to this competitive game that’s going to allow us to be successful? That’s a hard question for a lot of companies, especially those who have historically had success and maybe dominated a market space, but, again, are looking at a world that could be radically different than the one they’re currently operating in.

Tim Butara: Well, Mike, this has been such a great discussion. I’m super excited by all your insights. I think that we did an excellent job of uncovering and talking about insights that you don’t really get to hear everywhere during these discussions. And just before we say our goodbye and jump off the call, I want to ask you a last question that will be kind of practical for listeners right now; so, what should be the key lesson or the key lessons from Strategy in the Digital Age for business leaders who are listening right now?

Mike Lenox: Yeah, and it kind of relates back to the point I was just making. I talk in the book about the digital transformation stack. And what I propose is four levels to the stack. There’s your digital infrastructure – that’s everything from cloud computing and maybe building a data lake and your data into it, critically important. That then facilitates the next level of the stack, which is your data and analytic capabilities. And recognize that we all get excited, and I’m excited about generative AI and the like, but there are even basic things like dashboards and the like that can be very helpful within the organization.

And layered above that are what I would call digital applications. So these are, like, where you’re really creating artifacts, maybe internally, maybe an HR process system, or a way of engaging the customers and the like. But at that top level, once again, is your digital strategy. And that’s what really the first half of the book is about, how do you envision where you’re trying to go as an organization? 

And then everything else should flow from that. Once you decide, alright, here’s a value proposition. Now let’s think about the infrastructure we need, the analytic tools we need, the applications we need to develop. And then to really emphasize that this is a journey, not a destination. 

And strategy is not what’s sometimes thought of as, like, let’s set this direction and then never revisit it again. In fact, you’re re-steering all the time, and you might need to pivot as the world evolves, but you got to have some sense of where you’re going. And that’s really the core point of the book. And there are a whole bunch of frameworks and the like that try to help companies and organizations think through how can they position themselves and what does it take to kind of transform to that position they’re trying to get.

Tim Butara: I think there’s an expression that might be a bit overused at times, but I think really applies really well here; basically you’ve got to find your north star, right?

Mike Lenox: Exactly. Absolutely.

Tim Butara: Well, Mike, this is the perfect note to finish on. Just before we do, if people listening right now would maybe like to connect with you, like to learn more about you, or learn more about your book, Strategy in the Digital Age, and order it, where can they do all that?

Mike Lenox: Well, first with the book, it’s available through Stanford University Press, available on Amazon, Barnes & Noble, all your favorite online retailers. And if you want to connect with me directly, I’d encourage you to go to my personal website which is michael-lenox.com. Lots of information there, including my own podcast that I have called Good Disruption, where we tackle different technologies in each episode.

Tim Butara: Oh man, I’ll have to check that out as well. Thanks so much, Mike, this has been really awesome, thank you for sharing your insights with us today.

Mike Lenox: Thank you so much for having me.

Tim Butara: And, well, to our listeners, that’s all for this episode. Have a great day, everyone, and stay safe.

Thanks for tuning in. If you'd like to check out our other episodes, you can find all of them at agiledrop.com/podcast as well as on all the most popular podcasting platforms. Make sure to subscribe so you don't miss any new episodes and don't forget to share the podcast with your friends and colleagues.