024 Mark Graban: That Mistakes that Make Us
How much attention do you pay to the mistakes you make? Well, with today's guest, you will find out why that should be so much more.
Are you a leader trying to get more from your business in life? Me too. So join me as I document the conversations, stories and advice to help you achieve what matters in your life. Welcome to unbound with me, Chris DuBois.
Mark graven is an author, speaker, consultant, podcaster and entrepreneur. He works with leaders to improve and sustain performance arts background in engineering and healthcare. Give him a fresh perspective on culture building in his insights can be found in his new book, the mistakes that make us Mark, welcome to unbound. Hey, Chris, it's great to be here. Thank you. Yeah, this is going to be a great episode. And I would love to start by just hearing your origin story.
Well, sure, I mean, I'm getting to be an old guy, I'm turning 50 in October. So trying to think how far
give you not the super long version of the story. But, you know, like, in a nutshell, I mean, my career has taken some directions, I wouldn't have expected coming out of college, I was an industrial engineer, I started my career at General Motors, I thought I was going to have a career in manufacturing, and, you know, different leadership roles. You know, along the way, I went to grad school with that, with that focus. And, you know, I don't know, there was there was something like lacking, it's not that I was running away from manufacturing, it's probably more that I was like, running away from big companies, you know, General Motors. And then after grad school, Dell computer was 1/10, the size, employee wise, but you know, 30,000 people, that's still a huge company. And you'll work for a software startup, over adjusted and went to another huge company, Honeywell, and then I had an opportunity in 2005, to kind of apply some of the things I had been doing and learning into health care. So that was a real kind of a curveball, there. So I've worked since then, as a coach and consultant, almost exclusively in healthcare, like sometimes I still coach and work with companies and other industries. And I enjoy that. I'm involved in a software company called Chi Nexus that's been around for about 11 years, and as a great growth story there. But you know, I think the thing I've seen from getting the work in these different environments, like it comes down to people and leadership. People aren't that different in different countries or different professions. organizational dynamics are pretty similar. And, you know, whether the people let's say in healthcare, will believe it, or admit it or not, you know, there are things that you can bring in ideas from other industries, a lot of people in healthcare are open to that. But, you know, that's been kind of like the
the career arc that that I wouldn't have expected 25 years ago. Right. Yeah, that is a quite the journey. And the fact that Dell was at 10% of their current size.
Yeah, let's, I want to get into
really talking about cultivating a culture of, of learning and innovation. Yeah. And one of the things we had initially talked about was how, like, the blame game is counterproductive. And obviously, I feel like everyone knows this, right? That's like anyone who wants to argue against it, everyone kind of like, questions, what their motive is.
But even with everyone hearing it, how can you actually get leaders to become aware of this within their organizations?
Yeah, I mean, there's, there's connections between blaming and mistakes, like on an individual level? I mean, how often do we say we hear people all the time, say things like, some of our best lessons are learned from mistakes? Like I don't have polling data, but I'll make up a number, let's say, I don't know, 90% of Americans believe that to be true.
It's far less somehow then, like what what percentage of organizations really have that kind of culture? It's way lower than 90%. Right. So how is it that you have companies that are full of people who might share this individual belief, but then somehow the culture and the leadership has gotten off track from that, this reaction to, you know, this, this this, this punishment reaction? Like it's kind of ingrained and maybe sometimes people just don't question that like they, they as they start their career, they see leaders react that way they learn, they kind of follow the lead of their leaders. They always step back and challenge and say like, Okay, well, if we're punishing somebody for that mistake,
But what does that really lead to?
is our goal to feel better? Because we've punished somebody? Or is our goal to have fewer mistakes occurring? Because then we'll have better business success.
And so I, I don't know if it's so much people share, like they have this assumption of punishing people reduces the number of mistakes. I think sometimes it's a deflection, it's it's not helpful, but by blaming someone else, then I'm not in trouble. So sometimes leaders may blame their employees and deflect to them. You know, I think like one one kind of famous example, from recent years, Wells Fargo, there was this big huge scandal, where
lots of customers were getting signed up for accounts that they didn't authorize. And it was driven by people in the branches, they had an incentive system. Right. So and then, and then, you know, people are saying, Well, I gotta hit my incentive, or I'm going to be fired. So I'm going to sign these people up for banking, credit card and checking accounts. And, you know, hopefully, they don't notice and there might be fees and I can hit my quota, well, then this, this blew up to be a big huge scandal. And at some point, the CEO before the CEO, I forget his name before he lost his job sort of said something. Oh, we have, you know, we've gotten rid of a couple 1000 unethical employees are like I come on like that, that that kind of widespread blaming, and if it were true, and I don't believe it to be true that they had a couple 1000 unethical people working in their banks. That's a systemic problem of a different, right. I would hold the CEO accountable for hiring unethical people, you know, you've got a hiring problem. But, you know, there's something kind of ingrained in organizations, it's, I think, it's not so much that people think punishment really helps. Because what, what happens is it drives mistakes underground, people get better at hiding mistakes, I don't think it's so much leaders think the punishment route is more effective, that just might be what they were taught or what they were exposed to. Or they might somehow think that's necessary, because well, if we don't punish people, then we're giving them permission to make more mistakes. And I don't think that's how it works. Either people. People don't want to make mistakes. They feel bad when they do. And I think punishment just gets us off track from being able to learn and prevent future mistakes. Right. And I think, just to build on that, I think the concept of failing forward is probably is misunderstood by a lot of people where it's like, we want to make good mistakes, right mistakes that we can learn about things. And so I think it's probably important to differentiate the mistake that was like through negligence and like something that obviously should have been caught first, hey, we're growing, and we're trying these new things. And we're gonna keep learning. Because we haven't tried this before. Yeah, yeah. It's very situational. Yes. Right. Yeah. How often? Do you see that with the companies you work with? Where it's like, you actually just need to appreciate that, hey, we're making mistakes, but it's because we're, we're doing all these awesome things. Yeah, I mean, I'll cite and credit. Amy Edmondson, a professor from Harvard Business School, who's written a couple of great books about psychological safety, and it will workplace environment and do people feel safe to admit a mistake. Like that's the first part of being able to then do problem solving and improvement and prevention of the repeat of mistakes, but you know, evidence and puts out a framework. You know, with base I think it's really helpful like three categories of mistakes. There's one category of like mistakes that we could say, I would add like in a non judgmental way, mistakes that should never happen. Right. These are things like we know how to do the work a certain way. And we have procedures and structures in place. Like when I was growing up, when I was a kid in Detroit, one of the catastrophic plane crashes of the era on takeoff from the Detroit airport Northwest Airlines plane crashed and killed all but one person on board I think, because the pilots are the cockpit captain and the the copilot didn't set the flaps to a certain position the plane couldn't get enough lift, hit a light pole ended up crashing. Like that's the type of mistake for one he can't hide or cover up that mistake, unfortunately. And aviation has such a strong culture of learning from not just mistakes that are catastrophic, but from near misses. Right and putting procedures in place. Hopefully a plane never crashes again because of that failure mode, if you will, right. A lot of medication errors and harm in that occurs in hospitals. I would probably put in that same category like procedurally. These errors are preventable then
There's kind of a middle ground, the gray area come back to then there's there's the other end, but that you're touching on Chris like on on the edges of innovation, you know what she calls quote unquote intelligent failures where we can fail forward, I tried to emphasize, like, let's let's if we're testing an idea, let's do a small test. So a small quote unquote failure, a small mistake can prevent larger ones. Like that's part of the you know, the iterative, you know, entrepreneurship
innovation cycle. So if someone like I think of organizations I've been involved in, like, we're going to do an experiment, we're gonna spend $500 on LinkedIn advertising, like, what's the worst that can happen? We waste $500, right. So at some point, you got to have like, a reasonable hypothesis, and like, we're gonna go test this idea, it could be a mistake. But no, the company is not going to go out of business. We, we, we tried some sort of innovative approach. And we know it could fail. And we have to accept that. Otherwise, we're not going to try anything other than the safest of boring. Right approaches. Right. But then then there's a middle ground that, you know, the gray area, I guess, is always the more difficult spot where,
let's say there's a mistake that occurred because of like this, this new, unique combination of events, it's something that hasn't happened before, we didn't really anticipate it, you know, but I think in either of those situations, like even when it's a quote, unquote, preventable error, we still then have the question of how do we respond, and I think the punitive approach, it's, it's counterproductive. On some level, it might feel good. Like, you know, when, when a patient dies, because of a medication error,
you know, society, and people may want punishment that's more maybe just for the purpose of justice or vengeance. But it's not preventing future mistakes, like that particular nurse won't make that mistake again.
But a lot of these mistakes are so systemic, they're going to happen again, and punishment just gets in the way.
What if anything, keeping that nurse now trains the rest of the organization, because they're going to share that every time something similar is coming up? Well, and there's a better opportunity for hospitals to learn from each other. That's something healthcare does not do, as well as aviation in this huge opportunity where there are certain types of mistakes, sort of like, you know, in aviation, we don't expect each airline to learn on their own, you know, about, for example, setting the flaps properly before takeoff, you know, there's an FAA there's their structures to help encourage learning. And healthcare, aviation has an interesting dynamic of like this national reporting system, have a very specifically non punitive mandatory reporting system. So you can report a near miss or a mistake that didn't lead to any harm. You can't get punished for it, because the emphasis is on learning. And I wouldn't believe any argument that, you know, that that breeds sloppiness or anything, I think it's just the opposite. It creates more awareness. Because people now are able to focus on improvement and prevention instead of fearing punishment and healthcare. There's still this wild range of nurses getting fired, losing their license, in some cases, being prosecuted and convicted. And then there was a story in the news recently down the road for me in Lexington, Kentucky, a tragedy like it shouldn't happen.
80 year old patient died because they were given the wrong medication. Multiple nurses made the same mistake, they were arguably understaffed. There were all kinds of factors involved where it seems like this hospital, very specifically, isn't punishing. They're focusing on the learning in the improvement. But my hope would be every hospital not just across the US, but around the world should be learning from that mistake.
Right? So that just got me thinking to like, I wonder how much so a lot of it placed on the organization, right? How the organization handles this, but also wonder on the the individual, how much goes like unrecorded, because of that, not necessarily just the fear of making a mistake, but the shame of potentially making that mistake. And so they almost hide it, rather than sharing it so that people can learn that. I think it's all interconnected. I mean, that's a great point, Chris, it might not be fear of punishment, but it might be fear of embarrassment, and lower status. That's that's all certainly true. So one thing I think is really healthy within the culture at Connexus. As a 40 person, software company, is an environment
And that really starts with the CEO of being very open in admitting mistakes. Now, he might do that after going through a cycle of feeling bad internally, and, like there's, there's the inner struggle, and then there's the leadership struggle, but, you know, it really comes down to like, you know, setting an example.
In terms of, you know, sharing and admitting mistakes and, and focusing then on the constructive path forward, that creates cover for people then to follow his lead, in terms of sharing a mistake for the benefit of others learning. You know, this happens quite often and let you know, kind of a weekly all hands call, nobody's shaming or mocking or teasing people for making a mistake, it's far more likely that someone says, You know what, I've made that mistake before. But you know, here's what I do, to try to prevent it. Now. Now we're sharing and learning. You know, ideally, a lot of that happens proactively, but sometimes, it's it's reactive, you know, I think I think that's,
I think that's positive. And then I got myself sidetracked from the core of your question, though, I think.
I think we're good because I got sidetracked too.
Friday afternoon, yeah, sorry.
But on the same day, you've reminded me actually just now of a previous question that I wanted to ask, okay, one of the earlier statements, like just specifically looking at the Wells Fargo incident, yeah. Where CEO, up there's, there is a systemic issue within the organization. And from the top, they're passing, passing blame off. Yeah. Versus take CEO, you were just talking about who, you know, is having meetings where we're actually fostering growth by talking about these mistakes, and, and everyone's willing to now take ownership and accept it. How, what advice would you have for leaders to kind of set the tone of the organizations from a taking ownership and mistakes, kind of perspective? Yeah, I mean, I think a lot of it and and I think
I'll credit a couple people that I've been able to learn from here, I like trying to cite my sources of why I am connecting the dots between peoples, between people's ideas I'm not trying to make making the mistake of taking credit for someone else's ideas. But when you think of the reasons why people don't speak up, and then we'll come back to what leaders can do about it. This is where I'd get myself back on track. So the fear and shame factor is a big one. You're right. Ethan Burris, who's a professor at the business school at University of Texas, Austin, has done great research around why employees choose to not speak up for different reasons, including improvement ideas, pointing out problems. If it's not fear, it's futility. So there's situations where people would say, I'm not afraid to speak up, it's on a fear factor. It's the futility factor of like, I've spoken up before, nothing changes matter. So it doesn't matter. Why bother, you know, and so I think we have to be mindful of, of both of those situations, you know, we need, you know, high levels of psychological safety, but we also need good problem solving, to make sure that speaking up doesn't become a futile approach. And so then, you know, leaders can try to, you know, I think foster that openness, you know, to two key things, I think I've touched on it before, and I'll cite Timothy Clark, the author of another great book called the four stages of psychological safety, and he was kind enough to give an endorsement for my book. But, you know, Tim Clark teaches, there's really two key things leaders can do. He talks about modeling, certain behaviors, and then rewarding those behaviors. So it's leading by example, Greg Jacobson, the CEO of chi Nexus and other leaders, he is modeling the behavior that we think is positive of admitting a mistake, and then doing so in a constructive kind of problem solving. Let's move forward better sort of way. He's modeling that that kind of gives permission for others say, Well, if it's okay for Greg, and he's encouraging us to do the same. But then here's here's the moment of truth is that when somebody else admits a mistake, Greg has to reward that same behavior. So this is the Tim Clark do of modeling these behaviors and then rewarding them with and I think that's a different word than saying not punishing, like rewarding doesn't mean literally, like, Hey, here's $20 Here's a gift card. But it's got to be a positive like, Thank you for sharing that. And let's move forward right? So I can end with just saying, like, well, it's okay. I know you didn't mean to do it. It's fine. And that's the end of the discussion. You're doomed to repeat the same this
Steak perhaps, like being mindful and like, Okay, well,
I clicked in the wrong place, and I did okay, I'll remember I won't do that, again, like that doesn't last very long.
You know, we're human, we're, we're fallible, we get distracted, we get tired. There are lots of reasons why human error is just a reality. But yeah, I mean, you've you've got to
not just encourage people to speak up, but you've got to actively kind of reward it. And that rewarding could be just a thank you. Now let's focus on together figuring out how to prevent that from happening again. That's very positive.
Right? So you just, man, I love getting in these conversations, because I started learning all this stuff that like fits into other things that I've learned, like connecting the dots, like you just said, Yeah, where two of the two of the questions that I'm constantly asking if I'm in almost any scenario for one of the old people, it's that reptilian brain is always asking, am I safe? And do I matter? And so when we can talk the fear and futility, they directly correlate? Yeah, there. And so if you can be working to make your team feel safe, and that they matter, you're generally going to be able to avoid a lot of these mistakes. Yeah, it's pretty powerful. Yeah. But that reptilian brain that's part of our nature, right? When we're, when we're scared, it's hard to be in a creative problem solving kind of mode. So one thing I've learned along the way, and there's a story about this in the book from some things I was involved in at Connexus. Because people do care about their job and the company. And like, when someone's made a mistake, they feel they feel bad. And sometimes you have to give some space and recognize that and this is something I've tried to get better at, have some empathy, because like, buy engineer brand wants to kick right into problem solving mode. And sometimes people just aren't ready for that. And like, you've got to be a little bit comforting and say, I know you didn't mean to do that. I know, I know. And to let that sit for a minute. But then, again, that can't be the end of the discussion. But you might say, let's meet up this afternoon, like, soon enough, like one thing I've learned about the problem solving side is like, the fresher the situation, the more effective your problem solving is going to be. But if you if you rush it, people might not be in that place mentally, emotionally, to really have their creative thinking back. So I guess, I guess how could you take a constructive response? And you've given some examples, but to address some mistakes without necessarily giving permission for them to make like the blatant mistakes?
Well, yeah, and I mean, you know, and blatant mistake, I mean, that like that's, that's a judgment call or right or like, right, yeah. And a lot of times, that's clear in hindsight of like, you know, and even the person involved might admit, well, now I see, I shouldn't have done that. You know, but there's, you know, I think back to the question of, like, there's a framework in healthcare, but actually, it comes from aviation. And this is one thing that healthcare has been trying to adopt, you know, not not equally everywhere yet. But this framework called just culture. And it basically kind of gives a framework and even like a flowchart. So as an engineer, I love that right? Here's an algorithm for thinking through when some sort of mistake occurred.
Is that a systemic problem where, you know, punishment of the individual would be ineffective and unjust, like, so that's where just culture phrase comes from? And what are the situations where individual punishment is appropriate? And if not necessary? Now, you know, most of the time, I mean, you know, the algorithm goes through and like, well, would another nurse in that same situation likely have made the same mistake not to pick on nurses, but in a certain scenario like that, and Lexington, like the fact that a couple of different nurses made the same mistake of confusing one bottle with another, that screams systemic error. And somebody might look and say, well, that's a blatant mistake. They should have read the bleeping bottle, like Well, sure. But if we haven't walked in their shoes as a nurse of having three ICU patients assigned to you, and that's far from ideal, like one or two patients at a time, is normally a full workload and what kind of stress and distraction and things were going on. You know, we're human. So we're not just shrugging our shoulders and saying like, well, it's human error, or what can you do? That's where you need good preventive measures. And some of those preventive measures weren't working well like barcode scanning. And then if you don't have a supportive culture, where people can say, hey, the barcode scanning and something's not working, if they're left defend
for themselves, well, then then they ended up in this case working around the system that was meant to prevent the error, which I and again, I don't think I'm just being so loosey goosey of like, nothing's ever your responsibility. But like that all screams systemic error, where the sought the solution to it for preventing that from happening again isn't going to fall back on punishment. Now there are certain circumstances like for example, if somebody is intentionally hurting patients, I mean, there are podcasts about that different type of disgusting situation. Clearly, like when there's intent and when people know they're doing something that is going to harm others. That's when punishment and the justice system should be involved. But I think there's there's times unfortunately, where there's been great injustice as a pharmacist or nurses, you know, being convicted
of things that, you know, I would argue were systemic problems, or that kind of like that nurse was in the wrong place at the wrong time.
You know, that.
That, to me, that's troubling. And then that that causes all sorts of other problems.
Yeah. So let's talk about testing different ideas, which we kind of talked briefly already.
You said something important, I think as far as how we go about actually testing and using a hypothesis, right? So we're not just blanket, like throwing stuff out there and saying go, we're actually saying, hey, this what we think is gonna happen. So now, as mistakes are coming up, they're a little more understood, like, almost acceptable, right? Because we're coming about this. How do you? I mean, I have some ideas for like, how I've approached this in the past, but like, how do you work with teams that basically say, you know, here's how you should set up some of your tests so that we do know that we're actually growing and we're learning from this? Yeah, I mean, there's this middle ground where you're right, I mean, this idea that we can iterate, learn from mistakes and kind of work our way towards discovering a solution over time, instead of like, knowing the solution right away, that doesn't give permission to, you know, throw spaghetti against the wall, or, you know, make all kinds of just reckless
assumptions or things that haven't been thought through. There's a middle ground where, you know, I think, when you have a team,
where people would share that, you know, the reasonable hypothesis, if we do this thing, we're likely to get this outcome, like, well, we want to be careful about groupthink. You know, I mean, we could still as a group, agree that something seems like a good idea, move forward and learn. It wasn't right. Yeah. But I think sometimes, you know, when you've got more people involved, or let's say, in a healthy environment, it's the it's a rhetorical question. But think of like, as an executive,
this may apply to some people listening, can you remember the last time an employee disagreed with you or said something was a bad idea?
Like, if you can't remember that there might be a culture problem.
That hasn't happened recently enough, people might be
just, you know, out of fear or futility, you know, not speaking up, and then the boss might be going forward with a lot of ideas that turned out to be a mistake. And then they don't want to admit it was a mistake. And like, that doesn't seem like a path to success. But, you know, I think getting other people's input, do we agree that the hypothesis is solid. And then the other strategy is, you know, the small tests of change. Like, there's, there's a story in the book about AI, I was coaching a team at a hospital where somebody had an idea that seemed like a reasonably good idea of something they wanted to buy that would, they would buy one for every hospital bed and install them on the bed rails, and it would be good for the patients and everything. So, like, it wouldn't have bankrupted the hospital to spend that money and to be wrong, but at the same time, like we want to be effective stewards of the community's funds if this hospital in Iowa and so that part of the coaching was so well, instead of buying one for every hospital, because we're assuming that thing they wanted to buy was going to be effective and the right thing to buy once a smaller test of change. I like well, we could start with one unit. Okay, and I kind of pushed and we're talking like, well, I like what's the smallest test of change and like going well, one, one bed now, you know, there was a middle ground to be had because you might not want to
put too much weight into one small one test and one data point, right so maybe you know, the reality was in the middle but I
I think the point
is, you know, the smaller test of change is less risky. When we can learn from that small tests of change, and when it was a, you know, a small expense, you're less likely to be embarrassed or double down. And how many times in corporate settings and we've been pressured into, like, prove that was a good idea like that, okay, I, we can torture the data till it says, we want it to say or it's a sunk cost. So we know that we aren't willing to just throw aside. So I think when it's a smaller test of change, people are more willing, or it's safer to say, hey, we tried that, and it didn't really work out. But we learned something, let's, let's adjust the plan.
I think those are some helpful tactics. Definitely. So something that I've done that that seemed well received, was thinking through the actual value of a mistake, and what I was willing to accept, for that level of risk. And so it changed per team member, right? Because their experience levels matter and stuff. But for someone, I might say, Hey, you can make $500 decisions, like if you think this could cost us $500, like, Okay, any more than that, come talk to me, and you know, 5000 for someone else, 50,000 for someone else. And and now we're almost balancing the set risk. And so mistakes just kind of don't happen as frequently or as large for that individual. And I think it's reasonable to have some boundaries or guidelines, I think sometimes a helpful question can be, what's the worst that can happen? You know, as we're evaluating a test, and again, thinking of the situation, if the worst, the worst that can happen is losing $500 Fine. If doing some sort of marketing experiment that could really destroy the company's reputation. While that we that's, that's a risk, we would be a lot more careful about.
Want to switch switch base again, for now, going back focus on the leader, where
this is something I have struggled with, and it's finding the balance of taking too much ownership for mistakes, versus ensuring your team understands the mistakes that they've made. And then taking ownership of that, where I've always, I look at a situation say what could I have done differently. So we didn't have this outcome? Even if there was like, I could have on board member right onto the company a year ago. I'm still willing to accept that, and I'll look at how I could do it. But at some point, I gotta make sure I'm going down talk right team, and helping them them do that. How often do you see that balance as a struggle? With leaders you're working with? I mean, I think it's it's one of these kind of like, you know, yes, and situations where I agree with your point of maybe it was an onboarding issue, a training issue, we've identified a training gap
that we can, we can close that gap and onboard people better in the future. I think that's a healthy constructive kind of system based view. And there's some things that only the CEO and the senior leadership team own and have responsibility for you at the same time, you can engage and involve, you know, people that were, shall we say, you know, involved in the mistake. Right. So, I mean, I think there's this balance, you know, servant leadership doesn't mean, we do it all for them, or what that we do all the problem solving as the leader, I think there can be some collaboration, and some middle ground where the person, you know, the individual employee can own their action, their decision. Leaders can own some of the systemic factors, but then you can collaborate and say, Well, I want the leader can say, I want your input on how we can help prevent that. And I might point back to this question like, Well, you never trained me about this and that and like, Okay, well, that's an onboarding question. I mean, I think it'd be it can be collaborative in a way where it's not,
you know, paternalistic or materialistic or like, well, you know, we have to, you know, we're gonna add on that, you know,
not trust the employees to help us fix it. Well, okay, well, we're the leaders, we have to do it all. Like that doesn't scale. Well, and, you know, some of the leaders from Connexus, you know, we'll talk about how they might have, you know, 15 people reporting to them, they can't micromanage. You know, what those 15 people are doing every day, and they can't necessarily be involved in micromanaging the reaction to every mistake that's made. So I think you find a balance of, like, What are you teaching and empowering your team to work on? Versus what are they escalating up the chain because they do need help? You know, I think I've seen some situations where it goes to an unhelpful, extreme or everything's being escalated to the CEO, and CEO, even if they've worked 17 hours a day, which I don't recommend, but you know,
everything going up the chain
that that could be driven
by fear or people not being equipped with the problem solving skills
that they could and should develop.
Yeah, that makes sense makes a lot of sense. Mark, this has been a great conversation, thank you. Like, I have personally, again, gave some great insights that I get to take back. So I'm sure the audience has as well. Appreciate that. Before we get into some of the closing questions here. I want to recommend that everyone grab a copy of your book, the mistakes that make us
available on Amazon or anywhere else that people can grab it, it's starting to trickle out. I know, it's available through Barnes and noble.com. It's available in Kindle format. That's the only ebook format right now. Paperback, hardcover audiobook version is going to be available sometime in August. I've already recorded that. And it's just waiting for the arrival. So depending on when this gets released, it's either already available now or coming soon. But yeah, Amazon is probably the easiest way where people can go to mistakes book.com, that's my website, if they want to order a copy direct directly from me, they can do that. If they want to do a bulk order for their team, I can hook that up as well.
Awesome, which could be extremely valuable for I mean, I've done company like bookclubs. Yeah, where we all just read and really helps with syncing that that knowledge into the company way faster. Yeah. So I hope people would do that. There is one, there's a one level of bulk order purchase, where I'll also come and do like a q&a session for book club, if people want sounds even more valuable.
So separate from your book, what book would you recommend that everybody ever read to?
I think I already mentioned the four stages of psychological safety. Tim Clark, that book is fantastic. Amy Edmondson book, The fearless organization
is really good. And then I love books. I mean, for my auto industry routes and learning from Toyota. You know, there there are books out there about Toyota, there's one called Toyota culture.
That is, I think, really good and really helpful. Because, you know, it's not about copying the tools and techniques from from Toyota. It's really more about culture, leadership, management systems, mindset, mindsets, right. So yeah, I would check out the book Toyota culture as well. So there's three, two of them. Awesome, very similar Toyota culture from a different direction.
Awesome. It's a it's a very selfish question. I asked her so I can keep expanding my libraries, dare you say?
It's okay to be selfish that way.
Right? What's next for you professionally?
Well, you know, book launch mode, you know, is ongoing, it's, it's a marathon, not a sprint, which is a funny analogy to use, because I don't run either of those
distances, write speeds, but you know, ongoing book launch mode, doing more speaking engagements related to the topics in the book, and I think, you know, two of the, the key ideas that are interconnected, and some organizations need more help with one or the other, you know, psychological safety, how do we nurture that and build that and in a team, and then the problem solving piece? How do we help make sure when people speak up, something good, and constructive, and productive comes out of it? So, yeah, doing doing more of that, and getting them not just healthcare organizations, but doing sessions like that with people in other industries, and I'm gonna continue podcasting. I'm going to keep doing the My Favorite mistake podcast, Chris. I'm not putting you on the spot publicly. I guess I would love to have you come on my podcast if you have a favorite mistake story to share. And when you talk about some of the things you've learned,
you have lots of mistakes. Sorry, let's let's sort through them. Yeah, well, yeah, I've learned it's a mistake. You don't want anyone to be surprised by that question. It takes the you got to think about it, like at least over a weekend or something.
Right.
Awesome. So where can people find you? So mistakes book.com is probably the easiest to spell and to get to for the book. My broader website, Mark graven.com. If you can put a link in the show notes, probably gr A B, a n.com. And I can be found on LinkedIn. It's kind of the main social media platform.
Those would be I think, the best places to find me I kind of joke around like I'm, I'm obnoxiously easy to find online, but probably the worst way to contact me is an Instagram message. Because I think a couple of years went by I'm not a heavy Instagram user. I didn't know there was messaging in Instagram.
People have been trying to contact me there, right? You just got messages so effective.
I didn't know I didn't have notifications on. I don't know. So please don't try to contact me that way.
Noted. Sure. Well, I use Instagram for that. I'm getting better now. I know I'm not repeating that mistake. Right.
Well, Mark, thank you for joining me. It's been awesome conversation Chris. I appreciate the questions and your enthusiasm for the topic and I look forward to exploring this more on my show. So thank you.
If you enjoyed today's episode, I would love a rating and review on your favorite podcast player. And for more information on how to build effective and efficient teams through your leadership. Is it leading for.com As always deserve it
