What happens when your team doesn’t trust each other? How can you repair damaged trust at work? How has technology changed the way we decide who to trust? Author and trust researcher Rachel Botsman joins host Jeff Berman to tackle these questions and more.
About Rachel
- Leading global authority on trust and its impact on society and technology.
- Author of three influential books on trust, translated into 14 languages.
- Lecturer at Oxford University on trust, innovation, and social change.
- Advisor to organizations on designing trustworthy platforms and products.
- Pioneered frameworks on distributed and digital trust dynamics.
Table of Contents:
Transcript:
Build stronger trust on your teams
RACHEL BOTSMAN: The game changer is that in old forms of information and news, you could kind of take what I call a trust pause. You could slow down, and you could say, “Is this person, is this product, is it worthy of my trust?” And we don’t do that anymore because everything is designed around speed and efficiency. You talk to most entrepreneurs, it’s about taking out the friction.
JEFF BERMAN: Rachel Botsman really knows it feels like the world just keeps getting faster and faster, and she wants us to take these pauses, trust pauses, more often. She’s an expert researcher on trust and on how technology is upending this bedrock element of being human.
BOTSMAN: For the first time in history, we could trust strangers mediated through technology in ways on a scale that we’ve never seen before.
[THEME MUSIC]
BERMAN: I’m Jeff Berman, your host. This week on the show, author and researcher Rachel Botsman. Her latest is the audiobook “How to Trust and Be Trusted.” We’ll dig into why there’s a crisis of trust in our world and how leaders can improve their trustworthiness. I can’t imagine a more important time to be discussing this topic. Rachel, welcome to Masters of Scale.
BOTSMAN: It’s great to be here. I listen to the show, so I’m pleased to be on as a guest.
Copy LinkAre we in a ‘trust crisis?’
BERMAN: I’m super excited to have you. Rachel, what is your basic definition of trust?
BOTSMAN: So my basic definition of trust is that trust is a confident relationship with the unknown.
BERMAN: And I’m curious, where are we in the arc of trust?
BOTSMAN: I’ve been studying trust for over 18 years, and I would say every single one of those years people talk about a trust crisis. So I think it’s a human obsession to focus on the decline of trust, and it’s a human obsession to focus on lines on a graph going upwards or downwards. And that’s not really the way trust works. So without a doubt, trust is in crisis in traditional institutions. We all know that.
But the reason why it’s in crisis when you separate our individual characters is because many of those systems and many of those structures have a design problem when it comes to trust, in that they were built like pyramids. They’re very top-down, they’re very over-hierarchical. They think of power in this over form, and that is not how human beings relate anymore. So we know from the way we don’t look up to experts anymore, we’re not deferential to the scientists. Even CEOs and bosses, it’s not like that hierarchy that used to exist. And even the way we see influence and trust work in information, whether that’s the news or whether that’s the way we buy a product or service, it flows sideways. And that’s largely because of the change of the nature of information and the channels.
BERMAN: I’m almost seeing a visual where we’re going from a pyramid to almost a scatter chart with lots of lines and crossing over nodes. Is that the right way to think about what’s happening to trust? It’s not a decline so much as it’s a redistribution?
BOTSMAN: Yeah. So it’s like if you summarize it, history has gone through three chapters of trust: local, institutional, and what we’re now in is distributed.
BERMAN: Can you just unpack those three?
BOTSMAN: So when you zoom out in history and you think about trust always existed, right? This is the social glue of relationships. It’s what enables innovation. It’s what enables commerce. And so I was like, “So how did this work over time before we had these technologies?”
And what you see is trust used to be local. So it used to be defined by proximity, physical distance. And largely we used to trust people that were close to us, people that we knew, and trust worked by reputation. Now, when we started to move, when we started to build cities, when we started to trade, this type of trust, it wasn’t effective. It no longer sort of held up to scale and growth. And it was one of the most incredible periods of innovation. So you think of all these trust mechanisms that were invented, whether that’s things like insurance or things like contracts or things like brands where you no longer had to trust another person, but you could trust through an entity. Even things like agents and brokers, these were all incredible innovations. And this was the era of institutional trust, which lasted for a long, long period of time.
And then, technologies came along, digital technologies, especially internet, social media, all these things that essentially allowed trust to distribute itself through networks and marketplaces and platforms. And so, what happened was for the first time in history, we could trust strangers mediated through technology in ways on a scale that we’d never seen before. This is what I call distributed trust, and it fascinated me because I was really studying the mechanisms of platforms. So I was with the entrepreneurs of the very early days of Etsy and Airbnb, and the number one question no one had studied is how does trust work on these platforms? So that’s what led to this three-chapter theory.
Copy LinkThe risks of declining trust in experts
BERMAN: In the early days of Uber and Airbnb, there was a leap of faith, right? There just wasn’t enough objective data. But now if I’m looking at Airbnbs, I’m looking at the host rating, I’m looking at how the guest comments, I’m looking at, is the Uber driver close to five stars? So there’s almost the crowdsourcing of trust here because I can reasonably rely on these data points. So that makes sense to me.
But this decline in trust of experts, frankly, I’m scared by it. It strikes me as exceedingly dangerous that someone who spends five minutes doing internet research is as trusted on TikTok or YouTube or Instagram as someone who has spent 18 years studying a subject as you have with trust itself. What are the implications for us that this is where we are?
BOTSMAN: It’s huge. And it’s the thing that worries me that I tried to work with news organizations around because we focus on this question all the time, number one question. “I want to build more trust in my platform or my organization.” But what no one has figured out is how do you really signal trustworthy information? And the reason why we don’t really signal that is because first of all, trust can be subjective. And second of all, these platforms make a lot of money off people who are not trustworthy to give that kind of information. And you can see it in all different contexts. So I, last year, recently got into running. And I went down the rabbit hole of, I properly injured myself following these influencers. And I say that because I know this stuff inside and out, so you know how easy it’s to fall for the trap.
BERMAN: I so appreciate the personal example as perhaps the world’s foremost expert on trust. You put trust in people who ended up giving you bad advice, bad input that didn’t work. What is it about just the nature of humanity that we are putting trust in people who don’t deserve it?
BOTSMAN: It’s part of what makes humans wonderful and also terribly stupid. So what makes us wonderful is most of us, 80% of us, are naturally trusting human beings. Our propensity is to give trust. Where we’re absolutely stupid is we like this illusion of control. So when we think we’re getting this information around our morning routine or our diets or our children or whatever it may be, we think in some way that particular person and the information they’re giving us is giving us a little bit more control over our lives. So we give our trust away really easily.
And the thing that is completely different and the game changer is that in old forms of information and news, you could kind of take what I call a trust pause. You could slow down and you could say, “Is this person, is this product, is it worthy of my trust?” And we don’t do that anymore because everything is designed around speed and efficiency. You talk to most entrepreneurs, it’s about taking out the friction. And so, we don’t realize that process of literally giving our trust away has been sped up and accelerated. And we don’t take those trust pauses to slow down and go, “Hang on, what is this person’s intentions and motives? And most of all, do they understand the context of my situation?” Because all trust is really dependent on context.
So you take the running example. That person doesn’t know if it’s raining and freezing in England. They don’t know if I slept that night. They don’t know if I’ve ate. There’s no context in that situation, and that is also really dangerous when it comes to making trust decisions.
Copy LinkThe emotional nature of trust in the age of AI & social media
BERMAN: This trust pause is a really interesting concept, and we’re actually in a moment where that trust pause is as short as it’s ever had to be. I have three teenagers, my youngest lives a good bit in the kind of Joe Rogan world, and he now knows that when he asserts something, I’m going to ask, “What’s your source?” And TikTok is not an acceptable answer. But we find ourselves going to an AI together to then go deeper on it. And I’m curious. We used to talk about this as media literacy. I don’t even know if that’s the right term anymore. What do you recommend that we do so that we’re building a new generation of trust that is in the right things, not the wrong things?
BOTSMAN: So I think even making your kids aware that there is nothing wrong with opinion, but opinion is different from fact. It is really hard. They’re not really taught this in school, like what is a factual argument and what is opinion? I get to lecture at Oxford, and I even see this in graduate students where you’re still pointing out that distinction.
I was involved with this piece of research for Channel 4. I didn’t directly do the research, it was sort of unpacking it; where they were looking, it was the largest study done in the UK on Gen Z in particular and what influences their information and decision-making? And this really stuck with me that what’s happened is in our generations, the trust sort of hierarchy was based on who was delivering that information. So was this a newsreader that we trusted? Was this a journalist? Was this a scientist? And then, the second was what? So was it The New York Times? Was it the FT, like the Channel? And the thing that has changed is that they are now forming opinions or arguments based on how they feel in that moment. It’s not to do with what’s being said or who’s being said, it’s based on how it makes them feel.
BERMAN: So it’s literally a gut reaction.
BOTSMAN: It’s like an emotional response. And it’s how they’re feeling in that moment as well, not even the bigger picture. And so, if you think about influencers, politicians, they understand this, right? So you think of the most salacious, the most extreme, the most funny, anyone that can prompt that emotional response, that is what they tune into. Now that somewhat is human nature, but it’s the inversion of that, that there is no slowing down going, “The reason why I’m factually listening to that person is because they are validating how I feel in this moment versus the validity of what they’re saying.”
Copy LinkBuilding trust and accountability in the workplace
BERMAN: You’re kind of taking my breath away. I’m a little scared hearing this. What are the takeaways for this for people who lead teams? How should we take these data points and apply them to how we’re working with our colleagues?
BOTSMAN: So I think trust in the workplace is in one of its most complex stages in history. And the reason why is you have these conflicting societal changes going on. You’ve got obvious things like a change in the format of work, and this is what everyone’s fixated on: hybrid work, virtual work, in the office. So that is one. The physical office no longer means that trust is by proximity. It’s not like that local trust again.
The second thing tied to this is that, and I think it’s again a societal shift, not even a trend that people aren’t talking about. Never before have we spent more time alone at home across all activities. So that’s shopping, gym, entertainment, even things like religion, things you’d expect, dating. But even online clubs, people don’t go out anymore. And so, this ties to the workplace, Jeff, because what happens is if you spend all that time alone and then you go into the workplace, we don’t like that human friction. Right? We can’t even sit in a meeting. We can’t make eye contact. So learning the basics of human connection, presence, showing up, these skills of leadership are really hard. So that’s the second thing.
And then, the third thing, and this is something leaders talk about all the time, is because the hierarchy doesn’t work anymore because people want to be empowered and they want accountability and they want that sideways trust. It’s how do you create clear boundaries around the personal and the professional? How do you create clear boundaries around things like reporting and accountability? How do you create all of that when these old systems of trust and power no longer feel relevant to different generations?
BERMAN: I was lucky early in my career to have a boss who worked this way, and I’ve tried to apply this ever since. And when we’re hiring at scale and we do kind of new hire breakfasts and what have you, one of the things that I’ll say is, “I’m wrong a lot.” And at most companies, I think they’re going to want you to sit in the room, hear the person at the top of the org chart say something really stupid. And they actually want you to keep your mouth shut, walk out of the room, roll your eyes, say, “The old man doesn’t know what he’s talking about,” and go do it anyway.
And I’ll fire you if you do that here. The way we work is you say it in the room, you push back, we have the argument. Ultimately, I’m either going to say, “Shoot. You’re right, we should do it that way. I’m not sure. That’s interesting. Let’s pull some other people in and have a conversation about it. Or, I appreciate that perspective in this discussion. I’ve heard you. We’re going in this direction. My expectation is we’re all going to get in the boat and row together in that direction.”
I feel like maybe that last part is getting harder, that pact that we’re going to have an honest process and at the end of that process, a decision is going to be made and we’re all going to go with that decision. I’m very lucky we have a team that is really capable of working this way, but as I talk to friends in the industry, my sense is that that’s getting harder. Is that your sense as well?
BOTSMAN: Yeah. It’s like a decline in constructive disagreement.
BERMAN: Yeah.
BOTSMAN: And it’s because of that deep trust and the psychological safety, so what Amy Edmondson would say, the group, the permission in the group to take risks, to be vulnerable, to admit failures, to talk about mistakes, that is declining in organizations. And that is because it is harder to do in virtual environments. I’m a huge supporter of the hybrid model, but you do need that human energy, that face-to-face contact to feel that other person for that really to exist. So when people are coming in maybe twice a week and then they’re trying to have those difficult conversations and there is some kind of disagreement, the natural thing is not to walk away and go, “Oh, I feel good about that.” It’s to be quite protective, to withdraw.
BERMAN: Still ahead, more with Rachel Botsman on how to heal damaged trust on your teams.
[AD BREAK]
Welcome back to Masters of Scale. You can find this conversation and more on our YouTube channel.
Copy LinkThe three phases of distrust
Next up, we dig into a key framework from Rachel’s work, the three phases of distrust.
BOTSMAN: So the first phase is defensiveness. You’d actually want people to be defensive because they’re engaged, and that’s a sign of trust. But it’s when they move to the next phase, which is disengagement. So, “Oh, Jeff didn’t agree with me. He always thinks he’s right. You know what? I’m not coming to the next meeting, or I’m not going to do my best work.” Whatever version of it is. And then, the last phase being disenchantment. “Hey, have you worked with Jeff? He doesn’t know what he’s doing, right? We need to find another. But he’s got to go, right?” So they have turned against you, and their only intent is to turn others against you. And what we’re seeing is that cycle from defensiveness to disenchantment, it goes a lot quicker without the physical face-to-face contact because when you are removed from one another, you fill in the blanks.
BERMAN: You fill in the blanks, and also the protection of doing it through a screen on Slack or whatever, there’s a buffer there. So is part of the answer that we need to be in physical space together more? If you lead a team, how do you best deal with this?
BOTSMAN: So I’m not going to recommend that everyone should be physically together, but I think it’s just being very aware that if you are working predominantly in a virtual environment, you cannot rely on trust through proximity. It’s just gone. So trust has to come from somewhere else. And what we see is leaders who are really good at managing virtual teams, hybrid teams, they’re really good at two things. One, they are very, very good at managing expectations. “This is what you can expect of me. This is what I can expect of you.” Whether they do that in meetings, how they use virtual tools, they are excellent expectation setters. And also clarifying what they can’t do, what they do not have the time or the skills, the resources. And then the second thing is they’re very consistent in the way that they show up. So you know that erratic behavior, like, “Oh, they log on late at night, or I’m not sure they’re going to send it in the morning.” So that consistency and expectation setting is really, really important when the proximity goes out of forming trust.
The second thing is when people come together, you actually create the space for conversation, not agenda. So the number of leaders who say, “You know what? When they get to the office, all they do is chat about the weekend.” And I’m like, “Great, that’s what they should be doing.” And you have to actually intentionally design those meetings in those days. If you invest in that time, you intentionally design it. It can carry you through for weeks. It then allows the virtual work to happen. So we’ve put so much thought into making the tools work and making it really efficient and making people productive, but not thinking about in the absence of physical proximity, how do you really nurture that trust?
BERMAN: This arc of disengagement to disenchantment, I’m curious how we break that, how we stop that.
BOTSMAN: It’s a great question. You really have to try and catch it in the defensiveness phase. And it’s amazing how people don’t have the conversation, right? Well, to have that conversation is uncomfortable. But believe me, when you get to the next phase, it’s going to be more uncomfortable. So just saying, not in a public way, but saying to that person, “I noticed in that meeting that you were really invested and you were really engaged, but it came off as slightly defensive.” And it’s not accusational. What you’re trying to do is understand where that person is coming from. Where are they coming from? Where is that emotion? Where is that thought coming from?
And they might have a very rational explanation, which could be, “Do you know what? I’ve been working on this for two years, and then Joe comes in with a completely new idea. You like Joe more than me.” We’re all children basically. “And you listen to Joe, and I know what I’m talking about.” Right? So giving them the opportunity to explain where they’re from.
And it’s funny, when they’re handled well, they’re really, really deep trust moments because you’re like, then you go into validation. “I don’t like that feeling either. I’m really sorry that came across, blah, blah, blah, blah.” So you’ve got to catch it in that defensiveness phase. The disengagement phase requires quite hard feedback and data gathering. So it’s like you didn’t show up, you weren’t reliable. Reliability is actually a really good trait for tracking around disengagement. And then, the disenchantment, personally, I think people need to go.
BERMAN: Yeah, that’s it. It’s like a tumor. You just have to get it out.
BOTSMAN: 80% of leaders regret how long they’ve kept someone on their team that distrusts them. Because they are causing not just damage, but you think about the mental consumption, the spiral. You’re worrying about them, what are they saying? And so, why we don’t deal with those situations, people, a lot earlier; for both sides, it is not a good situation, and it’s unlikely to improve.
BERMAN: Why don’t we deal with those more quickly?
BOTSMAN: Because we think it’s going to take up time. It’s uncomfortable again. And then, if that person goes, they’re leaving a hole. Right? So you’ve got, then, a hiring decision under pressure. But there’s already a hole there, so I think it’s a belief that it’s going to get fixed or it’s going to go away, or someone else is going to deal with it. And it’s not. It’s just going to get worse.
BERMAN: It’s avoidant, is what I’m hearing.
BOTSMAN: Yeah.
Copy LinkExploring AI’s impact on the next era of trust
BERMAN: We talked about AI earlier in the conversation. There are so many positives about AI. It’s astounding that we’re in a moment where literally the world’s repository of wisdom is at our fingertips. There’s a downside too. We live in an era of extraordinary disinformation and misinformation, and there are risks that AI amplifies that. How are you looking at this next era? Are we entering a fourth phase of trust? What’s your perspective on this?
BOTSMAN: Yeah, so my working title for this, and I’m not sure it’s right, is Auto Sapient Trust, because I want to avoid the word artificial intelligence or AI. I don’t trust anything that’s artificial. And I think if you imagine local, institutional, distributed, and then you’ve got auto sapient; and the shift is that what we’re essentially saying is the way you trust something and the way you trust someone is now the same.
And this is so important when it comes to technology, because for such a long period of history, we’ve trusted the technology to do something. And now we’re trusting technology to decide something and to create something. And the reason why this is so important for entrepreneurs and designers and business leaders to understand is that you’ve got two sides to trust. You’ve got capability and you’ve got character. And when a technology is doing something, you are focused on the capability side of trust; that it’s competent and reliable, it does what it says it’s going to do.
And what this auto sapien era does is it takes it into the character side of trust. So empathy, integrity, humility, very human qualities. And that’s the piece that worries me. And you can see where there’s really, really serious trust issues because it isn’t trustworthy. So if you think about copyright of artists and authors, which deeply bothers me, it’s because there isn’t an alignment in the business model with those creators’ IP and how it’s protected. And so I’m like, “We’ve made this mistake with platforms over and over again. Why can’t we get this alignment issue right?”
I actually think the empathy issue, it’s surprising when you look at the research, particularly in health and a medical diagnosis, that it’s really, really good at empathy. And it took me a long time to realize, Jeff, that its weakness is its strength. So the fact that it can’t absorb emotions, that it doesn’t get flooded, allows it to be empathetic in a way that humans can’t, like whether you’re a therapist or whether you’re a doctor.
And the interesting question is where it stops. So it’s really good at cognitive empathy, but it can’t do what we call somatic empathy. So there’s no physical response. Do we use that line in a really constructive way and actually say, “Ooh, it allows us to be more human in healthcare. It allows more time for that physical presence and the support that people need because actually it can do the cognitive empathy really well.”
It drives me nuts, if I’m honest, the conversation with artificial intelligence, because it’s so generalized, it’s so noisy, and there isn’t enough nuance in thinking about those qualities that we should trust AI to do. It’s more than the human in the loop. It’s like, what is that line of trustworthiness that is really, really important to figure out in any product or system that we’re designing?
Copy LinkWhat AI companies need to do differently
BERMAN: You get a phone call this afternoon from OpenAI or Inflection or Anthropic and they say, “Rachel, we really want our customers to trust us and for that trust to be earned, deserved, and fulfilled. What do we need to be thinking about differently than what we’re doing right now?” What are you telling them?
BOTSMAN: So first of all I’d say, “I don’t actually think, ‘how do you get people to trust us?’ is the problem. I think too much trust is the problem,” is the first thing I’d say. And it’s really like, “How can we make this product worthy of people’s trust in different contexts?” So I would start to really look at different areas of our lives and the way that we are using it, and where we actually think it’s high trust and high trustworthiness. That trust is deserved. That’s the good square.
Then you’ve got the design problem. So actually, it’s really trustworthy, but people aren’t trusting it. So it’s really good at diagnosis, reading x-rays, whatever it might be. “Okay, how do we solve that design problem?”
Then you’ve got a quadrant which is low trustworthiness, high trust. So actually, Sam’s going, “This is really unwanted trust that people are using us to solve this problem.” And that’s the danger zone. How do you stop people using it in that area because it’s not ready?
And then, you’ve got low trust, low trustworthiness, which is less of a problem because you’ve got less users in that space.
So I would literally be saying to them, “Think about this quadrant, high trust, high trustworthiness. That’s where you’re creating the social proof around this. Ooh. High trustworthiness, low trust, that’s the design problem. But the one I’m worried about is low trustworthiness, high trust. How do you stop people in that space?” So it’s coming back to that mapping, that context, understanding those usage scenarios.
And then, the other side of that is I’d really push them to say, “Where are your misalignment problems? So where as a company are you not aligned with the best interests of your users, of the people that you serve, of the people providing the inputs and the information? Because those misalignment problems, they are integrity issues. And they are not going to go away. They’re only going to get worse.”
And just look at what happened with social media. Meta is still dealing with the same problem of misalignment around use of people’s information. So they would be my two things, is where are your big misalignment problems in terms of your interest and your intentions and what really serves the public, and how are you sort of really functioning in that quadrant?
BERMAN: Love that. Thank you so much for being with us. A lot to chew on here. Thank you.
BOTSMAN: It’s a pleasure. Nice to meet you.
BERMAN: In any and every business, trust is an essential ingredient for scaling. Rachel Botsman’s approach to thinking about trust is a captivating blend of historical context and actionable insights. To learn more, I hope you’ll check out her audio book “How to Trust and Be Trusted.” We’re linking to it in the show notes and comments below. I’m Jeff Berman. Thank you for listening.