
The Elephant in the Org
The "Elephant in the Org" podcast is a daring dive into the unspoken challenges and opportunities in organizational development, particularly in the realm of employee experience. Hosted by the team at The Fearless PX, we tackle the "elephants" in the room—those taboo or ignored topics—that are critical for creating psychologically safe and highly effective workplaces.
The Elephant in the Org
The Spy Who Slacked Me: Slack Honeypots, Turf Wars, and the HR Tech Hunger Games
🎧 Special Edition – Oh Shit, What’s Happened Now?
This is one of those moments when we pause our scheduled programming because something in the news is just too wild to ignore.
In this special edition of The Elephant in the Org, we dive headfirst into the explosive lawsuit between HR tech giants Rippling and Deel — featuring Slack honeypots, alleged corporate spying, stolen sales data, and actual RICO charges (yes, the kind usually reserved for organized crime).
Rippling says Deel recruited a mole who conducted thousands of unauthorized internal searches to steal client data — all caught via a fake Slack channel set up as a digital trap. Deel says the whole thing is overblown. Meanwhile, the HR tech community is left wondering… who do we trust now?
We unpack:
- What this means for psychological safety in high-growth tech
- Whether surveillance culture is becoming the new norm
- How this could shake employee trust, not just in vendors — but in work itself
And of course, we reflect on what this kind of chaos means for managers, employees, and all of us caught somewhere between hustle culture and HR values. Is this just startup life? Or have we officially entered the HR Tech Hunger Games?
🐘 Connect with Us:
🚀 Follow The Fearless PX on LinkedIn: The Fearless PX
📩 Got a hot take or a workplace horror story? Email Marion, Cacha, and Danny at elephant@thefearlesspx.com
🎧 Catch every episode of The Elephant in the Org: All Episodes Here
🚀Your Hosts on Linkedin:
💬 Like what you hear?
Subscribe, leave a ★★★★★ review, and help us bring more elephants into the light.
🎙️ About the Show
The Elephant in the Org drops new episodes every two weeks starting April 2024.
Get ready for even more fearless conversations about leadership, psychological safety, and the future of work.
🎵 Music & Production Credits
🎶 Opening and closing theme music by The Toros
🎙️ Produced by The Fearless PX
✂️ Edited by Marion Anderson
⚠️ Disclaimer
The views and opinions expressed in this podcast are those of the hosts and guests, and do not necessarily reflect any affiliated organizations' official policy or position.
WEBVTT
1
00:00:03.310 --> 00:00:08.430
Danny Gluch: Welcome back to the elephant of the Org. I'm Danny Glutch, and I'm joined by my co-host CACHA DORA
2
00:00:08.430 --> 00:00:09.590
Cacha Dora: Hello!
3
00:00:09.590 --> 00:00:10.890
Danny Gluch: And Marion Anderson
4
00:00:10.890 --> 00:00:11.640
Marion: Hello!
5
00:00:12.380 --> 00:00:16.110
Danny Gluch: How are you ladies doing today? You ready for some espionage
6
00:00:17.620 --> 00:00:21.679
Marion: Think we're doing better than our friends at rippling and deal
7
00:00:21.680 --> 00:00:29.731
Cacha Dora: Oh, yeah, with some with some spy theme music going on in the background. Everyone can plug in their own favorite genre in their head.
8
00:00:30.030 --> 00:00:31.260
Marion: Already James Bond.
9
00:00:31.260 --> 00:00:34.567
Cacha Dora: Yeah. Mission? Impossible. All of it. Yeah.
10
00:00:36.795 --> 00:00:37.700
Danny Gluch: So
11
00:00:38.890 --> 00:01:00.269
Danny Gluch: we wanted to talk about how, how this even happens like, 1st off like, how does this decision get made to do this to like find someone recruit someone. And like this is legit like mi. 5 slow horses. James Bond. Stuff here like, are our companies set up to do this, and I'm just unaware
12
00:01:02.490 --> 00:01:20.153
Marion: There's a reason why we all have code of conduct training and go through all of that. You know all of that training and organization. So I would like to think that it's not that common. But I think I mean I think it. It's a bit of a shocker.
13
00:01:20.500 --> 00:01:20.949
Cacha Dora: Oh, yeah.
14
00:01:20.950 --> 00:01:22.680
Marion: Own industry, you know.
15
00:01:23.160 --> 00:01:24.580
Cacha Dora: Well, and I think, like
16
00:01:24.800 --> 00:01:48.639
Cacha Dora: Danny to your point, like, how is it? Is it common? Is this something we should be like seeing? I'm wondering if the act, if the thing is more of like, what about in the startup fields in these these unicorn fields? If that if it's less like established business, and more like the ones that are trying to really put their claim in. If it's if that's more where we could see activity like this
17
00:01:49.560 --> 00:01:54.379
Danny Gluch: Yeah. And then I guess that that brings up the question of like, can you trust
18
00:01:54.820 --> 00:01:59.799
Danny Gluch: startups? Can you? Can you trust any organization that
19
00:02:00.130 --> 00:02:06.990
Danny Gluch: might be doing stuff like this, both as as a partner or as an employee like, can can they be trusted
20
00:02:10.240 --> 00:02:36.833
Marion: Yes, I think this is an anomaly. You know. I mean, as far as I know, I've never, you know, experienced anything like this that I've been aware of. So I I'd like to think that this is highly unusual, but it does call into question ethics and integrity and values, and I don't know what the values of of that particular organization are. But
21
00:02:37.590 --> 00:02:46.690
Marion: you know, it's it's definitely not reflective of what it is that they certainly aim to do with their clients and the audience that they're promoting to right. So
22
00:02:47.940 --> 00:02:50.809
Marion: Oh, I don't know. I'd hate to be on that board
23
00:02:51.070 --> 00:02:51.390
Danny Gluch: Yeah.
24
00:02:51.390 --> 00:02:59.299
Cacha Dora: I mean, especially like when what they, the business itself, right? The business itself handles sensitive data. That's what they do
25
00:02:59.470 --> 00:03:00.010
Marion: Hmm.
26
00:03:01.240 --> 00:03:07.509
Cacha Dora: And like that, that's such a key component of it. So like, if what you're doing is holding and handling sensitive data
27
00:03:09.320 --> 00:03:16.459
Cacha Dora: like there's almost this incumbent trust contract that you need going into a conversation because you're assuming
28
00:03:17.130 --> 00:03:18.530
Cacha Dora: my stuff's good
29
00:03:18.660 --> 00:03:19.450
Marion: Yeah.
30
00:03:19.450 --> 00:03:20.110
Danny Gluch: Yeah.
31
00:03:20.110 --> 00:03:29.600
Marion: Yeah. Now, obviously, when you're signing contracts with these vendors, you know, they're the. Those contracts are fairly robust, and and you know, protect
32
00:03:29.600 --> 00:03:36.790
Danny Gluch: So are the, so were the laws, and and they clearly right. They clearly didn't mind stepping past that
33
00:03:36.790 --> 00:03:49.849
Marion: That's true, that's true. I don't know enough about, though. How far up the chain it went like, was it known all the way to CEO level, or was there someone acting in a silo that? But I don't know
34
00:03:49.850 --> 00:03:51.690
Danny Gluch: We may, we may never know, for
35
00:03:51.690 --> 00:03:52.910
Marion: Sure.
36
00:03:53.110 --> 00:04:00.630
Danny Gluch: I really hope to hear like, see some memos and some some emails and text messages and stuff, though that's gonna be juicy
37
00:04:00.630 --> 00:04:01.120
Marion: Mobile.
38
00:04:01.120 --> 00:04:03.909
Cacha Dora: Turn into, it'll turn into a movie. It'll turn into a movie
39
00:04:04.410 --> 00:04:05.410
Marion: Probably not.
40
00:04:05.410 --> 00:04:07.889
Danny Gluch: Speaking of, have you guys seen the blackberry movie?
41
00:04:08.070 --> 00:04:09.389
Danny Gluch: That's a good one.
42
00:04:09.440 --> 00:04:13.609
Marion: Oh, yeah, that's just no, I haven't seen it, but I've seen but heard about it. Yeah.
43
00:04:13.610 --> 00:04:40.870
Danny Gluch: It's it's not entirely unrelated to this whole espionage corporate dealing just unethical, like shenanigans. And both that movie and and this, you know, reading the news blurbs reminded me of Adam Grant when he was talking about like givers and takers, and you know there's there's givers, there's matchers, there's takers, and how a lot of organizations think that it's like, Oh, it's okay to have takers in your organization as long as they're like, really productive.
44
00:04:41.260 --> 00:05:02.660
Danny Gluch: and he says, well, it's fine until their allegiances like don't align with the organization, and then you just can't trust them at all because they're gonna act in their own self interest. And I think organizations can have that kind of mentality, too, of yeah, deal might be really good at what they do, and and you might trust them for a bit.
45
00:05:02.900 --> 00:05:06.489
Danny Gluch: But if this is how they act, if they're willing to act like this.
46
00:05:07.000 --> 00:05:19.790
Danny Gluch: then, like at some point, someone's gonna be like, Hey, if we just like. Didn't tell anyone and sold all of this information to like 3rd party marketers, or China or Russia, or you know, Kyrgyzstan like.
47
00:05:20.020 --> 00:05:28.369
Danny Gluch: maybe we can make a little more money. And all of a sudden that agreement you made with them doesn't mean anything because it benefits them more to break the agreement
48
00:05:28.840 --> 00:05:29.380
Marion: And
49
00:05:29.950 --> 00:05:31.760
Danny Gluch: How can you trust that
50
00:05:32.350 --> 00:05:32.880
Cacha Dora: At.
51
00:05:33.940 --> 00:05:53.000
Cacha Dora: And then I think, what's also like the articles. There's so many of them spinning around right now because it's so current in the news cycle, but deals, turning around and pointing the finger right back at rippling and being like, well, you're not being good either, like you're also doing bad things, and it's like we're not doing as bad of a thing as you said we did
52
00:05:53.190 --> 00:06:05.923
Cacha Dora: like. You're blowing this out of proportion and like it's I. When the dust settles the truth, I think that I think that's a huge part, right? Trust and truth. I think the theme here like what's happening.
53
00:06:06.600 --> 00:06:17.230
Cacha Dora: I think it'll be very interesting, not just for the Hr professionals who use both of these platforms, but just people in this space. It'll be very interesting to see where the where the truth really ends.
54
00:06:17.230 --> 00:06:26.930
Marion: Yeah, I mean, I have a lot of thoughts. I mean, you both know how I feel about most. Hr platforms. I I have a
55
00:06:27.460 --> 00:06:38.469
Marion: I have always had a bias towards the smaller and more agile platforms, like rippling like bamboo Hr, just because they give you a lot more
56
00:06:38.530 --> 00:06:59.389
Marion: flexibility in a lot of ways. Certainly, when you're a startup, right? And you need a 1 stop shop for things the further into the enterprise solutions that you go. That's when I start to lose the will to live because they're typically awful. And and they're slow and they're cumbersome, and they don't give you what you need, and if you, if
57
00:06:59.390 --> 00:07:01.329
Cacha Dora: You're constrained in their utilization
58
00:07:01.330 --> 00:07:09.739
Marion: Yeah. And to get what you need, you have to buy extra add-ons. And it's bit of a con most of the time. So yeah, like
59
00:07:10.940 --> 00:07:18.890
Marion: as a as a bystander, but as a you know, a potential customer for a future customer. For these types of platforms.
60
00:07:19.560 --> 00:07:25.200
Marion: I feel sick to my stomach because I'm such an advocate for this
61
00:07:25.430 --> 00:07:41.260
Marion: this type of company, this type of forward thinking, progressive platform. So that's that's a really interesting thing. But the other thing I was thinking about as as I was getting caught up in all. All of this news was
62
00:07:41.460 --> 00:07:51.369
Marion: when that when that all broke when that all came out, and I woke up as a rippling employee or a deal employee, and saw all of that in the media
63
00:07:51.370 --> 00:07:52.080
Cacha Dora: Hmm.
64
00:07:52.080 --> 00:07:52.870
Marion: Like.
65
00:07:53.390 --> 00:08:01.672
Marion: I mean, I don't know anything about their companies or their company cultures right? And I've never worked there. And I don't know anyone that works there. I don't think directly
66
00:08:02.250 --> 00:08:09.290
Marion: but like I can't imagine waking it up and being like, wait what you know like.
67
00:08:09.690 --> 00:08:12.450
Marion: who am I working for what just happened.
68
00:08:12.550 --> 00:08:19.500
Marion: And I think things like that when you're selling trust, which effectively what these platforms are right is
69
00:08:19.500 --> 00:08:20.270
Marion: from basic
70
00:08:20.270 --> 00:08:39.270
Marion: contrast like that must really shake you to your core, especially if you're an employee who's very values led. And and you know a good human being frankly so. I cannot even imagine what those employees are going through
71
00:08:39.750 --> 00:08:58.820
Danny Gluch: Yeah, I think that's a really important thing to think about. It's a lot of these, you know, especially the younger generation. And you know, you have people who listen to our podcast and and read books that we like to read, who are very interested in progressing their careers in ways that aligns with their purpose in life.
72
00:08:59.540 --> 00:09:12.379
Danny Gluch: And so often people are like, I want to help workers. I want to make the work experience better. I want to work for a place like rippling. That is that, you know. That's what they're doing or deal, and it looks like that's what they're doing.
73
00:09:12.660 --> 00:09:22.650
Danny Gluch: And then they wake up and go. Oh, my goodness, this is what a breach of trust between employee and organization. Yeah.
74
00:09:22.770 --> 00:09:36.519
Marion: I mean, the only thing I can really compare that to is, you know, I've worked for brands that are very values led now much bigger to be fair, but you know I think Apple is a great example, and I I you know, an apple
75
00:09:36.810 --> 00:09:46.860
Marion: perfect, and it's had some things in the press. And what have you bought? Company hasn't right, but I'm trying to imagine if I was an apple employee and something like this happened, how would. I feel, I think.
76
00:09:46.860 --> 00:09:54.779
Cacha Dora: I think lied to would be the 1st thing that comes to my betrayal. Kind of like a like a cheater in a break in a relationship. Yeah.
77
00:09:54.910 --> 00:10:10.180
Marion: I would I'd feel very cheated and lost and confused and disappointed in a lot. Lots of things now, disclaimer Apple have not done that I'm just trying to use an example that relates to me. But like it, it would shake me to my very core. I think.
78
00:10:10.724 --> 00:10:18.090
Marion: Almost like, you know, if you're dating someone and you're totally in love with them, and then you find out they've cheated on you, and you're just like completely.
79
00:10:18.240 --> 00:10:24.419
Marion: Someone's pulled the rug out from under you. I think that that's how I'd feel, and I would probably be quite
80
00:10:25.398 --> 00:10:31.019
Marion: mistrusting as I go into future relationships or future employment engagements so
81
00:10:31.310 --> 00:10:33.970
Cacha Dora: Yeah, 1 1 would call that trauma, informed Marion.
82
00:10:33.970 --> 00:10:39.269
Marion: Yeah, for sure. I mean definitely, I suspect that there, there may be some people that are feeling like that.
83
00:10:39.450 --> 00:10:41.770
Marion: But yeah, I think you know.
84
00:10:42.360 --> 00:10:51.239
Marion: trust, leadership, trust, transparency, all of these things when it's broken internally.
85
00:10:51.420 --> 00:10:57.180
Marion: Let's dial it back to what we talk about all the time. Psychological safety, you know.
86
00:10:57.350 --> 00:11:23.350
Marion: I do a lot of thematic analysis across my own research. And and you know, across other kind of work related projects. And I always come back to 3 core things that I think are time and time and time and time again consistent in most organizations that are the key source of a lot of issues, leadership, transparency, trust, psychological safety and manager capability. Like, we talk about these things all the time. Right
87
00:11:23.750 --> 00:11:29.070
Marion: in the number one thing there, leadership, transparency interest when that's
88
00:11:29.340 --> 00:11:37.770
Marion: broken, when that's shattered, not just chipped away like psychological contract in the early stages, but actually shattered.
89
00:11:38.560 --> 00:11:40.069
Marion: You don't come back from that
90
00:11:41.140 --> 00:11:43.330
Cacha Dora: And I think on top of it, like.
91
00:11:43.860 --> 00:11:56.810
Cacha Dora: if I were an employee at either one of those companies. And I did have a huge feeling of like pride in product, pride in the company. I was really a active part of the culture
92
00:11:56.810 --> 00:11:57.190
Marion: Hmm.
93
00:11:57.190 --> 00:11:59.879
Cacha Dora: Right, like our internal promoters, right?
94
00:12:00.720 --> 00:12:05.129
Cacha Dora: And I think there's a segment of people who are trying to find out if
95
00:12:05.550 --> 00:12:14.480
Cacha Dora: and trying to ask themselves the questions of, you know. Was it just this small subset of the organization that did this? Or was this the organization as a whole?
96
00:12:15.000 --> 00:12:43.669
Cacha Dora: Do I still belong here? Can I still have a place? And or, you know, like, and they start, like, you know, trying to rationalize things, and and that takes its own energy source away from the work that you're actually doing on the job and the work that you're like investing in your coworkers and your peers. And you know, I I really empathize for the employee base, not just the the user base of all these platforms who have a whole bunch of other trust based questions that are coming up.
97
00:12:43.670 --> 00:12:48.339
Cacha Dora: Yeah them. But like, it's not their fault that this happened
98
00:12:48.340 --> 00:12:48.810
Marion: No.
99
00:12:48.810 --> 00:12:54.309
Cacha Dora: You know, but they're companies in the news. And that means that they're friends and family and like coworkers. And
100
00:12:54.650 --> 00:12:55.290
Cacha Dora: yeah.
101
00:12:55.530 --> 00:13:00.499
Danny Gluch: You know. What it reminds me of is the Faa Layoffs.
102
00:13:01.230 --> 00:13:09.910
Danny Gluch: Oh, now just the the environment, the message, the the presence of a threat of future layoffs
103
00:13:10.030 --> 00:13:21.989
Danny Gluch: impacted performance to a degree where there was a spike in aviation accidents and near misses. There was another one yesterday, another near, miss, and it's 1 of those things that's like
104
00:13:22.400 --> 00:13:42.159
Danny Gluch: they're creating an environment that all of a sudden it went from safe. And maybe, you know, there were some layoffs. And it's like, Okay, well, maybe that was a good business decision. But now, all of a sudden, they're questioning again. It's like, Oh, now it's really clear that I just can't trust
105
00:13:42.380 --> 00:13:49.219
Danny Gluch: what's going on. I thought this was a happy good place to be, but it is now very clear that
106
00:13:49.500 --> 00:13:53.979
Danny Gluch: this is cutthroat, and my neck might be on the line
107
00:13:53.980 --> 00:13:54.320
Marion: Yeah.
108
00:13:54.320 --> 00:13:57.079
Danny Gluch: It's hard to perform. Well, when that's the cake
109
00:13:57.670 --> 00:14:04.309
Marion: Out a really important word a few minutes ago belonging. And you know, in this age of
110
00:14:04.888 --> 00:14:31.220
Marion: diversity, equity, and inclusion becoming bad words. You know, there's there's often a shift in terminology using belonging, which to me is the same thing, right? Like. If I belong, then I'm in a place where I feel included. And and for me, that's a place where not everyone's homogenized. But everyone's like different and there is equity right? So for me that is belonging.
111
00:14:32.340 --> 00:14:52.229
Marion: And I've been in situations where I've worked in organizations. And I've questioned, do I belong here? Do my values mirror the values of what I'm seeing around me, either at organizational level or at leadership level, or even just at sub leadership level, right? Because
112
00:14:52.510 --> 00:15:10.240
Marion: my own feel is that as I mature in age and in my profession. My values have got louder and louder and louder, and when I was younger I couldn't afford to be so value driven, because, you know, you've got to pay bills and put food on the table. But
113
00:15:10.500 --> 00:15:22.070
Marion: you know, as you get older and you, you are obviously have a little bit more flexibility with how you you know what you spend and how you earn, etcetera, like.
114
00:15:22.580 --> 00:15:32.089
Marion: I think values start to become so much more important in in how you factor in decisions about where you're going to work and where you're going to put your energy.
115
00:15:32.230 --> 00:15:33.460
Marion: Because
116
00:15:33.860 --> 00:15:39.369
Marion: in this day and age. I think we give a hell of a lot of ourselves to our jobs.
117
00:15:39.870 --> 00:15:59.379
Marion: And especially in our jobs. You know where you're holding space for people all day every day, and it is exhausting. So like I think it does make you question for me to give so much of myself. It has to be to a company that reflects my values, and where I feel like I belong. So
118
00:16:00.290 --> 00:16:02.890
Marion: wow, it's it's that's a big one.
119
00:16:03.230 --> 00:16:05.330
Danny Gluch: And so what I've do, I was gonna
120
00:16:05.330 --> 00:16:05.869
Cacha Dora: Go ahead, Danny!
121
00:16:05.870 --> 00:16:12.399
Danny Gluch: Do you think that both sides of this rippling end deal? Their employees are feeling that right now.
122
00:16:12.600 --> 00:16:15.089
Danny Gluch: or is it just on deal side
123
00:16:15.090 --> 00:16:17.499
Cacha Dora: I think it's both. I think it's both.
124
00:16:17.650 --> 00:16:28.800
Cacha Dora: I think it's also kind of interesting. Because before we we clicked record on this, we were really thinking about the people who are using the platforms. We were talking a lot about them, but I think the focus
125
00:16:28.920 --> 00:16:45.329
Cacha Dora: and and I haven't heard the focus in a lot of these articles on the employees of the companies. It's all been on the user base. So I'm being able to highlight the employees, I think, is such a key component to what this feels like, because this kind of activity is not unheard of in the business world.
126
00:16:45.480 --> 00:16:46.460
Cacha Dora: Right?
127
00:16:46.862 --> 00:16:56.409
Cacha Dora: And no one signs up to work for. I mean, I'm not gonna say no one clearly, but majority of people, 99.9% of people don't sign up to join a company
128
00:16:56.540 --> 00:16:58.009
Cacha Dora: where this is the behavior
129
00:16:58.450 --> 00:17:19.410
Marion: No, I mean if I if I signed up to be a trader in the City of London or the New York Stock Exchange, and it's all frozen. And you know, buy, sell, buy, sell, cocaine fast, fast, right? Ie. Like the TV show industry. That's 1 thing like that's an environment that you anticipate, maybe some weird shit going on.
130
00:17:19.740 --> 00:17:29.589
Marion: But if I work in the people industry, if I work in, you know human resources, technology or L and D, or any of those areas.
131
00:17:30.220 --> 00:17:32.599
Marion: I'm not expecting this shit. I'm expecting
132
00:17:32.600 --> 00:17:33.829
Cacha Dora: No, not at all.
133
00:17:33.830 --> 00:17:39.070
Marion: People that are reflective of my values, and and that want to do a good job and want to help people
134
00:17:39.340 --> 00:17:49.720
Danny Gluch: But I think this brings up what was mentioned at the beginning. Does the Hr sort of like purpose and values get trumped
135
00:17:49.910 --> 00:17:59.219
Danny Gluch: by the startup sort of environment and values the the hey, we're we're going fast. We're going to break stuff. We're going to bend the rules. We're
136
00:17:59.350 --> 00:18:05.279
Danny Gluch: you know, all of that right that goes along with what it means to be a startup and scrappy and
137
00:18:05.280 --> 00:18:11.419
Marion: Just laughing at the use of the word trumped, because in both senses it made absolute sense to what you were
138
00:18:11.420 --> 00:18:18.240
Danny Gluch: That's true. I was using it in the cards. I play cards, trump cards, that's the but yes.
139
00:18:18.240 --> 00:18:20.809
Cacha Dora: No one can see it but Marion and I both made faces
140
00:18:22.580 --> 00:18:23.580
Marion: Amazing.
141
00:18:25.860 --> 00:18:30.570
Marion: Ugh, yes, yeah, I mean.
142
00:18:30.570 --> 00:18:31.160
Cacha Dora: Yeah.
143
00:18:31.390 --> 00:18:32.325
Marion: Yeah.
144
00:18:33.260 --> 00:18:53.480
Danny Gluch: I mean, look at what happened to Uber right like, you know, when I was living in San Francisco, Lyft was their corporate headquarters just a couple blocks from from our apartment, and Ubers were not far from there either, and the difference in cultures were so apparent. Right lyft, uber so many different values. Yada! Yada! Yada!
145
00:18:53.660 --> 00:19:01.589
Danny Gluch: And then Uber was like, Oh, no, we had these huge scandals. It turned out that Lyft wasn't that much of a better place to work for.
146
00:19:01.590 --> 00:19:02.500
Marion: Hmm.
147
00:19:02.660 --> 00:19:11.969
Danny Gluch: It's it. They were just that competitive. They were just a little bit better at showing it, and not being like overtly sexist and racist, and
148
00:19:12.090 --> 00:19:28.916
Danny Gluch: but it was still there, like, even with their whole like, yay, we're buddies, we're just trying to, you know, help the community and and offer these rides. They they pitched themselves that way. But the corporate, or like the startup corporate culture of it, you know, trying to become and and
149
00:19:30.110 --> 00:19:34.570
Danny Gluch: well, how do I say, like, solidify themselves as like 1 billion dollar companies.
150
00:19:35.720 --> 00:19:44.910
Danny Gluch: it created an environment that like, actually wasn't such a great place and and didn't really reflect their values like that they espoused externally
151
00:19:45.550 --> 00:19:46.370
Cacha Dora: I think it.
152
00:19:46.820 --> 00:19:51.790
Cacha Dora: I think there's a huge point in that startup culture is also hustle culture
153
00:19:51.790 --> 00:19:52.130
Marion: Oh, yeah.
154
00:19:52.130 --> 00:19:55.399
Cacha Dora: And hustle. Culture is always going to be toxic
155
00:19:55.770 --> 00:19:59.700
Cacha Dora: because it doesn't take into account the humanity that's powering the machine
156
00:19:59.700 --> 00:20:26.039
Marion: Yeah. But then there's also a lot of like culture washing and values washing right where you know, companies will understand the value of talking about values and and culture, and wash all of their stuff with that language, but it's as transparent as a window like there's, you know, offer coat. No knickers right? It's all on the surface, and there's nothing happening underneath like that. That is exactly it. It's exactly
157
00:20:26.040 --> 00:20:26.370
Cacha Dora: Yeah.
158
00:20:26.370 --> 00:20:32.959
Marion: And I think that that happens now again, do not know if that's the case here at all. Do not know these companies well enough. But like.
159
00:20:33.120 --> 00:20:42.080
Marion: Yeah, I it's Oh, where are the where are the real ethical companies? Where are they? The ones that
160
00:20:42.420 --> 00:20:47.240
Marion: still give a damn about their people and their customers authentically.
161
00:20:47.530 --> 00:20:49.200
Marion: and not just for a quick buck
162
00:20:50.190 --> 00:20:52.561
Cacha Dora: Put their names in the comments below
163
00:20:52.900 --> 00:20:56.080
Danny Gluch: You want to work for them, I'd work with them
164
00:20:56.760 --> 00:21:08.010
Cacha Dora: And here and learn from them, turn them into case studies. But I I think I think company culture is so important to to the psychological safety of the employee. Right?
165
00:21:08.130 --> 00:21:12.219
Cacha Dora: And they they are so co-intermingled.
166
00:21:13.630 --> 00:21:21.320
Cacha Dora: But it is 2025, and we are talking about corporate espionage guys like
167
00:21:22.640 --> 00:21:24.050
Marion: The Hr. Space
168
00:21:24.820 --> 00:21:25.892
Cacha Dora: Like I am.
169
00:21:26.250 --> 00:21:29.300
Cacha Dora: I just I gotta call that back like what
170
00:21:29.800 --> 00:21:41.459
Marion: Hmm. So so let's talk about the honeypot, because obviously they set a trap right slack with a fake slack account, and you know they? They set this kind of honey trap type thing.
171
00:21:43.120 --> 00:21:54.950
Marion: I mean can't imagine how quickly and how much that would undo any
172
00:21:55.250 --> 00:22:14.932
Marion: work that had gone into building, you know, internal trust and psychological safety. And and you know I mean we struggle in Hr enough as it is, I mean, let's face it. How many times have you had the I promise. The engagement survey is confidential conversation, like I mean conversation like a million right?
173
00:22:15.400 --> 00:22:21.170
Marion: and so shit like this kind of like doesn't help the cause.
174
00:22:21.330 --> 00:22:21.880
Marion: Yeah.
175
00:22:21.880 --> 00:22:25.790
Danny Gluch: No, no, I think that's such a great call out like, how
176
00:22:26.270 --> 00:22:43.869
Danny Gluch: is there even a way that you could counter message and help people trust that like. No, no, no, we only did this because we thought it was like a really like external plant spy kind of thing. We're not going to be watching you. We're not going to be setting a trap to, you know.
177
00:22:44.120 --> 00:22:44.460
Cacha Dora: Yeah.
178
00:22:44.460 --> 00:22:47.000
Danny Gluch: You! And oh, boy.
179
00:22:47.530 --> 00:22:51.281
Cacha Dora: Yeah, cause like I, you always like hear
180
00:22:52.430 --> 00:22:57.400
Cacha Dora: in articles when when they're writing about how
181
00:22:58.320 --> 00:23:18.050
Cacha Dora: it and or Hr are really only in it for the business that mindset of like they're there to protect the business. Careful what you write, careful what you write, everyone, a big brother, everyone's watching you there there. And so for people who do have that mentality or have worked at companies where you know how many times they move, the mouse counts. Yeah.
182
00:23:18.220 --> 00:23:25.290
Cacha Dora: And if they're working in an environment like that, like again, it goes right back to trust right? Like, well, should I send this message to a coworker?
183
00:23:25.530 --> 00:23:25.920
Marion: Yeah.
184
00:23:25.920 --> 00:23:39.810
Cacha Dora: Should I just text them on the side? Well, now, you're having like questions that should be happening at work happening on a phone because they don't trust to do it on the tools that the business provides, like it really does drive right back to trust
185
00:23:39.810 --> 00:23:40.810
Marion: Absolutely.
186
00:23:40.810 --> 00:23:44.760
Danny Gluch: It really does. It's it's the digitized panopticon.
187
00:23:44.880 --> 00:23:52.430
Danny Gluch: It's you're always being watched. Everything that you're doing on your company computer could be used against you.
188
00:23:52.920 --> 00:24:00.680
Danny Gluch: And so what do you do like? Do do you have any trust? Can you be authentic? It's just
189
00:24:01.050 --> 00:24:11.629
Danny Gluch: just the fact that they did. This destroys any sense that people can be challengers which is like at the core of psychological safety. Right?
190
00:24:11.740 --> 00:24:31.399
Danny Gluch: Or are they just going to wait for me to challenge something so that they can, you know. Oh, we're just gonna do this on a performance thing. And now you're gone, or, you know, bring it up in a meeting and find a way to dismiss you because you're not, you know, rowing in the right direction. Like
191
00:24:32.660 --> 00:24:36.560
Danny Gluch: boy, this sets a really scary precedence for everyone who works there
192
00:24:37.250 --> 00:24:50.729
Cacha Dora: And the managers, those poor people, managers who are gonna be just being thrown like trying to catch a salmon up river or down river. Whatever the analogy is, ignore me. I'm tired, but like I've had no coffee today, guys
193
00:24:50.730 --> 00:24:53.479
Danny Gluch: Here's Fisher, Fisherman, are you
194
00:24:53.480 --> 00:24:58.081
Cacha Dora: Doing great. No, make me a nature duty, that is you.
195
00:24:59.440 --> 00:25:08.260
Cacha Dora: You're the nature, guy, but I I do think that the these poor managers are just gonna get slapped around with these trust questions that they are questioning
196
00:25:08.930 --> 00:25:12.879
Cacha Dora: themselves, and they're probably being given guidance, I would hope.
197
00:25:13.450 --> 00:25:22.579
Cacha Dora: But like these managers aren't just the people who are working in deal and in rippling. It's also the managers of all of their clients.
198
00:25:22.580 --> 00:25:25.000
Marion: Hmm, yeah.
199
00:25:25.000 --> 00:25:32.530
Cacha Dora: Like I feel so bad for managers just across the board in this particular situation, because it doesn't matter where you're working.
200
00:25:32.880 --> 00:25:47.909
Cacha Dora: The question of trust is gonna happen everywhere. Right now, what about my data? What about this like? What about that? And you know, like, are you guys gonna change platform like it. It's gonna oh, I I do not. I do not envy their plight right now.
201
00:25:47.910 --> 00:26:01.999
Marion: No, and I think it's compounded by a lot of other things like a a question. I ask a lot of people right now whether it's through my Phd research, or, you know, external consults, or whatever whatever it is.
202
00:26:02.080 --> 00:26:18.079
Marion: I asked them how much the external environment is impacting their internal environment. And because that's significant, right? And when I think about all of this that's going on. And then I think about the the compounding of
203
00:26:18.310 --> 00:26:28.519
Marion: you know what we're feeling right now, with all of the the Federal layoffs and Doge and and and all the other shit that's happening, which I think is really heightening
204
00:26:30.030 --> 00:26:48.120
Marion: that anxiety and fear over being, you know, being constantly under surveillance. Not to mention did my company have an Rto mandate. And now I'm being policed as to whether I'm sat at my desk or not, like all of that on top of this like
205
00:26:48.460 --> 00:26:54.989
Marion: this is, this is, this has gone. I feel like we've just gone back a hundred years like I.
206
00:26:55.220 --> 00:26:56.770
Marion: You know, I'm
207
00:26:57.530 --> 00:27:09.879
Marion: I'm kind of lost for words as to how significant this moment is in employee psychological safety and employee voice and employee experience like.
208
00:27:10.390 --> 00:27:12.239
Marion: I can't imagine
209
00:27:12.400 --> 00:27:23.980
Marion: working in an environment which is already fairly old school and now believes it has a license to go back to the Victorian era because of what's happening
210
00:27:23.980 --> 00:27:30.510
Danny Gluch: That's what's scary. It's the the old school mentality with modern weapons of war.
211
00:27:30.750 --> 00:27:46.440
Danny Gluch: right? With actual ability to measure keystrokes and mouse movements. And your proximity around the building, and tags and stuff they used to not have that used to have a a punch card that you would punch in, and that's what they'd be able to test
212
00:27:46.560 --> 00:27:47.110
Marion: Like.
213
00:27:47.580 --> 00:27:49.430
Danny Gluch: Yeah, it's so different. Now.
214
00:27:49.430 --> 00:27:54.560
Marion: The most disturbing one I've heard is a heat sensor under your desk that's disturbing
215
00:27:55.010 --> 00:27:56.260
Danny Gluch: What?
216
00:27:57.380 --> 00:27:58.600
Cacha Dora: Oh!
217
00:27:58.600 --> 00:28:07.310
Marion: So so it's icky, like all of this is really icky. And yeah, trust works both ways.
218
00:28:07.680 --> 00:28:08.240
Marion: And I
219
00:28:08.240 --> 00:28:08.740
Cacha Dora: Really.
220
00:28:08.740 --> 00:28:14.449
Marion: And I think that you know employees
221
00:28:14.610 --> 00:28:17.319
Marion: are now not just dealing with
222
00:28:17.640 --> 00:28:25.580
Marion: salary inequity, or, you know, inclusion, but they're now dealing with this crap as well
223
00:28:25.760 --> 00:28:26.460
Cacha Dora: Yeah.
224
00:28:26.790 --> 00:28:28.220
Danny Gluch: So I've got question
225
00:28:29.820 --> 00:28:30.255
Cacha Dora: Okay.
226
00:28:31.200 --> 00:28:37.189
Danny Gluch: They were clearly justified in setting this honeypot and really trying to trap this person
227
00:28:37.410 --> 00:28:38.100
Marion: Of course.
228
00:28:39.940 --> 00:28:44.690
Danny Gluch: Should they have done it? Should they not have done it like? What were their alternatives
229
00:28:44.690 --> 00:28:47.768
Cacha Dora: Well, it's not like they could send a memo out warning the company they were gonna do it
230
00:28:47.940 --> 00:28:50.819
Marion: Yeah, I mean on their typewriter
231
00:28:50.820 --> 00:28:51.250
Danny Gluch: Thank you.
232
00:28:51.250 --> 00:28:59.740
Danny Gluch: No like you had to be closed lipped. This had to be like no one could know. You couldn't be like, hey, everyone that we still do trust. We're
233
00:28:59.740 --> 00:29:00.250
Cacha Dora: Yeah.
234
00:29:00.250 --> 00:29:07.339
Danny Gluch: This one person. But don't worry. We will never do this for you like what what were they supposed to do like? Honestly, I don't know
235
00:29:07.340 --> 00:29:21.469
Marion: I don't know enough about how that would play out. I genuinely don't. I don't feel I can comment on that. You know this is much, but like for decisions of this nature to be made, lawyers
236
00:29:21.610 --> 00:29:28.970
Marion: should have been involved. I don't know if they were right, but if someone went rogue, and did crap like this
237
00:29:29.970 --> 00:29:33.050
Marion: without any legal advice.
238
00:29:33.260 --> 00:29:34.865
Marion: God help them!
239
00:29:35.400 --> 00:29:44.979
Cacha Dora: Yeah. And and Marion to your point, like also th, there's so much that's going on in this particular in this particular story, right?
240
00:29:45.350 --> 00:30:06.969
Cacha Dora: It also involves everything from like your data prime like a lot of these large businesses. They have a whole data privacy division. They've got a cyber security division and department like they have all these other departments that really work in in lieu in tandem. I'm sorry not in lieu in tandem to help protect and help set these guardrails in place so
241
00:30:07.130 --> 00:30:08.050
Cacha Dora: like
242
00:30:09.100 --> 00:30:15.052
Cacha Dora: it's not just Hr that that's like scratching their heads, I'm sure, over this, like in any other place like
243
00:30:15.730 --> 00:30:22.909
Cacha Dora: people in cybersecurity, and people in data protection are probably going like, what? What happened to get to that point
244
00:30:22.910 --> 00:30:23.510
Marion: Yeah.
245
00:30:23.510 --> 00:30:24.110
Cacha Dora: Yeah. And
246
00:30:24.110 --> 00:30:29.479
Cacha Dora: and there's a lot that we're probably not gonna know. That'll only be really disclosed in in the courts and stuff like that.
247
00:30:30.160 --> 00:30:31.080
Marion: It's need to do
248
00:30:31.080 --> 00:30:39.010
Cacha Dora: In the movie. Yeah, which will be a version of like 12 truths. But like.
249
00:30:40.050 --> 00:30:51.060
Cacha Dora: there's so many factors that really tie into this. It's not just. Hr, it's not just the employees like it. There's so many things like when you really start thinking about it. The factors at play like the variables
250
00:30:51.060 --> 00:30:51.690
Marion: Hmm.
251
00:30:51.690 --> 00:30:53.539
Cacha Dora: Oh, my God! The variables.
252
00:30:53.540 --> 00:30:54.620
Marion: Hmm, hmm.
253
00:30:54.620 --> 00:30:59.630
Danny Gluch: Yeah. And and one of the variables, I bet they didn't actually consider is.
254
00:31:00.070 --> 00:31:06.650
Danny Gluch: how is this going to affect the trust and psychological safety of the rest of our employees
255
00:31:06.960 --> 00:31:11.089
Danny Gluch: after we do this, to someone who who was an employee
256
00:31:11.090 --> 00:31:11.790
Marion: Yeah.
257
00:31:11.790 --> 00:31:17.259
Danny Gluch: Right like. I bet they didn't ask that question, and if they did like.
258
00:31:17.920 --> 00:31:22.289
Danny Gluch: I have no idea what alternative action they they could have come up with
259
00:31:22.290 --> 00:31:26.000
Marion: Yeah, I mean, we we don't know, and we'll never know. And we're we're
260
00:31:26.000 --> 00:31:26.330
Cacha Dora: Right.
261
00:31:26.330 --> 00:31:38.160
Marion: Really speculating right? So I I don't know. But it just all I do know is that if if I was an employee there and that happened around me.
262
00:31:38.700 --> 00:31:43.350
Marion: I'd I'd I'd already be like looking for my next role. I'd be out the door. I wouldn't be
263
00:31:43.350 --> 00:31:45.280
Danny Gluch: Yeah, and and that
264
00:31:45.280 --> 00:31:52.369
Danny Gluch: that's just as much of a I'm looking to leave as if I was on the accused side and worked for for a deal
265
00:31:52.960 --> 00:31:58.370
Danny Gluch: Right. I would. I would be like it doesn't matter. If I were with either of those companies. I would be looking to leave
266
00:31:58.550 --> 00:31:59.570
Danny Gluch: Asap
267
00:32:00.090 --> 00:32:08.220
Cacha Dora: Yeah, I I feel for them, and I like it. Just you were because we're both saying that. And I was like God, I would not want to be in a town hall after this, at either of those companies
268
00:32:08.220 --> 00:32:09.750
Danny Gluch: I would love to be
269
00:32:10.130 --> 00:32:17.229
Cacha Dora: I don't know. I feel like speaking of anxiety. I feel like it was like just the idea of it made me like internally cringe like. That that
270
00:32:17.510 --> 00:32:22.090
Cacha Dora: talk about being able to hear a PIN drop potentially, which would probably be over Zoom. But that's not the problem
271
00:32:22.090 --> 00:32:27.940
Danny Gluch: I would love to ask the rippling CEO, how can you guarantee you're not going to honeypot me
272
00:32:28.690 --> 00:32:29.470
Marion: Yeah.
273
00:32:29.470 --> 00:32:30.719
Danny Gluch: I would love to ask that
274
00:32:31.700 --> 00:32:32.500
Marion: Yeah.
275
00:32:33.030 --> 00:32:34.659
Danny Gluch: I want assurances.
276
00:32:35.250 --> 00:32:40.349
Marion: Yeah, I know. And alright, Whoa.
277
00:32:40.350 --> 00:32:41.710
Cacha Dora: So complicated. It's so
278
00:32:41.710 --> 00:32:48.410
Marion: It really is. i i i can't. I can't really fathom how I feel about it, because it's
279
00:32:48.690 --> 00:32:50.400
Marion: it's not clear cut at all.
280
00:32:50.672 --> 00:32:57.210
Cacha Dora: And we don't have enough information right like to. To quote another Danny, that we all very much enjoy. It all depends right
281
00:32:57.210 --> 00:32:57.870
Marion: Perfect.
282
00:32:58.720 --> 00:32:59.970
Danny Gluch: Future episode.
283
00:32:59.970 --> 00:33:26.219
Cacha Dora: Yes, plug it in now, friends. But I I think there's so like I said, there's so many variables there's so much we don't know and like there's how long was this going on for before they made that decision? Right? What? What was this? A. No recourse for them. Have they already like tried things that we don't know about? You know what I mean like? There's so much of that that again. We just don't know. All we're seeing is the very end of the tail.
284
00:33:27.580 --> 00:33:30.500
Marion: yeah, absolutely. Oh, it's so. Juicy
285
00:33:31.190 --> 00:33:48.555
Cacha Dora: It is, and there's so many articles about it. There's so many speculative articles people trying to get things and quotes. But obviously neither com company is gonna really do a huge amount of commentary on it. So both going into legal things. So it's like, you know, it really is just this juicy wtf! Oh, my gosh! Like wow!
286
00:33:48.840 --> 00:33:49.460
Marion: Your holiday.
287
00:33:49.460 --> 00:33:51.060
Cacha Dora: 3 letter acronyms. You can think of guys
288
00:33:51.060 --> 00:34:13.939
Marion: Yeah. And it happened right on the eve of Transform. So 1st day of transform, I think, is when the news broke, or maybe the day before. I'm not sure but everyone turns up and someone posted a great photo on Linkedin in our community where there's a Trans transform. You've got the rippling booth, and you've got in the background, and I was like, Oh, no.
289
00:34:14.530 --> 00:34:18.130
Cacha Dora: That had to have been tense, tense, and awkward.
290
00:34:18.480 --> 00:34:20.190
Cacha Dora: Yeah, good.
291
00:34:21.270 --> 00:34:24.310
Cacha Dora: The wait and see. The wait and see will be
292
00:34:24.310 --> 00:34:28.689
Danny Gluch: I'm really curious to see if Deal was doing similar things with other partners.
293
00:34:30.060 --> 00:34:39.140
Danny Gluch: and and how the industry is going to treat them. Are they going to treat them like lepers where they're just like, look, every contract is now in breach, and we're going to try to get out.
294
00:34:39.300 --> 00:34:41.840
Danny Gluch: Oh, they'll be out of business. They're not.
295
00:34:41.840 --> 00:34:55.089
Danny Gluch: I'm really curious to see see where it goes from here. But I'm also really interested to see what what you all think we couldn't learn as employees, as Hr. Professionals, as organizational leaders. What can we learn from this like
296
00:34:55.770 --> 00:35:03.080
Marion: Going to go back to my my transparency and trust comment like that has to be
297
00:35:03.930 --> 00:35:12.880
Marion: absolutely paramount for every leader, transparency and trust, transparency and trust, because when there are security breaches
298
00:35:14.210 --> 00:35:17.799
Marion: and it's covered up and and not really kind of like
299
00:35:19.070 --> 00:35:25.110
Marion: shared or discussed or acknowledged. And then people. Then get the eck like.
300
00:35:25.330 --> 00:35:33.649
Marion: yeah, that that just decimates psychological safety and culture in one go like, at least, if you screw up and you own it.
301
00:35:34.540 --> 00:35:43.400
Marion: You're showing integrity, vulnerability. These are excellent leadership qualities, yet might suck in the short term right? But you know, people
302
00:35:43.400 --> 00:35:45.830
Cacha Dora: What you build in the long term is massive
303
00:35:46.050 --> 00:36:03.839
Marion: Employees will respect you so much more than they do when you're trying to be shifty and and hide things under the rug. And no, I don't want to work for a leader like that. I'll take someone who can own their shit all day every day. But someone who
304
00:36:04.050 --> 00:36:08.300
Marion: is not transparent, and hates this type of thing.
305
00:36:09.192 --> 00:36:10.130
Marion: Not for me.
306
00:36:11.960 --> 00:36:13.369
Cacha Dora: Yeah, I think
307
00:36:14.400 --> 00:36:19.689
Cacha Dora: I think in a lot of ways, leaders can learn from this and the what not to do category
308
00:36:19.690 --> 00:36:20.270
Marion: Okay.
309
00:36:20.780 --> 00:36:22.280
Cacha Dora: Right, and
310
00:36:22.920 --> 00:36:34.040
Cacha Dora: the the need for communication will be paramount for their direct reports for their colleagues, and being able to lean into that
311
00:36:34.350 --> 00:36:37.289
Cacha Dora: that transparency of communication right now.
312
00:36:37.610 --> 00:36:44.713
Cacha Dora: because it it is a giant unknown right, and people are uncomfortable in an unknown ambiguity is not something that comes naturally to people.
313
00:36:45.633 --> 00:36:53.669
Cacha Dora: Dealing with change typically does not come naturally to people. And so when they're in that kind of uncertainty.
314
00:36:53.990 --> 00:36:56.239
Cacha Dora: what managers can learn how to listen
315
00:36:57.950 --> 00:37:02.330
Cacha Dora: And not maybe to provide a solution because you're not gonna have one. Today, you're not gonna have one tomorrow.
316
00:37:03.390 --> 00:37:05.880
Cacha Dora: So listen to be present for your people.
317
00:37:07.090 --> 00:37:15.159
Cacha Dora: and be a voice to actually then work through your leadership too. What can managers learn how to speak to their superiors?
318
00:37:15.380 --> 00:37:22.400
Cacha Dora: I think that I think it's a good growth opportunity truthfully, for leaders in how to navigate uncertainty
319
00:37:23.710 --> 00:37:26.469
Danny Gluch: Yeah. Oh, boy, there's a lot of uncertainty
320
00:37:26.470 --> 00:37:30.400
Cacha Dora: Right? Yeah. Cause I mean, like, if the public is speculative.
321
00:37:31.310 --> 00:37:38.749
Danny Gluch: Imagine what the employee base is feeling and and the conversations they're having and and what managers have to deal with. Yeah, that's a
322
00:37:39.370 --> 00:37:41.876
Danny Gluch: boy, do I feel for them right now?
323
00:37:42.330 --> 00:37:45.149
Danny Gluch: I'm really curious to see
324
00:37:45.310 --> 00:37:56.961
Danny Gluch: over the next couple months how leadership, especially at rippling handles, this. What messages they do to try to repair the trust with their own employees?
325
00:37:58.040 --> 00:38:17.259
Danny Gluch: What policies they they might put into place of, you know I'm always curious like when you see the sign of a building like, you know, hey? Don't, don't grab this handle. It's like, well, someone grab that handle and something went bad. So now there's a sign for it. I'm really curious to see like what Paul like. Do they have any sort of like ad hoc.
326
00:38:17.260 --> 00:38:39.439
Danny Gluch: reactionary policies of like who they do business with, or how they they go about hiring and vetting and things like that. I'm really curious to see if they're reactionary, or if they're like, you know what we actually did have a good process. This was a 1 off, of whatever reason, you know, that they found that we we actually don't have to make any big changes
327
00:38:39.650 --> 00:38:50.730
Marion: As part of the investigation process. That's that's a given right? You're gonna you're gonna dig into points of failure. What went wrong? What can be improved, and there will be things that can be improved. There always is right
328
00:38:50.730 --> 00:38:51.470
Danny Gluch: Yeah.
329
00:38:51.630 --> 00:39:01.619
Marion: It's never perfect. But I think the bigger issue is when maybe, and again speculate and do not know if this is the case. So I'm speaking generally, but like, if if
330
00:39:01.760 --> 00:39:08.870
Marion: people are aware that there's issues, but either don't fix them or can't fix them, and then shit happens. Now
331
00:39:08.870 --> 00:39:11.390
Cacha Dora: Or they can't feel safe to speak up. In the 1st place.
332
00:39:11.390 --> 00:39:16.580
Marion: Yes, exactly. The psychological safety doesn't exist. That bigger issue.
333
00:39:16.730 --> 00:39:21.570
Marion: And yeah, that that's a whole different kettle of fish. So
334
00:39:23.720 --> 00:39:28.070
Danny Gluch: Yeah, I think the the ability to listen to those employees of like.
335
00:39:28.730 --> 00:39:32.639
Danny Gluch: you know. I wonder if people were saying like, Hey, this guy's a little shady
336
00:39:33.380 --> 00:39:35.260
Danny Gluch: You know, and and oh!
337
00:39:35.260 --> 00:39:40.289
Cacha Dora: Some weird off off color questions. Here, I don't understand. That's out of context. Yeah, yeah.
338
00:39:41.760 --> 00:39:42.780
Marion: Who knows?
339
00:39:42.890 --> 00:39:44.949
Cacha Dora: We might never know
340
00:39:45.070 --> 00:39:45.890
Marion: Yeah.
341
00:39:46.090 --> 00:39:47.330
Danny Gluch: Until the movie comes out
342
00:39:47.330 --> 00:39:54.359
Cacha Dora: Until the movie comes out. But Marion, I do think you bring up a good point, right like when you're investigating points of failure.
343
00:39:54.550 --> 00:40:01.179
Cacha Dora: How often do organizations as a whole ask the question, did people feel safe enough to speak up? In the 1st place.
344
00:40:01.550 --> 00:40:04.910
Marion: Yeah, how often do businesses ask that question? I know
345
00:40:05.020 --> 00:40:05.780
Cacha Dora: Yeah.
346
00:40:06.350 --> 00:40:10.760
Marion: No enough, so
347
00:40:10.760 --> 00:40:14.189
Danny Gluch: Yeah. And I mean, part of it is, do they?
348
00:40:14.360 --> 00:40:29.549
Danny Gluch: Do they respond to results? Or do they respond to process right like, oh, the if something resulted in this. So let's let's do a patchwork, you know, change of policy here, or do we look at the process, find the failures in the process and fix that I
349
00:40:29.660 --> 00:40:35.874
Danny Gluch: been big on scene you know, especially with a whole signal gate also going on right now.
350
00:40:36.220 --> 00:40:38.059
Marion: Oh, God! Oh, God! That's
351
00:40:38.060 --> 00:40:39.449
Cacha Dora: Other conversation good day.
352
00:40:39.450 --> 00:40:44.899
Danny Gluch: It. It is a different conversation. But it's also, you know, kind of similar of
353
00:40:45.020 --> 00:40:52.239
Danny Gluch: a lot of people aren't worried about it because it didn't result in any really horrific thing happening as opposed to.
354
00:40:52.480 --> 00:41:04.830
Danny Gluch: you know, her emails Benghazi stuff where the process was was fine. But something tragic did happen. And it's because the results were bad that they're like, Oh, no, we have to do investigations and changes
355
00:41:04.830 --> 00:41:05.640
Marion: Yeah.
356
00:41:05.931 --> 00:41:16.999
Danny Gluch: And it just I really wonder if they're gonna look at the process or just be like the result. Was this. So we need to respond to the result, and I don't think that's good long term business
357
00:41:17.000 --> 00:41:22.299
Marion: And that comes down to integrity and leadership, transparency and trust, and
358
00:41:22.300 --> 00:41:22.870
Danny Gluch: Yeah.
359
00:41:23.140 --> 00:41:24.220
Marion: All the things
360
00:41:25.500 --> 00:41:31.730
Danny Gluch: It really does all right. Any final thoughts, ladies.
361
00:41:33.360 --> 00:41:34.410
Cacha Dora: It all depends
362
00:41:34.770 --> 00:41:35.350
Danny Gluch: Yeah.
363
00:41:35.910 --> 00:41:44.940
Cacha Dora: Kidding. No, I will say I I think that the thing that's really in my head is I'm it's the wait and see. I'm so curious to see what's like really going to happen. And
364
00:41:45.070 --> 00:41:50.590
Cacha Dora: in the the weird part with new cycles is things always tend to fall off and then pop back up again.
365
00:41:50.590 --> 00:41:50.990
Marion: Hmm.
366
00:41:50.990 --> 00:41:57.417
Cacha Dora: So like something else, and if anything this year has taught us is that the new cycle always has something very interesting.
367
00:41:58.257 --> 00:42:01.130
Cacha Dora: Whether or not we like. It is a different conversation.
368
00:42:02.569 --> 00:42:05.809
Cacha Dora: I think the wait and see like this might be a slow burn
369
00:42:06.030 --> 00:42:08.929
Cacha Dora: till we really learn what truly happened.
370
00:42:09.660 --> 00:42:12.869
Cacha Dora: And that means that what we learn might take a long time
371
00:42:13.250 --> 00:42:13.930
Marion: Hmm!
372
00:42:14.180 --> 00:42:22.430
Marion: Well, all I'm gonna say is, hopefully, this doesn't cause a rippling effect across the industry.
373
00:42:23.680 --> 00:42:25.060
Danny Gluch: That's so good
374
00:42:25.060 --> 00:42:30.949
Cacha Dora: Oh, you were holding that you were holding that 10 out of 10, Mary
375
00:42:30.950 --> 00:42:33.289
Danny Gluch: Oh, we have an episode, title.
376
00:42:37.183 --> 00:42:41.839
Cacha Dora: I'm so glad you went after me. I couldn't afford that
377
00:42:43.160 --> 00:42:46.920
Danny Gluch: Be sure to like and subscribe for more jokes like that.
378
00:42:48.970 --> 00:42:50.429
Marion: Week. Tip your server
379
00:42:51.830 --> 00:42:58.210
Danny Gluch: Please leave a five-star review, and your favorite dad joke as a comment.
380
00:42:59.880 --> 00:43:03.421
Danny Gluch: My Brendan's favorite dad joke
381
00:43:04.600 --> 00:43:11.439
Danny Gluch: Why can't you hear a pterodactyl go to the bathroom because the P. Is silent
382
00:43:13.160 --> 00:43:14.140
Cacha Dora: Oh, my God!
383
00:43:14.140 --> 00:43:17.410
Marion: And on that note, everyone. We'll see you next time
384
00:43:17.890 --> 00:43:18.900
Danny Gluch: Bye.
385
00:43:18.900 --> 00:43:19.570
Cacha Dora: Bye.