The Elephant in the Org

Cute Survey. What Are You Going to Do About It? with Jeffrey Fermin

The Fearless PX Season 3 Episode 13

Send us a text

S3E13 — Cute Survey. What Are You Going to Do About It?
with Jeffrey Fermin (Host of People First, tech founder, advisor)

If your org runs surveys but never closes the loop, people don’t just get annoyed — they stop telling the truth. Silence becomes the culture.

In this episode, we get brutally practical about what happens after the survey:

  • Why surveys often become “listening theatre.”
  • How dashboards replace decisions
  • The trust damage caused by silence (and how fast it spreads)
  • A simple five-step Close-the-Loop Ladder you can steal and use immediately:
    Acknowledge → Prioritise → Commit → Explain “Not Yet” → Report Back

If you’ve ever read survey results and thought, “Cool… now what?” — this is your playbook episode.

Link to Show Notes

🔹 Jeffrey Fermin: https://www.linkedin.com/in/jfermin/

📩 Got a hot take or a workplace horror story? Email us at elephant@thefearlesspx.com

🚀 Your Hosts on LinkedIn

💬 Like what you hear?
Follow/subscribe so you don’t miss an episode — and if this one hit home, leave a ★★★★★ review to help more people find the show.

🎙️ About the Show
The Elephant in the Org drops new episodes every two weeks starting April 2024 — fearless conversations about leadership, psychological safety, and the future of work.

🎵 Music & Production Credits
🎶 Opening and closing theme music by The Toros
🎙️ Produced by The Fearless PX
✂️ Edited by Marion Anderson

⚠️ Disclaimer
The views and opinions expressed in this podcast are those of the hosts and guests, and do not necessarily reflect any affiliated organizations' official policy or position.

Topics: employee surveys, listening culture, trust, people analytics, psychological safety, em...

S3 EP13 - Transcript Release 1/7/25


Cute Survey. What Are You Doing About It? — with Jeffrey Fermin


1

00:00:02.610 --> 00:00:08.650

Danny Gluch: Welcome back to The Elephant in the Org, everyone. I'm Danny Clutch, and I'm joined, as always, by my co-host, Marian Anderson.


2

00:00:09.720 --> 00:00:13.690

Marion: Really? Me? Hi, I'm so honoured. Nice to be here.


3

00:00:14.300 --> 00:00:15.890

Danny Gluch: Ed Kashadora.


4

00:00:15.890 --> 00:00:19.100

Cacha Dora: Hello! And Marion's the problem today.


5

00:00:19.100 --> 00:00:21.350

Danny Gluch: She, she is.


6

00:00:21.750 --> 00:00:28.100

Danny Gluch: But we've got a guest to help handle the problem. Jeffrey Furman. Jeffrey, say hi to our guests.


7

00:00:28.310 --> 00:00:35.950

Jeffrey Fermin: Hey everybody, excited to be here. Judging… man, I wish we could have recorded the first, like, 20 minutes of us just…


8

00:00:36.320 --> 00:00:45.629

Jeffrey Fermin: That whole exchange was terrific. That should be the podcast. I don't know what we're doing today, but, like, can we go back and find a way to, like, get that footage and audio and bring it here?


9

00:00:45.890 --> 00:00:49.820

Danny Gluch: I'm sure Zoom was… Zoom was secretly recording, I'm sure.


10

00:00:49.820 --> 00:00:50.859

Jeffrey Fermin: I hope not to make us…


11

00:00:50.860 --> 00:00:51.470

Marion: Cause…


12

00:00:51.700 --> 00:00:53.290

Cacha Dora: Just sharing 80s tunes.


13

00:00:56.180 --> 00:00:56.899

Danny Gluch: We're gonna keep it to 8.


14

00:00:56.900 --> 00:00:59.069

Cacha Dora: tunes. It's just gonna be nice and clean and.


15

00:00:59.070 --> 00:01:03.450

Jeffrey Fermin: Do I have to call… do I have to call Danny Papa? Is that a thing for this podcast?


16

00:01:03.450 --> 00:01:04.369

Marion: You gotta call him Popeye.


17

00:01:05.730 --> 00:01:08.539

Danny Gluch: Oh, no. I'm not gonna hold it together.


18

00:01:08.540 --> 00:01:10.410

Cacha Dora: Look what you did!


19

00:01:10.410 --> 00:01:11.789

Jeffrey Fermin: I made him break.


20

00:01:12.750 --> 00:01:14.610

Danny Gluch: It's one minute in.


21

00:01:14.740 --> 00:01:17.330

Danny Gluch: Oh, that's a… that's a good sign.


22

00:01:17.330 --> 00:01:18.510

Cacha Dora: of tidings.


23

00:01:18.510 --> 00:01:32.689

Danny Gluch: Well, we actually have a really, really interesting elephant in the org today. The fact that organizations have more data and more feedback from their employees than ever, but seemingly less action.


24

00:01:32.740 --> 00:01:50.829

Danny Gluch: And I love the idea, Jeff, that you brought up about performative listening, and how it's just such a big problem. So, we're gonna dig in, and you have a particular expertise. You've been working in HR tech for, would you say, 13, 14 years now?


25

00:01:50.830 --> 00:01:51.890

Jeffrey Fermin: It's up there.


26

00:01:52.090 --> 00:01:57.879

Danny Gluch: Yeah, so tell us how all of that history has brought you to…


27

00:01:58.050 --> 00:02:00.779

Danny Gluch: Wanting to talk about this elephant with us today.


28

00:02:01.170 --> 00:02:05.020

Jeffrey Fermin: Yeah, I… Yeah, early on, I started, actually, I…


29

00:02:05.560 --> 00:02:15.960

Jeffrey Fermin: I won't say I invented, but we came up with one of the first people analytics platforms at OfficeVibe. Essentially, we're gathering information through post surveys, and every single week, we'd


30

00:02:16.260 --> 00:02:20.320

Jeffrey Fermin: Ask all these questions, and have all this data, and, you know.


31

00:02:20.780 --> 00:02:40.189

Jeffrey Fermin: these very impressive dashboards that would just showcase all this stuff. So I thought very early on in my career, especially in the era of big data, right, 2011, everyone wanted big data, it was the big buzzword, big keyword, and, you know, companies were investing heavily in finding out, like, all these issues, came up with this, this new concept.


32

00:02:40.190 --> 00:02:42.650

Jeffrey Fermin: And it worked. It was very effective.


33

00:02:42.650 --> 00:02:48.550

Jeffrey Fermin: At collecting data. Now, that's… that's the first part, right? But…


34

00:02:48.900 --> 00:03:13.019

Jeffrey Fermin: we really had to nudge people, not even as, like, within our product, but people that were using us, they would call us, like, our customer success was essentially HR, I guess, consulting. So we'd be like, oh, okay, well, we saw that we have a 4.0 in wellness, what should we do? And then we'd actually have to write blogs, write content, and then, you know, give them a little bit of knowledge on the subject, and then say, like, oh, you might want to


35

00:03:13.020 --> 00:03:30.079

Jeffrey Fermin: invest in XYZ Corporate Wellness Initiative and do this, and your numbers in office vibe will improve. So, I don't know. I learned very early on that this problem that still persists to this day, you know, it's one that still hasn't really been solved, and I don't know.


36

00:03:30.080 --> 00:03:43.260

Jeffrey Fermin: I want to say it's gotten even worse, right? Like, we have all these systems spread out, giving us all this information, all these numbers, and, you know, we're supposed to kind of just look at it and make our decisions. So, I think that's the big elephant in the room, right?


37

00:03:43.260 --> 00:03:43.620

Danny Gluch: often.


38

00:03:43.620 --> 00:03:57.609

Jeffrey Fermin: or we have all the data, but what's going to be the next step, and what are we going to actually do to take action to make workplaces better? So, yeah, my rant's done. That's just why I've been so interested in this topic for so long, but yeah.


39

00:03:57.740 --> 00:03:59.120

Jeffrey Fermin: I now yield the floor.


40

00:03:59.300 --> 00:03:59.910

Danny Gluch: Sweet.


41

00:03:59.910 --> 00:04:00.270

Marion: Too many.


42

00:04:00.750 --> 00:04:04.929

Marion: So many thoughts, so many thoughts.


43

00:04:06.680 --> 00:04:21.510

Danny Gluch: But before you guys get going, I wanna… I want everyone's thoughts. Do you think that this is just over, like, paralysis by analysis? They've got too much information? Is it a lack of skill? What, like, a lack of…


44

00:04:22.730 --> 00:04:29.149

Danny Gluch: experience knowing what to do to get the outputs that they want? Like, what do you think is the core issue?


45

00:04:29.190 --> 00:04:31.489

Cacha Dora: I think it's death by politics.


46

00:04:32.000 --> 00:04:33.340

Jeffrey Fermin: Yes. Sorry.


47

00:04:33.930 --> 00:04:35.540

Cacha Dora: I think…


48

00:04:35.540 --> 00:04:36.719

Marion: on that, Kasha.


49

00:04:36.720 --> 00:04:40.270

Cacha Dora: Well, let me, let me double-click on that for you.


50

00:04:40.270 --> 00:04:44.690

Danny Gluch: No! You're in the corner! You're the double-clicking corner.


51

00:04:45.590 --> 00:04:46.780

Marion: Well, Tony was gonna be…


52

00:04:46.780 --> 00:04:48.309

Cacha Dora: I wasn't kidding.


53

00:04:48.310 --> 00:04:51.930

Marion: If you follow that up with a lean in, we're gonna fall out.


54

00:04:53.090 --> 00:05:09.269

Cacha Dora: No, I would never. Ew. But it was worth it to see your faces that no one else will get to see as I said that. But, I think that… I think that having all of that data is so important, and I think that, yes, you could look at it through the lens of.


55

00:05:09.270 --> 00:05:21.960

Cacha Dora: A lack of experience, or not sure, of what avenue to take, because there might be so many different things, depending on what the data's telling you, and depending on what narrative you're now throwing on top of that data.


56

00:05:22.020 --> 00:05:23.450

Cacha Dora: But ultimately.


57

00:05:23.850 --> 00:05:32.390

Cacha Dora: And I think we've talked about this in many podcasts, but the HR unit, whatever you're calling it in your organization, people, team, what have you, is not the one that makes the decision.


58

00:05:34.030 --> 00:05:47.360

Cacha Dora: They are the ones who do the recommending, and they might be waving every color flag under the sun to point out what some of these core issues are, but they don't have the power, and it ends up in a politics game.


59

00:05:48.200 --> 00:05:56.049

Cacha Dora: I'm sure, organizationally, we have better words for it, but I think it really just is a politics game of…


60

00:05:56.070 --> 00:06:14.270

Cacha Dora: what people do and don't want to address, and I think it gets even trickier when you're looking at organizations, depending on the stage of growth that they're in, and depending on if they're public or private. I think all of those things, when people start looking at data, those all factor into how intense that politicking


61

00:06:14.470 --> 00:06:15.790

Cacha Dora: Can evolve into.


62

00:06:15.790 --> 00:06:16.410

Danny Gluch: Hmm.


63

00:06:16.670 --> 00:06:20.470

Marion: I… I want to build on that. Like, I think when…


64

00:06:20.850 --> 00:06:22.250

Danny Gluch: You wanna circle back?


65

00:06:24.040 --> 00:06:26.860

Marion: I would never, Papa. Never.


66

00:06:26.860 --> 00:06:27.390

Cacha Dora: Hello!


67

00:06:29.050 --> 00:06:44.390

Marion: So, no, where I was going with this was, when you said that… when you said big data, right? Like, I… I felt like I was having PTSD as well, because it was such a phrase, it did have a moment, and…


68

00:06:44.390 --> 00:06:51.430

Marion: when I think about that evolution since then of people data and people analytics, you know.


69

00:06:52.840 --> 00:07:10.169

Marion: it's such a big industry, right? Like, there's so much tech, there's so much airtime given, there's so much discussion about all of the previous unconnected data points that no one was ever able to do anything with before, and all of a sudden we're able to synthesize this data, because we have all these great tools now, which is amazing.


70

00:07:10.950 --> 00:07:20.650

Marion: But here's the thing, and I have a very simplistic approach to this, and view of this, but when you look at pretty much any employee survey, right.


71

00:07:21.060 --> 00:07:23.450

Marion: Any company in the world, doesn't matter.


72

00:07:23.740 --> 00:07:26.549

Marion: After you've stripped out anything which is, like.


73

00:07:27.060 --> 00:07:35.830

Marion: compensation benefits related, like, the transactional stuff. Once you've taken all that transactional stuff out, the coffee and the tea rooms, crap, like, all of that's out.


74

00:07:36.670 --> 00:07:51.029

Marion: what's left, right? All the sentiment that's left goes back to three things, and I will die on this hill, right? When you look at all of the qualitative data that comes out, it goes back to either lack of leadership transparency and trust.


75

00:07:51.030 --> 00:08:02.389

Marion: Lack of psychological safety, and lack of manager capability and enablement. And you can map every single comment back to one or more of those three things, right?


76

00:08:02.770 --> 00:08:10.700

Marion: And I think that… because… Even with enhanced tech.


77

00:08:11.600 --> 00:08:28.749

Marion: We fail on the last one, we fail on the capability and enablement, so you can give all of these managers all of this incredible data, but they're so overwhelmed already, right? And no one's ever really trained them on how to use it effectively and what to do with it.


78

00:08:28.900 --> 00:08:37.209

Marion: And let's face it, most of them don't have the teeth to be able to make any decisions anyway. None of the stuff that people want is in their control, or very little.


79

00:08:37.330 --> 00:08:42.399

Marion: maybe there's an element of influence, but very little control. So, like, it's…


80

00:08:43.120 --> 00:08:47.170

Marion: It just becomes this, like, continual circus of


81

00:08:47.270 --> 00:08:54.199

Marion: we're gonna ask you what you think, and isn't that great? Because we're trying to be, you know, get good information, be transparent, and yet.


82

00:08:54.430 --> 00:09:10.000

Marion: when we get stuff back that, A, we don't know what to do with, or we can't do anything, and we just, like, brush it under the rug, and we'll spin it, and we'll make it sound like we've done something, and we've actually done fuck all, right? Like, that is the reality of what happens in a lot of organizations, not all.


83

00:09:10.000 --> 00:09:15.459

Marion: By any means, there's some really great ones out there, but certainly in my experience, that's…


84

00:09:15.550 --> 00:09:20.400

Marion: how it rides in the majority of companies. And so, for me.


85

00:09:21.650 --> 00:09:24.750

Marion: That's where the performative stuff really kicks in.


86

00:09:24.970 --> 00:09:31.649

Marion: And that's where the damage gets done. Because if you ask me 10 times to tell you what I think and you do nothing.


87

00:09:31.900 --> 00:09:39.500

Marion: Right. Then, all of that, that transparency and trust, my psychological safety, those are the three things we keep coming back to.


88

00:09:39.800 --> 00:09:41.050

Marion: That's gone.


89

00:09:41.250 --> 00:09:52.769

Marion: And my psychological contract is broken, and I'm peace out, and I'm gonna go to the next one. He'll probably do the same thing again. But that is the cycle. I'm sorry I sound really negative Nelly today, but that is the reality, right?


90

00:09:53.070 --> 00:09:57.110

Marion: Unless you can tell me something different, Geoffrey, is there companies out there that are doing this really well?


91

00:09:57.110 --> 00:10:10.529

Jeffrey Fermin: I will disagree with you on one thing. I don't think you're a negative Nelly at all. Don't put that label on you. Now, a couple points, and I want to start off with some of the things that Kasha was, like, discussing, right?


92

00:10:10.670 --> 00:10:23.829

Jeffrey Fermin: Office politics plays a very large role in what decisions can be made. I've spoken to a couple folks, Kim Rohrer, Sarika Lamont, and it was specifically around, like, what to do with survey results.


93

00:10:23.920 --> 00:10:26.700

Jeffrey Fermin: And, yeah, they both kind of…


94

00:10:26.710 --> 00:10:35.900

Jeffrey Fermin: said the same thing, where, like, you know, you bring these results from a platform like an Office Vibe, you know, CultureAmp Lattice.


95

00:10:35.900 --> 00:10:59.980

Jeffrey Fermin: And then you say, like, here's what the employees are asking for. Like, there is a clear… there's clear data points that say, like, hey, they want better benefits. Everyone's asking for, I don't know, wellness stipends, or, you know, better healthcare. Why am I paying 50%? Healthcare in America is expensive. You know, can you do the full 100? And they have to take that back as a CHRO or consultant, whatever the role is.


96

00:11:00.020 --> 00:11:09.520

Jeffrey Fermin: and then go to the C-suite and say, like, hey, we need you to invest more in people, you know, can you do these things? Guess what happens right there?


97

00:11:09.660 --> 00:11:29.329

Jeffrey Fermin: in the C-suite's mind, the conversation becomes, well, like, what is the output? Like, what are we gonna expect from a business perspective? Like, what are we going to get in return? You know, do we… I don't want to fund that. They have enough already. That's most of the case. I'm not saying that's, you know, every organization, all that stuff, but traditionally here.


98

00:11:29.380 --> 00:11:32.679

Jeffrey Fermin: In the United States, that's… that's kind of how things are. Very…


99

00:11:33.070 --> 00:11:50.910

Jeffrey Fermin: horrible opinion, and you could call me a negative Nelly now, but more times than not, yeah, I feel like that is the conversation that's being had. And, oh god, I'm gonna go on a tangent, but I feel like that's… that's the problem with maybe that negative perception of HR. There's this whole thing, like, oh, they're just being the fun police. No, they're actually, like.


100

00:11:50.910 --> 00:11:53.570

Jeffrey Fermin: Doing things, collecting data to actually


101

00:11:53.570 --> 00:12:10.210

Jeffrey Fermin: advocate for people, for employees, and doing all these things so they can make workplaces better, and I feel like it's very unfair that HR folks have this negative light around them when they do so much to make workplaces better. So, yeah, that's the whole thing. I wanted to address that, like, the office politics part.


102

00:12:10.420 --> 00:12:17.199

Jeffrey Fermin: plays a very massive, probably the biggest role out there. And, Marianne, you're absolutely right too, right? Like.


103

00:12:17.520 --> 00:12:26.100

Jeffrey Fermin: outside of those transactional, like, oh, I don't like, you know, the coffee that we use here, the grounds are awful, the beans suck, right?


104

00:12:26.190 --> 00:12:41.090

Jeffrey Fermin: A lot of the problems that stem in workplaces that you'll see in these surveys are things like psychological safety type of topics. They're, you know, relationships with peers and managers, and maybe you have, like, one manager that


105

00:12:41.320 --> 00:12:46.820

Jeffrey Fermin: that is a, I don't know, a micromanaging type, a… can I curse on here?


106

00:12:46.820 --> 00:12:47.860

Marion: Absolutely.


107

00:12:47.860 --> 00:13:07.320

Jeffrey Fermin: I was gonna say, maybe… I was just being polite. Yeah, maybe you have, like, an asshole manager, and you're just like, oh man, like, everyone is writing bad about it. So what's the next step? What's the follow-up? What can you do as an organization to say, like, oh man, well, everyone says this guy's an asshole, but his department produces well. Like, what can you do as a…


108

00:13:07.320 --> 00:13:10.689

Jeffrey Fermin: Higher up in the organization to make any kind of change.


109

00:13:10.690 --> 00:13:20.760

Jeffrey Fermin: And even if you have both quantitative and qualitative data, what's that next step? You really just have to just nip it in the bun, sweep it under the rug, and that's just about it.


110

00:13:20.940 --> 00:13:33.639

Marion: And you also need to have an element of savvy with this data. Like, you know, let's think about the reality of where we are today, right? The job market sucks, particularly in certain sectors, right? Tech is a good example.


111

00:13:34.170 --> 00:13:34.890

Marion: And…


112

00:13:35.900 --> 00:13:46.820

Marion: I know just from reading a lot of industry commentary that engagement surveys are very conflicting this year, where


113

00:13:46.870 --> 00:14:02.439

Marion: employees are saying things like, you know, they're not happy, their engagement's low, they don't feel psychologically safe, yadda yadda yadda, but their intent to stay is high. And the company's like, well, that's great, they're going to stay, they must be fabulous. No!


114

00:14:03.460 --> 00:14:03.830

Danny Gluch: because.


115

00:14:03.830 --> 00:14:07.059

Marion: which are fabulous. They're saying because the job market's shit.


116

00:14:07.060 --> 00:14:09.180

Danny Gluch: And if they leave, if they…


117

00:14:09.220 --> 00:14:23.139

Marion: have a toddler tantrum and storm out, they could be unemployed for an extended period of time, particularly, again, in certain industries where there's compounding factors like AI and what have you, right? So.


118

00:14:23.570 --> 00:14:35.340

Marion: you have to have a little bit of savvy and an ability to question that data in a really analytical and thoughtful way, and not just to accept it at face value, and I don't think that


119

00:14:35.460 --> 00:14:45.799

Marion: Again, manager capability and enablement. Companies do not train managers on how to interpret this data properly, or certainly with an analytical or open mind.


120

00:14:45.920 --> 00:14:55.800

Marion: And I've seen it time and time again, where you'll get, like, paradoxes in data like that, but the company just puts a really positive spin on it, you know?


121

00:14:56.030 --> 00:15:00.569

Marion: Jazz hands, and it's not great, it's not fucking great, it's far from great.


122

00:15:01.120 --> 00:15:10.579

Marion: You know, it's like, why are we collecting this data if we're just trying to spin it into this narrative that actually doesn't exist, you know?


123

00:15:12.040 --> 00:15:17.789

Danny Gluch: Yeah, that… it's… it's not, like, ill-intentioned. I think it's…


124

00:15:17.890 --> 00:15:27.690

Danny Gluch: It's that misinterpretation, though, and not actually understanding the data. And this is one of my questions for Jeff, is…


125

00:15:27.930 --> 00:15:42.759

Danny Gluch: how often are these teams getting all of this data? You know, you had the big era of big data that I think is, like, softening a little bit, where it's not just, like, data, data, data all the time, it's,


126

00:15:43.240 --> 00:15:49.979

Danny Gluch: we're now using, like, AI, like, give the data to AI, because we don't know what to do with it.


127

00:15:50.150 --> 00:16:00.079

Danny Gluch: And I really do think that there was a skill issue in interpreting and in crafting narratives, and like you said, people were, like, seeing the numbers and being like, I got a 4.


128

00:16:00.260 --> 00:16:07.850

Danny Gluch: what do I do with that? Because I don't know, I'm a professional, but I'm not an expert in…


129

00:16:08.180 --> 00:16:11.660

Danny Gluch: Data analytics and, you know, strategy.


130

00:16:12.420 --> 00:16:14.550

Danny Gluch: So, what do people do?


131

00:16:15.660 --> 00:16:23.860

Jeffrey Fermin: And, oh god, there's, like, so many ways I want to answer everything you just said. I'll start with the latter. What do we do? Well…


132

00:16:24.230 --> 00:16:41.529

Jeffrey Fermin: I think of, like, a company where I'm currently employed at, head of demand gen at All Voices. You know, I'm very happy at what they've built. You know, it's an employee relations platform, but they also act as a whistleblower hotline, ethics hotline, all that fun stuff. Go check out the website. I'm a horrible salesperson for the company, to be honest.


133

00:16:41.530 --> 00:16:44.759

Jeffrey Fermin: But, I really love the fact that they…


134

00:16:45.800 --> 00:17:04.359

Jeffrey Fermin: did a great job of sort of weaving in AI to get all, like, all the issues that are going on in large corporations, where there's frontline employees, where there's, you know, let's say it's restaurants, you have, a plethora of workers, different, you know, level types, you have corporate employees, all that fun stuff, right?


135

00:17:05.060 --> 00:17:24.190

Jeffrey Fermin: we actually provide actions, and we use AI to do that. So if we're, like, you could upload your policies, your procedures, state laws, country laws, national laws, all that fun stuff, and then you can then do follow-ups, say, okay, there was… I'm gonna use sexual harassment as an example here, right? Like, there was an incident over here in the California office.


136

00:17:24.300 --> 00:17:38.910

Jeffrey Fermin: So what can I do to address this situation? Here are the people involved, the president, all this fun stuff. So you could have those follow-ups and kind of take action as an individual to ensure that you have the best possible outcome.


137

00:17:38.990 --> 00:17:49.769

Jeffrey Fermin: I don't know the current lay of the land for the people analytics space, or employee survey space, and what's being done there, but I have to assume that they're on there, and


138

00:17:50.210 --> 00:17:53.460

Jeffrey Fermin: If they haven't done it just yet, please…


139

00:17:53.580 --> 00:18:04.819

Jeffrey Fermin: contact me, I'll be your product consultant. Haven't done this stuff before, but that's where the opportunity really lies. You have all these large data sets, and you really don't want to act on it.


140

00:18:05.350 --> 00:18:15.409

Jeffrey Fermin: Like, come on, guys, like, get it together. You're… a lot of these companies have millions and millions of dollars going into R&D and, you know, sales…


141

00:18:15.740 --> 00:18:36.159

Jeffrey Fermin: God, I can't even begin to describe all the nonsense that these companies buy, and, you know, they might overspend on marketing. Reality is, if you build products that could genuinely affect people's lives and people's workplaces, everyone's gonna flock and gravitate towards you the same way back in 2011, 2013, people were coming to Office 5.


142

00:18:36.270 --> 00:18:41.419

Jeffrey Fermin: I lie, I forgot… I'm gonna not lie to you and say I forgot the other points I wanted to make.


143

00:18:41.800 --> 00:18:46.409

Jeffrey Fermin: Because I got so passionate about that, I was like, man, we gotta do something, and then I forgot.


144

00:18:46.410 --> 00:18:48.570

Marion: It's… it's good!


145

00:18:48.900 --> 00:18:51.660

Marion: HR… HR tech needs that passion.


146

00:18:53.100 --> 00:18:55.110

Jeffrey Fermin: I just remembered the point I was gonna make.


147

00:18:55.230 --> 00:19:04.419

Jeffrey Fermin: I don't like the fact, and this might be a little bit of a tangent, but the same way we're kind of entering this era where the market kind of sucks.


148

00:19:04.890 --> 00:19:20.669

Jeffrey Fermin: employees aren't necessarily a major investment for companies, because you automatically, as a C-suite person of a not-so-good org, right, I'm putting that ICP out there, you're probably thinking, I'm going to replace them with AI, so you're… every employee feels like they're walking on eggshells.


149

00:19:20.670 --> 00:19:21.370

Danny Gluch: Yep.


150

00:19:21.370 --> 00:19:24.790

Jeffrey Fermin: Every organization right now, especially in the tech space, right?


151

00:19:25.070 --> 00:19:49.519

Jeffrey Fermin: why is there this ebb and flow of, like, oh, since it's an employer market, we cannot necessarily invest in employees? I think if you really want to stand the test of time right now, you need to double down on your employee engagement initiatives. If you really want to build, like, satisfaction, you invest in tools like people analytics platforms, employee relations platforms, poll surveys, you, you know, give your employees healthcare, you do all these things.


152

00:19:49.520 --> 00:20:13.670

Jeffrey Fermin: And if you really want to improve performance, loyalty, and satisfaction, I have no current stats to back this, by the way, but it's just a suggestion. I do think that is the way forward. I think as long as organizations are continually investing employees, even in down periods of times, and not making employee engagement this fun-to-have thing, that it's only when employers kind of dominate the market when we can have employees.


153

00:20:13.670 --> 00:20:24.400

Jeffrey Fermin: If we could change that mindset and have orgs invest in people, we can see very strong positive outcomes and long-term and organizational success. So that's…


154

00:20:24.400 --> 00:20:24.880

Marion: Yeah.


155

00:20:24.880 --> 00:20:26.439

Jeffrey Fermin: That was the point I wanted to make.


156

00:20:26.440 --> 00:20:42.189

Marion: No, and it's a really good point. I think it's a balance, though, right, in my experience. You know, like, if you're in a larger organization and you have more, resources to be able to do a lot of that, then that's phenomenal.


157

00:20:42.190 --> 00:20:53.930

Marion: if you're in a small to medium enterprise, if you're a startup, right, and you're maybe in the tech sector, and it is a very precarious market right now, you've got to try and hedge your bets a little bit, and I think that


158

00:20:54.080 --> 00:21:10.179

Marion: absolutely meeting those basic, you know, going back to, you know, Maslow's hierarchy of needs, right? But meeting those basic, basic things that humans need is absolutely essential, of which I would include healthcare as one of the most important.


159

00:21:11.950 --> 00:21:23.189

Marion: But there's something to… there's something in there about the balance between the transactional and the physical, as well as the emotional.


160

00:21:23.190 --> 00:21:38.410

Marion: And I think the emotional piece is really important. Like, if I think about, would I rather work for a company where my healthcare was really good, but every day I felt psychologically unsafe, I felt terrified to make a mistake or to take a risk?


161

00:21:38.510 --> 00:21:46.430

Marion: Or would I rather be in a company where maybe, yes, I'm paying more for my healthcare, but every day I leave work and I feel like I've done something a bit good.


162

00:21:46.760 --> 00:21:48.549

Marion: smile on my face, like.


163

00:21:48.790 --> 00:21:57.979

Marion: that's the trade-off that employees are having to make, and that's very specific to the United States, I think. Like, if I compare that to being in the UK,


164

00:21:59.110 --> 00:22:08.579

Marion: we have national healthcare. It may be crap at the best of times, because it's so over, so oversubscribed and under-resourced, and when I say it's crap, I don't mean


165

00:22:08.690 --> 00:22:11.630

Marion: In a negative way, it's phenomenal.


166

00:22:12.050 --> 00:22:22.579

Marion: But it doesn't compare to private healthcare, right? Like, if I needed an MRI in the UK, it might take me 18 months to 24 months. If I need one here, I'll get it next week, right? Because I'm paying for that privilege.


167

00:22:23.530 --> 00:22:30.900

Marion: But when I compare the psychological trade-off versus the, like, the mechanical, the transactional.


168

00:22:31.000 --> 00:22:47.090

Marion: you know, I think it can be a really difficult balance for companies to maintain, right? Other… in the other extreme, you've got companies that are, like, if you look at the FANG, right, those companies are chucking things at their employees, you know?


169

00:22:47.150 --> 00:23:02.970

Marion: Rsus, the golden handcuffs, like, the big salaries and all of that, but you just have to look at blind and read blind, they're all fucking miserable, right? They're absolutely miserable. They are stressed, they are burned out, they are in… in pieces.


170

00:23:03.580 --> 00:23:11.710

Marion: But yeah, you know, they're walking away with maybe $2 million in RSUs or whatever. So, like, it's like… it's such a difficult…


171

00:23:11.820 --> 00:23:17.520

Marion: balance to find and to maintain, and I think to kind of, like, be able to


172

00:23:17.940 --> 00:23:23.579

Marion: Do it in a way which reflects your… your pay philosophy, and your values, and your…


173

00:23:23.580 --> 00:23:25.140

Danny Gluch: company ethos.


174

00:23:26.010 --> 00:23:29.519

Marion: But still let people live in a… in our… in a good way.


175

00:23:30.270 --> 00:23:32.480

Jeffrey Fermin: Papa, I wanna ask if…


176

00:23:33.350 --> 00:23:35.350

Jeffrey Fermin: I had to do it. I had to do it, I'm sorry.


177

00:23:35.350 --> 00:23:36.050

Danny Gluch: But…


178

00:23:36.050 --> 00:23:37.690

Jeffrey Fermin: Papa, I had to ask,


179

00:23:38.200 --> 00:23:55.699

Jeffrey Fermin: I want to open this one up for everybody. Would you do that trade-off? Do you think, you know, it's worth kind of taking that extra dollar if it meant, like, oh, you know, I'm gonna make $200, $300, $400K, if you felt, like, not necessarily psychologically safe, or you're just constantly stressed, because


180

00:23:55.890 --> 00:24:13.089

Jeffrey Fermin: I say that because I do know folks that work at FANG, and they're like, oh my god, my… you know, they tell me all the time, my job is awful, but I get to travel, I get to do this and that, but I have such high, like, expectations from leadership, and I have to do it every single quarter, and it sucks. So, just want to hear everyone's thoughts.


181

00:24:13.090 --> 00:24:15.079

Danny Gluch: That's a great question.


182

00:24:15.080 --> 00:24:16.320

Cacha Dora: This is a fantastic question.


183

00:24:16.320 --> 00:24:19.090

Danny Gluch: defer to my co-hosts, let them answer first.


184

00:24:19.090 --> 00:24:23.649

Cacha Dora: Oh, alright. Well, I'll say this, I feel like…


185

00:24:24.030 --> 00:24:30.530

Cacha Dora: I feel like this is a bit of a cop-out answer, but I actually think it's very nuanced, is it almost comes down to the season of life that you're in.


186

00:24:30.530 --> 00:24:35.339

Jeffrey Fermin: In where that answer 100% could change.


187

00:24:35.410 --> 00:24:47.629

Cacha Dora: If you're trying to put money away for your kid's college fund, and you're like, I'm gonna make this sacrifice because I know what this end goal is 5 years down the line, or you're gonna give yourself a hard


188

00:24:48.020 --> 00:24:49.500

Cacha Dora: term limit?


189

00:24:49.690 --> 00:24:53.829

Cacha Dora: Not to make any puns here for other things going on in our country, but,


190

00:24:54.080 --> 00:25:04.319

Cacha Dora: you know, I think… I think it kind of depends on where you're at as far as, like, is it worth it, right? Like, are you even in a spot from where you've been in your career that you could mentally take that?


191

00:25:04.660 --> 00:25:14.390

Cacha Dora: And if you know that it would just turn you into sawdust, then you wouldn't even think about it, right? Like, your mental health is way more important. Your physical health is always more important.


192

00:25:14.540 --> 00:25:16.080

Cacha Dora: Because there's only one of you.


193

00:25:16.330 --> 00:25:27.169

Cacha Dora: But are there other things that kind of weigh out? Like, I've had some of those golden handcuff moments, and they're helpful until they're not, and suddenly those handcuffs mean jack shit.


194

00:25:28.510 --> 00:25:33.669

Cacha Dora: Because you, or the people around you who are witnessing what's happening to you mean way more.


195

00:25:33.900 --> 00:25:48.769

Cacha Dora: So it's like, I feel like it's a season of life dependent. If I knew I had a goal in mind, and I was like, oh, I need to save money for this, I would be like, I'd probably do it for a few years and apologize to everybody that loves me for that period of time, and then I would GTFO.


196

00:25:48.770 --> 00:25:49.390

Marion: Yeah.


197

00:25:49.390 --> 00:25:56.230

Cacha Dora: But it really just comes down to that season, and on if it's even… plausible, capable.


198

00:25:56.500 --> 00:26:01.049

Marion: No, you're absolutely right. Like, I am at that point in my…


199

00:26:01.160 --> 00:26:19.639

Marion: professional development and in my career and in my personal life, where I'm like, fuck that for a game of soldiers. I would… I value much more my mental health, my physical health, my ability to spend time with the people that I love, doing things that I enjoy, pottering about in my garden, like, doing.


200

00:26:19.640 --> 00:26:20.119

Danny Gluch: At this point.


201

00:26:20.120 --> 00:26:24.210

Marion: doing all the thought leadership stuff, my PhD, like, I value that.


202

00:26:24.390 --> 00:26:25.480

Marion: more?


203

00:26:25.720 --> 00:26:34.359

Marion: fan, you know, extra zeros on my bank. I mean, obviously, everyone loves money, right? Money, we need money. Money makes well go around your area.


204

00:26:34.490 --> 00:26:38.290

Marion: But I've also worked in those environments.


205

00:26:38.750 --> 00:26:54.199

Marion: And to what end? It really impacted my mental health, it impacted my physical health, it impacted my self-confidence, my ability to function as a well-rounded human being. Like, so for me.


206

00:26:54.370 --> 00:26:57.719

Marion: No, but I'm also at a stage in my life where I'm at the


207

00:26:57.960 --> 00:27:03.540

Marion: hardly dead, but I'm not at the beginning of my career, right? I'm… I'm… Hmm.


208

00:27:03.780 --> 00:27:08.940

Cacha Dora: You have a breadth of experience to speak to at this point, instead of trying to collect the experience, right?


209

00:27:08.940 --> 00:27:14.350

Marion: Exactly! You know, imagine towards the later stages of my career, and now, I think…


210

00:27:14.610 --> 00:27:18.559

Marion: As well, we talk about this a lot, working in human resources.


211

00:27:18.900 --> 00:27:30.590

Marion: your values get louder and louder and louder as you get older, and I cannot turn them down. And now I'm at a point where I use that for good, and I use that to advocate for others.


212

00:27:30.590 --> 00:27:39.000

Marion: And, you know, I guess that's kind of very much the basis of my research and what I'm trying to advocate for psychological safety in workplaces, right? I think it's really important.


213

00:27:39.000 --> 00:27:42.719

Marion: So, that's me, but if I…


214

00:27:43.620 --> 00:27:55.999

Marion: was in a very different financial situation, like, I don't have kids of my own, but if I did, and I was having to think about college, and, think about, you know, other, you know, like.


215

00:27:56.900 --> 00:28:04.330

Marion: parents in later stages of life needing care and all of that stuff in a highly capitalist society like the United States.


216

00:28:04.640 --> 00:28:20.650

Marion: my opinion would probably be very different, right? And I would probably be putting myself under those types of pressures. I have the capability, but do I have the mental capacity and the mental wellness to withstand that?


217

00:28:21.000 --> 00:28:25.780

Marion: I don't think I do, not for any sustained period of time, but unfortunately.


218

00:28:25.910 --> 00:28:28.659

Marion: places like the United States and others.


219

00:28:28.900 --> 00:28:40.059

Marion: put humans in very dire situations and have to make those types of decisions, and so it's certainly not an easy path to walk.


220

00:28:40.350 --> 00:28:43.290

Marion: But yeah, it's very nuanced, as Kasha said.


221

00:28:44.880 --> 00:28:51.060

Danny Gluch: Yeah, it's… It is complicated, but it… for me, like…


222

00:28:51.290 --> 00:29:05.950

Danny Gluch: I think I would do well. You know, I was in sports and very competitive in games and whatnot, and I think I would do really well in environments like that. I would be a top performer even with the heat turned all the way up.


223

00:29:07.360 --> 00:29:08.390

Danny Gluch: But…


224

00:29:08.820 --> 00:29:18.629

Danny Gluch: I don't think I would last very long. I think I would see the pressure and the situation and how it affects people, and I can't stay silent.


225

00:29:18.700 --> 00:29:35.000

Danny Gluch: And especially if they're doing surveys of, like, hey, what's the, you know, employee experience like? I'm not gonna hold back, like, I'm gonna speak up. And so, like, I know my time there would be short-lived, because I would either completely burn out on…


226

00:29:35.670 --> 00:29:38.819

Danny Gluch: The, the emotional,


227

00:29:38.860 --> 00:29:56.890

Danny Gluch: ability to stay engaged, the ability to seem a future, like, for myself with that company, would just vanish. It would immediately become short-term. And I'm pretty sure it would affect how I was around my kids and stuff, too, which I wouldn't want. But, like.


228

00:29:57.660 --> 00:30:00.030

Danny Gluch: You know, yeah, it would be nice for, like.


229

00:30:00.170 --> 00:30:08.989

Danny Gluch: 3 months? But I don't think I would make it to the 6th month, even if I was doing really well, and I suspect that I might.


230

00:30:09.280 --> 00:30:21.370

Marion: But you say that, right? But think about that 3-6 month mark, where you're not sleeping well, and you have anxiety attacks, and you're having, you know, panic attacks, and you're not eating properly.


231

00:30:21.370 --> 00:30:23.740

Danny Gluch: Yeah, no, that's why I don't think I would make…


232

00:30:23.740 --> 00:30:27.319

Marion: Yeah, you're shouting at your kids. It takes a… there's a ramp up to that.


233

00:30:27.650 --> 00:30:34.979

Marion: It doesn't just go great to shit the next day, like, it's a… there's a… there's a change. And… and having gone through…


234

00:30:35.180 --> 00:30:41.450

Marion: various points of that in my life and in my career, I would not wish that on my worst enemy.


235

00:30:41.450 --> 00:30:44.330

Danny Gluch: So… I mean, I would wish it on my worst enemy, but…


236

00:30:45.430 --> 00:30:51.120

Marion: I would wish out on people to say things like circle back, but other than that, no, I wouldn't wish on my worst enemy.


237

00:30:54.690 --> 00:30:55.770

Danny Gluch: What about you, Jeff?


238

00:30:57.020 --> 00:30:58.339

Jeffrey Fermin: I'm just gonna ask.


239

00:30:58.460 --> 00:31:05.129

Jeffrey Fermin: No, I have… been at those workplaces, and I, bad ones, and great ones.


240

00:31:05.320 --> 00:31:13.650

Jeffrey Fermin: And I don't think I could do it again. I feel like, you know, you all touched on it, right? Like, you have these kind of…


241

00:31:15.070 --> 00:31:24.169

Jeffrey Fermin: you know, I don't even know how to sum it up into words. You have these anxious moments, you kind of turn in, like, non-verbal after work, you know, the second 5 o'clock.


242

00:31:24.170 --> 00:31:46.489

Jeffrey Fermin: hits, or 459 hits, you're looking at the clock, and it's almost like a blessing. You know, like, I've been in those places, and I really feel like having been there really drives my work, moving forward. Like, what I do now, I do my best to understand I've been in those places before, and I want to make sure that I can, like, eliminate those places before, and eliminate managers that are going to act like that, and stop giving, you know.


243

00:31:46.700 --> 00:31:59.629

Jeffrey Fermin: give the employees enough of a voice to not have to deal with those circumstances, especially in, like, today's day and age, right? Like, the market being the market and how things are, everything's kind of employer-centric, right? Like.


244

00:32:00.570 --> 00:32:07.490

Jeffrey Fermin: I feel like… God, I'm, like, so stressed just talking about it and thinking about my own personal experiences.


245

00:32:07.750 --> 00:32:10.989

Jeffrey Fermin: I would love to do whatever I can to ensure that


246

00:32:11.080 --> 00:32:23.270

Jeffrey Fermin: organizations are psychologically safe, and they're doing what they can for employees, and, you know, whether it's through data or just, you know, talking to people, HR becomes more of a function to


247

00:32:23.270 --> 00:32:48.109

Jeffrey Fermin: eliminate these types of environments and not have people make those choices, and like the ones we're describing, and we can just, like, all have a good workplace where we can be productive and find our flow states and enjoy doing what we're doing. So, very long-winded way to say I would not take the money, I've done it before, and that's what currently fuels my work now.


248

00:32:48.110 --> 00:32:56.149

Jeffrey Fermin: and the people that I talk to and the work that I do, it really stems from my hatred of bad workplaces, which is a very strong way of putting it.


249

00:32:56.150 --> 00:33:02.109

Cacha Dora: It's interesting that the negative experiences we have truly inform our emotional intelligence.


250

00:33:02.200 --> 00:33:04.370

Cacha Dora: And how we work


251

00:33:04.450 --> 00:33:23.639

Cacha Dora: each and every one of us, right, individually, I guarantee, works really hard to create a psychologically safe environment, because we've all experienced the antithesis of what that looks like, and… or feels like, right? Because it's not just how it looks or how it feels, it's also how you experience it and what that means, and I think the…


252

00:33:23.780 --> 00:33:29.110

Cacha Dora: One of… it's one of… it's an awful teacher, but one of the best teachers is the bad stuff.


253

00:33:29.830 --> 00:33:30.360

Marion: Yeah.


254

00:33:30.510 --> 00:33:35.520

Jeffrey Fermin: Annie, we talked a little bit about the Batman in Philosophy books, but that's kind of the…


255

00:33:35.520 --> 00:33:59.329

Jeffrey Fermin: the big thing, right? Like, those negative experiences either make you a good person or a bad person and turns you into either Batman or the Joker, right? How you overcome these situations are the true teller of, like, who you will become as an individual. So, I will say, I hope I'm doing a good job. I hope I could be… I'll make that my LinkedIn headline, right? Like, HR is Batman, and people off is Batman. That's very corny, please don't record that.


256

00:33:59.330 --> 00:34:00.000

Cacha Dora: Yeah, I love it.


257

00:34:00.000 --> 00:34:04.110

Danny Gluch: You may hit some trademark violations, I'm just gonna say.


258

00:34:04.110 --> 00:34:21.269

Jeffrey Fermin: Okay, perfect. But yeah, no, great points, and dude, please, for the love of God, don't work at these places. Let's not even buy from them as consumers. If you see them have, like, one-star Glassdoor ratings, let's not work with these folks, let's not give them platforms. So yeah, let's start a revolution.


259

00:34:21.620 --> 00:34:30.630

Marion: Yeah, and this is where I struggle, though, because here, like, let's get into the Amazon discussion, right? Like, you know…


260

00:34:30.889 --> 00:34:32.740

Marion: when I think about…


261

00:34:33.710 --> 00:34:45.720

Marion: how things have gone in that company in the last few years in particular, especially right now. 14,000 employees, flattening of structures, you know.


262

00:34:45.739 --> 00:34:56.470

Marion: Andy Jassy making some very questionable, non-data-driven decisions for the company that's the most data-driven in the world, she says, making inverted quote signs.


263

00:34:58.400 --> 00:34:59.440

Marion: And yet…


264

00:35:00.240 --> 00:35:11.489

Marion: none of us can give it up. It's, like, the worst addiction in the world, because it's so convenient, and it… in our very complex and busy lives, we're using tools like that to be able to find.


265

00:35:11.880 --> 00:35:22.089

Marion: and to keep going, and to… you know, I kind of decorated my basement in speakeasy that I was telling you guys about earlier, without being able to use, you know,


266

00:35:22.390 --> 00:35:24.089

Marion: Resources like that.


267

00:35:24.290 --> 00:35:27.989

Marion: But… but as an HR practitioner, and just as a


268

00:35:28.610 --> 00:35:35.090

Marion: someone who I like to think is a relatively decent human being, when I think about how people are being…


269

00:35:35.230 --> 00:35:37.349

Marion: you know, petite?


270

00:35:38.500 --> 00:35:51.629

Marion: Oh, wow. So it gets really complex, right? Like, it's super complex, and… We have… Enabled these behaviours.


271

00:35:51.780 --> 00:36:11.119

Marion: And we've enabled these organizations, and we continue to enable, from a place of what convenience? Like, we're going off on a slightly different tangent, but you bring it back to the employee experience, bring it back to that person who is in the DCs, who's running logistics, who's, you know.


272

00:36:11.830 --> 00:36:19.940

Marion: uploading catalogue, merchandise, like, whatever that might be. Yeah.


273

00:36:20.590 --> 00:36:22.400

Marion: We're… we're complicit.


274

00:36:22.480 --> 00:36:25.510

Danny Gluch: I mean, it's hard, right? Like, there's…


275

00:36:26.680 --> 00:36:45.539

Danny Gluch: it would be great if there was a competitor who treated their employees well and also had that ecosystem built, and they just don't. And I want to say, I don't think that Amazon is in the position they're in because they treat their employees poorly. I think they're in that position despite treating their employees poorly.


276

00:36:45.560 --> 00:36:59.299

Danny Gluch: One of my… one of my curiosities is… right, because… and that's their people strategy. Their people's strategy is to be cutthroat. And I think there's a lot of organizations who do that, and a lot of people read the Steve Jobs book, and…


277

00:36:59.460 --> 00:37:00.480

Danny Gluch: kinda…


278

00:37:00.850 --> 00:37:11.449

Danny Gluch: have this sort of pressure cooker mentality of how they want their teams and their organizations to run. And… you know what? Bless them, I… I hope that…


279

00:37:11.660 --> 00:37:14.529

Danny Gluch: people find better places to work.


280

00:37:14.750 --> 00:37:33.560

Danny Gluch: what I'm curious about are the people who unintentionally have these tough places to work, where they're… they think they're trying, or at least they're… they think they're fairly neutral. Like, they know they're not the best, but, like, they're… they're not trying to be bad, they're… you know, they might even have the intention of being a really good place to work.


281

00:37:34.060 --> 00:37:42.910

Danny Gluch: But they're failing at it. And the data, and the dashboards, and the culture, and everyone knows that they're failing at it. Like.


282

00:37:43.250 --> 00:38:02.429

Danny Gluch: How does that happen? How can you be, like, what are, in your experience, Jeff, with, you know, the data and your experience in HR tech specifically, looking at this, what are some of the things that companies get so wrong when they're not trying to be awful, but end up being just really, really bad place to work?


283

00:38:03.320 --> 00:38:15.460

Jeffrey Fermin: That is a great question, and I wish I had, like, a very straightforward answer for it. I immediately think… my crutch is to say, like, they're not listening well enough, and…


284

00:38:15.460 --> 00:38:26.260

Jeffrey Fermin: I hate blaming senior leadership because that's who I directly talk to and say, like, hey, you know, if you take actions in your workplace off of, like, listening to your employees, you're gonna be able to make the changes to be great.


285

00:38:26.910 --> 00:38:42.529

Jeffrey Fermin: I feel like that could be the missing part there, where, you know, you have everything laid out, everyone kind of understands, and the company might be trying, but it might just be performative, and they might not be hearing from the employees to understand what's truly going on. So…


286

00:38:42.720 --> 00:38:48.630

Jeffrey Fermin: If that… in this particular situation, maybe that's what's happening. That's the perfect situation to…


287

00:38:48.760 --> 00:39:01.089

Jeffrey Fermin: Institute employee engagement initiatives, have Pulse surveys, or employee relations platforms to really understand why the, you know, the big thing is why.


288

00:39:01.360 --> 00:39:14.410

Jeffrey Fermin: why the place sucks, right? That's all we're trying to get down to the nitty-gritty. Like, is it managers? Is it, you know, benefits? Is it, I don't know, salaries? What can we be doing to be better?


289

00:39:14.750 --> 00:39:21.580

Jeffrey Fermin: I gave Marion an example of, like, during my tenure at Office Vibe, Bose was one of our first enterprise clients.


290

00:39:21.650 --> 00:39:24.960

Jeffrey Fermin: And they were kind of in this situation where


291

00:39:24.980 --> 00:39:40.049

Jeffrey Fermin: you know, they were a good organization. They are obviously a world-known enterprise. I mean, if you have Bose headphones, it's like luxury, right? It's almost identical to having, like, a Louis Vuitton purse, I'd say. And…


292

00:39:40.050 --> 00:39:48.999

Jeffrey Fermin: Yeah, like, they just couldn't get it together, and they were trying to figure out why, and the second that first poll survey came out, and all the responses started coming in.


293

00:39:49.160 --> 00:40:02.600

Jeffrey Fermin: they found that, you know, they had employees that were going in from Ireland to London every week, and they were doing hypercommutes from other countries, trying to get to the main HQ. This was back in 2011, when remote work was not, like, a thing at all.


294

00:40:02.860 --> 00:40:16.899

Jeffrey Fermin: But they were… they tried to come up with solutions. They spoke to their employees and said, hey, you know, what can we do? We see that this is one of the biggest pressing complaints on here. We genuinely want to get better. We, like, actually give a shit about you.


295

00:40:17.260 --> 00:40:25.359

Jeffrey Fermin: what can we do? And… Literally within 2 months, they instituted a remote work policy back in 2011.


296

00:40:25.360 --> 00:40:41.059

Jeffrey Fermin: probably one of the first to do it in London, at least, you know, one of our first customers to do it. They took action, and right afterwards, their scores, engagement scores, all across the board went up. They found out that they, like, you know, were… they had, like…


297

00:40:41.140 --> 00:40:56.270

Jeffrey Fermin: what's it called? One of our pillars was wellness. They saw, like, mental health go up. They saw manager-to-employee relations go up. Employee-to-employee relations, off of one little change here. Granted, that's back in 2012-ish. Now, fast forward.


298

00:40:56.790 --> 00:41:05.129

Jeffrey Fermin: I hate to be the negative Nelly again, but now you have people that don't give a damn, and they're just, like, trying to take away things like RTO and do all this stuff, but…


299

00:41:05.340 --> 00:41:13.040

Jeffrey Fermin: that's not the question that's being asked. I just wanted to vent a little and tell you guys, I don't like the world of work right now, and I don't know how I'm doing.


300

00:41:13.460 --> 00:41:25.609

Marion: But this is my hotspot, right, because this is the center of my research, so I do want to jump on that a little bit, and I think that this is where HR practitioners are really stuck between a rock and a hard place, right?


301

00:41:25.660 --> 00:41:36.739

Marion: We know what the research tells us, right? And if anyone comes at me and says, oh, the research says that you're better together, fuck off, it doesn't. There's no data that supports any of that.


302

00:41:36.740 --> 00:41:40.580

Cacha Dora: Define better together, because you can be better together anywhere.


303

00:41:40.580 --> 00:41:51.110

Marion: Exactly, right? Like, do not. And there's a lot of manipulation of data to try and form a narrative for companies, right? And that truly does not exist, right? Now.


304

00:41:51.300 --> 00:41:53.080

Marion: there's nothing…


305

00:41:53.630 --> 00:42:07.490

Marion: There's no data that supports or refutes that being together is any better or worse than being remote. It's not about being in office, it's not about being remote, and it's not about being hybrid. All of them on their own as individual constructs are pretty shit, right?


306

00:42:07.490 --> 00:42:14.050

Marion: It's about flexibility, it's about adult to adult, it's about treating people as human as in adults, and let them work how they best work.


307

00:42:14.150 --> 00:42:24.950

Marion: Some people work better in an office, some people don't. Some people have disabilities, and it works better for them not having to deal with whatever drama it is to get to work, and…


308

00:42:25.060 --> 00:42:27.190

Marion: Different, different strokes, right?


309

00:42:27.520 --> 00:42:34.279

Marion: And that's what people thrive in, and I think that where the world is right now is that we're…


310

00:42:34.540 --> 00:42:37.609

Marion: Weaponising data, in a lot of ways.


311

00:42:37.790 --> 00:42:40.730

Marion: To suit a narrative.


312

00:42:41.240 --> 00:42:46.880

Marion: and… That's almost worse than not collecting the data to begin with.


313

00:42:47.710 --> 00:42:52.160

Marion: And so, I… I have a lot of concerns about…


314

00:42:52.300 --> 00:43:02.459

Marion: how we spin a lot of the stuff that comes out of our surveys. Surveys are an important employee voice. This is my bread and butter. This is what I do all day, every day, and I think it's…


315

00:43:02.780 --> 00:43:04.739

Marion: It's hugely, hugely important.


316

00:43:05.620 --> 00:43:17.559

Marion: But if we're not really taking that data with the intention that it was given in, and the spirit it was given in, and not doing anything really meaningful with it, what a waste of time.


317

00:43:17.750 --> 00:43:23.480

Marion: And it just goes back to that whole thing of it… it… breaks trust.


318

00:43:25.170 --> 00:43:42.979

Marion: And it damages psychological safety, and so it… there's that whole question of, was it better to just do nothing at all than do something that you're not doing anything with? Do you know what I mean? Like, it's… it's really messy, and I think that the… the return to office thing really underpins just how damaging that can be.


319

00:43:43.340 --> 00:43:45.110

Jeffrey Fermin: Yeah. Oh, go ahead.


320

00:43:45.110 --> 00:43:46.739

Danny Gluch: Oh, no, no, go, go ahead.


321

00:43:46.740 --> 00:44:01.489

Jeffrey Fermin: I was gonna say, like, did you all see that picture? I wanna say it's Goldman. They just, like, showed their new office up, and it's, like, this open concept, and it's like, the desk or the computers are all, like, right next to each other. You have, like, one foot of separation between computers.


322

00:44:01.490 --> 00:44:13.699

Jeffrey Fermin: And all the comments were just insane. Everyone's like, what is… this looks dystopian. Is this what we're fighting? Like, this is what we're going… we're pushing all these return-to-office mandates so we could come back into a building and handle Zoom calls.


323

00:44:13.700 --> 00:44:16.079

Jeffrey Fermin: One foot away from the person next to you, like.


324

00:44:16.080 --> 00:44:16.619

Marion: Oh, it's true.


325

00:44:16.620 --> 00:44:17.530

Jeffrey Fermin: What the hell's going on?


326

00:44:17.690 --> 00:44:34.880

Marion: It's wild. Like, I have been in that situation where, you know, I'm sitting in a… I'm sitting on a Zoom meeting, and I'm looking round, and half the people that are in the Zoom meeting are sitting, like, 3 feet away from me. Are you kidding me with this, right? Like, how is that enhancing my life?


327

00:44:35.530 --> 00:44:36.340

Danny Gluch: Oh, man.


328

00:44:36.340 --> 00:44:39.880

Marion: It's wild. So, no, that is a reality, and…


329

00:44:41.240 --> 00:44:45.819

Marion: Man, like, I think that that just does even more damage, ultimately.


330

00:44:46.990 --> 00:45:02.550

Danny Gluch: Yeah, it's not a good look or feel, and I think that's the… I love the name Office Vibe, because it's about the feel. What does it feel like to be a part of this company? And what I loved about the story of Bose was they asked…


331

00:45:03.160 --> 00:45:18.149

Danny Gluch: And then 2 months! Like, that's a big… I honestly don't understand how they pulled that off. But the level in… the pace of that reaction and response and movement and change…


332

00:45:18.300 --> 00:45:29.280

Danny Gluch: based on listening, boy, their employees, even if that didn't affect them, and I think this is the important part, because you're gonna take a survey, there's gonna be a lot of things, and you can't respond to all of them.


333

00:45:29.400 --> 00:45:44.260

Danny Gluch: But I feel like they moved, and moved big enough and fast enough to where everyone probably felt heard, even if their ask wasn't specifically actioned on. And feeling heard is one of the things that I think these…


334

00:45:44.260 --> 00:45:49.620

Danny Gluch: Surveys, do the opposite of, is oftentimes the action is


335

00:45:49.620 --> 00:45:58.009

Danny Gluch: Two years down the road, if there's anything, and it's impossible to feel heard because you never even see it.


336

00:45:58.270 --> 00:46:09.189

Cacha Dora: There's always that lack of transparency that pops up sometimes, where if it's easier to say, I think, right? Like, with my no-decision-making role in an organization.


337

00:46:09.190 --> 00:46:21.599

Cacha Dora: Like, I think it would make so much more sense to be like, we heard you. This… you don't realize how intense this change is, but we're trying to figure something out. It might take 2 years, but you've been heard.


338

00:46:21.600 --> 00:46:33.050

Cacha Dora: And I… I think that there is an immense pressure, kind of going back to what you were saying earlier, Marion, on our HR teams, to not say that, and to do the thing.


339

00:46:33.100 --> 00:46:39.499

Cacha Dora: But sometimes doing the thing is a very time and labor-intensive thing to make sure that it does


340

00:46:40.190 --> 00:46:54.140

Cacha Dora: hit all of our criteria in an organization to make sure that you're within policy and on compliance and all the things that you have to check all these boxes for. But so many businesses across the board, and there's so much


341

00:46:54.640 --> 00:47:07.330

Cacha Dora: so much academic research on the lack of transparency, right? The lack of communication and how, if we were to do that one simple thing of just, we're working on it, we heard you…


342

00:47:07.330 --> 00:47:11.400

Danny Gluch: One simple trick. It's just one of those little… it's a TikTok video.


343

00:47:11.400 --> 00:47:12.600

Cacha Dora: Crafts!


344

00:47:12.600 --> 00:47:19.439

Marion: Yeah, but let me lay it out really, really simple, right? Lack of transparency breaks trust.


345

00:47:19.440 --> 00:47:23.210

Cacha Dora: Transparency builds trust.


346

00:47:23.210 --> 00:47:31.149

Marion: Right? It's really that basic. But here's… here's the… here's the thing where it starts to get complicated, it's scale.


347

00:47:31.390 --> 00:47:38.810

Marion: All of this stuff is fine if you work in a small to medium enterprise, and apart from Bose being a big brand, they're still a relatively small company.


348

00:47:38.810 --> 00:47:39.420

Danny Gluch: Smoke, yeah.


349

00:47:39.420 --> 00:47:43.560

Marion: comparison to their peer group within consumer electronics, right?


350

00:47:43.820 --> 00:47:48.090

Marion: They're a luxury brand, so they're smaller, but high-end.


351

00:47:48.590 --> 00:47:58.450

Marion: But when you start working in Fortune 500, Fortune 100 companies, where you're millions of employees, global employees.


352

00:47:58.830 --> 00:48:18.359

Marion: you know, the analogy I always use, it's like… it's like the biggest cruise liner or oil tanker or cargo ship you can think of, and it's gone off course, right? Like, it's not like a speedboat that you can just course correct within a few, you know, within a minute. You're less than a minute, you're back on course.


353

00:48:18.360 --> 00:48:35.719

Marion: If you're, like, a gazillion ton tanker, and you've gone 180 off course, it's going to take a long time to be able to course correct, especially if you've got headwinds and tides and what have you coming at you, right? You can't do that overnight. It's exactly the same in an enterprise organization.


354

00:48:35.720 --> 00:48:47.029

Marion: And so you can have these issues that kind of manifest and then get bigger and bigger and bigger, but it takes a long time to course correct, whereas if you're smaller and nimble, like your friends at Bose.


355

00:48:47.740 --> 00:48:50.219

Marion: you can course correct a lot quicker. Now.


356

00:48:50.470 --> 00:48:56.510

Marion: I do… I say that, and I believe that, but I also think that size gets used


357

00:48:56.990 --> 00:48:59.479

Marion: As a convenient blocker.


358

00:48:59.910 --> 00:49:09.679

Marion: to fixing problems, right? Well, we're big, it'll take too long, da-da-da, but that's just bureaucracy, it's really taking its nonsense in a lot of cases, and I think that there's probably enough


359

00:49:10.930 --> 00:49:22.679

Marion: Examples, case studies out there where large companies have made relatively, nimble decisions, and it has had a positive result, but conversely, it's probably had a neg… there's been negative


360

00:49:22.820 --> 00:49:37.999

Marion: results too, right? Similar situations, and it hasn't gone well, because what works for a small to medium enterprise does not work in a big company. I mean, there's a reason why, and Jeffrey, you and I have had this conversation, and Kasha and Danny and I have definitely had this conversation.


361

00:49:38.000 --> 00:49:43.260

Marion: There's a reason why all the great H. Artec that exists today


362

00:49:43.260 --> 00:49:53.510

Marion: is not built for enterprise solution yet, because it doesn't work in a complex matrix organization. And there's a reason why these complex matrix organizations have all the


363

00:49:54.820 --> 00:50:00.740

Marion: shall we say, less enjoyable technology to use, not naming any names.


364

00:50:00.740 --> 00:50:02.929

Danny Gluch: I was gonna say baggage, but yeah.


365

00:50:02.930 --> 00:50:03.610

Marion: Yeah, but…


366

00:50:03.610 --> 00:50:04.360

Danny Gluch: That's enjoyable.


367

00:50:04.680 --> 00:50:08.549

Marion: The tech solutions that are open to enterprise companies.


368

00:50:08.590 --> 00:50:10.420

Danny Gluch: Are not the greatest.


369

00:50:10.430 --> 00:50:19.699

Marion: they're not the most, UI UX enjoyable, they're certainly not EX enjoyable. And…


370

00:50:19.840 --> 00:50:32.920

Marion: But that's the best that we have, because they work for these very complex organisational structures. And so, you know, it really isn't that straightforward. Like, it's easy to sit and bitch and moan and be like, oh, well, you know.


371

00:50:33.060 --> 00:50:46.289

Marion: But it really isn't. When you work in an enterprise organization, you see just how difficult it is to course correct, and the tools that you have at your disposal are not necessarily able to help you do that in a quick way, so…


372

00:50:46.830 --> 00:50:51.689

Marion: Unfortunately, employees Are the collateral damage of that?


373

00:50:53.340 --> 00:50:58.690

Jeffrey Fermin: I want to add on to that point that you just made around, like, enterprise tech looking a little…


374

00:50:59.190 --> 00:51:05.310

Jeffrey Fermin: not so nice to look at, from a UX standpoint, or just the whole entire interface is always trash.


375

00:51:06.810 --> 00:51:16.210

Jeffrey Fermin: I want to say I've consulted or worked at, I wanna say nearly 10 HR tech companies, with the exception of All Voices.


376

00:51:16.510 --> 00:51:34.260

Jeffrey Fermin: the other 9 have been, when you get that first enterprise client, you essentially become a dev workshop for them, especially if you're a smaller startup. So you have to build in features that kind of go towards that Fortune 500 company and make sure that you are keeping them happy, and that…


377

00:51:34.400 --> 00:51:48.069

Jeffrey Fermin: kind of… ugh, makes the product not so fun and not so engaging for anyone, which is such a shitty way of approaching design. I have to give kudos to All Voices and the product team there. You know.


378

00:51:48.470 --> 00:51:56.269

Jeffrey Fermin: They do a lot to make sure that, you know, we put our product roadmaps out there, we, you know, make it very transparent with folks, but


379

00:51:56.420 --> 00:52:04.080

Jeffrey Fermin: we work with a lot of enterprises. I don't want to name names, because you never know, churn and legal stuff, right? But reality is, like.


380

00:52:04.140 --> 00:52:23.049

Jeffrey Fermin: We understand what it takes to manage employee relations from a, like, large enterprise level, so we're able to kind of, like, build features out that are gonna affect the whole entire market, or all of our users and people that we want to eventually gain. That said, not the case. With Office Vibe, we're a dev shop. With, you know, Butterfly.ai.


381

00:52:23.050 --> 00:52:30.039

Jeffrey Fermin: We got, you know, the big Fortune 1, right? So, we essentially became… we had a massive contract. We became their dev shop.


382

00:52:30.080 --> 00:52:38.440

Jeffrey Fermin: God, all the other small ones, I've seen it happen every single time. The second you get… you see that check come in, or you see the deal about to be closed.


383

00:52:38.520 --> 00:52:55.950

Jeffrey Fermin: you are just getting customer success tickets to build out tickets left and right, so I say that as a product person, and it's been stressful having to be on that part of things, and I'm more than happy just sticking to marketing and having fun creating content with brilliant folks like you, so…


384

00:52:55.950 --> 00:53:01.190

Marion: Oh, no, we hear you. And I, you know, like, I have this thing, right? Like, I…


385

00:53:01.560 --> 00:53:11.019

Marion: I'm… I talk about this a lot, like, I'm super passionate about HR tech, having been the poor sod, trying to curate


386

00:53:12.310 --> 00:53:19.790

Marion: a tech stack, a people tech stack, which gives you everything that you need to give your employees an incredible experience.


387

00:53:19.790 --> 00:53:39.670

Marion: But also try to balance that with not killing them with tech fatigue, right? Because you can get one thing that does one thing well, and another thing that does something else well, and another thing that does something else well. And before you know it, people are like, wait, what? I go where for what? And they get confused, and it just becomes a whole thing. And then, you know, the usage of it.


388

00:53:39.670 --> 00:53:40.360

Marion: It just…


389

00:53:40.360 --> 00:53:45.000

Marion: drops, because people are like, oh, this is shit, I don't know where to go for anything. Or they're just so…


390

00:53:45.020 --> 00:53:57.380

Marion: burned out by jumping between platform to platform, and, you know, it's not always as simple as an open API and white labeling and all of that. As great as that sounds, the reality is very different. So, like.


391

00:53:58.900 --> 00:54:10.640

Marion: God, I would… I am such a cheerleader for some of these smaller tech platforms, and I presume, like, where all voices started and have evolved into. Like.


392

00:54:10.770 --> 00:54:21.900

Marion: the company that can figure out a one-stop shop, right, that really does tick all of the boxes, that does a good job of being HRIS, employee relations.


393

00:54:22.420 --> 00:54:33.659

Marion: talent, like, if you can get a one-stop shop that does all of it and does it well and works for enterprise, it's like the holy grail of HR tech, right? The person who figures that one out.


394

00:54:33.780 --> 00:54:42.699

Marion: will never work again a day in their life, right? They've nailed it. They've solved half the big problems in industry today. But unfortunately, it just…


395

00:54:42.940 --> 00:54:48.509

Marion: We're not there, and we can't quite figure it out, and so the tech that is available


396

00:54:49.210 --> 00:54:59.539

Marion: It's good in some ways, and shit in others, but it certainly doesn't create this incredible employee experience that we really want our people to have.


397

00:55:00.900 --> 00:55:16.229

Jeffrey Fermin: Sad note and good note, sad note is that's before we got acquired, that was the end goal for, Office 5, now Work Leap, right? Like, we wanted to have this massive ecosystem where every HR person can kind of, like, handle everything, you know, end-to-end, right?


398

00:55:16.230 --> 00:55:21.889

Jeffrey Fermin: Good note, I'm gonna send this over to Brian, our CEO, see if we can make something happen now.


399

00:55:22.730 --> 00:55:25.540

Jeffrey Fermin: Oh, God, I hope no competitors are listening. I…


400

00:55:25.830 --> 00:55:28.280

Danny Gluch: I would…


401

00:55:28.970 --> 00:55:32.199

Marion: Hire us, because we have all the answers.


402

00:55:32.910 --> 00:55:36.210

Cacha Dora: Super smart peoples on this podcast.


403

00:55:36.210 --> 00:55:37.620

Marion: I won'.


404

00:55:37.620 --> 00:55:41.360

Jeffrey Fermin: I want to add one more thing on that Bose example.


405

00:55:41.420 --> 00:55:58.359

Jeffrey Fermin: you know, there's… I strongly believe that was a sign of the times, where, like, yeah, they're a smaller organization, but, you know, maybe there was less technology, less data going around, and since we were kind of the first in line with, you know, this people analytics solution.


406

00:55:58.380 --> 00:56:12.500

Jeffrey Fermin: they used us, and they were like, wow, okay, there's all this information. Big data is kind of this trend. We have, like, we need to make decisions fast, because that's kind of in mode with the market right now, and we have to, you know, empower our folks to do that.


407

00:56:12.540 --> 00:56:27.769

Jeffrey Fermin: back in 2012-ish, when that was a thing to do, you can make those decisions and not have too many folks kind of chirping, telling you, like, oh, that's not the right thing to do. I feel like maybe now, you know, paralysis through analysis to make these types of changes.


408

00:56:27.770 --> 00:56:50.379

Jeffrey Fermin: You look at it, you kind of have to worry about the numbers, you know, budget, and then somebody in from compliance has to go in and give their opinion on that, and they probably bring in some more analytics, and then somebody else brings in another sheet that says, I don't know, Deloitte found that remote work isn't all that good. From all these studies, right? You have all these kind of reports, and it makes it difficult to make a decision.


409

00:56:50.380 --> 00:56:58.189

Jeffrey Fermin: So, could… could that have been just a sign of the times, and less technology, and one source of data


410

00:56:58.190 --> 00:57:11.239

Jeffrey Fermin: encouraging a company to make a decision, and now it's a little bit more difficult, because you have so many more variables that you have to take into account, and more people that you have to kind of please, for lack of a better phrase, in order to make that decision, right?


411

00:57:12.790 --> 00:57:31.630

Danny Gluch: Yeah, absolutely. It goes back to the politics. I think it's a more complicated, you know, and one of the things about the going remote during COVID was it really stripped out all of that bureaucracy and politics and the finance industry, because you just had to. So companies just had to make it work. There wasn't…


412

00:57:31.630 --> 00:57:48.849

Danny Gluch: you know, oh, well, let's do a decision on this, and can you research that, and, you know, let's ask some of our companies that are in a similar industry of a similar size, and see how they handled it. It was like, no, you just have to go, go, go, go, go. So, in one hand, that was kind of a…


413

00:57:49.110 --> 00:58:03.080

Danny Gluch: a nice little blessing to cut through all that red tape. Not that a global pandemic was a blessing. But, to sort of smooth through, right, and as we're, like, wrapping up, is…


414

00:58:03.210 --> 00:58:18.510

Danny Gluch: I realize it's been almost an hour. It's been a great conversation. But, to smooth through the… all the data, all of the vibes that you're feeling in the office, right, you're getting the signals. How do you go from that to…


415

00:58:18.730 --> 00:58:38.199

Danny Gluch: not necessarily the right decision, because there is no perfect decision, but one that is going to be beneficial in the long run. Is there a process? What have you seen successful organizations do from deciding they want to collect data to making a decision that is beneficial to the employees?


416

00:58:38.490 --> 00:58:39.750

Jeffrey Fermin: Got it. So…


417

00:58:40.390 --> 00:58:43.480

Danny Gluch: Turning data into action, right? So…


418

00:58:43.480 --> 00:58:46.000

Jeffrey Fermin: Plain and simple, you want to spend around…


419

00:58:46.740 --> 00:59:09.019

Jeffrey Fermin: Pulse surveys, you get the data every single week, but I personally feel like there's a good 2-3 month window where you can really get accurate results and understand what's going on in your organization. You've probably heard phrases like survey fatigue, and all the different things, like maybe somebody could incentivize someone to take a survey, so it kind of influences answers, right?


420

00:59:09.020 --> 00:59:12.540

Jeffrey Fermin: You want to make sure that you have a dataset that is


421

00:59:13.270 --> 00:59:25.229

Jeffrey Fermin: You know, free from any type of bias. And if you give, like, that 2-3 month window of collecting information, you'll get a true understanding of how your workers feel, how your leadership feels, the whole lay of the land.


422

00:59:25.410 --> 00:59:30.029

Jeffrey Fermin: That said, the next thing then becomes, you know, collect… seeing that information.


423

00:59:30.310 --> 00:59:46.989

Jeffrey Fermin: trying to analyze all these, things. Back in the day, we did not have AI or sentiment analysis. We would just kind of go in and say, like, alright, based on this, you want to do this. Nowadays, the platforms are a lot more advanced, so I would say, you know, look into the suggestions, or, you know, maybe bring in a…


424

00:59:46.990 --> 00:59:57.700

Jeffrey Fermin: outside source, or don't even bring an outside source. You have talented HR folks that are looking at these things nowadays, so just kind of come up with ideas based on what are the industry leaders doing.


425

00:59:57.730 --> 01:00:07.980

Jeffrey Fermin: What can we do to train our managers to improve their soft skills, because we see leadership being down? All that fun stuff. You can use data to tell the story of what actions to take next.


426

01:00:08.080 --> 01:00:22.590

Jeffrey Fermin: So, I think after you decide on that, come up with an internal comm structure and say, hey, we noticed that these scores are down, we want to improve on all these things, we're here for you, here are some initiatives that we want to maybe propose.


427

01:00:22.590 --> 01:00:38.000

Jeffrey Fermin: Do you think they could work? Do you want to send up a follow-up survey and just make it, like, a multiple selection of things that would make employees feel better, and understand, like, what is the things that… what are the things that are scoring the highest? What are the things that people truly want?


428

01:00:38.000 --> 01:00:53.920

Jeffrey Fermin: get both quantitative, qualitative data, and then start taking action with another internal comms and say, okay, based on whatever we approved with our C-suite, here's what we're gonna do for you, right? Here… here's what we worked for, here's what we negotiated.


429

01:00:53.920 --> 01:01:11.020

Jeffrey Fermin: Wham bam, thank you, ma'am, here's a 3-month plan to get that out there. And understandably so, like, Marion brought up a great point. With larger enterprises, this might stretch out a lot longer. It could be a whole year, two, maybe even three years of, you know, reshaping how your organization runs.


430

01:01:11.020 --> 01:01:33.459

Jeffrey Fermin: It could be one year of, like, internal communication letting people know, like, like, hey, we're gonna collect pulse survey information for 6 months, and understand what's needed in every single country, or department, or office, whatever the case may be, right? You get all this information, the… it remains the same. You get all the information, you analyze, you think of… you bring it to your C-suite.


431

01:01:33.460 --> 01:01:38.499

Jeffrey Fermin: You kind of come up with a resolution, and then you just communicate it with your employees and do a slow rollout.


432

01:01:38.500 --> 01:01:40.310

Jeffrey Fermin: Or quick rollout if you're a startup.


433

01:01:41.520 --> 01:01:58.719

Marion: But do meaningful stuff, right? Just do stuff that's meaningful, and be honest, be transparent, build trust. Don't do the quick gratification just for the sheer hell of it, because it's that red bull effect, right? It doesn't last, and it doesn't sustain, and


434

01:01:59.000 --> 01:02:03.339

Marion: Don't use phrases like a low-hanging fruit, either, because that just, you know…


435

01:02:03.760 --> 01:02:07.359

Marion: really undermines, I think, the value of what it is that you're trying to do.


436

01:02:07.360 --> 01:02:07.830

Danny Gluch: That's not the.


437

01:02:07.830 --> 01:02:14.920

Marion: We're not here for quick wins, we're here to do something meaningful that's really going to change the employee experience over the long term.


438

01:02:15.250 --> 01:02:33.830

Cacha Dora: And if people have taken the time to give you their opinion, the most an organization, or maybe the least, I don't know, whatever's the right word to use here, an organization can do, is to tell them, we got this, and this is what we're doing, right? Like, because I feel like if you're not that… if you don't give some kind of transparent message, everything happens in a void.


439

01:02:33.970 --> 01:02:46.370

Cacha Dora: And that void is then the assumption of they're doing nothing. And that sucks, because you know that there is work that's being done, but since you didn't say anything, and they haven't seen a result of their feedback, they think.


440

01:02:46.530 --> 01:02:57.309

Cacha Dora: nothing's been done. And then that's just, like, detrimental to the people who are doing the hard work that might be taking 3 years, you know, like what, Jeff and Marian were saying. So it's like, say something.


441

01:02:57.980 --> 01:03:07.470

Marion: Or, you get the opposite, but it's all jazz hands, but it's all fur coat and no knickers, as we see in Glasgow, right? It's all show, no substance. So it's trying to find that balance between…


442

01:03:07.470 --> 01:03:09.289

Danny Gluch: We now have to log into some app.


443

01:03:09.520 --> 01:03:10.170

Danny Gluch: That's…


444

01:03:10.170 --> 01:03:15.040

Marion: But it's trying to find that balance between that… that…


445

01:03:15.830 --> 01:03:31.430

Marion: visibility that stuff's happening, there's progression happening, but it's not just some, like, you know, lack of substance Red Bull effect that really means nothing and isn't going to move the needle on anything. So, you're always trying to walk a line, and that's where our poor HR friends


446

01:03:32.060 --> 01:03:33.260

Marion: It's, it's tough.


447

01:03:33.480 --> 01:03:39.529

Jeffrey Fermin: I have, one more thing. I gave a long, drawn-out way of, like, telling people how to do your job. Plain and simple.


448

01:03:39.850 --> 01:03:43.950

Jeffrey Fermin: Make your workplace fucking cool. That's it. That's it.


449

01:03:44.270 --> 01:03:47.020

Marion: Everyone was…


450

01:03:47.020 --> 01:03:50.840

Cacha Dora: Throwing metal hands and horns, everyone, you just couldn't see it.


451

01:03:50.840 --> 01:03:54.000

Marion: Yeah, exactly. Make work suck less!


452

01:03:54.000 --> 01:03:54.380

Jeffrey Fermin: Sit.


453

01:03:55.500 --> 01:04:09.929

Danny Gluch: You know, it's one of those, I think, having a motto for why you're doing these surveys would be really beneficial, right? Do you want to make work suck less? Do you want this to be a workplace that everyone


454

01:04:09.980 --> 01:04:21.110

Danny Gluch: looks forward to coming to, right? Or, you know, they look forward to signing in Monday morning, or are they posting all weekend about already having the Sunday Scaries, right? Like, that's…


455

01:04:21.480 --> 01:04:28.370

Danny Gluch: That's the difference, and that's why you're gonna, you know, ask these questions, and that's why you're gonna make these decisions, and


456

01:04:28.580 --> 01:04:36.909

Danny Gluch: actions, because that's gonna fix culture. Just doing the surveys isn't going to fix the culture. In fact, it could make it worse. And…


457

01:04:37.140 --> 01:04:43.340

Danny Gluch: Yeah, thank you so much. This was such a fun conversation, Jeffrey. Where can people find you? What are you working on?


458

01:04:43.510 --> 01:04:58.160

Jeffrey Fermin: Oh my goodness, so you can find me on LinkedIn, and yeah, Jeffrey Furman, J-E-F-F-R-E-Y, I don't know, common misspelling, people do E-R-Y, and Ferming is F-E-R-M-I-N. I'm on Instagram as well, Furman Talks Work.


459

01:04:58.160 --> 01:05:09.669

Jeffrey Fermin: And threads, that's my new favorite obsession. Furman Talks work as well, I'm owning that handle. TikTok, threads, Instagram, meet me there. Get OuttaX, that place is a cesspool. I don't know if…


460

01:05:10.490 --> 01:05:11.790

Jeffrey Fermin: Alright, I don't even know.


461

01:05:11.790 --> 01:05:13.340

Danny Gluch: Sponsorship?


462

01:05:13.340 --> 01:05:20.369

Jeffrey Fermin: Sorry, no, no, no, X is sponsors this, podcast, give them millions and millions of dollars, that's the only way I like you guys.


463

01:05:20.990 --> 01:05:22.430

Jeffrey Fermin: And I am…


464

01:05:22.430 --> 01:05:46.670

Jeffrey Fermin: Currently working at All Voices. I am their head of demand gen. I get to interview brilliant people all day, every day. And, Emily, Fennec, and I, we are the two marketing people that fight for HR folks every single day. So, yeah, we love you guys, we're here for you, and yeah, give All Voices a look if you want to get down to any employee relations issues.


465

01:05:46.670 --> 01:06:01.559

Jeffrey Fermin: Solve them, manage them, use AI to do them. I don't know, I'm not a good salesperson for the company. I will say, what we do, we actually impact workplaces, and we change people's lives, so that is a big mission statement, right there. And yeah, check us out. We love you guys.


466

01:06:01.740 --> 01:06:14.919

Marion: Yeah, links in the show notes, all of, Jeffrey's contact details, and they should check out your podcast as well, which I've been on, I was honoured to be a guest, and is a lot of fun.


467

01:06:15.080 --> 01:06:30.419

Marion: And yeah, and there was a guest appearance from our cat Izzy as well, so, you know, anyone wants to see Izzy clawing onto me while I'm talking psychological safety, you can find it on All Voices. Yeah.


468

01:06:30.420 --> 01:06:45.100

Danny Gluch: Well, be sure to check out that podcast. Again, links in the show notes. Be sure to subscribe to this feed so you can get all of our episodes as they come out. We're trying to do them every two weeks, with a few breaks for the holidays and summer.


469

01:06:45.290 --> 01:06:53.080

Danny Gluch: Be sure to leave a 5-star review, that helps everyone be found in the algorithms of Spotify and Apple Podcasts, and even LinkedIn.


470

01:06:53.240 --> 01:06:56.080

Danny Gluch: Thank you all very much, we'll see you next time.