The Elephant in the Org

AI Will Save Us! The Great CEO Gaslight of the Modern Industrial Revolution with Felix Mitchell (AI Special - Part 2)

The Fearless PX Season 3 Episode 3

Send us a text

“AI will make your job easier.”
Translation: Here’s a shiny new tool — now do 10x more with half the resources.

In Part 2 of our AI Trilogy, we’re joined by Felix Mitchell (Co-CEO of Instant Impact, host of We Need to Talk About HR) to unpack how leaders are spinning AI as a miracle cure — while quietly restructuring work, cutting jobs, and pushing employees harder than ever.

We dig into:

  • The 3 phases of AI adoption (and why most companies are stuck in Phase 1).
  • Why CEOs love to announce bold AI visions but skip the messy middle.
  • How “just upskill” became the new corporate gaslight when no one can define the skills.
  • The growing tension between efficiency, engagement, and psychological safety.
  • Why mid-size companies might actually come out on top in this industrial revolution on fast-forward.

This isn’t another glossy keynote about innovation. It’s a raw look at the human cost of AI hype — and why psychological safety may be the most important currency of the future of work.

Link to Show Notes Here

🐘 Connect with Us:

🚀 Follow The Fearless PX on LinkedIn: The Fearless PX
📩 Got a hot take or a workplace horror story? Email Marion, Cacha, and Danny at elephant@thefearlesspx.com

🎧 Catch every episode of The Elephant in the Org: All Episodes Here

🚀Your Hosts on Linkedin:

🐘Marion Anderson

🐘Danny Gluch

🐘Cacha Dora

💬 Like what you hear?
Subscribe, leave a ★★★★★ review, and help us bring more elephants into the light.

🎙️ About the Show

The Elephant in the Org drops new episodes every two weeks starting April 2024.
Get ready for even more fearless conversations about leadership, psychological safety, and the future of work.

🎵 Music & Production Credits

🎶 Opening and closing theme music by The Toros
🎙️ Produced by The Fearless PX
✂️ Edited by Marion Anderson

⚠️ Disclaimer

The views and opinions expressed in this podcast are those of the hosts and guests, and do not necessarily reflect any affiliated organizations' official policy or position.

Felix Transcript



3

00:00:03.920 --> 00:00:09.830

Danny Gluch: Welcome to the elephant in the org, everyone. I'm Danny Glutch, and I'm joined, as always, by my co-host Cacha Dora


4

00:00:09.830 --> 00:00:10.920

Cacha Dora: Hello!


5

00:00:10.920 --> 00:00:12.339

Danny Gluch: And Marion Anderson.


6

00:00:12.340 --> 00:00:13.809

Marion: Good morning!


7

00:00:14.190 --> 00:00:21.909

Danny Gluch: And today we have a special guest. We have Felix Mitchell. Felix, why don't you introduce yourselves to our listeners?


8

00:00:22.150 --> 00:00:33.379

Felix Mitchell: Thanks, Danny. It is fantastic to be here. I'm Felix Mitchell. Yeah, co-CEO of instant impact. We are a


9

00:00:33.550 --> 00:00:55.210

Felix Mitchell: professional services business. We help companies with their talent, acquisition function kind of working as their in-house recruitment team. And we're more and more getting into helping with transformation as well. So helping businesses get to grips with how their existing workforce and teams need to change in this


10

00:00:55.480 --> 00:00:59.919

Felix Mitchell: wild world of AI, where everything seems to change.


11

00:01:00.180 --> 00:01:26.270

Felix Mitchell: I'm also wild. It's wild I get very excited about. I get very excited about very, very mundane things, Marion. It's 1 of my superpowers, and I think the the other thing is, I'm host of a podcast called, we need to talk about Hr, it's kind of my really lame hobby. But is is a lot of fun I get to meet some amazing, amazing folks.


12

00:01:26.660 --> 00:01:33.380

Marion: No, we, we dig that one. We understand that it's exactly the same for us. It's a labor of love, but


13

00:01:33.380 --> 00:01:34.440

Marion: it's definitely definitely


14

00:01:34.440 --> 00:01:53.249

Marion: definitely worth it. And yeah, we're actually referencing. I think one of your big episodes today, because that was one that really stood out to us. And it was your 100th episode, live episode, hosted at Linkedin. Their fabulous new space. They have a new creative space in London, I believe, and.


15

00:01:53.250 --> 00:01:55.090

Felix Mitchell: Certainly do. It's amazing.


16

00:01:55.090 --> 00:02:07.579

Marion: Yeah, I was fascinated about the what was it? She's the guy said about the the light flashes the number of times that every to represent the number of hires like per minute, or something like that. It was sounded really cool.


17

00:02:08.039 --> 00:02:11.489

Felix Mitchell: It. It did. Yeah. So their their logo, as you go in sort of.


18

00:02:11.490 --> 00:02:12.020

Marion: Hmm.


19

00:02:12.020 --> 00:02:19.090

Felix Mitchell: Backlit by Led. And yeah, every time that flashes, someone's found a job through Linkedin, which is pretty


20

00:02:19.300 --> 00:02:24.850

Felix Mitchell: pretty interesting, also interesting. How heavily they index on being a job finding platform. I wasn't expecting that.


21

00:02:25.160 --> 00:02:28.399

Marion: Yeah, it's definitely shifted from.


22

00:02:28.400 --> 00:02:32.498

Marion: Hmm, yeah, when it was the professional Facebook, right?


23

00:02:32.950 --> 00:02:40.517

Felix Mitchell: Yeah. Well, given all the video. Given all the videos that we all like put out on there, I think you guys are as guilty as I am on that.


24

00:02:41.160 --> 00:02:41.550

Cacha Dora: Yeah.


25

00:02:42.120 --> 00:03:03.800

Danny Gluch: I mean, it's like all social media. It's as they, you know. I don't know if you guys have seen the social network movie where they talk about like no 1st off, just the Facebook. And like that's that's cooler. Because a lot of the early social media was about being cool, about being a place to hang out, and it was. It was great.


26

00:03:03.800 --> 00:03:16.070

Danny Gluch: And then all of a sudden they needed to make money, and it shifted, and a lot happened, and it became about, oh, well, we it's actually about influencers. And how do we drive people towards commerce and advertising?


27

00:03:16.090 --> 00:03:43.790

Danny Gluch: And Linkedin? I think it had a similar pivot towards job listings, job postings, job finding. And I think that's even changing more now with the introduction of AI and what we wanted to talk about in this episode. A lot is how that AI and the workforce are overlapping and shifting, and what leadership looks like. What job you know, recruitment looks like what job searching looks like, what jobs will look like.


28

00:03:43.790 --> 00:03:49.519

Danny Gluch: And I think that's a really important topic for us to talk about, because everyone sees it coming.


29

00:03:49.600 --> 00:04:17.790

Danny Gluch: People are either like Super uber excited for AI, or they are terrified. And I honestly like, I'm not sure where I am anymore. Like I was really excited. I was a really early adopter. I've got friends who work for Openai and things like that, and I've used it heavily, and then it like it, kind of showed some warts. And I was like, maybe maybe I should be pumping the brakes a little bit


30

00:04:17.860 --> 00:04:26.479

Danny Gluch: and then listening to to your podcast Felix, I'm like, Okay, maybe there is just a good blend. And I don't really know where I feel. But I feel like


31

00:04:26.590 --> 00:04:28.399

Danny Gluch: we need some people


32

00:04:28.570 --> 00:04:35.919

Danny Gluch: who are willing just to ask questions and talk and and share honestly and openly about what's going on in in AI, in the workforce.


33

00:04:35.920 --> 00:04:43.150

Felix Mitchell: Hey, let let me just let me just inject my because I cause I think when you're paying attention to the capability of the technology, I think.


34

00:04:43.150 --> 00:04:43.770

Marion: Okay.


35

00:04:43.770 --> 00:04:53.740

Felix Mitchell: You do spend a lot of time here. Oh, man, is this terrifying? Is it really? Is it really exciting? What's my job going to look like the thing that just gets me really excited about it, and


36

00:04:54.360 --> 00:04:57.099

Felix Mitchell: I think gets lost in the noise is


37

00:04:58.740 --> 00:05:12.980

Felix Mitchell: we're in this unique moment in human history where there's this epoch defining shift in the way that everything gets done with the only limit on its adoption.


38

00:05:13.200 --> 00:05:19.240

Felix Mitchell: and how much it's going to change things being the pace at which humans and businesses can shift.


39

00:05:19.410 --> 00:05:38.570

Felix Mitchell: And so many of us are in positions where we're shaping the thinking around it. We're you guys, you're influencing the people leaders and organizations that are going to change the face of work. And those people who are leading businesses, they are going to create the new rules


40

00:05:38.730 --> 00:05:39.530

Felix Mitchell: of the workplace.


41

00:05:39.530 --> 00:05:40.220

Danny Gluch: Isn't.


42

00:05:41.190 --> 00:05:50.030

Felix Mitchell: Normally change just happens to people. But we just happen to be in these worlds where where we can point it and steer it. And how cool is that.


43

00:05:50.510 --> 00:06:00.810

Cacha Dora: Yeah, I love that cause. I think normally, when we think about change overall, it's an event, right? It's like a moment in time. It's something that's really like you can pinpoint it.


44

00:06:01.343 --> 00:06:19.830

Cacha Dora: Or it happens to you versus. At this point we're talking about AI as a change point as a tool so like the big moving thing isn't so much the moment in time so much as it is the tool, and how we interpret the tool, how we use the tool, how we like you were saying right? Shape the tool


45

00:06:20.345 --> 00:06:25.919

Cacha Dora: and I think that that's such a different way of looking at change, because we're so used when there's like.


46

00:06:26.270 --> 00:06:28.180

Cacha Dora: how many change theories exist.


47

00:06:28.180 --> 00:06:28.900

Marion: M.


48

00:06:29.190 --> 00:06:36.390

Cacha Dora: All of them deal with like human behavior or human psychology, and like how we respond to the moment.


49

00:06:36.730 --> 00:06:39.440

Cacha Dora: None of them talk about how we respond to a tool.


50

00:06:41.670 --> 00:06:42.140

Felix Mitchell: Yeah.


51

00:06:42.140 --> 00:06:44.160

Cacha Dora: It's it's new. It's completely new.


52

00:06:44.160 --> 00:06:51.520

Marion: Yeah, I mean, go kind of piggybacking on what Danny said. I'm I'm having a bit of a


53

00:06:52.450 --> 00:07:05.739

Marion: conscious crisis. I think around it because heavy user, heavy adopter. I'm doing a Phd, you know, I I'm constantly embracing this technology and in so many ways it's changing my life for the better.


54

00:07:05.870 --> 00:07:08.930

Marion: but from a practitioner standpoint


55

00:07:09.210 --> 00:07:14.385

Marion: I am scared about what I'm seeing around me, you know one of the things


56

00:07:15.040 --> 00:07:36.009

Marion: that you called out in that episode, Felix was. You know, we're at this inflection point in human history, you know. Now, today, one developer can do in like 3 days what it used to take an entire team of developers 6 months right? And which is phenomenal, right from a productivity standpoint and cost and all of that.


57

00:07:36.150 --> 00:07:47.649

Marion: But then there's the human piece, and and what I hear, there's 2 things that I see. I see companies throwing AI tools at their teams and go


58

00:07:47.770 --> 00:07:53.510

Marion: have at it. There you go! There's the tools. Now go be fabulous, and and 10 extra goals right?


59

00:07:53.870 --> 00:07:58.439

Marion: And it's there's a lot of stealth and not stealth pressure. There.


60

00:07:58.560 --> 00:08:03.450

Marion: The other thing that I see is, we hear a lot of you know.


61

00:08:03.620 --> 00:08:16.513

Marion: the world of work is changing with this technology. So we have to rescale, and we have to upscale, which is true. But no one can actually tell you what that means. What what does that actually mean? And I feel like


62

00:08:17.280 --> 00:08:19.060

Marion: There's a lot of kind of


63

00:08:19.690 --> 00:08:25.450

Marion: I don't know smoke emitters. There's a lot of concern. There's a lot of waffle. And and underneath all of that


64

00:08:25.610 --> 00:08:35.350

Marion: we're seeing steady job losses in technology. So it's a really weird time in our industrial revolution. And.


65

00:08:35.350 --> 00:08:36.020

Danny Gluch: Behave.


66

00:08:36.330 --> 00:08:41.529

Marion: I don't know. It's it's it's making me lose sleep. In a way it's quite scary.


67

00:08:41.539 --> 00:08:43.398

Cacha Dora: Industrial revolution. 2 point oh.


68

00:08:43.770 --> 00:08:44.420

Marion: Yeah.


69

00:08:45.990 --> 00:08:47.410

Felix Mitchell: Yeah, it is.


70

00:08:47.680 --> 00:08:57.169

Felix Mitchell: It is. I mean, any you guys know way better than I do. Any change is scary, right? We're hardwired to to reject, to reject change. I think the


71

00:08:59.040 --> 00:09:01.280

Felix Mitchell: you want my perspective on the


72

00:09:01.650 --> 00:09:02.860

Felix Mitchell: on, where, on, where we're at.


73

00:09:02.860 --> 00:09:03.440

Danny Gluch: Cent.


74

00:09:03.440 --> 00:09:04.550

Marion: Absolutely make it.


75

00:09:04.550 --> 00:09:05.685

Cacha Dora: Spicy.


76

00:09:06.820 --> 00:09:14.070

Felix Mitchell: I see cool, so I don't know how to make it spicy. I'll work on it so.


77

00:09:14.070 --> 00:09:14.780

Marion: Ha! Ha! Ha!


78

00:09:16.050 --> 00:09:23.409

Felix Mitchell: I think I think, to your first, st to your 1st point around everyone's kind of throwing tools at everyone else, and everyone says that


79

00:09:24.280 --> 00:09:27.400

Felix Mitchell: you know, these efficiencies will come, and and they won't.


80

00:09:29.120 --> 00:09:30.600

Felix Mitchell: I think there's a lot of


81

00:09:30.880 --> 00:09:37.119

Felix Mitchell: I think there's a lot of leaders who kind of get the endpoint.


82

00:09:37.220 --> 00:09:40.469

Felix Mitchell: but none of the like 50 points in between


83

00:09:40.650 --> 00:09:47.479

Felix Mitchell: sort of where they get the destination, but do not understand what's what's involved in.


84

00:09:47.580 --> 00:09:51.760

Felix Mitchell: in, I mean even the mode of transport to get there.


85

00:09:51.950 --> 00:10:00.540

Felix Mitchell: It's not even. It's not even whether you're doing roads or train tracks or or flying, or a combination of of all of them.


86

00:10:00.880 --> 00:10:02.159

Felix Mitchell: I think the


87

00:10:02.690 --> 00:10:13.820

Felix Mitchell: the thinker that I find best on this is, or most useful to to how we've conceptualized at instant impact is Josh Josh Burson. A lot of what we do is built on


88

00:10:14.270 --> 00:10:15.670

Felix Mitchell: on his thinking.


89

00:10:16.420 --> 00:10:34.800

Felix Mitchell: I'm gonna I'm gonna butcher it. So I apologize. But essentially 3 big phases of AI he does. 4. But I do 3 of AI development within within an organization, the 1st being I think, what you were pointing out before Marion, which is


90

00:10:35.650 --> 00:10:52.179

Felix Mitchell: but he calls the assist phase, it's like, here's some tools. Here's an Llm. Here's a co-pilot like. Go go figure it out the kind of vision around AI has been completely democratized and the accountability has been democratized. I think the way most companies are right now is that's just not working


91

00:10:52.290 --> 00:11:14.000

Felix Mitchell: because you've got a CEO who understands the destination is a complete rewiring of the business, and you've got people who are quite understandably using AI in their work and and getting to like, I don't know 10% productivity improvement, because when you roll up all the personal improvement it doesn't, it doesn't work at an organizational level.


92

00:11:14.030 --> 00:11:27.479

Felix Mitchell: So next phase is, next phase is getting between phase one and phase 2 is where most businesses are getting stuck and really frustrated right now. So phase 2 is what it's called augment phase. It's where you've got


93

00:11:27.600 --> 00:11:55.569

Felix Mitchell: really powerful AI agents that are augmenting existing processes and ways of working. And you're talking like 30 to 50% productivity improvement. And this is where we are at instant impact at the moment. So we brought in some really cool AI, some of them agents, some of them like smaller pieces of technology. And they're just building loads of efficiency. And they make our people better at what they're doing. We're not in the point at this stage where we're talking about an organizational


94

00:11:55.830 --> 00:12:03.260

Felix Mitchell: organizational restructure that won't be the, you know. We're just doing the same work better than we were before.


95

00:12:04.080 --> 00:12:14.149

Felix Mitchell: which is, which is really exciting. But it it also doesn't mean that, as with any other technological technological change. It doesn't mean that people get any less busy. It just means that the output goes goes up.


96

00:12:16.960 --> 00:12:29.510

Felix Mitchell: The thing that people struggle about is that the democratized approach to AI adoption that works in phase. One does not work in phase 2. It doesn't get you to phase 2, because you have to make big procurement decisions.


97

00:12:29.610 --> 00:12:52.310

Felix Mitchell: And you need to be really purposeful about what you're doing and why you're doing it. And there's a to the point you're talking about before, Danny. There's a big change element to this right? You have to roll it out. You have to communicate it. You have to train people you have to do. XY, and Z. People need a different like a tweet, a really significant tweak to the way that they approach work. I think you were saying before


98

00:12:52.310 --> 00:13:13.030

Felix Mitchell: you know, you're a heavy heavy user. I think people need to change their way of thinking to. Not okay. I've got work to do. I'm going to go do the work. It's I've got work to do. Can I apply AI to this. Okay, how do I apply AI to this? Oh, wait! I can't apply AI to this. I'm just going to go and do it. So there's like a whole another loop that's involved in the second


99

00:13:13.030 --> 00:13:24.869

Felix Mitchell: phase, and if you don't train people on it, you just get adoption that doesn't stick. I heard a stat the other day that normal change programs, you got about 70% failure rate AI change programs. It's more like 80,


100

00:13:25.060 --> 00:13:40.210

Felix Mitchell: just because it just requires such a rethinking. Then the next phase is, when you start building in an orchestration layer, where you where you get AI doing like really automating the work. And it's moving things from these different agents across the across the organization.


101

00:13:40.680 --> 00:14:01.650

Felix Mitchell: And the reason why companies don't get from phase 2 to phase 3 is because that requires like a full, big old piece of org design. That's where you're like, okay, well, how do we work? How do we staff projects? If knowledge doesn't mean as much anymore. What does that mean for how we do our work at the end of the day? I think we're so far off of that conversation, and.


102

00:14:01.650 --> 00:14:02.149

Danny Gluch: I think that.


103

00:14:02.150 --> 00:14:03.300

Felix Mitchell: Why, there's this.


104

00:14:03.860 --> 00:14:14.179

Felix Mitchell: What was it that people talked about in Covid? There's this cognitive dissonance where I shouldn't be throwing around psychology words with you guys.


105

00:14:14.260 --> 00:14:17.323

Cacha Dora: You got a lot of head nods. We're we're tracking.


106

00:14:17.630 --> 00:14:18.290

Felix Mitchell: I can't.


107

00:14:18.290 --> 00:14:20.999

Felix Mitchell: That, like red flag was like out of my depth back off.


108

00:14:21.485 --> 00:14:21.970

Cacha Dora: No.


109

00:14:22.415 --> 00:14:23.760

Marion: Nobody spot on.


110

00:14:23.760 --> 00:14:26.350

Marion: Yeah, you're spot one excellent way.


111

00:14:26.350 --> 00:14:30.589

Felix Mitchell: So it really feels like a kind of organizational dissonance between.


112

00:14:31.100 --> 00:14:31.770

Felix Mitchell: And


113

00:14:32.310 --> 00:14:41.239

Felix Mitchell: what the CEO wants. I'm being really reductive here. What the CEO wants, because they really get the destination. They're super excited about it. But they think that's going to be in 6 months, not in


114

00:14:41.330 --> 00:15:03.440

Felix Mitchell: 2 or 3 years. Then you've got a bunch of people who've been asked to do something really, really complex and given 0 training, 0 resources, 0 budget and 0 break from their actual job to be able to figure out how to how to do it. And then everyone's like, Let's upskill. And no one knows the fucking skills. In the 1st place, because this is new for all of us.


115

00:15:03.440 --> 00:15:06.110

Felix Mitchell: That's think that's what's going on right now.


116

00:15:06.200 --> 00:15:07.460

Marion: Yeah, I


117

00:15:07.620 --> 00:15:18.219

Marion: yes, absolutely. We're in the same place. And and what is, I think, exacerbating? All of this right is our external environment. Right now, you know.


118

00:15:18.250 --> 00:15:30.860

Marion: I mean, I've I can't even remember where we were at last time I looked. It was like 90,000 tech layoffs since the beginning of the year. I'm sure it's more than that now. I think a stat that that we pulled was something like 10,000


119

00:15:31.140 --> 00:15:49.399

Marion: of them have directly been attributed to AI. And I'm sure there's a lot more. But there's a lot of other things that are going on as well. Right. Return to office that stealth. You know, if you don't want to come back to the office, you know where you you know you can leave, you know there's all of that happening as well.


120

00:15:49.550 --> 00:15:56.139

Marion: and the job market is fucked. So what we've got is we've got all of these competing


121

00:15:56.430 --> 00:16:24.740

Marion: threats effectively coming in, and employees who are just trying to keep the lights on and just trying to keep food on the table and be able to pay the mortgage. And so they are trying to do everything that they're being asked to do, like embrace the technology which is great. But again, without the tools, without the training, without the support, without the time to really kind of investigate it, and still with a pressure to deliver.


122

00:16:25.140 --> 00:16:26.330

Marion: and so


123

00:16:26.800 --> 00:16:40.879

Marion: really concerned about the psychological state, psychological safety of our employees in this time, because we've literally whiplashed from being. You know.


124

00:16:41.010 --> 00:16:43.790

Marion: I hate this phrase candidate driven market. But you know


125

00:16:44.010 --> 00:16:48.299

Marion: the the employee held all the cards. Not that long ago.


126

00:16:48.700 --> 00:16:51.349

Danny Gluch: We couldn't be further away from that now.


127

00:16:52.110 --> 00:17:00.380

Marion: I'm really worried about the legacy of this, and how we kind of continue to evolve past this. We didn't see


128

00:17:00.970 --> 00:17:05.879

Marion: we saw it coming, but I don't think we saw it coming like this. And so


129

00:17:06.140 --> 00:17:14.050

Marion: I don't know I'm I'm the human piece in here is so important, and I think that we're doing some some really sustained damage.


130

00:17:14.550 --> 00:17:16.969

Felix Mitchell: And have been for the last 6 years right?


131

00:17:16.970 --> 00:17:17.630

Marion: Yeah, I guess.


132

00:17:17.630 --> 00:17:17.960

Cacha Dora: -


133

00:17:17.960 --> 00:17:20.740

Felix Mitchell: It's been. Really. It's been really tough. Sorry, Danny. I can'.


134

00:17:20.740 --> 00:17:34.029

Danny Gluch: Yeah, no, no, I think it. It has been for for about that 6 years it's been rough and on your podcast on the 100th episode, you mentioned that there were 50% of us workers stressed about job insecurity


135

00:17:34.390 --> 00:17:38.119

Danny Gluch: or 50 54%, 56%.


136

00:17:38.230 --> 00:17:40.340

Danny Gluch: I wrote it down somewhere.


137

00:17:40.340 --> 00:17:40.670

Marion: I'm.


138

00:17:40.670 --> 00:17:41.210

Danny Gluch: But like.


139

00:17:41.210 --> 00:17:43.440

Felix Mitchell: That's right. Sounds about right? Yeah.


140

00:17:43.440 --> 00:17:55.190

Danny Gluch: Over 50%. And and I just I wonder how much some of that is the actual like, ask that Marion was talking about of like I'm being asked to do more. I'm not being given time to learn it.


141

00:17:55.400 --> 00:18:07.609

Danny Gluch: And how much is I'm also hearing on social media and in the news from these tech leaders about how this is going to do everything. And this is going to replace my entire industry. And


142

00:18:07.620 --> 00:18:32.280

Danny Gluch: what am I going to do like there, really, there is that sort of industrial revolution. Should I just go start sculpting and making art or Youtube videos or video games? Because I don't know what I'm going to be doing in, you know, 3 years, let alone 10 years from now, and I'm not going to retire for another 20 years, and I think there's there's a little bit of that like actual concern.


143

00:18:32.567 --> 00:18:46.660

Danny Gluch: But I think the big concern is more the day to day of, just like I'm being asked to use this tool. I don't really know how to incorporate it into my workflow. And and, Felix, you gave a better description. I think of how it should be used


144

00:18:46.660 --> 00:19:00.910

Danny Gluch: than I've I've ever heard. Which was the simple instead of just go, hey? I've been asked to do this thing. Let me just go do it. Let me spend a few minutes 1st and see if there is a role that AI can play in this.


145

00:19:00.910 --> 00:19:17.929

Danny Gluch: And this is where I actually, as an L and d professional, think that the upskill is less of an upskill and more of a mindset shift and just a workflow. The workflow just needs to change. You just need a sticker on your monitor that says pause, can AI help.


146

00:19:18.230 --> 00:19:18.840

Marion: Hmm.


147

00:19:19.350 --> 00:19:37.159

Danny Gluch: And just look at that like 20 times a day, and eventually you'll find your own. And I think that's where there's a lot of fear is that people are looking for direction and to be told. And this happens in the Us. Workforce is very much a do what you're told.


148

00:19:37.410 --> 00:19:48.069

Danny Gluch: mindset, you know. That's that's how our education system works, too. And when people you know, the young generation gets into the workforce, they're very much of like a let me do what I'm told to do, because that's what I've been trained to do.


149

00:19:48.930 --> 00:20:08.050

Danny Gluch: But now we're asking them to like you take ownership. You find the edges, you find the efficiencies you find the. You know, the extra quality in your own work on your own time by using this tool. And it's just. It's such a different mindset it really. I just want to put a post-it note on everyone's monitors.


150

00:20:08.050 --> 00:20:15.459

Felix Mitchell: Well, I'm I'm glad that you mentioned mindset, because I think one of the


151

00:20:15.950 --> 00:20:30.810

Felix Mitchell: most impactful shift that we've made in our in in our business over the last. I mean, probably our entire history is this year we have really focused on, you know, Carol Dweck's growth mindset.


152

00:20:30.980 --> 00:20:32.350

Marion: And.


153

00:20:32.610 --> 00:20:47.390

Felix Mitchell: We did it because we're going through as a lot of businesses are huge amount of change. You know, layer on layer on technology and the fact that recruitment and hiring still the majority of what vast majority of what we do, as the organization is


154

00:20:47.510 --> 00:20:50.620

Felix Mitchell: in a more challenging place than it's been in.


155

00:20:51.170 --> 00:20:57.810

Felix Mitchell: I think, since since Covid since the blip of Covid, and before that, since 2,008


156

00:20:58.000 --> 00:21:19.969

Felix Mitchell: layer on on top of that everything that's going on in geopolitics, everything that's going on with the economy. So we need to be a business that's fast and agile and changes and does all the rest of it. And we realized that our mindset was not geared that way. And so we've done a big rollout of growth mindset, with really, really focusing on the fact that


157

00:21:19.970 --> 00:21:29.870

Felix Mitchell: we don't just need to grow our top line and our bottom line. We also need. We also recognize that we don't know everything that we need to know to be a successful business. And so we really


158

00:21:29.940 --> 00:21:38.200

Felix Mitchell: emphasizing the importance of experimentation and of learning and of quick failure, and of setting really high


159

00:21:38.310 --> 00:21:42.589

Felix Mitchell: bars and targets for ourselves, and just rethinking everything the whole time.


160

00:21:42.840 --> 00:21:46.950

Felix Mitchell: And that's been an incredibly energizing experience.


161

00:21:46.950 --> 00:21:47.510

Danny Gluch: Hmm.


162

00:21:48.680 --> 00:21:52.570

Felix Mitchell: But I but and I don't know where we'd be without it.


163

00:21:53.020 --> 00:21:53.590

Marion: Hmm.


164

00:21:53.590 --> 00:21:57.380

Felix Mitchell: Just because there's been so much foundation shift.


165

00:21:57.890 --> 00:22:01.229

Marion: Yeah, yeah, I you know, I


166

00:22:02.230 --> 00:22:10.280

Marion: having worked in startup and in enterprise, you know, you've got 2 real polarities


167

00:22:10.410 --> 00:22:38.501

Marion: on the the Sme side, you know you, you are able to be much more nimble. You are able to really kind of embrace, agile thinking, and and all of that stuff in an enterprise. It's like, you know, a gazillion ton oil tanker that's drifted, you know, 400, plus because it's more than 3 60 degrees off course, right. And you're trying to course, correct. And it's it's as slow as a week in jail as we say in Glasgow.


168

00:22:40.070 --> 00:22:44.880

Marion: And I think that that's where it can get really tough


169

00:22:45.060 --> 00:23:12.180

Marion: for the employees like I, you know I love. I'm a huge proponent of the methodologies that you're applying, and I think that that's such a great way to enhance your business. But nurture your talent right? You're teaching them great habits. You're teaching them great life skills that will not only enhance what they do for you, but will enhance their lives generally, but I think that that isn't the reality. Sometimes in big orgs.


170

00:23:12.180 --> 00:23:14.400

Felix Mitchell: I mean Microsoft Microsoft did it right.


171

00:23:14.840 --> 00:23:19.819

Marion: Yes, but but also look, look! I mean again. Where I was going was about engagement.


172

00:23:19.820 --> 00:23:20.180

Felix Mitchell: Sorry.


173

00:23:20.180 --> 00:23:22.940

Marion: Look how many layoffs that Microsoft.


174

00:23:22.940 --> 00:23:24.180

Felix Mitchell: But huge numbers.


175

00:23:24.180 --> 00:23:30.870

Marion: This year alone. Right? If you look at Fang, or you know, whichever tech giants you're paying attention to.


176

00:23:31.010 --> 00:23:34.740

Marion: layoffs have been significant, and


177

00:23:34.980 --> 00:23:57.429

Marion: the the kind of parallel to that is the absolute, you know. Demise of employee engagement employee engagements in the toilet, you know, every time I pick up something, by Gartner. It's like, you know, it's down, it's down, it's down, and you only have to, you know, spend time with anyone in a corporate setting, and you you can feel it. It's like permeating. So like again


178

00:23:58.280 --> 00:24:00.290

Marion: pulling those threads together.


179

00:24:00.430 --> 00:24:03.470

Marion: Psychological safety engagement.


180

00:24:03.750 --> 00:24:14.439

Marion: the the, the, you know, rapid change, the need for more agility and the the external environment geopolitical. All of that.


181

00:24:15.260 --> 00:24:19.790

Marion: It's like this, just this big powder keg moment.


182

00:24:20.220 --> 00:24:25.620

Marion: And I often feel like, you know, we're fairly good at reading tea leaves.


183

00:24:26.220 --> 00:24:36.990

Marion: I'm struggling with this one because I just don't know. I cannot determine. I cannot develop a gut. Feel as to how, what this is going to look like, you know, 6 months, 12 months.


184

00:24:37.320 --> 00:24:39.660

Marion: 18 months from now, because


185

00:24:40.420 --> 00:24:52.140

Marion: I couldn't have imagined, 18 months ago that this is where we would be to this point, you know, so that the evolution of change and all of the other things that are compounding that are just


186

00:24:52.380 --> 00:25:10.209

Marion: really hard to wrap my head around, or maybe that's my me being a Gen. Xer, right? And and if I was you know, younger, and had a different sort of digital experience. Maybe my mindset would be different. But where I am today I'm like, Oh, this is a lot.


187

00:25:10.210 --> 00:25:14.930

Felix Mitchell: Yeah, I hear you. I mean my, and I am not


188

00:25:15.280 --> 00:25:18.577

Felix Mitchell: a fortune teller here at all. But my


189

00:25:19.830 --> 00:25:24.645

Felix Mitchell: My instinct on this is that we've just yet to see the job creation, the AI yeah


190

00:25:25.170 --> 00:25:28.419

Felix Mitchell: will bring and I don't know how long it will take. But


191

00:25:28.790 --> 00:25:33.040

Felix Mitchell: I would be really surprised if in a decade it washes out, and it has a bit of net. Job. Creator.


192

00:25:33.400 --> 00:25:33.930

Marion: Hmm.


193

00:25:34.320 --> 00:25:41.580

Felix Mitchell: Those jobs might look really different, and they may be more transient, and they may be way, more self-employed people, and probably will


194

00:25:43.330 --> 00:25:46.199

Felix Mitchell: But a very wise


195

00:25:46.540 --> 00:25:54.679

Felix Mitchell: friend of mine called Ryan Jansen, who runs an AI Startup in Miami. Actually, he told me that


196

00:25:55.430 --> 00:25:58.840

Felix Mitchell: you should never trust anyone when they're talking about


197

00:26:00.010 --> 00:26:02.870

Felix Mitchell: big shifts in in the world.


198

00:26:03.050 --> 00:26:06.169

Felix Mitchell: When they start any sentence with.


199

00:26:07.280 --> 00:26:09.480

Felix Mitchell: It'll be different this time, because.


200

00:26:09.875 --> 00:26:10.270

Marion: Because.


201

00:26:10.270 --> 00:26:13.779

Felix Mitchell: If you look, I mean, if you look at every single new technology


202

00:26:14.200 --> 00:26:18.530

Felix Mitchell: since we started tracking, these things has been a net job Creator.


203

00:26:18.750 --> 00:26:43.479

Felix Mitchell: Every single new technology has had a revolution against it. From, I mean, that's literally where Luddites come that term. Luddite comes from about people who are worried about the technology or don't want to, don't want to bring it in, or whatever whatever that looks like. I see no, no compelling reason why this would be. Why, this would be different. I just think that we've got that problem of being a


204

00:26:44.410 --> 00:26:59.520

Felix Mitchell: sort of an ant, an ant on a on a ball right? You just keep it just keeps walking and thinks it's making progress. But it's just going around, round and round and round. I just think we can't. We can't see far enough into the future for for this to have


205

00:26:59.860 --> 00:27:02.089

Felix Mitchell: for this to have meaning and make sense right now.


206

00:27:02.580 --> 00:27:12.789

Marion: Yeah, which I think I think is the irony here, because this is a, you know, a tool that supports predictability. So there's there's the complete irony. Right?


207

00:27:13.320 --> 00:27:15.319

Marion: I guess. Just kind of like


208

00:27:15.720 --> 00:27:19.999

Marion: pulling back on on a point we we mentioned a minute ago.


209

00:27:20.940 --> 00:27:27.729

Marion: The big myth, or is it. A myth right is that AI is here to help, not replace


210

00:27:27.900 --> 00:27:29.099

Marion: people plus tech.


211

00:27:29.910 --> 00:27:30.530

Felix Mitchell: Hmm.


212

00:27:31.770 --> 00:27:41.600

Marion: And we keep everyone's telling us that. And everyone's telling their employees that. But I think we're seeing that that's maybe not the truth. So how does that continue to play out.


213

00:27:42.560 --> 00:27:43.353

Felix Mitchell: Yeah, I


214

00:27:44.250 --> 00:27:59.399

Felix Mitchell: I agree with you. So I think I think phase one phase 2. So when we're in assistance and augment phase, I think that's going to be true. I think the moment we move into Phase 3, where you see businesses rewiring themselves as if they were AI native.


215

00:27:59.660 --> 00:28:00.330

Marion: Hmm.


216

00:28:00.880 --> 00:28:02.540

Felix Mitchell: It's it's


217

00:28:03.190 --> 00:28:13.179

Felix Mitchell: it's not going to be the case that it's here to help. It's going to fundamentally shift the way that we that we approach work, and I think that there are 2 things going on at the moment.


218

00:28:13.450 --> 00:28:31.169

Felix Mitchell: There are businesses like instant impact, who's trying to, and every established business out there is trying to figure out how their business works in an age of AI. Right? So that's where we're doing all of this. That's where a lot of the stress sits. It's where a lot of the uncertainty sits. It's where a lot of the change sits.


219

00:28:31.550 --> 00:28:34.289

Felix Mitchell: and it's where it's where the change is going to be slower


220

00:28:34.330 --> 00:28:55.029

Felix Mitchell: as ever. If you're changing, you've got legacy systems, but it's also where all the money sits right. I think that's really worth bearing in mind, so that one of the main limiting factors on the progress of AI is going to be those size businesses propensity to change, because you can only do one or 2 big change programs when you're a business over a certain size, you only do a couple of change programs


221

00:28:55.030 --> 00:29:05.320

Felix Mitchell: every year, or else it's, or else it just becomes completely unmanageable. Then you've got a whole generation of new businesses. And this actually might be where the job creation comes from.


222

00:29:05.510 --> 00:29:12.050

Felix Mitchell: it's become so easy to set up a business. The barriers to setting up a business in almost anything you can imagine have been


223

00:29:12.190 --> 00:29:20.239

Felix Mitchell: knocked, knocked down to almost 0. So you've got all of these new businesses springing up. So I think.


224

00:29:20.680 --> 00:29:28.529

Felix Mitchell: before everything stabilizes. You're going to have to have those businesses growing to maturity, and even in an AI world


225

00:29:29.160 --> 00:29:48.989

Felix Mitchell: limited by that route to market piece, it's going to take years, and there'll be exceptions. Of course there will, but there always have been. And then you've got the speed that it's going to take mid-sized large companies to rewire themselves, and I think there'll be some movement amongst those. There's some mid-sized companies. I think it's an amazing opportunity for companies in the sort of


226

00:29:49.900 --> 00:29:59.619

Felix Mitchell: 10 million to 100 million pounds dollars revenue, because, you know, you've got the budget. You've got clients. You've got route to market, you understand the levers of your business.


227

00:29:59.970 --> 00:30:13.289

Felix Mitchell: You can also massively outmaneuver much larger organizations. I think that's 1 of the really big opportunities. And I think you'll see some shifting of mid-sized companies to industry, leaders and industry leaders to nothing.


228

00:30:13.780 --> 00:30:14.310

Marion: Hmm! I.


229

00:30:14.310 --> 00:30:26.449

Felix Mitchell: Again. I think that happens every time there's a technological revolution, I just think it will happen faster. So I think those 2 things are going to happen. But it's all going to take. It's all going to take time. And while all of this is going on.


230

00:30:26.910 --> 00:30:47.420

Felix Mitchell: we're all gonna we're gonna see companies, you know, I think we'll see industries that are government supported in the Uk. And in the Us. Start to get that get that removed and there'll be movement and companies going bust and companies springing up. I just think it's gonna be much faster and much more extreme than it has been before.


231

00:30:47.800 --> 00:30:56.180

Marion: I think that's right. I think I mean, the pattern of industrial revolution doesn't change. But you're right. It's just getting faster. And


232

00:30:57.280 --> 00:31:01.999

Marion: yeah, I think it's pulling it back to the human experience. What does that mean?


233

00:31:02.000 --> 00:31:03.010

Marion: And that's and.


234

00:31:03.270 --> 00:31:07.309

Danny Gluch: That's what I wanted to circle us back to is in


235

00:31:07.960 --> 00:31:18.879

Danny Gluch: in this, you know. Let's call it 15 years down the road where some have successfully chosen the right path of you know. Now we're fully integrated, we we are AI native.


236

00:31:19.300 --> 00:31:39.090

Danny Gluch: What does that look like for psychological safety? Because when I think of psychological safety, it is the ability to ask coworkers for help to bring up issues and have it be listened to to be able to to try something and fail and have it be a learning experience.


237

00:31:39.640 --> 00:31:40.750

Danny Gluch: And if


238

00:31:41.180 --> 00:31:56.259

Danny Gluch: the 1st people are looking at their sticky note and their 1st reach out isn't to their coworker. It's to the AI tool. And they're asking the AI tool for help. And they're working through problems and learning lessons through failures with their AI tool?


239

00:31:56.670 --> 00:32:00.529

Danny Gluch: Are we going to be losing that


240

00:32:00.880 --> 00:32:12.949

Danny Gluch: human connected experience that actually is the foundation for creating psychological safety in any size group? Whether it's the 4 of us, or a thousand people.


241

00:32:14.160 --> 00:32:16.899

Cacha Dora: You can't build trust. If you can't talk to people.


242

00:32:17.350 --> 00:32:18.150

Marion: Hmm.


243

00:32:18.150 --> 00:32:18.570

Felix Mitchell: It seems.


244

00:32:18.570 --> 00:32:29.420

Danny Gluch: Well, and I mean, we've we've seen people try to use AI tools as therapists and things, and it's gone not well, like significantly, not well. And.


245

00:32:29.420 --> 00:32:33.519

Felix Mitchell: I think Openai just rolled out a new therapy


246

00:32:34.150 --> 00:32:37.039

Felix Mitchell: solution like agent on that thing. So.


247

00:32:37.040 --> 00:32:37.466

Danny Gluch: Is it?


248

00:32:39.930 --> 00:32:45.629

Danny Gluch: Not a recommendation! But I would recommend not trying that as an early adopter.


249

00:32:46.090 --> 00:32:51.980

Cacha Dora: I think it's probably informed by people who've used it quite heavily prior to.


250

00:32:51.980 --> 00:32:59.989

Felix Mitchell: Here's here's the question that I would have for the whole of for the whole of AI. And it's a question, not an answer. I think everything around


251

00:33:00.550 --> 00:33:11.350

Felix Mitchell: job losses, replacing human conversations assumes that there is a pretty, a finite demand.


252

00:33:12.820 --> 00:33:17.420

Felix Mitchell: But I guess output of organizations


253

00:33:17.560 --> 00:33:23.279

Felix Mitchell: for what we for what we create and a finite demand on the number of questions that we would ask one another.


254

00:33:24.360 --> 00:33:36.139

Felix Mitchell: And the reason why it requires that assumption is because, let's say, AI can replace 80% of the jobs that we're doing right now, or 80% of the conversations that we ask 80% of the times we ask one another for help.


255

00:33:37.450 --> 00:33:42.780

Felix Mitchell: My question is, are you sure that 20% that's left isn't still gonna be enough


256

00:33:43.090 --> 00:33:47.489

Felix Mitchell: to keep us busy or still be enough for us to ask each other questions.


257

00:33:47.720 --> 00:33:52.639

Felix Mitchell: I'm never going to be able to. I'm never going to be able to have an AI agent.


258

00:33:53.130 --> 00:33:55.740

Felix Mitchell: and I say, Hey, look! I'm going on. You know. I'm


259

00:33:56.300 --> 00:33:59.658

Felix Mitchell: really. I'm really struggling with


260

00:34:01.480 --> 00:34:06.189

Felix Mitchell: Kasia and sorry picking on you.


261

00:34:06.190 --> 00:34:08.339

Cacha Dora: It's okay, fair ready.


262

00:34:08.340 --> 00:34:09.546

Marion: Fairly, standard.


263

00:34:10.159 --> 00:34:14.099

Cacha Dora: Yeah, I I wanna I wanna ask Kasha for a raise.


264

00:34:14.209 --> 00:34:15.089

Felix Mitchell: And


265

00:34:16.520 --> 00:34:25.380

Felix Mitchell: I just want to do it at the right time. The right, you know, based on your like. Essentially, how? How can I go and do this, or Danny really upset me yesterday.


266

00:34:26.199 --> 00:34:38.449

Felix Mitchell: and you know, I think, that so many questions that we ask so many questions that we go to the people we trust so many of those trust building questions. And you'll again, you guys will know this better than I will.


267

00:34:38.639 --> 00:34:49.650

Felix Mitchell: They're not, and I'm looking for some case. Law on, on, on this, that how do you had that?


268

00:34:50.179 --> 00:34:58.109

Felix Mitchell: You know Kasha hasn't had lunch yet. How is it the right time to approach her. You know those sorts of like weird, messy human things that.


269

00:34:58.110 --> 00:34:58.560

Marion: Yeah.


270

00:34:58.560 --> 00:35:21.299

Felix Mitchell: Non-structurable. And actually, you kind of wouldn't want to ask an agent about my guess. And I'm a massive optimist, as you can probably tell. My guess is that we'll always have things that AI can't have answers to, because they're so innately human, and there'll always be conversations that we'd rather have with a mate or a colleague than AI. And I also think that


271

00:35:21.590 --> 00:35:30.119

Felix Mitchell: the 20% that's left in tasks that AI can't do. I think that would be. I actually think that would be plenty.


272

00:35:30.120 --> 00:35:30.520

Felix Mitchell: Yeah.


273

00:35:30.560 --> 00:35:33.819

Felix Mitchell: How many of us have a to-do list that doesn't last a day.


274

00:35:34.060 --> 00:35:34.750

Marion: Yeah.


275

00:35:34.920 --> 00:35:35.550

Cacha Dora: Oh, my God!


276

00:35:35.550 --> 00:35:36.029

Danny Gluch: I mean.


277

00:35:36.030 --> 00:35:36.510

Cacha Dora: Never.


278

00:35:36.510 --> 00:35:54.490

Danny Gluch: Meme have you seen the meme of? I don't want AI to do my laundry and dishes so that I can do work. I want AI to do my work, or whatever it is. I want them to do my work in my laundry so that I can go, you know, enjoy time I brutalized that me, maybe it's not as good of a meme as I thought.


279

00:35:54.490 --> 00:35:54.890

Danny Gluch: But


280

00:35:54.890 --> 00:36:13.299

Danny Gluch: yeah, I think what you're right is that 20% is gonna get filled up. And I think the question is our organizations going to allow that to be meaningful? Or is it gonna be like, yeah, we've had all of these tools over the last 20 years, and we're busier than ever


281

00:36:13.480 --> 00:36:19.410

Danny Gluch: like that. All these tools that we have been given since you know the the.com era.


282

00:36:19.920 --> 00:36:31.716

Danny Gluch: it's made us busier. We have less time in our our days. We're working more hours, and it's it hasn't exactly freed us up to to have those those connections.


283

00:36:32.330 --> 00:36:38.189

Danny Gluch: but I think that's the decision for business leaders of do we allow for


284

00:36:38.720 --> 00:36:53.630

Danny Gluch: that agility and the efficiency to open up space to where I can turn to Kasha and say, man, did you hear what Felix said? That was that really hurt me when he singled me out, you know, and and to have those conversations.


285

00:36:53.630 --> 00:36:55.189

Felix Mitchell: I did try and single both of you out.


286

00:36:55.190 --> 00:36:58.640

Felix Mitchell: So it was like a fan, Marion. Something's coming later.


287

00:36:58.640 --> 00:37:02.459

Marion: I might feel left out. I feel excluded.


288

00:37:02.640 --> 00:37:09.920

Felix Mitchell: Book plug to do from one of my previous Podcast guests called Chris Lovett, Discovery of Less. It's all around.


289

00:37:10.150 --> 00:37:14.749

Felix Mitchell: both at home and like in your personal life and at work, just


290

00:37:16.150 --> 00:37:24.439

Felix Mitchell: really taking yourself to task on whether whether or not you need stuff, whether you need a meeting, whether I've got.


291

00:37:24.590 --> 00:37:33.319

Felix Mitchell: I've got 3, 4 things that I'm drinking from on my table. I definitely don't need all of those, and just around like the cost of yes.


292

00:37:33.520 --> 00:37:39.530

Felix Mitchell: there are certain people who are just powered by liquids. I definitely fall into that.


293

00:37:40.050 --> 00:37:42.386

Felix Mitchell: I think I think that there is


294

00:37:43.420 --> 00:37:46.959

Felix Mitchell: and I don't know whether it's human, and I don't know whether it's modern.


295

00:37:47.150 --> 00:37:50.660

Felix Mitchell: but it and whether it's just because of our sort of consumerist culture.


296

00:37:51.020 --> 00:37:54.759

Felix Mitchell: I think we'll always we'll always be looking for, more, more to do, more to have.


297

00:37:54.760 --> 00:37:55.600

Cacha Dora: Yeah.


298

00:37:55.600 --> 00:38:03.069

Felix Mitchell: Cai, changing that. And I don't see businesses changing either. I think that this epidemic of busyness is just going to continue.


299

00:38:03.240 --> 00:38:16.760

Cacha Dora: Well, the AI. And how AI manifests, I think, Marian, to your point earlier, right? No one could foresee where we're at right now, but also, Felix, to your point. No one. No one can really see where we're gonna go, because, depending on how organizations


300

00:38:16.760 --> 00:38:37.749

Cacha Dora: utilize. Are they going through a chat agent? Are they gonna be using a software that's using the AI that actually makes their work easier, like going into like the learning and development space and writing courses. Some of the software we use is actually getting much better because of AI. That doesn't mean that the human quotients going away. If anything, we're having more to talk about.


301

00:38:37.810 --> 00:38:43.940

Cacha Dora: So it's really interesting just to kind of see how AI is gonna manifest itself. But I really like


302

00:38:44.140 --> 00:38:49.500

Cacha Dora: you kind of going through those phases. I'm gonna I like Josh person I'm gonna have to like go back because it's been a while since I've read his stuff.


303

00:38:50.790 --> 00:39:06.750

Cacha Dora: but I think, going through those phases. It's really easy to see that we're in that pain point right? We're in that that space. And it's uncomfortable because it is a moment of change, but also because no one knows where it's going, because we haven't figured it out yet.


304

00:39:08.180 --> 00:39:28.329

Marion: Do you know what it reminds me of? Do you remember like when the iphone came out, the kind of very, very early days of the smartphone, and I can remember seeing like a news piece where they're like, bye 20. Whatever you will not use laptops, everything will be done on your phone. And I remember going. That's bullshit. How can that be?


305

00:39:29.360 --> 00:39:42.370

Marion: We are, you know, like, and there's so many things like that over my life, my timeline, where I remember thinking, Oh, my God! Like, how could that possibly be? That can't be a thing. And and now it is so. We know


306

00:39:42.830 --> 00:40:00.380

Marion: our brain. Our brains aren't fully capable of being able to, you know. See what's beyond what's coming around the corner sometimes. But if we look to your point earlier. We look at history, it always happens, and something always comes around and we adjust. And we adapt. And again, that's part of evolution and revolution. Right?


307

00:40:01.250 --> 00:40:01.960

Marion: But.


308

00:40:01.960 --> 00:40:14.959

Felix Mitchell: I think the fascinating thing is that you sort of. I guess. The other thing that I would point to is, I think, even by conservative estimates in in some of the sci-fi you read like it's nuts that we don't have flying cars now.


309

00:40:15.738 --> 00:40:20.280

Felix Mitchell: And that we're we're not all like jet sending into into work.


310

00:40:20.630 --> 00:40:23.430

Felix Mitchell: But I mean they exist.


311

00:40:24.010 --> 00:40:24.700

Felix Mitchell: Yeah. But


312

00:40:25.050 --> 00:40:44.340

Felix Mitchell: but I mean, personally, I would literally never get in one, no matter how safe it is. There's literally no way you're getting me in a flying car. But the sort of challenges of you know, regulation and infrastructure, and all of those, all of those elements they get in the way. And I think that


313

00:40:45.380 --> 00:40:51.009

Felix Mitchell: even if something seems, if something seems illogical, Marion, you're saying with the iphone.


314

00:40:51.010 --> 00:40:51.690

Marion: Yeah.


315

00:40:52.040 --> 00:41:03.710

Felix Mitchell: It may well still happen, and if something seems logical like, I would say a flying car, it may well not happen, and it's just so interesting. Kash, to what you were saying before


316

00:41:04.490 --> 00:41:10.208

Felix Mitchell: knows where we're going to end up. We just know it's going to be really different. And I think


317

00:41:10.670 --> 00:41:13.179

Felix Mitchell: it was a I wish I could remember what the book was.


318

00:41:15.890 --> 00:41:34.700

Felix Mitchell: Anyway, it was someone made the point that, and it was someone smarter than me, but I'm so unsmart I can't even remember who it was, but that there are already autonomous, non-human, decision-making organisms in in the world. Companies make decisions without


319

00:41:34.870 --> 00:41:36.780

Felix Mitchell: a single human


320

00:41:37.300 --> 00:41:54.239

Felix Mitchell: interaction. They they have their own agenda, they have their own way of way of working, they have negative influences. And look at, I mean look at Meta and all of this and social media, and all of the awful societal implications that that's had no one. No one had that as an objective right. So.


321

00:41:54.770 --> 00:42:05.000

Felix Mitchell: You know, things that aren't human can have already proven themselves to be damaging on a scale that humans can as well.


322

00:42:05.110 --> 00:42:07.510

Felix Mitchell: So I think


323

00:42:08.560 --> 00:42:15.150

Felix Mitchell: this is something that we have dealt with before. And I think to the question it was interesting. The phrasing of your question, Danny, which was.


324

00:42:15.260 --> 00:42:16.700

Felix Mitchell: will companies


325

00:42:17.110 --> 00:42:23.676

Felix Mitchell: create that space. I think the the one thing that we can do, or one of the main things that we can do to


326

00:42:24.080 --> 00:42:30.870

Felix Mitchell: to gear ourselves up for a better future is relook at.


327

00:42:31.030 --> 00:42:40.099

Felix Mitchell: How our companies are set up. The Us. Is way worse than this than the Uk. But your companies are literally bound, almost some of them by law.


328

00:42:40.440 --> 00:42:42.869

Felix Mitchell: to to maximize profit for shareholders.


329

00:42:42.870 --> 00:42:43.330

Danny Gluch: Yeah.


330

00:42:43.800 --> 00:42:45.880

Felix Mitchell: Let's change that first, st because


331

00:42:46.490 --> 00:42:58.830

Felix Mitchell: that's the way the company is pointing. And that's the way the technology we're doing. So if the technology gets more powerful. That's what's going to happen to it. So I love these movements like B. Corp, and we're starting our B Corp journey at the moment where


332

00:42:59.390 --> 00:43:05.709

Felix Mitchell: you know you accept that there's other stuff that businesses are there for, and that will be a whole nother rewiring, probably. Don't have time for that before AI.


333

00:43:05.710 --> 00:43:06.930

Danny Gluch: And no.


334

00:43:06.930 --> 00:43:16.440

Danny Gluch: that's that's such a an interesting thought of just sort of like how you organize yourself as a corporation, as a firm.


335

00:43:16.540 --> 00:43:35.429

Danny Gluch: And you know, there's the hey, we're not here for profits. We're here for this, and you know, there's the sort of mixed of like for every shoe we sell, we're gonna donate. We're still gonna make profits. But we're, gonna you know, have that sort of mixed. And then there was, you know, that big philosophical shift in the seventies into the early eighties


336

00:43:35.430 --> 00:43:46.900

Danny Gluch: of the role of organizations is to maximize profits. And that was such a big philosophical shift. And I think we're still dealing with repercussions of that.


337

00:43:47.010 --> 00:43:54.420

Danny Gluch: But it's really interesting thinking of what? What if there's another classification where people can can organize their corporations as


338

00:43:54.770 --> 00:44:01.299

Danny Gluch: we're going to use AI not to maximize profits. But we're going to use AI to maximize.


339

00:44:01.440 --> 00:44:03.130

Danny Gluch: you know, employee benefit.


340

00:44:03.360 --> 00:44:07.090

Danny Gluch: And yeah, you know, we we only work 20 h work weeks.


341

00:44:07.220 --> 00:44:12.219

Danny Gluch: and you know, whatever it is that you know I would work for them.


342

00:44:12.980 --> 00:44:21.840

Danny Gluch: Where do I sign up? Even if it even if it was a 30 h work week, like, you know. Just just give me one day 32, 32 h.


343

00:44:21.840 --> 00:44:27.819

Marion: But then then the capitalist kind of inertia kicks in, and it's like, well.


344

00:44:28.490 --> 00:44:54.850

Marion: you know, you have to be present and bums on seat to be productive. Do you know what I mean? And then that whole thing blows up because that's what we're dealing with right now with the whole rto, thing right? You know, presence does not equal productivity. People like I don't know how many times we need to say that, but we still have, and the deep rooted belief system in capitalist society that it does. And so, Danny, what you're saying like.


345

00:44:54.960 --> 00:45:03.330

Marion: I mean, it's like, it's kind of like an advancement of the 4 day work week. Remember, when we started talking about that, and people were like what you know couldn't get their.


346

00:45:03.880 --> 00:45:20.299

Marion: But you know, Danny, I think that people companies forward, thinking companies will embrace that, and they will embrace the ability to leverage technology to enhance the the human experience. But not sure. It's going to be in a fortune 100 anytime soon, so.


347

00:45:20.810 --> 00:45:21.950

Cacha Dora: Yeah, yeah.


348

00:45:22.200 --> 00:45:25.599

Danny Gluch: Be like the B corpse, the C corpse. It'll be AI Corps. Yeah.


349

00:45:25.600 --> 00:45:31.660

Cacha Dora: To Felix's Point. It's gonna be the the companies that can that can move and can move quickly where.


350

00:45:31.660 --> 00:45:32.020

Marion: Yeah.


351

00:45:32.020 --> 00:45:38.310

Cacha Dora: Our larger companies just can't, because they're so bureaucratic that you need 12 layers to make a decision.


352

00:45:38.310 --> 00:45:40.669

Marion: Can or won't, yeah, can or won't.


353

00:45:40.670 --> 00:46:02.179

Danny Gluch: Or they're just going to delegate those big decisions to AI at some point. And they're gonna be like, oh, it wasn't. It wasn't us. It was AI, and I actually see that happens in some industries already. You're like, Oh, it wasn't us, the algorithm, you know, and it allows leaders oftentimes to wash their hands on decisions which.


354

00:46:02.180 --> 00:46:07.829

Felix Mitchell: Everyone needs to remember that even the best of these are only 70% correct. That's not.


355

00:46:07.830 --> 00:46:08.440

Cacha Dora: Usually it's.


356

00:46:08.440 --> 00:46:09.530

Felix Mitchell: Decisions, on.


357

00:46:09.530 --> 00:46:09.905

Cacha Dora: Yeah.


358

00:46:11.060 --> 00:46:11.850

Danny Gluch: What do you mean? It's not.


359

00:46:11.850 --> 00:46:12.200

Danny Gluch: How can I.


360

00:46:12.200 --> 00:46:14.746

Marion: Give you the right reference every time.


361

00:46:15.110 --> 00:46:17.359

Felix Mitchell: No, believe it or not.


362

00:46:17.710 --> 00:46:18.060

Cacha Dora: Yeah.


363

00:46:18.480 --> 00:46:18.900

Danny Gluch: Yeah.


364

00:46:19.080 --> 00:46:43.060

Danny Gluch: it's it's 1 of those things, though, as organizations. And, Felix, I think you were so right on in saying that there's going to be opportunities for leveraging AI for these sort of like mid-level companies to really make strides and and become, you know, sort of like if their goal is to grow that to grow that way. I think there also might be opportunities for organizations to zig when everyone's zagging.


365

00:46:43.060 --> 00:46:56.749

Danny Gluch: you know, when everyone is fishing with this certain bait. I'm gonna go fish with this other one. When everyone's using AI, I'm going to employ that one person to be a headhunter and just physically go out and investigate and find the best candidates


366

00:46:56.870 --> 00:47:10.140

Danny Gluch: headhunt. 3 of them. Bring them in, and we've got the one, and we don't have to use this big AI net to get 10,000 candidates to maybe find one, you know. I think there is still an opportunity for organizations to go.


367

00:47:10.420 --> 00:47:18.920

Danny Gluch: you know. Analog when everyone else is going to AI do you think that that is a viable? You know


368

00:47:19.080 --> 00:47:25.950

Danny Gluch: your your business, and and what you guys do aside. Do you think there's like a niche viable strategy for people to do that?


369

00:47:27.060 --> 00:47:38.359

Felix Mitchell: I think it's already happening to be completely honest with you. I think. Let's look at buying decisions at the moment. And this doesn't really have anything to do with AI, but AI will exacerbate it.


370

00:47:39.160 --> 00:47:47.299

Felix Mitchell: and I don't have stats on on this one when I need something new for my business in an area that I'm not familiar with. I no longer Google.


371

00:47:47.930 --> 00:47:50.530

Felix Mitchell: I send 3. I send 3 messages.


372

00:47:50.530 --> 00:47:51.240

Cacha Dora: Yep.


373

00:47:52.060 --> 00:47:52.590

Felix Mitchell: On Whatsapp.


374

00:47:52.590 --> 00:47:55.759

Danny Gluch: Wow! That's such a great example.


375

00:47:56.040 --> 00:47:57.790

Felix Mitchell: And Google's worthless now.


376

00:47:58.250 --> 00:48:05.309

Felix Mitchell: Well, I mean, there's just too much information. There was too much information before we started all this nonsense. And and so


377

00:48:05.480 --> 00:48:07.119

Felix Mitchell: the power of network


378

00:48:07.430 --> 00:48:14.710

Felix Mitchell: is and of relationships is going to be the number one differentiator, because everyone's going to have the same knowledge.


379

00:48:15.390 --> 00:48:25.590

Felix Mitchell: So. And and there's too much of it. And our tiny little human brains that still need to make decisions aren't going to be able to handle the the amount of choice that we have.


380

00:48:25.730 --> 00:48:32.549

Felix Mitchell: So you just go and you go on what's trusted, and everyone says that they can do everything at the moment. So you just go on something that's worked for someone else that you trust.


381

00:48:32.750 --> 00:48:35.010

Felix Mitchell: And so I said, it's already happening, Danny.


382

00:48:36.310 --> 00:48:50.569

Danny Gluch: And what a great tie back to the conversation of you're still going to go talk to that person. Yeah, AI is going to do all this. You know, Google is going to take care of all that I'm still just going to talk to this person about, hey? I'm having a kind of hard time


383

00:48:50.740 --> 00:48:59.330

Danny Gluch: you're you're not going to outsource that part. And, in fact, the organizations that really encourage that, I think are gonna do


384

00:49:00.230 --> 00:49:02.059

Danny Gluch: do well, they're going to exceed.


385

00:49:03.650 --> 00:49:13.709

Danny Gluch: Well, great, great ending. Thank you so much, Felix. Thank you, Mary, and thank you, Kasha. Thank you. To all of our listeners, Felix. Where can people find you if they're looking for you and your podcast.


386

00:49:14.030 --> 00:49:29.509

Felix Mitchell: Right? Well, we're on. Www, I still do that instant hyphen impact.com, and then the podcast. Is in the resources section. And then I'm on. I'm on Linkedin as we were talking about before.


387

00:49:29.510 --> 00:49:30.100

Marion: Yay!


388

00:49:30.100 --> 00:49:30.600

Danny Gluch: Awesome.


389

00:49:31.030 --> 00:49:43.720

Danny Gluch: Well, we will find you on Linkedin. Everyone. Please leave comments on Linkedin, share, subscribe to the podcast leave a review. We really appreciate that help fight against the algorithms. Thank you all very much. Have a great day.


390

00:49:43.720 --> 00:49:45.040

Felix Mitchell: Thank you for having me.




People on this episode