
The Elephant in the Org
The "Elephant in the Org" podcast is a daring dive into the unspoken challenges and opportunities in organizational development, particularly in the realm of employee experience. Hosted by the team at The Fearless PX, we tackle the "elephants" in the room—those taboo or ignored topics—that are critical for creating psychologically safe and highly effective workplaces.
The Elephant in the Org
Human in the Loop, AI in the Mix: Building Teams That Can Still Think — with Mark Evans (AI Special - Part 3)
When AI does the homework but skips the hard work, businesses will be the ones picking up the tab.
In this final episode of our AI trilogy, we’re joined by Mark Evans — serial entrepreneur, Fractional Chief AI Officer, and strategic advisor who’s spent 30 years building, scaling, and exiting tech ventures. Now, he helps leaders navigate AI adoption without outsourcing the very thing that makes us human: our ability to think.
From classrooms to boardrooms, Mark warns of a creeping danger — the illusion of mastery. Students, job-seekers, and even seasoned professionals are presenting AI-polished competence without the critical thinking, resilience, or authenticity to back it up. When that illusion flows into hiring and leadership pipelines, the long-term costs to businesses could be catastrophic.
We talk about:
- Why “competence signaling” through AI is a hiring manager’s nightmare.
- How academia and business are both scrambling to set guardrails.
- The mental health fallout of outsourcing, struggle, and resilience to machines.
- Why “human in the loop” isn’t optional — it’s essential for trust and capability.
- What leaders can do now to avoid building a workforce of imposters.
Top Takeaways:
- AI isn’t evil, but without guardrails, it creates dangerous illusions of mastery.
- Critical thinking is the new currency — leaders who protect it will win.
- Psychological safety erodes when people feel like imposters in their own roles.
- Guardrails in education, hiring, and leadership are non-negotiable.
Guest Bio:
Mark Evans is the founder of 360 Strategy, a Fractional Chief AI Officer, and serial entrepreneur. With three decades of experience scaling and exiting tech ventures, he now works with leaders to ensure AI augments rather than replaces human capability. Based in Glasgow, Mark is passionate about preparing the next generation of talent to enter the workforce ready to think, not just prompt.
🔗 Connect with Mark: LinkedIn
📄 Read Mark’s paper featured in this episode: AI and the Corrosion of Academic Rigour
🐘 Connect with Us:
🚀 Follow The Fearless PX on LinkedIn: The Fearless PX
📩 Got a hot take or a workplace horror story? Email Marion, Cacha, and Danny at elephant@thefearlesspx.com
🎧 Catch every episode of The Elephant in the Org: All Episodes Here
🚀Your Hosts on Linkedin:
💬 Like what you hear?
Subscribe, leave a ★★★★★ review, and help us bring more elephants into the light.
🎙️ About the Show
The Elephant in the Org drops new episodes every two weeks starting April 2024.
Get ready for even more fearless conversations about leadership, psychological safety, and the future of work.
🎵 Music & Production Credits
🎶 Opening and closing theme music by The Toros
🎙️ Produced by The Fearless PX
✂️ Edited by Marion Anderson
⚠️ Disclaimer
The views and opinions expressed in this podcast are those of the hosts and guests, and do not necessarily reflect any affiliated organizations' official policy or position.
3
00:00:34.990 --> 00:00:40.470
Danny Gluch: Welcome back to the Elephant in the Org, everyone. I'm Dani Glutch, and I'm joined, as always, by my co-host, Kashadora.
14
00:00:40.470 --> 00:00:41.740
Cacha Dora: Hello?
15
00:00:41.740 --> 00:00:43.150
Danny Gluch: And Marion Anderson.
16
00:00:43.580 --> 00:00:46.200
Marion: Clearly, I got bumped to second place this week.
17
00:00:46.390 --> 00:00:50.539
Danny Gluch: You know, if I could remember which one I do first, normally, I would.
18
00:00:50.890 --> 00:00:54.830
Marion: What is normal? We don't need normal here. You know what?
19
00:00:55.810 --> 00:01:01.150
Danny Gluch: And we're joined today by Mark Evans. Mark, introduce yourself to our listeners.
20
00:01:01.150 --> 00:01:06.950
mark evans: Yes, hello, my name's Mark Evans, I'm delighted to be here with you guys, …
21
00:01:07.170 --> 00:01:25.000
mark evans: I'm based in Glasgow, in Scotland. I am the founder of 360 Strategy, which is a management consultancy with a particular focus on AI, but I've had 30 years building, scaling, and exiting tech ventures
22
00:01:25.220 --> 00:01:30.540
mark evans: And, this is… this is now my sweet spot, giving advice and helping people make the right decisions.
23
00:01:31.130 --> 00:01:49.240
Danny Gluch: That's wonderful. And you're our guest today because we have the final elephant of our trilogy starting Season 3 on AI, and you have a paper that was just… when Marion read it, she immediately sent it to us, and honestly, it's…
24
00:01:49.290 --> 00:01:52.629
Danny Gluch: It's fantastic, because I think it gets to some…
25
00:01:53.030 --> 00:01:59.820
Danny Gluch: sort of, like, assumptions people had about the use of AI in organizations, in academia.
26
00:02:00.320 --> 00:02:06.529
Danny Gluch: But you, like, nailed it down and put ink to paper and said, this is what's actually happening. Tell us about this paper.
27
00:02:07.230 --> 00:02:23.939
mark evans: The paper… what happened was, in 2023, when I exited my final business, I did an MBA, completed an MBA at Stradcliffe in Glasgow, Stradcliffe University, so it was like a lifetime ambition, so I took a year out to do that, and…
28
00:02:24.040 --> 00:02:35.539
mark evans: The MBA was incredibly challenging, but what I did see around about me was the emergence and the sort of… start… AI starting to seep into academia.
29
00:02:35.610 --> 00:02:45.039
mark evans: And at the highest levels as well, but I saw a distant connection between… a distant connection between students
30
00:02:45.310 --> 00:02:50.849
mark evans: Using it, experimenting with it, and the…
31
00:02:51.120 --> 00:02:54.850
mark evans: Lecturers, the academics, way behind the curve.
32
00:02:55.730 --> 00:02:56.650
mark evans: And…
33
00:02:57.090 --> 00:03:07.520
mark evans: I kind of… when I came out of… after really, really slogging to get the MBA, which is a tough, tough course, coming out the other end of it, I thought to myself, you know, I felt a bit of…
34
00:03:07.540 --> 00:03:23.509
mark evans: potential injustice in the sense that if this carries on, AI's growing at lightning speed, as we all know. It's… we're way by that inflection point, you know, of adoption. It's in the system. It's… it's everywhere now, and…
35
00:03:23.510 --> 00:03:33.020
mark evans: what I thought to myself was, look, you know, I really need to dive in a wee bit more to see exactly where, how academia and how business, which are interrelated.
36
00:03:33.020 --> 00:03:50.909
mark evans: how that interplay is going to work over the next few years, when people start to wake up to AI and its adoption with these students that are going through university and at school, and it opened my eyes. So it started off as a sort of small research experiment.
37
00:03:50.930 --> 00:04:01.109
mark evans: And then before you know it, I had to sort of put it on paper. I thought to myself, look, people need to know about this, and the business community needs to know about this, because we're storing up
38
00:04:01.460 --> 00:04:05.570
mark evans: Something which business will ultimately pick up the tab for.
39
00:04:06.060 --> 00:04:07.420
Marion: Hmm,
40
00:04:07.420 --> 00:04:21.379
Danny Gluch: Oh, I love that analogy, that it really is, and there's a number of ways where the Western academic system really fails businesses. It's very individualistic, it's very, you know, …
41
00:04:21.519 --> 00:04:30.409
Danny Gluch: goal-focused rather than process-focused, and I think that's part of the issue with AI, is people see a goal, and they can take a shortcut to the goal.
42
00:04:30.510 --> 00:04:41.499
Danny Gluch: And it doesn't matter about gaining mastery, or competence, or understanding, or practicing critical thinking. It's, I got to the goal, I showed you the goal, I got my grade done. And…
43
00:04:41.500 --> 00:04:51.140
Danny Gluch: yeah, businesses are gonna pick up the tab, and it's gonna be widespread. I'm not sure if you saw the, the chart of AI usage from OpenAI,
44
00:04:51.140 --> 00:04:52.540
Danny Gluch: Yes. Dropped off in June.
45
00:04:52.540 --> 00:04:53.010
mark evans: And….
46
00:04:53.010 --> 00:04:54.220
Danny Gluch: I like a lot.
47
00:04:54.220 --> 00:05:00.739
mark evans: Yeah, yeah, it was… it was… I was actually… I was hoping you'd bring that up. It was a fascinating indication of how it's…
48
00:05:00.960 --> 00:05:04.439
mark evans: Of the use, you would say, during term time.
49
00:05:04.920 --> 00:05:11.760
mark evans: And then how it completely drops off the cliff, and that just started to use a signal, and it's a worrying signal.
50
00:05:11.760 --> 00:05:12.250
Cacha Dora: Yeah.
51
00:05:12.250 --> 00:05:15.449
mark evans: because it's like, it's Pandora's box.
52
00:05:15.630 --> 00:05:16.590
mark evans: No, it's open.
53
00:05:17.020 --> 00:05:23.849
mark evans: And no… there's no one at the, let's see, in the ranks of influence.
54
00:05:24.050 --> 00:05:33.489
mark evans: It seems to have any sort of guardrails in place. There's no checks, there's no sense checks, and it's… it's rampant, and it's only going to get worse.
55
00:05:33.740 --> 00:05:34.350
Marion: Oh.
56
00:05:34.350 --> 00:05:42.480
mark evans: And that is the… that's what I'm worried about. You know, as someone who's employed people in the past with companies, and who's used those signals
57
00:05:42.910 --> 00:05:45.290
mark evans: Of competence to hire.
58
00:05:45.910 --> 00:06:01.360
mark evans: And the reliance on it, and with the money that's invested by businesses to do that hiring, to go through the process, then the training, then the development, and then the trust that's put in that person to go and represent your business, if there's a danger in that.
59
00:06:01.900 --> 00:06:09.719
mark evans: If you're hiring people that are signaling competence, but they're polished, but they have no critical thinking capability, then
60
00:06:10.200 --> 00:06:19.920
mark evans: where is your business going to be? The whole hiring process is going to be in jeopardy. The… you know, that to me, and that's going to affect businesses. That's what really, really worries me.
61
00:06:20.510 --> 00:06:24.430
Cacha Dora: There's a ton of things I want to unpack in there, but… Yes.
62
00:06:24.430 --> 00:06:37.419
Marion: The academic bit, I kind of want to touch on a little bit, just because, I mean, that's my world right now as well, finishing this PhD. Same. Pasha's doing a master's, and it has been really fascinating. So, when I did my
63
00:06:37.600 --> 00:06:49.319
Marion: bachelor's degree, like, way, way back in the aquarium days, Mark. That's another story. That's another story for another day. But…
64
00:06:49.430 --> 00:07:02.390
Marion: then it was very manual, right? You're going to the library, you're pulling journals, you're pulling periodicals, you're… it's very, very hands-on, because this is, like, the late 90s, you know, the internet was just sort of seeping in, right?
65
00:07:02.510 --> 00:07:08.230
Marion: And then, when I went back to do my master's, like, 2013?
66
00:07:08.870 --> 00:07:28.690
Marion: it completely changed. So, you know, I'm able to pull papers online, I'm able to, you know, do a lot more stuff, I'm not having to go to the library as much, you know, it was a real game changer. And that was quite a shift, even though, like, you know, I'm very, digitally comfortable. It was a real shift in mindset.
67
00:07:28.700 --> 00:07:33.630
Marion: And now, even just in the last 4 years of doing my PhD,
68
00:07:33.840 --> 00:07:46.540
Marion: the landscape has changed again with AI. And what's really interesting doing the volume of research that I'm doing right now, qualitative research, is that there's AI tools out there
69
00:07:47.170 --> 00:07:49.599
Marion: Specifically for this purpose.
70
00:07:49.960 --> 00:07:59.100
Marion: There's one I'm using right now, MAXQDA, which I never really know how to say it, but it's really good, and it has an AI part to it.
71
00:07:59.380 --> 00:08:06.460
Marion: But it doesn't actually speed the process up. Yeah, it'll identify themes, but you still have to qualify them, you still have to check it.
72
00:08:06.570 --> 00:08:13.990
Marion: And it's been a really interesting process to follow, but when I've had conversations with my supervisor about
73
00:08:14.510 --> 00:08:29.160
Marion: you know, I'm… I'm integrating AI into my… my project. Initially, she was like, oh, you know, because of that… because it's… it's not… nobody's… no one's really kind of figured out what the parameters are for using it in education.
74
00:08:29.160 --> 00:08:41.040
Marion: And so there's a concern that, you know, you're going to use it and churn out this work, which is not your own, and blah blah blah, but in fact, that's not the case, because the work is 100% my own, the research is unique.
75
00:08:41.330 --> 00:08:49.329
Marion: but I'm using maybe ChatGPT or whatever to kind of refine what I'm saying, or I'm using it to critique what I've written. Hey, you know.
76
00:08:49.330 --> 00:09:01.530
Marion: acting as Creswell and Creswell review this methodology chapter, give me, you know, critical feedback, tell me where I can strengthen it, improve rigor, all of that stuff, right? And it will do that, and then I'll go off and do that work.
77
00:09:03.120 --> 00:09:12.310
Marion: When you are, and you know this, when you're handing in your final thesis, like, you're obviously, highlighting any tools that you've used.
78
00:09:12.660 --> 00:09:22.380
Marion: And there's still real ambiguity around what you should and shouldn't say, about how you've used AI within your project.
79
00:09:22.480 --> 00:09:39.289
Marion: And that made me really think hard, because, you know, the program I'm doing is a professional doctorate, right? So it's meant for people in industry, it's meant for people that are doing the day job, and then they're building out the underpinning, the academic underpinning, to kind of lift up what they're doing day to day.
80
00:09:39.530 --> 00:09:48.750
Marion: And I'm like, there's a real disparity between industry and academia. And if… if academia is meant to be preparing.
81
00:09:48.870 --> 00:09:55.610
Marion: leaders of tomorrow, at whichever level. They have a responsibility to get this bit right.
82
00:09:55.880 --> 00:10:01.280
Marion: to then allow people to go forward. And we seem to be, going back to your point at this inflection point, where
83
00:10:01.390 --> 00:10:10.359
Marion: it's a bit of a no-man's land, because younger generations, AI native, to some extent, are taking it and running with it.
84
00:10:10.990 --> 00:10:22.040
Marion: critical thinking's out the window, because they've never had to do it. Whereas us, kind of like old schoolers, we've grown up with that, so we're better placed to be able to kind of rationalize
85
00:10:22.230 --> 00:10:28.290
Marion: what's fact that AI spits out, and what's fiction, and identify hallucinations and all of that stuff.
86
00:10:28.600 --> 00:10:30.880
Marion: I don't think… I don't think these guys can.
87
00:10:31.050 --> 00:10:38.820
Marion: And that is terrifying as a, as a hiring manager, as a, you know, a head of HR. That's actually terrifying.
88
00:10:39.100 --> 00:10:47.530
Cacha Dora: I… I can second that as well, even in my master's program. I… you know, it's, what, like a 10-course program?
89
00:10:48.360 --> 00:10:51.670
Cacha Dora: Every single syllabus I've gotten mentions AI,
90
00:10:52.040 --> 00:11:11.480
Cacha Dora: And the professor's stance on it is so varied, where some of them are like, absolutely not. And then they'll point back to the institution's, you know, policies, but it's very broad, it's very vague, it's not, like, a zero-tolerance policy.
91
00:11:12.120 --> 00:11:19.180
Cacha Dora: Going back to all of your points where, you know, our education systems, it's not based on comprehension, it's based on testing.
92
00:11:19.300 --> 00:11:25.450
Cacha Dora: So, if we have an education system that's based on testing, it does not necessarily reward comprehension.
93
00:11:25.620 --> 00:11:41.739
Cacha Dora: And these AI tools are taking prompts that are brilliant and make sense, and then making someone sound equally so, even if that comprehension isn't coming from that individual, specifically, right?
94
00:11:41.940 --> 00:11:42.930
Cacha Dora: …
95
00:11:43.050 --> 00:11:52.220
Cacha Dora: And then on the flip side, and Mark, you said it, and I was like, yes, because the correlation is not causation when it comes to the fact that
96
00:11:52.420 --> 00:12:09.449
Cacha Dora: our institutions as businesses, are putting up guardrails. They're the ones that are saying, we do have a zero tolerance policy. We do not understand this. Our IP is so important that we do not want you using AI, but education doesn't have the IP.
97
00:12:09.890 --> 00:12:13.199
Cacha Dora: The IP is from the authors that you're referencing.
98
00:12:13.320 --> 00:12:22.449
Cacha Dora: So, the framework doesn't exist because the systematic, framework is completely different between academia and
99
00:12:22.690 --> 00:12:29.319
Cacha Dora: your profit-bearing businesses. And… Even in a master's degree, 10 different courses
100
00:12:30.480 --> 00:12:34.590
Cacha Dora: I'm in two classes right now, two different syllabus answers when it comes to AI.
101
00:12:35.100 --> 00:12:41.210
mark evans: Yeah, yeah. The thing that scares me is that you've hit it in the head, is this illusion of mastery.
102
00:12:41.790 --> 00:12:55.440
mark evans: That what you're teaching, you know, what students are learning, and young people are learning, and we're talking about age groups as young as, you know, 9 years old. I mean, the evidence points to 9- to 14 year old, and the damage it does to the prefrontal cortex.
103
00:12:55.590 --> 00:13:00.509
mark evans: The damage to grey matter and, you know, the ability to… damage of memory.
104
00:13:00.650 --> 00:13:11.859
mark evans: And the ability to reflect on the process of problem solving, how they got arrived at an answer. If you take that cognitive friction away.
105
00:13:12.890 --> 00:13:25.110
mark evans: from thinking in children when their brains are developing, and students, you know, because we're still developing into early 20s, you know, the brain's still developing. You take that process of friction away.
106
00:13:25.230 --> 00:13:29.810
mark evans: Of understanding what critical thinking is, then what you're doing is you're outsourcing
107
00:13:30.490 --> 00:13:37.239
mark evans: Everything, the most intelligent parts of being a human, the most important parts of being a human, to a machine.
108
00:13:37.460 --> 00:13:44.560
mark evans: And what we're… what I'm hearing just now, and what I'm seeing from speaking to business owners is,
109
00:13:44.930 --> 00:14:03.770
mark evans: starting to see this coming through. First of all, we know that second screen damage, you know, on a generation is already in the bin, it's in the bag, and you know, that damage is there. Low retention, low attention span, and, you know, a low intellectual resilience.
110
00:14:04.390 --> 00:14:05.390
Marion: And….
111
00:14:05.390 --> 00:14:23.200
mark evans: Now, when you layer another level of technology on top of that that outsources the thinking process, you've got something which is genuinely going to… is going to damage education and business. And, you know, the point of what you said was absolutely valid, but so many of the academic institutions have got different rules.
112
00:14:23.630 --> 00:14:28.109
mark evans: You know, you can go from one, lecture and one
113
00:14:28.110 --> 00:14:49.589
mark evans: one course, one module, and they've got a completely different, mindset when it comes to use of OpenAI. I mean, I know that Strathclyde, and Edinburgh, they're all quite forward-thinking on how you should actually… if you're going to use it for research, use it for your research. And, you know, we did governance, et cetera, and, you know, when we did the governance module, we're free to use it for research.
114
00:14:49.810 --> 00:14:59.650
mark evans: But that's all and well when you say to do that. What happens? Human nature will naturally make people go like water, find the easiest path.
115
00:15:00.300 --> 00:15:12.550
mark evans: And when the LLMs are so smart now, you see the… I mean, in the space of a year and a half, I've watched the LLMs accelerate in competence, in perplexity, clawed.
116
00:15:12.550 --> 00:15:27.939
mark evans: you know, GPT-5 came out, etc. When you see the quality of output, and the citations that they're putting in, and the references in there. Now, from a research point of view, it can be used very effectively, but when people are now starting to submit, and in my view, submitting.
117
00:15:28.500 --> 00:15:32.009
mark evans: Work, which is their work, and passing it off
118
00:15:32.110 --> 00:15:36.759
mark evans: as their work, when it's been completely done by an LLM,
119
00:15:37.030 --> 00:15:40.209
mark evans: You know, undergraduate, postgraduate, or otherwise.
120
00:15:40.420 --> 00:15:49.910
mark evans: What you're doing is you're actually just creating a whole, the signal of mastery, but there's no competence behind it.
121
00:15:50.380 --> 00:16:03.949
Marion: Yeah, you know, I saw that real time, Danny. I don't know if you've experienced this with your kids, but, my partner's little boy, he's 9, so funny you said 9, right? And we had a thing earlier in the year when he was doing a book report.
122
00:16:04.270 --> 00:16:20.179
Marion: And I don't think he'd read the book, right? So, like, I think it was, like, The Lion, the Witch, and the Wardrobe, I can't remember, I can't remember. But he's, like, obviously… yeah, I know me too. But he'd, like, put it into whichever, you know, AI tool he was using.
123
00:16:20.410 --> 00:16:23.960
Marion: And then, like, handwrote it out, and then handed it in.
124
00:16:24.620 --> 00:16:27.260
Marion: It came back from the teacher saying.
125
00:16:27.480 --> 00:16:43.369
Marion: what is happening here, because actually what it spat out was complete nonsense, and it… it had the names of the characters wrong, and it, like… Now, this is probably, like, January or February, and even in that short time, like, the tech has really improved.
126
00:16:43.500 --> 00:16:48.259
Marion: But it just underpinned exactly what you said, and I… and I was trying to explain to him
127
00:16:48.720 --> 00:16:49.740
Marion: you know, mate.
128
00:16:50.520 --> 00:16:58.139
Marion: AI, you know, using ChatGPT, all of that, it's got a place, and it's a brilliant tool, right? And it's going to change the world, but…
129
00:16:58.530 --> 00:17:06.429
Marion: The reason why you need to be able to read the book, and then process it in your brain, and then write about it is because that's such an important skill.
130
00:17:06.430 --> 00:17:09.169
Danny Gluch: As you grow up, and as you become, you know, an adult.
131
00:17:09.170 --> 00:17:19.210
Marion: And he's looking at me like, but why? You know, like, this can do it. And I said, well, it can, but look at the crap it gave you, do you know what I mean? Like, it's not accurate. And so…
132
00:17:19.400 --> 00:17:25.290
Marion: that really… like, I talked about that for days afterwards, because it really put the wind up me when I was like.
133
00:17:25.740 --> 00:17:38.780
Marion: shit, this is really happening? You know, this is just an absolute example of that. And so now, as I think about, you know, employment and, you know, bringing in grads or interns or whatever.
134
00:17:40.180 --> 00:17:43.189
Marion: If we go back to the notion of psychological safety, right?
135
00:17:43.420 --> 00:17:51.269
Marion: do I feel safe to give feedback, to, you know, share ideas, to fuck up? Like, do I feel safe to do that?
136
00:17:51.550 --> 00:17:56.830
Marion: And… If you are kind of reliant on this tool.
137
00:17:57.050 --> 00:18:09.560
Marion: And you're incapable of that critical thinking, or being able to kind of, like, analyze, like, you know, if you think in tech root cause analysis, right? You're… all of that, cognitively, is a challenge for you.
138
00:18:10.230 --> 00:18:17.889
Marion: Then you're hiding. Then you're like, oh, you know, and you start to panic when you can't get to your keyboard, or you can't get to your phone.
139
00:18:18.090 --> 00:18:24.499
Marion: And then I start to think about the anxiety that that's gonna cause, and the burnout that that's gonna cause, and it's like…
140
00:18:25.150 --> 00:18:39.979
Marion: I don't hear people talking about that yet. I mean, you're talking about it in terms of the sort of rigor, but when I actually think beyond that and think about the long-term health impacts and mental health impacts, it really… it's really frightening.
141
00:18:40.290 --> 00:18:56.190
Cacha Dora: It's interesting when you think about it from the aspect of, you know, we always… we're all proponents of, like, being our true authentic selves, right? Like, and presenting yourself honestly. And there's been a lot of study and a lot of stuff that's come out in the last few years, especially because it's… I think it's become,
142
00:18:56.830 --> 00:19:01.820
Cacha Dora: A, a social catchphrase, right, is… is imposter syndrome.
143
00:19:01.960 --> 00:19:05.459
Cacha Dora: So now what happens when you actually are the imposter?
144
00:19:06.070 --> 00:19:08.070
Cacha Dora: And you aren't being authentic.
145
00:19:08.070 --> 00:19:11.420
Danny Gluch: It's like Among Us, but everyone's just the imposter.
146
00:19:12.010 --> 00:19:26.390
Cacha Dora: Yeah, it's like, you know, it's a really wild thing to really try… I'm, like, my… I'm going like this right now. You guys can see it, I'm, like, shaking my head, hands are around my head. I'm just like, what if you are the imposter? Like, what if you aren't authentic? Because, like, I can't…
147
00:19:26.530 --> 00:19:34.970
Cacha Dora: I can't not be that person. I have to be my authentic self, or I'm… I'm not okay, right? Marion, going back to my mental health, like, I have to be who I am.
148
00:19:35.380 --> 00:19:38.279
Cacha Dora: But what if all you're trying to do is get through the next door?
149
00:19:39.000 --> 00:19:40.190
Cacha Dora: class.
150
00:19:40.600 --> 00:19:41.170
mark evans: M.
151
00:19:41.290 --> 00:19:46.950
mark evans: That's such an important point, Cash, and that goes back to when you look at some of the studies that go back to
152
00:19:47.120 --> 00:19:56.089
mark evans: sort of mid, you know, teenage education and that sort of later parts of their higher education. Then when they see
153
00:19:56.660 --> 00:20:00.649
mark evans: And they overuse LLM to the point that there's cheating.
154
00:20:00.920 --> 00:20:03.690
Marion: Then there's a kind of moral drift.
155
00:20:04.600 --> 00:20:05.110
Cacha Dora: It comes in.
156
00:20:05.110 --> 00:20:12.939
mark evans: That what is acceptable, there's no… where are the boundaries of what is acceptable? And what you're doing is we're teaching, and we're allowing and enabling.
157
00:20:13.660 --> 00:20:15.360
mark evans: This kind of…
158
00:20:15.760 --> 00:20:24.179
mark evans: Attitude where it's okay to do that, because everyone else seems to get away with it, and there's a lack of trust they have with each other.
159
00:20:24.210 --> 00:20:37.180
mark evans: But the most important thing, as you say, when they start to move and they're being tested on those critical thinking elements that you need when you start to move into employment, everything that they've achieved up to that point has been built on an illusion.
160
00:20:38.110 --> 00:20:56.010
mark evans: That has got to have an effect. There's no cognitive resilience built in there. There's nothing that helps you. We've learned as adults that, you know, you have to… there's an element of pain and failure
161
00:20:56.010 --> 00:21:06.700
mark evans: And trying and testing and getting back on the bike when you fall off it, that's part and parcel of learning. But when you take all of those… that framework away, that scaffolding away.
162
00:21:07.020 --> 00:21:17.560
mark evans: then you're going to be left with people that are moving into employment, which have… which really are going to have that imposter syndrome. It's a very, very, …
163
00:21:17.780 --> 00:21:20.070
mark evans: A very, very interesting angle.
164
00:21:20.460 --> 00:21:21.530
Marion: Hmm….
165
00:21:21.530 --> 00:21:25.179
Danny Gluch: I feel like I've become my grandparents when…
166
00:21:25.340 --> 00:21:43.009
Danny Gluch: I was doing math homework, and I was like, why can't I just use the calculator? It's right in the drawer. And they were like, no, you need to learn how to do the process so that you understand what's going on, and how the numbers are interacting with each other, so that if you don't have a calculator, or even if you do.
167
00:21:43.010 --> 00:21:50.820
Danny Gluch: you can understand, like, wait, no, something went wrong, or, you know, I understand the process, and I can do it on my own. I'm not relying on the tool.
168
00:21:50.900 --> 00:21:58.310
Danny Gluch: And I hear a saying, a very similar thing, of… You, you have to…
169
00:21:58.790 --> 00:22:06.319
Danny Gluch: dig in. You have to be good at problem solving. You have to understand what you're talking about, otherwise you're… you're…
170
00:22:06.540 --> 00:22:08.779
Danny Gluch: Incapable of critical thinking.
171
00:22:09.000 --> 00:22:27.099
Danny Gluch: And I think the incapability, the just absolute atrophy of organizations who use this tool so much, they're going to be atrophying the mastery, and the ability to think critically, and the ability to innovate. Like.
172
00:22:27.100 --> 00:22:28.000
Marion: Mmm.
173
00:22:28.000 --> 00:22:46.350
Danny Gluch: No AI is going to be able to innovate and push forward because they're at the precipice of a new technology, or the precipice of, you know, the industry. Like, it's never gonna be able to do that. It's just gonna mirror whatever language it can find online. It's never gonna push things forward.
174
00:22:46.910 --> 00:22:55.719
mark evans: One thing he just said, the calculator analogy is used quite a lot, but when you… to use a calculator, you still have to understand the problem.
175
00:22:55.850 --> 00:22:58.680
Cacha Dora: That the calculator's going to solve.
176
00:22:59.610 --> 00:23:09.869
mark evans: So, the context of that neurocognition, that metacognitive process, that ability to understand the problem, what you've got now is
177
00:23:10.040 --> 00:23:15.989
mark evans: Not understanding what the problem is, but just firing a question, and there's… it's frictionless through to the answer.
178
00:23:16.480 --> 00:23:17.240
Danny Gluch: You're right.
179
00:23:17.240 --> 00:23:30.480
mark evans: And that's… it's a terrifying… you're so right with the… I hear it a lot, you know, people trying… Yes, Marion hit it perfect earlier on when she talked about the fact that
180
00:23:30.480 --> 00:23:49.650
mark evans: kids nowadays are, you know, they're tech-savvy. They understand tech, they've grown up with tech, they've been born into tech, digital savvy, tech savvy. Now, you know, you could say that we… I've not certainly wasn't born into that environment. I've grown into that environment, I've, you know, I've seen the good and the bad, and I've seen how…
181
00:23:49.650 --> 00:23:54.799
mark evans: The tech giants, as we put them, manipulate the narrative.
182
00:23:54.800 --> 00:24:06.189
mark evans: to create users, and create a user base. The danger with… the danger that we have just now is that they're all battling for a user base, and we're talking about outsourcing
183
00:24:06.240 --> 00:24:14.010
mark evans: Educate educators for chatbots and digital avatars, to teach kids.
184
00:24:14.200 --> 00:24:15.769
Marion: Yeah. And at home.
185
00:24:15.770 --> 00:24:22.500
mark evans: And a lot… it's… that is a worry for me. I mean, I've seen this come on, because of course, there's a commercial aspect to that.
186
00:24:23.310 --> 00:24:35.119
mark evans: Teaching kids at a younger age to respond to avatars and digital learning. When there's no teacher, there's no connection with the teacher, there's no pain points involved in it.
187
00:24:35.120 --> 00:24:35.770
Cacha Dora: There's no chance.
188
00:24:35.770 --> 00:24:36.339
mark evans: That's interesting.
189
00:24:36.340 --> 00:24:36.740
Cacha Dora: Right?
190
00:24:36.740 --> 00:24:47.829
mark evans: a real challenge. And so, the understanding of the pro… it starts to become so diluted that that's a fear point for me, because where's it going to be in 10 years' time?
191
00:24:48.050 --> 00:24:51.699
mark evans: Where… where's it going to be in 50… where's it going to be in 5 years' time? We're moving that.
192
00:24:51.700 --> 00:24:52.300
Marion: quick, just….
193
00:24:53.150 --> 00:24:53.810
Marion: Yeah.
194
00:24:53.810 --> 00:24:55.959
mark evans: I'm worried about that, I'm deeply worried about that.
195
00:24:56.530 --> 00:25:02.949
Marion: Yeah, that's… that's a legit fear, and I think it's something that we all share, and there was something else that
196
00:25:03.090 --> 00:25:10.520
Marion: occurred to me when you were talking, it's like… and I… and I read something about this recently, but it really just hit home. You know.
197
00:25:11.200 --> 00:25:24.989
Marion: They… these companies, these AI companies, right now, it's really accessible, right? Because, again, they're vying for a business, they want to get you in, they want to get you hooked. And someone compared it to a gateway drug, right?
198
00:25:24.990 --> 00:25:25.530
mark evans: Hmm.
199
00:25:25.960 --> 00:25:37.259
Marion: just a wee bit of weed, you know, just have a wee smoke, it'll be fine, you'll enjoy it, and then before you know it, they're shooting up heroin. And it's kind of the same. Maybe not quite the same, it's not quite train spotting, but….
200
00:25:37.260 --> 00:25:39.029
Cacha Dora: Quite a progression.
201
00:25:39.030 --> 00:25:42.850
Marion: You get two glass regions on this, this is what's gonna happen.
202
00:25:43.300 --> 00:25:45.860
Danny Gluch: Scott's impulse to bring up trade spotting.
203
00:25:46.070 --> 00:25:48.829
mark evans: I know, of course. Yeah, yeah, just be up.
204
00:25:49.100 --> 00:26:03.569
Marion: Of course, they can't help it, but it is true, right? Because, you know, the theory is that, you know, as we get more hooked on it, and we can't function without it, and it becomes so part of our lives as a mainstay.
205
00:26:03.870 --> 00:26:14.160
Marion: then they've got you. So they'll just start, you know, kicking up the price, and, oh, well, we've improved it, it's, you know, we're on, you know, version 45, and da-da-da, right?
206
00:26:14.470 --> 00:26:17.099
Marion: And that got me thinking, and I was like.
207
00:26:17.750 --> 00:26:30.420
Marion: That's scary, because I can exactly see those kids who, again, haven't necessarily got that full, kind of, development of critical thinking, and just become reliant on it.
208
00:26:30.820 --> 00:26:42.079
Marion: they're gonna be sucked in, and they're gonna be manipulated, and they're gonna continue to, you know, pay those prices, and that is where, morally, I start to really, really struggle.
209
00:26:42.780 --> 00:26:52.699
mark evans: I mean, I think the danger is that we don't… you know, I'm certainly… I'm no Luddite, you know, AI, I've built… I've been working with AI since 2012, building products from AI.
210
00:26:53.230 --> 00:26:55.600
mark evans: see the wonderful things of AI, and we all can.
211
00:26:55.600 --> 00:26:56.050
Marion: Middles.
212
00:26:56.050 --> 00:27:05.040
mark evans: There's going to be things when we… when you eventually have a convergence of AI, and I talk about in certain… in years to come with quantum computing, etc, there's going to open up
213
00:27:05.300 --> 00:27:14.790
mark evans: Knowledge and thought and intelligence that we as humans would not have the capacity to understand for the next thousand years.
214
00:27:14.940 --> 00:27:33.780
mark evans: We've kind of… we take time to innovate, we go through a process of invention, we collapse invention, we reinvent, and that takes time. You know, that invention and creation and thinking outside the human scope of things, that'll happen in microseconds with these… with the way it's going. So, you have to prepare
215
00:27:34.160 --> 00:27:36.420
mark evans: Your next generation of…
216
00:27:36.590 --> 00:27:50.200
mark evans: of employers, employees and business leaders and innovators to understand AI and the great things it can do. But the flip side of that is I feel that the business leaders
217
00:27:50.490 --> 00:27:57.019
mark evans: And educators are not equipped just now, and are flying blind.
218
00:27:57.150 --> 00:27:59.580
mark evans: So, they're way behind the curve.
219
00:28:00.030 --> 00:28:03.530
mark evans: The students are in one, one, one, dimension.
220
00:28:03.530 --> 00:28:21.150
mark evans: And they're pushing ahead because they're hungry for the knowledge, and they'll use what tools are there. The big AI companies are pushing out at lightning speed. They don't even know if they're pushing out half the time, and they don't care about guardrails, and they don't care about the systems they're putting out there, and how they're abused and abused.
221
00:28:21.150 --> 00:28:21.730
mark evans: I mean.
222
00:28:22.100 --> 00:28:33.689
mark evans: just been pulled up for, you know, they've just been identified for hackers using their code system, their, you know, their clawed code, to create hack systems that hack
223
00:28:33.690 --> 00:28:47.059
mark evans: and are used for criminality, and they've just been made aware. So there's always going to be… there's going to be grey areas, so what we need, and what the system needs, and what I've tried to identify in the paper, is that there needs to be checks and balances put in now.
224
00:28:48.080 --> 00:29:02.509
mark evans: which allow AI to be used safely, securely, and ethically within a framework and a context that allows students to know where those barriers and boundaries are. So AI is absorbed and adopted positively within
225
00:29:02.620 --> 00:29:09.290
mark evans: At business and within academic institutions, but it's done in such a way where
226
00:29:09.490 --> 00:29:29.060
mark evans: you know, like the Copenhagen Business School, they do everything VIVA oral basis, everything's assessed, you know, you write a paper, Marion will know in PhD, you'll have to do your VIVA at the end, you need to explain, you'll be questioned on it. That is a fundamental tool at the disposal of educators, that they can sit down with the students and say, okay, tell me about this reference.
227
00:29:29.990 --> 00:29:39.790
mark evans: Tell me where you sit down with this citation. Tell me how… where this thought process has come from, that systems thinking, that executive thinking. Tell me about this.
228
00:29:40.180 --> 00:29:56.960
mark evans: And that is lacking. It's… and so you've got institutions way behind, and they're so locked in paralysis by analysis. They're so locked in the problems there. Oh, we know it's here, we're aware of it. But it… it's… it's washing over them.
229
00:29:57.010 --> 00:30:05.989
mark evans: And it needs to be acted on very quickly, and then the business leaders need to come in at the other side to actually stop burning their heads in the sand.
230
00:30:06.240 --> 00:30:13.659
mark evans: And saying that AI's not our problem. Just now. It's not an… it's a risk. It's a risk on human resource.
231
00:30:13.940 --> 00:30:25.160
mark evans: It's a risk on operations, it's a risk on scenario planning for the future, and for business strategy, and for competitive edge, and value creation, and profit.
232
00:30:26.280 --> 00:30:27.219
Marion: It's the competitive.
233
00:30:27.220 --> 00:30:32.660
Danny Gluch: edge and value creation that I think long-term has, you know.
234
00:30:32.910 --> 00:30:39.069
Danny Gluch: business leaders not known for their long-term thinking all the time, right? Like, that's… AI can do a lot of things now.
235
00:30:39.380 --> 00:30:48.030
Danny Gluch: But I worry, in the same sense where you're talking about, you know, the kids growing up, you know, I'm thinking of my kids and going through school.
236
00:30:48.330 --> 00:30:56.780
Danny Gluch: And if they're not developing the muscles, if they're not learning to think critically and to problem-solve, they're never going to have it.
237
00:30:57.020 --> 00:31:05.159
Danny Gluch: But organizations, if they're implementing this, and like, wow, look at these, like, you know, shortcuts and quick wins we can get with AI…
238
00:31:05.630 --> 00:31:16.719
Danny Gluch: And the more they use AI as their muscle and not their own, I really think that they're gonna lose some of the competitive advantage. They're individuals who are the organization.
239
00:31:16.720 --> 00:31:17.530
Marion: Right? Like, that's….
240
00:31:17.530 --> 00:31:37.019
Danny Gluch: You know, we think of the organization, we think of human resources, but we forget, at the end of the day, it's individual people who make up, who actually innovate, who are the masters of their industry, who are the ones who are going to create the value propositions and the differentiating factors that allow businesses to create profit.
241
00:31:37.140 --> 00:31:41.080
Danny Gluch: If we're outsourcing the muscles that eventually get us there.
242
00:31:41.440 --> 00:31:49.459
Danny Gluch: then AI can just run all the industries, because if all the organizations are using… It's called the Matrix.
243
00:31:49.460 --> 00:31:53.970
Marion: Great movie!
244
00:31:53.970 --> 00:31:55.619
mark evans: But, like, it….
245
00:31:55.620 --> 00:32:03.130
Danny Gluch: The organization that allows AI to be a tool, but not a crutch, Because crutches….
246
00:32:03.130 --> 00:32:03.450
Cacha Dora: skinny.
247
00:32:03.450 --> 00:32:09.559
Danny Gluch: help you, but you're not gonna win a 100-meter dash if you've got a crutch, a crutch. And…
248
00:32:09.940 --> 00:32:16.070
Danny Gluch: I think the organizations that use the crutch might get to the finish line, but they're not going to be leading the way.
249
00:32:16.070 --> 00:32:16.770
Marion: Mmm.
250
00:32:16.770 --> 00:32:21.109
Danny Gluch: Who use it as an aid to help, you know, fill in the gaps.
251
00:32:21.110 --> 00:32:21.480
Marion: released.
252
00:32:21.480 --> 00:32:28.860
Danny Gluch: rely on their people's ability to solve problems, critically think, innovate. I think that's gonna be…
253
00:32:29.210 --> 00:32:38.700
Danny Gluch: you know, an advantage, right? It's gonna create a competitive advantage if your, you know, your competitor is just doing whatever ChatGPT tells them to do.
254
00:32:39.090 --> 00:32:39.990
Marion: Yeah.
255
00:32:40.070 --> 00:32:40.890
Cacha Dora: Yeah, that…
256
00:32:41.330 --> 00:32:55.410
Cacha Dora: you saying that, Danny, really made me think, like, I was like, all I was… my head was just… the word trust just kept bubbling through my head. Because I think at the individual level, you have people who both trust and mistrust AI.
257
00:32:55.570 --> 00:33:02.740
Cacha Dora: But, systematically, the software that we're all using is competitively trying to use AI.
258
00:33:02.880 --> 00:33:10.239
Marion: So, a huge portion of the software using Microsoft, you've got Copilot showing up in your email, and in Word, and showing up everywhere.
259
00:33:10.240 --> 00:33:27.570
Cacha Dora: And then insert software's AI competitive advantage. It's going… it's showing up everywhere, right? So even if you as a human do not trust AI, if your organization doesn't trust AI, the software that you're using is throwing AI at you.
260
00:33:27.740 --> 00:33:28.900
Marion: And….
261
00:33:29.230 --> 00:33:36.070
Cacha Dora: From… so we're seeing it, like, that's, like, your business side of things, but then when you look at it, it's like, if you're not trusting it.
262
00:33:36.420 --> 00:33:49.150
Cacha Dora: And then you start to see it, its impact, and you start to use it and be like, now my trust is building. We keep talking about psychological safety, that fear… the fear of taking a risk, the fear of speaking out.
263
00:33:50.550 --> 00:34:04.590
Cacha Dora: if we're seeing it at the… I'm gonna say at the adult level, versus, like, going through education, what… what are our kids seeing where AI is also doing the exact same thing and building their trust in it?
264
00:34:05.290 --> 00:34:05.880
Danny Gluch: Hmm….
265
00:34:07.610 --> 00:34:10.059
Danny Gluch: It's interesting you bring up trust, because…
266
00:34:10.429 --> 00:34:21.900
Danny Gluch: as you were talking, I was reminded with what Marion said. And the most innovative teams, the most successful teams, are going to have trust amongst each other. Individuals, people.
267
00:34:22.400 --> 00:34:23.550
Danny Gluch: And…
268
00:34:24.860 --> 00:34:36.169
Danny Gluch: do I just now need to trust the AI tool that they're using? I don't have to worry about your competence? And it goes back to that, like, can I be authentic? Can I show up? And…
269
00:34:36.500 --> 00:34:42.759
Danny Gluch: fail? Can I show up and try something new? Can I show up and just be my authentic self?
270
00:34:43.090 --> 00:34:46.809
Danny Gluch: Do I have the ability to…
271
00:34:47.179 --> 00:34:52.460
Danny Gluch: have psychological safety if I'm overly reliant on AI. I, I…
272
00:34:52.889 --> 00:34:58.109
Danny Gluch: I haven't heard that talked about, aside from Marion. Marion, that's why you're the best.
273
00:34:59.720 --> 00:35:03.139
Danny Gluch: But I don't see that in, you know.
274
00:35:03.260 --> 00:35:19.120
Danny Gluch: that's… that's a… that's something you practice. And I think it starts, like Mark was saying, in school, in the school age of going in and getting something wrong and having your teacher say, I understand where you went wrong, let's work on that.
275
00:35:19.890 --> 00:35:32.099
Marion: I want to ask this, I want to ask this. So around about the time that I'd read Mark's paper, I saw in the press that Microsoft were releasing this kind of really big
276
00:35:32.230 --> 00:35:41.929
Marion: initiative, for teachers, and this was back in the summer, and it was basically… they were teaming up with the, American Federation of Teachers.
277
00:35:41.930 --> 00:35:59.150
Marion: And they were launching training for K-12 educators, US, I guess, in a… to… I think they were… it was the same amount of money as, like, $23 million or something, which actually to them is, like, a drop in the ocean, right? It's nothing. But they wanted to equip
278
00:35:59.270 --> 00:36:13.250
Marion: something like half a million teachers, with AI skills for this, but for what reason? To use it, or to spot it, or whatever, right? Obviously, you know, you can look at this in lots of different ways. I'm sure that there…
279
00:36:13.420 --> 00:36:23.100
Marion: is a huge element of corporate social responsibility in there, you know, it's the right thing to do, but I would, being the academic that I am, I would
280
00:36:23.250 --> 00:36:39.729
Marion: be curious to understand how it was being done and, you know, any other sort of motivations there. And then I think there was another thing in addition to that, they were developing an Elevate Academy or something like that, but again, they were aiming to
281
00:36:39.730 --> 00:36:48.439
Marion: I think it was, like, 20 million people that were aiming to, kind of, train and expose and equip and skill with AI, and
282
00:36:49.650 --> 00:36:52.779
Marion: Yay! I mean, on the surface, doesn't that all sound great?
283
00:36:53.040 --> 00:36:54.749
Marion: But… I don't know.
284
00:36:55.040 --> 00:37:02.550
mark evans: Education, you see them all. OpenAI, Anthropic, they're Google, they're all Microsoft.
285
00:37:03.370 --> 00:37:06.439
mark evans: We're only interested in user base and volume.
286
00:37:06.830 --> 00:37:13.670
mark evans: And education's almost become, like, a commodity within that. It's a… it's a trans… it's a currency.
287
00:37:13.860 --> 00:37:20.659
mark evans: So, we give this away free, we get something back in return, what they do. We give every… if you look at them all, they're all giving you free education.
288
00:37:20.690 --> 00:37:34.190
mark evans: on their systems, on their LLMs, and they get you hooked, so you actually understand their system and the benefits of their system. So it's like live advertising being fed through education.
289
00:37:34.190 --> 00:37:42.270
mark evans: And there's no… there's no guardrails on that. If you get the teachers involved, and you look as if you're doing your…
290
00:37:42.440 --> 00:37:44.820
mark evans: Your worldly service for the greater good.
291
00:37:44.820 --> 00:37:45.290
Marion: Hmm.
292
00:37:45.290 --> 00:37:55.139
mark evans: and you give it away for free, then they're trained on Microsoft Copilot. And before you know it, there's an education… before they can start to
293
00:37:55.340 --> 00:38:01.940
mark evans: You know, from a business model point of view, you've got a massive untapped market to start releasing modules that paint the.
294
00:38:01.940 --> 00:38:02.320
Marion: Yeah.
295
00:38:02.320 --> 00:38:08.799
mark evans: understand how to use, and they can start to commercialize education using chatbots and avatars.
296
00:38:08.800 --> 00:38:09.339
Marion: I was rolling.
297
00:38:09.340 --> 00:38:13.039
mark evans: Wonderful things, and start looking at age targeting from
298
00:38:13.040 --> 00:38:31.150
mark evans: pre-teen, you know, preschool, nursery preschool, right through into primary or secondary or high college education. So, it's just changing the mindset and trying to understand from the commercial, the model point of view, what makes these guys tick, and it's about market share.
299
00:38:31.150 --> 00:38:39.510
mark evans: It's ultimately about the shareholders. It's… they've got… that's their responsibility. It's not for the world good. We've seen that with social media. These guys.
300
00:38:39.510 --> 00:38:39.950
Marion: look up.
301
00:38:39.950 --> 00:38:50.930
mark evans: to bed in the morning and say, what am I going to do to make the world a better place today? No, it's not. It's like, what can we do to make more money for our shareholders? Because that is our… that is our mission.
302
00:38:51.060 --> 00:38:56.769
mark evans: There's no other… there's no other reason for it. They have that… that's the drive. That's it.
303
00:38:56.770 --> 00:38:57.230
Marion: Right.
304
00:38:57.230 --> 00:39:12.029
mark evans: They can weaponize their system within cultures and communities and society, and then normalize that, and they can make money from it. That's the start of it. And that's what we're seeing just now. That's what I'm seeing, and I'm just maybe being a bit…
305
00:39:12.030 --> 00:39:17.640
mark evans: cynical through the commercial mindset, but I can see exactly the direction of travel with this.
306
00:39:17.920 --> 00:39:28.460
Marion: Yeah, do you know what just occurred to me when you were talking about schools and stuff as well? Like, and I don't think this is as much of a thing in the UK, but it certainly is here in the US, where
307
00:39:28.820 --> 00:39:29.730
Marion: you know.
308
00:39:29.790 --> 00:39:44.550
Marion: The disparity in… in resource and funding for schools can be quite significant, and, you know, you'll have… you'll hear countless, countless stories of inner-city schools, projects, whatever.
309
00:39:44.550 --> 00:39:55.090
Marion: Where poor teachers, very low-paid, very under-resourced, and are actually buying their own school supplies out of their own pocket because the school can't afford it, right?
310
00:39:55.550 --> 00:40:08.120
Marion: which makes me then wonder about this, because, you know, is this like, you know, when we were kids and everybody wanted a pair of Nike or Jordans, but, you know, your mum and dad couldn't afford it, right? And you never… you could have that sort of, like.
311
00:40:08.290 --> 00:40:21.050
Marion: I don't have that type thing. And I'm starting to wonder, like, is this going to be the new Nike or Jordans? Do you know what I mean? Like, is this something where the disparity gets wider because of economic factors?
312
00:40:21.060 --> 00:40:34.020
Marion: Where some kids are really getting to use this and embrace it, and maybe some aren't. I… again, I don't know enough about that, but I'm just kind of, like, trying to read the tea leaves here based on what I do know from, certainly in the US.
313
00:40:35.190 --> 00:40:35.680
Danny Gluch: Yeah.
314
00:40:35.680 --> 00:40:38.379
mark evans: I think that there's definitely… sorry, sorry, Danny, when you'.
315
00:40:38.380 --> 00:40:40.030
Danny Gluch: Oh, no, no, no, go ahead, Mark.
316
00:40:40.280 --> 00:40:50.459
mark evans: I was just going to say, I think if people kind of reframe the problem when it comes to schools, and I keep going on about schools, when the paper's about wider issues, but
317
00:40:50.460 --> 00:41:07.400
mark evans: When you look at where it all starts, we would… there would be public outrage if we had an alcohol company or a tobacco company standing in the playground selling or sampling their wares to kids on their phones, but yet we're allowing this generation of
318
00:41:07.470 --> 00:41:16.920
mark evans: kids to come through where they're at their most vulnerable. They not only deal with the complexities of social media and mobiles and smartphones and that ship sailed.
319
00:41:17.260 --> 00:41:17.730
Marion: Oh, you should….
320
00:41:17.730 --> 00:41:32.529
mark evans: See, politically, you know, there's a lot of people trying to do something about that, certainly for, you know, young teenage… teenage… young children up to, sort of, 16, as it is in Australia. But when you start to add another layer into that, and watching
321
00:41:32.560 --> 00:41:45.470
mark evans: Rules being twisted, and governments sitting on their hands, and allowing this to start to infiltrate and feed through. I fear that we've lost the battle if we allow it to become…
322
00:41:45.550 --> 00:41:47.170
mark evans: unchecked.
323
00:41:47.980 --> 00:41:53.899
Danny Gluch: Yeah, I think guardrails are really important, and Marion, it's so interesting that you brought up the funding.
324
00:41:54.280 --> 00:41:54.730
Marion: I hate it.
325
00:41:54.730 --> 00:42:06.499
Danny Gluch: And that is definitely a very American problem. And in our conversation, one of the things that I think I would love our listeners to understand is that the commodity of
326
00:42:07.250 --> 00:42:11.670
Danny Gluch: Individuals of human resources going into the future is going to be critical thinking.
327
00:42:12.360 --> 00:42:15.639
Danny Gluch: And the question is gonna be, who actually…
328
00:42:16.430 --> 00:42:36.379
Danny Gluch: it goes through an education system that still allows for critical thinking to be practiced, right? It's hard to teach critical thinking. That's actually what I did in the university for quite a while. It's hard to teach. It really is. It's more of a practice. It's more like driving. You have to just learn by doing, right? You get better.
329
00:42:36.750 --> 00:42:39.530
Danny Gluch: Not by reading a book about it, but by practicing it.
330
00:42:40.080 --> 00:42:49.240
Danny Gluch: And I think it's gonna be the schools that are underfunded that don't have the ability to have teachers take the time to sit down and say, tell me how you got there.
331
00:42:49.740 --> 00:43:01.800
Danny Gluch: Right? Those are going to be the schools that have lots of funding, that have a low, you know, student-to-teacher ratio, and they can sit down and say, tell me how you got there as early as, you know, age 7, 8, 9.
332
00:43:02.020 --> 00:43:08.969
Danny Gluch: And it's gonna be the schools that aren't funded well, that's just like, hey, they turned something in at least, it looks good to me, go through.
333
00:43:09.300 --> 00:43:09.849
Marion: It's gonna….
334
00:43:09.850 --> 00:43:21.370
Danny Gluch: it's gonna create such a big divide. The one thing I'm seeing, in the US at least, is, like, a total reversion. It's, like, going back to the 1970s. They're using their, like.
335
00:43:21.380 --> 00:43:33.520
Danny Gluch: you know, paper, tests, they're… they're not giving homework. There's… it's a lot more, like, stand and deliver, show me the stuff, as opposed to go home and do it, and do the.
336
00:43:33.520 --> 00:43:33.980
Marion: Yeah.
337
00:43:33.980 --> 00:43:44.430
Danny Gluch: there, because they can't trust it anymore. And I don't know if that's the right guardrail, but I think it's getting in the right direction to where it's….
338
00:43:44.840 --> 00:43:47.349
Cacha Dora: The institution isn't trusting the output.
339
00:43:47.570 --> 00:43:50.409
Danny Gluch: Why would you? I wouldn't.
340
00:43:50.710 --> 00:44:03.940
Marion: happening… that's happening at our job seeker level, like, working in, you know, working in the tech environment. I hear these stories where, you know, people will be using, you know, you've got those,
341
00:44:04.650 --> 00:44:09.900
Marion: sort of SaaS platforms that you can test, you know, techability, right?
342
00:44:11.130 --> 00:44:23.339
Marion: I've been trying to find all sorts of ways to circumvent that, down to having someone in a different country answer it, show them it, and then they're, like, inputting it. Like, it's wild!
343
00:44:23.540 --> 00:44:24.080
mark evans: Hi, beautiful.
344
00:44:24.080 --> 00:44:29.920
Marion: people are trying to navigate and circumvent some of these tools, and so when you were saying that there, Danny, I was like.
345
00:44:33.150 --> 00:44:34.479
Marion: It's almost normalized.
346
00:44:35.510 --> 00:44:44.270
mark evans: So it's almost normalized the… seeking to lack effort, if that makes sense. It's almost normalized.
347
00:44:44.520 --> 00:44:54.800
mark evans: whole thing where people are always looking for the fastest, easiest way to obtain something, and that's part and past, I guess, of a cultural thing, which has gone on right across the world, is this very
348
00:44:55.220 --> 00:44:57.909
mark evans: throwaway effort. It's just a version of that.
349
00:44:57.910 --> 00:45:00.900
Danny Gluch: ethical drift that you were talking about, right? Yeah.
350
00:45:00.900 --> 00:45:11.999
mark evans: But it's acceptable, yeah. I use that, and the water will always find the easier path, and we've become… there's just a sense of, well, you know, first of all.
351
00:45:12.340 --> 00:45:14.750
mark evans: you know, I feel there's this…
352
00:45:15.180 --> 00:45:17.790
mark evans: You know, how can you say it without?
353
00:45:18.260 --> 00:45:29.270
mark evans: just a huge gap in people's ability to be able to focus on problem solving, and that is… just now, if we don't put a stamp on that.
354
00:45:29.270 --> 00:45:40.999
mark evans: it's almost as if we're just saying, well, you know, effort doesn't really matter. You know, it doesn't really matter. And how's that going to be when it gets… in the workforce, when you do make that important hire.
355
00:45:41.220 --> 00:45:45.509
mark evans: And they're… they've not got a computer at their fingertips, or they're not
356
00:45:45.860 --> 00:45:48.750
mark evans: One at their fingertips, and they've got a problem to solve.
357
00:45:48.750 --> 00:45:49.819
Cacha Dora: And….
358
00:45:49.820 --> 00:46:07.799
mark evans: that's… that's what you're… you've then hired this person, taken them into your… into the business, but they don't understand how you… to do that critical thinking. They've not got that resilience built in, and that's the part which is really, really dangerous for… for… for business. And I'm seeing it, and I'm hearing it. That seems to be the case.
359
00:46:08.850 --> 00:46:19.700
Danny Gluch: Yeah, it really does, and I'm terrified. I'm glad I'm not in recruiting, because I'm terrified of what it's gonna look like in 5 years, right? If it's this bad now, like.
360
00:46:20.140 --> 00:46:28.129
Danny Gluch: ChatGPT just got normalized, what, like, two and a half, three years ago? Yeah. And it's where we are now.
361
00:46:28.290 --> 00:46:29.440
Danny Gluch: And…
362
00:46:30.090 --> 00:46:47.629
Danny Gluch: So, as we wrap up, what are… what's the main lesson you would like, whether it's, you know, towards schools, or towards individuals, parents, or towards organizations? What's one thing you would love to see, or a message you'd like to give them that would help them
363
00:46:47.880 --> 00:46:51.350
Danny Gluch: Not be floundering looking for talent in 5 years.
364
00:46:51.470 --> 00:46:56.209
mark evans: Do you know the most important thing is? You know, there's a term called human in the loop.
365
00:46:56.660 --> 00:47:07.770
mark evans: And, you know, the human-in-the-loop term comes from a sort of military directive, where semi-autonomous drones were used, and any kill decision or any weapon decision
366
00:47:07.950 --> 00:47:21.229
mark evans: you always had to have, under the laws of war, and certainly under charters, that you have a human in the loop, so a machine doesn't decide to take the life of a human being on its own. That's an ethical thing, that's just an.
367
00:47:21.230 --> 00:47:21.880
Marion: I think.
368
00:47:21.880 --> 00:47:26.959
mark evans: Moral law, but obviously there'll be certain people that'll break that in the future, and…
369
00:47:26.990 --> 00:47:44.819
mark evans: But from a Western, from a culture that cares about rules and laws of the world, that's the starting point for me. We need to bring those human-in-the-loop guardrails into the systems of education, into the systems of business.
370
00:47:44.820 --> 00:47:55.049
mark evans: And hiring, and that we have to change. Going back to written tests and verbal, assessments and
371
00:47:55.050 --> 00:48:07.710
mark evans: testing, and that may be when you're hiring someone, is before you give them the job, is you actually give them something which is work-related as an assessment. And the depth of investigation
372
00:48:07.880 --> 00:48:12.409
mark evans: to unearth the true person behind the CV,
373
00:48:12.530 --> 00:48:25.330
mark evans: the gloss of the CV, that's going to be super important. So I think my manual is human in the loop all the time, because AI's here to stay, and it's going to grow, and it's going to have an effect on everyone's life.
374
00:48:25.330 --> 00:48:40.110
mark evans: Whether you like it or not, it's preparing and making sure that the people are educating and the people that are leaders in the world are aware of these risks and plant humans around all the key decisions.
375
00:48:40.320 --> 00:48:48.310
mark evans: To make sure the process is as humanized as possible, so AI becomes augmentation rather than replacement.
376
00:48:48.850 --> 00:48:50.270
Marion: I love it.
377
00:48:51.240 --> 00:49:03.089
Danny Gluch: Thank you so much. That was just a phenomenal conversation. I hope that everyone can take those and start to think about and be strategic in how AI is working in the workforce, because…
378
00:49:03.090 --> 00:49:13.809
Danny Gluch: that human in the loop really does sound like it would solve just a lot of the concerns and red flags that people are experiencing right now. So, thank you for joining us, Mark. Thank you, everyone, for listening.
379
00:49:13.810 --> 00:49:14.590
mark evans: here.
380
00:49:15.100 --> 00:49:29.869
Danny Gluch: Please be sure to subscribe, leave us a 5-star review, leave us comments and notes on LinkedIn, be sure to message us if you have an idea for an elephant that you want us to talk about on another episode. Thank you all so much, have a great day.
381
00:49:30.290 --> 00:49:31.189
mark evans: Thank you.