The Elephant in the Org

Catfished by Ralph: What a LinkedIn Social Experiment Taught Us About Bias, Voice & Authenticity with Aubria Ralph (AI Special - Part 1)

The Fearless PX Season 3 Episode 2

Send us a text

What happens when ChatGPT decides you’re not a Black woman… but a 40-year-old white man named Ralph?

That’s exactly what happened to attorney and entrepreneur Aubria Ralph, who turned an AI glitch into a full-blown LinkedIn social experiment. She flipped her name, posted as “Ralph Aubria,” and watched the offers, DMs, and algorithm love roll in. The results? Both hilarious and deeply troubling.

In this episode, Marion and Danny unpack:

  • Why AI assumed “thoughtful leader” = white male.
  • How LinkedIn rewarded Ralph with more visibility than Aubria.
  • What happens when bias gets coded, amplified, and normalized online.
  • Why authenticity keeps getting buried under clickbait and copy-paste content.

This is Part 2 of our AI Trilogy — three back-to-back episodes exploring how AI is reshaping trust, work, and psychological safety.

🎧 Listen now to hear how a fake Ralph exposed real bias — and what it says about the future of voice and authenticity online.

Get the Show Notes here

🐘 Connect with Us:

🚀 Follow The Fearless PX on LinkedIn: The Fearless PX
📩 Got a hot take or a workplace horror story? Email Marion, Cacha, and Danny at elephant@thefearlesspx.com

🎧 Catch every episode of The Elephant in the Org: All Episodes Here

🚀Your Hosts on Linkedin:

🐘Marion Anderson

🐘Danny Gluch

🐘Cacha Dora

💬 Like what you hear?
Subscribe, leave a ★★★★★ review, and help us bring more elephants into the light.

🎙️ About the Show

The Elephant in the Org drops new episodes every two weeks starting April 2024.
Get ready for even more fearless conversations about leadership, psychological safety, and the future of work.

🎵 Music & Production Credits

🎶 Opening and closing theme music by The Toros
🎙️ Produced by The Fearless PX
✂️ Edited by Marion Anderson

⚠️ Disclaimer

The views and opinions expressed in this podcast are those of the hosts and guests, and do not necessarily reflect any affiliated organizations' official policy or position.

S3 Episode 2 Transcript
1

00:00:02.930 --> 00:00:08.939

Danny Gluch: Welcome back to the elephant of the org, everyone. I'm Danny Glutch, and I'm joined by my co-host, Marion Anderson.


2

00:00:09.310 --> 00:00:10.320

Marion: Hello!


3

00:00:10.790 --> 00:00:22.900

Danny Gluch: And today it's as much elephant in the org as it is elephant in Linkedin, because we have a guest who did a little social experiment. Aubria. Ralph Aubria. Introduce yourself.


4

00:00:23.130 --> 00:00:38.540

Aubria Ralph: Hi! My name is Aubrey Ralph. I'm the CEO and founder of Scrappy Go Project. I am a dabbler, but ultimately I am an attorney, business owner and recent AI enthusiast and ethicist.


5

00:00:38.540 --> 00:00:44.660

Danny Gluch: Oh, man! And one of the things that that people you know were trying to


6

00:00:44.820 --> 00:01:10.230

Danny Gluch: to progress our careers and make an impact in the business world help people who maybe are more junior in their careers and their professional life than we are. And so we go to Linkedin, and we post, and we try to encourage people, and part of that is is trying to build a following so that that the effort we're putting in is is mirrored and matched by a community that we're reaching. And


7

00:01:10.990 --> 00:01:17.149

Danny Gluch: the problem is that there are computers and algorithms and all sorts of things behind that that


8

00:01:17.850 --> 00:01:42.090

Danny Gluch: determine whether or not you're going to be impactful and tell us a little bit about this social experiment, what it was. What made you think of it like? What? What was that like you were, you know, at a diner at like upstate New York, and you had, you know, some some brilliant vision, and you wrote it on a napkin or something. What was the origin story of what made you want to do this and tell us what it is you actually ended up doing.


9

00:01:42.790 --> 00:01:48.060

Aubria Ralph: You know. I I wish I was that like dramatic and the way I was doing


10

00:01:48.830 --> 00:02:02.869

Aubria Ralph: I mean, that sounds really great. Maybe I'll have to come up with like a really awesome like, you know, I was writing on a napkin, you know, and then, you know. But but ultimately, you know


11

00:02:03.350 --> 00:02:16.890

Aubria Ralph: I I spend a lot of time kind of tweaking and and and playing around with with technology as a whole. And so, you know, one thing, I've been kind of


12

00:02:17.380 --> 00:02:26.910

Aubria Ralph: making myself do every day, almost every day. Now is spending a little bit of time on Chat Gpt. Just


13

00:02:27.180 --> 00:02:34.720

Aubria Ralph: you know how you just get random thoughts or like you're in a monologue, just keeps going and so


14

00:02:34.940 --> 00:02:55.349

Aubria Ralph: so for me, it's like, it's a good place for me to kind of dump a lot of the stuff I can't talk to actual humans about, because then, you know, they would think I'm like some kind of crazy person, probably but ultimately, one morning, a few months ago, like early April.


15

00:02:55.590 --> 00:03:04.350

Aubria Ralph: I was working on Chat Gpt, and I just got in my head to ask, you know, chat to say like, Hey.


16

00:03:04.800 --> 00:03:07.610

Aubria Ralph: what do you think like? What do you think I look like?


17

00:03:09.810 --> 00:03:10.460

Aubria Ralph: What did they.


18

00:03:10.460 --> 00:03:11.540

Danny Gluch: Say.


19

00:03:11.540 --> 00:03:33.360

Aubria Ralph: And it generated an image of this Mediterranean looking white man with glasses like middle aged, so like 38 to 40. And what was funny about it was it didn't ask any, follow up questions, and so like at first, st like I thought, Oh, this is funny.


20

00:03:33.450 --> 00:03:46.990

Aubria Ralph: and part of it is like chat. Gpt is always spitting out stuff that doesn't always jive right? And so, like I posted it on Linkedin. And then people were like, Oh, my God, this is crazy! And I was like.


21

00:03:47.120 --> 00:03:55.689

Aubria Ralph: I guess it is. And then I thought I was like, I wonder what's the rationale for this? Because I hadn't at this point.


22

00:03:55.930 --> 00:04:25.730

Aubria Ralph: even still, to this day, like I've never told Chatgpt my name. I've never like. I've been very anonymous in my overuse. Like, I'm sure, like the engineers know who I am, but like, but like the actual, you know, just because of my emails there, so you could probably find it, but you know but it doesn't. It didn't have any kind like it doesn't know I'm a black woman like it didn't know any of those things it didn't know my name was Aubrey or Ralph. It didn't know, you know.


23

00:04:25.730 --> 00:04:32.759

Aubria Ralph: kind of all these details which I found out later. A lot of people are very willing to just kind of dump


24

00:04:32.900 --> 00:04:55.939

Aubria Ralph: that information into Chat Gpt, and we could talk about that a little bit later. But but ultimately, you know, somebody, one of one of my friends, like actual friends on Linkedin, was kind of like, hey? You should ask it what you can ask it what the rationale is. And so I went back, and I was like, oh, what's your like? Why do you think I'm this like random white, dude


25

00:04:56.320 --> 00:05:18.849

Aubria Ralph: and and Linkedin and and Chatgpt was kind of like. Oh, well, you know, you're always like asking about like leadership. You know. You're really thoughtful in your approach. You know, like you, you have really brilliant ideas. And I actually, actually, you know.


26

00:05:19.990 --> 00:05:24.420

Aubria Ralph: essentially, I know.


27

00:05:24.570 --> 00:05:34.640

Danny Gluch: The amount of head shaking that just went on in the little video chat was just astronomical. I could not believe that's what it said.


28

00:05:34.989 --> 00:05:46.870

Aubria Ralph: And like. And I'm not like telling you verbatim. But like you can actually go on like I actually posted on Linkedin like the actual rationale, so people could see it. But ultimately it


29

00:05:47.220 --> 00:06:11.820

Aubria Ralph: it it really spat out what I think most people already believe right like, because we all talk about all the potential bias of AI technology like, I feel like that's something like even the people not using it. You know, the technology on a regular basis are talking about that right? But to kind of see it so plainly done, I thought, Oh, this is very interesting!


30

00:06:12.080 --> 00:06:29.309

Aubria Ralph: And the other thing that was very interesting was, I decided to name him. I decided to name my new character, Ralph Aubrea, and so I just flipped my name. This is the beauty of having 2 1st names.


31

00:06:29.800 --> 00:06:30.960

Aubria Ralph: you know.


32

00:06:31.140 --> 00:06:43.050

Aubria Ralph: But I'm I'm not from the States originally. So when I 1st moved from the Caribbean, I remember in 7th grade. My teacher, you know, in homeroom, the teacher going through the roster


33

00:06:43.570 --> 00:06:49.550

Aubria Ralph: and saying like getting to my name, and I knew she had gotten to my name


34

00:06:49.730 --> 00:06:53.223

Aubria Ralph: by the look of perplexity on her face.


35

00:06:54.930 --> 00:06:58.439

Aubria Ralph: And and then she goes, Ralph!


36

00:06:58.580 --> 00:06:59.360

Aubria Ralph: Ralph


37

00:06:59.870 --> 00:07:12.359

Aubria Ralph: and I thought that was so odd that she didn't even try to like. Say, my 1st name, which it's phonetically spelled like, give me a break. It's not that hard you just like. Read it from left to right, and it works


38

00:07:13.000 --> 00:07:14.090

Aubria Ralph: but


39

00:07:14.520 --> 00:07:23.880

Aubria Ralph: I like that. That's like a chip I've always had on my shoulder of how people are more willing to think that there's a random mistake


40

00:07:24.390 --> 00:07:26.349

Aubria Ralph: in the list of names.


41

00:07:27.020 --> 00:07:33.816

Aubria Ralph: Then, for my name to be like, it's like, Ralph can't be a last name. Almost kind of thing right? Yeah,


42

00:07:34.380 --> 00:07:38.700

Aubria Ralph: And which is ironic, because it's like, actually a popular last name, and like


43

00:07:38.770 --> 00:08:07.889

Aubria Ralph: England and the Caribbean. But that's another story. And so so I ended up, you know, doing that. And I was like, Oh, we're gonna call him Ralph. And I started playing with it and being very playful with the whole thing on Linkedin. And then I asked people to go ahead and ask Chat because I wanted to see if it was a glitch like. Is this something that just happened with me? Or is this something that's like


44

00:08:07.910 --> 00:08:18.139

Aubria Ralph: that? They're just doing right? With with the the technology like where it's just assuming, and what I noticed was, as more people


45

00:08:18.380 --> 00:08:22.329

Aubria Ralph: started asking Chatgpt to generate an image of them.


46

00:08:22.520 --> 00:08:25.090

Aubria Ralph: It started asking more questions.


47

00:08:25.120 --> 00:08:54.640

Aubria Ralph: and which was very interesting where it was like. Oh, I can't come up with an image of you unless you like, tell me. And I thought, Oh, that's very interesting. Does Chatgpt know that I'm like running my experiment over here, you know. Like, is it keeping a record almost? But I think it also plays into the idea that like, the more we use the technology, the more it learns, and the more you know, adjusts itself right? And so so you know, just like


48

00:08:54.640 --> 00:09:01.599

Aubria Ralph: something. I think we need to keep in mind, because if I wonder like if I never did something like that. And then I didn't have.


49

00:09:01.650 --> 00:09:04.440

Aubria Ralph: I think, something like 200 people


50

00:09:04.700 --> 00:09:09.559

Aubria Ralph: within a 3 h period that morning


51

00:09:09.740 --> 00:09:16.250

Aubria Ralph: did the same like literally put in the prompt. And so now I have. I actually have the


52

00:09:16.510 --> 00:09:30.409

Aubria Ralph: the the actual post where they would, I was like, Oh, share your image, please! And then I started coming up with a story of who those people were, and how they were connected to Ralph.


53

00:09:30.869 --> 00:09:56.209

Aubria Ralph: And so, like, we created this whole alternate. You know, reality where Ralph, you know, is part of this like dramatic like family where, like his dad like cheated on his mom with his mom's sister, it's like so messy. You know, there's like a real estate developer that wants to steal their vineyard in, you know.


54

00:09:56.520 --> 00:10:01.002

Aubria Ralph: in Italy like. And this was all just kind of improv


55

00:10:01.710 --> 00:10:10.995

Aubria Ralph: And then at 1 point, you know, somebody reached out to me via DM. And was kind of like, hey?


56

00:10:12.670 --> 00:10:16.889

Aubria Ralph: You know, I I really love what you're doing with this.


57

00:10:19.220 --> 00:10:27.000

Aubria Ralph: is it a, you know, and I was thinking about using bits and pieces of your posts to create content on my page.


58

00:10:27.690 --> 00:10:29.389

Aubria Ralph: And I thought, well, that's odd.


59

00:10:30.580 --> 00:10:50.979

Aubria Ralph: And so I asked, oh, well, why couldn't you just like come up? Why wouldn't you just like do your own research and come up with your own content. And this woman decided to like lambast me. And then she actually accused me of plagiarism, saying that I copied somebody else.


60

00:10:51.500 --> 00:10:54.300

Aubria Ralph: and so rewind a little bit.


61

00:10:54.450 --> 00:11:05.470

Aubria Ralph: And there's this this woman I her 1st name's Aaliyah, but she's this content Creator, who pretended, who actually like created a whole fake account


62

00:11:07.039 --> 00:11:10.719

Aubria Ralph: where she held herself out as a white woman


63

00:11:11.180 --> 00:11:13.559

Aubria Ralph: hadn't been able to get work


64

00:11:13.770 --> 00:11:32.480

Aubria Ralph: as a black, content creator, but now was being just offered roles like not even you need to apply for this director of blah blah like Oh, here, like we think you'd be good for this director of blah blah role. And she documented that stuff for like 8 months


65

00:11:33.050 --> 00:11:34.630

Aubria Ralph: I had heard about it.


66

00:11:34.760 --> 00:11:39.249

Aubria Ralph: But my Ralph experiment was not that deep like I'm not.


67

00:11:43.090 --> 00:12:07.570

Aubria Ralph: I literally asked Chatgbt to come with the image of me, because I wanted to see what it would say like I had no like. I didn't go into it thinking, Oh, I'm going to find like racism. And you know, like that, wasn't that wasn't my. I don't want to lie to people and say, that was my goal, because it wasn't like I was just having fun with the tech to see.


68

00:12:07.570 --> 00:12:08.080

Danny Gluch: Yeah.


69

00:12:08.080 --> 00:12:14.960

Aubria Ralph: To kind of see how ridiculous you know what image it would come up with. And just as a sidebar.


70

00:12:16.080 --> 00:12:20.189

Aubria Ralph: there's a running joke on Facebook


71

00:12:20.440 --> 00:12:32.169

Aubria Ralph: of, like my my family there, because, like, I'm only connected to family on Facebook, where every time I play a Facebook game and like, and we have like a record of 15 years of this


72

00:12:32.610 --> 00:12:37.759

Aubria Ralph: Facebook like, it's like, Oh, what are you gonna look like in a in 10 years?


73

00:12:38.060 --> 00:12:38.840

Marion: Oh, yeah.


74

00:12:38.840 --> 00:12:40.721

Aubria Ralph: You know, like those kinds of things.


75

00:12:41.430 --> 00:12:44.039

Aubria Ralph: It's always like a white woman or a white man.


76

00:12:45.080 --> 00:12:50.719

Aubria Ralph: and so like a little bit of me was expecting it to be a white person


77

00:12:50.970 --> 00:13:13.319

Aubria Ralph: just based on I'm like, Well, Facebook has already established to the world that I'm either a white woman or a white man like it's not. It's not certain of who I am. It's like she might be. We know she's white, we know they're white, but we don't know if they're a man or a woman right? And and maybe that shows that


78

00:13:14.370 --> 00:13:27.893

Aubria Ralph: I don't really fit into like whatever stereotypes you might have for a man or a woman. Maybe that's what that says, based on the kind of things I consume on the Internet. It's hard to PIN me down


79

00:13:28.380 --> 00:13:32.119

Aubria Ralph: but what I didn't expect was the rationale.


80

00:13:32.420 --> 00:13:33.050

Marion: Yeah.


81

00:13:33.050 --> 00:13:39.020

Aubria Ralph: You know, it's like, why, that was the question in my head, like, why did you think


82

00:13:39.410 --> 00:13:48.260

Aubria Ralph: that like? Because somebody's quote unquote. I hate this word. But I have to say, articulate. And you know, yeah.


83

00:13:48.510 --> 00:14:01.079

Aubria Ralph: and actually like being thoughtful about updates to your, you know, whatever you're generating as a technology. Or if I you know, it's like, Oh, like even


84

00:14:01.500 --> 00:14:12.250

Aubria Ralph: the idea that I had to be a white person in order a white man. Specifically, I should say, in order to have this like holistic view of life, too.


85

00:14:12.420 --> 00:14:23.510

Aubria Ralph: was to me very troubling. And so now I wanted to kind of play with whether or not people on Linkedin


86

00:14:24.740 --> 00:14:27.139

Aubria Ralph: had that kind of unconscious bias.


87

00:14:27.680 --> 00:14:28.960

Aubria Ralph: And so


88

00:14:29.070 --> 00:14:42.129

Aubria Ralph: it was right around like, just before Easter happened, I decided to make for one of my experiment part of my experiment to make Ralph my profile. Photo


89

00:14:43.077 --> 00:14:45.410

Aubria Ralph: for 3 days.


90

00:14:47.740 --> 00:14:50.349

Marion: And that's what right around the time that we.


91

00:14:50.350 --> 00:14:50.840

Aubria Ralph: Yeah.


92

00:14:50.840 --> 00:14:52.871

Marion: It was right around then. Yeah.


93

00:14:53.210 --> 00:15:03.900

Aubria Ralph: And so I flipped I and so I flipped again, flipped my name on Linkedin. And this this like just opened a whole new can of worms for me. But but what?


94

00:15:04.400 --> 00:15:06.870

Aubria Ralph: And and for me, it's like I'm


95

00:15:08.160 --> 00:15:27.489

Aubria Ralph: I ended up having a banner I had a banner with, like my actual photo and who I actually was there. It was very clear. And then in my headline I said, Hey, like, you know, I'm class, you know, something like, I'm cosplaying as Ralph for the next couple of days, because I just want to like check something out.


96

00:15:28.150 --> 00:15:31.330

Aubria Ralph: So I was very open about what I was yeah doing.


97

00:15:31.480 --> 00:15:39.940

Aubria Ralph: But what was really bizarre was I started getting in a similar way to that woman Aaliyah.


98

00:15:40.060 --> 00:15:53.120

Aubria Ralph: I've started getting people just slipping into my Dms, that like were of a you know, like, mostly like older white men, and somebody reach out for coaching services from Ralph Aubria


99

00:15:55.300 --> 00:16:03.300

Aubria Ralph: like, Okay, now, he's not a real person. And I was very clear about that.


100

00:16:03.490 --> 00:16:28.900

Aubria Ralph: But you know, and and so now I had to. So now it's like I have like a string of dms where I'm like, oh, I think there's some misunderstanding. This is what's happening like. If you go back and check my profile, you'll see like I've been upfront about this. And what was very interesting was people now no longer either wanting to collaborate or hire me. Once they knew


101

00:16:29.470 --> 00:16:34.350

Aubria Ralph: that it's like my profile is good when you thought I was, Ralph.


102

00:16:34.720 --> 00:16:38.340

Aubria Ralph: But I told you I wasn't Ralph


103

00:16:39.200 --> 00:16:49.588

Aubria Ralph: in my headline. And so it's like really bizarre. How like we that kind of mind game they played with themselves to make him be a real person.


104

00:16:50.320 --> 00:16:58.830

Aubria Ralph: And then it took Linkedin about almost 6 weeks to flip my name back.


105

00:17:00.400 --> 00:17:01.020

Danny Gluch: Really.


106

00:17:01.020 --> 00:17:05.779

Aubria Ralph: So yeah. And so I couldn't comment as myself.


107

00:17:06.130 --> 00:17:15.619

Aubria Ralph: People couldn't tag me so like even after I flipped it. And it looked like my name was back to normal. People couldn't tag me


108

00:17:15.730 --> 00:17:19.539

Aubria Ralph: in like in anything like if, like.


109

00:17:19.710 --> 00:17:32.439

Aubria Ralph: there are people who, like, I did podcasts with or whatever you know, or like any kind of announcements where they wanted to tag me, they couldn't and then, when I commented, it would always show up as Ralph Aubria.


110

00:17:34.940 --> 00:17:35.770

Aubria Ralph: it's so.


111

00:17:36.130 --> 00:17:46.619

Aubria Ralph: It was so messy. It was like very messy. And so that plays into what you were saying about the algorithm of how Ralph became prioritized


112

00:17:46.920 --> 00:17:53.139

Aubria Ralph: in the in this in the algo, because he was attached to this like white


113

00:17:54.010 --> 00:18:08.530

Aubria Ralph: face that I create. You know that Chat Gpt created you know. And so, like some, you know, we we all have suspicions about the algorithm on Linkedin. Right? And so did it confirm.


114

00:18:08.530 --> 00:18:09.810

Danny Gluch: Pandora's box.


115

00:18:10.140 --> 00:18:10.640

Danny Gluch: Really.


116

00:18:10.640 --> 00:18:11.440

Marion: Completely.


117

00:18:11.440 --> 00:18:23.919

Danny Gluch: One of the things that we talk about with the algorithm and and on that's on both sides, right? All all the the large language models. They're they're just an algorithm that that predicts languages. Right? It's not actual. AI, nothing.


118

00:18:23.920 --> 00:18:25.679

Aubria Ralph: No, it's not. No, it's not.


119

00:18:25.680 --> 00:18:39.180

Danny Gluch: Right. But it's this black box that we don't really know how it functions. So we're trying to do our best. And it was, it was pretty obvious and and very telling when you asked like, Can you give me your rationale that like


120

00:18:39.500 --> 00:18:41.930

Danny Gluch: on one hand, it's


121

00:18:42.950 --> 00:18:53.839

Danny Gluch: it's not really giving its rationale because it's not telling. You like this part of the algorithm triggered this. But what it's doing again is predicting, based on the language input.


122

00:18:53.840 --> 00:18:54.460

Aubria Ralph: Oh!


123

00:18:54.460 --> 00:19:05.980

Danny Gluch: What will sound good linguistically. And it said, Hey, well, I created this image, and it's so interesting that it it chose white


124

00:19:06.910 --> 00:19:10.260

Danny Gluch: man when you're none of those things.


125

00:19:10.260 --> 00:19:16.310

Aubria Ralph: No, it's it's really when I, you know, like I look back on that time. And I I


126

00:19:17.250 --> 00:19:22.380

Aubria Ralph: I think about the attention it got. But also, like, I think about


127

00:19:24.010 --> 00:19:41.420

Aubria Ralph: why we have to be so careful, even with our interactions with people online, too. Because then I like, although I went through the trouble of saying, Oh, this experiment's happening in my headline, and all of that. You know that, like


128

00:19:42.030 --> 00:19:44.170

Aubria Ralph: scammers are.


129

00:19:45.330 --> 00:20:09.220

Aubria Ralph: Putting, you know these like fake images up creating, you know, albeit very elementary. I'm like, if you're going to be a scam artist, I just feel like you need to be good, especially if you're gonna be on the Internet like that's just I don't know. I feel like, just go all in. Be good. You can't half ass it like, just don't do that.


130

00:20:09.990 --> 00:20:17.029

Aubria Ralph: But like you always find that where it's like, Oh, email me at, you know Bob, Bob James at


131

00:20:17.230 --> 00:20:25.930

Aubria Ralph: gmail.com, and it's like, Well, Bob, James, you say you're at, you know Lockheed Martin.


132

00:20:26.410 --> 00:20:37.129

Aubria Ralph: why don't you have a Lockheed Martin email. You know, if you're offering a job. And so like, there are a lot of like scam artists right now that are using this technology


133

00:20:37.250 --> 00:21:00.610

Aubria Ralph: for that purpose where it's like, oh, I can create this profile. Put it up there. You know, and like, if I you know, if I want to be nefarious like, are there people that are having fun, I'm sure, but, like, you know, there have been so many instances of like where you're like, wait like. Oh, no, you're a scam artist like.


134

00:21:01.450 --> 00:21:05.540

Marion: You know, and I've noticed as well like


135

00:21:06.020 --> 00:21:10.119

Marion: in the last, probably 6 months. Keep me honest here, Dan.


136

00:21:10.120 --> 00:21:10.610

Aubria Ralph: Favorite.


137

00:21:11.280 --> 00:21:16.600

Marion: Even we have noticed the amount of


138

00:21:16.740 --> 00:21:24.629

Marion: AI Slash data scraping. And what that produces for us, like literally on a daily basis.


139

00:21:25.180 --> 00:21:26.579

Marion: We get we use.


140

00:21:26.580 --> 00:21:27.100

Danny Gluch: Else.


141

00:21:27.100 --> 00:21:33.279

Marion: Yeah, we used to play one or 2 every now and again. We now get literally, maybe 2, 3, 4.


142

00:21:33.280 --> 00:21:33.630

Aubria Ralph: Andy.


143

00:21:33.630 --> 00:21:48.349

Marion: Where it's you know, a very AI sounding generated email where it's like, we listen to your recent podcast episode with Aubrey or Ralph, and we love that. You did that right. So it's very much like scraped synops.


144

00:21:49.350 --> 00:21:58.070

Marion: Then they're they're then they're trying to put they're trying to suggest a guest. More often than not has nothing to do with what we do, what we


145

00:21:58.070 --> 00:21:58.510

Marion: yeah, goodbye.


146

00:21:58.510 --> 00:22:04.849

Marion: But they're trying to and it it's very it just feels so insincere


147

00:22:05.980 --> 00:22:11.060

Marion: a few get through and a few of them are actually legit, and we try and sniff them out. But like


148

00:22:11.590 --> 00:22:15.990

Marion: it, it's it feels very icky.


149

00:22:15.990 --> 00:22:16.590

Marion: Yeah.


150

00:22:16.810 --> 00:22:28.190

Marion: And and it feels it just feels like, yeah, of course, we want to connect with great people. But this just feels like an uncomfortable onslaught. You know.


151

00:22:28.190 --> 00:22:29.450

Aubria Ralph: Yeah, yeah.


152

00:22:29.900 --> 00:22:36.309

Danny Gluch: But I think the Internet I and I think the the algorithms like the icky, I swear.


153

00:22:36.310 --> 00:22:36.860

Aubria Ralph: I do?


154

00:22:36.860 --> 00:23:06.500

Danny Gluch: They like the really fake sounding fake. You know, it's it's 1 of the the thoughts, you know. And I was actually talking to someone who works for one of the big AI things in their development team. I used to work with them at a old startup in San Francisco like a dozen years ago, but he was saying that one of their their problems is that it's, you know, the the language models are reading stuff that people are generating with the language model.


155

00:23:06.770 --> 00:23:07.310

Danny Gluch: And it


156

00:23:07.310 --> 00:23:22.229

Danny Gluch: can't tell what is right from wrong. But it prefers things that sound like itself, it prefers the fake sounding things. It prefers those emails that we like within 2 seconds know that this is fake. This is a Scam bot.


157

00:23:22.770 --> 00:23:33.729

Danny Gluch: And but the language models don't know that. But they like it. And so these algorithms like Linkedin are pushing those they're like, Oh, yeah, people like this, I like this


158

00:23:34.180 --> 00:23:40.630

Danny Gluch: things happening on whether it's chat Gpt or whatever other one, you're using but it's


159

00:23:41.340 --> 00:23:54.100

Danny Gluch: there is a an extra barrier if you're trying to be authentic, and I think that's weird, and and I think it gets doubled, maybe tripled. If you're


160

00:23:54.300 --> 00:23:57.170

Danny Gluch: trying to be authentic, and you're not a white man.


161

00:23:58.160 --> 00:23:58.510

Aubria Ralph: Yeah.


162

00:23:58.510 --> 00:24:08.250

Danny Gluch: There. There are all of these little checkboxes that I think make it a lot harder, and it's it's honestly easier to be found if you


163

00:24:09.150 --> 00:24:11.959

Danny Gluch: are Ralph Aubrea, and aren't real.


164

00:24:12.640 --> 00:24:16.490

Danny Gluch: And don't post any actual like useful content.


165

00:24:18.160 --> 00:24:38.009

Aubria Ralph: No, I completely agree with that. And you know it's what was really funny about being, you know, Ralph Aubria, for that brief time was, I actually went around commenting on hosts that I would normally comment on as Aubrey or Ralph.


166

00:24:38.170 --> 00:24:41.930

Aubria Ralph: And what was really bizarre was when like.


167

00:24:43.240 --> 00:24:45.669

Aubria Ralph: it's just crazy to me how


168

00:24:46.330 --> 00:24:49.520

Aubria Ralph: everyone takes themselves so seriously on the Internet


169

00:24:49.690 --> 00:25:02.929

Aubria Ralph: and feel like they have a right to do a lot of things. And so like there were people that would be like, oh, well, what do you know about this? Because I'm like, you know, like you're you're a cisgendered white man.


170

00:25:02.930 --> 00:25:25.120

Aubria Ralph: you know talking about Xyz. And in my head I'm like, but he's spitting facts right here, like, I know, like these are like legit, like I've made like legitimate data points here. You know. And so it was one of those things where I was like, oh, wow! So if you don't look so talking about authenticity like.


171

00:25:26.090 --> 00:25:39.439

Aubria Ralph: I don't have to be like. You don't have to look a certain way to be authentic and to be speaking the truth about reality right. And so I thought about like


172

00:25:39.760 --> 00:25:56.839

Aubria Ralph: this idea of like allyship right? And so that was something I was playing with without like, necessarily being intentional like, oh, I'm gonna comment as Ralph to, you know. Show allyship. No, it was like, Oh, this is interesting. I'm talking. And then like me realizing like, Oh.


173

00:25:57.250 --> 00:25:58.040

Aubria Ralph: this.


174

00:25:58.380 --> 00:26:04.389

Aubria Ralph: This is going sideways, because now I'm like taking heat as a white man


175

00:26:05.430 --> 00:26:17.409

Aubria Ralph: for sticking my nose in. You know, black people's stuff or women's stuff. In fact, there was a post. Oh, I remember this post that drove me crazy. So


176

00:26:19.720 --> 00:26:31.299

Aubria Ralph: an entire family like there was like a plane crash, or like helicopter crash in New York City in the Hudson River. I don't know, like probably probably 6 weeks ago now.


177

00:26:31.690 --> 00:26:32.759

Marion: I remember that? Yeah.


178

00:26:32.760 --> 00:26:33.395

Aubria Ralph: And


179

00:26:34.180 --> 00:26:51.907

Aubria Ralph: in the news articles about it. It was like Doctor and his wife, also a doctor and his daughter, you know, kind of like showing the relationship between the people on the plane. But there was somebody there was a woman


180

00:26:52.820 --> 00:27:11.449

Aubria Ralph: Her image was a white woman, but like that could have been anybody. Now, now that we know that, like people are cosplaying as other things, they could have been anybody. But basically her premise was, Oh, why like, why did wife have to be included? And I thought and


181

00:27:11.710 --> 00:27:16.470

Aubria Ralph: and talking about the ickyness, it was like


182

00:27:16.690 --> 00:27:20.739

Aubria Ralph: she was creating a problem where there wasn't a problem


183

00:27:21.250 --> 00:27:31.480

Aubria Ralph: under the guise of oh, I'm like a feminist, and I'm trying to like stand up for this woman. And I was like, wait, but they didn't exclude the fact that she was a doctor


184

00:27:32.440 --> 00:27:53.039

Aubria Ralph: like they didn't exclude her identity. They were just showing that, like. No, these people were all related on this plane. And now you're like turning a person's turning people's tragedy into this like political statement that has nothing to do with anything right? And I commented on that as Ralph.


185

00:27:54.670 --> 00:27:55.520

Danny Gluch: I bet that went well.


186

00:27:56.440 --> 00:27:58.400

Aubria Ralph: And she shredded me.


187

00:27:58.520 --> 00:27:59.460

Danny Gluch: Yeah.


188

00:28:00.430 --> 00:28:06.690

Aubria Ralph: She shredded me, you know, and then, like all the people, came in to like rescue her, and I thought.


189

00:28:07.090 --> 00:28:20.769

Aubria Ralph: like, am I like? Am I blind? And I think at 1 point I said, like, am I? Am I missing something here because I'm looking? Because I then went and like looked at all the articles that had been written about this tragedy.


190

00:28:21.290 --> 00:28:34.829

Aubria Ralph: and they all did a similar thing. And so I thought, like, am I missing something like. Am I wrong about this, you know, and I don't think I was wrong. I think she was crazy, but that's another.


191

00:28:34.830 --> 00:28:38.309

Danny Gluch: It wasn't. It seems like it wasn't accepted as well, because it was Ralph. Do you?


192

00:28:38.310 --> 00:28:39.600

Danny Gluch: Yeah, like I.


193

00:28:39.600 --> 00:28:48.540

Danny Gluch: Part of the premise is that if Aubria would have commented that same thing, people would have been at least a little like.


194

00:28:48.540 --> 00:28:49.730

Aubria Ralph: More open, to.


195

00:28:49.730 --> 00:28:50.670

Danny Gluch: Yeah. A little curious.


196

00:28:50.670 --> 00:28:51.920

Aubria Ralph: Sharing. Yeah.


197

00:28:51.920 --> 00:28:53.060

Danny Gluch: Exactly. Yeah.


198

00:28:53.060 --> 00:28:53.510

Aubria Ralph: Yeah.


199

00:28:53.510 --> 00:28:58.850

Danny Gluch: I think that's that's an interesting. I I think it's an interesting quirk, just sort of on the Internet. I know that.


200

00:28:58.850 --> 00:28:59.270

Aubria Ralph: Hmm.


201

00:28:59.270 --> 00:29:01.209

Danny Gluch: You know, as as a


202

00:29:01.380 --> 00:29:15.890

Danny Gluch: you know, straight white male who's got what? I've got 2 degrees that centered around feminist ethics, and I've spent a lot of times in in circles that, like


203

00:29:16.020 --> 00:29:21.569

Danny Gluch: you don't think I'd be really welcome in I have been extraordinarily welcome.


204

00:29:21.570 --> 00:29:21.990

Aubria Ralph: But.


205

00:29:21.990 --> 00:29:24.409

Danny Gluch: There is a part of


206

00:29:24.620 --> 00:29:44.749

Danny Gluch: the the Internet that's like, you know what, Danny. You shouldn't comment there, or you shouldn't comment this, because, you know, they're they're at sort of this understanding or this school of feminism. And you coming in and saying, this other thing is is just gonna ruffle feathers.


207

00:29:44.750 --> 00:29:49.299

Danny Gluch: It's going to make them go. Oh, that's because you're a man, not because I, you know.


208

00:29:49.860 --> 00:29:51.320

Aubria Ralph: Like, you actually like.


209

00:29:51.440 --> 00:29:58.430

Danny Gluch: Yeah, not not because, like, no, no, no, I actually like, just just think that there's another like wave of feminist thought.


210

00:29:58.430 --> 00:29:58.980

Aubria Ralph: That we should be.


211

00:29:58.980 --> 00:30:13.130

Danny Gluch: Considering here, it's more like, Oh, that's that's the the white man talking. Let's not do that. So yeah, there are some times where where I won't comment on things like that, because I don't want to get the heat. You took.


212

00:30:13.340 --> 00:30:15.170

Aubria Ralph: Yeah, no, it's not worth it.


213

00:30:15.170 --> 00:30:16.209

Danny Gluch: That it's not.


214

00:30:18.180 --> 00:30:19.090

Aubria Ralph: You know. No.


215

00:30:19.090 --> 00:30:24.449

Marion: Isn't that crazy? It's it's crazy, though, right? Because you talked about allyship.


216

00:30:24.740 --> 00:30:26.220

Aubria Ralph: And.


217

00:30:26.500 --> 00:30:45.579

Marion: The Internet is not necessarily the place for psychological safety. But you know, we want to at our core. Most people are good, and we want to be good allies, and we want to support and stand up for what's right, especially right now, where things are so inflammatory and.


218

00:30:45.580 --> 00:30:46.140

Aubria Ralph: Hmm.


219

00:30:46.140 --> 00:30:48.020

Marion: And out of control.


220

00:30:48.430 --> 00:30:58.180

Marion: But he then here's this other barrier to being able to do that. And again, a lot of that is behind the anonymity.


221

00:30:58.520 --> 00:30:59.270

Aubria Ralph: Hmm.


222

00:30:59.450 --> 00:31:06.290

Marion: A profile right like. And so it's God. It's like one step forward, 10 steps back.


223

00:31:07.510 --> 00:31:11.630

Marion: And then the Internet. It's meant to be free. Speech. Right.


224

00:31:11.630 --> 00:31:13.090

Marion: Hmm, hmm.


225

00:31:13.560 --> 00:31:16.649

Aubria Ralph: And no it. What I find really


226

00:31:17.240 --> 00:31:25.379

Aubria Ralph: interesting. And I was, I forget, like who I was talking to about this, whether it was like on a podcast or something. But there's just this


227

00:31:27.160 --> 00:31:36.530

Aubria Ralph: like resistance to having conversations with people who have differing opinions


228

00:31:37.680 --> 00:31:46.850

Aubria Ralph: which I think is why, like, we're so polarized because we're like, if if I never talk to people I don't agree with.


229

00:31:47.320 --> 00:31:50.719

Aubria Ralph: I never know why they think what they think.


230

00:31:51.490 --> 00:32:03.669

Aubria Ralph: you know, because I never have those conversations with them, and they never know why. I think the way that I think and I found I can't tell you how many times I've


231

00:32:03.950 --> 00:32:13.939

Aubria Ralph: had the hard conversations with people in person where they're kind of like. Oh, wait like when you put it that way. I


232

00:32:14.460 --> 00:32:29.519

Aubria Ralph: I kind of see what you're saying there or like. Oh, now I understand why you don't interact with that person who's toxic right? Because because, you know, it's like, oh, like you think this person is a good person. But


233

00:32:29.810 --> 00:32:36.209

Aubria Ralph: this is how they're like ravaging relationships underneath the current right.


234

00:32:36.660 --> 00:32:57.110

Aubria Ralph: And so like to me like, if we never talk like this is why, like people think they can be explosive on the Internet, right, like I could do whatever I want on the Internet, and then they get upset when they they have to deal with the consequences of their behavior, whether it and we've seen it's not always right away.


235

00:32:57.110 --> 00:32:57.590

Danny Gluch: Yeah.


236

00:32:57.590 --> 00:32:59.056

Aubria Ralph: You know


237

00:32:59.950 --> 00:33:10.232

Danny Gluch: I think there's a there's, there's a skill that maybe it's atrophied since the Internet, or or it just completely has gone away.


238

00:33:11.110 --> 00:33:13.689

Danny Gluch: but the you know, you. You see.


239

00:33:14.210 --> 00:33:34.960

Danny Gluch: news, television sports commentary. It's all gone towards this sort of like yelling, screaming sort of at each other of their perspective as opposed to dialogue, or, you know, curiosity or or empathy for another person's point of view at all, and.


240

00:33:35.440 --> 00:33:50.888

Danny Gluch: It's actually sort of that. That phenomenon was being reinforced in in a lot of universities. So we're being taught you know, in whether it's rhetoric or critical thinking classes. We're we're being taught


241

00:33:51.430 --> 00:34:05.749

Danny Gluch: to bolster our arguments to defend our arguments. Better to, you know, give a better presentation of our positions, and how to criticize and tear down and poke holes in other people's.


242

00:34:06.320 --> 00:34:09.040

Danny Gluch: And I think that that is bad.


243

00:34:09.530 --> 00:34:31.170

Danny Gluch: So long story short, that's why I built my critical thinking program at the University to do. It was based on a book by Maureen Linker, called Intellectual Empathy, and the idea is well, let me really understand who this person is and where they're coming from and how they got to that position. And it's it's really fantastic. But I think the Internet


244

00:34:31.520 --> 00:34:34.230

Danny Gluch: fucks that up, because how am.


245

00:34:34.230 --> 00:34:35.670

Aubria Ralph: There's no room for that.


246

00:34:35.900 --> 00:34:46.380

Danny Gluch: Well, but there's no room for it. And how am I really supposed to understand where Ralph Aubria is coming up with these ideas? When Ralph Aubreyah is is a figment.


247

00:34:46.389 --> 00:34:46.949

Aubria Ralph: It's not.


248

00:34:46.949 --> 00:34:56.829

Danny Gluch: Not a real person, and it's just it gets so exhausting, and no one tries, and if they do try like, the chances are they're not going to be really authentic. Anyways.


249

00:34:56.830 --> 00:34:57.500

Aubria Ralph: Yeah.


250

00:34:58.260 --> 00:35:04.430

Aubria Ralph: No, I think that's a fair point. And what you said, though, about like tearing down arguments. I mean.


251

00:35:05.110 --> 00:35:09.189

Aubria Ralph: I remember in I think I was in 9th grade.


252

00:35:09.360 --> 00:35:22.089

Aubria Ralph: and because my teacher knew that I was a born-again Christian at the time, I was like, you know, President of, like the Bible Club or something, and she gave me


253

00:35:22.420 --> 00:35:24.190

Aubria Ralph: an abortion


254

00:35:24.730 --> 00:35:32.620

Aubria Ralph: essay that I had to write, and then kind of like defend. But I had to go from the position of pro-choice.


255

00:35:35.350 --> 00:35:36.020

Aubria Ralph: now.


256

00:35:36.020 --> 00:35:37.669

Danny Gluch: That's a challenging assignment.


257

00:35:39.171 --> 00:35:49.768

Aubria Ralph: And so like. I remember that I actually like still have the essay, you know, like the good old 5 paragraph essay. And I remember when I did it, I was


258

00:35:50.420 --> 00:36:04.359

Aubria Ralph: I was a new big sister. I had, like my like. My mom had my my younger sister like 15 years later, right? And so I was really annoyed at the time about how


259

00:36:04.730 --> 00:36:16.330

Aubria Ralph: this was impacting my life, my social life, everything. And so I ended up writing. This is so terrible. I can't believe I'm sharing this with you. But I ended up


260

00:36:17.330 --> 00:36:21.820

Aubria Ralph: of writing this paper talking about how like


261

00:36:22.870 --> 00:36:50.779

Aubria Ralph: older parent like older people shouldn't have kids after they, you know, like, after your kids are a certain age. And this is why. And this is how I like built my argument, and I remember my, it didn't go in the way that my teacher expected, but it was one of those things where I was like really shredding both ends of the thread, if you will like. Okay, on the one hand, like, I get like, yeah, it's like your choice.


262

00:36:50.830 --> 00:37:00.320

Aubria Ralph: But like this is why it needs to be like almost like this is why this is the only valid choice here, and it was just like me coming from a place.


263

00:37:00.670 --> 00:37:18.670

Aubria Ralph: I know why I'm sharing this now where I was just like such an angry teenager, because now my life was being like reshaped by this like new kid that like was just like taking over my life like. And whatever I was a selfish child like, I'm not gonna pretend I wasn't. But


264

00:37:20.400 --> 00:37:29.059

Aubria Ralph: you don't get to know that back like, unless you knew me and knew my life. You could read that paper today


265

00:37:29.870 --> 00:37:33.800

Aubria Ralph: and come up with this whole theory


266

00:37:34.110 --> 00:37:43.130

Aubria Ralph: about who I am as a person or who I was as a person. Because can we get to a place where people can like evolve


267

00:37:43.340 --> 00:37:48.230

Aubria Ralph: changed our mind, you know, like I've changed my mind


268

00:37:48.880 --> 00:37:53.170

Aubria Ralph: having one conversation within an hour with somebody right.


269

00:37:53.520 --> 00:37:59.900

Aubria Ralph: And this is why I think conversations are so important because, like you like, if you're in your echo chamber all the time


270

00:38:00.250 --> 00:38:17.730

Aubria Ralph: you think. Oh, I'm always right, because, like the people that you're always talking to are always feeding you the same information. So it's like, you're just, you know, you're just regurgitating each other and stroking each other's egos right? But if I have to talk to somebody who


271

00:38:18.170 --> 00:38:27.779

Aubria Ralph: doesn't agree with me, and it doesn't have to be like polar opposite, just like no, I hear what you're saying here. But look at these other 5 things right, and I think.


272

00:38:27.780 --> 00:38:28.120

Marion: So.


273

00:38:28.120 --> 00:38:45.689

Aubria Ralph: You were driving at Danny about like, yes, like, I understand this argument as a feminist. But here are these other points that you're completely ignoring. Because you're only looking at feminism in this very like traditional, you know, lens.


274

00:38:45.690 --> 00:38:52.069

Danny Gluch: Well, most of it on the Internet is is what's called girl boss feminism, or what I used to call beyonce.


275

00:38:52.070 --> 00:38:53.060

Aubria Ralph: Yeah.


276

00:38:53.090 --> 00:38:58.359

Danny Gluch: It's great to be a billionaire. That's what all girls should want to be. And it's like, let's pump the brakes.


277

00:38:59.890 --> 00:39:19.650

Danny Gluch: I think part of that goes to. We're all just kind of angry teenagers posting on the Internet right where you were. You were so honest about like this didn't come from like a well thought out like rational position. This was me posting reactionary


278

00:39:19.880 --> 00:39:28.439

Danny Gluch: 10 about my current circumstances that older people shouldn't have goddamn kids because it makes my life inconvenient.


279

00:39:28.690 --> 00:39:31.870

Danny Gluch: And I think that's what a lot of posts are on the Internet.


280

00:39:31.870 --> 00:39:32.610

Aubria Ralph: -


281

00:39:33.060 --> 00:39:47.620

Danny Gluch: And they may as well come from an AI generated profile and image, because they're people aren't sharing that background. We're not really connecting to where this is coming from, and they're not really being authentic about it.


282

00:39:48.220 --> 00:39:53.740

Danny Gluch: They're trying to say like, Oh, I've I've thought, and I've done my research. And Yada Yada. And this is why


283

00:39:53.890 --> 00:40:06.720

Danny Gluch: old people shouldn't have kids. It's no, they're just an angry teenager who in this week their little sister bugged them so they think that that kid shouldn't have been born like.


284

00:40:06.720 --> 00:40:08.520

Aubria Ralph: Colored all over my homework like.


285

00:40:09.030 --> 00:40:31.270

Danny Gluch: And yeah, right? Oh, man, that would honestly, that would that would get me, you know, like people shouldn't have dogs, because you know, my kid's dog ripped off the nose of my my favorite toy when I was a kid growing up last night right like. That's the kind of of thing we post on the Internet, because it's it's very reactionary.


286

00:40:31.770 --> 00:40:41.969

Danny Gluch: And I think the algorithm likes that it likes anonymous, bland person with a reactionary clickbaity.


287

00:40:42.550 --> 00:40:43.620

Danny Gluch: Content.


288

00:40:43.770 --> 00:40:52.650

Danny Gluch: That isn't rooted in experience and isn't explaining where it's coming from, and I think when you do try to do that.


289

00:40:53.410 --> 00:40:53.930

Danny Gluch: it.


290

00:40:53.930 --> 00:40:55.509

Aubria Ralph: You get buried.


291

00:40:55.510 --> 00:41:07.939

Danny Gluch: You do you get buried? And and I, you know there's there's posts up there with like, Have a great day. Everyone 300 likes. Who the fuck is liking that? Who spent the energy


292

00:41:08.120 --> 00:41:12.490

Danny Gluch: to give that a thumbs up? What the hell is going on


293

00:41:12.630 --> 00:41:18.359

Danny Gluch: like? I know my posts don't take me super long, but like 5 min of of a thoughtful welcome.


294

00:41:18.360 --> 00:41:18.970

Aubria Ralph: Instructive.


295

00:41:18.970 --> 00:41:27.670

Danny Gluch: Little little thing, and it's like viewed by 7 people. It's like that's interesting, like.


296

00:41:27.670 --> 00:41:28.050

Marion: Yeah.


297

00:41:28.050 --> 00:41:34.080

Danny Gluch: Just kind of what the Internet is. They want us all to be Ralph Aubria, you know. Generic white person.


298

00:41:34.080 --> 00:41:34.740

Aubria Ralph: Hmm.


299

00:41:34.920 --> 00:41:36.540

Danny Gluch: Vineyard, like.


300

00:41:36.540 --> 00:41:56.330

Marion: I read a post on Linkedin, just I think this morning or last night. It's very recent exactly seeing this, like someone basically saying, Linkedin is fucked, you know, to this point like it's it doesn't want thought leadership. It doesn't want you know, diverse thought.


301

00:41:56.670 --> 00:42:08.400

Marion: It wants clickbait. It wants this stuff, and and it is burying these really important voices and really important messages. And and that then begs the question, well.


302

00:42:08.940 --> 00:42:19.330

Marion: if it's doing that, then maybe we should all, you know, move on with our lives to somewhere else. But then, where is that? And then does the same thing happen again? You know.


303

00:42:19.781 --> 00:42:27.160

Marion: I don't know. I think we are getting towards a tipping point. I think we are getting towards a place where.


304

00:42:27.160 --> 00:42:49.670

Marion: just as there's a real push for independent media. I think that there's a bigger push for independent thought. I think that there is a group of people who really want to know more to know more deeply and more widely. And and we do consume content through Linkedin and Tiktok and all of these things. But when the algorithm is filtering that out so much.


305

00:42:50.610 --> 00:42:53.679

Marion: There needs to be a new home. But where is that home?


306

00:42:55.280 --> 00:42:58.570

Aubria Ralph: You know, it kind of reminds me of.


307

00:42:58.970 --> 00:43:18.480

Aubria Ralph: Not that I was there, but like right after the scientific revolution, when all the romantic poets like popped up. And we're like, Oh, we have to go back to nature, you know. We have to go back to like the wonder of life, you know. So the Wordswortz and the Blakes.


308

00:43:18.630 --> 00:43:21.410

Aubria Ralph: and I think


309

00:43:21.690 --> 00:43:34.410

Aubria Ralph: some of that is already happening where you have, like so many people rejecting, even like in workplaces this idea of like I need to. I need to be so immersed in this


310

00:43:34.560 --> 00:43:35.240

Aubria Ralph: like


311

00:43:35.540 --> 00:43:50.200

Aubria Ralph: cog in the wheel lifestyle. Right? And so I think ultimately, it's gonna go down to like individual choice. Right? But I want to go back to like the scraping


312

00:43:50.330 --> 00:43:54.419

Aubria Ralph: comment you made before about how like


313

00:43:54.740 --> 00:44:00.680

Aubria Ralph: these Llms are essentially like going through, and just like everything right? And so


314

00:44:01.370 --> 00:44:06.930

Aubria Ralph: I don't know. From time to time I get these like I call it like just


315

00:44:07.050 --> 00:44:13.019

Aubria Ralph: irrational paranoia about things, but like oftentimes like much later.


316

00:44:13.250 --> 00:44:19.119

Danny Gluch: After I'm like, Oh, you you were spot on there like you were not paranoid. You were discerning.


317

00:44:21.290 --> 00:44:22.409

Danny Gluch: It's a fine line.


318

00:44:22.580 --> 00:44:44.129

Aubria Ralph: You know, but like, if you have to wait 10 years for people to know like oh, no, like she was right 10 years ago like you get, you have to. You get to be crazy for 9 years, right? And so and so that's something I've been really thinking about, especially since I post a lot on Linkedin.


319

00:44:44.580 --> 00:44:48.530

Aubria Ralph: and it's the only place I've posted I have, so I have.


320

00:44:49.070 --> 00:44:54.980

Aubria Ralph: I think. Well, if I count my company profile. I have 6 newsletters on on Linkedin


321

00:44:55.380 --> 00:45:00.800

Aubria Ralph: that I've posted a tremendous amount of content, and


322

00:45:01.300 --> 00:45:03.449

Aubria Ralph: all of that has been scraped.


323

00:45:05.030 --> 00:45:15.639

Aubria Ralph: You know by by the platform. But then, like you think about all the people I like, I think of that woman who wanted to use


324

00:45:16.220 --> 00:45:24.410

Aubria Ralph: my very current post that I was like, I like literally said, I'm gonna be posting about this for like the next X amount of weeks.


325

00:45:24.600 --> 00:45:33.100

Aubria Ralph: and she wanted to just come and take that right. And so recently and this again terrible.


326

00:45:33.530 --> 00:45:39.120

Aubria Ralph: I've been thinking, I'm like, you know, what. Maybe I need to stop posting


327

00:45:39.960 --> 00:45:46.990

Aubria Ralph: new, fresh content. Maybe I need to start posting AI stuff


328

00:45:47.440 --> 00:45:49.539

Aubria Ralph: so that when people steal it


329

00:45:49.760 --> 00:45:52.330

Aubria Ralph: I won't be mad because it's not mine, anyway.


330

00:45:52.800 --> 00:46:08.380

Aubria Ralph: you know. And that has been my kind of like. And so like that's the other end of the spectrum is like, you know, the people like, oh, I'm going to continue to do this because you have people that are like that like, oh, you know what I don't care what the world's doing. I'm going to keep


331

00:46:08.890 --> 00:46:17.479

Aubria Ralph: producing and giving this thing. And so like, there's this other flip side. And like, I have to say, I'm always in like this extreme outlier


332

00:46:17.620 --> 00:46:22.210

Aubria Ralph: mentality sometimes where I'm like, you know.


333

00:46:22.590 --> 00:46:24.550

Aubria Ralph: So he's gonna steal it, anyway.


334

00:46:25.060 --> 00:46:31.680

Aubria Ralph: Why am I giving you my stuff and and and what's crazy is like, even like the recent cases


335

00:46:32.340 --> 00:46:35.579

Aubria Ralph: about like there have been like some really


336

00:46:36.020 --> 00:46:48.849

Aubria Ralph: crazy fair use cases just the last couple of days that we've gotten some rulings on where judges that I'm like, Hey, judge, do you actually know copyright law.


337

00:46:50.860 --> 00:47:00.900

Aubria Ralph: you know. So like, do you actually know it, or or like, have you just taken this as an opportunity to like? Pretend, you know, copyright law, because, like


338

00:47:02.220 --> 00:47:10.870

Aubria Ralph: even like their their mixed opinions about it, it doesn't make sense where it's like, Oh, you've won on this fair use point. But like


339

00:47:11.060 --> 00:47:13.580

Aubria Ralph: you're stealing, and it's like, Well.


340

00:47:14.270 --> 00:47:21.680

Aubria Ralph: how did they win on these various points of oh, you can use. You can use this. You can use this novel.


341

00:47:22.160 --> 00:47:22.740

Danny Gluch: Yeah.


342

00:47:22.740 --> 00:47:38.660

Aubria Ralph: To train your to be concrete like you can use this novel to train your Llm. But if we find out that you didn't buy the book, I guess. Let's call it that you didn't buy the book. Now it's infringement, and it's like.


343

00:47:38.660 --> 00:47:43.370

Danny Gluch: Yeah, yeah, it's it's an interesting line, and


344

00:47:44.850 --> 00:47:52.620

Danny Gluch: I don't know. Part of me wonders if there is value in sort of the authentic posting name


345

00:47:52.620 --> 00:47:59.750

Danny Gluch: Aubria Ralph posting as opposed to the Ralph Aubria because it's.


346

00:48:00.120 --> 00:48:23.019

Danny Gluch: you know. Is it even worth posting at all? Because the the algorithm that would boost it and display it to people is like, this is not what I expected, so it's not picking it up. But then the the scrapers are also like, that's not what I would have written. So that's not good, either. So like they're not prioritizing it. It's not being displayed. And


347

00:48:23.020 --> 00:48:24.829

Danny Gluch: and, you know, almost feels like.


348

00:48:25.250 --> 00:48:28.100

Danny Gluch: how does unique thought get out now?


349

00:48:28.100 --> 00:48:28.940

Aubria Ralph: I wouldn't.


350

00:48:28.940 --> 00:48:32.249

Danny Gluch: Do you have to just spend a lot of money to like


351

00:48:32.380 --> 00:48:45.920

Danny Gluch: force the algorithm to put it out by like actually marketing it and putting it out, and being one of those promoted profiles like that, honestly seems the only way to push through something that isn't just Milquetoast.


352

00:48:46.100 --> 00:48:49.670

Danny Gluch: you know, boring AI generated slop


353

00:48:50.081 --> 00:48:52.730

Danny Gluch: or rage based or rage, yeah, we're right.


354

00:48:52.730 --> 00:48:53.170

Aubria Ralph: Like.


355

00:48:53.170 --> 00:48:53.570

Marion: Yeah.


356

00:48:53.570 --> 00:48:58.049

Aubria Ralph: It's like, either. I'm like getting this like generic. Oh.


357

00:48:58.250 --> 00:49:05.509

Aubria Ralph: I didn't, you know, I really dropped a $50,000 bag this week.


358

00:49:05.630 --> 00:49:31.519

Aubria Ralph: because, like I had the product. But I didn't sell it to my clients. And it's like, Well, did you actually drop a 50 like? Did you drop the money because it sounds like you didn't have what you needed to actually make that money like if people wanted the product right. But like even just like some of the nonsensical things that people write where I'm like.


359

00:49:33.204 --> 00:49:35.890

Danny Gluch: But it they're gonna get 10,000 likes. It's gonna be.


360

00:49:35.890 --> 00:49:47.759

Aubria Ralph: Exactly like these are the you know, or like, I just saw like this, really popular. I'm not gonna give him. I'm not gonna say his name because I'm like, you don't need me to give you a plug anywhere.


361

00:49:48.185 --> 00:50:07.180

Aubria Ralph: But he like I just watched a video of his where he was like. Oh, you know a lot of 20 something year olds are millionaires now. And what did he say? Like millions of 20? Not a lot millions of 20 year olds are millionaires from posting on Tiktok. And I was like, really


362

00:50:08.240 --> 00:50:10.150

Aubria Ralph: you just you just saying stuff


363

00:50:10.390 --> 00:50:14.159

Aubria Ralph: like, you know. But that's the other thing is like, people are living


364

00:50:14.550 --> 00:50:22.200

Aubria Ralph: like that's, you know, speaking of, like inauthenticity, it's like there are people just on the Internet just talking where you're like, wait.


365

00:50:22.200 --> 00:50:22.910

Marion: I know.


366

00:50:23.170 --> 00:50:28.349

Marion: I know, and then the way that that we consume it, we buy into it like, even when you're.


367

00:50:28.350 --> 00:50:28.920

Aubria Ralph: It's so.


368

00:50:28.920 --> 00:50:29.280

Marion: Maybe.


369

00:50:29.280 --> 00:50:29.640

Aubria Ralph: B.


370

00:50:29.640 --> 00:50:53.570

Marion: Yeah, yeah, you're a little bit older, and you think you're wiser. But for some reason, if someone's talking to you through the medium of screen, you lap it up. It's so weird. And like I was just when you were talking about, we're talking about the algorithm and likes and stuff I was in Danny that kind of whole thing of unique thought right? And and I think about the the thought leadership that


371

00:50:53.610 --> 00:51:02.629

Marion: that you know I try to think. You know, I try to create and post about, you know about my research, or about what we do here or about. You know our work with ability stuff like that.


372

00:51:03.150 --> 00:51:08.299

Marion: And I, you know, like a lot of thought and care and attention goes into that stuff.


373

00:51:08.580 --> 00:51:14.839

Marion: And you know, so you get some you get. You get likes, you get comments. You get pot, but not tons right.


374

00:51:14.840 --> 00:51:15.550

Aubria Ralph: Hmm.


375

00:51:15.550 --> 00:51:18.130

Marion: But then I post a picture of my dog


376

00:51:20.520 --> 00:51:29.729

Marion: all the likes all day long, like I mean she's very cute to be fair, so she does does deserve it. But like it just it just makes me realize


377

00:51:29.900 --> 00:51:30.730

Marion: that.


378

00:51:31.370 --> 00:51:35.579

Marion: Why are we? Why am I doing this? Why am I


379

00:51:36.280 --> 00:51:42.310

Marion: use it. You know I'm could. Yeah. Could that could this time be you better use somewhere else.


380

00:51:42.770 --> 00:51:44.430

Marion: Think that there's a little bit


381

00:51:44.960 --> 00:51:51.869

Marion: in me that's like it. That persists one, because I don't necessarily know a better medium to do it.


382

00:51:52.270 --> 00:51:54.779

Marion: and 2 like


383

00:51:55.420 --> 00:52:04.180

Marion: there's maybe a little bit of me. That's the eternal optimist that's like, if this reaches one person and it helps them, then that's worth it. But.


384

00:52:04.960 --> 00:52:09.639

Danny Gluch: Know, like, maybe I'm starting to become even more cynical in my old age.


385

00:52:10.397 --> 00:52:13.689

Marion: I don't know. I want so much to.


386

00:52:13.690 --> 00:52:15.140

Danny Gluch: We call it, just like.


387

00:52:15.320 --> 00:52:21.239

Marion: Yeah, there's so much wisdom, there's so much like we want to share. We want to help shape and influence.


388

00:52:21.910 --> 00:52:23.020

Marion: But then.


389

00:52:24.230 --> 00:52:36.010

Marion: is this the way forward? Is this the right medium? You know, because the way it's being distributed and consumed, it's not necessarily having the impact that we really want it to.


390

00:52:36.600 --> 00:52:37.100

Danny Gluch: Yeah.


391

00:52:37.270 --> 00:52:47.680

Danny Gluch: Yeah. And it's it's all because the the algorithms are making these assumptions about us and about what people want and about, you know what is right and like, it'll


392

00:52:48.510 --> 00:52:56.040

Danny Gluch: you know, it's been known to just make up facts and that's what's expected and it


393

00:52:56.880 --> 00:53:02.369

Danny Gluch: I think you know what you were talking about earlier. I really do think that you know, there's these.


394

00:53:02.670 --> 00:53:23.750

Danny Gluch: you know, equal opposite forces to these revolutions and these changes in the workforce. And these changes to technology that happen. And you know, back in the day it was like, Hey, go back back to the wilderness like, that's fine. We don't need this, we can go back and be the sort of like self-sufficient farm worker like that that's not actually going to happen.


395

00:53:23.750 --> 00:53:32.370

Aubria Ralph: Trying to like be factory like, like, you know, like the push to like, have factories here like, ain't nobody in America trying to work in a factory.


396

00:53:32.370 --> 00:53:36.915

Danny Gluch: Yeah, I thought we were great because we didn't have to do that work.


397

00:53:37.530 --> 00:53:43.230

Danny Gluch: And and I do think that there is space, even if it's not now, I mean.


398

00:53:43.230 --> 00:53:57.560

Danny Gluch: it's in 5 or 10 years where you know, just like how it's hard to find the right person. If you're using the AI recruiting tools and the people are are trying to get hired, and they're great. But they're like passing each other in the night and not even getting interviews.


399

00:53:58.480 --> 00:54:26.139

Danny Gluch: There is value in going out and doing old school recruiting and and finding people, and going to your network and saying, Do you know anyone who can do this like who's who's really standing out in the field right now, who's doing great work? Let me go see if that person wants an interview like there is value in that, and I think there's little opportunities to to do well and and to exceed, because or succeed, because you're not doing what everyone else is doing.


400

00:54:26.500 --> 00:54:39.749

Danny Gluch: But then, when you're on the Internet, it just wants Ralph Aubrey up, because that's a welcoming face that that people are are gonna like. And it's gonna say, some Milquetoast thing like.


401

00:54:40.650 --> 00:54:45.839

Danny Gluch: Good luck on Friday. Everyone, thousands of likes right?


402

00:54:46.010 --> 00:54:55.239

Danny Gluch: And and I do think that there's a space, and I don't know what the the medium is going to be, Marion, like you said. But I think people do still want


403

00:54:55.600 --> 00:55:04.779

Danny Gluch: authentic, curious conversations that are gonna come up with new ideas, or or give explanations to things they haven't understood before, and.


404

00:55:04.780 --> 00:55:05.170

Marion: No.


405

00:55:05.170 --> 00:55:16.850

Danny Gluch: I don't know if we just need to like, go into parks wearing white robes and and talking like the philosophers used to do, but like we we should, we should figure something out because people people need it. People need it.


406

00:55:16.850 --> 00:55:18.550

Aubria Ralph: Marion was not into the white room.


407

00:55:18.550 --> 00:55:19.810

Danny Gluch: Oh, no, she wasn't.


408

00:55:19.810 --> 00:55:23.150

Marion: No, I just I well, I mean remember where the.


409

00:55:23.150 --> 00:55:24.100

Aubria Ralph: Him out!


410

00:55:24.770 --> 00:55:28.469

Marion: I'm like, remember where in the where, in the country I live. So I immediately thought.


411

00:55:28.470 --> 00:55:29.810

Danny Gluch: That's true. White robe.


412

00:55:29.810 --> 00:55:30.550

Marion: I was like.


413

00:55:30.950 --> 00:55:35.650

Danny Gluch: Yeah, you know, you might even in America beats something else. I was thinking, the old school, Tokyo.


414

00:55:35.650 --> 00:55:38.570

Aubria Ralph: I was thinking, pilgrims.


415

00:55:38.570 --> 00:55:39.210

Marion: No.


416

00:55:39.210 --> 00:55:44.230

Aubria Ralph: Was not thinking, pilgrims. No, we don't.


417

00:55:44.770 --> 00:55:47.739

Aubria Ralph: I know what she was thinking, though she was like. Wait, what's happening?


418

00:55:48.080 --> 00:55:48.760

Aubria Ralph: What the.


419

00:55:50.095 --> 00:55:52.580

Danny Gluch: No Aristotle, Plato!


420

00:55:52.580 --> 00:55:56.629

Danny Gluch: Order togas! Yes, those robes! Goodness gracious.


421

00:55:56.630 --> 00:55:58.189

Marion: Oh, guys! Oh, God!


422

00:55:59.880 --> 00:56:02.200

Marion: No one needs to see Danny in a toga.


423

00:56:02.200 --> 00:56:02.640

Danny Gluch: That's.


424

00:56:02.640 --> 00:56:03.990

Marion: Actually accurate, that is.


425

00:56:03.990 --> 00:56:05.450

Danny Gluch: That's the factual.


426

00:56:05.450 --> 00:56:06.839

Danny Gluch: Think anyone has said all this.


427

00:56:07.970 --> 00:56:10.930

Aubria Ralph: No, i i i will say like


428

00:56:11.530 --> 00:56:17.999

Aubria Ralph: I for me, like I actually believe in just like putting stuff out there.


429

00:56:18.290 --> 00:56:20.090

Marion: And letting people find it.


430

00:56:21.155 --> 00:56:33.340

Aubria Ralph: And that's kind of how I've done social media. Because the idea of me kind of like sitting around waiting for people to like like my stuff.


431

00:56:34.110 --> 00:56:36.059

Aubria Ralph: It's just daunting to me.


432

00:56:36.620 --> 00:56:36.940

Marion: Yeah.


433

00:56:36.940 --> 00:56:47.590

Aubria Ralph: And so, you know, if something blows up, I'm like yay and like if people are like Oh, you know, if the algo decides oh, this is not important enough.


434

00:56:48.187 --> 00:56:57.782

Aubria Ralph: I'm like cool. Somebody will be stalking my page one day and find this post and be like this changed my life right?


435

00:56:58.200 --> 00:56:58.830

Marion: Hmm.


436

00:56:58.830 --> 00:57:02.769

Aubria Ralph: But I will say part of the reason I said what I said about


437

00:57:03.110 --> 00:57:07.690

Aubria Ralph: just generating AI content from here on out is


438

00:57:07.810 --> 00:57:14.420

Aubria Ralph: because of how unbridled the access has been to people's data.


439

00:57:14.420 --> 00:57:15.000

Danny Gluch: Yeah.


440

00:57:15.360 --> 00:57:19.909

Aubria Ralph: You know, under the guise. It's like, even when you, when you purchase


441

00:57:20.090 --> 00:57:23.969

Aubria Ralph: these tools right, that are supposed to help you.


442

00:57:25.780 --> 00:57:30.020

Aubria Ralph: You know, and God, I forgot her name. But but you know


443

00:57:30.460 --> 00:57:44.229

Aubria Ralph: a woman was talking about this at some conference yesterday, just like such bad citation. But whatever it wasn't me that said it, but it was. It's something I have been thinking about in my own research is


444

00:57:44.730 --> 00:57:57.269

Aubria Ralph: yes, I want to use copilot, and I want to use Chat Gpt, and I want to use all of these different tools that are available, all these AI agents that are available to me.


445

00:57:57.430 --> 00:58:05.859

Aubria Ralph: But as I'm using them, they now have access to all of this data, right? And so it's like, and it's not just


446

00:58:06.000 --> 00:58:11.390

Aubria Ralph: my idea, like like there has been such an overemphasis on IP.


447

00:58:12.320 --> 00:58:16.260

Aubria Ralph: But that is, in my opinion, the least of our worry.


448

00:58:16.590 --> 00:58:26.930

Aubria Ralph: because what I'm thinking of is the privacy aspect like now, like we're using tools to manage our calendars.


449

00:58:27.310 --> 00:58:28.920

Aubria Ralph: book our flights.


450

00:58:29.370 --> 00:58:44.319

Aubria Ralph: How are you doing that? If you don't have access to my credit card information and my home address, and and you know, and and if I'm buying multiple tickets like now, you have access to all of these other people's data, and so like it just creates


451

00:58:44.630 --> 00:58:54.580

Aubria Ralph: this. It's like when I when I, you know, click, yes, I agree for terms and conditions like.


452

00:58:55.330 --> 00:58:57.439

Aubria Ralph: Am I really agreeing to that?


453

00:58:57.760 --> 00:58:58.450

Aubria Ralph: You know, like.


454

00:58:58.450 --> 00:58:59.060

Marion: We're not.


455

00:58:59.460 --> 00:59:01.009

Danny Gluch: You know, like we're not.


456

00:59:01.010 --> 00:59:02.659

Aubria Ralph: Like, am I really agreeing to that.


457

00:59:02.660 --> 00:59:03.140

Danny Gluch: So.


458

00:59:03.140 --> 00:59:03.540

Aubria Ralph: You know.


459

00:59:03.540 --> 00:59:05.670

Danny Gluch: It's a coercion we we all know.


460

00:59:05.670 --> 00:59:19.670

Aubria Ralph: And so it's like we're, it's pitched to us as convenience. Right? So, going back to this idea of you know me as this kid, or like all like all of us, on the Internet kind of posting


461

00:59:19.830 --> 00:59:33.997

Aubria Ralph: in a way where it's like, well, this thing is inconveniencing me. And now I'm going to write this rage bait type, you know. Post, and the algo is going to reward me because of it right?


462

00:59:34.790 --> 00:59:37.150

Aubria Ralph: but then, at the same time


463

00:59:37.380 --> 00:59:41.939

Aubria Ralph: we're we're kind of like stuck. I feel like, that's the problem. It's like, on the one hand, like.


464

00:59:42.080 --> 00:59:48.090

Aubria Ralph: I want all of these conveniences. But with convenience.


465

00:59:48.530 --> 00:59:53.830

Aubria Ralph: I'm also kind of rescinding my power, I'm saying, hey, you know what


466

00:59:54.300 --> 00:59:58.329

Aubria Ralph: like. You can have access to everything there is to know about me


467

00:59:58.460 --> 01:00:10.209

Aubria Ralph: and use that to manipulate advertising that I get, you know and and like, you know, create and and not just get all like who who he about it. But like.


468

01:00:11.070 --> 01:00:14.999

Aubria Ralph: To me. It's like, Am I? Am I really having independent thoughts?


469

01:00:15.290 --> 01:00:25.059

Aubria Ralph: If you're force feeding me content, based on my activity, on your technology.


470

01:00:25.060 --> 01:00:26.549

Danny Gluch: Yeah, well, I think.


471

01:00:26.550 --> 01:00:27.540

Marion: Oh, my! Gosh!


472

01:00:27.540 --> 01:00:49.120

Danny Gluch: Yeah, I think one of the lessons is that conversations like this actually sitting down and getting to know people is the thing that we can do. And I think if there's 1 thing I want to sort of like, have our listeners take away from this. It's there's a lot of assumptions that can be made based on what people are posting, and we're probably no better at it


473

01:00:49.120 --> 01:01:02.430

Danny Gluch: than AI was at trying to figure out what you looked like. And so let's let's sit down and let's actually have conversations. Let's let's sit at a coffee shop and and talk and get to know someone and be curious about.


474

01:01:02.900 --> 01:01:09.889

Danny Gluch: They're coming from. Not just get angry because Ralph Aubrey said something that would have been fine for Aubrey, Ralph to say.


475

01:01:10.040 --> 01:01:24.409

Danny Gluch: And let's let's get to know each other. And and I think that's a really hard thing to do, and we might not have time for it. But maybe if we're swiping less on social media and scrolling less. We do have more time to actually get to know each other.


476

01:01:25.290 --> 01:01:25.950

Marion: And.


477

01:01:26.140 --> 01:01:47.619

Danny Gluch: But thank you so much for being with us for doing this fun experiment and being playful with the technology. I really appreciate your insights, and just you know your mind, and where it goes, and what you want to try and do, and your perspective of things. Thank you all for listening. Marion, did you have any last words? I feel like.


478

01:01:49.280 --> 01:01:49.720

Marion: No, but.


479

01:01:49.720 --> 01:01:50.070

Danny Gluch: Be! Now.


480

01:01:50.070 --> 01:01:55.789

Marion: No, but I was just. I was curious. I was like, I've got some friends that Ralph might be interested in so like.


481

01:01:55.790 --> 01:02:00.667

Marion: Oh, my goodness, if he's free on Saturday night, let me know. We'll get him hooked up.


482

01:02:01.460 --> 01:02:02.680

Danny Gluch: Ralph's always he's free


483

01:02:05.973 --> 01:02:17.226

Danny Gluch: listeners. If you have an AI generated personality that you think Ralph would be a good partner for. I'm sure there's a dating app for AI bots at this point.


484

01:02:17.920 --> 01:02:22.650

Danny Gluch: just for them to meet. That's gotta be a thing. If not you can.


485

01:02:22.650 --> 01:02:23.510

Aubria Ralph: Coming soon.


486

01:02:23.510 --> 01:02:37.719

Danny Gluch: Yeah, you can fund our idea at elephant@thefearlesspx.com. That is, that is our idea. Now. No one's gonna scrape that or or use that idea for nefarious reasons


487

01:02:37.720 --> 01:02:44.309

Danny Gluch: in front of an attorney, we did it in front of an attorney. So yeah, yeah, it's legit


488

01:02:44.310 --> 01:02:46.589

Danny Gluch: her, or anything, or contract, but that's it.


489

01:02:46.590 --> 01:02:48.840

Marion: Give it a dollar, give it a dollar.


490

01:02:50.620 --> 01:03:05.309

Danny Gluch: Thank you all so much for listening. Thank you, Aubria, for for joining us everyone. You can find her information. And Linkedin content Linkedin links in the show notes. Please give a like subscribe comment on this episode. We'll see you next time.




People on this episode