
Genesis The Podcast
Genesis the Podcast is a new way to connect with Genesis Women’s Shelter and Support and expand your thinking about domestic violence and related issues that affect women. GTP is also a trusted source of information if you are in an abusive relationship and need safety, shelter or support. Listen every week for fresh content related to domestic violence, to connect with world-renown professionals, participate in exclusive events and training opportunities, and take action against domestic violence.
Genesis The Podcast is hosted by Maria MacMullin, Chief Impact Officer of Genesis Women's Shelter & Support and the Host of the Podcast on Crimes Against Women.
About Genesis Women's Shelter & Support - Located in Dallas, Texas, Genesis provides safety, shelter and support for women who have experienced domestic violence, and raises awareness regarding its cause, prevalence and impact. Learn more at GenesisShelter.org
Genesis The Podcast
The Algorithmic Trap: How Misogyny Weaponizes Tech
The digital world has become a hunting ground where algorithms serve as silent recruiters, pulling vulnerable young men toward misogynistic ideologies at alarming speeds. This eye-opening conversation with experts Laura Frombach and Joy Farrow reveals the disturbing reality of how technology amplifies hatred against women and girls.
When researchers created a fake social media account for a 16-year-old boy, the algorithm began serving misogynistic content within just 23 minutes. This isn't coincidence – it's systematic grooming at an algorithmic level. While human predators might take months to isolate and indoctrinate victims, today's AI-powered platforms accomplish the same goal with frightening efficiency through data-backed feedback loops that constantly refine their effectiveness.
The most troubling aspect is how this online radicalization translates directly into real-world violence. We examine three chilling case studies where digital hate found deadly physical expression, including the notorious Isla Vista killings by Elliot Roger and the Parkland High School shooting. These weren't isolated incidents caused by individual pathology alone – they represent the culmination of algorithmic radicalization pathways that validate and amplify harmful ideologies.
For parents, educators and concerned citizens, addressing this crisis requires immediate action. Our experts provide practical strategies for engaging with youth about their online activities, teaching comprehensive media literacy, and effectively intervening when someone shows signs of radicalization. Most importantly, they emphasize the need to "call out the content but call in the viewer" – recognizing that many drawn to toxic content are primarily seeking connection rather than hatred itself.
In this second part of the three-part conversation with Laura Frombach and Joy Farrow, we examine how misogyny has weaponized technology to illustrate how online platforms have become training grounds for digital predators, transforming what used to be fringe misogyny into mainstream male supremacy. Through algorithmic amplification, monetized content and influencer ecosystems. This virtual content results in real world offline violence against women and girls. I'm Maria McMullin and this is Genesis, the podcast Laura and Joy welcome back to the show.
Speaker 2:Thank you for having us.
Speaker 1:Thank you so much. We're glad to be here. In our last conversation we discussed the weaponization of kindness, using metaphors specific to technology that translate how we, as women, are socially conditioned to ignore our own intuition and submit to the demands and even the abuse of men. Let's recap that concept as we move further into its consequences in talking about the weaponization of technology through misogyny. So Laura and Joy give us a quick recap of the concept of the weaponization of kindness.
Speaker 3:So the last time we talked about kindness, but not the kindness that gets nurtured, it's the kindness that gets exploited. Here's what we know. Our instincts are hardwired, just like any other mammal. We are built to sense danger, built into our bodies by nature Somewhere along the way. For women especially, society installs a faulty app and it's called Be Nice, no Matter what, and it teaches us to override our natural instincts so that we stay quiet, stay small and stay agreeable, and that's what predators look for.
Speaker 2:Got it. They don't look for weakness, they look for kindness, the kind that's been programmed to ignore red flags. But the good news, the bad software can be overwritten and your instincts were never broken, they were just muted. So today we're going to talk about how technology, like predators, has learned to weaponize that same social programming. It's time to take back that control.
Speaker 1:Yeah, and in a lot of ways it is possible to take some of that control back. Now help us understand where the idea of how misogyny has weaponized technology has come from.
Speaker 3:So the concept that misogyny has weaponized technology came from we'll use the Dublin study as an example found that when the researchers created a fake account for 16-year-old boys, that within 23 minutes that account was fed misogynistic and anti-LGBTQ content. And so it's talking about. This isn't just what to watch. Next it's grooming by algorithm, and here's why so kind of think of it like this. If a human predator slowly gains teens' trust and isolates them and feeds them toxic content over time, we'd call it what it is right Grooming. Now swap that human predator for an algorithm Same strategy, same harm. It's just faster, slicker and with a data-backed feedback loop. And so the Dublin study showed that those fake accounts didn't just stumble into content. The account was fed that information by the AI algorithm.
Speaker 3:The systems are designed that way. They're designed to keep people watching. Controversial, extremist content does exactly that. And now AI makes the algorithm worse because it's self-tuning. So the algorithm watches you as you watch the videos. So if you linger too long on a video that's pushing toxic content, the algorithm says, oh here, you like that, here's 10 more. And unlike a human you know of course it doesn't sleep, it doesn't stop and it certainly doesn't question the ethics of what it's feeding you. So we say what we're seeing is a high speed conveyor belt pulling boys especially those poor guys who are lonely angry or they just want to date towards content that dehumanizes women, mocks LGBTQ people and normalizes violence. So we say this is actually radicalization on an industrial scale, dressed up as entertainment.
Speaker 1:Yeah, I want to back up a minute from that part of it, because someone designed and decided that this algorithm needed to be in place to entrap young boys and young men to be socially conditioned toward misogynistic thoughts. So this is kind of the architecture that's designed by patriarchal society. That is correct.
Speaker 3:That said, I will say the algorithm does not care what content it serves up. So, whether you're cooking, whether you're sewing, whether it's misogynistic, whether you're into fashion or swimwear or whatever, the algorithm does not care what content it serves up, and so that's. Its sole goal in life is to feed you more of what you've already watched. The idea, I think, that someone has designed this it's actually just not personal. The algorithm designed by the corporations just wants you to watch more content because, quite actually and we'll talk about this further it's what feeds them ads, and, of course, the ads are monetized and the creator and the platform gets more money from it.
Speaker 1:Well, who's designing the algorithm that selects young boys?
Speaker 3:should get misogynistic content, then, it depends on what they're watching. So let's say they start off with a question on YouTube that says how can I get a date, how can I meet women? And what happens in the way that creators really game the algorithm is they will say this is how you get a date, and then, within that content, they start weaving that misogynistic content. So the algorithm itself is very neutral. It's only got one goal, and one goal only to feed you content. But it's the creators who gain that algorithm and say oh, you want to date, you want to learn how to meet women? Okay, I'll tell you this.
Speaker 3:You want to self-improve? I'll tell you how to make yourself better, I'll tell you how to work out and what to eat. But in the meanwhile they are weaving in the content that says oh, a woman's trying to set up boundaries. Well, a guy like you can overwrite those boundaries, because you know what? If a woman is really into somebody, she's not going to have any boundaries. So I believe, we believe that the content creators are just as much to blame because they've gamed the algorithm.
Speaker 1:Yeah, I've no doubt that that's true as well. It's a fascinating topic and, as simple as it sounds, I think there are a lot of complexities into how these algorithms are designed and what all the content is that can be pushed. And I know at times on social media, I feel like I'm being tested by different businesses or tech companies or algorithms to say well, what do you think about this? I know you look at all of this and I have certain things that I'm always looking at, but what do you think about this? And you can ignore it or just keep scrolling Joy, what do you have to say?
Speaker 2:I definitely agree with that, and I find that myself when I'm online whoa, why am I being pitched all this? And I just think over the last five years at least, it's just become bombarding you with all these horrible toxic content. You almost can't get away from it wherever you are. No matter what you do, what you say, it still tends to pop up on your feed.
Speaker 1:Yeah, it is very hard to get away from it and there's not a lot of legislation around it. I mean, they only just passed that recent legislation, the Take it Down Act. So the Take it Down Act was inspired by deep fakes being created, especially of women, of nude images, and this has happened to lots of celebrities and it happens to lots of women, you know, and young girls, and I'm going to look it up so I can read it exactly what it says the Take it Down Act is a federal law that addresses the issue of non-consensual intimate images, including deep fakes, that are shared online, and it actually criminalizes the publication of these images and requires platforms to remove them upon the request of the victim. The act also includes provisions for law enforcement to access this information for investigative purposes, and there's a lot more to it. Right? But it passed the House and it was sent to the president, and while you're looking at that.
Speaker 3:I would love to see Congress do anything because, quite truthfully, here we are in 2025 and nothing has been done in Congress about making online spaces safer even since the turn of the century, so 25 years. Ever since they legislated I think in 1998, because of lobbyists, because of the influence of the corporations, social media corporations and whatnot, all of the legislation has been overcome time and time and time again. No matter who testifies in front of Congress, no matter how many whistleblowers have come forward, they have yet to act.
Speaker 2:And how many people have sued and have not gotten their content taken down. And the longer something like that stays up there ruining their reputation, and they're spending money for lawyers, and it just keeps going around and around.
Speaker 1:Yeah, it's a very murky place right On the internet and posting stuff on social media, and there is not a lot of safety for any of us on social media or the internet, but there certainly is even less for children, and so we wanted to talk about an example from the Netflix series Adolescence.
Speaker 2:Right. I'm going to say that from my time in law enforcement, I've learned that being home doesn't always mean being safe. So a closed door and Wi-Fi creates a dangerous illusion where parents think their kids are safe because they're not out. But the threat, hate, doesn't need a front door, so it finds them online. And that Netflix series. It's real A boy alone in his room, radicalized by hate, like Laura mentioned, fueled by messages of power and control, he killed a classmate. It made him kill his classmate. This isn't rare. We need to disrupt the silence behind these screens. The screens hide warning signs. Digital safety isn't passive, it's active. So we need to check in, stay curious, ask them questions, because we need real conversations, we need real awareness, not because you don't trust your kid, but because the internet is full of voices trying to own them and nobody owns your children.
Speaker 1:I want to add to that a couple of thoughts as well, because having conversations, especially with teenagers, can be challenging.
Speaker 1:To ensure that you're getting really honest answers and I'm not trying to imply that teenagers are intentionally dishonest what I'm trying to say is that you might ask them a question specifically about are you on this platform, do you play Roblox or you know one of these other types of games, which are equally as scary as social media platforms, and they may.
Speaker 1:They may tell you yeah, I play that, but I only play with my friends, or no, I don't use that service, but sometimes I get Snapchats, you know, sometimes people send them to me, and so there are vulnerabilities in their responses, and parents are not always educated enough to know what these responses mean and that's again not a criticism of a parent so much as we don't have the information we need and we don't have legislation that could even protect us or our children, should we find out that something is going terribly wrong. Now, lately, however, certain video games and other platforms have been exposed as having some serious vulnerabilities and very dangerous environments for children, and Instagram did make a move to require age limits for accounts. Now I think kids can't have an Instagram account, what I recall hearing last. But those are baby steps. They're not like really strengthening the safety net around technology in any way.
Speaker 2:It's still the conversation that sometimes I think parents just don't want to have. They may feel embarrassed to even talk about it because they think they're going to hear oh, I know about it, they talk about that at school and they feel that if they went on that website and you caught them, you know they don't know what to do and they're going to get angry. So, you know, these are probably just ways to talk to them so they're not embarrassed and they can say yep, I was on that website and I saw it. You know they talked about it at school. I just wanted to go on it. But let them know what is happening out there and how you can be lured into people catfishing you on there.
Speaker 1:Yeah, and I think this ties right back to the episode that we that was the first part of this series, when we're talking about kindness, and I think here's a really dangerous scenario. Right, you have a young girl who's on social media, she plays video games with her friends, she has her own account, her own phone, and she's been taught to be polite at all costs, and so she's not listening to her instincts. And it's funny when someone reaches out who's a stranger and starts texting you and they saw your profile picture and you're really cute, why don't you send me another picture? And she starts to be drawn into this, and then she doesn't want to say no, because now there's a bond, right, so they've gotten you to feel safe and comfortable with them, and now, if you say no, you know what's going to happen they're going to be mad, they're going to avoid you, and then you don't have this digital relationship any longer, and this is one of the traps.
Speaker 2:Right, and I think sometimes that if parents are cornered, when they're getting pushed back like that, if they think maybe a relationship is starting, you know, it might even be good to look up a case recently and there are so many of them that have happened and say, look, this girl was the same age as you. All she did was get online to play this game. And here's this person that pretended to be 14 years old, 15 years old, and they're really 30 and 40 years old or older. So maybe if they see that, sometimes kind of like a movie until they believe it, they need to see that these cases are real and they do happen and children do get kidnapped.
Speaker 1:Yeah, I mean, that's one of the worst outcomes of those types of cases. Laura, what were you going to add?
Speaker 3:to that.
Speaker 3:So often we think about young women, but I think it's important to think about young boys as well and how naive they are online because they fall into the same trap, and what we're starting to hear more about now is the suicides of young men who are caught in sexting traps where somebody pretends to be a young woman, pretends to send them you know a picture, and then tells the boy to take a picture of himself and everything that's involved with that and send it back, and then, once they get going in this loop that then they start hitting them up for money and we know people whose young men, whose young boys, have been caught in this trap and they say if you don't send us the money, then we're going to send these pictures to everybody on your social media.
Speaker 3:They just had to take the hit because what's more important? Your child's life, or some pictures that people are going to forget about in the next news cycle or the next 24 hours? But you know, because so much shame is involved, that so many young people are being taken in by this and you know, really it breaks our hearts.
Speaker 1:For sure, and that has been a popular news story lately it's been in the media about those sexting cases and some of the really terrible consequences and outcomes from those. I suggest people look into some of these topics themselves, especially if you have children of your own, to understand kind of the landscape of what's going on with our young people and technology. Let's turn our attention to a little bit of a different concept. Joy, can you give us some insight into how tech-driven misogyny is showing up in actual cases, whether it's domestic violence, school threats or other gender-based crimes. In other words, we experience this harassment or misogynistic content in technology and when we're, you know, kind of in that secluded space alone with our Wi-Fi, with the bedroom door closed, what happens in real life then, when we are out with people? What are some examples of what can happen to change our attitudes towards other people, especially men, towards women?
Speaker 2:There have been plenty of cases that I have seen over the years in my career in law enforcement, a lot of the cases I studied. They are rooted in control and entitlement and these people have a deep resentment towards women. So that's misogyny in action and you know it's not about hateful words, but it's about their behavior. So, like the mass shooters and domestic abusers, they have clear histories of women hating and they justify this abuse by blaming the victim. And it shows up in schools and homes, especially online, and these extremist groups actively recruit young men into misogynistic thinking. Now I have three case studies that illustrate this behavior. Three case studies that illustrate this behavior and it's really mind blowing that at such a young age, this has happened and they've got sucked into this. The first one was a notorious one and he was 22 years old and he inspired others Elliot Roger, isla Vista, california, in 2014,. He killed six and he injured 14 in his misogyny-fueled rampage near UC Santa Barbara. He called it his day of retribution against women who rejected him and the men he hated, and his YouTube videos and manifesto showed years of growing resentment, violent fantasies, starting at age 17.
Speaker 2:The second one, nicholas Cruz, from Parkland, florida, the county I worked in and, as a matter of fact, it was just months after I retired and my sheriff's office responded to this at the high school in Parkland. At the high school in Parkland, Cruz was 19 years old. He opened fire at the high school with an AR-15, killing 17 and injuring another 17. And before the attack he posted violent messages, was active in the online extremist communities. He was another isolated, angry guy. And the third guy is very recent, from FSU in Florida, phoenix Eichner. 20 years old. Eichner targeted fellow college students, hate online and on the campus In his posts, also depostility towards women and minorities. These aren't isolated incidents. They're part of a digital echo chamber. This is tech-driven misogyny in action.
Speaker 1:Those are very interesting examples, horr horrifying, of course, but they are exactly what you just said. This is how tech-driven misogyny plays out in our day-to-day lives, and we see it all the time on media. Let's talk about what can be done. There are four bills before Congress to regulate algorithmic exposure and content targeting. Talk about what those are.
Speaker 3:First of all, the Kids Online Safety Act. Now that was passed last year, I believe, by the Senate, it was held up by the House and now it's back again with some revisions. So if this does pass, this will be as I mentioned before, this will be the first bill that passes our federal government regulating online content for children. The second one, called the Children and Teens Online Privacy Protection Act, and that's COPPA 2. And it expands protections for users ages 13 to 16, andans targeted advertising and mandates verified parental consent, and it also introduces an eraser button for teens and parents to delete personal data. The third one aims to tighten obligations for platforms around child sexual abuse material, and it's CSAM, and so it requires public reporting, mandatory reporting of suspected exploitation, and it introduces penalties plus a new exception to Section 230. That's the bill that was passed in 1998, the last time by Congress, I might add, to allow suits when providers facilitate child sexual abuse material. You would think that this would be a no-brainer Stop the exploitation, especially the exploitation of children, but it seems, as we talked about before, it seems to go round and round. And then the fourth one is the EARN IT Act, and that proposes stripping Section 230 prevention protections when platforms fail to prevent childhood sexual abuse material.
Speaker 3:So I'm going to summarize Section 230 real quick. It basically says that the platforms are not responsible for the content that users provide. So if Time Magazine or the New York Times publishes content, they are held liable for publishing factual content. Section 230 of the federal government says social media platforms do not have that responsibility. In fact, anyone can post anything they want and the only responsibility of the platform is to make the platform available and, as we just previously talked about, and to push content that users want to see more of. So we're hopeful that these regulations pass in Congress. But we also encourage your listeners to not just rely on legislators, because the news cycle, the cycle of predation on the internet, is way too fast. It takes legislators years and many of them have no idea about technology. By the way, it takes years for them to even consider something and in the meanwhile, things can go viral just in a few minutes, as we know.
Speaker 1:Yeah, and technology has changed so rapidly and I mean, I think even if there was legislation a couple of years ago, it would have to be revised at this point because there's just so much happening. I appreciate you bringing all of that to our attention. What else can parents, schools or even people listening to this podcast do to help stop the spread of this online radicalization?
Speaker 2:Well, I'm going to say for parents, it's not just about the screen time, it's about the screen content. Like we said, you can ask your kids what are they seeing, but, you know, not in an aggressive, gotcha tone, but with some curiosity. You know, who do they follow, what makes them feel seen or powerful online? You know that opens the door to talk about manipulation without triggering shame. And for schools, media literacy definitely needs to be taught like a life skill, because it is. Schools should learn how algorithms work and how online content can be engineered to manipulate them. And for the rest of us friends, neighbors, podcast listeners we need to call out the content but call in the viewers. So when we see someone getting pulled into that toxic content, don't write them off. You can try, you know, validating their feelings or insecurity while drawing a hard line against hate.
Speaker 1:Yeah, those are good points. Thanks for sharing that.
Speaker 3:The only thing I can add is that we say call out the content but call in the person. One of the things we talk about so often is involuntary celibates the acronym is incels and incels, like we talked about before, they're often young men who are lonely, who are angry and who are pulled into this content because they are angry and lonely and they don't really have anyone else to turn to, so they think that these influencers are their friends. So when we become aware of that, if we can be empathetic and really call those people in and let them know that they aren't alone and that that content is harmful but that there are better ways, I think that we can all benefit.
Speaker 1:Good advice. So thank you both for talking with me again and I will see you next time. Thank you.
Speaker 3:It's been wonderful.
Speaker 1:Thanks so much, maria. Genesis Women's Shelter and Support exists to give women in abusive situations a way out. We are committed to our mission of providing safety, shelter and support for women and children who have experienced domestic violence, and to raise awareness regarding its cause, prevalence and impact. Join us in creating a societal shift on how people think about domestic violence. You can learn more at GenesisShelterorg and when you follow us on social media on Facebook and Instagram at Genesis Women's Shelter, and on X at Genesis Shelter. The Genesis Helpline is available 24 hours a day, seven days a week, by call or text at 214-946-HELP 214-946-4357.