Tragedy Uncovered: The Devastating Story of Cimarron Thomas and the UK's Most Notorious Catfisher
Serial Napper | True Crime StoriesNovember 14, 202400:31:0328.44 MB

Tragedy Uncovered: The Devastating Story of Cimarron Thomas and the UK's Most Notorious Catfisher

In May 2018, the life of 12-year-old Cimarron Thomas from West Virginia was cut short in a tragic event that would leave her family and community shattered. This vibrant young girl, who loved playing the violin and chatting with friends on Snapchat, ended her life using her father's handgun. Her younger sister discovered her body, and the family was left with unanswered questions. Eighteen months later, Cimarron's father, a U.S. Army veteran, took his own life, consumed by guilt and grief, unaware of the true circumstances behind his daughter's death.

Years later, in 2021, a shocking revelation emerged. Police in Northern Ireland contacted the family, revealing that Cimarron had been a victim of extreme online abuse by Alexander McCartney, a notorious catfish who targeted thousands of children worldwide. The chat logs on McCartney's computer exposed the horrific blackmail and exploitation that led to Cimarron's tragic decision. McCartney, sentenced to life with a minimum of 20 years, had manipulated and coerced Cimarron into sending compromising images, threatening to share them with her family and friends if she didn't comply with his demands.

Join me as I delve into this heartbreaking case, exploring the devastating impact of online abuse and the importance of protecting children from predators like McCartney. Discover how this tragedy has sparked a global conversation about online safety and the need for vigilance in the digital age. Will justice ever fully be served for Cimarron and her family, or will this case remain a haunting reminder of the dangers lurking online?

Sources:

https://www.judiciaryni.uk/judicial-decisions/2024-nicc-30 

https://www.bbc.com/news/articles/c7897plqy3zo

https://www.dailymail.co.uk/news/article-14001169/Alexander-McCartney-catfishing-sentencing-Cimarron-Thomas.html

https://www.bbc.co.uk/news/articles/cm2yj24xge1o 

https://www.pewresearch.org/internet/2023/12/11/teens-social-media-and-technology-2023/

Follow me here:

► YouTube - https://www.youtube.com/@SerialNapper/

► Twitter - https://twitter.com/serial_napper

► Instagram - https://www.instagram.com/serialnappernik/

► Facebook - https://www.facebook.com/SerialNapper/

► TikTok - https://www.tiktok.com/@serialnappernik 



I will be in London at CrimeCon UK and I would love to meet you! Use my discount code NAPPER10 for 10% off the ticket price! Visit https://www.crimecon.co.uk/

Our Sponsors:
* Check out Mood and use my code SERIALNAPPER to get 20% off your first order at https://mood.com
* Head to http://www.Goli.com now and get an exclusive 42% off!


Advertising Inquiries: https://redcircle.com/brands

Privacy & Opt-Out: https://redcircle.com/privacy

[00:00:00] The case featured in this episode has been researched using police records, court documents, witness statements, and the news. Listener discretion is advised. All parties mentioned are innocent until proven guilty, and all opinions are my own.

[00:00:34] Hey everyone, my name is Nikki Young, and this is Serial Napper, the true crime podcast for naps. I'm back with another true crime story to lull you to sleep, or perhaps to give you nightmares.

[00:00:47] Cimarron Thomas was a beautiful, happy-go-lucky 12-year-old little girl from West Virginia. She was the kind of young lady who loved playing the violin, spending time with her family, and chatting with her friends on social media apps like Snapchat.

[00:01:01] However, in May of 2018, without any warning, she ended her own life using her father's legally owned handgun.

[00:01:10] Cimarron Thomas was a good one. Her little sister would be the one to find her body in her parents' bedroom. And tragically, only 18 months later, her father, feeling guilty and devastated over the loss of his daughter, he ended his own life too.

[00:01:25] But what could be the cause of such a senseless tragedy resulting in the loss of two lives?

[00:01:31] No one in the family would have those answers for years, until 2021, when police forces in Northern Ireland contacted them.

[00:01:40] During one of their investigations, officers had made a shocking discovery. Chat logs between Cimarron and a man who they suspected of extreme sexual exploitation.

[00:01:51] The 12-year-old little girl was believed to be a victim of 26-year-old Alexander McCartney, who is alleged to have targeted as many as 3,500 children on 64 devices between 2013 and 2019.

[00:02:08] The heartbreaking reason she had taken her life would be found in chat logs on his computer, saved like some kind of sick trophy.

[00:02:18] So, dim the lights, put your phone down, and listen to the very important story of Cimarron Thomas and the other young victims who made a simple mistake on Snapchat and were robbed of their childhood by a disgusting monster located hundreds of miles away.

[00:02:36] So let's jump right in.

[00:02:38] I don't think I've cried this much while researching a case in a very long time.

[00:02:43] I almost didn't cover it.

[00:02:45] I have a little girl who is just 8 years old right now.

[00:02:49] But she reminds me so much of Cimarron Thomas.

[00:02:52] It's the adorable smile with the gapped teeth, the freckles, the big eyes.

[00:02:57] There's nothing more heartbreaking in this world than the loss of pure innocence, the loss of a child who never gets the chance to grow up to be the person that they were meant to be in this world.

[00:03:08] Like I said, I cried so much that I wasn't sure I was going to be able to read about this case, let alone speak about it.

[00:03:16] But then I read a quote by Cimarron's grandparents that said,

[00:03:40] And so here we are.

[00:03:43] Parts of tonight's story will likely make you sick.

[00:03:46] It's shocking and it's difficult to accept that there are people so disgusting and perverse on the other side of our computers and our cell phones.

[00:03:56] That these vile monsters can find your children in the safety of their homes and bedrooms.

[00:04:03] Cimarron's story is really more important than ever.

[00:04:06] She was a typical 12-year-old girl who grew up living with her father, an Army veteran, her mom, and her little sister in West Virginia.

[00:04:15] They were an ordinary family, and Cimarron was an ordinary young lady who was on the cusp of becoming a teenager.

[00:04:23] While she was still a child at heart who loved elephants and playing the violin,

[00:04:28] she was also spending more time chatting with her friends on social media, particularly the app Snapchat.

[00:04:35] Truthfully, this is not an app that I am super familiar with.

[00:04:39] Back in my day, and I know that makes me sound old, we used Snapchat simply for the cool filters.

[00:04:45] They were one of the first apps that I can recall that had those puppy dog filters that everyone was using.

[00:04:52] Beyond that, I've never really understood the appeal of wanting your messages and photos to disappear after 24 hours.

[00:04:59] But those features are exactly why so many people use the app today.

[00:05:04] According to the stats, in 2023, 60% of teens reported using Snapchat,

[00:05:11] making it the third most popular social media platform for teens after YouTube and TikTok.

[00:05:18] Approximately 51% of teens use Snapchat every single day,

[00:05:23] with 20% of its users being between the ages of 13 and 17 years old.

[00:05:28] My son is 12, and it seems like just about every friend that he has is on Snapchat.

[00:05:35] Now, he's technically not allowed to have the app, but that certainly hasn't stopped him from trying.

[00:05:41] He's redownloaded it, and I've had to delete it a few times now.

[00:05:46] All this to say, as much as we can try to protect our children, technology certainly doesn't make it easy.

[00:05:52] In May of 2018, while 12-year-old Simarin was chatting with her friends on Snapchat,

[00:05:58] she received a new friend request from a name that she didn't recognize.

[00:06:02] The username said Sarah, so Simarin accepted,

[00:06:07] believing this person to be a young girl around her age.

[00:06:11] At first, they chatted back and forth about usual things that young girls would talk about.

[00:06:16] They spoke of their own body image issues,

[00:06:20] and somehow Sarah convinced Simarin to send over a topless photo.

[00:06:25] As soon as she had done so, Simarin learned the truth about who Sarah really was.

[00:06:31] Sarah was not a girl at all, but an admitted catfish,

[00:06:36] who said he now had her nudes, including photos of her face,

[00:06:40] all of which he was going to post online for the world to see

[00:06:43] if she didn't comply with his further requests.

[00:06:47] While Simarin pleaded with the catfish not to do it,

[00:06:51] he continued to threaten her for almost two hours.

[00:06:54] If she didn't want him to post her nude photos to Instagram,

[00:06:58] he warned her that she better send him whatever other photos he wanted,

[00:07:03] including photos of her in her underwear, in different sexual positions.

[00:07:09] Simarin complied, terrified about the possible consequences of not doing so.

[00:07:14] The catfish concluded the conversation by telling the young girl that

[00:07:18] she had done what she needed to do, so she could go ahead and block him now.

[00:07:23] He would delete the photos and all would be well.

[00:07:27] Simarin went back to bed that night, and to school the following morning,

[00:07:30] she didn't tell a soul about what had happened the evening prior.

[00:07:35] She thought that it was over, but it wasn't.

[00:07:39] Just two nights later, she was once again contacted by the catfish,

[00:07:44] who was now using a different profile, a different picture, and a different name.

[00:07:48] He said to her,

[00:07:50] I want to play one more time.

[00:07:53] Simarin begged for him to leave her alone,

[00:07:56] but the catfish told her that if she truly wanted to be left alone,

[00:08:00] she'd have to send just a few more photos before he actually deleted the ones that he already had of her.

[00:08:07] This time, he wanted Simarin to take photos of her nine-year-old sister.

[00:08:12] She refused, telling the catfish that she would rather kill herself than harm her little sister like that.

[00:08:19] And in turn, he sent a countdown clock and told her,

[00:08:23] Good luck and goodbye.

[00:08:25] End quote.

[00:08:27] You will see what happens by tonight.

[00:08:29] I tried to be nice by offering to delete them for you.

[00:08:32] Just three minutes after the final message had been sent,

[00:08:36] Simarin walked into her parents' bedroom and closed the door behind her.

[00:08:41] Both of her parents were out for the evening.

[00:08:43] Her little sister rushed into the room after she thought she heard a balloon pop,

[00:08:48] but it wasn't a balloon at all.

[00:08:51] Simarin had retrieved her father's gun, which he legally owned,

[00:08:55] and she shot herself.

[00:08:57] Her little sister called 911,

[00:09:00] and here's a short clip of that call.

[00:09:02] You can only hear one side, the side of the dispatcher,

[00:09:05] because her sister was a minor at the time.

[00:09:11] Crescent County 911, what's the address of your emergency?

[00:09:15] What happened?

[00:09:17] She fell?

[00:09:19] Shot herself?

[00:09:21] What's your address?

[00:09:22] Do you live on, uh, do you live on?

[00:09:25] Is she okay?

[00:09:27] Is she, are you next to her?

[00:09:29] No.

[00:09:30] I'm at a neighbor who's trying to run across the street.

[00:09:33] I need to, I know, a firearm and it's okay.

[00:09:37] I got nothing right now.

[00:09:38] They're outside.

[00:09:40] They're not on sync.

[00:09:42] Sir, are you there?

[00:09:44] I have a dump of creep.

[00:09:45] Okay.

[00:09:46] I need you to get in there and get some rags and put pressure on that wound,

[00:09:51] find out how to control the bleeding,

[00:09:52] and I need to secure the firearm.

[00:09:55] You're the mother, right?

[00:09:56] No.

[00:09:58] Twelve or...

[00:10:00] Twelve or nine.

[00:10:01] The girl shot us twelve.

[00:10:04] Are you there?

[00:10:06] No, she shot herself in the head.

[00:10:08] Shot herself in the head?

[00:10:14] Okay.

[00:10:14] Okay.

[00:10:15] And can you get pressure on that?

[00:10:22] I need, yeah, get her to get rags, put pressure on that,

[00:10:25] control the bleeding as much as you can.

[00:10:26] I need to secure the firearm.

[00:10:27] Is it close by?

[00:10:29] There is now.

[00:10:30] Neighbors in the house.

[00:10:37] Able to secure the firearm?

[00:10:50] Okay, you got rags and pressure on?

[00:11:02] Turn her to the side so she doesn't ask for it, okay?

[00:11:14] I'm coming.

[00:11:15] Simmerin was transported to the hospital where she was pronounced dead shortly after midnight.

[00:11:20] Not her parents, sister, or anyone close to her had any idea why this happy-go-lucky

[00:11:27] twelve-year-old little girl would want to take her own life.

[00:11:31] And they wouldn't know the truth of it all for several more years.

[00:11:35] Sadly, Simmerin's father would never have those answers.

[00:11:39] He was devastated by his daughter's death,

[00:11:43] completely grief-stricken and overcome with feelings of guilt

[00:11:46] as it was his gun that Simmerin had used that night.

[00:11:49] He also had a seizure and was diagnosed with epilepsy,

[00:11:54] losing his job and his source of income.

[00:11:57] Just 18 months after her death, he too took his own life,

[00:12:02] leaving behind his wife and youngest daughter.

[00:12:05] This family was completely destroyed,

[00:12:08] and no one was aware of the source of all this pain.

[00:12:12] Not until April 2021, about three years later,

[00:12:16] when they received a call from a police agency in Northern Ireland.

[00:12:21] It's time for a quick break and a word from tonight's sponsors.

[00:12:25] Hang on, I'll be back before you know it.

[00:12:29] Is your daily routine filled with hidden hormone disruptors?

[00:12:34] Believe it or not, there are over 1,000 of these sneaky culprits lurking in our environment.

[00:12:39] Everything from our food and water to the air we breathe,

[00:12:42] and even in our skincare products.

[00:12:44] And they wreak havoc on our hormones.

[00:12:47] But here's the good news.

[00:12:49] You don't have to put up with it any longer.

[00:12:52] Introducing Hormone Harmony,

[00:12:54] a game-changing formula crafted from all natural herbal ingredients

[00:12:58] designed to ease hormonal symptoms for women of all ages.

[00:13:02] Hormone Harmony isn't just for those going through menopause.

[00:13:06] It's for any woman navigating the ups and downs of hormone balance.

[00:13:10] Join the movement.

[00:13:12] Hormone Harmony has become a sensation.

[00:13:15] With a bottle flying off the shelves every 24 seconds,

[00:13:18] and over 17,000 rave reviews,

[00:13:22] it's clear that women everywhere can't stop talking about it.

[00:13:26] If you're fed up with unexpected monthly surprises,

[00:13:29] or menopause symptoms that zap your energy and joy,

[00:13:32] it's time to reclaim your vitality with Hormone Harmony.

[00:13:36] And here's a special treat.

[00:13:38] For a limited time, enjoy 15% off your first order at happymammoth.com.

[00:13:44] Just use the code CERIALNAPPER at checkout.

[00:13:47] That's right.

[00:13:48] Head to happymammoth.com and enter CERIALNAPPER for your discount today.

[00:13:54] It's time to feel like you again.

[00:13:58] Now back to our story.

[00:14:01] Officers told the Thomas family that they had been investigating a 26-year-old man

[00:14:07] named Alexander McCartney for crimes against children,

[00:14:10] and they had found information on his computer

[00:14:13] that led them to believe that Simmerin had been one of his victims.

[00:14:18] Alexander still lived at home with his parents in a city in Northern Ireland.

[00:14:23] He was known to police long before this.

[00:14:26] His history of sexual deviancy began when he was just 17 years old.

[00:14:31] His internet activity had been flagged for viewing illegal material,

[00:14:36] resulting in a raid of his childhood home

[00:14:38] and a large number of electronic devices being seized.

[00:14:42] At that time, police took eight computer towers,

[00:14:46] four laptops, eight tablets, and nine mobile phones.

[00:14:50] They found thousands of illegal images depicting children on four devices,

[00:14:57] a laptop, a computer tower, a tablet, and a phone.

[00:15:00] He was using all of these devices

[00:15:01] to view explicit material containing images of children.

[00:15:06] Previously deleted chat conversations were recovered

[00:15:09] that showed a disgusting pattern of manipulation.

[00:15:13] Alexander would pressure young girls

[00:15:15] into sending nude photos of themselves,

[00:15:17] but he never seemed to be satisfied with what he got.

[00:15:21] He would simply demand more and more,

[00:15:24] and when they eventually refused,

[00:15:26] he'd threaten to post the photos that he already had

[00:15:29] publicly to the internet.

[00:15:31] In one of the messages police were able to recover at that time,

[00:15:35] the young girl said that she felt like she was going to die,

[00:15:38] and he replied,

[00:15:39] I don't care.

[00:15:41] Alexander was brought in for questioning.

[00:15:44] During the search of his home,

[00:15:45] police had found a handwritten letter,

[00:15:48] supposedly written by him,

[00:15:49] that started with,

[00:15:50] Dear Mom and Dad,

[00:15:52] I would like to say sorry and explain.

[00:15:54] The note went on to say that

[00:15:56] when he was 15 years old,

[00:15:58] he was the victim of a catfish

[00:16:00] who manipulated him into sending nude photos,

[00:16:03] which were later used to threaten him with.

[00:16:06] It also suggested that this catfish

[00:16:09] made him do the very same thing to other young girls

[00:16:12] to collect more nude photos.

[00:16:14] If he didn't comply with this catfish,

[00:16:17] the note alleged that

[00:16:18] the catfish would ruin his life

[00:16:20] by then posting his nude photos online.

[00:16:23] The end of the letter read,

[00:16:25] Since September,

[00:16:27] I've wanted to end it one way or another

[00:16:29] as he had the power to ruin my life.

[00:16:32] I shouldn't live with the guilt anymore,

[00:16:34] and I have been depressed ever since.

[00:16:37] I should have stopped long ago,

[00:16:39] and I was stupid,

[00:16:40] so it spiraled out of control,

[00:16:43] so I can't take it anymore.

[00:16:44] I am so sorry.

[00:16:46] Goodbye.

[00:16:46] I love you all.

[00:16:47] During his interview with the police,

[00:16:50] Alexander reiterated this story,

[00:16:52] how this catfish,

[00:16:54] who claimed to be named Sarah,

[00:16:56] had made him go out

[00:16:57] and get more nude images of these girls,

[00:16:59] that he was being manipulated.

[00:17:01] He was being blackmailed to do this.

[00:17:03] When the police asked

[00:17:05] why someone would go this far

[00:17:06] to make him do all of that,

[00:17:08] he responded that

[00:17:09] the person was probably just lazy

[00:17:11] and wanted someone else

[00:17:13] to do the dirty work.

[00:17:15] Investigators didn't really believe

[00:17:17] his story,

[00:17:18] and they were absolutely right not to.

[00:17:20] He was arrested

[00:17:21] and then released on bail,

[00:17:23] but this did not stop him

[00:17:26] from continuing to harass children online

[00:17:28] for his own desires.

[00:17:30] While the police were building

[00:17:32] their case against him,

[00:17:33] he continued to use his laptop,

[00:17:36] his desktop,

[00:17:37] and his mobile phone

[00:17:38] to solicit young girls

[00:17:39] for nude photos.

[00:17:41] He had absolutely no desire to stop.

[00:17:44] If he was truly being blackmailed

[00:17:46] by this catfish to do this

[00:17:48] and he had now told the police

[00:17:50] what was going on,

[00:17:50] then he would have been able

[00:17:52] to stop at this point.

[00:17:52] But no,

[00:17:53] these photos were for him.

[00:17:55] It was brought to the attention

[00:17:57] of the police again

[00:17:58] when a 13-year-old girl

[00:18:00] from Scotland

[00:18:00] reported to her local authorities

[00:18:02] that she had been a victim

[00:18:04] of this sextortion scheme.

[00:18:07] She claimed that

[00:18:08] she had been groomed by someone

[00:18:10] who had pretended to be a girl

[00:18:11] her age,

[00:18:12] but was actually an adult male.

[00:18:15] Surprise, surprise.

[00:18:17] They were able to trace

[00:18:18] all of these messages

[00:18:19] right back to Alexander McCartney,

[00:18:22] who had never stopped doing this.

[00:18:24] Once again,

[00:18:25] the police raided his home,

[00:18:27] where he was still living

[00:18:29] with his parents,

[00:18:29] and they seized 64 devices

[00:18:32] that contained tens of thousands

[00:18:34] of images of underage girls

[00:18:36] who were being blackmailed

[00:18:38] into performing sex acts

[00:18:39] in photos and videos.

[00:18:41] While most of these victims

[00:18:43] didn't report anything

[00:18:44] to the police

[00:18:45] for fear of repercussions,

[00:18:47] they identified around

[00:18:49] 3,500 victims

[00:18:51] in Belgium,

[00:18:52] Canada,

[00:18:53] Croatia,

[00:18:54] Colombia,

[00:18:55] Denmark,

[00:18:55] Germany,

[00:18:56] Italy,

[00:18:57] Latvia,

[00:18:58] Lithuania,

[00:18:58] Mexico,

[00:18:59] the Netherlands,

[00:19:00] Norway,

[00:19:01] New Zealand,

[00:19:02] Poland,

[00:19:03] Slovakia,

[00:19:04] Spain,

[00:19:05] Sweden,

[00:19:05] and the US,

[00:19:06] as well as the UK.

[00:19:08] This was a

[00:19:09] wide-scale investigation

[00:19:10] that took years

[00:19:12] to fully uncover,

[00:19:14] but it was clear

[00:19:15] that Alexander McCartney

[00:19:16] had basically built

[00:19:18] a pedophile enterprise.

[00:19:20] He had a system

[00:19:22] that he followed closely

[00:19:23] to obtain these images,

[00:19:25] and he used

[00:19:26] these saved messages,

[00:19:27] these canned messages,

[00:19:28] so he basically

[00:19:30] just had to copy

[00:19:31] and paste

[00:19:31] these pre-written

[00:19:32] canned messages

[00:19:33] to these children

[00:19:34] to get them to do

[00:19:35] what he wanted to do.

[00:19:37] It was basically

[00:19:37] this entire system

[00:19:38] that he had built.

[00:19:39] He targeted

[00:19:41] young girls

[00:19:41] between the ages

[00:19:42] of 12 to 16 years old,

[00:19:44] most of who

[00:19:45] were questioning

[00:19:46] their own sexuality

[00:19:48] and whether or not

[00:19:49] they were attracted

[00:19:50] to other women.

[00:19:51] He set up

[00:19:53] numerous fake profiles

[00:19:54] pretending to be

[00:19:55] a girl himself,

[00:19:57] and he used

[00:19:58] names such as

[00:19:59] Chloe,

[00:19:59] Anna,

[00:20:00] and Hannah.

[00:20:01] Think about it

[00:20:01] for a second.

[00:20:02] The way that he

[00:20:03] targeted these girls

[00:20:05] who were questioning

[00:20:05] their sexuality

[00:20:06] is absolutely

[00:20:08] diabolical

[00:20:09] because he knew

[00:20:10] that many of them

[00:20:11] were not out.

[00:20:12] So if they were

[00:20:14] to tell their parents

[00:20:14] what was going on,

[00:20:16] not only would

[00:20:17] these girls

[00:20:17] have to admit

[00:20:18] to taking these

[00:20:19] sexually explicit

[00:20:20] photos of themselves,

[00:20:21] but then they would

[00:20:22] have to out themselves

[00:20:24] to their parents

[00:20:25] when they might not

[00:20:26] be ready to do that.

[00:20:27] He would send them

[00:20:28] photos which were

[00:20:29] not actually him,

[00:20:30] they were always

[00:20:31] photos of young girls

[00:20:32] that he was pretending

[00:20:33] to be.

[00:20:34] Some of the photos

[00:20:35] that he was using

[00:20:36] for profile pictures

[00:20:37] and things like that,

[00:20:38] they were actually

[00:20:39] of previous victims

[00:20:41] he had extorted.

[00:20:42] So he was getting

[00:20:43] these photos

[00:20:44] from other girls,

[00:20:45] and then he was

[00:20:45] pretending to be them

[00:20:46] online to further

[00:20:47] victimize more

[00:20:48] young ladies.

[00:20:50] And then he preyed

[00:20:51] on their insecurities

[00:20:52] by love-bombing

[00:20:54] them at first,

[00:20:55] telling them

[00:20:56] how attractive

[00:20:57] they are,

[00:20:58] grooming them

[00:20:58] for the next

[00:20:59] phase of his plan.

[00:21:00] Once he gained

[00:21:01] their trust,

[00:21:02] he pressured them

[00:21:03] to take topless

[00:21:04] photos of themselves,

[00:21:06] always showing

[00:21:07] their face.

[00:21:08] He made sure

[00:21:09] that their face

[00:21:11] was visible

[00:21:11] in both these

[00:21:12] photos and videos.

[00:21:13] Once the young girl

[00:21:15] had sent the photo,

[00:21:16] he would reveal

[00:21:17] his true nature,

[00:21:19] telling them

[00:21:19] that he was

[00:21:20] a catfish,

[00:21:22] which,

[00:21:22] let's be real,

[00:21:23] catfishing is

[00:21:24] far too innocent

[00:21:25] of a term.

[00:21:26] He is a pedophile,

[00:21:28] he's a predator.

[00:21:29] Then he demanded

[00:21:31] more explicit photos

[00:21:32] and videos

[00:21:33] from them,

[00:21:34] threatening to post

[00:21:35] the news that he

[00:21:36] already had in his

[00:21:36] possession if they

[00:21:37] didn't agree.

[00:21:39] Sometimes he threatened

[00:21:40] to send the photos

[00:21:41] directly to their

[00:21:41] family members.

[00:21:43] Sometimes he would

[00:21:44] send them maps

[00:21:45] of their location,

[00:21:46] which was easily

[00:21:46] accessible on Snapchat,

[00:21:48] and then he would

[00:21:49] threaten to show up

[00:21:50] at their school.

[00:21:51] With one victim,

[00:21:53] he told her that

[00:21:54] if she didn't send

[00:21:55] the exact photos

[00:21:57] that he wanted,

[00:21:58] he would post her

[00:21:59] nudes online,

[00:22:00] and then he would

[00:22:01] have people go to

[00:22:02] her home to rape

[00:22:03] her.

[00:22:04] These are literally

[00:22:05] children that we are

[00:22:06] talking about here.

[00:22:08] Children who have

[00:22:09] been manipulated

[00:22:10] and groomed into

[00:22:11] doing things that

[00:22:12] they know they

[00:22:13] really aren't

[00:22:13] supposed to be

[00:22:14] doing.

[00:22:15] Now,

[00:22:15] they don't want

[00:22:16] to get in trouble.

[00:22:17] They don't want

[00:22:18] people they know

[00:22:18] seeing these photos

[00:22:19] and thinking they're

[00:22:20] quote,

[00:22:21] bad.

[00:22:22] So,

[00:22:22] he has them

[00:22:23] right where he

[00:22:24] wants them,

[00:22:25] where he can

[00:22:26] convince them

[00:22:27] to continue

[00:22:27] doing as he

[00:22:28] demands.

[00:22:29] Now,

[00:22:30] I need to issue

[00:22:31] a trigger alert

[00:22:32] here because

[00:22:33] I'm going to be

[00:22:33] talking about

[00:22:34] some of the

[00:22:34] things that

[00:22:35] this monster

[00:22:36] made these

[00:22:37] young girls

[00:22:37] do in the

[00:22:38] photos and

[00:22:39] the videos.

[00:22:41] It's disgusting

[00:22:42] and it will

[00:22:43] make you sick

[00:22:43] to your stomach,

[00:22:44] so skip ahead

[00:22:45] five minutes

[00:22:46] if you need

[00:22:46] to.

[00:22:47] I'm not trying

[00:22:48] to be gratuitous

[00:22:49] here.

[00:22:49] This is just

[00:22:50] really important

[00:22:51] to know,

[00:22:52] especially if

[00:22:53] you're a parent

[00:22:53] of either a

[00:22:54] girl or a

[00:22:55] boy.

[00:22:55] This monster

[00:22:57] would have

[00:22:58] these young

[00:22:59] girls position

[00:23:00] themselves while

[00:23:01] nude into sexual

[00:23:02] poses and perform

[00:23:04] various sex acts,

[00:23:06] including having

[00:23:07] them masturbate.

[00:23:08] According to the

[00:23:09] court documents,

[00:23:10] many of these

[00:23:11] sexual acts were

[00:23:12] penetrative in

[00:23:13] nature.

[00:23:14] If he learned

[00:23:15] that they had a

[00:23:16] younger sibling,

[00:23:17] he would demand

[00:23:18] that they sexually

[00:23:20] abuse these siblings

[00:23:21] in the photos and

[00:23:22] videos as well.

[00:23:23] He wanted to

[00:23:24] make his victims

[00:23:25] the bad guy

[00:23:26] and feel so

[00:23:27] much shame that

[00:23:29] they would never

[00:23:29] tell a soul.

[00:23:31] Some of the

[00:23:32] siblings of these

[00:23:33] victims were as

[00:23:34] young as three

[00:23:35] and five years

[00:23:36] old.

[00:23:36] So now,

[00:23:37] not only have

[00:23:38] these young

[00:23:39] girls been

[00:23:39] victimized,

[00:23:40] but they also

[00:23:41] have to carry

[00:23:41] the guilt and

[00:23:42] shame of abusing

[00:23:44] their younger

[00:23:45] siblings.

[00:23:46] Now they're being

[00:23:47] threatened with

[00:23:48] the allegation

[00:23:49] that they are

[00:23:50] having sexual

[00:23:52] intercourse with

[00:23:52] their siblings.

[00:23:54] One father from

[00:23:55] New Zealand spoke

[00:23:56] about how his

[00:23:57] daughter was

[00:23:57] abused exactly

[00:23:59] this way.

[00:24:00] His name has

[00:24:01] been withheld in

[00:24:01] order to protect

[00:24:02] the identity of

[00:24:03] his daughter,

[00:24:04] who was only

[00:24:05] 12 years old at

[00:24:06] the time that

[00:24:07] Alexander McCartney

[00:24:08] contacted her

[00:24:08] through Snapchat.

[00:24:10] He had asked

[00:24:11] the girl for

[00:24:12] nude photos,

[00:24:13] and after she

[00:24:14] sent them,

[00:24:15] Alexander used

[00:24:16] them to blackmail

[00:24:17] her.

[00:24:17] The girl's

[00:24:18] father would say,

[00:24:19] quote,

[00:24:19] he then used

[00:24:21] that to

[00:24:21] manipulate and

[00:24:22] blackmail her

[00:24:23] into sending

[00:24:23] more photos,

[00:24:25] which ended up

[00:24:26] including our

[00:24:26] youngest daughter

[00:24:27] as well as

[00:24:28] part of the

[00:24:29] blackmail.

[00:24:30] And then,

[00:24:31] in time,

[00:24:32] through her

[00:24:32] contact list on

[00:24:33] Snapchat,

[00:24:34] he added

[00:24:34] Rebecca's cousin

[00:24:35] as well,

[00:24:36] who was older

[00:24:37] at the time,

[00:24:38] and then he

[00:24:38] tried to threaten

[00:24:39] her with getting

[00:24:39] more photos.

[00:24:41] Thankfully,

[00:24:41] she was mature

[00:24:42] enough and smart

[00:24:43] enough to reach

[00:24:44] out to my wife,

[00:24:45] and then we went

[00:24:46] straight to the

[00:24:46] police from there.

[00:24:48] As if this

[00:24:49] wasn't bad

[00:24:50] enough,

[00:24:51] he would further

[00:24:52] degrade some

[00:24:52] of the victims

[00:24:53] by demanding

[00:24:54] that they

[00:24:54] urinate and

[00:24:56] defecate on

[00:24:56] themselves or

[00:24:57] on the floor,

[00:24:58] and then handle

[00:25:00] the feces.

[00:25:01] All of this

[00:25:02] was recorded

[00:25:03] through photos

[00:25:03] and videos

[00:25:04] that he would

[00:25:05] then threaten

[00:25:06] to make public

[00:25:07] if they ever

[00:25:07] defied him.

[00:25:08] If they complied,

[00:25:10] he might tell

[00:25:11] them that they

[00:25:12] did good,

[00:25:12] he would now

[00:25:13] delete these

[00:25:14] images,

[00:25:15] and they would

[00:25:15] just disappear,

[00:25:16] but oftentimes,

[00:25:18] once the victim

[00:25:19] finally felt like

[00:25:20] it was all over

[00:25:21] and they could

[00:25:22] breathe,

[00:25:22] he would pop

[00:25:23] back up in a

[00:25:24] week or even a

[00:25:25] month later to

[00:25:26] further victimize

[00:25:27] them, to

[00:25:28] traumatize them,

[00:25:30] to terrify them.

[00:25:31] In response to

[00:25:33] Alexander's

[00:25:33] disgusting demands,

[00:25:35] these young girls

[00:25:36] would beg for

[00:25:37] him to stop.

[00:25:38] In many of the

[00:25:39] photos they

[00:25:40] would send,

[00:25:41] it was obvious

[00:25:42] that they had

[00:25:42] been crying and

[00:25:44] they were distraught.

[00:25:45] Even when some

[00:25:46] of the victims

[00:25:47] told him that

[00:25:48] they wanted to

[00:25:49] kill themselves

[00:25:50] because of what

[00:25:50] he was doing to

[00:25:51] them, his

[00:25:52] response was

[00:25:53] cold.

[00:25:54] He basically

[00:25:55] just told them

[00:25:56] to go ahead

[00:25:57] and do it

[00:25:58] because he

[00:25:59] wasn't going

[00:25:59] to stop.

[00:26:01] One of the

[00:26:01] victims repeatedly

[00:26:03] said that she

[00:26:04] had wanted to

[00:26:05] end her life

[00:26:06] over the

[00:26:06] threats and

[00:26:07] that her

[00:26:07] mother was

[00:26:08] dying from

[00:26:08] cancer.

[00:26:09] His response

[00:26:10] to her,

[00:26:11] I do not

[00:26:12] give a shit

[00:26:13] about you or

[00:26:14] your mom.

[00:26:14] It's believed

[00:26:16] that Alexander

[00:26:17] McCartney

[00:26:17] targeted as

[00:26:18] many as

[00:26:19] 3,500

[00:26:19] children on

[00:26:21] 64 devices

[00:26:22] between 2013

[00:26:23] and 2019.

[00:26:25] After he was

[00:26:26] arrested, he

[00:26:27] would admit

[00:26:28] to about

[00:26:28] 185 charges

[00:26:30] involving 70

[00:26:32] child victims

[00:26:33] between the

[00:26:33] ages of 10

[00:26:35] and 16,

[00:26:36] including a

[00:26:37] manslaughter

[00:26:38] charge for

[00:26:39] Simmer and

[00:26:39] Thomas'

[00:26:40] death.

[00:26:41] Police in

[00:26:41] America had

[00:26:42] initially wanted

[00:26:43] to extradite

[00:26:44] him to

[00:26:44] the states

[00:26:45] so that he

[00:26:46] could face

[00:26:46] serious charges

[00:26:47] there, but

[00:26:48] unfortunately, it

[00:26:49] just wasn't

[00:26:50] possible.

[00:26:51] Still, this

[00:26:52] case was one

[00:26:53] of the biggest

[00:26:53] criminal

[00:26:54] indictments in

[00:26:55] Northern Ireland's

[00:26:56] legal history.

[00:26:57] It was also a

[00:26:58] groundbreaking case

[00:26:59] of the first

[00:27:00] instance of a

[00:27:01] manslaughter

[00:27:01] charge where

[00:27:02] the perpetrator

[00:27:03] had never

[00:27:04] actually met

[00:27:05] the victim

[00:27:05] in person.

[00:27:07] Alexander

[00:27:08] McCartney would

[00:27:08] be sentenced

[00:27:09] to life in

[00:27:10] prison, at

[00:27:10] least 20 years

[00:27:11] in prison,

[00:27:12] for his crimes

[00:27:13] against children.

[00:27:14] The judge

[00:27:15] presiding over

[00:27:16] the case, Mr.

[00:27:17] Justice O'Hara,

[00:27:18] would say,

[00:27:18] quote,

[00:27:19] To my knowledge,

[00:27:20] there has not

[00:27:21] been a case

[00:27:21] such as the

[00:27:22] present where

[00:27:23] a defendant

[00:27:23] has used

[00:27:24] social media

[00:27:25] on an

[00:27:25] industrial scale

[00:27:26] to inflict

[00:27:27] such terrible

[00:27:29] and catastrophic

[00:27:30] damage on

[00:27:31] young girls

[00:27:31] up to and

[00:27:32] including the

[00:27:33] death of a

[00:27:34] 12-year-old

[00:27:34] girl.

[00:27:35] The defendant

[00:27:36] was remorseless.

[00:27:37] He ignored

[00:27:38] multiple opportunities

[00:27:39] to stop.

[00:27:40] He ignored

[00:27:41] multiple pleas

[00:27:42] for mercy.

[00:27:43] He lied

[00:27:45] and lied

[00:27:45] and then

[00:27:46] lied again.

[00:27:47] In my

[00:27:47] judgment,

[00:27:48] it is truly

[00:27:49] difficult to

[00:27:49] think of a

[00:27:50] sexual deviant

[00:27:51] who poses

[00:27:51] a greater

[00:27:52] risk than

[00:27:53] this defendant.

[00:27:55] So how

[00:27:56] was this

[00:27:56] monster able

[00:27:57] to get away

[00:27:58] with this

[00:27:58] for so long?

[00:28:00] Alexander

[00:28:01] McCartney was

[00:28:02] described by

[00:28:03] his former

[00:28:03] friends as

[00:28:04] a bit of

[00:28:05] a loner

[00:28:05] and a

[00:28:06] weirdo who

[00:28:07] was super

[00:28:08] interested in

[00:28:09] computers.

[00:28:10] Yet,

[00:28:11] not someone

[00:28:11] who they

[00:28:12] would ever

[00:28:12] describe as

[00:28:13] having the

[00:28:13] characteristics

[00:28:14] of a

[00:28:15] predator.

[00:28:16] And that

[00:28:16] might be

[00:28:17] the problem.

[00:28:18] There's no

[00:28:18] specific checklist

[00:28:20] of markers

[00:28:20] that one

[00:28:21] must have

[00:28:22] to be

[00:28:22] identified

[00:28:22] as a

[00:28:23] pedophile.

[00:28:24] This was

[00:28:25] a person

[00:28:25] who started

[00:28:26] abusing

[00:28:26] young girls

[00:28:27] through the

[00:28:27] computer when

[00:28:28] he was still

[00:28:29] only in

[00:28:30] high school

[00:28:30] himself.

[00:28:31] He spent

[00:28:32] a lot of

[00:28:33] time on

[00:28:33] technology,

[00:28:34] but it was

[00:28:35] something that

[00:28:36] he was interested

[00:28:36] in.

[00:28:37] Even going

[00:28:38] to university

[00:28:38] to study

[00:28:39] computer

[00:28:40] science.

[00:28:41] No one

[00:28:41] was any

[00:28:42] the wiser

[00:28:42] that he

[00:28:43] had created

[00:28:44] this elaborate

[00:28:45] system to

[00:28:46] violate

[00:28:47] thousands of

[00:28:48] child victims

[00:28:49] all around

[00:28:49] the world.

[00:28:50] When he

[00:28:51] was asked

[00:28:52] why he

[00:28:52] was doing

[00:28:53] what he

[00:28:53] was doing

[00:28:53] by some

[00:28:54] of the

[00:28:54] victims,

[00:28:55] sometimes

[00:28:55] he'd

[00:28:56] make up

[00:28:56] a complete

[00:28:57] lie,

[00:28:58] claiming that

[00:28:58] his own

[00:28:59] parents were

[00:29:00] in prison

[00:29:00] for sexually

[00:29:01] abusing

[00:29:02] him and

[00:29:02] that he

[00:29:03] had been

[00:29:03] adopted

[00:29:03] out as

[00:29:04] a young

[00:29:04] child.

[00:29:05] An

[00:29:05] allegation

[00:29:06] that is

[00:29:07] totally

[00:29:07] false.

[00:29:08] Other

[00:29:09] times,

[00:29:09] he would

[00:29:10] be a

[00:29:10] little bit

[00:29:10] more

[00:29:11] honest,

[00:29:11] telling

[00:29:11] them,

[00:29:12] I'm

[00:29:13] just

[00:29:13] messed

[00:29:13] up and

[00:29:14] into

[00:29:14] weird

[00:29:14] stuff,

[00:29:15] or I

[00:29:16] just get

[00:29:16] this

[00:29:17] urge where

[00:29:17] I like

[00:29:18] to be

[00:29:18] in full

[00:29:18] control

[00:29:19] sexually.

[00:29:20] Technology

[00:29:21] and social

[00:29:22] media,

[00:29:22] they've

[00:29:23] made it

[00:29:23] easier than

[00:29:24] ever for

[00:29:24] predators

[00:29:25] to reach

[00:29:25] children in

[00:29:26] the one

[00:29:27] place where

[00:29:27] we feel

[00:29:28] like they're

[00:29:28] safe,

[00:29:29] in their

[00:29:30] own

[00:29:30] homes.

[00:29:31] Many

[00:29:32] times,

[00:29:32] while these

[00:29:32] victims were

[00:29:33] being violated

[00:29:34] and abused,

[00:29:36] their parents

[00:29:36] were in the

[00:29:37] room right

[00:29:38] next to

[00:29:38] them.

[00:29:39] This monster

[00:29:40] in particular

[00:29:41] was only

[00:29:41] able to

[00:29:42] abuse these

[00:29:43] young girls

[00:29:43] for so

[00:29:44] long because

[00:29:45] he made

[00:29:45] them feel

[00:29:46] shamed and

[00:29:47] embarrassed.

[00:29:48] They were

[00:29:49] victims.

[00:29:50] Unfortunately,

[00:29:51] they were made

[00:29:51] to feel

[00:29:52] like they

[00:29:53] were the

[00:29:53] bad ones,

[00:29:54] which is

[00:29:55] why we

[00:29:55] need to

[00:29:56] talk about

[00:29:56] it.

[00:29:57] Not only

[00:29:58] in our

[00:29:58] homes with

[00:29:59] our own

[00:29:59] children,

[00:30:00] but this

[00:30:01] is the

[00:30:01] kind of

[00:30:01] thing that

[00:30:02] needs to

[00:30:02] be taught

[00:30:03] in school.

[00:30:04] Digital

[00:30:04] literacy and

[00:30:05] how to

[00:30:06] protect

[00:30:06] yourself

[00:30:06] against

[00:30:07] the dangers

[00:30:07] of technology

[00:30:08] and the

[00:30:09] monsters

[00:30:10] behind the

[00:30:11] screens.

[00:30:12] Especially

[00:30:13] in the age

[00:30:13] of AI,

[00:30:14] things are

[00:30:15] only going

[00:30:16] to continue

[00:30:16] to get

[00:30:17] worse.

[00:30:18] Children

[00:30:18] just do not

[00:30:19] have the

[00:30:20] capacity to

[00:30:21] fully

[00:30:21] understand

[00:30:22] the

[00:30:22] implications

[00:30:23] of just

[00:30:23] this one

[00:30:24] mistake,

[00:30:25] like sending

[00:30:25] a nude

[00:30:26] photo.

[00:30:27] But it's

[00:30:28] something that

[00:30:29] could completely

[00:30:30] destroy their

[00:30:31] lives.

[00:30:31] As I wrap

[00:30:32] up tonight's

[00:30:33] story,

[00:30:34] let's take a

[00:30:34] hard look at

[00:30:35] the reality

[00:30:36] that we're

[00:30:36] facing today.

[00:30:37] The internet

[00:30:38] and social

[00:30:39] media,

[00:30:39] they've

[00:30:40] knocked down

[00:30:41] these boundaries

[00:30:42] and safety

[00:30:43] nets that

[00:30:43] we've built

[00:30:44] between our

[00:30:44] kids and

[00:30:45] the dangers

[00:30:45] of the

[00:30:46] world.

[00:30:46] The

[00:30:47] predators

[00:30:47] that we've

[00:30:48] feared no

[00:30:49] longer need

[00:30:50] to lurk in

[00:30:51] the dark

[00:30:51] alleys or

[00:30:52] in the

[00:30:52] shadows.

[00:30:53] They can

[00:30:53] quite literally

[00:30:54] enter our

[00:30:55] children's

[00:30:55] lives uninvited

[00:30:57] and undetected.

[00:30:58] We're so

[00:31:00] focused on

[00:31:00] keeping our

[00:31:01] kids safe on

[00:31:02] the streets

[00:31:02] while we also

[00:31:04] let them roam

[00:31:05] be free on

[00:31:05] their phones

[00:31:06] and their

[00:31:06] iPads.

[00:31:07] It's time

[00:31:08] to get

[00:31:09] involved,

[00:31:10] re-engage,

[00:31:11] and take

[00:31:11] control with

[00:31:13] the understanding

[00:31:13] that the

[00:31:14] virtual world,

[00:31:15] the digital

[00:31:16] world,

[00:31:16] the internet,

[00:31:17] social media,

[00:31:18] can be far

[00:31:19] more dangerous

[00:31:20] than the

[00:31:21] world just

[00:31:22] outside our

[00:31:22] door.

[00:31:23] That's it

[00:31:24] for me

[00:31:25] tonight.

[00:31:25] If you want

[00:31:26] to reach out,

[00:31:27] you can find

[00:31:28] me on

[00:31:28] Facebook at

[00:31:29] Serial Napper.

[00:31:30] You can find

[00:31:31] my audio on

[00:31:32] Apple or

[00:31:32] Spotify or

[00:31:33] wherever you

[00:31:34] listen to

[00:31:34] podcasts.

[00:31:35] I post all

[00:31:36] of my episodes

[00:31:37] in video format

[00:31:38] over on

[00:31:39] YouTube,

[00:31:40] so go check

[00:31:40] it out.

[00:31:41] And if you

[00:31:42] are watching

[00:31:42] on YouTube,

[00:31:43] I would love

[00:31:43] if you can

[00:31:44] give me a

[00:31:44] thumbs up

[00:31:45] and subscribe

[00:31:45] because quite

[00:31:46] literally every

[00:31:47] little bit

[00:31:47] helps.

[00:31:48] I'm over

[00:31:48] on X for

[00:31:49] now,

[00:31:50] probably going

[00:31:50] to delete

[00:31:51] it eventually,

[00:31:51] but hey,

[00:31:52] if you're

[00:31:52] still over

[00:31:53] on X,

[00:31:54] formerly known

[00:31:54] as Twitter,

[00:31:55] I post

[00:31:56] things at

[00:31:56] Serial

[00:31:57] Napper.

[00:31:58] I'm also

[00:31:58] on TikTok

[00:31:59] and my

[00:32:00] username is

[00:32:01] Serial

[00:32:01] Napper

[00:32:01] Nick,

[00:32:02] that's all

[00:32:02] one word.

[00:32:03] Until

[00:32:04] next time,

[00:32:05] sweet dreams,

[00:32:07] stay kind,

[00:32:08] especially in

[00:32:09] the comments.

[00:32:10] Bye.