Player FM ऐप के साथ ऑफ़लाइन जाएं!
315 – Keeping Our Children Safe Online, with Susan Kennedy
Manage episode 404553870 series 100692
Dr. Sandie Morgan is joined by Susan Kennedy as the two discuss the importance of keeping our children safe online.
Susan Kennedy
Susan Kennedy joined the National Center for Missing and Exploited Children in 2018. At NCMEC, Susan leads NCMEC’s prevention, outreach, training, and partnership programs. Previously Susan was the Director of Programs at the Center for Alexandria’s Children where she conducted child forensic interviews, coordinated the Child Advocacy Center program, and oversaw a community-based primary prevention program for children aged zero to five and their caregivers. She earned her Bachelors’ degree in Psychology from The College of William & Mary and a Master of Education degree in Human Development and Psychology from Harvard University.
Key Points
- The National Center for Missing and Exploited children is the nation’s largest and most influential child protection program, and creates vital resources for children and those who keep them safe.
- In 2023, NCMEC’s Cyber Tip Line received 36.2 million reports of suspected child exploitation.
- Reports of online enticements have almost doubled from 2022 to 2023, observing an increase of more than 300% from 2021 to 2023.
- An important part of the policy agenda is to equip local, state, and national agencies with equivalent technology that has enabled offenders.
- There has been a shift in sextortion where now, offenders target teenage boys and are financially motivated.
Resource
- NCMEC
- 48 – International Centre for Missing and Exploited Children
- NCMEC CyberTipline
- NCMEC Impact Page
- NetSmartz
Transcript
Sandra Morgan 0:14
You’re listening to the Ending Human Trafficking podcast. This is episode #315: “Keeping Our Children Safe Online” with Susan Kennedy. My name is Sandie Morgan and this is the show where we empower you to study the issues, be a voice, and make a difference in ending human trafficking. Our guest today is Susan Kennedy. She joins us from the National Center for Missing and Exploited Children, where she leads their Prevention, Outreach, Training and Partnership programs. Previously, Susan was Director of Programs at the Center for Alexandria’s Children, where she conducted child forensic interviews, coordinated the child advocacy center program, and oversaw a community based, primary prevention program for children aged zero to five and their caregivers. She’s earned her degrees from the College of William and Mary, and from Harvard University, I am so glad to welcome you here, Susan, and there are so many things in your bio that make me want to go back and ask, but we can’t do that, we have a job to do today.
Susan Kennedy 1:39
Yes, thank you so much for having me, and I’m excited to talk about all the things we have to talk about today.
Sandra Morgan 1:44
Well, let’s talk first about NCMEC, National Center for Missing and Exploited Children, and the strengthened value of being a public-private partnership with the FBI.
Susan Kennedy 1:59
Sure, so NCMEC, for those of you who may not be familiar, the National Center for Missing Exploited Children, is the nation’s largest and most influential child protection organization. We really see ourselves as leading the fight to protect children, creating vital resources for them, and for the people who keep them safe. So we’re going to talk about a range of issues today. I think most relevant probably for our conversation, is the work that we do around the cyber tip line. So receiving reports of child sexual exploitation online, as well as providing services for victims, for law enforcement, including the FBI, as you mentioned, around child sex trafficking cases as well. So we are a nonprofit, but we have very strong work and partnerships with law enforcement agencies, like the FBI, but also other federal agencies, and local law enforcement. Really, that’s because what we do is we are receiving information from the public, from law enforcement, from internet platforms, and working with law enforcement. Law enforcement are the ones who are going to investigate those cases, who are going to figure out what happened, who needs help, how they can hold people accountable, and make us all safer. It’s really law enforcement that has to do that work and that does do that work, so we really see ourselves as lending a helping hand to those agencies and providing some resources that might not be possible without that private support as well.
Sandra Morgan 3:16
I remember the first time I had an NCMEC guest on this show, I think it was Ernie Allen, one of the cofounders. Just beginning to understand the significance of the work of finding missing children, and then now bringing decades of that work to the issue of online exploitation, it feels like a really different approach and we need new and different tools. So let’s dive into our theme for this episode and talk about the way to keep our children safe online. I love your background in prevention and forensics with children. I’m pretty interested in the recent congressional hearings, because that tells me that there is a response to the growing public concerns and that we are going to begin to see better policy, stronger policy, I’m not exactly sure what how I want to term that. But can you give us an overview of NCMEC’s view of online safety from a policy perspective?
Susan Kennedy 4:53
Yeah, absolutely. So I would start by saying the National Center, as I mentioned, runs what we call the Cyber Tip Line, which is a program authorized by Congress, that receives reports of child sexual exploitation, again, from the public, from law enforcement, from victims themselves, from these internet safety platforms. Teally what we’ve seen, unfortunately, is a continued increase in reports of that cyber tip line. Last year, in 2023, we received 36.2 million reports of suspected child exploitation.
Sandra Morgan 5:21
Woah!
Susan Kennedy 5:22
Yes, and that has been increasing year over year for a while. It’s a big number, it’s a lot of reports. I think within that, what we have really been highlighting or noticing is a huge increase in the reports of what we call online enticement. So that number has increased, it almost about doubled from 2022 to 2023, and we saw a an increase of more than 300% from 2021 to 2023. Let’s dive into that.
Sandra Morgan 5:50
Okay, lets go back. Tell me, what is online enticement?
Susan Kennedy 5:55
Right, so the definition of online enticement. It’s a pretty broad category of things, anything where an adult is communicating with a child in order to commit a sexual offense, or it could even be an abduction, and a subcategory of that that’s getting a lot of attention, that people may be familiar with, is called sextortion. That’s when there is blackmail, it’s kind of a mix of the word sex and extortion together, and that blackmail is around generally nude or explicit images of the child. So a common scenario is a child provides, sends a nude image of themselves to someone else, and then that person turns around and says, “Unless you pay me money, or unless you make more of this imagery, I’m going to spread this image around to everyone you know.” There’s variations on that, but that is what sextortion is, and that’s inside this category of “online enticement,” and really what we think is driving a lot of the increase. We’ve seen a real spike in those kinds of cases, particularly in 2022, and continuing in 2023.
Sandra Morgan 6:51
So these numbers that you just spouted, that just blew me away, are those online, so we can put them in our show notes?
Susan Kennedy 7:01
Absolutely, we have a couple of blogs that have come out recently tied actually, to the congressional hearings you were referencing before that I’m happy to provide. They’re right on our website. Every year, we do update what we call our Impact Page, which will have all the latest numbers, and we’re in process right now of doing the full revamp of that to show our 2023 numbers. But the blogs have the most recent numbers that I just cited. I can definitely provide you those links for your show notes.
Sandra Morgan 7:26
Thank you. So let’s go back to that congressional hearing. What were your top three takeaways?
Susan Kennedy 7:36
I mean, I think as you said, one takeaway for me is, we did really see a lot of public attention on those. Lots of parents of kids who’ve been harmed, or have even died as a result of some of the harms they’ve experienced on social media, where they are in present, and I think people are starting to really pay attention that issue, which is really important to see, we hope that there’s progress there. I think for us, the overarching thing we want people to know is that our laws around what the tech platforms are obligated to do, how they report, what can be reported, what should be reported, and some of the measures we can have around accountability for those tech platforms, really need to be updated to reflect the explosion in use of social media. Just the changes and how technology has changed vastly, and our laws and policies have not, and that we really want to see some more accountability and some updating to how the Cyber Tip Line works, and how things are reported to us.
Sandra Morgan 8:31
So this takes me back to the very first time Ernie Allen spoke here at Vanguard University, and he talked to us about changing technology. The illustration he used is when our law enforcement was still on horseback, and the thieves got cars, and the law enforcement felt like it’s not fair. I think that part of our policy agenda is to equip our local, and state, and national agencies with equivalent technology. Is that part of the conversation you’re hearing in this space?
Susan Kennedy 9:23
I definitely think that’s true, that the the tools and the policies need to be updated to be able to hold platforms accountable, and again, to increase what they’re voluntarily, or what they are mandated to report to us and things like that. One simple example that would be within these bills, is part of what they are updating is the requirement for tech platforms to hold on to materials that are relevant for investigation. So when a piece of content is reported to the National Center, we pass it on to law enforcement. Right now, the tech platforms are only required to hold on to that information for 90 days. And you can appreciate the process and the legal process they have to, understandably, go through to get into someone’s social media account, and to understand which account we need and what documents we’re talking about. A lot of law enforcement have told us, by the time we’re able to work through that whole process that we need to do, the content is gone, or the information is gone. So one of the bills, and this is just one example, but one of the ways it kind of modernizes the requirements and the processes around that is to require tech companies to hold on to that information for a year. So it’s like an example of a tool that we need to give law enforcement to help kind of level the playing field, as you’re saying with with what the technology has enabled offenders to be able to do.
Sandra Morgan 10:37
Okay, that’s very helpful. Let’s dive in to what you called sextortion. We did an interview with Aaron Burke after the documentary came out, and it was clear that some of our old conceptions around vulnerabilities, and it’s kids who, no one’s home, or they’re already marginalized, those old paradigms are not exactly accurate, because predators have morphed, and they have access in ways that we’re just not ready for. What should we be looking for now.
Susan Kennedy 11:26
I think it’s helpful to know that in 2022, what we saw at the National Center, and in turn, our federal partners, the FBI, Homeland Security, and others really saw a big shift in sextortion. What we saw is a big increase in offenders targeting teenage boys, and targeting them for money. So what we had previously seen, was that the majority of sextortion cases, the victims were female, and the the motivation for that victimization was that the offenders wanted more content. The example I gave before, the child would send one image and the offender would turn around and say, “I want more images like this, I want fully nude images, I want an image every day when you get home from school.” That was the the motivation behind that offense, was it was sexually motivated to get more content of the child. What we saw kind of come out of nowhere and then spike tremendously was sextortion of boys for money. So the victim will oftentimes send an initial image and then the offender turns around and says, “You need to give me $100, $500 in gift cards, or just a payment app, or whatever it is, and if you don’t, I’m going to send this image to everyone you know.” What we also saw was an increase in the sophistication of those kinds of offenses. I would actually characterize a lot of these as social engineering fraud and scams. Just as, I think many of us, in the working world have gotten more sophisticated phishing emails, and things like that, where someone wants to call us right away and give us a gift card, and those overtures, those attacks have become more sophisticated in that they’re including names of people we do work with, and they’re understanding who supervises who, and doing that kind of background research to make these threats seem more realistic. In the same vein, what they’re doing is they are infiltrating these kids’ social networks. So they’ll say “I know this person you’re on the soccer team with,” and then they can show you they’re friends with everyone on your soccer team, everyone who goes to your high school, they know exactly who your girlfriend is, they know who your parents are, and they’re ready to send that image. So it doesn’t seem like an empty threat, it seems like it is ready to happen. They’ll show, “I’m right here in the group chat, ready to hit send with your image, unless you send me this money right now.” And we’ve seen these threats are egregious, they are quick moving, that’s the other big change that people need to really be aware of. We used to see in sextortion cases there was a gradual, as you see in other types of sexual offenses, a gradual building of trust. You might call it grooming, you might call it manipulation over time, these offenses now are quick. The initial contact can be made at six or seven o’clock at night, it can escalate quickly. The FBI has documented more than a dozen kids who have died by suicide associated with these offenses, and sometimes that happens overnight. Initial contact is six, seven o’clock at night, the child has died by suicide by the morning. They’re very fast moving, they’re very egregious, and they target again, primarily boys, and they’re financially motivated. So they’re very different than what we used to see and so for parents, and teachers, and professionals to be aware of that and how that has shifted. I’ll pause there because I know that’s a lot.
Sandra Morgan 14:28
Wow, yeah. We’re going to have to have a glossary for this episode, social engineering. So okay, things have morphed, but how can I better prepare the children and youth in my community to be their own first line of defense? Things are going to change again, and so I can’t just tell them, “Be aware that somebody’s going to do this social engineering tactic,” because they move away and do something different. So what do you suggest?
Susan Kennedy 15:07
I think the first thing is to make sure that the young people in your life know that you are a resource, and you are a source of support, even if they feel like they made a mistake. One of my colleagues said, something I think is very powerful, she said, “These offenders are really weaponizing the shame that these kids feel for having taken this imagery or sent this imagery.” I think many of us as well intending adults, even doing Internet safety prevention, talking with kids about how to be safe online, we have emphasized to them, “don’t ever take these kinds of pictures, don’t trust people you meet online.” All of those messages are well intentioned, but what we have found is that there’s a real backfire effect for kids who have already engaged in that behavior, or kids who engage in that behavior, and then are under threat, and what they remember all of us telling them is, “don’t do this,” and they’ve done it. Now they don’t feel like they can come to us for help. They’re embarrassed, they’re ashamed, they know that they did exactly what we told them not to do, and that has really increased the vulnerability of these kids for those impacts where they continue to kind of deal with these offenders on their own. Paying them doesn’t help, ignoring them doesn’t help, I mean, it really needs an adult and a law enforcement response, to be able to support these kids. And it’s out there, and and not enough kids know. They don’t know that this is a thing, they think, understandably, that they’re the only ones who’ve experienced that. So we need to get the message out to kids that if they ever find themselves in this situation, and I think we should be explicit and clear, especially with our teenage boys, that this is a scam going around. Just the way we might warn our older, elderly parents about phone scams or different types of things like that, I think we should be explicit with them that this is a possibility. I also think we’ve got to start talking with kids about the harm of recirculating non-consensual imagery.
Sandra Morgan 15:13
Yeah, let’s break that down. Non-consensual imagery. Tell us what that is.
Susan Kennedy 17:05
So what I mean is a picture, an explicit, a nude picture. Kids will call them nudes, kids are fine if we still say sexting, that’s not what they say, but that’s okay. So any image of someone where they don’t have their clothes on, and we’re going to focus on minors here. They send a picture, say they send it to a classmate, that classmate sends it to someone else, that classmate sends it to someone else, it circulates online. That is where a lot of the harm of this imagery happens, is because of that circulation, and the harassment and the bullying that happens when an image is circulating out there. That, I would argue also gives ammunition to these people who are sextorting children. Because if the child could say to this offender, “Go ahead, send it to everyone on my soccer team, literally no one will care. Everyone will delete it, no one will say anything to me.” But that’s not the reality, and that’s not what they think will happen. They know that they’re going to be embarrassed, they know that people are going to possibly say mean things about them, repost it, put it publicly, put their name with it online. All of these things that they’re worried will happen, may happen, and that is part of what gives ammunition to these offenders. So that’s where we need to take their power away and tell kids that it’s up to them, the climate, and what happens when this imagery circulates at their school is up to them. It’s not up to me, it’s up to you guys, and how you treat each other when this image is circulating. So we really need to broaden the conversation beyond just, “Don’t send this imagery, don’t talk to strangers.” We need to broaden it to think about how we view each other and how we behave as bystanders and supporters of each other.
Sandra Morgan 18:34
I think the bystander element of this is so important because you can build that sense of a youth community already prepared to protect their group, and if you see this happening to somebody in your group, you know what to do. Because when it’s happening to you, you may not really feel like you have the power but when it’s happening to your friend, you have a sense of coming alongside, and I think that bystander element of this is an important protective tool and quality to help flourish in our kids’ groups, our teens, our youth groups, so that they become their own first line of defense. I have so many more questions and we’ve only got like 10 minutes left, but we probably will have to have another conversation. When we’re looking at prevention strategies from a community based approach for all of our children. Every time I talk to parents with kids that are on computers for homework or just because they get this amount of screen time every day, I feel pretty confident that these parents are doing a good job of what you’ve just described. But then, I end up in some other areas where I’m with kids, and there is no consistent caregiver, the child may already be system involved because of a lack of appropriate family support and even neglect. So how can a community based approach help us keep our kids safe?
Susan Kennedy 20:40
I think that’s such an important point, and one of, I think, the most challenging aspects of Internet safety is just what you said. The parents who show up to the Internet safety presentation, who are going to the National Center, and looking at NetSmartz, those parents have a lot of advantages and are engaged, and those kids have that going for them. Those are, in some ways, the easiest parents and kids to reach with this messaging, and I think what you’re highlighting is such a huge challenge. The first thing I would say is, that’s one reason why this policy stuff is really important. Making the platform safer, holding these platforms accountable for the ways that they can increase child safety that benefits all kids, and so there’s lots of analogies about rivers and oceans, and making the river smoother, versus teaching kids how to swim and all that. So if we can make the river smoother, if we can make the internet safer, that’s obviously going to be an effective way, or more effective way to help all kids, just make those waters easier to navigate. I think beyond that, one of the things the National Center tries to do is we try to not only talk to parents directly, but we also want to partner with schools. We also partner, you said in your intro, I used to be a forensic interviewer, I used to work at a Children’s Advocacy Center. CAC’s employ mental health therapists and victim advocates who are working with kids who have experienced child sexual abuse, working with them to understand some ways you can use part of our Internet safety program called NetSmartz to have one on one conversations with those kids about Internet safety moving forward. Here’s how you can talk to them about the ways in which you can be a support. We’re always trying to get our resources into the therapist’s office, the advocates’ offices, get them to Child Protective Services, talk to folks who work in residential treatment, and group homes. We work with after school programs, just trying to really reinforce and get that message to all adults who have contact with kids, not only focusing on parents, but trying to get to all those systems where kids might be able to reach out for help, might voice a concern, might go for guidance. I think that’s another really important approach, is we can’t just focus on talking to kids and families directly, we’ve got to get to all these settings and all these professionals who have opportunities, not only to intervene, but to speak about prevention with the children that they interact with.
Sandra Morgan 20:41
You’ve mentioned NetSmartz, I’ve mentioned it, we will put a link to NetSmartz, age appropriate prevention curriculum online, one of the best out there and it’s free. Okay, definitions are so important, and we need to use the real words with kids. I see a lot about CSAM and SGCSAM, what are people talking about when they’re using those acronyms?
Susan Kennedy 22:04
Okay, so CSAM (CSAM), stands for Child Sexual Abuse Material. This is the terminology that we prefer to use for what we used to call child pornography. In fact, another one of the provisions in that legislation, that group of bills before Congress right now, changes that terminology officially in federal legislation. Right now, the Criminal Code, and most places where it’s referred to in the federal government policy and laws, it’s called child pornography. But we believe that that term is not accurate for this imagery, because it is not legal to have sex with a child. This is the sexual assault and abuse of a child and there’s nothing consensual about their participation in this imagery. It’s important for people to understand when we say child pornography, when we say Child Sexual Abuse Material, the vast majority of it is the hands on, sexual abuse of the child that is filmed and put on the internet. That’s what we’re primarily talking about, and that’s really important because I think sometimes people think of child pornography, “Oh, a lot of it is probably innocuous bathing suit pictures, or kids coming out of the bathtub, or self generated content maybe, “just kids”, teenage girls in their bras,” and certainly there’s a wide range of imagery, and all of that, everything I’ve described could be exploitative and could fall under this legal definition. It’s just important for listeners to know that the vast majority of what we’re talking about is someone sexually abusing a child, and filming or taking pictures of that abuse. It’s hands on sexual abuse that is then filmed and put on the internet. So we feel like CSAM or Child Sexual Abuse Material, is a much more clear and accurate term for those images and videos. Now, when you see SGCSAM, that SG stands for Self Generated. There are other terms for that, some people might call it youth-produced explicit content. What that is referring to is when the child who is depicted in the imagery is the one who took the picture or created the video. Again, as I said, it could be a child taking an image of themselves partially clothed, unclothed, and sending that. Sometimes that could be in the context of a romantic relationship, but as we’ve already talked about, sometimes kids are under threat, forced intimidation and fear, and they’re taking that imagery and sending it to someone who is forcing them to do that. And it’s important for providers, as well as parents, to understand that you’re not going to know the full context of a picture when you first discover it. So you could think this image was created, and this child is willingly doing it, and being irresponsible, and sending it to someone. It could also be true that they are being threatened to take that image, and the person threatening them to take that image is asking them to smile or pose. We see that often. You see it also in child sex trafficking cases, where a child is instructed to pose in a certain way or do certain things on camera, it is impossible to tell from the image the full context under which it was created. That’s important for people to know, but that’s the difference. Child Sexual Abuse Materials is the broad category of what we used to call child pornography, and when you see the SG in front of it, it means that in some way it was generated by the child in the image.
Sandra Morgan 24:19
So when those images are out there, what can someone do?
Susan Kennedy 26:55
Great question. So the most important thing we want people to know, and we want to ask people to do, is anytime you see suspected child exploitation online, we want you to report it to the CyberTipline. I can also send you a link, but it’s easy to find: cybertipline.org, it’ll come right up. We have a brand new public reporting form where we’ve actually invested a lot to make that an easier report to make, more user friendly, more clear about how to report. That will trigger what we call a cyber tip, which will let our analysts look at it. Also, it’s important for people to know that by definition, that process involves law enforcement, because our job is to get that information back out to law enforcement so that they can investigate it. What it also allows us to do is once the national center looks at it and says, “This is CSAM, this is an explicit image of a child,” we get law enforcement to verify that is in fact a child. So even if the child in the image may look like they are 19, or 18, they are really 16, or 15, or 14. Those two pieces together, the National Center is identifying this as child sexual abuse material, and law enforcement is verifying that this is in fact a child, makes that image legally, child pornography or Child Sexual Abuse Material, and platforms must remove it when they are notified that it’s on their platforms. That’s actually the only level of mandatory reporting that exists across all platforms, is that once it is known Child Sexual Abuse material, you have to remove it, and the National Center actually has staff whose job is to notify these platforms, and monitor over time, and make sure that that content comes down. The identifying information from that picture is added to what we call a hash list. That list is shared with many platforms. Anytime a picture on that hash list appears on their platform, they have to take it down. So that is your most powerful tool to get content removed and prevent re circularization. That’s making a report to the CyberTipline. We know that that law enforcement involvement can be a barrier for people and we very much understand that. I mean, I can say a lot of the times law enforcement, the only thing that’s going to happen is they’re going to put that on the removal list. But there could be circumstances around your case, and the person and people involved in it, and offenders who may be involved in other cases, and it may result in law enforcement involvement in that report that you made. We can’t say for sure, law enforcement needs to hold people accountable and investigate things that we bring to their attention. We have other resources. You mentioned, Take It Down. Take It Down is a really specific resource for a child who has a self generated image of themselves, they want to report it, but they want to stay anonymous. So what they can do, they have an image on their phone because they sent it to someone and now that relationship has gone south. It’s important to note that sextortion still happens in that context, peer to peer, where you’re in a relationship one minute, the relationship goes south, and now that person doesn’t want you to break up with them or is controlling you in some way. This can be a form of domestic violence, where they’re going to control you because they have this image of you, and if you want to make a report to Take It Down, what we do is just pull that hash value that I mentioned, just that hash value, we don’t see the image, we don’t know who you are, but that we can share that hash value with platforms who are participating voluntarily, and they will look for that image should it appear on their platform. That’s a very powerful tool but it’s not as potent of a content removal tool. But it is important, and for people who aren’t ready to make that CyberTip Report, we definitely want them to report to Take It Down so that we can help look for that image. There’s also tools on our platform where you can self report to the platforms, and we go through every few months and make sure that those instructions are up to date, in terms of reporting to popular platforms out there about your self generated content as well. So there are different tools to really empower victims to pick which one works for them the best, but the important takeaway for professionals, for parents, for kids, is that we used to say, “Once it’s out there, it’s always out there,” that’s not true. There’s a lot we can do with technology to get your image down, and so it’s really important that you reach out and get support and help in whatever way you’re comfortable, so that we can get that imagery down and stop it from circulating.
Sandra Morgan 28:50
I am so encouraged by that. Our time is up, but I’m going give you one more statement. What gives you hope, Susan?
Susan Kennedy 31:06
I think working at the National Center and seeing the incredible work and resiliency of the people who look at this imagery and try to help kids, and even adults who have imagery circulating, and coming up with these new innovative tools, that makes me feel really optimistic, as well as just talking to young people. I think they are taking control of this new technology and really speaking up and advocating, and most of them, when you read the research, are making really good decisions online. I think we have a great opportunity to come alongside them and empower them to be safe, and take care of each other out there.
Sandra Morgan 31:39
Talking to you gives me hope. Thank you, Susan. Thank you for tuning in and listening today. I’m going to see you all again in two weeks. In the meantime, go to the show notes to find the links that Susan and I have talked about and if you haven’t been to the ending human trafficking website, go on over to endinghumantrafficking.org where you can find a library of past episodes and resources and get connected with our community. And Susan, will you tell us the website for NCMEC?
Susan Kennedy 32:18
Sure. It’s missingkids.org and the NetSmartz is missingkids.org/netsmartz.
Sandra Morgan 32:26
And spell NetSmartz.
Susan Kennedy 32:28
N-E-T-S-M-A-R-T-Z. NetSmartz with a Z.
Sandra Morgan 32:33
Okay, yes, gotta spell it with a Z. It has been a delight. Thank you so much, Susan.
Susan Kennedy 32:40
Thank you very much.
342 एपिसोडस
Manage episode 404553870 series 100692
Dr. Sandie Morgan is joined by Susan Kennedy as the two discuss the importance of keeping our children safe online.
Susan Kennedy
Susan Kennedy joined the National Center for Missing and Exploited Children in 2018. At NCMEC, Susan leads NCMEC’s prevention, outreach, training, and partnership programs. Previously Susan was the Director of Programs at the Center for Alexandria’s Children where she conducted child forensic interviews, coordinated the Child Advocacy Center program, and oversaw a community-based primary prevention program for children aged zero to five and their caregivers. She earned her Bachelors’ degree in Psychology from The College of William & Mary and a Master of Education degree in Human Development and Psychology from Harvard University.
Key Points
- The National Center for Missing and Exploited children is the nation’s largest and most influential child protection program, and creates vital resources for children and those who keep them safe.
- In 2023, NCMEC’s Cyber Tip Line received 36.2 million reports of suspected child exploitation.
- Reports of online enticements have almost doubled from 2022 to 2023, observing an increase of more than 300% from 2021 to 2023.
- An important part of the policy agenda is to equip local, state, and national agencies with equivalent technology that has enabled offenders.
- There has been a shift in sextortion where now, offenders target teenage boys and are financially motivated.
Resource
- NCMEC
- 48 – International Centre for Missing and Exploited Children
- NCMEC CyberTipline
- NCMEC Impact Page
- NetSmartz
Transcript
Sandra Morgan 0:14
You’re listening to the Ending Human Trafficking podcast. This is episode #315: “Keeping Our Children Safe Online” with Susan Kennedy. My name is Sandie Morgan and this is the show where we empower you to study the issues, be a voice, and make a difference in ending human trafficking. Our guest today is Susan Kennedy. She joins us from the National Center for Missing and Exploited Children, where she leads their Prevention, Outreach, Training and Partnership programs. Previously, Susan was Director of Programs at the Center for Alexandria’s Children, where she conducted child forensic interviews, coordinated the child advocacy center program, and oversaw a community based, primary prevention program for children aged zero to five and their caregivers. She’s earned her degrees from the College of William and Mary, and from Harvard University, I am so glad to welcome you here, Susan, and there are so many things in your bio that make me want to go back and ask, but we can’t do that, we have a job to do today.
Susan Kennedy 1:39
Yes, thank you so much for having me, and I’m excited to talk about all the things we have to talk about today.
Sandra Morgan 1:44
Well, let’s talk first about NCMEC, National Center for Missing and Exploited Children, and the strengthened value of being a public-private partnership with the FBI.
Susan Kennedy 1:59
Sure, so NCMEC, for those of you who may not be familiar, the National Center for Missing Exploited Children, is the nation’s largest and most influential child protection organization. We really see ourselves as leading the fight to protect children, creating vital resources for them, and for the people who keep them safe. So we’re going to talk about a range of issues today. I think most relevant probably for our conversation, is the work that we do around the cyber tip line. So receiving reports of child sexual exploitation online, as well as providing services for victims, for law enforcement, including the FBI, as you mentioned, around child sex trafficking cases as well. So we are a nonprofit, but we have very strong work and partnerships with law enforcement agencies, like the FBI, but also other federal agencies, and local law enforcement. Really, that’s because what we do is we are receiving information from the public, from law enforcement, from internet platforms, and working with law enforcement. Law enforcement are the ones who are going to investigate those cases, who are going to figure out what happened, who needs help, how they can hold people accountable, and make us all safer. It’s really law enforcement that has to do that work and that does do that work, so we really see ourselves as lending a helping hand to those agencies and providing some resources that might not be possible without that private support as well.
Sandra Morgan 3:16
I remember the first time I had an NCMEC guest on this show, I think it was Ernie Allen, one of the cofounders. Just beginning to understand the significance of the work of finding missing children, and then now bringing decades of that work to the issue of online exploitation, it feels like a really different approach and we need new and different tools. So let’s dive into our theme for this episode and talk about the way to keep our children safe online. I love your background in prevention and forensics with children. I’m pretty interested in the recent congressional hearings, because that tells me that there is a response to the growing public concerns and that we are going to begin to see better policy, stronger policy, I’m not exactly sure what how I want to term that. But can you give us an overview of NCMEC’s view of online safety from a policy perspective?
Susan Kennedy 4:53
Yeah, absolutely. So I would start by saying the National Center, as I mentioned, runs what we call the Cyber Tip Line, which is a program authorized by Congress, that receives reports of child sexual exploitation, again, from the public, from law enforcement, from victims themselves, from these internet safety platforms. Teally what we’ve seen, unfortunately, is a continued increase in reports of that cyber tip line. Last year, in 2023, we received 36.2 million reports of suspected child exploitation.
Sandra Morgan 5:21
Woah!
Susan Kennedy 5:22
Yes, and that has been increasing year over year for a while. It’s a big number, it’s a lot of reports. I think within that, what we have really been highlighting or noticing is a huge increase in the reports of what we call online enticement. So that number has increased, it almost about doubled from 2022 to 2023, and we saw a an increase of more than 300% from 2021 to 2023. Let’s dive into that.
Sandra Morgan 5:50
Okay, lets go back. Tell me, what is online enticement?
Susan Kennedy 5:55
Right, so the definition of online enticement. It’s a pretty broad category of things, anything where an adult is communicating with a child in order to commit a sexual offense, or it could even be an abduction, and a subcategory of that that’s getting a lot of attention, that people may be familiar with, is called sextortion. That’s when there is blackmail, it’s kind of a mix of the word sex and extortion together, and that blackmail is around generally nude or explicit images of the child. So a common scenario is a child provides, sends a nude image of themselves to someone else, and then that person turns around and says, “Unless you pay me money, or unless you make more of this imagery, I’m going to spread this image around to everyone you know.” There’s variations on that, but that is what sextortion is, and that’s inside this category of “online enticement,” and really what we think is driving a lot of the increase. We’ve seen a real spike in those kinds of cases, particularly in 2022, and continuing in 2023.
Sandra Morgan 6:51
So these numbers that you just spouted, that just blew me away, are those online, so we can put them in our show notes?
Susan Kennedy 7:01
Absolutely, we have a couple of blogs that have come out recently tied actually, to the congressional hearings you were referencing before that I’m happy to provide. They’re right on our website. Every year, we do update what we call our Impact Page, which will have all the latest numbers, and we’re in process right now of doing the full revamp of that to show our 2023 numbers. But the blogs have the most recent numbers that I just cited. I can definitely provide you those links for your show notes.
Sandra Morgan 7:26
Thank you. So let’s go back to that congressional hearing. What were your top three takeaways?
Susan Kennedy 7:36
I mean, I think as you said, one takeaway for me is, we did really see a lot of public attention on those. Lots of parents of kids who’ve been harmed, or have even died as a result of some of the harms they’ve experienced on social media, where they are in present, and I think people are starting to really pay attention that issue, which is really important to see, we hope that there’s progress there. I think for us, the overarching thing we want people to know is that our laws around what the tech platforms are obligated to do, how they report, what can be reported, what should be reported, and some of the measures we can have around accountability for those tech platforms, really need to be updated to reflect the explosion in use of social media. Just the changes and how technology has changed vastly, and our laws and policies have not, and that we really want to see some more accountability and some updating to how the Cyber Tip Line works, and how things are reported to us.
Sandra Morgan 8:31
So this takes me back to the very first time Ernie Allen spoke here at Vanguard University, and he talked to us about changing technology. The illustration he used is when our law enforcement was still on horseback, and the thieves got cars, and the law enforcement felt like it’s not fair. I think that part of our policy agenda is to equip our local, and state, and national agencies with equivalent technology. Is that part of the conversation you’re hearing in this space?
Susan Kennedy 9:23
I definitely think that’s true, that the the tools and the policies need to be updated to be able to hold platforms accountable, and again, to increase what they’re voluntarily, or what they are mandated to report to us and things like that. One simple example that would be within these bills, is part of what they are updating is the requirement for tech platforms to hold on to materials that are relevant for investigation. So when a piece of content is reported to the National Center, we pass it on to law enforcement. Right now, the tech platforms are only required to hold on to that information for 90 days. And you can appreciate the process and the legal process they have to, understandably, go through to get into someone’s social media account, and to understand which account we need and what documents we’re talking about. A lot of law enforcement have told us, by the time we’re able to work through that whole process that we need to do, the content is gone, or the information is gone. So one of the bills, and this is just one example, but one of the ways it kind of modernizes the requirements and the processes around that is to require tech companies to hold on to that information for a year. So it’s like an example of a tool that we need to give law enforcement to help kind of level the playing field, as you’re saying with with what the technology has enabled offenders to be able to do.
Sandra Morgan 10:37
Okay, that’s very helpful. Let’s dive in to what you called sextortion. We did an interview with Aaron Burke after the documentary came out, and it was clear that some of our old conceptions around vulnerabilities, and it’s kids who, no one’s home, or they’re already marginalized, those old paradigms are not exactly accurate, because predators have morphed, and they have access in ways that we’re just not ready for. What should we be looking for now.
Susan Kennedy 11:26
I think it’s helpful to know that in 2022, what we saw at the National Center, and in turn, our federal partners, the FBI, Homeland Security, and others really saw a big shift in sextortion. What we saw is a big increase in offenders targeting teenage boys, and targeting them for money. So what we had previously seen, was that the majority of sextortion cases, the victims were female, and the the motivation for that victimization was that the offenders wanted more content. The example I gave before, the child would send one image and the offender would turn around and say, “I want more images like this, I want fully nude images, I want an image every day when you get home from school.” That was the the motivation behind that offense, was it was sexually motivated to get more content of the child. What we saw kind of come out of nowhere and then spike tremendously was sextortion of boys for money. So the victim will oftentimes send an initial image and then the offender turns around and says, “You need to give me $100, $500 in gift cards, or just a payment app, or whatever it is, and if you don’t, I’m going to send this image to everyone you know.” What we also saw was an increase in the sophistication of those kinds of offenses. I would actually characterize a lot of these as social engineering fraud and scams. Just as, I think many of us, in the working world have gotten more sophisticated phishing emails, and things like that, where someone wants to call us right away and give us a gift card, and those overtures, those attacks have become more sophisticated in that they’re including names of people we do work with, and they’re understanding who supervises who, and doing that kind of background research to make these threats seem more realistic. In the same vein, what they’re doing is they are infiltrating these kids’ social networks. So they’ll say “I know this person you’re on the soccer team with,” and then they can show you they’re friends with everyone on your soccer team, everyone who goes to your high school, they know exactly who your girlfriend is, they know who your parents are, and they’re ready to send that image. So it doesn’t seem like an empty threat, it seems like it is ready to happen. They’ll show, “I’m right here in the group chat, ready to hit send with your image, unless you send me this money right now.” And we’ve seen these threats are egregious, they are quick moving, that’s the other big change that people need to really be aware of. We used to see in sextortion cases there was a gradual, as you see in other types of sexual offenses, a gradual building of trust. You might call it grooming, you might call it manipulation over time, these offenses now are quick. The initial contact can be made at six or seven o’clock at night, it can escalate quickly. The FBI has documented more than a dozen kids who have died by suicide associated with these offenses, and sometimes that happens overnight. Initial contact is six, seven o’clock at night, the child has died by suicide by the morning. They’re very fast moving, they’re very egregious, and they target again, primarily boys, and they’re financially motivated. So they’re very different than what we used to see and so for parents, and teachers, and professionals to be aware of that and how that has shifted. I’ll pause there because I know that’s a lot.
Sandra Morgan 14:28
Wow, yeah. We’re going to have to have a glossary for this episode, social engineering. So okay, things have morphed, but how can I better prepare the children and youth in my community to be their own first line of defense? Things are going to change again, and so I can’t just tell them, “Be aware that somebody’s going to do this social engineering tactic,” because they move away and do something different. So what do you suggest?
Susan Kennedy 15:07
I think the first thing is to make sure that the young people in your life know that you are a resource, and you are a source of support, even if they feel like they made a mistake. One of my colleagues said, something I think is very powerful, she said, “These offenders are really weaponizing the shame that these kids feel for having taken this imagery or sent this imagery.” I think many of us as well intending adults, even doing Internet safety prevention, talking with kids about how to be safe online, we have emphasized to them, “don’t ever take these kinds of pictures, don’t trust people you meet online.” All of those messages are well intentioned, but what we have found is that there’s a real backfire effect for kids who have already engaged in that behavior, or kids who engage in that behavior, and then are under threat, and what they remember all of us telling them is, “don’t do this,” and they’ve done it. Now they don’t feel like they can come to us for help. They’re embarrassed, they’re ashamed, they know that they did exactly what we told them not to do, and that has really increased the vulnerability of these kids for those impacts where they continue to kind of deal with these offenders on their own. Paying them doesn’t help, ignoring them doesn’t help, I mean, it really needs an adult and a law enforcement response, to be able to support these kids. And it’s out there, and and not enough kids know. They don’t know that this is a thing, they think, understandably, that they’re the only ones who’ve experienced that. So we need to get the message out to kids that if they ever find themselves in this situation, and I think we should be explicit and clear, especially with our teenage boys, that this is a scam going around. Just the way we might warn our older, elderly parents about phone scams or different types of things like that, I think we should be explicit with them that this is a possibility. I also think we’ve got to start talking with kids about the harm of recirculating non-consensual imagery.
Sandra Morgan 15:13
Yeah, let’s break that down. Non-consensual imagery. Tell us what that is.
Susan Kennedy 17:05
So what I mean is a picture, an explicit, a nude picture. Kids will call them nudes, kids are fine if we still say sexting, that’s not what they say, but that’s okay. So any image of someone where they don’t have their clothes on, and we’re going to focus on minors here. They send a picture, say they send it to a classmate, that classmate sends it to someone else, that classmate sends it to someone else, it circulates online. That is where a lot of the harm of this imagery happens, is because of that circulation, and the harassment and the bullying that happens when an image is circulating out there. That, I would argue also gives ammunition to these people who are sextorting children. Because if the child could say to this offender, “Go ahead, send it to everyone on my soccer team, literally no one will care. Everyone will delete it, no one will say anything to me.” But that’s not the reality, and that’s not what they think will happen. They know that they’re going to be embarrassed, they know that people are going to possibly say mean things about them, repost it, put it publicly, put their name with it online. All of these things that they’re worried will happen, may happen, and that is part of what gives ammunition to these offenders. So that’s where we need to take their power away and tell kids that it’s up to them, the climate, and what happens when this imagery circulates at their school is up to them. It’s not up to me, it’s up to you guys, and how you treat each other when this image is circulating. So we really need to broaden the conversation beyond just, “Don’t send this imagery, don’t talk to strangers.” We need to broaden it to think about how we view each other and how we behave as bystanders and supporters of each other.
Sandra Morgan 18:34
I think the bystander element of this is so important because you can build that sense of a youth community already prepared to protect their group, and if you see this happening to somebody in your group, you know what to do. Because when it’s happening to you, you may not really feel like you have the power but when it’s happening to your friend, you have a sense of coming alongside, and I think that bystander element of this is an important protective tool and quality to help flourish in our kids’ groups, our teens, our youth groups, so that they become their own first line of defense. I have so many more questions and we’ve only got like 10 minutes left, but we probably will have to have another conversation. When we’re looking at prevention strategies from a community based approach for all of our children. Every time I talk to parents with kids that are on computers for homework or just because they get this amount of screen time every day, I feel pretty confident that these parents are doing a good job of what you’ve just described. But then, I end up in some other areas where I’m with kids, and there is no consistent caregiver, the child may already be system involved because of a lack of appropriate family support and even neglect. So how can a community based approach help us keep our kids safe?
Susan Kennedy 20:40
I think that’s such an important point, and one of, I think, the most challenging aspects of Internet safety is just what you said. The parents who show up to the Internet safety presentation, who are going to the National Center, and looking at NetSmartz, those parents have a lot of advantages and are engaged, and those kids have that going for them. Those are, in some ways, the easiest parents and kids to reach with this messaging, and I think what you’re highlighting is such a huge challenge. The first thing I would say is, that’s one reason why this policy stuff is really important. Making the platform safer, holding these platforms accountable for the ways that they can increase child safety that benefits all kids, and so there’s lots of analogies about rivers and oceans, and making the river smoother, versus teaching kids how to swim and all that. So if we can make the river smoother, if we can make the internet safer, that’s obviously going to be an effective way, or more effective way to help all kids, just make those waters easier to navigate. I think beyond that, one of the things the National Center tries to do is we try to not only talk to parents directly, but we also want to partner with schools. We also partner, you said in your intro, I used to be a forensic interviewer, I used to work at a Children’s Advocacy Center. CAC’s employ mental health therapists and victim advocates who are working with kids who have experienced child sexual abuse, working with them to understand some ways you can use part of our Internet safety program called NetSmartz to have one on one conversations with those kids about Internet safety moving forward. Here’s how you can talk to them about the ways in which you can be a support. We’re always trying to get our resources into the therapist’s office, the advocates’ offices, get them to Child Protective Services, talk to folks who work in residential treatment, and group homes. We work with after school programs, just trying to really reinforce and get that message to all adults who have contact with kids, not only focusing on parents, but trying to get to all those systems where kids might be able to reach out for help, might voice a concern, might go for guidance. I think that’s another really important approach, is we can’t just focus on talking to kids and families directly, we’ve got to get to all these settings and all these professionals who have opportunities, not only to intervene, but to speak about prevention with the children that they interact with.
Sandra Morgan 20:41
You’ve mentioned NetSmartz, I’ve mentioned it, we will put a link to NetSmartz, age appropriate prevention curriculum online, one of the best out there and it’s free. Okay, definitions are so important, and we need to use the real words with kids. I see a lot about CSAM and SGCSAM, what are people talking about when they’re using those acronyms?
Susan Kennedy 22:04
Okay, so CSAM (CSAM), stands for Child Sexual Abuse Material. This is the terminology that we prefer to use for what we used to call child pornography. In fact, another one of the provisions in that legislation, that group of bills before Congress right now, changes that terminology officially in federal legislation. Right now, the Criminal Code, and most places where it’s referred to in the federal government policy and laws, it’s called child pornography. But we believe that that term is not accurate for this imagery, because it is not legal to have sex with a child. This is the sexual assault and abuse of a child and there’s nothing consensual about their participation in this imagery. It’s important for people to understand when we say child pornography, when we say Child Sexual Abuse Material, the vast majority of it is the hands on, sexual abuse of the child that is filmed and put on the internet. That’s what we’re primarily talking about, and that’s really important because I think sometimes people think of child pornography, “Oh, a lot of it is probably innocuous bathing suit pictures, or kids coming out of the bathtub, or self generated content maybe, “just kids”, teenage girls in their bras,” and certainly there’s a wide range of imagery, and all of that, everything I’ve described could be exploitative and could fall under this legal definition. It’s just important for listeners to know that the vast majority of what we’re talking about is someone sexually abusing a child, and filming or taking pictures of that abuse. It’s hands on sexual abuse that is then filmed and put on the internet. So we feel like CSAM or Child Sexual Abuse Material, is a much more clear and accurate term for those images and videos. Now, when you see SGCSAM, that SG stands for Self Generated. There are other terms for that, some people might call it youth-produced explicit content. What that is referring to is when the child who is depicted in the imagery is the one who took the picture or created the video. Again, as I said, it could be a child taking an image of themselves partially clothed, unclothed, and sending that. Sometimes that could be in the context of a romantic relationship, but as we’ve already talked about, sometimes kids are under threat, forced intimidation and fear, and they’re taking that imagery and sending it to someone who is forcing them to do that. And it’s important for providers, as well as parents, to understand that you’re not going to know the full context of a picture when you first discover it. So you could think this image was created, and this child is willingly doing it, and being irresponsible, and sending it to someone. It could also be true that they are being threatened to take that image, and the person threatening them to take that image is asking them to smile or pose. We see that often. You see it also in child sex trafficking cases, where a child is instructed to pose in a certain way or do certain things on camera, it is impossible to tell from the image the full context under which it was created. That’s important for people to know, but that’s the difference. Child Sexual Abuse Materials is the broad category of what we used to call child pornography, and when you see the SG in front of it, it means that in some way it was generated by the child in the image.
Sandra Morgan 24:19
So when those images are out there, what can someone do?
Susan Kennedy 26:55
Great question. So the most important thing we want people to know, and we want to ask people to do, is anytime you see suspected child exploitation online, we want you to report it to the CyberTipline. I can also send you a link, but it’s easy to find: cybertipline.org, it’ll come right up. We have a brand new public reporting form where we’ve actually invested a lot to make that an easier report to make, more user friendly, more clear about how to report. That will trigger what we call a cyber tip, which will let our analysts look at it. Also, it’s important for people to know that by definition, that process involves law enforcement, because our job is to get that information back out to law enforcement so that they can investigate it. What it also allows us to do is once the national center looks at it and says, “This is CSAM, this is an explicit image of a child,” we get law enforcement to verify that is in fact a child. So even if the child in the image may look like they are 19, or 18, they are really 16, or 15, or 14. Those two pieces together, the National Center is identifying this as child sexual abuse material, and law enforcement is verifying that this is in fact a child, makes that image legally, child pornography or Child Sexual Abuse Material, and platforms must remove it when they are notified that it’s on their platforms. That’s actually the only level of mandatory reporting that exists across all platforms, is that once it is known Child Sexual Abuse material, you have to remove it, and the National Center actually has staff whose job is to notify these platforms, and monitor over time, and make sure that that content comes down. The identifying information from that picture is added to what we call a hash list. That list is shared with many platforms. Anytime a picture on that hash list appears on their platform, they have to take it down. So that is your most powerful tool to get content removed and prevent re circularization. That’s making a report to the CyberTipline. We know that that law enforcement involvement can be a barrier for people and we very much understand that. I mean, I can say a lot of the times law enforcement, the only thing that’s going to happen is they’re going to put that on the removal list. But there could be circumstances around your case, and the person and people involved in it, and offenders who may be involved in other cases, and it may result in law enforcement involvement in that report that you made. We can’t say for sure, law enforcement needs to hold people accountable and investigate things that we bring to their attention. We have other resources. You mentioned, Take It Down. Take It Down is a really specific resource for a child who has a self generated image of themselves, they want to report it, but they want to stay anonymous. So what they can do, they have an image on their phone because they sent it to someone and now that relationship has gone south. It’s important to note that sextortion still happens in that context, peer to peer, where you’re in a relationship one minute, the relationship goes south, and now that person doesn’t want you to break up with them or is controlling you in some way. This can be a form of domestic violence, where they’re going to control you because they have this image of you, and if you want to make a report to Take It Down, what we do is just pull that hash value that I mentioned, just that hash value, we don’t see the image, we don’t know who you are, but that we can share that hash value with platforms who are participating voluntarily, and they will look for that image should it appear on their platform. That’s a very powerful tool but it’s not as potent of a content removal tool. But it is important, and for people who aren’t ready to make that CyberTip Report, we definitely want them to report to Take It Down so that we can help look for that image. There’s also tools on our platform where you can self report to the platforms, and we go through every few months and make sure that those instructions are up to date, in terms of reporting to popular platforms out there about your self generated content as well. So there are different tools to really empower victims to pick which one works for them the best, but the important takeaway for professionals, for parents, for kids, is that we used to say, “Once it’s out there, it’s always out there,” that’s not true. There’s a lot we can do with technology to get your image down, and so it’s really important that you reach out and get support and help in whatever way you’re comfortable, so that we can get that imagery down and stop it from circulating.
Sandra Morgan 28:50
I am so encouraged by that. Our time is up, but I’m going give you one more statement. What gives you hope, Susan?
Susan Kennedy 31:06
I think working at the National Center and seeing the incredible work and resiliency of the people who look at this imagery and try to help kids, and even adults who have imagery circulating, and coming up with these new innovative tools, that makes me feel really optimistic, as well as just talking to young people. I think they are taking control of this new technology and really speaking up and advocating, and most of them, when you read the research, are making really good decisions online. I think we have a great opportunity to come alongside them and empower them to be safe, and take care of each other out there.
Sandra Morgan 31:39
Talking to you gives me hope. Thank you, Susan. Thank you for tuning in and listening today. I’m going to see you all again in two weeks. In the meantime, go to the show notes to find the links that Susan and I have talked about and if you haven’t been to the ending human trafficking website, go on over to endinghumantrafficking.org where you can find a library of past episodes and resources and get connected with our community. And Susan, will you tell us the website for NCMEC?
Susan Kennedy 32:18
Sure. It’s missingkids.org and the NetSmartz is missingkids.org/netsmartz.
Sandra Morgan 32:26
And spell NetSmartz.
Susan Kennedy 32:28
N-E-T-S-M-A-R-T-Z. NetSmartz with a Z.
Sandra Morgan 32:33
Okay, yes, gotta spell it with a Z. It has been a delight. Thank you so much, Susan.
Susan Kennedy 32:40
Thank you very much.
342 एपिसोडस
Alla avsnitt
×प्लेयर एफएम में आपका स्वागत है!
प्लेयर एफएम वेब को स्कैन कर रहा है उच्च गुणवत्ता वाले पॉडकास्ट आप के आनंद लेंने के लिए अभी। यह सबसे अच्छा पॉडकास्ट एप्प है और यह Android, iPhone और वेब पर काम करता है। उपकरणों में सदस्यता को सिंक करने के लिए साइनअप करें।