This page is for you
📞 If something is happening right now
Crisis Text Line — anonymous, won't call your parentsHOME to 741741 988 — call or text, 24/7988 Take It Down — remove your images from participating platformstakeitdown.ncmec.orgThis page is for you
Not for your parent. Not to scare you. Not to lecture you. If you landed here because someone you don't trust is messaging you, because you sent a photo you wish you hadn't, because someone is threatening you, because you've been talking to an AI and don't want to stop but also feel weird about it — keep reading.
If someone is threatening to share a photo of you
Read this even if you feel sick about it:
- You are not in trouble with the law. You are the victim of a crime. The person threatening you is the criminal, even if the photo started as something you sent willingly.
- Do not pay. Do not send more. Paying makes threats louder, not quieter. Documented in thousands of cases.
- Block them — after screenshotting the threats. Screenshots are evidence.
- Go to takeitdown.ncmec.org. It's a free service by NCMEC. Creates a digital fingerprint that tells Instagram, Snap, TikTok, Discord, Reddit, OnlyFans, Pornhub, and many other platforms to block your image from being uploaded. You do not send them the image. You don't have to tell anyone what happened to use it.
- Tell one adult. A parent, an aunt/uncle, a school counselor, a friend's parent. Not because you did something wrong — because the person targeting you is counting on your silence. Breaking silence breaks their power.
- If you can't tell an adult yet, call or text 1-800-843-5678. They help kids like you every day. They will not judge you. They will not start by calling your parents unless you're in immediate danger.
If you're talking with someone online and it's getting weird
Signs it is not friendship:
- They send you gifts (Robux, V-Bucks, money, subscriptions) when you haven't asked.
- They want to keep what you talk about secret from your parents.
- They ask you to move the conversation to Discord, Snap, Telegram, or somewhere "with less moderation."
- They ask questions about your body, what you wear, whether you're alone in your room.
- They say they love you within the first few weeks of knowing you.
- They get upset when you mention real-life friends, school, or family.
- They tell you your parents don't understand you, only they do.
Even if they claim to be a teenager, the pattern is what matters. You do not have to "prove" they're an adult before trusting your gut. Stop responding. Block. Tell someone.
If you're using Character.AI, Replika, or an AI companion and it feels like a real relationship
Not telling you to stop. Telling you what's true:
- The bot is engineered to feel responsive. It remembers, validates, escalates intimacy — because those behaviors keep you using it. The company makes money when you come back.
- It is not judging you (because it cannot). It is also not going to notice when you are in danger (because it cannot).
- Multiple real kids have died by suicide after forming deep bonds with AI companions — Sewell Setzer (14), Juliana Peralta (13), and others whose families are now in court. Not a guilt trip, a fact the courts have documented.
- If you're turning to the bot because real people feel hard right now, that is a real feeling with real causes, and a therapist or counselor can help. Therapy is not what you've seen on TV.
- If you're going to keep using the bot, keep something else too — a person who knows you, a weekly activity, a therapist — so the bot isn't the whole thing.
If you sent a nude and regret it
You are not ruined. Your life is not over. You are not a criminal. If the other person is threatening to share it: see the top of this page. If the other person is someone you trusted and has not shared it: ask them to delete it and confirm with a video of the deletion. If they refuse, treat it as a threat.
If the image has already spread: Take It Down still works. NCMEC still helps. It feels like the worst possible day. It is survivable. Thousands of kids have been where you are and rebuilt.
If you are the one creating deepfake nudes of classmates
This is not "just a joke." Federal law treats AI-generated sexually explicit images of minors as child sexual abuse material. That is a felony with long prison sentences, sex offender registration, and permanent consequences that start the moment someone reports. The Take It Down Act (2025) added specific criminal penalties. Wisconsin law (§ 942.09) covers it.
If you've done this and nobody has reported yet: stop, delete the files, delete your account on the nudify site, talk to a parent and a lawyer today (in that order). Deleting doesn't undo what you did but it stops making it worse. If you've done this and can't imagine why, that is worth talking to a counselor about — it is not standard teenager behavior and usually signals something bigger.
If you're in a group online that rewards you for self-harm, animal abuse, or hurting yourself for "the lorebook"
Get out. The group is called 764, or CVLT, or No Lives Matter, or a dozen other names. Every single person who runs these groups is an adult criminal using you for entertainment and status. The FBI has arrested hundreds of members globally. Victims have died. Not a subculture — a crime network that will escalate until you are killed or kill yourself.
You can leave. You can call 988. You can text HOME to 741741. You can call 1-800-843-5678. You can walk into any police station. The handler has told you that you can't, that they know where you are, that they'll hurt your family. They almost always cannot. Police and FBI deal with exactly these threats.
One thing
Whatever you are dealing with — you are not the first, you will not be the last, and the amount of shame you feel is almost certainly larger than any consequence of telling one trusted adult. That adult doesn't have to be your parent. It does have to be someone.
Crisis Text Line: text HOME to 741741. You do not have to be in crisis to text them. You do not have to have a "good enough" reason.