Melania Trump Leads 'Take It Down Act' Roundtable: A Push for Online Safety

Melania Trump Leads 'Take It Down Act' Roundtable: A Push for Online Safety

Author by Phuong Le

Boys in School Are Using AI to Create Nudes of Girls, and Schools Can’t Hold Them Accountable

Today, I want to talk about something that shouldn’t be happening but is happening to young girls right now. AI is being used as a weapon, and the ones responsible are getting away with it.

At the Take It Down Act roundtable, two young girls shared their stories. They spoke with a kind of strength that only comes from having survived something awful. Their classmates used AI to generate explicit images of them, not real photos, but deepfakes that looked real enough to ruin their lives. The worst part? The schools did nothing. No punishment for the boys. No action taken to remove the images. No protection for the victims.

It’s hard to believe that in 2025, schools still have no system to hold these boys accountable. AI-generated images exist in a legal gray area, which means schools act as if their hands are tied. The truth is, they just don’t want to take responsibility. They brush it off or tell the girls to move on, as if that’s even possible when your face is being used for something you never consented to.

Melania Trump rarely speaks in public. When she does, it’s because she cares. Her presence at this roundtable was a blessing. The Take It Down Act is something that should have existed years ago. At least now, it’s being talked about. This act would punish those who create and share AI-generated explicit images. It would force social media platforms to take down harmful content quickly. Most importantly, it would hold schools and institutions accountable for ignoring victims.

This law needs to pass. AI is advancing too quickly, and laws are falling behind while real girls are suffering. There is no excuse for schools to keep looking away when the problem is right in front of them. Something has to change.

Why Schools Are Failing Girls and Why This Law Matters

It’s not just about the boys who create these AI-generated images. It’s about the entire system that allows them to get away with it. Schools claim they can’t do anything because the images aren’t real, but the damage is. The shame. The humiliation. The fear of going to class knowing that everyone has seen something you never consented to.

The problem isn’t just the lack of laws. It’s the way society refuses to take this seriously. If a girl’s nude leaks, people blame her. But when AI creates something against her will, suddenly, it’s no one’s fault. The boys get to laugh and move on. The girls are left to deal with the consequences.

The Take It Down Act is a step toward fixing this. It’s not just about removing content. It’s about making sure those responsible face consequences. Right now, AI is evolving faster than the law can keep up. This act would help close that gap before more girls become victims.

For the two girls who spoke at the roundtable, this isn’t just policy. It’s their reality. Their bravery in sharing their stories should be enough to make people listen. But will it be enough to make people act?

Will This Be Enough to Create Real Change?

Laws don’t always change mindsets, but they set a boundary, a clear line between what is acceptable and what isn’t. Right now, there is no real boundary for AI-generated harassment. The Take It Down Act is trying to create one. Will it be enforced? Will schools actually step up? Will social media platforms take responsibility, or will they keep hiding behind loopholes?

The truth is, laws only work if people are willing to fight for them. The two girls at the roundtable fought just by speaking up. They didn’t have to share their stories. They could have stayed silent, like so many others who have gone through the same thing. But they didn’t. They stood up and said, “This is happening to us, and it needs to stop.”

The first step is passing the law. The next step is making sure schools can’t ignore it anymore. No more excuses. No more pretending their hands are tied. No more telling girls to move on while the boys laugh about what they’ve done. The internet isn’t a lawless space. AI doesn’t get to be an excuse. If social media platforms have the power to spread this content, they have the power to take it down. If schools have the authority to punish students for dress code violations, they have the authority to hold them accountable for ruining someone’s life.

This can’t be just another conversation that fades away after a few weeks. If the Take It Down Act becomes law, it has to mean something. It has to protect girls in real life, not just on paper.

Melania Trump Leads 'Take It Down Act' Roundtable, A Push for Online Safety

First Lady Melania Trump’s recent roundtable on Capitol Hill, where she championed the "Take It Down Act," a bipartisan effort aimed at tackling non-consensual intimate imagery, NCII, including AI-generated deepfake content.

The Digital Battle, Protecting Victims

At the event, Melania shared her concerns about the dangers of harmful online content, especially for young girls. “It’s heartbreaking to see teens struggle with the damage caused by deepfake and revenge porn,” she said. Her advocacy focuses on making the internet safer for everyone.

This act would criminalize the sharing of explicit images without consent, requiring platforms to remove such content within 48 hours. Violators could face up to three years in prison for sharing images of minors and two years for adults. It’s a powerful move to help victims reclaim their dignity.

A Rare Public Appearance for a Critical Cause

Melania's presence on Capitol Hill was a statement in itself. Known for her reserved public persona, she stepped forward to support this bill, echoing her "Be Best" campaign’s message on digital safety.

Senator Ted Cruz, a co-sponsor of the bill, emphasized that this legislation ensures equal protection for victims, whether they are famous or not. "This bill gives people a real solution," he said. Lawmakers are pushing for it to reach the House floor quickly, aiming for swift approval.

Survivors Speak Out

One of the most impactful moments came from victims of NCII, who shared their painful experiences of having their private images leaked online. Their stories made it clear, this is more than just a legal issue, it’s a human one.

House leaders have pledged to move the bill forward as soon as possible. If passed, it would be a major step in holding online platforms accountable for harmful content.

This law isn’t just necessary, it’s overdue. Girls deserve better. Schools need to stop looking away. Social media platforms need to take responsibility. The justice system needs to stop allowing AI to be used as a tool for abuse.

If this law passes, it won’t undo the harm that has already been done, but it can stop it from happening again. Right now, that’s the most important thing.

 

Back to blog