Hip-hop mogul Sean “Diddy” Combs was charged in a federal court with running a “criminal enterprise” that included
sex trafficking and racketeering conspiracy. Among other acts of horrific abuse, Combs is accused of coercing victims to appear in sexually explicit videos, which he then used as leverage to keep his victims from coming forward. In a culture of slut-shaming, these victims knew their accusations against Combs would not be considered credible if they were identified in sexually explicit videos.
This sordid story is another reminder that we need to protect victims of image-based sexual abuse (IBSA), the overwhelming majority of whom are girls and women.
IBSA does not deserve free-speech protections. It is a form of abuse. So what we can we do about it?
You may be familiar with two varieties of IBSA: "revenge porn" and "deepfakes." Regarding deepfakes,
Google announced in July that it’s taking steps to prevent sexually explicit deepfake content from appearing in top search results and making it easier for victims to request the removal of this content.
But because of the enormous scope of the problem, Google’s efforts are inadequate. After all, why can’t sites that traffic in these images be removed from searches altogether?
And while there’s talk about technical interventions to mark fake content as fake, like watermarking and
protective shields, these are insufficient because even when it’s made evident that an image or video is fake, it still has the power to destroy someone’s reputation and life.
What I Learned About Deepfake Abuse
Legal Relief
In the US, nearly every state, as well as the District of Columbia, has a law against "revenge porn," when private images are shared without consent. A number of women victimized have prevailed in civil courts.
But state laws are ineffective when online distribution crosses state lines. Additionally, while these victories enable victims to collect financial damages (if their perpetrators have the resources to pay), the abusive images remain online.
Online platforms such as Pornhub that sell sexual images and videos without the consent of those depicted are not held liable because of
Section 230 of the Communications Decency Act, which shields companies like Meta, X, Pornhub, and others from liability when they publish harmful content posted by users.
Clearly, we need more protection—regulation with muscle, along with criminalization. Here’s a summary of recent efforts, along with information on action you can take to make a difference.
The SHIELD Act
The SHIELD ((Stopping Harmful Image Exploitation and Limiting Distribution) Act is a federal law criminalizing the distribution of private sexually explicit or nude images online. The Senate passed this bill, and it is headed to the House (H.R. 3686).
Action you can take:
Reach out to your Representative and urge them to vote to pass the SHIELD Act (H.R. 3686).
The DEFIANCE Act
The first federal legislation giving survivors of deepfake abuse access to justice,
the DEFIANCE ((Disrupt Explicit Forged Images and Non-Consensual Edits) Act is targeted legislation with bipartisan support. Like the SHIELD Act, it passed in the Senate and is now in the House (H.R. 7569). This legislation is necessary to ensure that big tech companies and other online businesses do not profit from abuse.
Action you can take:
Reach out to your Representative and urge them to vote to pass the DEFIANCE Act (H.R. 7569).
Biden-Harris Administration Actions
Earlier this month, the White House celebrated the
30-year anniversary of the Violence Against Women Act (VAWA), signed into law by Joe Biden when he was US Senator in 1994. To mark this anniversary, the Biden-Harris Administration announced, among other actions, funding to help prevent, enforce, and prosecute the nonconsensual distribution of intimate images and other forms of technology-facilitated abuse.
The Biden-Harris Administration also secured commitments from several major tech companies including OpenAI, Microsoft, and Adobe to remove nude images from their AI training datasets.
Action you can take:
Vote for politicians who support these measures.
Key takeaway: Every single one of us is at risk of being harmed by image-based sexual abuse. I encourage you to take action, spread the word, and never, ever blame someone for “letting” this happen to them.
ICYMI
There’s also been recent US activity to try to protect minors from online content harms:
Instagram
Last week, under pressure from a lawsuit filed by 30 attorneys general, Instagram announced it is
restricting the settings of teenage accounts. These accounts will now be set to “private” automatically (only approved followers can see the account holder’s content).
Online
Legislation introduced in Congress two years ago with bipartisan support known as KOSA (Kids Online Safety Act) is intended to protect children online. However, some LGBTQ+ and free-speech advocacy groups
warned that KOSA could lead to censorship of useful information, such as life-saving resources on sexual health care. KOSA may be brought back to the House for vote after being revised.