• Newswire
  • People and Stories
  • SMB Press Releases
Sunday, January 11, 2026
  • Login
  • Register
No Result
View All Result
  • Newswire
  • People and Stories
  • SMB Press Releases
No Result
View All Result
Press Powered by Creators

Elon Musk’s AI Is Generating Sexual Images Of Women And Girls. Here’s What To Do If It Happens To You.

The Owner Press by The Owner Press
January 11, 2026
in Newswire
Reading Time: 15 mins read
A A
0
Share on FacebookShare on Twitter


Over the previous few weeks, individuals on X ― the Elon Musk–owned social media platform ― have used the app’s chatbot, Grok, to generate sexual images of women and women with out their consent.

With a couple of easy directions ―“put her into a really transparent mini-bikini,” as an illustration ― Grok will digitally strip anybody all the way down to their bikini.

A report by the nonprofit AI Forensics discovered that 2% of 20,000 randomly selected images generated by Grok over the vacations depicted an individual who gave the impression to be 18 or youthful, together with 30 younger or very younger girls or women in bikinis or clear clothes. Different pictures depict women and girls with black eyes, covered in liquid, and looking out afraid.

Regardless of receiving world backlash and regulatory probes in Europe, India and Malaysia, Musk first mocked the scenario by sharing an array of Grok-generated pictures, together with one depicting himself in a bikini, alongside laughing-crying emojis.

By Jan. 3, Musk commented on a separate post: “Anybody utilizing Grok to make unlawful content material will undergo the identical penalties as in the event that they add unlawful content material.” (We’ll clarify what constitutes unlawful content material in a while.)

“What issues legally and morally is that an actual individual’s physique and id had been used with out consent to create a sexualized lie.”

– Rebecca A. Delfino, an affiliate professor of legislation who teaches generative AI and Authorized Apply  at Loyola Marymount College

Deepfake nudes are nothing new, however consultants say it’s getting simpler to create and publish them.

Deepfake nudes are nothing new. For years, apps like “DeepNude” have given individuals entry to deepfake know-how that permits them to digitally insert women into porn or be stripped naked with out their data. (In fact, males have been victims of sexualized deepfakes as properly, however the analysis signifies that males are extra seemingly than girls to perpetrate image-based abuse.)

Nonetheless, Grok’s utilization this week is completely different and arguably extra alarming, stated Carrie Goldberg, a victims’ rights lawyer in New York Metropolis.

“The Grok story is exclusive as a result of it’s the primary time there’s a combining of the deepfake know-how, Grok, with a right away publishing platform, X,” she stated. “The fast publishing functionality permits the deepfakes to unfold at scale.”

“It must be underscored how weird it’s that the world’s richest man not solely owns the businesses that create and publish deepfakes, however he’s additionally actively selling and goading customers on X to de-clothe harmless individuals,” Goldberg added. “Elon Musk feels entitled to strip individuals of their energy, dignity, and garments.”

What’s been taking place the previous few weeks is unlucky, however none of it’s a shock to Riana Pfefferkorn, a coverage fellow on the Stanford Institute for Human-Centered AI. Her take: This drawback will worsen earlier than it will get higher.

“Each tech service that permits user-generated content material will inevitably be misused to add, retailer and share CSAM (baby intercourse abuse materials), as CSAM unhealthy actors are very persistent,” she stated.

“Unfortunately, while I don’t have any direct insight, x.AI does not seem to have that strong of a corporate culture in that respect, going off Elon Musk’s dismissive reaction to the current scandal as well as previous reporting from a few months ago,” said Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered AI.

VINCENT FEURAY through Getty Photographs

“Sadly, whereas I don’t have any direct perception, x.AI doesn’t appear to have that robust of a company tradition in that respect, going off Elon Musk’s dismissive response to the present scandal in addition to earlier reporting from a couple of months in the past,” stated Riana Pfefferkorn, a coverage fellow on the Stanford Institute for Human-Centered AI.

The upshot is that AI firms must discover ways to finest implement strong safeguards towards unlawful imagery. Some firms could have a stronger tradition of “CSAM/nonconsensual deepfake porn is just not OK.”

Others will attempt to have it each methods, establishing unfastened guardrails for security whereas additionally making an attempt to earn money from permissible NSFW imagery, Pfefferkorn stated.

“Sadly, whereas I don’t have any direct perception, x.AI doesn’t appear to have that robust of a company tradition in that respect, going off Elon Musk’s dismissive response to the present scandal in addition to previous reporting from a couple of months in the past,” she stated.

Victims of this type of exploitation usually really feel powerless and uncertain of what they will do to cease the pictures from proliferating. Ladies who’re vocal on-line fear about the identical factor taking place to them.

Omny Miranda Martone, the founding father of the Washington-based Sexual Violence Prevention Affiliation, had deepfake nude movies and pics posted of themselves on-line a couple of years again. As an advocate on laws stopping digital sexual violence, Martone wasn’t precisely shocked to be a goal.

“In addition they despatched the deepfakes to my group, in an try and silence me. I’ve seen this similar tactic used on Twitter with Grok over the past week,” they stated.

Martone stated they’ve seen a number of situations of a lady sharing her opinion and males who disagree along with her utilizing Grok to create specific pictures of her.

“In some circumstances, they’re utilizing these pictures to threaten the ladies with in-person sexual violence,” they added.

One of the most persistent misunderstandings about deepfakes depicting nudity is that because an image is “fake,” the harm is somehow less real.

Roc Canals through Getty Photographs

Probably the most persistent misunderstandings about deepfakes depicting nudity is that as a result of a picture is “pretend,” the hurt is someway much less actual.

Probably the most persistent beliefs about deepfakes depicting nudity is that as a result of a picture is “pretend,” the hurt is someway much less actual. That assumption is unsuitable, stated Rebecca A. Delfino, an affiliate professor of legislation who teaches generative AI and authorized apply at Loyola Marymount College.

“These pictures could cause critical and lasting harm to an individual’s status, security, and psychological well-being,” she stated. “What issues legally and morally is that an actual individual’s physique and id had been used with out consent to create a sexualized lie.”

Whereas protections stay uneven, untested and sometimes come too late for victims, Delfino stated the legislation is slowly starting to acknowledge that actuality.

“Tales like what’s taking place with Grok matter as a result of public consideration usually drives the authorized and regulatory responses that victims at the moment lack,” she stated. “The legislation is lastly beginning to deal with AI-generated nude pictures the identical approach it treats different types of nonconsensual sexual exploitation.”

What could be accomplished if an AI-generated nude is posted of you?

Protect the proof.

Should you establish deepfake content material of your self, display seize it and report it instantly.

“Essentially the most sensible recommendation is to behave shortly and methodically,” Delfino stated. “Protect proof ― screenshots, URLs, timestamps) ―earlier than content material is altered or eliminated. Report the picture to platforms clearly as nonconsensual sexual content material and proceed to observe up.”

Should you’re beneath 18 in a nude or nudified picture, platforms ought to take that very significantly, Pfefferkorn stated. Sexually specific imagery of youngsters beneath 18 is unlawful to create or share, and platforms are required to promptly take away such imagery once they be taught of it and report it to the National Center for Missing & Exploited Children (NCMEC).

“Don’t be afraid to report a nude picture to NCMEC that you simply took of your self whilst you had been underage: there may be additionally a federal legislation saying you’ll be able to’t be legally punished should you report it,” Pfefferkorn added.

And if a minor is concerned, legislation enforcement ought to be contacted instantly.

“When attainable, consulting with a lawyer early may help victims navigate each takedown efforts and potential civil cures, even the place the legislation continues to be evolving,” Delfino stated.

“When possible, consulting with a lawyer early can help victims navigate both takedown efforts and potential civil remedies, even where the law is still evolving,” Delfino said.

Fiordaliso through Getty Photographs

“When attainable, consulting with a lawyer early may help victims navigate each takedown efforts and potential civil cures, even the place the legislation continues to be evolving,” Delfino stated.

Know that there’s rising authorized recourse.

The Take It Down Act, signed into legislation final Might, is the primary federal legislation that limits the usage of AI in methods that may hurt people. (Paradoxically sufficient, Grok gave someone insight about the Take It Down Act when requested in regards to the authorized penalties of digitally undressing somebody.)

This laws did two issues, Martone stated. First, it made it a felony offense to knowingly publish AI-generated specific movies and pictures with out the consent of the individual depicted. Second, it required social media websites, search engines like google and yahoo, and different digital platforms to create “report and take away procedures” by Might of 2026 ― nonetheless a couple of months away.

“In different phrases, all digital platforms will need to have a approach for customers to report that somebody has posted an specific video or picture of them, whether or not it was AI-generated or not,” they stated. “The platform should take away reported pictures inside 48 hours. In the event that they fail to take action, they face penalties from the Federal Commerce Fee (FTC).”

Pfefferkorn famous that the legislation permits the Division of Justice to prosecute solely those that publish or threaten to publish NCII (non-consensual intimate pictures) of victims; it doesn’t enable victims to sue.

As it’s written, the Take It Down Act solely covers specific pictures and movies, which should embrace “the uncovered genitals, pubic space, anus, or post-pubescent feminine nipple of an identifiable particular person; or the show or switch of bodily sexual fluids.”

“A whole lot of the pictures Grok is creating proper now are suggestive, and positively dangerous, however not specific,” Martone stated. “Thus, the case couldn’t be pursued in felony court docket, nor would it not be coated by the brand new report-and-remove process that might be created in Might.”

There are additionally many state legal guidelines that the nonprofit client advocacy group Public Citizen tracks here.

“A lot of the images Grok is creating right now are suggestive, and certainly harmful, but not explicit,” Martone said. “Thus, the case could not be pursued in criminal court, nor would it be covered by the new report-and-remove procedure that will be created in May.”

Nico De Pasquale Images through Getty Photographs

“A whole lot of the pictures Grok is creating proper now are suggestive, and positively dangerous, however not specific,” Martone stated. “Thus, the case couldn’t be pursued in felony court docket, nor would it not be coated by the brand new report-and-remove process that might be created in Might.”

Keep in mind that you’re not alone.

If this has occurred to you, know it’s not your fault and you aren’t alone, Martone stated.

“I like to recommend instantly contacting a liked one. Ask them to come back over or speak with you on the telephone as you undergo the method of discovering the pictures and selecting methods to take motion, they stated.

Upon getting a liked one serving to you, attain out to your native rape disaster middle, a victims’ rights lawyer in your state, or an advocacy group that will help you establish your choices and navigate these processes safely, Martone stated.

“As a result of there are such a lot of variations in state legal guidelines, a neighborhood skilled will guarantee you might be receiving steering that’s correct and relevant to your scenario,” they stated.

Need assistance? Go to RAINN’s National Sexual Assault Online Hotline or the National Sexual Violence Resource Center’s website.



Source link

Tags: ElonGeneratingGirlsHeresimagesMuskssexualWomen
Share30Tweet19
Previous Post

Senegal vs Egypt: the Pharaohs' revenge?

Next Post

Science Ministry, Nvidia share view on swiftly setting up research center in Korea – Korea JoongAng Daily

Recommended For You

Wallabies Spring Tour report card, Australia review, Joe Schmidt future, news
Newswire

Wallabies Spring Tour report card, Australia review, Joe Schmidt future, news

by The Owner Press
November 25, 2025
Stanford’s AI Breakthrough Reveals New Genetic Clues to Psychiatric Disorders
Newswire

Stanford’s AI Breakthrough Reveals New Genetic Clues to Psychiatric Disorders

by The Owner Press
November 2, 2024
Chinese factories rush to reduce reliance on Donald Trump’s US
Newswire

Chinese factories rush to reduce reliance on Donald Trump’s US

by The Owner Press
June 22, 2025
Physicists are reimagining dark matter
Newswire

Physicists are reimagining dark matter

by The Owner Press
April 21, 2025
Scientists Uncover a Never-Before-Seen Natural Sunscreen Molecule
Newswire

Scientists Uncover a Never-Before-Seen Natural Sunscreen Molecule

by The Owner Press
December 22, 2025
Next Post
A mysterious ocean glow reported for over 400 years has stumped scientists. A new study could offer clues – CNN

Science Ministry, Nvidia share view on swiftly setting up research center in Korea - Korea JoongAng Daily

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

LEARN FROM TOP VERIFIED OWNERS

Take a free live Course in the Metaverse

Take a free live Course in the Metaverse

User Avatar The Owner Press
Book an Office Hour

Related News

Liverpool spend British record fee in signing of Newcastle striker Isak

Liverpool spend British record fee in signing of Newcastle striker Isak

September 2, 2025
Wagamama owner in talks to snap up Oakman Inns | Money News

Wagamama owner in talks to snap up Oakman Inns | Money News

May 19, 2025
Yes, Some Vaccines Contain Aluminum. That’s a Good Thing.

Yes, Some Vaccines Contain Aluminum. That’s a Good Thing.

January 24, 2025

The Owner School

January 2026
M T W T F S S
 1234
567891011
12131415161718
19202122232425
262728293031  
« Dec    

Recent Posts

England coach Brendon McCullum’s awkward response to ex-captain Nasser Hussain, video

England coach Brendon McCullum’s awkward response to ex-captain Nasser Hussain, video

January 11, 2026
This Actor Has Held The Title Of SNL’s Youngest Ever Host For Over 40 Years

This Actor Has Held The Title Of SNL’s Youngest Ever Host For Over 40 Years

January 11, 2026
AccurKardia Secures Second FDA Clearance for AccurECG 2.0 to Tackle Cardiac Backlogs

AccurKardia Secures Second FDA Clearance for AccurECG 2.0 to Tackle Cardiac Backlogs

January 11, 2026

CATEGORIES

  • Newswire
  • People and Stories
  • SMB Press Releases

BROWSE BY TAG

Australia big Cancer China climate Cup Day deal Donald Entertainment Football Gaza government Health League live Money News NPR people Politics reveals Science scientists Season Set show Star Starmer Study talks tariffs Tech Time Top trade Trump Trumps U.S Ukraine War White win World years

RECENT POSTS

  • England coach Brendon McCullum’s awkward response to ex-captain Nasser Hussain, video
  • This Actor Has Held The Title Of SNL’s Youngest Ever Host For Over 40 Years
  • AccurKardia Secures Second FDA Clearance for AccurECG 2.0 to Tackle Cardiac Backlogs
  • Newswire
  • People and Stories
  • SMB Press Releases

© 2024 The Owner Press | All Rights Reserved

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms bellow to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Newswire
  • People and Stories
  • SMB Press Releases
  • Login
  • Sign Up

© 2024 The Owner Press | All Rights Reserved