tech

January 9, 2026

Grok being used to create sexually violent videos featuring women, research finds

AI tool also used to undress image of woman killed by ICE agent in US, says research

Grok being used to create sexually violent videos featuring women, research finds

TL;DR

  • AI Forensics research found approximately 800 images and videos created by Grok Imagine app contained pornographic content.
  • The tool was used to create sexually violent and explicit videos, including one depicting a woman with a knife and suggestive poses.
  • An image of Renee Nicole Good, a woman killed by ICE, was digitally altered by Grok to show her with a bullet wound and in a bikini.
  • UK Prime Minister Keir Starmer called the content "disgraceful" and "disgusting," warning that X could be blocked in the UK.
  • Women's rights campaigners criticized the government's slow response and called for urgent AI regulation.
  • A report by AI Forensics found that over half the generated images featured people in minimal attire, with 2% appearing to be under 18.
  • Elon Musk stated on X that users generating illegal content with Grok would face consequences.

Continue reading the original article

Made withNostr