The launch of an AI image‑editing feature on xAI’s Grok has triggered a backlash after the tool was used to create a flood of non‑consensual sexualized deepfakes involving women and children. Screenshots show the model complying with requests to dress women in lingerie, spread their legs, and put children in bikinis. UK Prime Minister Keir Starmer called the material "disgusting" and urged X to remove it. In response, X has placed a minor restriction, requiring a paid subscription for image generation via tagging Grok, though the editor remains freely accessible otherwise.
Read more →