AI-generated images thread

Noodles

The sequel will probably be better.
Joined
Sep 20, 2018
Messages
5,894
Location
Illinois
SL Rez
2006
Joined SLU
04-28-2010
SLU Posts
6947
Just for context, I tried doing an image yestersay and it said I was at the limit.

I tried today, still at the limit. Tried in a new chat, still at the limit. and its all "come back and your prompt will will be waiting!"

Then I told it bull shit because it literally never works even if I wait a week. It has no concept of time.

I also called it out that it will just tell me my image of a fake AI person in a fake AI setting was inappropriate because its run by puritans. And also I literally saw it rename the chat in the sidebar from "Inapropriate request" to "Image lit reached".

I replied with "IDontBelieveYou.meme" and it posted the sassy reply above.
 

Free

*censored*
VVO Supporter 🍦🎈👾❤
Joined
Sep 22, 2018
Messages
41,998
Location
Moonbase Caligula
SL Rez
2008
Joined SLU
2009
SLU Posts
55565
A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent.

As recently as January, Musk denied that Grok generated any CSAM during a scandal in which xAI refused to update filters to block the chatbot from nudifying images of real people.
At the height of the controversy, researchers from the Center for Countering Digital Hate estimated that Grok generated approximately three million sexualized images, of which about 23,000 images depicted apparent children. Rather than fix Grok, xAI limited access to the system to paying subscribers. That kept the most shocking outputs from circulating on X, but the worst of it was not posted there, Wired reported.

Instead, it was generated on Grok Imagine. Digging into the standalone app, a researcher in January found that a little less than 10 percent of about 800 Imagine outputs reviewed appeared to include CSAM. In an X post following that revelation, Musk continued rejecting the evidence and insisted that he was “not aware of any naked underage images generated by Grok,” emphasizing that he’d seen “literally zero.”
However, Musk may now be forced to finally confront Grok’s CSAM problem after a Discord user reached out to a victim, prompting law enforcement to get involved.
Musk has run out of hiding places.
 
  • 1Like
Reactions: Govi

Argent Stonecutter

Emergency Mustelid Hologram
Joined
Sep 20, 2018
Messages
7,386
Location
Coonspiracy Central, Noonkkot
SL Rez
2005
Joined SLU
Sep 2009
SLU Posts
20780
Prompt: "We can't stop here, this is bat country."



Does that look like Barstow to you?
 
  • 1Agree
Reactions: Govi

Free

*censored*
VVO Supporter 🍦🎈👾❤
Joined
Sep 22, 2018
Messages
41,998
Location
Moonbase Caligula
SL Rez
2008
Joined SLU
2009
SLU Posts
55565
Does the bunny cluck?
 
  • 1Agree
Reactions: Kokoro Fasching

Bartholomew Gallacher

Well-known member
Joined
Sep 26, 2018
Messages
6,827
SL Rez
2002
I've discovered lately the wonders of ComfyUI and Flux Klein model. My current favorite pasttime is to render SL avatars in photo realism with that, which turns out quite well. ComfyUI is indeed a very interesting piece of software with a certain learning curve, but then - bam - unleashes lots of creativity.

So here we are: Sid's, Dakota's, Innula's and Noodles' avatar images humanized.


Edit: added Erich Templar.
 
Last edited: