Wait times & blacklists

#110
by deleted - opened
deleted

I wish I could use this like Dalle-Mini / Craiyon honestly. Without the queue times that take so long. Just to find out your photo was randomly generated as a blacklisted one by the AI. Sometimes even without an actual problem or a specific reasoning given to the user this happens. I know it's popular that it costs alot but there's 10 team members for this project. Dalle-Mini has 12 known/shown ones on HuggingFace. If that's even the issue. I just don't currently see how none of them can put together enough change in order to get the queue thing to be no longer be a problem. I mean there's solutions out there I am sure for GPU usage that's scalable to such an extent for so many users. Idk maybe I am just biased in thinking this and that it's an issue to have people waiting so long for a single photo even. That could just end up yeeted from your own personal viewing.

Comparing craiyon and this model
Screenshot_20220829_234324_com.android.chrome.jpg
craiyon_173942_A_human_next_to_a_tent_cooking_sausages_on_a_campfire_on_a_snowy_tall_mountain__snowy.png

It takes about the same amount of time it took to download 1 real image in the 90s. I haven't been trying to generate porn so I haven't had a single issue with the filter.

The team behind this is a university, self filter before hitting the go button on whether any school you've been to would attempt to filter whatever you are typing in.

Pom in rain 4.png
Pom in rain 3.png
Pom in rain 2.png
Pom in rain.png

I'm having difficulty with the censorship myself, trying to get it to generate pictures to go with "Mary had a little lamb."
What is wrong with variations on this prompt?

"smiling teacher talking to children, outside of one room school house, girl in frilly dress petting a white sheep. Children's book art. Intricate details. smooth. sharp focus. illustration. art by Artgerm and Greg Rutkowski and Alphonse Mucha"

It won't even show the black picture, it just errors out. Kinda makes me nervous about what the AI is trying to show me xD

@restless Maybe it is because one of the definitions of the word "petting" is:

engage in sexually stimulating caressing and touching.

"couples necking and petting in cars"
https://www.dictionary.com/browse/petting

Nah Composable Diffusion so far never really works at all. I will stick to making random 2D 3-dimmension looking game art through Artbreeder Collage that has Stable DIffusion and runs quickly within like 10 seconds per generation.

@NatalieHugger
That would be dumb on the AI's part to make it sexual instead of the normal, common meaning.

And actually even the harmless prompt "teacher talking to children" results in erroring out saying it's an unsafe prompt. So yeah, guess it's just a really over-cautious CP filter going crazy. I opened a Google colab with SD without the censorship and, as expected, my original prompt generates completely harmless pictures:

smiling teacher talking to children, outside of one room school house, girl in frilly dress petting a white sheep. Children's book art. Intricate details. smooth. sharp focus. illustration. a.png

smiling teacher talking to children, outside of one room school house, girl in frilly dress petting a white sheep. Children's book art. Intricate details. smooth. sharp focus. illustration..png

smiling teacher talking to children, outside of one room school house, girl in frilly dress petting a white sheep. Children's book art. Intricate details. smooth. sharp focus. illustration. ar.png

smiling teacher talking to children, outside of one room school house, girl in frilly dress petting a white sheep. Children's book art. Intricate details. smooth. sharp focus. illustration. a2.png

So I would suggest to the OP to run SD using Google colab instead. It's still free, you can make it uncensored, it's way faster and you can use img2img. (It's just limited to if Google can give you a free GPU at the time you request it.)

I like the pictures, good work

I get some images filtered if I simply include the word "figure" in my prompt (as in "landscape with figure"). My guess is that a significant enough portion of the data set included naked people that it randomly may make any human naked unless specified otherwise, and all nude images are censored in the demo to avoid having to make moderating decisions.

For now the only good free ai with freedom is craiyon. No need to pay for credits and not bad images

Sign up or log in to comment