license: cc-by-nc-4.0
language:
- en
library_name: allennlp
tags:
- CNN
- Admission
- Python
Welcome to Project Estallie--the official administration system of United DeviantArt AIGC Art Alliance.
As UDAAA tries to prevail in making everything opensourced and clear to the audience, we are hosting our administration model on huggingface to let individual users compare the admission results with our model results.
Project Estallie has been trained on a Convolutional Neural Network with 3000+ images from our UDAAA groups galleries in order to automatically classify and tag images as categories that are: -Safe -R Rated -NSFW -Unethical(Rejection) -Also the type of details of the images to classify
We will not share the dataset with any third party libraries. However since AIGCs do not have proper copyright, we will use our gallery as training set. We also acknowledge your rights, hence, we will not be using them in any generative models nor sell or claim your work.
Okay now let's get to the topic:
Q: How can I download the model?
A: You should be able to directly download it from the 'Files' List, aka the "NSFW_Classifier.h5". You can then place it inside the directory of the Estallie-Inference.py file and start the code.
Q: How can I train my own model?
A: Go to Estallie-trainer.py, edit the Train and Validation folders and model name, then run the program. Remeber the default batch size is 4 so you might want to have more than 4 images per class. Also there is only two classes avaliable for this basic model.
Q: How are you supposed to interpret if an image is not NSFW or SFW such as Unethical...etc?
A: Good point. At this point we used seperate models to interpret. We also have yolov5 and Deepdanbooru but it's not fun anymore if they're trained for you. We have a backup program for automatically classifying subclasses, the "1c3a.oy" file, and we are trying to complete it.