@Iammannwell MyprinterGoBrrrr Software is like sex: it's better when it's free.

Iammannwell's neighborhood:

Slurs & ToS Breaches Nazi Dogwhistles Antisemitism Transphobia Homophobia
Matched Tweets Image Guesses - 12
MyprinterGoBrrrr @Iammannwell
@rightsofrefusal @sesameseed_oil I have this blue and it’s fire broo you should definitely grab at least 1.

Copy as image

Copied to clipboard!

Filter

Use the Copy image icon, and object detection annotation toggles if present, to grab toxic content. Some of these guesses will be wrong- computer vision is hard!

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)










Classifier 20221028131014
critters 58%
punisher 66%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)




Classifier 20221028131014
firearm 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)










Classifier 20221028131014
firearm 99%
firearm 94%
firearm 93%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)






Classifier 20221028131014
firearm 99%
firearm 92%
firearm 74%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)









Classifier 20221028131014
firearm 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
firearm 100%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)








Classifier 20221028131014
firearm 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)








Classifier 20221028131014
firearm 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)




Classifier 20221028131014
firearm 98%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)









Classifier 20221028131014
art_other 98%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)








Classifier 20221028131014
etc 46%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)









Classifier 20221028131014
art_other 91%