@VaxSideEffects Virophagy ✳️🩸🌿 Nutraceutical research Ozarks

VaxSideEffects' neighborhood:

Slurs & ToS Breaches Nazi Dogwhistles Antisemitism Transphobia Anti-Blackness Homophobia Misogyny Groypers
Matched Tweets Image Guesses - 11

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Virophagy ✳️🩸🌿 @VaxSideEffects
@Obliquities just don't create an amoral ai and you should be fine 😂

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Virophagy ✳️🩸🌿 @VaxSideEffects
@Project_Veritas @pfizer Same energy

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Filter

Use the Copy image icon, and object detection annotation toggles if present, to grab toxic content. Some of these guesses will be wrong- computer vision is hard!

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)








Classifier 20221028131014
stars and bars 94%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)






Classifier 20221028131014
etc 65%
militarized 59%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)





Classifier 20221028131014
militarized 64%
art_other 28%
gigachad 82%
gigachad 70%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)




Classifier 20221028131014
militarized 26%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)











Classifier 20221028131014
militarized 90%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)






Classifier 20221028131014
art_other 89%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)






Classifier 20221028131014
art_other 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)

Classifier 20221028131014
tldr 100%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)







Classifier 20221028131014
etc 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)

Classifier 20221028131014
tldr 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)

Classifier 20221028131014
tldr 99%