@putzfimel Eric Ciaramella whatever

putzfimel's neighborhood:

Slurs & ToS Breaches Nazi Dogwhistles Antisemitism Transphobia Anti-Blackness Homophobia Misogyny Groypers Proud Boys Patriot Front
Matched Tweets Image Guesses - 14
Eric Ciaramella @putzfimel
@boofyre @IliaOrthodox no gay lifestyle before modern medicine was unviable.

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Eric Ciaramella @putzfimel
Quoted tweet unavailable

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Eric Ciaramella @putzfimel
he/him
Quoted tweet unavailable

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Eric Ciaramella @putzfimel
@MattTho31313339 @TomRtweets this is what happens to puppets who "prepare" https://twitter.com/putzfimel/status/1633430594859450368/video/1

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Eric Ciaramella @putzfimel
@TheClintGarner @curious6788 @aaroncarter you seem angry

Copy as image

Copied to clipboard!

Filter

Use the Copy image icon, and object detection annotation toggles if present, to grab toxic content. Some of these guesses will be wrong- computer vision is hard!
swastika 74%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)







Classifier 20221028131014
tldr 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)










Classifier 20221028131014
anime 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)














Classifier 20221028131014
toons other 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


Classifier 20221028131014
tldr 45%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
skull mask 28%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


Classifier 20221028131014
tldr 80%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


Classifier 20221028131014
tldr 64%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)








Classifier 20221028131014
etc 94%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
hitler 29%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
us flag 82%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)




Classifier 20221028131014
face 86%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)








Classifier 20221028131014
art_other 54%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)










Classifier 20221028131014
art_other 55%
toons other 40%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)








Classifier 20221028131014
skull 61%
etc 27%