@D4M_EN D̹̈͑̇҈̼̤̪̞҃̂҈ą̝̖߮߳ͥm͚̩̣̀͜ỉ̶̶̠̙͕͓͛̀᷆ͨͦeͫ߬߭҆̉̚n͢ #IStandWithHateSpeech https://t.co/Igr2DVHzkg Delaware,

D4M_EN's neighborhood:

Slurs & ToS Breaches Nazi Dogwhistles Antisemitism Transphobia Anti-Blackness Homophobia Misogyny Asian Hate Groypers
Matched Tweets Image Guesses - 11
@MyBackupNinja @sameerbacha94 @Kronykal @taresamler @MrBeagleman @MyWhiteNinja_ @plout0n @MyWhiteNinja @scrowder @wigger

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

@cnustfupls @Southernmess_85 @gr86sweetheart The hotter one is the only one I get, the rest are just vinegar with flavor.

Copy as image

Copied to clipboard!

Do people actually fall for this? It must work because they wouldn't do it otherwise..

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

@D4M_EN: I was banned instantly for 12 hours for saying something. I mean within seconds of the word "die" the algorithm found me and deemed it inappropriate.

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

I need this in my life. It's not that expensive either but I can't afford it because alcohol is more important. My birthday is August 30th

Copy as image

Copied to clipboard!

For the people who are crying about baby formula. #BabyFormulaShortage #babyformula

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

@shadowpatriot81 @MarcLobliner Only a faggot would bring this up as a way to insult. It's okay dude the liberals will accept you no matter what.

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

View @SickMolly

Filter

Use the Copy image icon, and object detection annotation toggles if present, to grab toxic content. Some of these guesses will be wrong- computer vision is hard!
hitler 68%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)







Classifier 20221028131014
hitler 29%
pepe 88%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)
















Classifier 20221028131014
mspaint style 48%
pepe 36%
firearm 90%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)












Classifier 20221028131014
art_other 54%
etc 35%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)




Classifier 20221028131014
art_other 82%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)




















Classifier 20221028131014
etc 70%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)











Classifier 20221028131014
tldr 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


Classifier 20221028131014
map 92%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)





Classifier 20221028131014
etc 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)




Classifier 20221028131014
toons other 88%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)

















Classifier 20221028131014
toons other 55%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)












Classifier 20221028131014
tldr 99%