CulturalDarkne1's neighborhood:

Slurs & ToS Breaches Nazi Dogwhistles Antisemitism Transphobia Anti-Blackness Homophobia Misogyny Groypers Proud Boys Patriot Front
Matched Tweets Image Guesses - 10
the crow @CulturalDarkne1
@w_terrence

Copy as image

Copied to clipboard!

the crow @CulturalDarkne1
@Cernovich I’m sure this is all coincidence too

Copy as image

Copied to clipboard!

the crow @CulturalDarkne1
@JesseKellyDC

Copy as image

Copied to clipboard!

the crow @CulturalDarkne1
@ViralNewsNYC

Copy as image

Copied to clipboard!

the crow @CulturalDarkne1
@ViralNewsNYC me hear for boost 7 pleez. need now and good helth no leave me.

Copy as image

Copied to clipboard!

the crow @CulturalDarkne1
@mask_bastard

Copy as image

Copied to clipboard!

the crow @CulturalDarkne1
@9sami98 @Breaking911

Copy as image

Copied to clipboard!

the crow @CulturalDarkne1
@OccupyDemocrats Do these dummies asking for ‘sources’ not have eyes? The source is right in front of you! I’m disturbed that Twitter isn’t doing more to combat such disinformation and allowing photoshop-truther conspiracy theories to fester in online safe-spaces.

Copy as image

Copied to clipboard!

I don’t want to live around all these people. It seems like they don’t have my best interests in mind. https://t.co/EMG1eAUBF5
the crow @CulturalDarkne1
Minorities along with a couple chosenites...what a surprise.

Copy as image

Copied to clipboard!

View @Lady_Astor

Filter

Use the Copy image icon, and object detection annotation toggles if present, to grab toxic content. Some of these guesses will be wrong- computer vision is hard!

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)













Classifier 20221028131014
militarized 58%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)







Classifier 20221028131014
anime 53%
mspaint style 35%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)






Classifier 20221028131014
anime 52%
mspaint style 36%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)








Classifier 20221028131014
joker 43%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
face 42%
militarized 30%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)





Classifier 20221028131014
etc 71%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


Classifier 20221028131014
tldr 100%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)






Classifier 20221028131014
mspaint style 88%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)






Classifier 20221028131014
mspaint style 88%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
mspaint style 99%