@LennyLampshades Alex franklin Headstrong, I’ll take you on Michigan,

LennyLampshades' neighborhood:

Slurs & ToS Breaches Nazi Dogwhistles Antisemitism Transphobia Anti-Blackness Homophobia Misogyny Groypers
Matched Tweets Image Guesses - 7
Alex franklin @LennyLampshades
@OrwellNGoode Alrighty then

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Alex franklin @LennyLampshades
@NGunderland @xctlot Bas Rutten has the best daughter, but is unquestionably the most dangerous man there

Copy as image

Copied to clipboard!

Alex franklin @LennyLampshades
@MuIIetFan5 IT WAS ME ALL ALONG LAURA

Copy as image

Copied to clipboard!

Alex franklin @LennyLampshades
DON’T GIVE YOUR REAL PRONOUNS https://t.co/cyziS4edfk
Quoted tweet unavailable

Copy as image

Copied to clipboard!

Alex franklin @LennyLampshades
@SaucercrabZero >Half of the ANARCHISTS citing an unwillingness to break gun laws

Copy as image

Copied to clipboard!

Alex franklin @LennyLampshades
@Electrvm: @NoHighwayOption @Pozzidonius1 Batman Returns lookin ass mf

Copy as image

Copied to clipboard!

Filter

Use the Copy image icon, and object detection annotation toggles if present, to grab toxic content. Some of these guesses will be wrong- computer vision is hard!
gigachad 84%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)










Classifier 20221028131014
critters 99%
joker 76%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)











Classifier 20221028131014
art_other 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)









Classifier 20221028131014
etc 61%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)








Classifier 20221028131014
hitler 29%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)




Classifier 20221028131014
face 84%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


Classifier 20221028131014
tldr 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)




Classifier 20221028131014
hitler 29%