@sirabhe Sirabhe Unhinged tweetcel 🇪🇬

sirabhe's neighborhood:

Slurs & ToS Breaches Nazi Dogwhistles Antisemitism Transphobia Anti-Blackness Homophobia Misogyny Groypers
Matched Tweets Image Guesses - 9
Sirabhe @sirabhe
Soon as that niqqa stepped forward to the kid the dad should've squared up
Tom Hennessy @Tomhennessey69
A black verbally assaults a White family with a small child. Attempting to engage in a fight for unknown reasons. Subways and all forms of public transport are no-go zones, this family learned the hard way. #getawayfromthem https://twitter.com/Tomhennessey69/status/1638017933929508864/video/1

Copy as image

Copied to clipboard!

Sirabhe @sirabhe
Niggas going full dear diary mode on linkedin 🤢

Copy as image

Copied to clipboard!

Filter

Use the Copy image icon, and object detection annotation toggles if present, to grab toxic content. Some of these guesses will be wrong- computer vision is hard!

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
swastika 76%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)

























Classifier 20221028131014
anime 98%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)





Classifier 20221028131014
mspaint style 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)

Classifier 20221028131014
tldr 94%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)




Classifier 20221028131014
tldr 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)

Classifier 20221028131014
tldr 100%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
tldr 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
art_other 75%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)

Classifier 20221028131014
tldr 99%