@YoxallOliver Oliver Yoxall Clown World

YoxallOliver's neighborhood:

Slurs & ToS Breaches Nazi Dogwhistles Antisemitism Transphobia Anti-Blackness Homophobia Misogyny Groypers Proud Boys
Matched Tweets Image Guesses - 5

Copy as image

Copied to clipboard!

Oliver Yoxall @YoxallOliver
@LegendaryEnergy ‘You’re awful Jimmay. Playing my video, you just wanted to make fun of me’

Copy as image

Copied to clipboard!

TheEndOfEverything @EternalEnglish
What hellish vision is this? https://t.co/8BhSTYVfv9
Oliver Yoxall @YoxallOliver
Guy behind him is a (((pornographer))) called Richard Desmond

Copy as image

Copied to clipboard!

Way of the World @wayotworld
**NEW VIDEO** Early @bitchute release! WotW offers some belated thoughts on Joker and notes some encouraging signs in the mainstream media. Please share! https://t.co/RNDuRyoPQB
Oliver Yoxall @YoxallOliver
@bitchute Todd Phillips was on a podcast with Michael Moore recently. He spoke about how the movie is an indictment of the loss of compassion in Drumpfs America & boasted about how Hollywood and Joos have played a leading role in ‘shifting public attitudes’ towards liberalism & tolerance

Copy as image

Copied to clipboard!

View @wayotworld
Oliver Yoxall @YoxallOliver
@Urbanus_Crusade Is there an ((echo)) in here?

Copy as image

Copied to clipboard!

Filter

Use the Copy image icon, and object detection annotation toggles if present, to grab toxic content. Some of these guesses will be wrong- computer vision is hard!
joker 81%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


Classifier 20221028131014
joker 60%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


Classifier 20221028131014
militarized 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)







Classifier 20221028131014
militarized 53%
critters 38%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
etc 61%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


Classifier 20221028131014