@_neon_droid Carlos C 📍🇲🇽 | 𝗗𝗶𝗿𝗲𝗰𝘁𝗼𝗿 𝗼𝗳 𝗗𝗲𝘅𝘆 𝗟𝗮𝘁𝗔𝗺 | 𝙱𝚞𝚜𝚒𝚗𝚎𝚜𝚜 | 𝙶𝚊𝚜𝚝𝚛𝚘𝚗𝚘𝚖𝚢 | 𝙲𝚞𝚕𝚝𝚞𝚛𝚎 Mexico

_neon_droid's neighborhood:

Slurs & ToS Breaches Nazi Dogwhistles Antisemitism Transphobia Anti-Blackness Homophobia Misogyny Groypers Proud Boys Patriot Front
Matched Tweets Image Guesses - 9
Carlos C @_neon_droid
@Vic_S83 @ryanmjohnson

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Carlos C @_neon_droid
Quoted tweet unavailable

Copy as image

Copied to clipboard!

Carlos C @_neon_droid
@PGeneracija @Tom_Rowsell Lol

Copy as image

Copied to clipboard!

Carlos C @_neon_droid
@PonchoGutz Y sí

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Carlos C @_neon_droid
Quoted tweet unavailable

Copy as image

Copied to clipboard!

Carlos C @_neon_droid
@marciooop En EU más gente muere apuñalada o a golpes que por tiros de un rifle. Hoy en día es puro alarmismo, slacktivism y cero investigación o neutralidad.

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Carlos C @_neon_droid
@JackBouroudjian

Copy as image

Copied to clipboard!

Copy as image

Copied to clipboard!

Filter

Use the Copy image icon, and object detection annotation toggles if present, to grab toxic content. Some of these guesses will be wrong- computer vision is hard!

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


















Classifier 20221028131014
gadsden snake 99%
militarized 63%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)







Classifier 20221028131014
critters 80%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)











Classifier 20221028131014
art_other 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)









Classifier 20221028131014
art_other 99%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)








Classifier 20221028131014

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)












Classifier 20221028131014
map 98%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)


Classifier 20221028131014

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)



Classifier 20221028131014
tldr 63%

Copy image with current annotations Copied to clipboard!

Object Detector 20221101171408 (everything)

Classifier 20221028131014
tldr 91%