Tumgik
#the youtube algorithm has previously been shown to push far right and conspiracy content but that was proven through proper study
notforyov ยท 2 years
Text
Expanding on the previous post, the biggest flaw I see is in how it was set up. Creating a fresh account and following specific creators with content you are looking for creates an inherent confirmation bias. To show how the algorithm aids radicalization, you'd have to start completely fresh and meander your way through the website for a lot longer than a couple hours-worth of content. You'd have to behave like an actual person and not a bot seeking specific content or beliefs.
"I'm trying to prove that this platform pushes Xphobic content, so I made a new account and followed exclusively Xphobic creators to see how the algorithm would radicalize me" you already showed the algorithm your account had been radicalized!!!!!!
0 notes