Tumgik
#its important to unpack how the creators beliefs influence the work they produce
cloudysfluffs · 1 month
Note
Don’t listen to haters, everything ever spread about Vivzie was disproven. Your art is cute.
LMAOOOOOOOO NO IT WASNT????????!!??!?!?
#WEIRD take man#first of all there are so many accusations about viv this is so unspecefic#also. no they havent?!?!?!?!?!?!?!?!? ive seen so much proof. i see more every single day#i mean thank you. for the compliment.#but being critical about media (even media you enjoy) is a good thing.#its important to unpack how the creators beliefs influence the work they produce#disc horse#this is the first thing i saw when i woke up today and it baffled me so much that i couldnt sleep more like i planned lol#anyway. im not saying anyone cant enjoy the show(s). obviously i do A LITTLE if im making fanart#im not saying you have to drop a media if its creators are problematic. in facf i dont like that take#just remember you are not immune to propaganda and vivzies rac/ist/anti/semetic opinions are very much influencing these characters writing#and things like her (SELF ADMITTED) ra/pe fet/ish arent helping.#sorry. this is a rant ive been wanting to say for a while bur have never got to lol#im just so confhsed by what this person even meant??? some of the bad shit shes done is IN THE SHOW. its in there#you can see it. with your eyes . help#anyway again this is literally the first thing i saw when i woke up LMAO if i completely misinterpreted this ask lemme knkw#the assumption that ive just taken the word of a few ''haters'' and havent done my own research into this topic is kind of insulting#what did you expect me to say....??? did you think id just be like 'oh ok :3' ans blindly retract all negative statements
7 notes · View notes
vopium · 4 years
Text
A Framework for a Healthier YouTube
-by Andrew Cherry
YouTube, the video-sharing platform owned by Google that we all rely on for one reason or another, is routinely cited as one of the top used websites from around the world. There are few other places you can get daily long form vlogs, makeup demonstrations, cooking tutorials, cat video compilations, and television show breakdowns while being a click away from intense political analysis or video tutorials of open heart surgery. Part of the draw of this goliath website is the ever-expanding quantity and variety of videos available to the consumer for “free.” I use the phrase “free” here because while users don’t have to monetarily pay to access video content, there are major drawbacks to the site that end up negatively impacting both content creators and the general viewing public. My goal here as an active user of YouTube is to provide a set of recommendations that I believe would work to push back against some of the more harmful and toxic effects of the site. I hope not to change the fundamental structure of the site or stifle people’s creativity, but rather, I hope that these changes would make the site safer for everybody and work to ensure healthy discourse both online and in real life.
Re-Design the YouTube Algorithm
One of YouTube’s biggest issues is the way that their video recommendation algorithm suggests viewers to watch more and more extreme videos, in what is being called by some as the Alt-Right Pipeline. While the site has evolved to become much more than just a company in a lot of ways, it still strives to make money, and thus, the algorithm suggests similar but more extreme videos knowing that this gradual escalation will keep people actively engaged and generating more ad revenue. This Vox article does a good job of laying out the issue, whereas people start watching conservative YouTubers like Ben Shapiro who himself has explicitly disavowed the alt-right, but then YouTube’s algorithm continues to direct these viewers to further-right videos and cements their radicalization to the far-right over time. While I do see the irony of linking to a YouTube video here to drive my point home, this video titled “The Alt-Right Playbook: How to Radicalize a Normie” takes a pretty comprehensive, step-by-step approach to demonstrating how this radicalization can happen online.
The Vox article above notes how persuasive these extreme YouTube videos can be and even quotes sociologist Zeynep Tufekci in saying that “given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.” Given the long term and deeply alarming implications of that quote, it’s necessary to unpack why the site wields such influence in the minds of its users. I would argue, in terms of Cialdini’s Six Principles of Persuasion, that authority and social validation are the two strongest feelings associated with YouTube’s persuasiveness. With respect to his concept of authority, I argue that YouTubers like Ben Shapiro, Steven Crowder, and Joe Rogan become expert authority figures in the minds of their frequent viewers, which then makes these viewers more inclined to check out the videos produced by frequent guests on these shows. These guests, however, are often content creators even further to the right than the original hosts, which compounds with the perceived authority of the YouTube algorithm to push people further down the Alt-Right Pipeline even quicker.
Furthermore, I argue that the YouTube ends up plays on people’s lack of social validation to encourage them to stay on the site longer. This principle explains that as you see people around you doing something, you are more likely to also do that thing in order to fit in. In a time when people around my age are often incredibly anxious and feeling isolated from the world at large, it can be more common for people to latch onto social groups where they feel somewhat understood. One such area is alt-right comment sections on YouTube or by following alt-right YouTubers on Twitter. By deriving their social validation from these circles and by delving deeper into that online bubble, people are more likely to follow the examples set by these online agents, regurgitate their toxic talking points, and influence more people to go down that path.
In order to combat this issue, I recommend that YouTube take a serious look at how their algorithm is designed and make changes accordingly. It should no longer be the sole motive of the algorithm to keep people engaged to drive ad revenue. There should be a team of human moderators involved who understand the dangerous nature of online radicalization and work to prevent it from happening. This is not to say that conservative thinkers should be censored online, but rather to say that YouTube has an outsized influence on modern culture and should be aware of the role their site has played in disseminating fascist ideologies as simply free speech from right wing thinkers.
Make Monetization Practices more Transparent
In the past few years, YouTube has faced extensive criticism for the ways in which they control monetization practices on the site. Content creators who make videos and upload them on the site sometimes come to find that their content had been age-gated, hidden, or simply demonetized. This means that for people who rely on YouTube as their primary source of income, demonetization on popular videos means significantly less money coming in and they are often not provided with any explanation from the company as to why a video was demonetized. This article on The Verge highlights the accompanying problem to larger demonetization issues, which are the arguments that YouTube was automatically demonetizing videos from LGBTQ+ creators simply because of their identities. While they deny these claims, YouTube was also accused of showing anti-gay ads before LGBTQ+ videos, further contributing to the belief that the site says they support these lifestyles but then acts rather differently. It is important to note here as well that even if the videos become remonetized after YouTube reviewers check it out, the creators of the video do not get reimbursed for the money lost in that time frame and these types of practices can work to hide this type of information from the marginalized people who may need it.
Thus, I argue that YouTube needs to make their monetization practices more transparent and provide YouTubers with more in-depth responses after demonetization happens which explain why it happened and how to quickly appeal if they feel it was unjust. If they need to hire more reviewers to engage personally with videos and decide more quickly whether they should be monetized or not, then so be it because the site surely has the money to do so. Not only would this help to retain active YouTubers that are starting to feel sidelined, but it would help to bring in more users because as authors Kraut and Resnick explain on page 199 of Building Successful Online Communities, “Providing potential new members with an accurate and complete picture of what the members’ experience will be once they join increases the fit of those who join.”
Increase Child Protections and Age Restrictions
As mentioned previously above, setting 18+ age restrictions on LGBTQ+ content simply because of the creator’s identity is harmful because it prevents young, questioning individuals from viewing potentially validating and reassuring information that they could not get elsewhere. On the other hand, however, there needs to be a larger effort by the site to ensure that the videos that are allowed in the family/kid friendly side of YouTube are actually safe for children to watch. Last year, Wired reported that they found videos “containing violence against child characters, age-inappropriate sexualisation, Paw Patrol characters attempting suicide and Peppa Pig being tricked into eating bacon” that were discovered by following YouTube’s recommended section or just allowing the site’s autoplay function to do its job.
These horrifying videos would be scarring even to an adult, and I cannot imagine the type of long-term psychological damage that it could inflict upon children without their parents even being aware of what is happening. It is imperative then for YouTube to do a better job of ensuring that the content allowed to be viewed by children is safe for their eyes by improving the age restriction settings and increasing human involvement in the scanning of these videos for child protection. This will have to be done carefully, however, because subjecting people to scanning these types of videos all day long would also likely have negative lasting mental health effects. I would recommend ensuring that the workers have proper mental health services to go along with it and allowing for the rotation of workers in and out of this type of moderation position.
I am not naïve enough to think that these are all simple solutions to such complicated and encompassing problems. Alt-right fascists will not simply disappear if we better regulate YouTube content and fix the radicalization aspects of their recommendation algorithm. Certain videos will most likely always get demonetized or age-restricted even when their creators don’t think its necessary. It can be hard to catch every single disturbing video uploaded to the site when there are over 500 hours of content being uploaded to YouTube every minute. But, it is my hope that these recommendations are taken seriously and at the least, start a conversation about how the site can do better, and at most, can be used as a framework for how YouTube can become a better, more healthy site for all parties.
35 notes · View notes