How aspiring influencers are forced to fight the algorithm

There are two ways to try to understand the impact of content moderation and the algorithms that enforce those rules: by relying on what the platform says, and by asking creators themselves. In Tyler’s case, TikTok apologized and blamed an automatic filter that was set up to flag words associated with hate speech—but that was apparently unable to understand context. 

Brooke Erin Duffy, an associate professor at Cornell University, teamed up with graduate student Colten Meisner to interview 30 creators on TikTok, Instagram, Twitch, YouTube, and Twitter around the time Tyler’s video went viral. They wanted to know how creators, particularly those from marginalized groups, navigate the algorithms and moderation practices of the platforms they use. 

What they found: Creators invest a lot of labor in understanding the algorithms that shape their experiences and relationships on these platforms. Because many creators use multiple platforms, they must learn the hidden rules for each one. Some creators adapt their entire approach to producing and promoting content in response to the algorithmic and moderation biases they encounter. 

Below is our conversation with Duffy about her forthcoming research (edited and condensed for clarity). 

Creators have long discussed how algorithms and moderation affect their visibility on the platforms that made them famous. So what most surprised you while doing these interviews? 

We had a sense that creators’ experiences are shaped by their understanding of the algorithm, but after doing the interviews, we really started to see how profound [this impact] is in their everyday lives and work … the amount of time, energy, and attention they devote to learning about these algorithms, investing in them. They have this kind of critical awareness that these algorithms are understood to be uneven. Despite that, they’re still investing all of this energy in hopes of understanding them. It just really draws attention to the lopsided nature of the creator economy. 

How often are creators thinking about the possibility of being censored or having their content not reach their audience because of algorithmic suppression or moderation practices? 

I think it fundamentally structures their content creation process and also their content promotion process. These algorithms change at whim; there’s no insight. There’s no direct communication from the platforms, in many cases. And this completely, fundamentally impacts not just your experience, but your income. 

Source link

We will be happy to hear your thoughts

Leave a reply

Enable registration in settings - general
Compare items
  • Total (0)
Shopping cart