Clear Sky Science · en
When algorithms fail us: perceived algorithmic ineffectiveness, psychological reactance, and implicit personality as drivers of algorithm aversion behavior on short-form video platforms
Why this matters for your feed
Short video apps like TikTok and Douyin promise to show us exactly what we want, exactly when we want it. Yet many people still find themselves annoyed by their feeds, mistrustful of what they see, or constantly fighting the recommendations. This study asks a deceptively simple question with big consequences for our digital lives: what happens, psychologically, when people feel that the algorithm just is not working for them?

When the feed feels off
The researchers focus on a key idea they call perceived algorithmic ineffectiveness: the sense that the platform keeps serving videos that are boring, unhelpful, or irrelevant. Instead of measuring how accurate the algorithm really is, they look at how accurate it feels to users. When people judge recommended clips as not worth remembering, not meaningful, or not convincing, they are more likely to push back against the system itself. In other words, disappointment with the feed becomes the starting point for a broader resistance to algorithmic guidance.
From irritation to pushback
The next step is psychological reactance – the unpleasant feeling we get when we think our freedom is being limited. On Douyin and similar apps, the “For You” page decides what appears first, quietly steering attention. When this stream clashes with what users think they should be seeing, they can feel nudged, crowded, or even watched. The study shows that such moments of mismatch do more than irritate. They spark a sense that the app is trying to tell users what to watch, which in turn fuels anger, impatience, and the urge to do the opposite. This emotional backlash becomes a powerful driver of what the authors call algorithm aversion.

How people fight back against the feed
Algorithm aversion shows up in subtle but important ways. Instead of passively scrolling, users begin to avoid recommended clips, search manually, or rebuild their playlists by hand. Some try to “retrain” the system by skipping, blocking, or rapidly swiping through unwanted videos. Others disengage for stretches of time or treat the platform with a kind of weary cynicism: they keep using it, but with low trust and low expectations. Using survey data from 733 Douyin users, the study finds that the more ineffective people feel the algorithm is, the more psychological reactance they report – and the more strongly they enact these small acts of resistance.
Personality and mindset in the algorithm age
Not everyone responds to bad recommendations in the same way. The authors examine a trait called implicit personality, which captures whether people see traits and abilities as fixed or changeable. Those with a “fixed” mindset tend to keep a stable, skeptical attitude toward algorithms, whether they work well or poorly. Those with a “growth” mindset are more sensitive: they respond positively when the system seems helpful, but react more sharply when it fails. The study shows that for this second group, feeling that the algorithm is ineffective more strongly inflames psychological reactance, which then leads to stronger algorithm aversion behaviors.
What platforms can do differently
These findings suggest that the problem is not only whether recommendation engines are technically accurate, but whether people feel heard and in control. When users experience the feed as a one-way street, even small missteps can snowball into lasting distrust and avoidance. The authors argue that platforms should give people clearer ways to understand and influence recommendations, create real feedback loops when users push back, and design controls that respect different mindsets. Put simply, when algorithms fail us – or merely feel like they are failing – people do not just shrug and scroll. They adapt, resist, and sometimes quietly turn away from the very systems meant to serve them.
Citation: Zeng, R., Zhu, D. & Evans, R. When algorithms fail us: perceived algorithmic ineffectiveness, psychological reactance, and implicit personality as drivers of algorithm aversion behavior on short-form video platforms. Humanit Soc Sci Commun 13, 266 (2026). https://doi.org/10.1057/s41599-026-06573-w
Keywords: algorithm aversion, short-form video platforms, personalized recommendations, psychological reactance, user autonomy