Why the FTC's Attempt to Ban Buying Fake Followers Won't Work
The FTC's attempt to take on the internet's Unreal Problem is much more complicated when you realize the rot starts from the core.
The internet is big. Really big. You just won’t believe how vastly, hugely, mind-bogglingly big it is. Have you ever tried figuring out what, exactly, everyone is talking about? What’s everyone listening to? What is everyone watching? That doesn’t even broach trying to figure out why or how that’s become today’s event. Everyone wants to know what’s popular, and it feels harder than it should.
One growing problem with popularity that the FTC has pursued is fraudulence. The FTC recently denounced a number of shoddy practices, including buying fake reviews from someone without knowledge of a product, suppressing negative reviews, and publishing reviews from company insiders without proper disclosure. My favorite ruling is that it’s an unfair trade practice to buy followers, likes, and other engagement boosters that could suggest a product is more popular than it is. I like to refer to this as the Unreal Problem.
The Unreal Problem builds off my philosophy that as we become much more distanced from each other thanks to algorithmic fragmentation we wind up searching for more experiences and media that reconnects us. We seek out actual popularity instead of suggested popularity. This creates a barbell of implausibly big and intimately small. The latter is where we spend most of our time; reading books, watching TV shows, listening to artists that feel designed for us even if only a few thousand other people like the same thing. The latter is where we draw comfort in connectivity. Taylor Swift. The NFL. Christianity. The more popular something or someone is, the more likely we are to engage with them. Neuroscientist Daniel Kahneman sums it up best. Our brains are broken into two pathways: fast, intuitive, and emotional versus logical, rational, and conscious. When given the opportunity, we are naturally inclined to go with the former. Top 10 lists, Top 40 rankings, “what others are listening to” playlists all help with that preferred thought pattern. As the quantity of consumable media increases every year, so does the importance of latching onto an artist everyone knows. Consider this point from Vox:
“In 2020, there were more songs on Billboard’s Hot 100 than any year since the 1960s, the last decade when singles, and not albums, drove the recording industry. In 2019, 40,000 songs were uploaded daily to Spotify, according to Music Business Worldwide; in 2021, that number has grown to 60,000.”
The more difficult capturing attention is, the more deceptive tactics people use to project success. Anyone who works in the entertainment industry, and especially those who work in the music industry, will have plenty of experience with this type of behavior. Fan armies have taken on militaristic tactics to ensure that their favorite singers and artists appear at the top of YouTube’s “Trending” list and appear at the top of Spotify’s rankings. Fan driven inflation became such a problem that Billboard was forced to change its methodology in 2018 defining a stream to better illustrate authentic behavior, weighting streams from distributors that require payment higher than those on services like YouTube and the ad-supported version of Spotify.
We don’t often talk about how rapidly our brains adapted to the internet, therefore mastering its behavior and etiquettes; and how quickly we learned to manipulate manipulatable technology to tell whatever story we wanted. The higher the number of comments, likes, and streams, the more authoritative that person or product seems, and the more likely other people are to buy into it. Or, from the FTC’s perspective, the more likely they are to just buy it outright.
“Fake reviews not only waste people’s time and money, but also pollute the marketplace and divert business away from honest competitors,” Lina Khan, chairperson of the FTC, said in the ruling.
Unsurprisingly, companies who have built billion-dollar enterprises on user reviews and comments — Amazon, Yelp, Etsy — love the ruling. Similar to how Google found its dominance through perfecting ingesting gigantic heaps of data and surfacing the most relevant, authoritative results, so did Amazon, Yelp, and Etsy. At first. This changed when people could purchase higher ranking results and buy preferred placements. It got worse when the general public discovered review bombing. Algorithms don’t take in nuance, nor do they respond to context. Platforms that try to scale subjectivity through objective machine learning were never likely to execute perfectly. Add in human organization and deceptive or harmful behavior? You create a product that people trust less and less.
I was out walking with a friend the other day who mentioned a terrible matcha drink she bought at a new cafe in Brooklyn. It immediately made her sick. That alone is a non-story, but what made her comment so interesting to me was how she framed it. “The reviews on Yelp were so positive! Another reason you should never trust reviews on Yelp.” We know these platforms get shittier and shittier each day, but we use them anyway. The FTC ruling is designed to protect consumers, but it could arguably help some of these companies (even slightly), too.
Two very different realities make enforcing this ruling supremely challenging. Discovering that people are using fake followers or buying positive coverage is in itself tricky. Nearly 25% of “influencers” purchased fake followers on Instagram in 2022, according to IMAI. A Statista report from 2021 found that nearly 50% of influencers across multiple platforms purchased fake followers. The more followers these influencers have, the more likely they are to receive brand deals, and the more eCommerce deals platforms like Instagram can theoretically make. Instagram, unlike Google, Amazon, and Yelp, is dependent on pure attention, not authenticity.
Any form of vibrancy found inside our most-used apps is built on the rickety foundation of fake followers, fueled by the willful disbelief that all of it is just real enough.
The larger problem is asking consumers to evaluate the validity of comments and accounts they come across. Some of these are easier than others. We’ve all seen comments on Amazon or Yelp that are so clearly fake that many of us have naturally strayed away from those products. Others, however, are less obvious by design. When we’re trying to determine what level of attention is valid vs invalid while simultaneously searching for connection to the greater world by experiencing things that other people are experiencing or posting about places that other people are going, that question about whether something is real or not becomes less and less important compared to feeling like you’re part of something.
Think of the popularity of physical goods. We used to judge the popularity of a toy or a book based on the number of weeks it was sold out at Toys ‘R’ Us or backlogged at your local library. It’s more challenging to use those same gauges as brick and mortar stores disappear because we’re shopping on Amazon or Temu. There was an 80% surge in the number of physical retail stores that closed in 2023 compared to 2022, with more than 4,600 stores in the United States closing by the end of the year, according to Coresight Research. eCommerce continued to expand in the same period, growing 7.6% compared to 2022. While this is an understandable decline from the 42.9% growth seen between 2019 and 2020 when the pandemic hit, the behavioral shift remained irrevocable. Consumers were spending more on physical goods, but no longer reliant on stores. When everything is online and accessible, the question becomes one of curation, of mass demand, of personal taste — all of which are heavily influenced by easily gamified platforms.
It’s this type of manipulation that has helped further the enshittification of our communal internet. It’s all built on willful disbelief. Any form of vibrancy found inside our most-used apps is built on the rickety foundation of fake followers, fueled by the willful disbelief that all of it is just real enough. The problem isn’t that people can buy fake followers. What is a fake following in 2024 if not a full-sized ad in a catalog 30 years ago? That magazine placement is bought with the same intention as those collections of usernames. What’s different, however, is our relationship to an obvious ad versus a person on the other end of the screen. We understood there was a team of marketers behind the magazine advertisement. We may ultimately comprehend that an influencer isn’t as famous as their numbers make them appear, but there’s still something to a person talking about or wearing something that makes us feel intimately more connected. It’s no longer just an ad with bought reviews from an account with bought followers — it's a fleeting momentary connection. Feeling trumps logic.
The enshittification of the internet happened because platforms needed to scale to superserve advertisers to superserve companies’ own bottom lines. Companies made it easier for anyone to create an account, for anyone to advertise, and for anyone to sell. They turned digital hangout spots into planet-sized warehouses. That need to feel like we belong to the same world, that we’re participating in the same thing, never went away. It only got worse the more we drifted apart from one another without any control over where we landed.
The FTC may try to dissuade people from buying fake followers, positive reviews, or suppressing negative comments but the darker side of the truth is people are willing to put up with a lot of inherently harmful behavior if it makes them feel okay. A shoddy desk chair isn’t beneficial to anyone, but feeling like you’re part of a trend even if it’s made up of bots replying to other bots in the comment section is much harder to discourage. Isn’t that the real problem? Rhetorical question, of course it is. It’s also much, much harder to solve. Maybe the FTC can save someone from a bogus $30 toaster.