Better than a TikTok Ban (2024)

(Programming note: I have temporarily re-titled this newsletter. For now, we’re “Another Way”-- will take suggestions, since this doesn’t feel permanent. Whether this or something else, I’ll have a final name once I full-launch this newsletter, which should be happening in a matter of weeks. Thanks for reading!)

A friend of mine in college once joked that you never go to Denny’s (the cheap breakfast all day chain), you end up at Denny’s. Likewise for TikTok. One chooses to watch that new series or play that new video game that just released. But one ends up scrolling TikTok or Twitter.

Unlike Denny’s, TikTok has come under fire by Congress recently. Part of the reason the right wants to “ban” (Congress is not seeking to ban TikTok– the bill in question would lead to a forced sale) TikTok is because they think it’s brainwashing their kids into left-wing causes, queer identity, etc. Mitt Romney recently opined that Palestine was suspiciously popular as a topic on TikTok. While Congress has primarily spoken of the ban in national security terms relating to China, “anti-woke” politics also seems to be a key motivator.

Centrist Democrats have a version of this too, motivated by a combination of post-Russiagate conspiratorial attitudes and establishment elitism. Hilary Clinton recently complained of “propaganda” on TikTok. Romney’s aforementioned similar opinion came in a conversation with Secretary of State Anthony Blinken. This is a digital version of the same thinking behind NYC mayor Eric Adams’ (and many others’) false claims of “outside agitators” involved in campus protests. Unable to conceive of or unwilling to acknowledge protesters as legitimate, the political establishment has dismissed a mass movement against genocide as app-driven brainwashing by vague yet nefarious outsiders.

The right and center are targeting a social media platform dominated by a younger and more progressive user base precisely because those users have effectively used the service to their political and cultural ends. It is not a coincidence that the infamous hate speech purveyor “Libs of TikTok” targets (or at least began by targeting) libs of TikTok. Legislators oppose TikTok for reasons at best alarmist— what, exactly, could China do to America using TikTok?— and at worst openly censorious.

Despite this, I think there’s a case for anti-TikTok legislation— but for entirely different reasons, and in a different form. The problem with TikTok is algorithmic recommendation, not China. Let’s unpack some buzzwords here: an algorithm is simply, a set of rules. In the case of social media, the underlying rule is to choose content based on what gets my attention, since advertising is where revenue comes from. Let’s say I’ve liked several posts this week about FX’s recent series Shogun, with keywords like “pirate” and “rogue” and “delightfully exaggerated English accent.” Accordingly, the algorithm shows me more content featuring these words.

One might say that our social media feeds simply reflect what we want to see. But social media algorithms optimize around what gets our attention and causes us to share things– which is distinct from what we enjoy (things that we would have a positive emotional reaction to), and still different from what we want to see (things that we would choose to see if asked). “The algorithm” doesn’t care why my eyeballs are stuck on something, only that they are. Those who produce and share “content” know this, and produce material on their respective platforms that will get maximum attention. At best this means shallow and repetitive content that might waste our time. At worst we’re talking about QAnon content.

Algorithms don’t just change what we see, but what we can see, what we want to see, what we can want to see, and what we can know we want to see. If we imagine the internet as a library, algorithmic recommendation isn’t a card catalog. It isn’t even a guided tour by a bad librarian who thinks they know what you want better than you do. Algorithmic recommendation is more like a publisher who determines what books can exist and make it to the library at all.

In fact, it is worse than that. Algorithms do not “decide” anything. Unlike human individuals or communities, they are not agents with beliefs or preferences. They are rules. These rules can and frequently do include and magnify the biases and flaws of their creators– racism in face detection software, for example. What they fail to contain are our values. Embedded in algorithms are the frustrations of human subjectivity without any of the mercies of human personality. They inherit our bigotry and our oversights, not our intentions or ethics.

The result is often harms no one even intended. Mark Zuckerberg and I probably wouldn’t get along, but I am relatively certain he never wanted to be complicit in Myanmar’s genocide. His algorithms alone were enough for that: dehumanizing calls for violence against the Rohingya minority were clearly shareable content. Amnesty International reports that the platform’s recommendation algorithms “substantially contributed” to mass murder; the UN likewise linked Facebook to the genocide. Facebook responded to the problem with an anti-violence sticker pack. The idea was that users could post “don’t spread hate” style stickers in response to violent content. But like the sorcerer’s apprentice, Facebook found it difficult to direct the forces it unleashed: their algorithms interpreted users’ engagement with violent posts as a reason to spread them further.

Here is a short and not in any way complete list of concrete harms that have plausibly resulted from algorithmic recommendation on social media: an intensification of genocidal violence in Myanmar, reduced vaccination rates, an increase in suicidal ideation, a decline in teenage girls’ body image, and arguably Donald Trump’s clumsy failed autogolpe. The internet has taken to calling Panera’s recently-discontinued (RIP) “Charged Lemonade”-- which has been responsible for 2 deaths– “the lemonade that kills you.” By the same token, we might more clumsily call algorithmic recommendations “apps that make life worse.”

More nebulously, algorithmic recommendation steals hours and hours of our lives. You have a right to stare at your phone for hours. I’m probably going to stare at my computer for several later today while playing video games. But you also have a right not to have that choice made for you. The important difference between watching a classic film every night and watching 2 hours of TikTok isn’t a highbrow/lowbrow distinction. It’s that the first decision takes effort and deliberation. Yes, good TV can be “addictive,” but in most cases you binge watch your favorite show because it’s good. You binge watch TikTok because it’s easy.

Opponents of TikTok are right to recognize that it changes our behavior, even though they are horribly wrong about who is responsible, what to do about it, or why we should care. I’ve just warned you of Facebook’s involvement in genocide, while some in Congress seem more worried that TikTok might get in the way of the genocide in Gaza. While these tools exist, of course we should use them. Campus protests and the left have effectively used social media. That doesn’t mean we need it— despite what Hillary Clinton and Mitt Romney say— and it certainly doesn’t mean we need social media in the form it exists right now. Those of us on the left can get our ideas out without algorithms. It’s worth recalling too that social media has been immensely important for the right as well, of course.

Moreover, algorithmic recommendation is anti-democratic. We are all a messy constellation of impulses, instincts, intents, and convictions. Algorithmic recommendation reduces us to our immediate instincts and reactions. The parts of us that are least intentional thus determine the information environment we operate in, and ultimately help shape our worldviews and social connections. On social media, you neither fully choose what you will see, nor do you simply encounter a random slice of the world as it is– you see the world as a machine seeking to serve the ends of advertisers wants you to. This is antithetical to the intention, deliberation, and community at the heart of democracy.

It can be difficult to admit how much power recommendation algorithms exert over us. It’s genuinely embarrassing as an adult to recognize that you wasted an hour or more in a day doing something you might admit you don’t want to. An easy and obvious denial is to claim this was intentional— to identify with the choice that’s been made for you. This denial is made that much easier because of the manner in which algorithmic recommendation makes choices for us. Instead of using explicit coercion, algorithms wield our own instincts (five more minutes; scrolling delivers more interesting content while getting out of bed takes effort) against our intentions (I would like to spend more fully present time with my loved ones, engaging in my hobbies, and generally doing things I intend to today).

If you read those words with any of the recognition and shame that I write them with: it’s not your fault! Billions of dollars have been spent trying to optimize the theft of our attention. You don’t need to fight back alone with mere willpower. This is a classic case of big money against the rest of us. This is what government is for.

A ban on TikTok for being Chinese-owned, however, is like banning lead in gasoline, but not plumbing, and doing it because lead is mined in China rather than because it is toxic. The “TikTok Ban” isn’t simply a good policy for the wrong reasons— it’s a bad policy that misplaces blame. American-owned TikTok would be no better, because the primary problem with social media is what it is, not who runs it.

Instead, let’s ban algorithmic content recommendation, everywhere. Social media feeds will have to become chronological outputs of posts by users you have specifically chosen to follow. More general language than this will be necessary to keep big tech from reinventing the same problem in some other way. This has happened before with internet regulation: the European Union’s GDPR recommendations have required websites to let you reject certain cookies, but I’ve noticed (you probably have too) that most websites offer this choice in various misleading and confusing forms, making it easy to intentionally— or simply out of fatigue— acquiesce to something we don’t want to.

Regulation is always in a constant battle against market forces. Legal challenges will undoubtedly be lengthy and irritating, and in the long term, we’ll need continually updated laws. One can imagine an anti-algorithm bill of rights, or even an international convention like those against CFCs and chemical weapons. But national legislation targeted to the particular problems of algorithmic recommendation as it exists in the world now seems to me the most feasible immediate possibility.

Theoretically we could bamboozle the right and the center into passing legislation like this while they are hostile to big tech, but we will have to make sure the censorious impulses behind that hostility don’t actually make it into the law— a difficult task to say the least. Nonetheless, there are a variety of possible political coalitions here and ways to sell this policy: a public health perspective, pro-democracy rhetoric, economic arguments, anti-monopoly concerns.

So long as we keep in mind exactly why we are doing so, and ensure any legislation reflects our intent, we won’t regret trying to ban the apps that make life worse. We know a world without algorithmic recommendation is possible, because we recently lived in it. A new internet with slow social media feeds, niche forums, and little publications rather than content-guzzling juggernauts will be a better one.

One potentially damning objection to this whole program: you might say that markets themselves are an algorithm that does most of what I’ve just complained about. An erasure of intention is a part of markets themselves. Market dynamics determine what films we get to see, what art we create, and how we live. Markets, like algorithms, don’t deliver what we want, they determine what we can want— and because we depend on market structures in our current world to get food, shelter, and housing, they have coercive power social media doesn’t.

As a socialist, I happen to think we should limit the role of markets in our lives as much as possible. Unfortunately this is a much more difficult task than dealing with algorithms. Even a bluntly written and poorly-implemented one year phased-in ban on recommendation algorithms would cause only severe but tolerable social and economic disruption. A one year ban on market activity would be impossible, unthinkable even: markets are the entire structure of our political economy, and any effort to change that will take more time and more vision.

A strike against algorithms would be a step toward broader change though. What algorithmic recommendation tells us by its very nature is don’t think, feel. The algorithmic world is one in which we do not have meaningful thoughts to critically untangle, we have preferences that can be precisely calculated and expressed. Rejecting this paradigm will require embracing deliberate, collective intention. It will require gazing upon the dopamine-charged notification indicator, the endless short video feed, the demonic faces of YouTube thumbnails, and saying no, we will decide what world we want, and we will do it together.

(In the meantime, don’t hesitate to share this. I certainly can’t leave it to the algorithm.)


Better than a TikTok Ban (2024)


Is banning TikTok a good thing? ›

But bans are rarely the solution; inevitably, bans like the one that would result from this TikTok law threaten free expression and access to ideas. If TikTok is no longer available in the United States, audiences will have to find information elsewhere after losing access to the sources they typically go to.

What should we do instead of banning TikTok? ›

Privacy for All: The sponsors of the TikTok ban made it clear that they were concerned about protecting Americans' privacy. Rather than banning one application, a much better solution would be to pass a comprehensive privacy law.

What does TikTok say when you get banned? ›

If your account has been banned, you'll receive a banner notification when you try to log in to inform you of the ban. If your account is banned, you can log in to the account to submit an appeal and download your personal data.

Which country banned TikTok first? ›

China's neighbor, India, was among the first countries to have placed restrictions on TikTok and other Chinese apps. India banned some 60 Chinese apps, including TikTok, during a military confrontation along the Himalayan border that it shares with China.

Is TikTok getting banned in 2024? ›

WASHINGTON — Tucked inside the sprawling $95 billion national security package that President Joe Biden's signed Wednesday is a provision that could ban TikTok, with an important catch: It won't happen before the 2024 election.

What would replace TikTok? ›

ByteDance's competitors have benefitted from the uncertainty around TikTok's future, and the largest alternative video-sharing apps to emerge are Triller, Zynn, Dubsmash, and Byte.

Can I use TikTok if its banned? ›

The Senate passed legislation Tuesday that would force TikTok's China-based parent company to sell the social media platform under the threat of a ban. Here's what to know. No, TikTok will not suddenly disappear from your phone. Nor will you go to jail if you continue using it after it is banned.

Does TikTok allow 18+? ›

We allow a range of content on our platform, but also recognize that not all of it may be suitable for younger audiences. We restrict content that may not be suitable so that it is only viewed by adults (18 years and older). A summary of restricted content categories can be found here.

What is permanently banned on TikTok? ›

An underage ban on TikTok refers to the platform's enforcement of its minimum age requirement policy. This one is for users under 13 (or 14 in some places). Their accounts can face permanent deletion after 113 days.

Can you cuss on TikTok? ›

These are just a few of the more surprising things you can't say on TikTok, but there are plenty of other words and phrases that are also banned. Some terms aren't outright banned---such as common swear words---but you may get a violation warning or have the audience of your video restricted.

Why has usa banned TikTok? ›

TikTok is owned by the Chinese company ByteDance, the US introduced the law due to concerns TikTok was sharing user data with the Chinese government - something it has always denied. The law now gives ByteDance nine months to sell its part ownership of the app, or the app will be blocked in the US.

Who owns TikTok now? ›

TikTok was banned in China but gained a billion users in five years. It is now run by a limited liability company based in Los Angeles and Singapore but is essentially owned by ByteDance. While its founders own only 20% of ByteDance, it's the controlling stake in the company.

Does China ban TikTok? ›

TikTok has never been available in mainland China, a fact that CEO Shou Chew has mentioned in testimony to U.S. lawmakers. ByteDance instead offers Chinese users Douyin, a similar video-sharing app that follows Beijing's strict censorship rules.

What happens if TikTok is banned? ›

TikTok, which is used by more than 170 million Americans, most likely won't disappear from your phone even if an eventual ban does take effect. But it would disappear from Apple and Google's app stores, which means users won't be able to download it.

What is the real reason why TikTok is getting banned? ›

As tensions with Beijing have grown, congressional lawmakers, along with top law enforcement officials, have warned that TikTok is controlled by the Chinese Communist Party (CCP) and is a national security threat to the United States.

Is it good not to use TikTok? ›

More than 150 million Americans use TikTok, but is TikTok safe? It's as safe as just about any other social media platform. It doesn't infect your phone with malware, but it comes with some safety risks like scams and saved user data. Here's what you need to know to stay safe on TikTok.

How does TikTok affect mental health? ›

For some people TikTok has been said to contribute to low confidence and self-esteem, especially when it comes to the way we look. Young people are being exposed to videos featuring people with the 'perfect' body and the 'perfect' life.

Top Articles
Latest Posts
Article information

Author: Nicola Considine CPA

Last Updated:

Views: 6043

Rating: 4.9 / 5 (49 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Nicola Considine CPA

Birthday: 1993-02-26

Address: 3809 Clinton Inlet, East Aleisha, UT 46318-2392

Phone: +2681424145499

Job: Government Technician

Hobby: Calligraphy, Lego building, Worldbuilding, Shooting, Bird watching, Shopping, Cooking

Introduction: My name is Nicola Considine CPA, I am a determined, witty, powerful, brainy, open, smiling, proud person who loves writing and wants to share my knowledge and understanding with you.