A Little History
In the mid-19th century, British traders were making the people of China sick. They were importing countless tons of opium, grown in Britain’s newly acquired lands in India, and selling it to the Chinese people. Widespread addiction and heart wrenching social issues ensued. At its height, as many as 1 in 10 Chinese were regularly using opium.
China’s then-rulers, the Qing dynasty, horrified by the societal harms caused by the potent narcotic, attempted to impose strict controls on its importation and sale. The British traders, however, weren’t eager to give up any access to their newfound Chinese market, and the British empire, suddenly cash-poor in the wake of the successful American Revolution, was eager to generate additional revenue.
Eventually, tensions spilled over into open conflict between the Qing dynasty and the British Empire, who promptly beat the ludicrously overconfident but technologically primitive Chinese military into the dirt. A second conflict a decade later involving the French pushed the issue even further. The European powers won, at the barrel of a gun, the right to sell addictive poison to the Chinese. The period that followed, in which Western powers subjugated the Qing dynasty and did business as they wished, was known in China as “the century of humiliation.”
—
As you read this, there’s a kid with their eyes glued to a phone, watching TikTok videos. They watch a girl dance on screen, lip synching flirtatiously, then swipe it away. An adorable cat with cerebral palsy helplessly flops around in a plush bed to a sunny pop song.
They swipe again, a shirtless man with tattoos and a shaved head intones misogynistic bullshit over a menacing electronic beat. Swipe. Four Things You Didn’t Know About Taylor Swift. Swipe. A child dances along to a chipmunk-voiced rap song. Swipe.
The kid’s eyes are growing bleary with fatigue. The time is slipping by them: they could have been swiping away watching the little videos for five minutes, or it could have been half an hour. Some videos might make them laugh or gasp faintly with amazement, but as the minutes pass, they feel increasingly drained and empty.
Behind the screen, an algorithm is figuring out what exactly the child wants to see, what will give them the greatest rush of endorphins, and keep them coming back for more. A multinational corporation with vast resources has dedicated teams of the brightest computer engineers and data scientists and product designers they can find to make this algorithm as powerful as possible, which is to say making it as addictive as possible.
Eventually, the child will be broken out of their reverie. Maybe Mom calls them down for dinner. The child will put away the phone, and patter down the stairs to join their family at the table. But now, the pull of the algorithm has them. As the family eats together, the child can feel their phone, heavy in their pocket. The child is still thinking about the videos.
—
A New Kind of Drug
The last few years saw us mobilize swiftly to deal with the growing threat posed by fruit-flavored Juul pods to the nation’s youth. Yet we struggle to fully realize the effect that highly-optimized social media algorithms, like the ones now being deployed with incredible effect by Chinese-owned social media platform TikTok, are having on children. The addictive properties of these platforms are so extreme that we may need to redefine what it means for something to be a “drug.”
Youth mental health is at a wretched low point. Suicide has become the second highest cause of death for people 10 through 24, right behind unintentional injuries. Even pre-pandemic, more than 1 in 3 high school students reported persistent feelings of sadness or hopelessness (a 40% rise between 2009 and 2019). Self-harm rates (particularly among teenage girls) skyrocketed over the same period.
Now, two years of remote school and social isolation have tipped kids headfirst into their phones and iPads, no doubt worsening the problem. Multiple pediatric associations have declared states of emergency. Teachers are seeing the effects in the classroom, with one Department of Education survey showing that among 846 public schools, 83% reported stunted behavioral development among their students. The latest stats on youth mental health are shocking. A CDC study found that 45% of high school students were so hopeless or sad in 2021 that they couldn’t engage in regular activities. Nearly 1 in 5 seriously thought about killing themselves. 9% said that they’d tried.
Absenteeism and disruptive behavior have also skyrocketed, and burned-out teachers are leaving the profession in droves. God help us when these kids hit the workforce. Given the pandemic and current economic conditions, narcotic-like social media platforms are clearly only a part of the story of the ongoing mental health crisis, but it’s a very large part.
The fact that TikTok seems uniquely good at creating addiction and dependency will come as no surprise to anyone who has been around its users. A poll from earlier this year showed 67% of U.S. teens had used the app, while 16% said they’d used it almost constantly. Users frequently report accidentally spending hours on the app when it feels like only minutes, the time simply slipping away as they watch video after video.
Researchers suggested that the “swipe down” movement required to refresh your screen mirrors the pulling down the lever on a slot machine, and that the “variable pattern of reward” in the form of getting to find another cutesy TikTok video, or disappointedly swiping to the next one, parallels the thrill of winning or losing on a slot machine. The short and sweet video format, of course, accommodates the tastes and abbreviated attention spans of children.
Political Backlash
Certainly, TikTok and by extension the Chinese government seem to be clearly aware of TikTok’s more spiritually carcinogenic qualities. On the Chinese version of the app, called Douyin, the content stream pushes educational content, videos of science experiments, and patriotic propaganda. Young users are limited to 40 minutes a day.
In comparison, a NYT report found that TikTok quickly pushed content promoting self harm and eating disorders to new 13-year-old users. And there’s no question that TikTok as a company has an extraordinarily close relationship to the Chinese regime. TikTok and parent company ByteDance share an absurd number of former (and current!) employees with Chinese state media. It probably doesn’t constitute a massive plot twist then that the Chinese government has a well documented habit of using the platform to push misinformation and propaganda.
Our politicians, intellectual slowpokes that they are, have finally figured out that TikTok Bad. As I am writing this, the U.S. Senate just unanimously passed a bill banning the app on federal employee’s devices, and multiple states, including Maryland, Georgia, and South Carolina, have issued the same ruling for the phones of state employees.
Our politicians have largely fixated on the framing of TikTok as a security threat. And to be clear, TikTok’s data insecurity issues are ludicrous. The most recent action by our politicians seems to have been sparked by the revelation that Chinese TikTok employees might be able to access not just the private data of U.S. citizens, but track their locations. However, some of them have picked up on the fact, evident to anyone who is around its use, that the app is wildly addictive. Wisconsin Rep. Mike Gallagher called it “Digital Fentanyl.”
It’s tempting to imagine this is a story about China intentionally poisoning us, the revenge of the Qing emperors from beyond the grave. That the cutesy dance videos and funny skits are a trojan horse for CCP intelligence and disinformation operations. It’s the revenge for the Century of Humiliation, except now we Westerners are the ones having a nefarious narcotic peddled to us. It’s easy to see why this narrative appeals, it’s a simple us vs them story.
But the more honest reading of this situation is that TikTok, in an approach typical of Chinese tech companies, is directly imitating and refining the practices of western tech companies. After all, rates of depression and self-harm have been trending upwards since the late 2000’s. TikTok has only been around since 2018.
In fact, TikTok is not unique for the kind of technology it uses to create addiction in users. The incredibly sharp algorithm for figuring out what you are interested in seeing and engrossing user interface is just the latest in a decades-old space race between tech companies to create the most addictive user experience possible. All this incredibly expensive dopamine-triggering triggering artillery is aimed right at your child’s reward centers. No wonder the kids are miserable, their brains are being boiled like diplomats with Havana Syndrome.
Readers might recall that it was our own homegrown tech companies that pioneered the art of using addictive social media design to lure users onto their platforms, in order to harvest their data and sell it to advertisers. The difference is that if TikTok has your data, so does the Chinese government.
Maybe China isn’t in the role of the British opium traders at all. Instead, it’s tech companies, both our own home-grown Silicon Valley giants like Facebook and Snapchat and Apple, and those on the other side of the Pacific.
Welcome to the New Opium War
So what’s the answer to this disturbing quandary?
We all have an foreboding sense that TikTok is bad news. Just being around people using it is jarring, watching how totally engrossed they become in their phone screens as they swipe away at video after video. Our politicians have seized on this general sense of unease around the Beijing-owned app, and discovered that the idea that it is a security threat is the most effective narrative for creating legislative action. But banning TikTok won’t solve the most core issues at work here (though it might be a good start), and it’s jingoistic and simplistic to pretend that it will.
One answer is that we need our legislators to start looking into these highly-optimized habit forming algorithms and interrogate the way tech companies utilize them to hook users. We need to seriously consider regulation. We need to educate people on the risks that these algorithms, operating behind your screen, pose to the mental well being of both adults and children.
We simply are not used to thinking of a social media algorithm like a drug, but future people may very well look back on the way we exposed children to it in the same way we think of Victorian era folks giving heroin to babies. We need to hold companies accountable, no matter which side of the ocean they are on. This could prove a challenge, as our lawmakers may be reluctant to get off the gravy train of campaign contributions from Silicon Valley, or to cut into the revenue-generating abilities of some of the companies who have become major engines of our economy.
The other option is that we need to start taking personal responsibility. If you take your young children out to dinner, don’t hand them a phone to distract them. Push them into sports and music and other hobbies that tend towards collaboration, not isolation. There’s clear evidence that extended exposure to screens harms them, some studies even showing it reduces gray matter. People working at the companies that make these products won’t even let their kids touch the noxious stuff, or heavily limit access. Why should yours?