I wonder if this will be an unpopular issue, but I think it's important. I think we as social media marketer want to be taken seriously, and much of the time, we do thinks that hurt our own credibility in the world. Today's issue tackles the misinformation that we as marketers perpetuate, when honestly, we should know better. We'll cover: |
If TikTok actually censored anyone's content last month Not letting our emotions turn into misinformation What TikTok does need to be better at
|
—Jack Appleby |
|
How to pick an Influencer for your brand |
The most basic question in influencer marketing remains the most important. I rather like this lil' reverse pyramid situation from Caliber's new 32-page social media report, The Drop. |
|
The secret's in the subtext. That line about "find creators who have a unique perspective on topics that withstand the pendulum effect" is so important, but often forgotten. I'm a huge believer in Influencer Marketing, but you gotta work with Creators who actually have something to say, since they bring their audiences along for brand deals. |
Go grab Caliber's new 2026 social report for a deep dive, with: |
30+ examples of trending content formats with How-To's Breakdowns on Instagram + TikTok's latest features Creator x Brand Integration Strategies
|
Just grabbed dinner with the Caliber folks—they're very intent on making sure these reports are everything you need to learn to make better content, and they're coming every quarter now with fresh updates. It's a must-download. |
|
Is TikTok censoring us? |
|
Everyone's pissed off & hurting right now. The last few months have been a nonstop cycle of political tension, platform drama, and very real anger about what's happening in the world. So when a major social network starts malfunctioning at the exact moment people are trying to talk about emotionally charged issues, it's not surprising suspicion sets in fast. |
Over the past few weeks, TikTokers posting about ICE, Trump, Epstein, and other sensitive topics have reported their videos going nowhere. The natural assumption in a world where no one trusts Big Tech (or the White House): CENSORSHIP!!! And hey, I get it. |
But also… there's no evidence of TikTok censorship. And honestly, TikTok is one of the best places to find new information on real time events now, including the ICE murders. |
What we're seeing is a messy collision between broken systems, terrible timing, and our social media industry becoming far too comfortable making unfounded claims and spreading misinformation. |
What Actually Happened on TikTok |
In late January, TikTok began transitioning parts of its U.S. operations into a new entity as part of its ongoing regulatory process. Around the same time, winter storms caused power outages at data centers that support major parts of the platform. |
The effects were widespread and frustrating. New uploads stalled or failed, videos sat at zero views, analytics lagged or disappeared altogether. |
TikTok later said these issues were caused by infrastructure failures related to the transition and the outage, not by moderation changes or political enforcement decisions. They denied any policy shift tied to ICE or other political topics. |
And honestly, I believe them—during that same window, I couldn't publish any videos to my basketball account. TikTok wasn't suddenly censoring my jump shot—the platform just broke. |
But TikTok's also terrible at communicating with users. They didn't publish a detailed post-mortem, just press blurbs. And if you don't give people enough info, imaginations run wild. |
TikTok's actually the best place for hot political topics now, including the ICE murders and Trump criticisms. |
Every morning over the last few weeks, I'll wake up and scroll to see the latest with ICE and Minnesota—probably not the most mentally healthy habit, but it's felt important. And I've done allllll that scrolling on TikTok. |
Here's a screenshot of my TikTok search this morning. |
|
TikTok's where the primary signals tend to surface first. First-person accounts, on-the-ground footage, local reactions, and raw documentation often appear on TikTok long before they're filtered through other social networks, media institutions, or flattened into headlines. Hell, the headlines are usually reactions to TikTok, not original reporting. |
There's also a simple scale reality that matters here. TikTok's U.S. user base dwarfs Twitter's, especially among younger audiences who increasingly treat TikTok as their default place to process news. The platform's algorithm prioritizes relevance over who you follow, which means people actively engaging with ICE or Trump-related content are more likely to see a wide range of posts clustered around those topics, not just voices from a single ideological bubble. When a platform becomes that central to how millions of people make sense of political reality, any disruption to visibility feels enormous. That's exactly why recent technical failures felt so loaded, and why it's even more important to separate what feels like suppression from what we can actually prove. |
Why Broken Systems Feel Like Suppression |
When your content stops reaching people and no one tells you why, it's natural to assume intent. Platforms train users to think this way by offering almost no meaningful explanation when something goes wrong. |
A platform outage, an algorithm glitch, an automated moderation mistake, and a deliberate down-ranking decision all look the same from the outside. The result is silence, and silence invites interpretation. |
When the subject matter is political, that interpretation becomes loaded very quickly. People already don't trust platforms. They already assume bias. So a technical failure doesn't feel like a sufficient explanation, even if it's the most likely one. |
The problem is that feeling suppressed is not the same thing as being censored. Experience alone can't tell you which one you're dealing with, and treating it as proof is how misinformation takes hold. |
What Real, Proven Content Suppression Looks Like |
This isn't about pretending platforms have clean records. They don't. |
Meta has acknowledged over-enforcement of Palestinian human-rights content, supported by independent investigations. Twitter blocked links to the Hunter Biden laptop story in 2020, later reversed the decision, and admitted it was a mistake. TikTok has acknowledged technical issues in the past that temporarily suppressed Black Lives Matter content. |
In each of those cases, there was something concrete to examine. Policies were cited. Mistakes were acknowledged. Patterns could be documented. |
Right now, with TikTok, none of that exists. There are no internal documents, no confirmed policy changes, and no third-party audits showing intentional political targeting. What we have are user reports during a period of known instability. |
Those things shouldn't be treated as equivalent. |
The Rumor Economy, and Our Role in It |
This is the part where the social media industry needs to be a little uncomfortable. |
Creators, commentators, and social media managers are incentivized to move fast and sound confident. A post accusing a platform of censorship will always travel further than a careful explanation of infrastructure failures. Outrage performs better than restraint. |
So screenshots get treated like proof, speculation hardens into narrative, and suddenly a technical issue becomes a political scandal. By the time anyone asks for evidence, the damage is already done. |
If you work in social, you have a responsibility here. We can't complain about platform opacity while also amplifying claims we haven't verified. We can't ask for trust while modeling distrust as content. |
What TikTok Does Deserve Criticism For |
None of this lets TikTok off the hook. |
The platform communicates poorly during outages. It offers almost no tools to help users understand what's happening to their content. Its systems are designed in a way where failure feels indistinguishable from punishment. |
When platforms refuse to explain themselves, they create the conditions for rumors to thrive. That's a real problem, and it deserves sustained criticism. |
It just isn't the same thing as censorship. |
Evidence Has to Matter Again |
This isn't an argument for trusting TikTok or any platform unconditionally. It's an argument for not confusing bugs, algorithms, and frustration with political intent. |
Platforms have suppressed content before. That history is exactly why evidence matters now. If every outage becomes proof of conspiracy, real abuses become harder to identify when they actually occur. |
If the social media industry wants credibility with users, regulators, and the public, it has to stop treating suspicion as proof and outrage as insight. |
Skepticism is healthy. Misinformation isn't. |
And right now, the bigger problem isn't secret censorship. It's how quickly we're willing to claim it without evidence. |