The Context
TikTok’s core platform failed. Over the weekend of 25-26 January 2026, a power outage at a U.S. data centre triggered a cascading systems failure. This occurred days after the platform’s final transition to U.S. ownership. The For You Page algorithm reverted to generic content. Videos were stuck in review. User earnings appeared to vanish. Crucially, users reported videos on sensitive topics like the Minneapolis ICE shooting being flagged as “Ineligible for Recommendation.” The outage propelled a competitor, UpScrolled, to #2 in the U.S. App Store. This was not a simple glitch. It was a systemic rupture in the engine of user trust.
The Risk
For a director, this is a reputational and liability flashpoint. The Court of Public Opinion is now in session. The narrative is shifting from a technical fault to a question of integrity. Were those videos suppressed by a faulty algorithm or a new, politically-aligned ownership group? The revised privacy policy collecting immigration status data underlines this. In New Zealand, this may indicate a failure of duty under the Companies Act 1993 to act in good faith and with reasonable care. The Privacy Act 2020 principles around data integrity and purpose limitation are also in play. Psychosocial damage is real. Creators lost income. Users lost faith. Your brand is now synonymous with instability and potential bias. Regulatory scrutiny from the FTC or Congress is a secondary threat. The primary crisis is the erosion of your social licence to operate.
The Control
You must separate the technical narrative from the trust narrative. Immediately commission an independent, transparent audit of the outage’s root cause and its impact on content visibility. Publicly map the failure to your business continuity and crisis management plans. For New Zealand operations, this demands a proactive review of data governance against the Privacy Act 2020, ensuring local user data and algorithmic outputs are demonstrably secure and unbiased. Control the story by owning the failure and the fix.
The Challenge
These are the critical questions you should be raising at the board table:
| What is our forensic, public-facing plan to prove the content visibility issues were a technical failure and not a deliberate moderation policy under the new ownership? | |
| How are we quantifying and addressing the psychosocial and financial harm to our creator community to prevent a permanent exodus to platforms like UpScrolled? | |
| For our New Zealand operations, what specific, audited controls do we have to ensure local user data and algorithmic outputs are insulated from systemic failures and governance changes at the global parent level? |