Imagine coming across a brutal video on X, showing someone's last moments. Shocking, isn't it? Now imagine that this person is someone close to you, and you have to fill in a form to request the video's deletion.

Since February 2025, this has been the reality on X, the platform run by Elon Musk, which introduced a new policy called the “Moment of Death Form.” The move sparked heated debate and raised many questions: Where does free speech end? Who decides what stays online? This article explores the controversial decision, its implications for users, and the future of the

What is the Death Moment on X form?

The “Moment of Death Form” is not just a catchy term, but a new rule that has been incorporated into X’s violent content policy since February 18, 2025. This rule allows relatives or legal representatives of a deceased person to report a video showing their final moments. However, this is not an automatic process. In order for the request to be considered, X requires applicants to provide concrete evidence, such as a death certificate.

X's stated goal is to strike a balance between the dignity of the deceased and what the platform calls a "robust public record." That means if a video is deemed "historically significant" or "newsworthy," it could remain online, even if the family objects.

This policy raises a fundamental question: can a technology platform really reconcile ethics and total transparency?

Why did X adopt this new rule?

To understand this update, we need to look at previous events. In 2024, X refused to remove a video of a violent attack in Sydney, despite requests from Australian authorities. The platform then invoked freedom of expression as its main argument. This video, although it did not show any deaths, remained accessible, sparking a global debate.

Later, it was discovered that an individual responsible for a triple murder in the UK had watched this video before committing the crime. Coincidence or catalyst? It's hard to say, but this event certainly pushed X to adjust his rules, while maintaining his core principles.

According to an excerpt from their violent content policy: "X values a robust public record, especially for significant historical or current events." This statement shows that X prefers a systematic approach, with clear forms and criteria, rather than immediate removal.

A complex administrative process

Filling out a form to request the removal of a video showing the death of a loved one may seem cold and bureaucratic. However, that is exactly what X is asking. Users must prove their connection to the victim, provide official documents, and wait for the platform to accept or reject their request. If the content is deemed “relevant” by moderators, it will remain online.

To many, this complexity seems deliberate. By imposing a structuring process, X avoids mass deletions and maintains room for maneuver. But to users, it can feel like an additional ordeal in an already painful moment. A startup specializing in the management of reputation online could have a lot of work to do to help these families navigate this administrative process.

Freedom of expression versus human dignity

The conflict at the heart of this measure is philosophical: on the one hand, there is freedom of expression, dear to Elon Musk and his team, and on the other, the right to human dignity and privacy, especially after death. X seems to favor the first option, but not without some concessions.

The broad outlines of this policy are as follows:

  • Videos deemed “historical” or “significantly current” are given priority.

  • Only immediate relatives or legal representatives can request deletion.

  • The process is based on a subjective assessment by X.

This compromise does not satisfy everyone. Some accuse X of hypocrisy: claiming to defend freedom of expression while imposing barriers on grieving families.

What's the impact for tech companies?

This decision by X is a strong signal for startups and technology companies. Managing violent content is becoming a strategic issue. Here are the potential consequences:

  • Brand perception : A platform that keeps shocking videos can put off some users.

  • Regulation : Governments could toughen their laws in response.

  • Innovation : Moderation AIs will need to become more sophisticated to identify this sensitive content.

Companies may need to rethink their own policies to adapt to these new expectations. An AI startup could develop tools that can detect these “death moments” automatically and in a more humane way.

Users at the heart of the debate

For users, the debate over the Moment of Death Form is deeply divisive. Some support X, arguing that full transparency is essential in a connected world. Others believe the approach is too cold and lacks human sensitivity.

“If a video can inform or educate, it should stay, no matter who requests it.” – Anonymous user on X.

But for every voice that defends this logic, there is a voice that cries indecency. This debate reflects a broader tension in digital society: how far can we expose reality, even the most raw?

An uncertain future for X and its policies

With this update, X is opening a Pandora’s box. What happens next will depend on how users, regulators, and competitors respond. If governments get involved, as Australia did last year, X may have to rethink its policies. And if other platforms adopt stricter approaches, X risks losing market share.

For entrepreneurs and marketers, it’s essential to keep a close eye on this development. This story shows how a simple rule can upend a platform’s strategy, influence public opinion, and redefine digital norms.

At ValueYournetwork, we understand the importance of responding quickly to changes in digital policies and adapting communication strategies to protect and enhance our clients' brand image. Since 2016, we have been supporting companies with influence campaigns successful marketing and social media crisis management expertise.