Cyberbullying and algorithmic surveillance: the urgency of digital ethics

In 2026, total digital immersion erased the boundaries between private and professional life, giving way to systemic cyberbullying fuelled by intrusive algorithms. Whether it's viral lynchings assisted by AI among adolescents or permanent micro-control via People Analytics in business, data has become a vector of psychological suffering. To counter this invisible epidemic, organizations must move from simple paper compliance to an ethics of responsibility including rigorous audits of their surveillance tools, the anonymization of performance data, and technical compliance with a right to digital darkness.

By
Guillemette Songy
1
Min
Share this article
Computer hands harassment

In 2026, we stopped “connecting” to the Internet: we live there. This total immersion brought down the last barriers that still separated our private lives, our professional lives and the construction of the identity of our children. But behind the entertainment and operational efficiency, a devastating phenomenon has taken hold silently: a form of cyberstalking systemic, fuelled by increasingly intrusive algorithms and corporate surveillance mechanisms that turn against humans.

Algorithms and generative AI: the new faces of cyberbullying

For today's teens, school bullying has no end. In the past, the house was a refuge; now, torment follows the child to his bed via notifications. In 2026, cases of “viral lynching” for a simple misinterpreted video or a stolen photo are no longer exceptions, but clinical realities.

Case in point: At the beginning of the year, the story of Lucas, 14, upset public opinion. Victim of a humiliating “deepfake” created by generative artificial intelligence and broadcast to a class group, he was subjected to an avalanche of hate messages automated by bots. It was no longer a group of stalkers, but a technological wave that was impossible to stop. The result? Total school dropout and severe post-traumatic stress disorder.

From an ethical point of view, we allow developing brains to be shaped by platforms whose business model is the capture of attention through strong emotions. Since anger and outrage are the most powerful click drivers, the algorithm becomes, in effect, the stalker's unintended accomplice.

People Analytics and Data-Driven Harassment: Corporate Surveillance

The world of work is not immune, but harassment takes on a more insidious form, often linked to data collection. In 2026, employee surveillance is no longer limited to their schedules. The software of People Analytics analyze the reactivity to emails, the tone of exchanges on Slack, and even the micro-movements of the mouse.

This surveillance creates a permanent micro-control environment that can easily turn into bullying. When a manager uses real-time performance data to publicly point out the “weak links” in a team, he is not managing, he is organizing a digital pillory.

The toxic link: Harassing behaviors are fuelled by this data. A colleague may be marginalized because their internal “influence score” is falling, or because an algorithm has detected a sign of disengagement. People are no longer harassed for what they do, but for what the data says about them.

Impact on public health: the mental cost of constant comparison

Cyberbullying is not just a “click” problem. It is a major public health issue. Doctors are noticing an explosion of psychosomatic disorders linked to the use of networks: sleep disorders, behavioral addictions, and a dramatic drop in self-esteem.

The experience of human life is thus reduced to a constant comparison. For adults, it is the “fable of the perfect life” on the networks that creates unbearable social pressure; for the youngest, it is the terror of digital exclusion. This climate of permanent tension weakens the global mental health system, saturating child psychiatry services and corporate support cells.

Ethics and compliance audit: turning surveillance into protection

We are at countdown time. La RGPD compliance or corporate conduct charters are paper shields in the face of violent uses. The ethical challenge is to put humans back at the center of digital architecture:

  • For platforms: Civil liability for the damage caused by their recommendation algorithms
  • For companies: A right to digital darkness for employees where the data collected should never be used for human evaluation or psychological pressure
  • For all of us: Relearn analog empathy because the screen erases non-verbal signals, making the stalker blind to the pain he inflicts

Progress cannot be built on the ruins of an entire generation's mental health. Whether in a schoolyard or in an open space, digital technology must remain a link tool, not a leash or a whip. Protecting our digital privacy and our psychological balance is no longer an option, it is the fight for human dignity in the 21st century.

Concretely, solutions exist, and consideration must be given.

For a company to stop being a digital panopticon and become a healthy workplace again, legal compliance is no longer enough. In 2026, theethical audit of surveillance tools must be as rigorous as a financial audit. Here are the concrete levers:

1. The principle of sobriety surveillance

Ethics starts with a brutal question: is the tool necessary or simply comfortable for the manager?

  • Disable real-time presence indicators and keyboard typing time reports
  • Moving from a culture of monitoring effort to a culture of trust in the result

2. Mandatory anonymity in People Analytics

If the company uses AI to analyze the social climate, this data should never be linked to a name or a number.

  • Implement strict data aggregation to only see team trends
  • Ensure that no behavioral data can be used during an annual interview without prior human and factual proof

3. The right to digital darkness and technical disconnection

Professional cyberbullying often starts after 7:00pm. The audit must verify the technical “closure” of the systems.

  • Establish servers that block the sending of notifications between 8 p.m. and 7 a.m., except for a vital emergency
  • Protect rest time to preserve the cognitive and emotional capacity of employees

4. Shared governance via the data ethics committee

We cannot let the IT department and human resources decide on surveillance tools alone.

  • Create a joint body with a veto on any new data collection tool
  • Test each tool for human impact to see if it can be used to humiliate or discriminate

5. Distance management training

Harassment often results from technological awkwardness that degenerates.

  • Teach managers that the tone of a written message is often perceived as more aggressive than it is
  • Train HR to detect anomalous data clusters so that the tool serves as an alert against the harassing manager and not against the employees

FAQ - Cyberstalking and surveillance

What is data-driven harassment in business?

It is a form of bullying using People Analytics tools to exercise permanent micro-control, such as the analysis of email reactivity or mouse movements, in order to marginalize an employee based on algorithmic scores.

How does generative AI facilitate cyberbullying?

AI allows the creation of humiliating deepfakes and the automation of attacks via bots, making waves of hate technically impossible for victims to stop manually.

How to conduct an ethical audit of surveillance tools?

An ethical audit must assess the need for the tool, impose the suppression of micro-tracking, ensure the strict anonymization of performance data and involve joint governance to validate the human impact of data collections.

The latest news

They have trusted us for years

Discover Adequacy

One of our experts introduces Adequacy to you in a real situation.