Wearable AI and third party consent: the challenges of the Friend.com necklace
Wearable AI, illustrated by devices like the Friend.com necklace, captures data from those around us continuously. This evolution poses the challenge of third party AI consent, as voice is personal data protected by the GDPR. Compliance is based here on transparency, respect for the right to object and rigorous management of data processed outside of any prior framework (Shadow AI).

Artificial intelligence is no longer hidden behind our computer screens. With the arrival of portable devices such as the Friend.com necklace, AI is now invited into our most spontaneous conversations, in the café, on the street or during dinners with friends. While these objects promise to become everyday companions capable of supporting us or breaking our solitude, they radically transform our relationship with others. At the heart of this evolution, an essential question arises: that of AI third party consent.
The challenge of third party consent in the face of portable AI like Friend.com
We are already used to smart speakers or voice assistants in our homes. However, until now, these technologies have remained sedentary and limited to a defined private space. Portable AI marks a breakthrough: it is becoming nomadic and relational.
Unlike traditional tools, these new systems no longer involve only the user who has chosen to wear them. They include everyone around them. By becoming a silent witness that records and analyzes exchanges continuously, portable AI brings our loved ones, colleagues and even strangers into a database, often without them being aware of it. These are called third party AI.
From living room AI to companion AI: a new relational frontier
The transition from a living room AI to a companion AI is a game changer. The fact that some devices, like the Friend.com necklace, are designed to be worn all the time increases the points of contact with third parties who have never consented to be recorded or analyzed by an algorithm.
What is a third party AI and why is their consent key?
The term third party AI refers to anyone whose voice, speech, or image is captured by another person's device. In everyday life, consent is normally the foundation of privacy. But with a discreet object that analyzes the discussions in the background, getting the agreement of each interlocutor becomes a real challenge.
The General Data Protection Regulation (RGPD) recalls that the voice is a personal data unique. When a device captures a conversation, like the Friend.com necklace, it processes information that belongs to all participants. Respecting privacy cannot therefore be a technical option, but must remain a right for each person encountered during the day.
The Shadow AI: when the Friend.com necklace gets ahead of vigilance
This use of artificial intelligence tools outside of any framework or prior information is a form of Shadow AI. In our social life, this means capturing intimate or informal moments that end up being stored on remote servers to be analyzed by algorithms.
The news shows a growing trend towards emotional dependence on these virtual assistants. This proximity to the machine can sometimes make us forget that our freedom to wear a connected object ends where the right to the image and voice of others begins. Mastering your data also means ensuring that you do not, in spite of yourself, become the vector of invisible surveillance for those around you.
Towards an AI label for a respectful daily life
The aim is not to reject innovation, but to integrate it with courtesy and transparency. For these technologies to be accepted, they must respect the main principles of personal data protection. Digital trust is based on the ability of the user to remain in control of the tool and to ensure that others remain in control of what they say.
The AI third party consent must become a social reflex. By adopting simple gestures and clear communication, it is possible to take advantage of the benefits of AI while maintaining what is the value of our human exchanges: spontaneity and confidentiality.
FAQ: understanding your rights in the face of portal AIs such as the Friend.com necklace
How do I know if I am registered by someone else's AI?
That is the major difficulty. Unless the device has a light signal or the user is informing you, it is difficult to know. The principle of transparency of the GDPR normally requires the user to warn you.
Can I object to being registered by a friend's AI necklace?
Yes, absolutely. The right to object is a pillar of the GDPR. In a context such as using the Friend.com necklace, you can ask your interlocutor to turn off their device or delete the audio sequence about you.
Where does the data captured by these connected necklaces go?
Generally, recordings are sent to the cloud for analysis. It is essential to find out about the location of these servers and the security guarantees offered by device manufacturers like Friend.com.
Can AI use my voice to train?
It depends on the conditions of use of the device. Without your explicit consent as a third party, a company should not use your personal data to improve its artificial intelligence models.
Best practices for citizen use of AI
- Systematically inform your loved ones or your interlocutors if you use an AI assistant capable of listening in on the conversation
- Deactivate automatic registration in places where confidentiality is expected (medical appointments, confidences, private areas)
- Take the time to read the manufacturer's privacy policy to find out how the data of the people you meet is handled.
- Pay attention to the right to be forgotten: know how to quickly delete the data collected by your device
- Choose devices that offer local processing (“on-device”) without systematically sending audio to the Internet


