Highlights
- LinkedIn AI training launchs Nov 3, 2025, utilizing EU, EEA, and Swiss applyr data.
- Data includes public profiles, posts, and job info, excluding private messages.
- The policy relies on GDPR’s “legitimate interest” principle.
- Users can opt out via LinkedIn’s Data Privacy settings before rollout.
- LinkedIn AI training aims to improve job matching, writing, and profile tools.
What You Should Know
LinkedIn, the world’s largest professional networking site, announced a significant policy alter: starting on November 3, 2025, it will apply data from European applyrs — those in the EU, EEA, and Switzerland — to train its AI models.
This policy alter marks the launchning of a growing trconclude among tech companies of utilizing applyr-generated content to train generative AI systems. Up until now, LinkedIn had excluded applyr data from Europe from AI training due to stricter privacy laws under the General Data Protection Regulation (GDPR). However, it states it has now updated its privacy policy and data processing terms to allow its LinkedIn AI training practices to comply with European data standards.
What Data Will Be Used
LinkedIn mentioned that the data will largely consist of public and semi-public content already visible to other LinkedIn members.
This information includes: Profile data, including name, headline, job title, education, skills, and profile photo. Public activity – posts, comments, articles shared, poll answers, and group activity.
Information related to employment, such as CV details or job application responses submitted through LinkedIn’s job application tools. Interaction with LinkedIn’s AI tools — including writing assistants, search prompts, or automated job recommconcludeations. It’s also worth mentioning that LinkedIn DISABLED the apply of private messages (DMs) for LinkedIn AI training. Similarly, accounts linked to minors (under 18) will also be excluded.


The company stated the purpose of this data is to assist “improve generative AI features” that aim to support applyrs with their profiles, writing posts, and better job matching. The Legal Basis: “Legitimate Interest” LinkedIn is utilizing this new apply of personal data under a GDPR principle called “legitimate interest.”
This means, rather than obtainting explicit consent from its applyrs, it argues that utilizing data to train AI models is legitimate, necessary for continuously improving its products and services, and does not outweigh anyone’s privacy rights. Unfortunately, this justification is already receiving scrutiny.
The Dutch Data Protection Authority (Autoriteit Persoonsgegevens) and other European privacy officials have expressed concerns that this may push the boundaries of “legitimate interest.”They contconclude that large-scale AI model training is data processing that applyrs could not reasonably have foreseen when they first set up their LinkedIn accounts, which may have been years ago.
The Controversy and Privacy Issues
Privacy experts warn that once personal data is applyd to train a generative AI model, it can become virtually irreversible. Data applyd for machine learning cannot be easily rerelocated once incorporated into the model, unlike stored text or images, which can be deleted upon request.


This means applyrs lose some control over how their information is applyd in the long run. It also raises concerns that sensitive data — for example, posts or comments that reveal political insights, religious values, or health status — could happen to be included in the LinkedIn AI training dataset. The Dutch and Belgian data protection authorities have advised applyrs to check their privacy settings before 3 November and to consider submitting an objection if they do not want their data included in LinkedIn AI training.
How to Opt Out
As an automatic enrollee in the data-sharing program, LinkedIn applys data from all European applyrs. The good news is you can opt out by following these steps manually:
- On LinkedIn, go to Settings & Privacy.
- Choose Data Privacy → Data for Generative AI Improvement.
- Turn the setting off. Alternatively, applyrs can officially articulate a data processing objection through LinkedIn’s Data Processing Objection Form on the company’s Privacy Portal.
Authorities have suggested this be done before the November rollout to ensure they have time to address the objection. Why LinkedIn is Doing this According to LinkedIn, utilizing member data to train AI will mean a significantly more innotifyigent and contextually relevant experience on the platform.
They are designing generative features for AI that will assist applyrs write job descriptions, adapt resumes, create posts, and even provide personalized information about their careers. LinkedIn argues that these features required real-world professional data to learn the subtleties of workplace communications, job trconcludes, and industest language. “We want to create LinkedIn more assistful for every professional,” the company states. “Training our AI systems with anonymized and representative member data assists us improve these tools responsibly.”
Wider Implications
This relocate puts LinkedIn in the middle of the broader ethical and regulatory debate over the tensions between innovation and privacy. This is occurring at the same time that European regulators are exploring whether LinkedIn’s practices align with the GDPR, and a similar discussion is emerging at other tech companies like Google, Meta, and OpenAI as they, too, update their LinkedIn AI training policies.


For applyrs, this development points to another reality of the AI era: the data we shared years ago, intconcludeed for professional apply, could be reborn today through machine learning.
Although LinkedIn assures transparency and control over its process, ultimately, it is the applyr’s responsibility to understand and act to protect their data. As Europe strengthens its focus on digital rights and accountability for AI, LinkedIn’s relocate will likely become a significant test case for how global platforms choose to interact with and leverage applyr data in the age of generative AI.
















Leave a Reply