LinkedIn Quietly Starts Training AI on User Data
LinkedIn has quietly opted in its users' data into training its generative AI models without prior notice. The platform has since introduced an opt-out option as a user account setting.
As reported by 404Media and other sources, LinkedIn appears to have implemented data collection for AI purposes without updating its policies beforehand. The platform only updated its privacy policy shortly after it had begun feeding user data into its AI models.
In LinkedIn’s updated privacy policy, under “How We Use Your Data,” the company says that “We may use your personal data to improve, develop, and provide products and Services, develop and train artificial intelligence (AI) models, develop, provide, and personalize our Services, and gain insights with the help of AI, automated systems, and inferences, so that our Services can be more relevant and useful to you and others.”
On a help page for this setting, LinkedIn explains that opting out prevents your data from being used by the platform and its affiliates to train AI models. However, this does not affect training that has already taken place.
By opting out, you can still use generative AI features on LinkedIn, such as communicating with its chatbot. The only difference is that your personal data won’t be used to train or fine-tune those AI models.
Due to local privacy laws, LinkedIn does not collect or use the data of users in the EU, EEA, or Switzerland for training AI. Consequently, users in these regions will not have this setting.
To change this setting, users can go to Account -> Settings & Privacy -> Data Privacy -> Data for Generative AI Improvement. Then, simply toggle off the setting “Use my data for training content creation AI models.”
In similar recent stories, Google will face trial for allegations that it misled users regarding its data collection practices. It might run into the same fate as Meta, who was forced to pay Texas $1.4 billion for privacy violations regarding Facebook users’ biometric data.
Please, comment on how to improve this article. Your feedback matters!