Anthropic users face a new choice – opt out or share your chats for AI training

**Anthropic Users: Opt-Out or Contribute to AI Training**

Anthropic users are now faced with a new default setting for their chat data. The AI company has announced that, unless users actively opt out, their conversations will be used to train and improve Anthropic’s AI models.

This policy change shifts the onus onto individuals to manage their data preferences. Previously, user data was not automatically utilized for training. Now, to maintain privacy, users must specifically navigate their settings to prevent their interactions from contributing to future AI development. The move underscores the ongoing industry discussion around data usage, AI advancement, and user control.

اترك تعليقا

لن يتم نشر عنوان بريدك الإلكتروني. الحقول الإلزامية مشار إليها بـ *