Slack Faces Criticism For Using Private Data To Train AI Without Clear Opt-Out

Slack Faces Criticism For Using Private Data To Train AI Without Clear Opt-Out

Reading time: 2 min

  • Andrea Miliani

    Written by: Andrea Miliani Tech Writer

  • Kate Richards

    Fact-Checked by Kate Richards Content Manager

Last week, Slack users complained on the social news website Hacker News and on X about the company’s AI training methods, which uses private data without asking for explicit permission and makes it difficult to opt out. The discussions went viral, raising awareness and concerns, and in turn, causing Slack to update its AI privacy principles.

Customers quoted and criticized the company’s requirements to stop sharing data for AI training. “To opt out, please have your Org or Workspace Owners or Primary Owner contact our Customer Experience team at feedback@slack.com with your Workspace/Org URL and the subject line ‘Slack Global model opt-out request.’”

“They want to make it as difficult and painful as possible,” said one user. “My paid workspace opt-out confirmation just came through. One down. Several to go,” said another. Users expressed concern about the use of their private information and their trust in Slack.

The company’s clarification of the use of data in the opt-out email response shared on Hacker News, like only considering data for “machine learning models for things like channel and emoji recommendations and search results,” raised even more debates. “How can anyone in their right mind think building AI for emoji selection is a remotely good use of time?” wrote one user.

After the users’ backlash, Slack shared an announcement about new updates to its AI Privacy Principles, “Privacy Principles: Search, Learning and Artificial Intelligence.

In the document, Slack recognized customers’ concerns and explained it considers “industry-standard, privacy-protective machine learning techniques” and does not train LLM models with customer data. Slack emphasized its data collection is for machine learning models and not generative AI models and assured users that private information was not shared with third parties, ensuring there’s no leak risk across workspaces.

Slack provided details and examples of how it considers machine learning models for features like “Search ranking” to aggregate data and emphasized that they “do not access original message content in DMs, private channels or public channels to make these suggestions.” The company added that it uses LLM models for Slack AI, a separate add-on product that doesn’t use customer data.

However, users’ data to train AI and machine learning algorithms is still set as default. The process quoted and criticized by users remains the same: customers who want to opt out of this program must send an email and explicitly request Slack to stop using their workspace data.

Did you like this article? Rate it!
I hated it I don't really like it It was ok Pretty good! Loved it!
0 Voted by 0 users
Title
Comment
Thanks for your feedback
Please wait 5 minutes before posting another comment.
Comment sent for approval.

Leave a Comment

Show more...