Microsoft class action lawsuit overview:
- Who: Five Illinois residents filed a class action lawsuit against Microsoft Corp.
- Why: The plaintiffs allege Microsoft Teams illegally collects voice data in violation of the Illinois Biometric Privacy Act (BIPA).
- Where: The Microsoft class action lawsuit was filed in a Washington federal court.
A new Microsoft class action lawsuit accuses Microsoft Corp. of illegally collecting and analyzing the voice data of users of its Teams software without providing proper notice as required under Illinois law.
Plaintiffs Alex Basich, Kristin Bondlow and three others filed the class action complaint against Microsoft Feb. 5 in a Washington federal court, alleging violations of the Illinois Biometric Privacy Act.
The Microsoft class action lawsuit claims the software’s real-time transcription feature works by capturing speakers’ voices during online meetings and assessing qualities like pitch, tone and timbre to identify who said what.
While that alone isn’t illegal, Microsoft’s failure to inform users of how their voice data would be used violated BIPA, the Illinois residents claim.
The lawsuit argues that to legally obtain an Illinois user’s voiceprint, the company would need to first inform users of how the data would be used and how long it would be stored. A user would then need to provide a written release allowing the collection, the plaintiff claimed.
Microsoft failed to inform voice data was being collected, lawsuit claims
According to the lawsuit, the issue began with Microsoft Teams’ introduction of an automated transcription feature in 2021. The feature relied on a process called diarization, which differentiates voices based on their identifying characteristics and creates “individual speaker profiles in the form of voiceprints,” the plaintiffs say.
“Microsoft never informed Teams meeting participants that their biometrics, such as voiceprints, were being collected during Microsoft Teams Meetings,” the Microsoft class action says.
“Microsoft also failed to inform Teams meeting participants of the specific purpose for the collection or storage of their biometrics and failed to provide meeting participants with a schedule setting out the length of time which those biometrics would be collected, stored, used, and destroyed.”
The plaintiffs want to represent a class of Microsoft Teams participants whose biometric information was captured by Microsoft’s transcription feature while they resided in Illinois, dating to March 1, 2021.
The Microsoft Teams class action lawsuit demands actual damages or $1,000 per negligent violation, whichever amount is greater. If the BIPA violation was intentional or reckless, damages could rise to $5,000 per violation.
Open AI and key investor Microsoft are facing allegations, now consolidated in a federal multidistrict litigation, that the companies used the work of journalists and other copyrighted work to train the artificial intelligence program ChatGPT.
What do you think of the claims made in this Microsoft class action lawsuit? Let us know in the comments.
The plaintiffs are represented by Bradley S. Keller and Jofrey M. McWilliam of Byrnes Keller Cromwell LLP, Brian Levin and Nicholas Miranda of Levin Law P.A. and Jonathan D. Waisnor and James M. Fee of Labaton Keller Sucharow LLP.
The Microsoft BIPA class action lawsuit is Basich et al. v. Microsoft Corp., Case No. 2:26-cv-00422, in the U.S. District Court for the Western District of Washington.
Don’t Miss Out!
Check out our list of Class Action Lawsuits and Class Action Settlements you may qualify to join!
Read About More Class Action Lawsuits & Class Action Settlements:
- Vistaprint class action claims company sends emails with misleading subject lines
- Rocket Mortgage class action alleges illegal steering to in-house mortgage and title services
- Kraft Heinz faces class action over ‘naturally flavored’ Country Time lemonade
- Chrysler recalls 80K Jeep Grand Cherokees due to coil spring issue

2 thoughts onMicrosoft Teams class action claims company illegally collects voice data
Add me
Add me