Amazon has been targeted in a new class action lawsuit under an Illinois biometrics privacy law, with plaintiffs asking a court to order the e-commerce giant to pay potentially enormous damages for allowing its Alexa A.I. program to record the voices of children and others using their systems, or the voices of others speaking nearby while someone else uses Alexa.
On June 27, attorneys with the firm of Taxman Pollock Murray & Bekkerman, of Chicago, and KamberLaw, of Chicago and Denver, filed a complaint in Cook County Circuit Court against Amazon. The complaint lists three named plaintiffs, identified as Bennett Wilcosky and Michael Gunderson, and Gunderson’s three-year-old child, identified only as E.G.
The lawsuit centers on the Alexa artificial intelligence platform created by Amazon, and specifically, the Seattle-based company’s alleged practice of using Alexa to record the “voiceprints” of Alexa users and non-users, alike.
The Alexa A.I. performs many of the same functions as Apple’s Siri and the Google artificial intelligence programs, answering users’ queries and helping them find information. The complaint notes Alexa is installed on more than 147 million devices in use in the U.S. and around the world, including more than 47 million Amazon Echo devices, which are installed primarily in users’ homes.
The Alexa program is activated when a user asks Alexa a question, such as “Alexa, what is the weather forecast today?”
At that point, the class action complaint asserts, Alexa begins recording, and “keeps a recording of the user speaking those words, as well as any follow-up statements or questions, and the statements of any other persons who just so happen to be within recording distance of the Alex device.”
While seemingly not an illegal activity in most jurisdictions, the complaint asserts that recording practice violates the Illinois Biometric Information Privacy Act, which forbids the “collection, retention, capture or purchase of biometric identifiers,” including people’s speech and speech patterns, or their “voiceprint.”
The complaint asserts, to record someone’s voice in Illinois, Amazon must first secure their express written authorization, as well as provide a range of disclosures, including information on how Amazon would retain and ultimately permanently destroy the recordings.
According to the complaint, Gunderson is the owner of an Echo device and a regular Alexa user, as is his child, E.G. However, they claim they were never given any of the allegedly required disclosures or the opportunity to consent to the recordings.
Wilcosky said he is not an Amazon Alexa user, yet claims his voice has been recorded by Alexa as a bystander speaking while others are using Alexa.
The complaint asks the court to create three additional classes of plaintiffs, including:
Everyone in Illinois, from 2014-2019 who “spoke in the vicinity of an Alexa device and were recorded by the Alexa devices and … for whom Amazon created and stored their voice recordings;”
Everyone in Illinois from 2014-2019 who “do not have registered Alexa Accounts and … spoke in the vicinity of an Alexa device and … for whom Amazon created and stored their voice recordings;” and
“All minors in Illinois who … spoke in the vicinity of an Alexa device and were recorded by the Alexa device … and for whom Amazon created and stored their voice recordings.”
The complaint does not estimate the potential number of plaintiffs, other than to say “there are at least thousands of individuals” who could be included in the class action.
The complaint asks the court to award damages of $1,000-$5,000 per violation. The complaint does not specify the definition of “violation,” but others have defined violations under the BIPA law to include each time a user uses a program which captures their biometric information.
With potentially “thousands” of Alexa users in Illinois, total damage demands could easily run into the billions of dollars.
The class action lawsuit comes amid a blizzard of other class actions against businesses of all types based in Illinois and operating in the state under the BIPA law. Most of the actions, to date, have targeted employers who use so-called biometric timeclocks, which typically require workers to scan fingerprints when punching in and out of work shifts.
Others have targeted photo sharing services and social media companies, like Facebook, Shutterfly and Google, for tagging the “faceprints” of people in photos, without their consent.