Intel | Customer Sues McDonald’s Over Voice Recognition Technology
Home / Intel / Customer Sues McDonald’s Over Voice Recognition Technology

Customer Sues McDonald’s Over Voice Recognition Technology

Powered By

A new lawsuit against McDonald’s accuses the company of violating an Illinois privacy act by using voice recognition technology on customers without their consent. Shannon Carpenter reportedly had his drive-thru order taken by the artificial intelligence-based voice assistant program Apprente — an automated system being tested at 10 McDonald’s stores — when he visited the fast-food chain’s Lombard, Illinois location last year. According to Carpenter’s attorneys, who are seeking class-action status for the lawsuit, the technology violates the state’s Biometric Information Privacy Act (BIPA).

It is one of many lawsuits that have invoked the Illinois law, which was established in 2008, to challenge companies’ technology use. Six Flags Great America in Illinois agreed earlier this month to pay $36 million to settle a class-action lawsuit over its use of fingerprint scanners, and in February, the social media company TikTok agreed to pay $92 million in a class-action settlement over its collection of users’ biometric information.

Such lawsuits are likely to become increasingly common as more companies embrace the new technologies amid rapidly changing regulations, said David Derigiotis, Corporate Senior Vice President, National Professional Liability Practice Leader, Burns & Wilcox, Detroit/Farmington Hills, Michigan.

“This will certainly become much more of a pressing issue for all sorts of organizations,” he said, citing states and individual municipalities with tightening privacy laws and, in some cases, outright bans on facial recognition.1 “The pandemic has accelerated digitization and the use of technology; companies had to evolve in order to survive. With the increasing privacy litigation, it makes for a lot of change and compliance that organizations need to get their arms around.”

Companies should also understand how Cyber and Privacy Liability Insurance could help mitigate these risks — both through cybersecurity resources as well as coverage for potential lawsuit-related expenses such as legal defense, regulatory penalties, and settlement amounts.

“If there is a lawsuit, the insurance is going to provide the litigation cost for a company to defend itself, first and foremost,” said Ryan Ascenzo, Broker, Professional Liability, Burns & Wilcox Brokerage, New York, New York. “When you have hundreds of thousands of individuals affected, that is when you can run into very high defense costs and judgment awards.”

Insurance can provide resources, cover fines and settlements

Biometric information, by definition, refers to biological characteristics that can digitally identify an individual, such as a fingerprint, voiceprint, retina scan, or gait recognition. The technology is used across many industries, from airlines to banks and retail outlets, for both public-facing and employment-related uses.2,3 In a 2018 report, 62% of companies said they used some type of biometrics, and an additional 24% planned to start within two years.4 While individual consumers may prefer the technology in some cases, including 31% of U.S. respondents in a 2020 survey who said they preferred biometric authentication over manually entering a password, another survey last year found that in retail settings, about 70% of consumers were wary of in-store facial recognition.5,6

“Customers are being more diligent to understand the risk of their voice or facial features being put out there,” said Danion Beckford, Underwriter, Professional Liability, Burns & Wilcox, Toronto, Ontario. “We all have our unique identifiers and we do not want to lose that or have that readily available to others. From the business perspective, companies want to make things seamless and efficient for clients, but there is always the chance that their information could be taken.”

In March, Facebook made headlines with what may be the largest biometrics settlement to date when it was ordered to pay $650 million for violating BIPA through the social media site’s use of facial recognition technology without customer consent.7 In the settlement, 1.6 million users from Illinois would each be paid $345. Illinois is currently “leading the country” with the stringency of its biometrics law, Derigiotis said, and the penalties alone can range from $1,000 to $5,000 per violation. These fines, which can be covered by Cyber and Privacy Liability Insurance, are multiplied by the number of customers affected. In some cases, Commercial General Liability (CGL) Insurance or Technology Errors & Omissions Insurance could also cover this type of loss.

“The Cyber and Privacy Liability Insurance will specifically outline breach of regulations or privacy law; it is specifically defined within the policy,” Derigiotis said. “It is something a company will certainly want to have in place. CGL Insurance could possibly respond, but those policies are not built for these types of lawsuits.”

In Canada, as well, where biometric privacy laws have been under debate, Cyber and Privacy Liability Insurance is the most important type of policy to help companies weather biometrics-related risks, Beckford said.8 “It is always a matter of taking a look at the coverages and the specifics of the case, but the cyber coverage is what you will most likely be looking at,” he said.

Companies may face consequences for collecting data, even if no breach occurs

Recent increases in cyberattacks have been seen in both the U.S. and Canada.9,10 In the event of a data breach, Cyber and Privacy Liability Insurance can help with expenses such as forensic investigations, customer notification, and data restoration, Ascenzo explained. “There are a multitude of vendors and parties that the policy can help pay to help mitigate and remedy the situation as quickly as possible,” he said. “The more records that a company may have exposed, the more costly the lawsuit can be.”

However, companies should know that they could face litigation over the collection of biometric data even in the absence of a breach. Under the BIPA law in Illinois, no breach needs to have taken place for a company to be in violation of the act. “Even if the company collected the information and protected it, if they did not do it in a lawful manner, you have a private right of action,” Derigiotis said. “That is what makes the Illinois law very unique; consumers can bring a lawsuit whether they were harmed in a breach or not.”

Other benefits of a Cyber and Privacy Liability Insurance policy include risk management resources, which can help organizations better understand their local laws and regulations on data collection, and assistance setting up internal policies and privacy notices. “Before there is any type of lawsuit, a policyholder can take advantage of the resources to become more compliant, to educate themselves, and to have the necessary documentation in place to avoid these types of lawsuits to begin with,” Derigiotis said. “That is an important part of a Cyber and Privacy Liability Insurance policy.”

This could include the implementation of consent forms or signage notifying customers about data the business may collect, Beckford added. “Companies have to be very vigilant,” he said.

Employers face additional liability by using biometrics with staff

According to a report in CIO Review, the use of facial recognition technology in the workplace is expected to be more common in the “post-pandemic world.”11 Using this technology can give workers a contact-free way to clock in and out, for example, and could also prevent time clock “buddy punching” and help reduce time spent on administrative tasks. This type of technology is already having an impact on the construction industry, where some companies are using facial recognition for digital check-in and health screenings.12 During the pandemic, these contactless interactions made possible with biometrics “absolutely did increase safety,” Ascenzo pointed out.

Still, the increased use of biometric information by employers could mean new types of fraud may be on the horizon. “It creates another record that is susceptible to being compromised,” Ascenzo said. “It is only a matter of time before cybercriminals are able to manipulate that piece of information, which could be used to hack into a company’s system.”

In December of 2020, human resources solutions company ADP agreed to pay $25 million to settle a class-action lawsuit claiming it violated the Illinois BIPA law by supporting companies that had employees clock in and out with fingerprint scanning.13

Image

[Biometric data] creates another record that is susceptible to being compromised. It is only a matter of time before cybercriminals are able to manipulate that piece of information, which could be used to hack into a company’s system.

“I think we are going to see a lot more companies adopting this technology for clocking in and out moving forward,” Derigiotis said. “Employers need to understand what laws are in place right now while they are operating, and what could be passed in the future. There is a lot of legislation that is pending in the pipeline right now.”

When a company is sued by an employee over their use of biometrics, the employer’s Employment Practices Liability Insurance (EPLI) could respond to the lawsuit in addition to its Cyber and Privacy Liability Insurance. “Whether it is a customer, vendor, or their own employees, the Cyber and Privacy Liability Insurance will typically respond, because it involves data,” Derigiotis said.

The use of facial recognition for clock-punching could also lead to potential wage and hour lawsuits over when an employee technically “begins” work. “A lot of the wage and hour litigation sweeping the country has to do with when you are or are not officially working,” Ascenzo said. “When is the face being scanned to punch in: when you sit down at your computer, or when you are supposed to start work? Facial recognition may be a great way to allow employees to work remotely, but it may also open other doors for possible litigation.”

Just as they should with customers, companies should be mindful of obtaining consent before using biometric technology on staff, Beckford noted. “Just because you are working somewhere does not mean you have to put your voice or face out there. When you no longer work there, your sensitive information may still be stored,” Beckford said. “At the end of the day, you want to make sure that your information is secure. It needs to be taken seriously.”

Laws changing rapidly, penalties for non-compliance could be ‘in the millions’

Despite receiving less attention in the press, the privacy aspect of cybersecurity is just as crucial as protecting against cyberattacks and ransomware, Derigiotis said. “Most organizations and clients just think about hacking when it comes to Cyber and Privacy Liability Insurance, but it goes so much further than that,” he said. “Even if you do not have a data breach, you could be found operating out of compliance with appropriate data collection practices which can be costly. Did the organization inform the individual in writing that biometric information was being collected? Did the notice include the specific purpose and duration for which it is being collected and was written consent received from the individual?”

Image

In the absence of sweeping federal regulation, more privacy laws are being passed at the state level and statutes are being enacted by city governments.

Image

- David Derigiotis,

Corporate Senior Vice President
National Professional Liability Practice Leader

Companies can prioritize data privacy by engaging in regular risk assessments, hiring or working with privacy specialists, and by keeping up with federal, state, and local regulations. Strong policies should be in place for any biometric data collection and companies should thoroughly research potential vendors, their compliance, and whether they have insurance in place. “Things are changing so rapidly,” Derigiotis said. “In the absence of sweeping federal regulation, more privacy laws are being passed at the state level and statutes are being enacted by city governments. It is certainly a growing issue and one that everyone needs to pay attention to.”

Individuals must be notified that their data is being collected and safeguards must be in place to protect that data, Derigiotis explained. “Anytime a business will be collecting sensitive information, there are specific things they have to do to be in compliance,” he said. “Proper compliance can be dictated by federal or state laws, industry specific regulation and city councils as we have seen with facial recognition technology bans being implemented across the country.”

Image

We live in a time and world right now where information can be stolen with a single click. A stolen credit card can be replaced, but our actual voice and likeness is irreplaceable.

Image

- Danion Beckford

Underwriter, Professional Liability, Burns & Wilcox

Cyber and Privacy Liability Insurance itself is a key component of risk management, Ascenzo said. Even if a company believes it has followed all relevant data collection guidelines, the cost to defend against a lawsuit can be substantial. “When you look at the costs to defend a cyber incident or an employment-related event — and then you add in the potential for damages — it is a very important risk management tool to have in place,” he said.

Beckford agreed: “We live in a time and world right now where information can be stolen with a single click. A stolen credit card can be replaced, but our actual voice and likeness is irreplaceable,” he said. “The information is out there, and technology is going to get better and better, so we just have to be smart to ensure that everybody’s information stays safe.”

 

Sources



1Ng, Alfred. “Portland, Oregon, passes toughest ban on facial recognition in US.” CNET, September 10, 2020.
2ACI. “Making the passenger journey touchless: Biometrics on the rise.” Airports Council International, February 17, 2021.
3Kingson, Jennifer A. “Biometrics invade banking and retail.” AXIOS, February 18, 2020.
4Spiceworks. “Spiceworks Study Reveals Nearly 90 Percent of Businesses Will Use Biometric Authentication Technology by 2020.” Spiceworks, March 12, 2018.
5Nash, Jim. “Fed up with passwords and bad onboardings, consumers consider biometrics.” Biometric Update, November 19, 2020.
6Security Magazine. “Are consumers comfortable with facial recognition? It depends says new study.” Security Magazine, October 2, 2020.
7Hatmaker, Taylor. “Facebook will pay $650 million to settle class action suit centered on Illinois privacy law.” TechCrunch, March 1, 2021.
8Burt, Chris. “Canadian Privacy Commissioner says facial recognition risks not addressed in proposed law.” Biometric Update, May 12, 2021.
9Myre, Greg. “As Cyberattacks Surge, Biden Is Seeking To Mount A Better Defense.” NPR, June 4, 2021.
10The Canadian Press. “Cyberattacks on Canadian businesses up since remote work increased: report.” Global News, May 12, 2021.
11CIOReview. “How Facial Recognition Technology Can Benefit in the Workplace.” CIO Review, June 24, 2021.
12Ward, David Brian. “Construction Technology is Shaping the Post-Pandemic Workplace.” Occupational Health & Safety, June 21, 2021.
13Bilyk, Jonathan. “ADP agrees to pay $25M to settle worker fingerprint scan class actions; Lawyers could get 36%.” Cook County Record, December 9, 2020.

Featured Solution(s)

Featured Expert

Similar Articles

Featured Solutions

Customer Sues McDonald’s Over Voice Recognition Technology

Cyber/Privacy/Technology/Media

Cyber security threats are consistently ranked a top issue for organizations worldwide. While technological advancements bring about new innovations and…