Welcome to The John Marshall Law Review

Author Archives: Gabrielle Neace

AIVIA: A STEP TOWARDS PROTECTING DATA PRIVACY OR A CONTINUATION OF THE PUSH FOR INDIVIDUALS TO TRADE THEIR PRIVACY RIGHTS FOR EMPLOYMENT?

By Gabrielle Neace on Thursday, February 20th, 2020

Would you like your most cringe-worthy moment recorded and stored indefinitely? Under new Illinois law, if your interview is reviewed by Artificial Intelligence (“AI”) and you do not request the video to be deleted, a company can keep it forever.

In a seemingly ground-breaking data privacy measure, Illinois became the first state to enact a statute that regulates an employer’s use of AI to analyze job applicants.[1] In late 2019, Illinois passed HB2557, “the Artificial Intelligence Video Interview Act” (“AIVIA”), which became effective January 1, 2020.[2] While there is no stated legislative purpose articulated in AIVIA, “[t]he legislation is meant to bring transparency to how applicant videos are evaluated.”[3] In advocating for AIVIA on the Illinois House floor, Rep. Jaime Andrade said “what the applicant does not know, is that his motion is being…is being basically decided if he’s a loser, a CEO, or a winner, or if he’s getting hired.”[4]

Employers utilize AI technology in various ways: to interview job applicants, to track the movement of employees, to monitor employees’ emails and phone calls, to assess employees’ emotions and happiness through facial recognition, and to monitor employees’ social media outside of work.[5] AIVIA seeks to regulate employers that request pre-hire interview videos from applicants, which are then analyzed by “interview bots.”[6] In these videos, applicants respond to preset questions from the employer, then they submit the video to a company that analyzes it for the employer.[7] AI analyzes a candidate’s body language, facial expressions, vocal tone, language patterns, and emotions.[8] Then, “machine learning algorithms” are used to evaluate “a candidate’s work style, how they work with people, and general cognitive ability,”[9] and ultimately predict the candidate’s success at the potential job.[10] The companies that process these videos generally purport that “[r]ecruiters, hiring managers, and candidates gain time and convenience, reduce time to hire, and increase quality of hire by focusing on potential and fit.”[11]

Requirements under AIVIA

Prior to the passage of AIVIA, the Illinois Senate made a few adjustments to the employer requirements.[12] Currently, AIVIA applies to employers that use AI to analyze applicant-submitted videos for jobs “based in Illinois.”[13] Under AIVIA, an employer must meet three requirements prior to obtaining an applicant’s video interview submission:[14]

(1) Notify each applicant before the interview that artificial intelligence may be used to analyze the applicant’s video interview and consider the applicant’s fitness for the position.

(2) Provide each applicant with information before the interview explaining how the artificial intelligence works and what general types of characteristics it uses to evaluate applicants.

(3) Obtain, before the interview, consent from the applicant to be evaluated by the artificial intelligence program as described in the information provided. An employer may not use artificial intelligence to evaluate applicants who have not consented to the use of artificial intelligence analysis.[15]

There are two other restrictions on employers under AIVIA. First, an employer can only share applicant videos “with persons whose expertise or technology is[16] necessary in order to evaluate an applicant’s fitness for a position.”[17]Second, if an applicant requests that an employer destroy the applicant’s interview video, then the employer must comply within 30 days.[18] Under that requirement, the employer must also “instruct any other persons who received copies of the applicant video interviews to also delete the videos.”[19] These requirements also apply to “all electronically generated backup copies.”[20]

Bill Amendment and Vague Language

While AIVIA aims to protect applicants, its language is sparse and vague.[21] The  amendment to AIVIA before the bill passed in the Illinois Senate demonstrates the present vagueness in AIVIA.[22] For example, the word “written” was removed from the notice and consent requirements, thus broadening the ways an employer can obtain consent.[23] An employer could comply with AIVIA by simply giving a verbal explanation of the AI, just prior to the interview.[24] The term “artificial intelligence” is also not defined, so the scope of technology covered is vague. In fact, there is no definition section in AIVIA at all.

Section 15 of AIVIA originally required an employer, and those with a copy of the video, to erase an applicant’s video interview within “30 days after completing the hiring process.”[25] While the phrase “hiring process” was a bit ambiguous, at least the employer would have borne the onus of deleting the video. Now, the only way for an applicant to ensure that the collected data is destroyed is for the applicant to explicitly request that the employer delete the video.[26]

While a candidate is free to refuse the interview based on AI use, an employer is not required to offer an alternative interview method.[27] The lack of alternative, and the video destruction policy, begs the question of bargaining power: who would make this request of an entity that has the applicant’s future at its disposal? There is no resource for an applicant that refuses to trade privacy rights for an employment opportunity.

There is also no protection under AIVIA, or elsewhere, for discrimination resulting from AI analytics.[28] Facial recognition technology is already criticized for its “struggle to identify and characterize the faces of people with darker skin,[29] women,[30] trans and non-binary people,[31]” and those with certain medical conditions.[32] The premise is that there is implicit bias in the underlying data that AI uses to assess the fitness of candidates.[33]

In considering these known pitfalls of AI, individuals should be protected against employment discrimination resulting from the use of AI analyses. Consequently, an employer could be held accountable if its use of AI analytics results in a discriminate impact that adversely affects applicants. When AIVIA was briefly debated on the Illinois House floor, Rep. Mary Flowers expressed concerns that the technology could be used discriminatorily and suggested that passing the bill gives employers consent to use the technology.[34] In opposing AIVIA for its failure to address discrimination, Rep. Flowers said that:

[by enacting this bill] we’re now telling the artificial intelligent group of people they don’t have to worry about trying to incorporate or integrate other nationalities. You’re giving them a plan. You’re giving them a way out…all they will have to say is, we’re complying with current law and the law is make a telephone call [asking for consent]. That’s it.[35]

Missing Crucial Details and the Interplay of BIPA

There is no indication in as to whether AIVIA preempts any aspect of Illinois’ Biometric Privacy Act (“BIPA”) or whether it is an extension of the requirements BIPA places on employers. Enacted in 2008, BIPA regulates private entities that collect biometric data from individuals by imposing strict requirements for the collection, sharing, retention, and destruction of such data.[36] AIVIA could be viewed as creating an exception to BIPA for employers using AI or it could be considered an added requirement for employers to provide more “transparency” when collecting applicants’ data. Scans of face geometry and voiceprints are included as “biometric identifiers” subject to the requirements BIPA.[37] Given that the AI regulated under AIVIA analyses applicants’ faces and voices, it should comply with the detailed protections of BIPA as well.

However, AIVIA conflicts with BIPA in two aspects. First, BIPA requires an entity to destroy the data “within 3 years of the individual’s last interaction with the private entity[,]” whereas the applicant must request destruction under AIVIA.[38] Second, BIPA requires written consent from the individual whose data is collected, but AIVIA does not.[39]

AIVIA is silent on the enforcement of the statute and any applicable penalties, such as liquidated damages. For example, under BIPA, an individual may bring an action against an entity for violating BIPA.[40] The potential damages under BIPA include $1,000 per negligent violation, $5,000 for a reckless or intentional violation, or the opportunity to prove actual damages if it is greater than the applicable liquidated damages amount.[41] Injunctive relief may also be awarded, which could prevent a company from continuing to collect or use data.[42] It is unclear what remedies will apply to AIVIA violations and whether those remedies will be the same as BIPA’s.

AIVIA only states that it applies to Illinois-based positions, which leaves ambiguity as to whether it applies to out-of-state employers and Illinois residents, or just applies in-state Illinois employers.[43] When AIVIA lawsuits arise, if courts interpret AIVIA the same as BIPA, then the statute may apply to out-of-state employers hiring for Illinois based positions or hiring Illinois residents, which includes remote positions. For instance, a federal court held that Facebook, a California-based company, was subject to BIPA because of its interactions with Illinois residents.[44] Additionally, a technical violation of AIVIA’s notice, consent, or authorized third party disclosure requirements would likely violate BIPA as well.[45] A party would not have to show any additional harm.[46]

While AIVIA may improve transparency, it seems to miss the mark in furthering the protection of data privacy rights, especially as it relates to the workforce. AIVIA gives no guidance as to the extent of information an employer needs to provide in its notice to an applicant. Notably, the phrase “facial expressions” is not included anywhere in the statute, even though facial recognition is one of the main functions of this type of AI technology that employers are using.[47] AIVIA does not require any specific advanced notice timeframe to the applicant prior to the interview. An applicant is not given a minimum timeframe to choose whether the applicant would like to consent to the collection. The video sharing allowance also raises the unanswered question: who will be liable if an employer shares the video with a third party and that third party does not delete the video upon request?

 

Rep. Andrade contends that the law is a work in progress.[48] However, BIPA has also been coined as a work in progress, as shown by the many attempts to amend it. Yet, BIPA has not been amended since its introduction in 2008 and the Illinois Supreme Court did not interpret BIPA until 2019.[49] Hopefully AIVIA will expand to protect applicants who refuse to consent as Rep. Andrade intends.[50] Otherwise, AIVIA simply highlights the privacy rights that individuals must trade in exchange for employment.

 

[1] Jackson Lewis P.C., Illinois Leads the way on AI Regulation in the Workplace, JDSupra (Oct. 25, 2019), www.jdsupra.com/legalnews/illinois-leads-the-way-on-ai-regulation-12617/.

[2] The Artificial Intelligence Video Interview Act, 820 ILCS 42/ (2019).

[3] Abdel Jiminez, The ‘Illinois Artificial Intelligence Video Interview Act’ is a Real Law. Here’s why it may be Coming to a job Application Near Year, Herald & Rev. (Jan. 26, 2020), www.herald-review.com/news/state-and-regional/govt-and-politics/the-illinois-artificial-intelligence-video-interview-act-is-a-real/article_51fdad99-114f-5208-93d6-445a1ad45015.html.

[4] H.R. 101st Gen. Assemb., 32d Sess., at 34 (Ill. 2019)(statement of Rep. Andrade), available at http://www.ilga.gov/house/transcripts/htrans101/10100032.pdf.

[5] Richard A. Bales & Katherine V.W. Stone, The Invisible Web of Work: The Intertwining of A-I, Electronic Surveillance, and Labor Law, 41 Berkeley J. Emp. & Lab. L. 1, 4 (forthcoming 2020); Taylor Bragg, AI can detect if your employees are happy, THQ (July 31, 2018), https://techhq.com/2018/07/ai-can-detect-if-your-employees-are-happy/.

[6] See, e.g., Amy Elisa Jackson, Popular Companies Using AI to Interview & Hire You, Glassdoor (Jan. 1, 2019), www.glassdoor.com/blog/popular-companies-using-ai-to-interview-hire-you/ (attempting to put a positive spin on the use of AI technology and predicting an increased use of AI in hiring).

[7] Bales & Stone, supra note 5, at 11; Paul R. Daugherty & H. James Wilson, Human + Machine: Reimagining Work in the Age of AI 51 (2018).

[8] Evan Nadel and Natalie Prescott, Mintz, Levin, Cohn, Ferris, Glovsky and Popeo, Legal Implications of Using AI, Biometrics, or Bots in the Workplace, Bloomberg Law (Nov. 2019) www.mintz.com/sites/default/files/media/documents/2019-12-03/LegalImplicationsofUsingAI-ECO29364.pdf

[9] HireVue, The Most Comprehensive Solution For Evaluating Today’s Talent, www.hirevue.com/products/assessments (last visited Feb. 2, 2020).

[10] Bales & Stone, supra note 5, at 11.

[11] HireVue, The Market Leader In Structured Interviewing, www.hirevue.com/products/video-interviewing (last accessed Feb. 4, 2020).

[12] For the full text, see Amendment to H.R. 2557, 101st Gen. Assemb., Reg. Sess. (codified as amended at 820 ILCS 42/ (Ill. 2019)), available at  www.ilga.gov/legislation/fulltext.asp?DocName=10100HB2557sam001&GA=101&LegID=118664&SessionId=108&SpecSess=0&DocTypeId=HB&DocNum=2557&GAID=15&Session=.

[13] 820 ILCS 42/5.

[14] Id.

[15] Compare 820 ILCS 42/5 with Amendment to H.R. 2557, 101st Gen. Assemb., Reg. Sess. (Mar. 16, 2019). The following changes were made from the original: The phrase “facial expressions” was changed to “video interview” here. Id. Presumably, the employer can evade stating that the AI technology focuses on the applicant’s facial expressions by only stating that the video is reviewed. The phrase “information sheet” changed to “information,” which suggests that the legislature originally intended there to be some type written correspondence describing of the AI information to applicants. Id. The amendment also changed “what” to “what general types of.” The addition of the word “general” narrows the scope of characteristics that employers need to disclose to only a broad overview. Id.

[16] Id. Here, “or technology” was added to the word “is,” which presumably allows the video to be shared with persons who use technology to process the videos.

[17] 820 ILCS 42/10.

[18] 820 ILCS 42/15.

[19] Id.

[20] Id.

[21] AIVIA is a little more than 250 words.

[22] Amendment to H.R. 2557, 101st Gen. Assemb., Reg. Sess. (codified as amended at 820 ILCS 42/ (Ill. 2019)).

[23] Id.

[24] H.R. 101st Gen. Assemb., 32d Sess., at 36 (Ill. 2019)(statement of Rep. Andrade),

[25] For the original wording of AIVIA see H.R. 2557, 101st Gen. Assemb., Reg. Sess. (Ill. 2019)(as originally introduced by Rep. Andrade Feb. 13, 2019), available at www.ilga.gov/legislation/101/HB/10100HB2557.htm (“Section 15. Destruction of videos. No later than 30 days after completing the hiring process for any position, employers and any other persons who received copies of applicant video interviews as provided in this Act must erase all applicant videos including any electronically generated backup copies.”).

[26] 820 ILCS 42/15.

[27] Jiminez, supra note 3.

[28] See Morgan Klaus Scheuerman, Jacob M. Paul, & Jed R. Brubaker, How Computers see Gender: An Evaluation of Gender Classification in Commercial Facial Analysis and Image Labeling Services 3 Proce. of the ACM on Hum.-Computer Interaction 1, 25 (Nov. 2019), available at https://doi.org/10.1145/3359246 (noting that policy makers should consider expanding the protection of certain classes against discrimination to facial analysis technologies).

[29] Lauren Katz, “I Sold my Face to Google for $5”: Why Google’s Attempt to Make Facial Recognition Tech More Inclusive Failed, Vox (Oct. 17, 2019), www.vox.com/recode/2019/10/17/20917285/google-pixel-4-facial-recognition-tech-black-people-reset-podcast.

[30] Nadra Nittle, Amazon’s Facial Analysis Tech Often Mistakes Dark-Skinned Women for men, Study Shows, Vox (Jan. 28, 2019), www.vox.com/the-goods/2019/1/28/18201204/amazon-facial-recognition-dark-skinned-women-mit-study.

[31] Scheuerman, Paul, & Brubaker, supra note 22; see also Amrita Khalid, Facial recognition AI can’t Identify Trans and Non-Binary People, Quartz (Oct. 16, 2019), www.qz.com/1726806/facial-recognition-ai-from-amazon-microsoft-and-ibm-misidentifies-trans-and-non-binary-people/.

[32] Rebecca Heilweil, Illinois Says you Should Know if AI is Grading Your Online Job Interviews, Vox (Jan. 1, 2020, 9:50 AM), www.vox.com/recode/2020/1/1/21043000/artificial-intelligence-job-applications-illinios-video-interivew-act; see also Matthew Jedreski, Jeffrey S. Bosley, and K.C. Halm, Illinois Becomes First State to Regulate Employers’ use of Artificial Intelligence to Evaluate Video Interviews, Davis Wright Tremaine LLP (Sept. 3, 2019), www.dwt.com/blogs/artificial-intelligence-law-advisor/2019/09/illinois-becomes-first-state-to-regulate-employers; see also Steve Lohr, Facial Recognition is Accurate, if You’re a White Guy, N.Y. Times (Feb. 9, 2018), www.nytimes.com/2018/02/09/technology/facial-recognition-race-artificial-intelligence.html; see also Tom Simonite, The Best Algorithms Struggle to Recognize Black Faces Equally, Wired (July 22, 2019), www.wired.com/story/best-algorithms-struggle-recognize-black-faces-equally/.

[33] Gerard Stegmaier, Stephanie Wilson, Alexis Cocco, & Jim Barbuto, Illinois Takes First Step to Combat Bias in Hiring Decisions With Artificial Intelligence Video Interview Act, Reed Smith (Jan. 21, 2020), www.employmentlawwatch.com/2020/01/articles/employment-us/illinois-takes-first-step-to-combat-bias-in-hiring-decisions-with-artificial-intelligence-video-interview-act/; see also Nicole Martinez-Martin, What Are Important Ethical Implications of Using Facial Recognition Technology in Health Care?, 21 AMA J. of Ethics 180, 181-82 (Feb. 2019), available athttps://journalofethics.ama-assn.org/article/what-are-important-ethical-implications-using-facial-recognition-technology-health-care/2019-02(describing the racial bias in using facial recognition to diagnose medical conditions); for a more detailed discussion of how AI creates bias, see Bales & Stone, supra note 5, at 20-23.

[34] Patrick Thibodeau, A backlash emerges over automated interviewing, TechTarget (Oct. 17, 2019), https://searchhrsoftware.techtarget.com/news/252472497/A-backlash-emerges-over-automated-interviewing.

[35] H.R. 101st Gen. Assemb., 32d Sess., at 39 (Ill. 2019)(statement of Rep. Flowers).

[36] Biometric Information Privacy Act, 740 ILCS 14/15(2008).

[37] 740 ILCS 14/10.

[38] 740 ILCS 14/15.

[39] 740 ILCS 14/10.

[40] 740 ILCS 14/20.

[41] Id.

[42] Id.

[43] Nadel et. al, supra note 8.

[44] In re Facebook Biometric Info. Privacy Litig., 2018 U.S. Dist. LEXIS 81044 *3-5, 14 (N.D. Cal. May 14, 2018).

[45] See Rosenbach v. Six Flags Entm’t Corp., 129 N.E.3d 1197, 1207 (Ill. 2019) (holding “an individual need not allege some actual injury or adverse effect, beyond violation of his or her rights under the Act, in order to qualify as an ‘aggrieved’ person and be entitled to seek liquidated damages and injunctive relief pursuant to the Act.”).

[46] Id.

[47] Bales & Stone, supra note 5, at 11.

[48] Jiminez, supra note 3.

[49] Rosenbach, 129 N.E.3d at 1197.

[50] Jiminez, supra note 3.